Californian Artificial intelligence (AI) chip start-up, SambaNova Systems, has developed a new semiconductor to give its customers access to higher quality AI models at a lower overall cost.
SambaNova CEO, Rodrigo Liang, noted that the new chip, named SN40L, “is specifically built for large language models running enterprise applications.” He added that, “we’ve built a full stack that has allowed us to really understand the enterprise use case really well.”
Taiwan Semiconductor Manufacturing Company Limited currently manufactures the chip for SambaNova. The SN40L chip is designed to run AI models over twice the size of the model used by OpenAI’s latest ChatGPT version, US cybersecurity company, Palo Alto Networks noted.
The chip comes equipped with two advanced memory forms, and is also capable of powering a 5 trillion parameter model. SambaNova has stated that with such a hardware combination its customers will be able to run larger AI models without trading size for accuracy.
Rodrigo Liang believes that large businesses that seek to deploy AI in novel ways face a different and more complex set of considerations than consumer software like ChatGPT. Security, accuracy and privacy are all areas that AI technology must be designed to meet to be useful for enterprise customers.
Currently, Nvidia, famous for its powerful GPUs, dominates the AI chip market, however, other technology companies like Intel, Advanced Micro Devices, and SambaNova are closing in to give greater options.