According to two insiders, OpenAI is collaborating with Broadcom to develop a new type of artificial intelligence (AI) chip specifically designed to run trained AI models.
Insiders say that this artificial intelligence startup and chip manufacturer are still in talks with TSMC, the world's largest chip contract manufacturer. Insiders say that OpenAI has been planning a customized chip for about a year and is committed to such uses of the technology, but discussions are still in the early stages.
In June, it was reported that Broadcom had discussed manufacturing an artificial intelligence chip for OpenAI.
The process of designing and producing chips is long and expensive. OpenAI doesn't pay much attention to graphics processing units (GPUs), which are chips used for training and building generative AI models - a market that has always been monopolized by Nvidia. On the contrary, it is searching for a specialized chip to run software and respond to user requests, a process called inference. Investors and analysts expect that as more and more technology companies use AI models to perform more complex tasks, the demand for chips that support reasoning will increase.
A source familiar with the matter said that OpenAI may continue to research and establish its own network of foundries or chip factories, but the startup has realized that developing customized chips with partners is currently a faster and more feasible path. Earlier reports stated that OpenAI is giving up its efforts to establish its own chip manufacturing factory.
Broadcom is the largest dedicated integrated circuit (ASIC) designer - chips designed to meet a single purpose specified by customers. The company's largest customer in this field is Google, a subsidiary of Alphabet. Broadcom also cooperates with Meta and TikTok owner ByteDance.
In September, when asked if there were any new customers due to the huge demand for artificial intelligence training, Hock Tan, CEO of Broadcom, said that he would only increase the customer list when the project reached bulk shipment volume.
He said during an earnings conference call, "For any customer, this is not an easy to deploy product, so we don't consider proof of concept to be mass production
OpenAI's services require a significant amount of computing power to develop and operate, with the majority coming from Nvidia chips. In order to meet the demand, the industry has been striving to find alternatives to NVIDIA. This includes using AMD's AI accelerator and developing an internal version.
OpenAI is also actively planning investments and collaborations in data centers, which will eventually deploy such AI chips. The leadership of this startup has already promoted the necessity of establishing larger scale data centers to the US government, and OpenAI CEO Sam Altman has also solicited opinions from global investors (including some Middle Eastern investors), hoping that they can provide funding for the project.