OpenAI Partners with Broadcom, TSMC to Develop First Custom AI Chip

OpenAI is collaborating with Broadcom and TSMC to create its first AI chip by 2026

OpenAI Partners with Broadcom, TSMC to Develop First Custom AI Chip
AI

OpenAI, the company behind ChatGPT, has initiated a partnership with Broadcom and Taiwan Semiconductor Manufacturing Company (TSMC) to develop its first custom-designed artificial intelligence chip, the Reuters reports. The company is focusing on creating an inference chip, which processes new information based on trained AI models.

The artificial intelligence company has assembled a specialized team of approximately 20 chip engineers for this project, including Thomas Norrie and Richard Ho, who previously worked on Tensor Processing Units at Google. Through its collaboration with Broadcom, OpenAI has secured manufacturing capacity with TSMC, with plans to produce its first custom chip by 2026.

Supply Chain Maneuvering

In response to growing infrastructure demands, OpenAI is implementing a multi-faceted strategy to secure chip supply and manage costs. The company is expanding its processor sources by incorporating AMD chips alongside Nvidia’s graphics processing units (GPUs) through Microsoft’s Azure platform.

Initially, OpenAI considered establishing its own network of chip manufacturing facilities, known as foundries. However, the company abandoned these plans due to the substantial costs and time requirements involved, choosing instead to concentrate on chip design efforts.

The news of these developments had an immediate impact on the stock market, with Broadcom shares rising more than 4.5% and AMD shares increasing by 3.7% following the announcement.

Market Impact

OpenAI’s decision to develop its own chip while maintaining relationships with multiple suppliers mirrors strategies employed by other major technology companies, including Amazon, Meta, Google, and Microsoft. As one of the largest purchasers of Nvidia’s GPUs, OpenAI’s moves could influence the broader technology sector.

Currently, Nvidia maintains over 80% of the market share in AI chips. However, supply constraints and increasing costs have prompted major customers to explore alternatives. AMD has projected $4.5 billion in AI chip sales for 2024, following the launch of its MI300X chips in late 2023.

The development comes as OpenAI faces significant operational costs. The company is projected to experience a $5 billion loss this year on $3.7 billion in revenue, with computing costs representing its largest expense category. These costs include hardware, electricity, and cloud services required for processing large datasets and developing AI models.

Despite pursuing chip supply diversification, OpenAI maintains its commitment to working with Nvidia, particularly for access to Nvidia’s new generation of Blackwell chips. The company has been careful to preserve its relationship with the dominant chip manufacturer while pursuing its own development initiatives.

Avatar photo
Maria is a freelance journalist whose passion is writing about technology. She loved reading sci-fi books as a kid (still does) and suspects that that's the bug that got her interested in all things tech-y and science-y. Maria studied engineering at university but after graduating discovered that she finds more joy in writing about inventions than actually making them. She is really excited (and a little scared) about everything that's going on in the AI landscape and the break-neck speed at which the field is developing. When she’s not writing, Maria enjoys capturing the beauty of nature through her camera lens and taking long walks with her scruffy golden retriever, Goldie.

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Back To Top