You are currently viewing OpenAI is developing a custom AI chip with TSMC and Broadcom

OpenAI is developing a custom AI chip with TSMC and Broadcom

OpenAI is developing a custom AI chip, working with TSMC and Broadcom on a project that could reshape its hardware foundation. According to a recent report from Reuters, OpenAI has decided to abandon earlier plans to create its own network of foundries due to the high costs and time demands involved. It is now focused on designing custom chips in-house and relying on industry experts for manufacturing.

Sources familiar with the initiative say OpenAI has secured manufacturing capacity with TSMC, with hopes of producing its first custom AI chip by 2026. However, this timeline could shift as the project progresses. The company’s initial interest in an AI server chip emerged in July 2024, but the idea of designing its own chips first surfaced as early as late 2023. CEO Sam Altman has been a major proponent of OpenAI having its own chips, exploring funding opportunities under the codename “Project Tigris” before his brief exit and rehire in November 2023.

OpenAI now has around 20 people dedicated to chip design, including former Lightmatter chip engineering lead and Google TPU head Richard Ho, who joined as head of hardware last year. Broadcom, known for its key role in developing Google’s TPU AI chip, is also playing a part in OpenAI’s chip journey.

To meet its current demands, OpenAI has started broadening its hardware mix by using AMD chips alongside Nvidia GPUs for training AI models. This shift aims to reduce the company’s reliance on Nvidia, whose high-cost GPUs have faced supply challenges. Sources said OpenAI’s custom chip pursuit was partly driven by a need to address these issues with Nvidia.

Microsoft, OpenAI’s largest investor with nearly $14 billion in funding, has also been instrumental in expanding its hardware options. In May 2024, Microsoft announced that AMD’s MI300X accelerators would be available on its Azure platform, which OpenAI uses to access AMD chips. Scott Guthrie, Microsoft’s EVP for Cloud and AI, called the MI300X “the most cost-effective GPU out there right now for Azure OpenAI.”