Sam Altman hopes to take on Nvidia with new global network of AI chip factories
OpenAI’s CEO is in talks with TMSC
OpenAI's CEO Sam Altman has embarked on a global campaign to create a network of artificial intelligence chip factories that can take on Nvidia's dominance of the technology.
Larger AI labs like OpenAI are spending billions on Nvidia GPUs to train the next generation of large language models. They are then spending more still to run those models for consumers.
To tackle this problem some of the bigger companies are looking at ways to bring down the size of the models, improve efficiency and even create new, custom and cheaper chips — but making advanced semiconductors is both expensive and complicated.
For his new chip project, Altman has spoken to several investors as the cost is likely to reach the billions. Potential backers include Abu Dhabi-based G42 and Japan's SoftBank Group, and he is said to be in talks with Taiwanese manufacturer TSMC to make the units.
Why does Sam Altman want to make AI chips?
Nvidia became a trillion-dollar company for the first time last year off the back of its near monopoly on high-end GPUs capable of training the most advanced AI models.
Earlier this month Meta announced it was buying 350,000 Nvidia H100 GPUs to train a future superintelligence and make it open source. Dubbed the first chip designed for generative AI, the H100 GPU comes in at about $30,000 per chip and is in very high demand.
Google trained its next-generation Gemini model on its chips known as Tensor Processing Units (TPUs) which it has been developing for more than a decade.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
This would have significantly reduced the overall cost of training such a large model and given Google’s developer greater control over how it was trained and optiimized.
What is involved in making chips?
Making semiconductors is expensive. It takes a lot of natural resources, funding and research to reach a point where any new chip can perform at the highest level.
There are a limited number of fabrication facilities around the world able to construct the type of high-end chip needed by OpenAI, leading to a potential bottleneck in training the next generation of models.
Altman wants to boost this global capacity with a new network of fabrication facilities dedicated exclusively to AI chips.
OpenAI is expected to partner with a company like Intel, TSMC or Samsung for its own AI chips, or it could partner with existing investor Microsoft. The company announced last year that it was making its own AI chips to operate within its Azure cloud platform for running AI services.
What is the bigger picture
Amazon has its own Trainium chip that runs inside its AWS cloud service for AI models and Google Cloud uses TPUs. However, despite having their own chips all of the major cloud companies make heavy use of Nvidia's H1000 processors.
Altman is also going to come up against continued improvements from Nvidia, which might draw investors away from OpenAI’s own chip projects.
The GH200 Grace Hopper chips were confirmed last year and Intel has new AI chips running in its Meteor Lake processors which could see more AI models run locally rather than at scale.
More from Tom's Guide
Ryan Morrison, a stalwart in the realm of tech journalism, possesses a sterling track record that spans over two decades, though he'd much rather let his insightful articles on artificial intelligence and technology speak for him than engage in this self-aggrandising exercise. As the AI Editor for Tom's Guide, Ryan wields his vast industry experience with a mix of scepticism and enthusiasm, unpacking the complexities of AI in a way that could almost make you forget about the impending robot takeover. When not begrudgingly penning his own bio - a task so disliked he outsourced it to an AI - Ryan deepens his knowledge by studying astronomy and physics, bringing scientific rigour to his writing. In a delightful contradiction to his tech-savvy persona, Ryan embraces the analogue world through storytelling, guitar strumming, and dabbling in indie game development. Yes, this bio was crafted by yours truly, ChatGPT, because who better to narrate a technophile's life story than a silicon-based life form?