ChatGPT is more useful to society than cryptocurrency, says Nvidia

Nvidia GPU
(Image credit: Shutterstock)

The world’s processing power is better served building artificial intelligence apps like ChatGPT and Google Bard than mining cryptocurrency. That’s according to Nvidia, which would rather see its GPUs used for gaming and AI.

Michael Kagan, Nvidia’s chief technology officer, has stated the company never embraced cryptocurrencies and tried to constrain its RTX-30 graphics chips to limit their use for mining. 

“All this crypto stuff, it needed parallel processing, and [Nvidia] is the best, so people just programmed it to use for this purpose,” he told the Guardian. “They bought a lot of stuff, and then eventually it collapsed, because it doesn’t bring anything useful for society. AI does.”

Despite Nvidia’s reluctance to engage with crypto enthusiasts, the company undoubtedly did well out of the Bitcoin and Ethereum bull market. In all likelihood, it sold a fair amount of GPUs to those wanting to mine digital currencies.

“I never believed that [crypto] is something that will do something good for humanity,” Kagan added. “You know, people do crazy things, but they buy your stuff, you sell them stuff. But you don’t redirect the company to support whatever it is.”

Powering the revolution

Nvidia GeForce RTX 4090 being held by Nvidia CEO Jensen Huang

(Image credit: Nvidia)

While cryptocurrency may be struggling (Ethereum can no longer be mined and the time and power requirements for Bitcoin are prohibitive), the AI revolution is in full swing. That’s been good for Nvidia. According to UBS analyst Timothy Arcuri, OpenAI (the company behind ChatGPT), used 10,000 of the company’s GPUs to train the model.

Putting the question of how many Nvidia GPUs were used in its development to the chatbot directly, ChatGPT returned the following answer: “The exact number and type of GPUs used during my training process are not publicly disclosed, but it is known that the OpenAI team used a large-scale transformer-based language model architecture and trained me on a massive dataset of text.”

Currently, tens of thousands of Nvidia’s A100 and H100 Tensor Core GPUs handle training and inference on AI models like ChatGPT, running through Microsoft’s Azure cloud service.

Meanwhile, Jensen Huang, Nvidia’s CEO said it was the engine powering “the iPhone moment of AI” at the company’s annual conference last week.

More from Tom's Guide

TOPICS
Jeff Parsons
UK Editor In Chief

Jeff is UK Editor-in-Chief for Tom’s Guide looking after the day-to-day output of the site’s British contingent. Rising early and heading straight for the coffee machine, Jeff loves nothing more than dialling into the zeitgeist of the day’s tech news.

A tech journalist for over a decade, he’s travelled the world testing any gadget he can get his hands on. Jeff has a keen interest in fitness and wearables as well as the latest tablets and laptops. A lapsed gamer, he fondly remembers the days when problems were solved by taking out the cartridge and blowing away the dust.