[#] Jensen Huang: AI Has To Do '100 Times More' Computation Now Than When ChatGPT Was Released
robot(spnet, 1) — All
2025-02-27 14:22:01


In an interview with CNBC's Jon Fortt on Wednesday, Nvidia CEO Jensen Huang said next-gen AI will need 100 times more compute than older models as a result of new reasoning approaches that think "about how best to answer" questions step by step. From a report: "The amount of computation necessary to do that reasoning process is 100 times more than what we used to do," Huang told CNBC's Jon Fortt in an interview on Wednesday following the chipmaker's fourth-quarter earnings report. He cited models including DeepSeek's R1, OpenAI's GPT-4 and xAI's Grok 3 as models that use a reasoning process.

Huang pushed back on that idea in the interview on Wednesday, saying DeepSeek popularized reasoning models that will need more chips. "DeepSeek was fantastic," Huang said. "It was fantastic because it open sourced a reasoning model that's absolutely world class." Huang said that company's percentage of revenue in China has fallen by about half due to the export restrictions, adding that there are other competitive pressures in the country, including from Huawei.

Developers will likely search for ways around export controls through software, whether it be for a supercomputer, a personal computer, a phone or a game console, Huang said. "Ultimately, software finds a way," he said. "You ultimately make that software work on whatever system that you're targeting, and you create great software." Huang said that Nvidia's GB200, which is sold in the United States, can generate AI content 60 times faster than the versions of the company's chips that it sells to China under export controls.

[ Read more of this story ]( https://slashdot.org/story/25/02/27/0158229/jensen-huang-ai-has-to-do-100-times-more-computation-now-than-when-chatgpt-was-released?utm_source=atom1.0moreanon&utm_medium=feed ) at Slashdot.