Shopping cart

Texas News App is your reliable source for real-time updates across Texas, covering Local News, Politics, Business, Sports, and more. With a focus on all 15 Texas regions, we bring the stories that matter most to communities statewide. Stay informed and connected with an app designed to reach Texans wherever they are.

TnewsTnews
  • Home
  • Business & Tech
  • Jensen Huang says the 3 elements of AI scaling are all advancing. Nvidia’s Blackwell demand will prove it.
Business & Tech

Jensen Huang says the 3 elements of AI scaling are all advancing. Nvidia’s Blackwell demand will prove it.

Jensen Huang says the 3 elements of AI scaling are all advancing. Nvidia’s Blackwell demand will prove it.
Email :13
Jensen Huang and Sam Altman
Scaling has been a key concern for AI leaders.

  • Reports on an AI progress slowdown raised concerns about model scaling on Nvidia’s earnings call.
  • An analyst questioned if models are plateauing and if Nvidia’s Blackwell chips could help.
  • Huang said there are three elements in scaling and that each continues to advance.

If the foundation models driving the panicked rush toward generative AI stop improving, Nvidia will have a problem. Silicon Valley’s whole value proposition is the continued demand for more and more computing power.

Concerns about scaling laws started recently with reports that OpenAI’s progress in improving its models was slowing. But Jensen Huang isn’t worried.

The Nvidia CEO got the question Wednesday, on the company’s third-quarter earnings call. Has progress stalled? And could the power of Nvidia’s Blackwell chips start it up again?

“Foundation model pre-training scaling is intact and it’s continuing,” Huang said.

He added that scaling isn’t as narrow as many think.

In the past, it may have been true that models only improved with more data and more pre-training. Now, AI can generate synthetic data and check its own answer to —in a way— train itself. But, we’re running out of data that hasn’t already been ingested by these models, and the impact of synthetic data for pre-training is debatable.

As the AI ecosystem matures, tools for improving models are gaining importance. The first generation of post-training improvement for models came from armies of humans checking AI’s responses one by one.

Huang shouted out OpenAI’s Strawberry or o1 model, which uses more modern strategies like “chain of thought reasoning” and “multi-path planning.” These are both tactics that encourage the models to think longer and in a more step-by-step fashion so that the responses are more considered.

“The longer it thinks, the better and higher quality answer it produces,” Huang said.

Pre-training, post-training improvements, and new reasoning strategies all improve models, Huang said. Of course, if the model is doing more computing to answer the same fundamental question, that’s where higher-powered compute is necessary — especially since users want their responses just as fast, if not faster.

The demand for Blackwell is the result, he said.

After all, the first generation of foundation models took about 100,000 Hopper chips to build. “You know, the next generation starts at 100,000 Blackwells,” Huang said. The company said commercial shipments of Blackwell chips are just beginning.

Read the original article on Business Insider



This article was originally published by Emma Cosgrove at All Content from Business Insider – Read this article and more at (https://www.businessinsider.com/jensen-huang-scaling-ai-plateau-openai-nvidia-blackwell-chips-2024-11).

General Content Disclaimer



The content on this website, including articles generated by artificial intelligence or syndicated from third-party sources, is provided for informational purposes only. We do not own the rights to all images and have not independently verified the accuracy of all information presented. Opinions expressed are those of the original authors and do not necessarily reflect our views. Reader discretion is advised, as some content may contain sensitive, controversial, or unverified information. We are not responsible for user-generated content, technical issues, or the accuracy of external links. Some content may be sponsored or contain affiliate links, which will be identified accordingly. By using this website, you agree to our privacy policy. For concerns, including copyright infringement (DMCA) notices, contact us at info@texasnews.app.

Comments are closed

Related Posts

0
YOUR CART
  • No products in the cart.