Nvidia is betting $25 billion that the AI ​​boom isn’t over yet

Nvidia is betting $25 billion that the AI ​​boom isn't over yet

Written by Stephen Nelis and Max A. Cher me

(Reuters) – NVIDIA CEO Jensen Huang said he expects the artificial intelligence boom to continue into the next year, and made what may be the biggest single bet yet in the technology sector to support his optimism.

The company’s sales forecast on Wednesday beat Wall Street’s and said it would buy back another $25 billion of its shares, a move most companies make when their leadership believes the company is undervalued. However, Nvidia’s share price has more than tripled this year and was set to hit an all-time high after Wednesday’s results.

Nvidia said it plans to ramp up production of its devices next year, dispelling doubts some analysts had about how long the AI ​​craze could last. The company has a near monopoly on the computing systems used to run services like ChatGPT, OpenAI’s popular AI-generated chatbot.

“We have an excellent vision for the year and into next year, and we are already planning the next generation infrastructure with leading (cloud computing companies) and data center builders,” Huang told investors on a conference call.

In an interview with Reuters, Huang said two things are driving this demand: the shift from traditional data centers built around central processors to ones built around powerful Nvidia chips, and the growing use of content generated by AI systems in everything from legal contracts. to marketing materials.

“These two fundamental trends are what underlie everything we see, and we’re almost a quarter of it,” he said. “It’s hard to say how many quarters we have ahead of us, but this fundamental shift is not going to end. This is not a quarter.”

Huang’s move to buy back shares when they’re more expensive than they’ve ever been tops bets even other big tech companies are making on AI, but it comes as the price-to-earnings multiple has fallen to around 43 from 60 after he Analysts upgrade their investment values. Earnings estimates for the month of May.

And Microsoft said the $10.7 billion in capital expenditures it spent in the fiscal fourth quarter — a large portion of which went toward Nvidia hardware — is a number that will continue to rise. It also invested $10 billion in OpenAI.

Meta Platforms, Amazon.com’s AWS cloud computing unit and others have collectively bet tens of billions of dollars on AI-related hardware and products.

Demand for chips has given Nvidia the cash for an investor’s payday. The company reported nearly doubling its adjusted gross margins to 71.2% in the second quarter, when most semiconductor companies have gross margins in the 50% to 60% range.

Nvidia’s $4.32 billion in stock is “light,” said Kenjae Chan, an analyst with Summit Insights Group.

“We believe (Nvidia) will continue to exceed the guideline figure of $16 billion for the October quarter as demand continues to outpace supply,” Chan said, referring to the company’s third-quarter revenue forecast.

Certainly, some analysts do not see unlimited demand. SemiAnalogy’s Dylan Patel said many tech companies are spending big on Nvidia GPUs this year before figuring out how to actually make money from products developed with those chips.

“They have to overinvest in GPUs or risk missing the boat,” Patel said. “At some point, the real use cases will change, and many of these players will stop investing, although others will likely continue to accelerate the investment.”

Huang declined to comment on whether the AI ​​boom will continue into the next year. The biggest risk Nvidia faces, he said, is supply security.

The company said the biggest driver of sales this quarter was the HGX system, which is a full PC built on an Nvidia chip. This system is more complex than just the chip itself, and any missing piece can delay shipments.

“We get a lot of cooperation from our supply chain. It’s a complex supply chain,” Huang told Reuters. “People think it’s a GPU chip. But it’s a very complex GPU system. It weighs 70 pounds. It has 35,000 components. And it costs $200,000.”

(Reporting by Stephen Nelis and Max Cherny in San Francisco; Additional reporting by Shafi Mehta; Editing by Sonali Paul)

Leave a Reply

Your email address will not be published. Required fields are marked *