Return on AI: Why Nvidia's Q2 results are so keenly anticipated
The numbers, and commentary from CEO Jensen Huang will be microscopically examined for a sense of just how much AI is expected to grow further

Nvidia Corp, which rubs shoulders with Apple and Microsoft as the most valuable public companies in the world, reports its fiscal second quarter results later today in the US (August 29, 2.30 am in India).
While capital markets investors are eager to find out how much more of an upside there is likely to be on the stock, the results and commentary from Co-Founder and CEO Jensen Huang may also give us a sense of what’s next in the world of large language models (LLMs) based generative artificial intelligence.
In less than three years, Nvidia’s graphics processor units (GPUs) that started out in computer games have become the de facto underpinnings of generative AI as Microsoft, Google, Amazon Web Services—hyperscalers, as these cloud providers are now called—and Meta gobbled up everything that the chipmaker could provide.
Nvidia’s data centre revenues have gone from $2.37 billion for Q2 of fiscal year 2022 (Nvidia follows a Feb-Jan financial calendar) to $10.32 billion in Q2 of FY24. In Q1 this year, that figure was $22.6 billion. The company’s projection for Q2 FY25 total revenue is $28 billion, most of which will come from sales to the hyperscalers and other data centre customers, while the gaming segment, which has declined, is a distant second.
The dominance of Nvidia’s chips in the data centres that are powering the world’s biggest AI models has led commentators on financial platforms such as Bloomberg to describe the upcoming earnings as a “macro event".
“Part of it is, Nvidia is the face of AI at this point, or one of the faces of AI," Alvin Nguyen, senior analyst at Forrester Research, said in an interview with Forbes India on August 24. “That"s part of being essentially number one by a large dominant margin."
Neil Shah, vice president of research at Counterpoint Technology Market Research, echoes this: Nvidia, for now, “is the only show in town" when it comes to providing these high-end processors, accelerators as they are called, Shah said in an interview with Forbes India on August 24.
The deciding factor has been that Nvidia also developed a very sophisticated software platform that makes it easy for developers to use its chips and build powerful applications. Its gaming heritage helped, and Huang also led timely acquisitions to add heft and depth to the platform.
Over the last two to three years, the world has witnessed the rise of ChatGPT and the underlying LLMs that power it and other such generative AI models. This has propelled much of the AI industry, taking many companies by surprise, Shah said, with the power of LLMs simplifying and automating a plethora of tasks.
Training such models with higher accuracy required heavy duty compute power, in addition to good data. “In running just one epoch, which could involve billions of parameters, it takes like hundreds of millions of dollars," Shah says. An “epoch" in this context refers to one full pass-through of a data set in training an AI model.
First Published: Aug 28, 2024, 13:56
Subscribe Now