Nvidia CEO Jensen Huang sees at least $1 trillion of AI chip revenue opportunity through 2027
https://img.etimg.com/thumb/msid-129618652,width-1200,height-630,imgsize-82050,overlay-etmarkets/articleshow.jpg
The figure signals Huang’s confidence that Nvidia can remain the biggest player in the market for AI chips amid growing competition and investor doubts about whether its strategy of plowing back its profits into the AI ecosystem is paying off.
Huang did not offer more details on the forecast. But it marks a big step up from the around $500 billion revenue opportunity for 2026 Nvidia had reiterated at its last earnings call.
Shares of Nvidia – the world’s most valuable listed company with a market value of more than $4.3 trillion – briefly jumped on the news but pared those gains to last trade up 1.4%.
Huang is speaking at a hockey arena with a capacity of more than 18,000 at an event that has become one of the biggest showcases of AI technology.
He is also expected to lay out how the top AI chipmaker plans to adapt to a rapidly changing AI landscape at the four-day conference.
He started the keynote by making the argument that part of Nvidia’s competitive advantage was its CUDA chip programming software, which some analysts regard as its strongest shield. “The installed base is what attracts developers who then create (the) new algorithms that achieve the breakthrough” technologies, Huang said. “We are in every cloud. We’re in every computer company. We serve just about every single industry.”
IN-DEMAND AI CHIPS
The keynote is also likely to include detail on a next-generation AI chip called Feynman, named after late American physicist Richard Feynman.
Huang is also likely to talk about data centers, digital assistants known as AI agents and physical AI such as robots.
Another focus is likely to be Groq, a chip startup from which Nvidia licensed technology for $17 billion in December. Groq specializes in fast and cheap “inference” computing work, in which an AI model takes what it has already learned and uses it to answer a question or make a prediction in real time.
After spending hundreds of billions of dollars in recent years on chips for training their AI models, companies such as OpenAI, Anthropic and Facebook owner Meta Platforms are shifting toward serving hundreds of millions of users who are tapping those AI systems.
Nvidia faces greater competition in the market for chips for inference-computing work than it does for AI-training chips, and analysts expect the company to shore up its defenses against rivals looking to regain market share they lost to Nvidia in recent years.
Analysts also expect Nvidia to elaborate on why it invested $2 billion each in Lumentum and Coherent, both of which make lasers for sending information between chips in the form of beams of light.
Despite that increased competition, some of which is coming from Nvidia’s own customers designing their own chips, Nvidia remains central to the global AI ecosystem.
Nations such as Saudi Arabia are building custom AI systems for their own populations using its chips, and it is one of the only large U.S. companies that continues to release open-source AI software, a growing field of competition between the U.S. and China.









































Post Comment