Just caught something pretty significant that Jensen Huang dropped on investors last month, and it's worth paying attention to if you're watching the AI infrastructure space.



So Nvidia's about to roll out its next-gen Vera Rubin platform in the second half of this year, and the specs are honestly wild. We're talking about AI models training on 75% fewer GPUs compared to their current Blackwell chips, plus a 90% reduction in inference token costs. For context, that's the kind of efficiency jump that actually changes the economics of running AI services at scale.

Here's where it gets interesting though. During their earnings call back in late February, Jensen Huang made a comment that really highlighted how massive this opportunity actually is. When someone asked whether customers could keep up their massive capex spending on data centers, Huang essentially said the world has been spending around $400 billion annually on classical computing infrastructure historically. But for AI workloads? He's suggesting we need roughly a thousand times more capacity than that.

Last year Huang had mentioned that AI data center infrastructure spending could hit $4 trillion annually by 2030. At the time that sounded like a big number, but if he's right about the sheer scale of compute required, especially as inference costs drop and usage accelerates, it's starting to look more credible.

Looking at the numbers, Nvidia just pulled in $215.9 billion in revenue for fiscal 2026, up 65% year-over-year, with data center sales hitting $193.7 billion. They're guiding for $78 billion in Q1 fiscal 2027, which would be a 77% jump. Most of that's obviously coming from the data center business.

What's wild is the valuation. The stock's trading at a P/E of 36.1 right now, which is actually 41% below its 10-year average of 61.6. Wall Street's consensus for fiscal 2027 earnings is $8.23 per share, giving it a forward P/E of just 21.5. For comparison, the S&P 500 trades at 24.7 trailing P/E today. So if earnings hit estimates and the stock doesn't move, Nvidia could actually become cheaper than the broad market.

I'm not calling price targets here, but if you're thinking about the sheer magnitude of what Jensen Huang is describing with AI infrastructure buildout, and you look at where valuations actually sit relative to historical averages, the risk-reward feels pretty interesting at these levels. The Vera Rubin ramp starting this year could be a meaningful catalyst too.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin