Nvidia (NASDAQ: NVDA) didn’t just beat expectations in Q4 FY26
it effectively reset the ceiling for the entire AI infrastructure complex.
The company delivered $68.1 billion in quarterly revenue, nearly triple the $22.1 billion reported just two years ago (Q4 FY24). But the real catalyst was forward guidance.
Management projected $78 billion for Q1 FY27 (±2%), roughly
$5–6 billion above consensus, a decisive signal that the AI infrastructure build out is not slowing.
Executive Snapshot
- Q4 FY26 Revenue: $68.1B (+77% Y/Y)
- Next Quarter Guide: $78B (+77% Y/Y)
- Gross Margin: ~75%
- Data Center Revenue: $62.3B (+75% Y/Y)
- Networking Revenue: $11B (3.5x Y/Y)
- Other Income: $5.6B (equity gains, incl. Intel stake)
Despite the beat and raise, shares traded flat to slightly down in early action, a classic “sell the news” rotation after a heavily anticipated print.
Inside the Engine Room: Data Center = 91% of Revenue
The vast majority of Nvidia’s revenue now comes from a single segment.
Revenue Breakdown (Q4 FY26)
| Segment | Revenue | Y/Y Growth |
|---|---|---|
| Data Center | $62.3B | +75% |
| Gaming | $3.7B | +47% |
| Professional Visualization | $1.3B | +159% Y/Y (+74% Q/Q) |
| Automotive | $604M | +6% |
Data Center alone accounts for roughly 91% of total revenue, underscoring Nvidia’s transformation from GPU vendor to AI infrastructure backbone.
The Two Engines: Compute + Networking
1. Compute — $51.3B
This is the core AI accelerator business: Hopper and now Blackwell.
The transition from Hopper (H100) to Blackwell (GB200/GB300) marked a generational shift this quarter.
- Blackwell delivered billions in its first full shipping quarter
- 2.2x training performance improvement
- ~30x faster inference for reasoning workloads
Management described demand as “insatiable”, with systems effectively sold out through mid 2026.
2. Networking — $11B (The Underrated Story)
Nvidia is quietly becoming one of the largest networking vendors in the world.
- Spectrum-X Ethernet platform growing triple digits
- InfiniBand + Ethernet revenue hit $11B
- Now positioning as the “world’s largest networking business”
As AI clusters scale to tens of thousands of GPUs, networking bandwidth becomes as mission critical as the silicon itself.
Nvidia is no longer selling chips, it’s selling complete AI factories.
The Customer Mix Is Changing
CFO Colette Kress confirmed hyperscalers (Microsoft, Google, Meta, Amazon) still represent slightly over 50% of Data Center revenue.
But the fastest growing category is “the rest.”
The Three New Buyers:
- Sovereign AI Nations
Countries including India, Japan, Denmark, and Saudi Arabia are building domestic AI clouds to keep data localized. - Enterprise AI Agents
Salesforce, ServiceNow, Accenture, and others are buying Blackwell systems to power workflow automation and agentic AI. - AI Model Builders
Labs like OpenAI and Anthropic effectively write nine figure checks for training clusters.
CEO Jensen Huang calls these facilities “AI Factories”, data centers that convert electricity and raw data into digital intelligence.
The China Nuance: Compute vs. Networking
Management guided to zero Data Center compute revenue from China in the $78B forecast.
That distinction matters.
While U.S. export controls restrict advanced AI accelerators, Nvidia can still sell:
- Networking hardware
- Lower tier chips not classified as advanced compute
Using the term “compute revenue” reflects a more precise understanding of the regulatory landscape.
Blackwell → Rubin: One Year Cadence Confirmed
The AI cycle is accelerating.
Earlier this year at CES 2026, Nvidia unveiled the Rubin architecture and the Vera CPU, confirming a structural shift to a strict one year product cadence (versus the historical two year cycle).
Roadmap
- Blackwell: Sold out through mid 2026
- Rubin: Launching 2H 2026
- Rubin Ultra: Scheduled for 2027
- Rubin targets 10x lower inference cost vs. Blackwell
This cadence is designed to stay ahead of competitors such as AMD and custom silicon from hyperscalers.
The $5.6B “Other Income” Easter Egg
A notable balance sheet detail:
Nvidia reported $5.6 billion in Other Income, largely from gains on equity investments including its strategic stake in Intel.
Even outside core operations, the company is compounding capital.
DGX Cloud and the Asset Light Strategy
Contrary to common perception, Nvidia does not own massive hyperscale data centers like cloud providers do.
Its model:
- DGX Cloud: Nvidia owned hardware deployed inside partner data centers (Azure, Google Cloud, Oracle).
- Colocation (Equinix, Digital Realty): Nvidia supplies blueprints + silicon, real estate firms own the buildings.
- Internal R&D Clusters: Eos and Selene used for chip design and AI research.
The strategy: sell the shovels, not own the gold mine.
Risks and Constraints
Despite the momentum, several headwinds remain:
1. China Restrictions
Zero compute revenue assumption reduces geopolitical upside.
2. Supply Chain Complexity
Advanced packaging (CoWoS) remains capacity constrained.
3. Valuation Expectations
With price targets from Goldman, Morgan Stanley, and Bank of America ranging $250–$275, and Cantor Fitzgerald at $300, expectations are elevated.
The Bottom Line
This was not just a strong quarter.
It was confirmation that:
- AI infrastructure spend is accelerating
- Networking is becoming equal to compute in strategic importance
- Sovereign AI is a new multi year demand vector
- Nvidia has locked in a product roadmap through at least 2027
The $78B guide tells the market one thing clearly:
The AI build out is still in early innings and Nvidia remains the primary toll collector.

