Why Nvidia Can Still Break Records Despite the Skeptics

Why Nvidia Can Still Break Records Despite the Skeptics

Nvidia isn't just a chip company anymore. It’s the heartbeat of a global shift in how we process information. If you're looking at the stock chart and waiting for a massive correction, you might be waiting a long time. The "bubble" talk misses the point entirely because it ignores the physical reality of data centers.

Everyone asks if Nvidia can repeat the staggering growth of last year. That’s the wrong question. The real question is whether the world has finished building its AI infrastructure. The answer is a loud, resounding no. We're only in the early stages of a complete hardware overhaul that happens once every few decades.

The Blackwell Factor and the End of General Computing

The transition from traditional CPUs to GPUs isn't a trend. It’s a structural necessity. Old-school data centers are becoming obsolete because they can't handle the sheer weight of generative AI models. Nvidia’s Blackwell architecture isn't just a slight improvement over the H100. It’s a massive leap in efficiency and power.

Jensen Huang has been vocal about the concept of "sovereign AI." Countries like Japan, France, and Canada want their own domestic AI capacity. They don't want to rely on American or Chinese clouds. This creates a whole new layer of demand that didn't exist two years ago. When a nation-state decides it needs an AI supercomputer, they don't look for the cheapest option. They look for the standard. Right now, Nvidia is the standard.

The H100 was the gold rush starter. Blackwell is the permanent settlement. While competitors like AMD and Intel are shipping decent products, they’re fighting for the leftovers. Nvidia’s software moat—specifically CUDA—is what keeps developers locked in. Switching to a different chip isn't just about the hardware. It's about rewriting millions of lines of code. Most companies won't do that to save a few bucks on a chip.

Why the Competition is Still Miles Behind

You’ll hear a lot about "in-house" chips from Google, Amazon, and Meta. People think this is the "Nvidia killer." It's not. These companies are building custom silicon for specific internal tasks, but they're still buying Nvidia units by the truckload for everything else. Why? Because Nvidia’s ecosystem is flexible.

A Google TPU is great if you’re doing exactly what Google designed it for. But if you’re a startup trying to invent a new type of video generation, you need the versatility of a Green Team GPU.

The Software Moat is Deep

  • CUDA is the industry language. Nearly every AI researcher learned on it.
  • The library ecosystem is massive. From medical imaging to fluid dynamics, there's a pre-optimized Nvidia library for it.
  • Developer mindshare stays put. It takes years to build a community. Nvidia started twenty years ago.

The supply chain is the only real bottleneck. TSMC, the foundry that actually makes these chips, is running at full tilt. Nvidia’s ability to repeat its record-breaking performance depends almost entirely on how many wafers they can grab from TSMC. So far, they’ve managed to jump to the front of the line every single time.

Revenue Diversification Beyond the Data Center

While the world stares at data center revenue, the automotive and gaming sectors are quietly evolving. Self-driving tech is hitting a second wind. Tesla’s F12 software and other Chinese EV makers are using massive clusters of GPUs to train their driving models. This isn't just about selling a chip for the car; it’s about selling thousands of chips to the company developing the car.

Gaming used to be Nvidia’s bread and butter. Now it’s a side hustle, but a side hustle that still brings in billions. With the next generation of 50-series cards on the horizon, the upgrade cycle will likely trigger another spike in consumer spending. People forget that even without AI, Nvidia was a powerhouse.

Financial Reality vs Market Fear

Let's talk numbers without getting bogged down in a spreadsheet. Nvidia’s margins are unheard of in the hardware world. They're pulling software-like margins—around 70% or more—on physical products. That’s because they aren't selling silicon. They’re selling time. If an H100 cluster lets a pharmaceutical company find a drug candidate six months faster, the price of the chip is irrelevant.

The bears love to point at the P/E ratio. But if you look at the forward earnings, Nvidia often looks cheaper than some "boring" consumer staple stocks. Growth is priced in, sure, but the growth hasn't slowed down yet. Each quarter, they manage to beat expectations that analysts thought were impossible.

What Could Actually Go Wrong

  • Geopolitical tension. If something happens in the Taiwan Strait, the entire tech world breaks. Not just Nvidia.
  • Hyperscaler exhaustion. If Microsoft or Meta decide they've overbuilt and stop ordering for a few quarters.
  • AI utility wall. If companies realize they can't actually make money from the AI models they're training.

So far, none of these have happened. In fact, companies are doubling down. Mark Zuckerberg recently stated Meta would have 350,000 H100s by the end of 2024. That’s an insane amount of capital expenditure.

The Shift to Inference

The biggest misconception is that Nvidia only wins during the "training" phase. Training is when you teach the AI. Inference is when the AI answers your questions. As more AI apps go live, the demand for inference will explode.

Inference requires speed and low latency. Nvidia’s newer chips are specifically designed to dominate this space. If you think the demand ends once the models are "trained," you’re misunderstanding the lifecycle of software. Training happens once in a while. Inference happens every second of every day.

Basically, the infrastructure for the new internet is being laid right now. You don't stop building the road just because the first car drove over it. You pave more lanes. You build exits. You build gas stations. Nvidia is the company selling the asphalt, the rollers, and the blueprints.

Don't wait for a return to "normal." This is the new normal. The record-breaking year wasn't a fluke; it was a proof of concept. To see where this goes next, keep your eyes on the capital expenditure reports of the big four cloud providers. As long as they are spending, Nvidia is winning. Check the next quarterly 10-Q filing for "Inventory Purchase Obligations." That’s the real lead indicator of whether the momentum is staying or going.

KF

Kenji Flores

Kenji Flores has built a reputation for clear, engaging writing that transforms complex subjects into stories readers can connect with and understand.