We’ve been so focused on the speed of chips and the size of datasets that we forgot one thing: someone has to actually build and maintain the physical infrastructure that makes AI run. In a recent interview, Jensen Huang dropped a line that stuck with me. He said the biggest bottleneck for AI isn’t compute or algorithms—it’s something much more mundane. It’s plumbers.
Not literally, of course. He meant the people who install, connect, and maintain the massive data centers, power grids, and cooling systems that AI models depend on. The metaphor is a little funny and a little sobering. I’ve been chewing on it for a while, and I want to share eight notes I took from that interview and the broader picture it points to.
Note 1: The age of “just add more GPUs” is ending
For years, the playbook was simple: to train a bigger model, you buy more GPUs. But now, with clusters scaling to hundreds of thousands of units, the challenge shifts from raw compute to physical assembly. You can’t just order a pallet of chips. You need to wire them, cool them, power them, and make sure they talk to each other without catching fire. Jensen called this a “systems problem.” And systems problems don’t get solved by faster chips alone.
Note 2: The “plumber” shortage is real
Data center construction is booming, but the skilled labor pool is not keeping up. According to a recent report from McKinsey, the global data center workforce needs to grow by 30% over the next three years just to meet current demand. And we’re not talking about PhDs in machine learning. We’re talking about electricians, HVAC technicians, network cable installers, and yes, plumbers—people who can handle the physical side of infrastructure. Without them, the most advanced GPU is just an expensive paperweight.
Note 3: Energy is the silent bottleneck
Every new AI model consumes staggering amounts of electricity. Training a single large language model can use as much energy as hundreds of homes in a year. But the grid isn’t ready. Many regions have years-long wait times for new power connections. Jensen pointed out that the real constraint is not the chip performance per watt, but the ability to get enough megawatts to a single site. That’s a civil engineering problem, not a semiconductor problem.
Note 4: Software is only half the game
We tend to celebrate software innovations—the latest attention mechanism, the new loss function. But hardware deployment is equally hard. Jensen talked about the “AI factory” concept: a physical plant where raw data comes in and intelligence comes out. Running that factory efficiently requires software that optimizes cooling, power distribution, and networking. It’s not glamorous, but it’s where the real value capture will happen.
Note 5: The bottleneck moves upstream
When one part of the pipeline gets faster, the bottleneck shifts. First it was data labeling, then it was training time, then inference latency. Now, with model sizes plateauing in some areas, the bottleneck is moving upstream to infrastructure construction and operational talent. The companies that can build and run data centers quickly will have a massive advantage over those that just buy cloud credits.
Note 6: Jensen’s “plumber” is not a joke
In the interview, he said it with a straight face: “The world needs more plumbers.” He was making a broader point about the dignity and necessity of skilled trades. In an AI-driven economy, we still need people who can fix a pipe, install a transformer, or crimp a cable. The digital revolution rests on an analog foundation. Neglecting that foundation is a strategic error.
Note 7: The return of local thinking
Global supply chains for GPU manufacturing are fragile. But the real bottleneck is local: zoning laws, building permits, environmental reviews, and local labor markets. Jensen noted that many planned data centers are delayed not by technology but by municipal regulations. This is a reminder that AI progress is not just a tech story—it’s a story about policy, urban planning, and workforce development.
Note 8: What this means for the rest of us
If you’re building an AI startup, don’t just optimize your model. Think about where it runs. If you’re a student, consider that the highest-value skills might not be in Python or PyTorch, but in understanding how systems come together—electrical, mechanical, and thermal. And if you’re an investor, look beyond the AI chip makers. Look at the companies that build data centers, train electricians, and solve the power grid problem. Those are the real picks and shovels of the AI gold rush.
So here’s the takeaway: the next wave of AI innovation won’t come from a breakthrough in architecture. It will come from a thousand small improvements in how we build and run the physical infrastructure. And the people who make that possible? They might not have PhDs. But they have pipe wrenches.