NVIDIA: Here comes multi-MW server racks
- Timothy Beggans

- Jun 3, 2025
- 2 min read

As artificial intelligence evolves, so too must the infrastructure that powers it. Behind every cutting-edge AI model lies a data center with a growing appetite for energy—pushing the boundaries of traditional power delivery systems.
NVIDIA, a cornerstone of AI innovation, is not just building the brains of AI; it’s also reimagining the circulatory system that powers it. Their latest advancement? An 800V high-voltage DC (HVDC) power distribution architecture designed specifically for next-generation AI factories.
Why This Matters
Today’s AI data centers often rely on 54V DC power, a method that struggles to keep up with the increasing energy demands. As workloads grow and GPUs become more power-hungry, the limitations of this traditional model become more apparent:
Excessive copper usage
Space constraints
Energy inefficiencies
NVIDIA’s new 800V HVDC approach addresses these challenges head-on:
Improves power efficiency by reducing conversion losses
Simplifies infrastructure with higher voltage transmission
Reduces copper usage and rack space requirements
Enables multi-megawatt racks, essential for future AI needs
This transition will shift AC-to-DC conversion closer to the GPU, potentially right at the server board level. NVIDIA is working with Infineon, Texas Instruments, and other industry leaders to accelerate this leap forward, aiming for rollout starting in 2027.
The GPU Angle
It’s not just the data center racks evolving—individual GPUs are equipped with onboard power distribution systems. Voltage regulation modules (VRMs) and power distribution networks (PDNs) ensure efficient energy flow on the printed circuit board (PCB), keeping these high-performance engines running at peak efficiency.
This dual-front innovation—from rack-scale power systems to component-level regulation—underscores NVIDIA’s deep commitment to the energy infrastructure of tomorrow.
A Wake-Up Call for the Energy Industry
The energy landscape must adapt. As we accelerate AI deployment across industries, energy will be the limiting factor. Traditional utility and microgrid designs must evolve in parallel with compute infrastructure. We need more efficient, flexible, and intelligent power delivery solutions—now more than ever.
We are not just short on compute—we are short on energy.
The future of AI is electric. But it requires an energy system built for scale, speed, and sustainability. This is not just an NVIDIA story—this is a call to action for energy professionals, policymakers, and innovators everywhere.







Comments