Even as artificial intelligence (AI) saves us energy, it also inhales it: The International Energy Association estimates that global data centers will consume more than 1,000 terawatt-hours (TWh) of power in 2026—that’s roughly equivalent to the entire electricity consumption of Japan, and it’s more than double the 460 TWhs consumed in 2022.1
AI’s thirst for power is a familiar headline by now. Less familiar, though, is the intensifying race to build more power-efficient data centers to keep pace with exploding electricity demand and preserve global power grids.
Traditional data centers—the backbones of the AI revolution—might reach 1 million square feet and contain hundreds of racks of hardware (servers and routers), along with cabling, cooling and security systems. Running all that equipment requires a complex array of substations, transformers and power-distribution units, along with dual-power supplies and on-site generators—whatever it takes to keep the data flowing 24-7.
For decades, a 48-volt (V) power-distribution system could light up enough 15 kilowatt (kW) racks to meet the processing and storage needs of a typical data center; today, system architects are aiming to design racks that need 600 kW to 1 MW of power.2 Delivering 600 kW at 48 V would demand a whopping 12,500 amperes of current,3 and a daunting volume of cables and components to handle it. For instance, delivering 1MW of power through a traditional 48V system could require more than 200 kg of copper cabling, which becomes impractical due to the sheer bulk and potential energy losses.
Industry leaders, including AI-chip juggernaut Nvidia, are rising to this latest challenge by developing a new architecture—essentially, the rules for how a center is organized and assembled—based on an 800V High Voltage Direct Current (HVDC) power-distribution system.
In this configuration, alternating current (AC) from the power grid gets converted to direct current (DC) at the data center, near the rack perimeter. Then, inside the racks, higher-efficiency DC-DC converters with solid-state transformers (SSTs) “step down” the voltage from 800 V to the appropriate levels required by AI chips within the servers, thereby increasing the system’s overall efficiency and reliability. Operating at higher voltage levels can reduce the amount of copper required by up to 45%; lower energy losses by eliminating multiple AC-to-DC and DC-to-DC conversions; and deliver up to five-fold increases in overall power efficiency compared to conventional 48V methods.4
We believe broader migration to 800 V HVDC architecture will open new growth opportunities for key players at every stage of the power-distribution network:
- Solid-state transformers. The SST market is projected to grow at 32% a year, hitting nearly $1 billion by 2030, led by major players such as Eaton, GE Vernova, Schneider Electric, Siemens, Delta Electronics and innovative startups like DG Matrix, which recently received investment from ABB.5
- Hybrid Supercapacitor Technology. HSCs—developed by Flex and Musashi Seimitsu—provide rapid energy storage and release, allowing AI server racks to handle peak power demands and ensure stable performance at higher voltage levels. The global supercapacitor market is growing at 19% a year and expected to reach $9.6 billion by 2032.6
- Gallium Nitride semiconductor chips. GaN chips are used in high-efficiency DC-DC converters to step down the voltage to the appropriate levels needed by AI processors. Developed by chip companies like Infineon, Navitas, Renesas, ST Microelectronics and Innoscience, these chips boost power density while constraining data center footprints. The GaN power-device market is projected to grow 49% a year and hit $4.4 billion by 2030.7
As AI applications continue to demand more computational firepower, big tech players and governments are exploring scalable strategies for addressing greater energy demands. The U.S. government is issuing executive orders to revitalize nuclear power,8 while hyper-scalers Meta, Microsoft and Google are signing 20-year clean energy deals.
We believe the push for data-center power efficiency will only accelerate as demand for myriad AI applications continues to rise. Stay tuned.