"AI feels like magic largely because most of us don't understand how it really works, Axios' Amy Harder writes.
That "magic" is actually a giant stack of energy, hardware and software working together so your computer can turn a few typed words into a giant cat sightseeing in Seattle. Let's break it down, working backwards:
Step 8: I use an AI tool like ChatGPT or Gemini to write a prompt for an image of my cat stretching next to Seattle's Space Needle and ... voila!
Step 7: Inside a data center, the image I requested is crunched by powerful chips called GPUs (graphics processing units).
All this computing creates a lot of heat. So data centers need massive cooling systems, which use a lot of electricity.
Step 6: The companies running these AI systems rely on GPUs built mainly by one company, Nvidia.
A GPU is designed to handle huge numbers of small calculations at once, which is exactly what AI needs.
Step 5: Those GPUs sit inside cloud infrastructure that large tech companies own and operate.
These companies provide the software that lets AI systems actually run on all that hardware.
Step 4: Because millions of people are using AI, data centers need vast amounts of electricity.
Global electricity demand from AI-optimized data centers is projected to more than quadruple by 2030.
Step 3: Then there's the world of companies that build and operate the data centers themselves.
Step 2: The data centers from Step 3 need to connect to the electric grid.
Key players, from grid operators to utilities to firms acting as middlemen, help data center builders secure two scarce resources: land and power.
Step 1: It all starts with an original energy source a wind farm, nuclear power or natural gas plant that powers data centers."
Axios
Cancel my subscription to the Resurrection
Send my credentials to the House of Detention
I got some friends inside
“I love Cal deeply, by the way, what are the directions to The Portal from Sproul Plaza?”