How Much Energy does AI Use? Total Grid Compute Power & Total Energy Resources on Planet Earth 2023


ABSTRACT — How much energy does AI use? The purpose of this post is to get some big numbers out there, to establish some heuristics about how much energy the world produces annually (across all sources: coal, solar, nuclear, etc), how much of that is used by the global compute grid, and how much is private industry vs. consumer vs. military. And perhaps most importantly, to answer the looming questions: 

The Big Energy / Compute Questions

  • How much Energy does AI Use, in total?
    (what percentage of the earth’s total energy production)?
    • subset: What percentage of global energy is converted into electrical grid power (as opposed to, for instance, oil & gas, which directly fuels planes, trains, & automobiles)?
      quick answer: 20%
    • How much of total global electricity generation does AI use, in total? (including both training runs, and inference / operation)
      quick answer: it appears on the surface to be roughly 0.3%
    • subset: How much of total global electricity production is consumed by AI training runs?
    • subset: How much of total global electricity supply is consumed by AI inference / operation?
  • How much of global supercompute power is consumed by AI, in total?
    (as percentage of Top500)?
    • shocker: Top500 power consumption is barely on the radar. both Crypto mining, Cloud services and AI usage (see detailed charts below), dwarf the sum total consumption of the “Top500” fastest supercomputers in the world, which collectively draw less than 0.01% (one one-hundredth of one percent) of total global electrical power generation. 
    • How much of global supercompute is deployed towards AI training? towards AI inference / operation?
    • quick heuristic: as of 2022, AI training and inference uses less than 1% of global supercompute power. 2023, that could possibly hit 2% (SWAG). this is, though, the stat to watch. plausibly could increase to 20% or more within 1-5 years.
  • How much does a large model AI training run cost?
    (c. 2023 GPT4 equivalent, ~2T parameters)
    • How much of that cost is infrastructure (GPU / A100 / H100 purchases)
    • How much of that cost is electrical power?
    • How much is that cost if outsourced (i.e. OpenAI leasing Microsoft Azure, Anthropic using AWS)?
    • quick answer: ChatGPT roughly $10 million (compute + energy). GPT4 roughly $150 million (outsourced to MSFT Azure)
    • I am preparing a more detailed analysis of this question to be posted “soon.”
  • What is the total active compute power (MIPS) available across all planet earth, c. 2023?
    • break down by: smartphones, iOT devices, smartcars, personal computers / laptops, corporate datacenters, cloud grids, supercomputers.
  • How much of global electrical power is consumed by Bitcoin mining operations?
    • quick answer: roughly 0.8%, as of 2022
    • How much of global electricity is consumed by all cryptocurrencies, in total?
    • quick answer: roughly 1.2%, as of 2022

NOTE: This page is a work in progress. It is being continually updated as I gain insight into the data, and do the (basic) math & analysis. The questions listed above, all of which center on nailing “How much energy does AI use?” are being actively explored. Once all those questions are answered (and global estimates of these numbers vary widely), I will note this project as complete.

You may also wish to explore the “solution” to the very real possibility of AI hitting a global “ceiling” of compute & energy availability:

  • Infinite Abundance: How Quantum Computing and Nuclear Fusion solve the Energy/Compute limit problem. (COMING SOON!!)

Some of this might shock you…


The Top Line: Global Energy & Electricity

In 2022, the total energy generated globally by all sources — and consumed by all sectors of human civilization — was roughly 160,000 terawatt-hours (TWh). This is called total primary energy supply [source]

Energy Generation Sources are where energy comes from.
They are generally segmented as:

  1. Coal & Oil (“fossil”)
  2. Natural Gas,
  3. Nuclear,
  4. Solar & Wind (“renewables”)

Energy Consumption Sectors are where energy is used.
They are generally classified as:

  1. Residential,
  2. Transportation,
  3. Industrial (factories, manufacturing),
  4. Commercial (office buildings), &
  5. Government / Military.

Electricity accounts for about 20% of the world’s total final consumption of energy. This is generally viewed in contrast to Oil, which in its refined form of gasoline powers internal combustion engines directly, without intermediate conversion to electricity… or Natural Gas, which is burned directly to heat homes. However, electricity’s share of energy services is higher due to its efficiency (I read this sentence, and repeat it here, yet still struggle to make sense of it… thoughts?). Electrical Power is central to many aspects of daily life and becomes ever more so as it spreads to new end-uses, such as electric vehicles (EVs) and heat pumps.

How much energy is used by Compute and other segments annually?

This is the start of our answer. Out of the total 30,000 TWh of electrical power generated and consumed globally every year, a full 600 TWh, or 2%, is used to power datacenter-class computing devices (this is not including home PCs, laptops, smartphones, IoT devices, etc). Soooo… of that, How much Energy does AI Use?

Well, let’s start with:

How much Energy is used by all the Computing Devices in the World?

Of the 30,000 TWh of electrical power generated on planet Earth annually, roughly 5% (1,500 TWh) was used to power the global compute grid & devices. This broad category includes all things digital & net-connected:

[NOTE: conflicting data online here across multiple sources. Q: how do terawatt•hours (TWh) compare to straight gigawatts (GW)? how does 1,250 TWh == 13,000 GW? (note that GW is not GWh) must understand & resolve]

  • data centers, public & private clouds
  • private servers & server farms
  • laptop & desktops computers
  • smartphones & tablets
  • supercomputers
  • network hardware (routers, repeaters, firewalls)

Looked at through another lens,
you can break that 5% down as follows:

  • 1.6% end user devices
    (smartphones, laptops, desktops)
  • 1.4% data centers
    (clouds: Google, AWS, Microsoft)
  • 2.0% network hardware
    (routers, firewalls, NoCs)

— source: WikiPedia: IT Management

If we remove the consumer devices out of the picture, and look only at crypto-mining and commercial-class clouds and datacenters, we get this:

How much energy does AI use? How much electricity do all the computers in the world use?

What’s fascinating is the concentration of that power consumption by a handful of global corporations. The three top global consumers of that 5% were:

  1. Bitcoin Network (BTC) (!!!)
  2. Amazon (AWS)
  3. Microsoft (Azure) (which, btw, is the sole provider of OpenAI Compute)
  4. Google (Cloud, search, gMail, AI)

How much energy is used by Cloud Computing and AI?

Further drilling down into the Cloud+AI segment (and ignoring, for now, the conjecture that cryptographic mining could very easily be a proxy for AI inference…), we see, again, that a vast majority of global consumption is driven by a very few massive corporations. To get this, we will ask: Who are the major global cloud and AI companies, and how much of the total electricity on earth do they consume to power their services?

Who are the major global cloud and AI companies, and how much of the total electricity on earth do they consume to power their services?

And finally, just for comparison purposes, let’s take a quick look at how much electrical power is consumed by global cryptocurrency (“crypto”) mining operations, which are actually collectively larger than the entire global cloud compute + AI training and inference drain (…for now!)

How much electrical power does global crypto mining consume, annually?

 

Bitcoin & Energy: The Crypto Power Suck

The Bitcoin network component, consuming very close to 1% of the planet’s total global energy generation capacity, astounds me:

How much energy does AI use? For starters lets look at Bitcoin...

Bitcoin, still discounted by many traditional financial experts as a “fluke”, is currently devouring more than 1/100th of the entire energy generation capacity on Planet Earth… more, in fact, than 90% of all countries. More detail:

 

 

So: How much Energy does AI use?

Here’s the summary:

It takes a massive amount of energy to create, train, tune — and more recently, to run — an AI entity. For instance, it is rumored that the “training cost” for ChatGPT (3.0) was somewhere around $1-2 million USD. That wouldn’t be such a problem, given its great success, however: there are two major complications:

  1. most historical development costs were assessed in terms of man-hours and cost-basis-accounting. You’d say: “We need 100 engineers, plus support staff, for 3 years, and that’s how much it will cost to build product X.” But that’s not how modern AI is built.
  2. The size of training data sets required to continue linear improvement (as measured by benchmark accuracy scores on a variety of tasks) is exponential. We’re already used to exponential in tech due to Moore’s Law, which decrees that the speed of compute for a given cost will double every 18 months. But this is a bit more extreme. The current AI curve is : The size of the training data set required for continued linear progress doubles every 100 days
  3. That’s not so much a problem for storage. That is a massive problem for compute resources. A doubling of dataset size equates to an order of magnitude more compute power required to solve it.. to learn from it. So If the required data pile grows ~10x size in 12 months (2x2x2 + a little), then the compute power needed to train on that data must grow by 100x. [support: When Altman was asked: “What’s to prevent another AI company from just replicating what you published regarding GPT3, and releasing their own model?” he cockily replied (paraphrasing): “There’s only so much compute resource available on Planet Earth. Larger models require massive data centers to train and operate. We have $10 billion, a large portion of which we will be spending on compute. There will only be resources (both financial- and compute-wise) for a few entities globally to be running billion-dollar training runs).”]
  4. Even factoring for Moore’s Law, that means the cost of training an AI in 2023 will start to approach the 9 figure zone ($100 million), and God knows what that says about 2024. What it says to me is: we’d better find some efficiencies!
  5. That $10 million isn’t spent across 3 years. It’s spent with a single button push, across about 21 days. It’s not manpower. Its largely two things : raw electricity, and data center rental (“compute”). As in, 100 gigawatts running 300,000 servers for about 21 days… at redline. At the end of 21 days? Your bank account is $10 million the poorer, and — hopefully — you have a well trained AI model at your beck and call. Well, “well-trained” after some kludgy RLHF, censoring, muzzling, etc.

OK. C’mon Now. How much energy does AI use?

Well, I still haven’t answered the question. Dammit. Honestly, the data is really hard to come by. And clearly, its ratcheting up at an unprecedented growth rate. So the better question in the interim might be (to get us closer): “Of the total global energy consumed by the compute grid (see above), how much of that is used specifically to train & operated AI entities?”

Lacking any easily found online sources (do you know of any? hit me up!), I’ll SWAG it for you. I will say, 3% in 2022, moving up to 10% in 2023, possibly up to 20% in 2024.

So, to answer succinctly How much energy does AI use?

  • total energy: 160,000 TWh
  • electrical (20% of total): 30,000 TWh
  • all compute (5% of electrical): 1,500 GWh
  • commercial compute (2% of electrical): 600 TWh
  • cryptocurrency mining (0.7%): 200 TWh
  • AI: up to 150 TWh

By this admittedly SWAG heuristic, AI training and running would therefore account for up to 1/10th of 1% of the total energy generated on planet earth in 2023, moving up to a full 1 percent by 2024… its anybody’s guess where it goes from there.

There you go. You’re welcome. 

And PS: If you have better data, or more data points for this analysis, please direct me to sources.

 .


Addendum, May 30, 2023: 

What is the total available Global Compute Power?

In other words, what kind of machine calculation horsepower is harnessable by humans globally? Not just aforementioned supercomputers and data centers, but including all active PCs, laptops, and smartphones? (also, eventually, I’d be interested to add smartwatches, smart speakers… anything with a decent chip in it that doesn’t generally run at more than 10% of its rated capacity/performance)

This merits further research (I’m particularly compelled to dive into how much latent horsepower is sitting around… in the spirit of SETI@Home, Folding@Home, etc, what could we do if we put all those pocket-based supercomputers — i.e. smartphones — to real work, aside from making live-action animations of our faces for SnapChat & Instagram?).

And as a top line, I find this essay by “Jack” to be as good a starting point as any. Quoting:

“To understand how far we can expect to scale [total available global] compute, let’s try to estimate the total [present day] global computing capacity, and how fast it’s growing.

We can start with estimated global annual spending on computer hardware, which is roughly $1 trillion, and growing at a rate of about 3-4% annually (Statistica). A common estimate of [present-day] compute capacity is about 3 years worth of this [aggregate] spending – roughly $3 trillion.

For comparison, world GDP is ~$90 trillion and growing at ~3% annually.
[corollary: more than 1% of total global human work / production / manpower / output is concerned with creating computing machinery]

Next, we look at the price-performance of computing hardware as FLOPS vs $. [We see] a steady decline in the $ price per GFLOPS — prices generally halve every 1.5 years — in other words, you can buy double the computing power in FLOPS for the same dollar cost every 1.5 years. The current cost is estimated at about $0.3 / GFLOPS (AI Impacts).

Combining these two estimates puts global computing capacity at about 10^22 FLOPS, doubling every 1.5 years.

The growth comes almost entirely from improved price-performance (~60% annual growth), while total compute hardware spending is growing at ~3-4% annually.

 

 


OK. Now that you’ve got a grip on those metrics, let’s…

Connect the Dots: How the Cryptocurrency Ecosystem and AI are really Two Sides of the same (bit)coin.