ChatGPT's energy emergency - this is how much power OpenAI and others consume in a week

ChatGPT's energy emergency - this is how much power OpenAI and others consume in a week

This week, two reports revealed the growing environmental costs of generative AI and LLM training The data is shocking, showing that a single ChatGPT-4 query uses up to three bottles of water and a single year's query consumes enough electricity to power nine homes, and that this sustainability problem could get ten times worse by 2030

Much of the cost is due to the growing demand for data centers to power AI models As data centers are built or already exist, there are new issues surrounding their internal and external impacts

This week, data from the University of California was shared with the Washington Post and discovered by our friends at Tom's Hardware

The paper notes that water usage depends on the condition and proximity of the water The less water used, the lower the cost of electricity and the higher the usage For example, it takes an estimated 235 milliliters (about a cup) to compose a 100-word email in Texas, whereas it takes 1,408 milliliters (the equivalent of three 16-ounce water bottles) to compose the same email in Washington state

This may not be a lot, but ChatGPT is not used once for a single 100-word email Furthermore, the OpenAI-built chatbot requires a lot of power to run, and this is also the case for plain text The report does not appear to discuss image or video generation in detail

As an example from the EPRI report, a typical Google search uses about 03 Wh of power per query ChatGPT, on the other hand, requires about 29 Wh

In other words, if one in ten working Americans used GPT-4 once a week for a year (ie, 17 million people 52 times in total), the corresponding electricity demand of 121,517 MWh would be equivalent to the electricity consumed by all households in Washington, DC (an estimated 671,803) for 20 days On top of that, 105 million gallons of water would be used just to cool the servers

A second major report released earlier this week reinforces this problem Bloomberg reported that the push to insert AI into every possible aspect of the high-tech world is fueling a boom in utilities across the US seeking to build new gas-fired power generation facilities to meet surging electricity demand

Much of the demand appears to be coming from three sources: AI data centers, manufacturing facilities, and electric vehicles The article does not provide a breakdown of how much demand these three vectors actually place on the system, but according to the Electric Power Research Institute, demand for data centers is expected to increase up to 10 times the current level by 2030, This is equivalent to 9% of total US electricity generation

Domestic power companies have announced that they will build nearly twice as many new power plants to meet this demand, most of which will be built in Texas

“We were poised to shift away from expensive and polluting infrastructure like coal and gas power plants from the energy systems of the past Kendall Koberwig, Clean Virginia's advocacy director, told Bloomberg A lot of people are feeling the whiplash”

As Bloomberg notes, the recent resurgence of fracking has helped natural gas move away from coal fuels and reduce its environmental impact However, gas plants tend to leak methane gas, which “has 80 times the global warming impact of carbon dioxide in the first 20 years in the atmosphere”

Furthermore, gas plants, once built, never go away and may operate for at least 40 years In addition, electric utilities that had promised to reduce carbon dioxide emissions have had to cut back on their plans In the Bloomberg example, PacifiCorp had projected a 78% reduction in emissions by 2030 However, with the announcement of the new power plant, that projection was revised to 63%

Tom's Guide AI expert Ryan Morrison says of this new report: “While it's true that AI is having a significant impact on the environment, largely due to its enormous computational power, it's also true that it's having a significant impact on the environment

“AI may be the solution to not only the broader energy problems facing society, but also to AI's own energy problems As the technology improves, it will be used to design more efficient computing and cooling systems that minimize environmental impact AI may be needed to design our way to net zero”

AI consumes a lot of power, and these new data centers will not only impact the environment and climate change The Los Angeles Times reported in August that in Santa Clara, data centers consume 60% of the city's electricity This appetite risks increasing blackouts due to power shortages From there, water and electricity bills for people living in the area will also go up; companies like PG&E say their customers' bills will not go up, but it is clear that they are passing infrastructure costs on to their customers, and the data center is clearly not paying its fair share

This seems to run counter to what major AI companies like OpenAI, Google, Meta, and Microsoft say they are committed to reducing their environmental impact A Microsoft representative told the Post that the company is “working toward a way to cool data centers that completely eliminates water consumption” [For many of these companies, AI is in the picture and takes precedence over everything else, including community and environmental impacts

Categories