While advances in AI are unstoppable and major tech companies are all deeply integrating AI into their product lines, the implications of the power consumption required to run the prompts are becoming clearer
Google has spent much of the last few months deploying Gemini in every product it makes and defending its AI brief that incorrectly suggested users use glue on their pizza or eat rocks Yesterday, Google confirmed a 48% increase in greenhouse gas emissions thanks to data center energy consumption and said its "extremely ambitious" goal of achieving zero emissions by 2030 "will not be easy"
As reported by The Guardian, the International Energy Agency (IEA) has suggested that Google's electricity consumption could also double from 2022 levels
Google is investing heavily in tools, resources, and infrastructure to minimize emissions, but it is up against massive energy consumption to train and run its artificial intelligence models in its data centers
A study last year by AI startup Hugging Face found that generating AI images uses as much energy as fully charging a smartphone, and in terms of emissions, the cost of initial training of models is even higher
Google has contracted with companies like Reddit to help train the LLM (Large Language Models) that the Gemini Chatbot runs on, and Deep Mind's research department can add audio to silent videos and summarize other AI presentations Research continues on generative AI
Google is not alone Microsoft continues to push AI advances at the risk of its own emissions targets, and OpenAI's Sam Altman seemed to confirm this May that the company will run out of $520 million in cash in 2023
In other words, there are no signs of AI progress slowing down or emissions increasing Let's hope, as Bill Gates told Sky News last week, that the quid pro quo of technology to tackle the climate crisis by investing in AI will eventually balance things out
Comments