
The power of artificial intelligence continues to impress the world, but it comes at a growing environmental cost. AI tools, especially general-purpose models like ChatGPT, are consuming vast amounts of energy. UNESCO has now raised an urgent concern ,AI must become more efficient, both in terms of model size and how we use it.
A new study presented during the AI for Good Global Summit in Geneva reveals that small changes can create massive environmental savings. By simply shortening prompts and shifting to smaller AI models, energy consumption could drop by as much as 90%. The potential of AI is vast, but its energy footprint must be managed before it outpaces sustainability efforts.
The Alarming Cost of Running AI Models
Generative AI is already woven into everyday tools, but it’s quietly draining electricity at a staggering rate. OpenAI CEO Sam Altman recently said that each ChatGPT query uses 0.34 Wh of electricity ,10 to 70 times more than a standard Google search. With about a billion queries daily, that adds up to 310 GWh annually, which is equivalent to powering three million people in Ethiopia for a year.
UNESCO warned that AI energy demand is doubling every 100 days. The study highlighted how the race to create bigger and smarter models is putting pressure on global energy systems, water resources, and rare minerals. This growth also risks creating inequality as only a few countries can afford the energy required to run these models at scale.
Smaller AI Models Offer Smarter Efficiency
One of the key takeaways from the UNESCO report is that smaller AI models can be far more energy-efficient. Large models like GPT-4o or Gemini have to process massive volumes of data to respond accurately to a wide variety of topics. But most queries don’t need that level of complexity.
Using smaller, task-specific models reduces the load on servers and slashes power use. When paired with shorter prompts, the performance remains consistent, but electricity use drops dramatically. In fact, cutting prompt lengths from 300 words to 150 words showed up to 90% savings in energy ,without any loss in output quality.
Tech Giants Already Moving Toward Miniature Models
In response to rising concerns over the AI energy footprint, major tech companies are already building more efficient solutions. Google has released Gemma, Microsoft launched Phi-3, and OpenAI has its lighter GPT-4o mini version. Even French firm Mistral AI introduced a more compact model called Ministral.
These smaller AI models use fewer parameters but still perform efficiently for routine or specific tasks. They are also more accessible to developers and organisations with limited infrastructure, helping democratise AI access while saving energy.
Prompt Size Plays a Critical Role
Surprisingly, it’s not just model size that matters. The way users interact with AI also impacts energy usage. The UNESCO report found that shorter prompts dramatically lower power needs. Long, wordy queries force the AI to sift through more information, requiring more computation and energy.
Trimming prompts to be more direct not only speeds up responses but also lowers the processing burden. When used with a specialised model, this approach provides a high-quality answer with far less electricity consumed.
Reducing AI Energy Footprint is Now Essential
The growing reliance on AI should not come at the cost of the planet. If unchecked, the AI energy footprint could undermine sustainability goals and strain already-limited global resources. But this future isn’t inevitable.
As the UNESCO study shows, simple tweaks ,like cutting down prompt length and switching to smaller AI models ,can make AI far more sustainable. The time to act is now, not just for tech companies, but for users and developers who shape how AI is used daily.