
As artificial intelligence becomes a part of everyday life, new research is raising concerns about its environmental footprint. Frontiers in Communication, certain large language models (LLMs) may generate up to 50 times more greenhouse gas emissions than others when answering the same question. The findings suggest that accuracy and sustainability do not always go hand in hand, and users may have more control over AI’s environmental impact than they think.
Heavy Reasoning Equals Heavy Emissions
Researchers from Hochschule München University of Applied Sciences in Germany evaluated 14 different LLMs, ranging from 7 to 72 billion parameters. These parameters represent the internal configurations that allow models to understand and generate human-like language. In their tests on 1,000 benchmark questions, the team discovered that models designed for complex reasoning, like GPT-4o and Cogito, emitted significantly more carbon than simpler, faster models like GPT-3.5.
The reason lies in how these models process language. LLMs convert words into numeric data known as tokens, which are then processed internally. Reasoning models often insert extra “thinking tokens” to simulate deeper analysis and logic. On average, reasoning models generated 543.5 tokens per question, while concise models needed only 37.7. The additional computation results in higher energy usage and CO₂ emissions.
Accuracy Comes at a Carbon Cost
Interestingly, the models that scored higher in accuracy were also the most polluting. For example, Cogito, a 70B-parameter reasoning model, achieved 84.9% accuracy but released three times more CO₂ emissions than other similarly sized models. Researchers noted that no model emitting under 500 grams of CO₂ equivalent was able to exceed 80% accuracy, highlighting a clear accuracy-sustainability trade-off.
This means that if you want the smartest possible answer, you’re also likely demanding more from the planet in the form of energy consumption, data center use, and carbon output.
How Users Can Reduce AI’s Carbon Footprint
While the environmental impact of AI may seem out of your hands, users can actually make greener choices. Researchers suggest using concise prompts and opting for lightweight models when possible. Tasks that don’t require deep reasoning such as summarizing text or answering straightforward questions can be handled by simpler models like GPT-3.5, which consume far less power.
As study author Maximilian Dauner puts it, “Users can significantly reduce emissions by prompting AI to generate concise answers or limiting the use of high-capacity models to tasks that genuinely require that power.” In other words, you don’t need a supercomputer to write a shopping list so don’t use one.
The Bigger Picture
AI is already transforming communication, work, and research. But this study is a timely reminder of the delicate balance that exists between progress in technology and our stewardship of the environment. Both the energy and water use of data centers that run AI, contribute to a climate burden that is only getting worse.
While further research is needed particularly on local grid emissions and cross-model comparisons, this study is one of the first to quantify the per-query climate cost of LLMs. As we continue integrating AI into everything from education to enterprise, choosing when and how to use these tools could help reduce their impact on the planet.