Think before you prompt: Reduce your AI carbon footprint with ROCKS

By Maria Toro-Troconis, University of Cambridge

Every time you ask ChatGPT a question or generate an AI image, you’re tapping into powerful servers that burn energy, consume water, and leave a real-world carbon footprint. While Generative AI tools can feel instant, seamless and free, they come at a hidden environmental cost. In this post, I’ll discuss how we can use these tools more responsibly, without trading convenience for climate damage.

Climate change and digital transformation are among the most influential forces shaping our era. How we navigate and integrate these evolving trends will be critical in determining the course of humanity’s future throughout the 21st century and beyond (GPAI, 2021; Masterson, 2024).

Generative artificial intelligence (GenAI) has rapidly transformed how we interact with machines, particularly through natural language processing (NLP). From chatbots and automated writing assistants to creative tools and research companions, GenAI is increasingly embedded in our daily digital lives. These tools, like OpenAI’s ChatGPT, Google’s Gemini, Anthropic’s Claude, and DeepSeek, are powered by large language models (LLMs) that generate human-like responses based on prompts we give them. 

How these systems understand us depends on their ability to process natural language, breaking down our words into manageable, computationally understandable pieces called tokens. The ease with which we can now generate responses using GenAI contradicts the immense computational and environmental power behind each interaction.

How do AI models process our prompts?

The way AI models process our prompts has an impact on the environment.  When you type a question into a LLM like ChatGPT, it does not read your message word-for-word in the way humans do. Instead, it breaks down your input into tokens: small fragments of text, typically chunks of 3 – 4 characters or individual words. For instance, the sentence “Think before you prompt” may be split into six tokens. Each token processed consumes computational resources, requiring both energy and water for the massive data centres that power these models. 

It’s not just a technical detail, it’s a sustainability concern.

The environmental footprint of a prompt

According to the World Economic Forum (2024), AI can help us tackle climate change: it can predict the weather, track icebergs and identify pollution. It can also improve agricultural output and reduce its environmental impact. However, engaging with LLMs carries an environmental footprint that needs ethical regulation and greener tech to prevent risks and inequalities (UNU EHS, 2024).  

Let’s break down the environmental cost of an interaction with a GenAI LLM:

  • According to the research conducted by Luccioni, et al., (2024), a single generated AI image can use as much energy as half a smartphone charge.
  • Charging the average smartphone phone requires 0.022 kWh of energy (US Environmental Protection Agency, 2024), which means that the most efficient text generation model uses as much energy as 9% of a full smartphone charge for 1,000 inferences (prompts and answers). 
  • Training a single large AI model can generate a carbon footprint comparable to the total emissions produced by five cars throughout their entire lifespan (Walther, 2024). 
  • The energy used to train OpenAI’s GPT-3 language model could directly evaporate 700,000 litres of clean water (Luccioni, et al., 2024).
  • AI demand is projected to account for 4.2-6.6 billion cubic meters of water withdrawal in 2027 which accounts for half of the total annual water withdrawal of the United Kingdom (Li, et al., 2023).
  • AI depends on continuously operating large data centres which consume vast amounts of energy around the clock. These facilities are estimated to account for between 2.5% and 3.7% of global annual carbon emissions (Cho, 2023).

GenAI also increases the need for resource-heavy hardware like Graphics Processing Units (GPUs) or Tensor Processing Units (TPUs). Producing these components requires rare-earth metals, whose extraction often causes environmental harm and high emissions, adding to AI’s overall ecological impact (Walther, 2024).

It’s not just about AI efficiency anymore—it’s about digital sustainability. And the more iterative or unclear your prompts, the more resources are used (CoDesignS AI Framework, 2024).

Comparing Leading LLMs: Environmental impact

Let’s have a closer look at the environmental impact of some of the leading LLMs.

As presented in Table 1 below, environmental impact is strongly influenced by the LLM’s token input capacities and efficiency. Models with large token limits, such as Gemini 1.5 Pro and GPT-4o, require significant computational power, leading to higher energy consumption. In contrast, DeepSeek V3 claims to be more environmentally sustainable, using only about one-tenth of the computational resources compared to similar models, although this is yet to be independently verified (Calma, 2025). Claude 3.7’s environmental performance remains unclear due to limited transparency. Overall, while handling larger contexts improves capabilities, it also increases environmental costs unless offset by greater model efficiency.

A screenshot of a computer screen

AI-generated content may be incorrect.
Table 1: Environmental impact of Large Language Models (LLMs) (CoDesignS ESD AI Coach, 2024)

Every additional prompt adds more tokens to the pile—more energy, more water, more emissions. It’s tempting to view interactions with LLMs as “free,” but behind the scenes, there is a high cost.

Less is more: Why every prompt matters

This brings us to a pivotal mindset shift: less is more. Instead of multiple iterations and retries, the goal should be to get the best response in fewer interactions. Every refined, thoughtful prompt saves resources and time.

Prompt engineering is the skill of crafting clear, effective inputs to get the desired outputs from GenAI models. It’s like learning to ask the right question in the right way. The better the prompt, the better (and faster) the answer.

Prompt engineering is an essential tool in reducing GenAI’s environmental footprint (CoDesignS AI Framework, 2024). It reduces trial and error, minimises redundant queries, and shortens conversation length. Not only does this save money and computing time, but it directly reduces electricity and water usage.

But how do we create better prompts? This is where a framework like ROCKS comes into play.

Optimise your interactions: Introducing the ROCKS method

Developed as part of the open access CoDesignS AI Framework (2024), ROCKS is a practical, easy-to-remember method to help us optimise our interactions with GenAI systems. 

A diagram of a group of arrows

AI-generated content may be incorrect.
Figure 2

As presented in Figure 2, ROCKS stands for:

Role: Identify your role.
Objective: State your objective.
Community: Specify your audience
Key: Describe the tone or style, and any related parameters.
Shape: Note the desired format of the output.

An example of a prompt using the ROCKS method

For example, instead of writing:

“Can you help me make my lecture more interactive?”

Use the ROCKS structure for greater clarity and focus, leading to fewer follow-up questions and lower energy consumption:

Role: I am a lecturer in Pathology teaching Year 2 MBBS students.  

Objective: I aim to make my lecture on the Pathology of the Head more interactive, dynamic, and enjoyable to better engage students during the session.  

Community: The lecture will be delivered in-person to around 300 Year 2 MBBS students in a large lecture theatre setting. 

Key/Tone: The tone should be engaging, energetic, and inclusive, encouraging active participation and maintaining attention across a large group.  

Shape: I would like suggestions for practical, easy-to-implement methods using PowerPoint and Kahoot, focusing on live interaction, quizzes, and gamified learning elements that are scalable for a large cohort.

Using the ROCKS method, you can craft more precise and purposeful prompts that not only improve the quality of AI responses but also contribute to more efficient and sustainable GenAI use.

Conclusion: Prompt with Purpose

GenAI is transforming how we think and work, but every interaction draws on the planet’s finite resources. By learning to prompt carefully and with purpose, we can reduce wasteful iterations, improve response quality and minimise the environmental impact of our digital habits. Tools like the ROCKS method empower us to be not just efficient, but ethical. As educators, learners and professionals, we have a responsibility to pair innovation with sustainability. The next time you engage with an AI, pause to refine your prompt; not just for a better answer, but for a better future.

Think before you prompt—because better inputs create better outputs for us and a better future for the planet (CoDesignS ESD AI Coach, 2024).

REFERENCES

Calma, J. (2025) ‘AI is ‘an energy hog,’ but DeepSeek could change that’. The Verge. January. Available at:

https://www.theverge.com/climate-change/603622/deepseek-ai-environment-energy-climate?utm_source=chatgpt.com last accessed: 7 March 2025

CoDesignS ESD AI Coach (2025). Available at: https://codesignsesd.org/codesigns-ai-coach/ last accessed: 7 March 2025

CoDesignS AI Framework (2024). Available at: https://aldesd.org/7574-2/ last accessed: 7 March 2025

Cho, R. (2023). AI’s Growing Carbon Footprint. June. Columbia Climate School Newsletter. Available at:

https://news.climate.columbia.edu/2023/06/09/ais-growing-carbon-footprint/ last accessed: 7 March 2025

GPAI – The Global Partnership on Artificial Intelligence (2021). ‘Climate Change and AI – Recommendations for Government action’. Available at: https://www.gpai.ai/projects/climate-change-and-ai.pdf last accessed: 7 March 2025

Li, P., Yang, I., M., Ren, S. (2023) ‘Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models’ https://doi.org/10.48550/arXiv.2304.03271 

Luccioni, S., Jernite, Y., Strubell, E. (2024). Power Hungry Processing: Watts Driving the Cost of AI Deployment? In Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’24). Association for Computing Machinery, New York, NY, USA, 85–99. https://doi.org/10.1145/3630106.3658542

Masterson, V. (2024). ‘9 ways AI is helping tackle climate change’. Blog post. February. Available at: https://www.weforum.org/stories/2024/02/ai-combat-climate-change/ last accessed: 7 March 2025

UNU EHS (2024). ‘5 Insights into AI as a Double-Edged Sword in Climate Action’. Blog post. June. Available at: https://unu.edu/ehs/series/5-insights-ai-double-edged-sword-climate-action last accessed: 7 March 2025

US Environmental Protection Agency. 2024. Greenhouse Gases Equivalencies Calculator – Calculations and References. https://www.epa.gov/energy/greenhouse-gases-equivalencies-calculator-calculations-and-references last accessed: 7 March 2025

Walther, C. (2024). ‘Generative AI’s Impact On Climate Change: Benefits And Costs’. Blog post. November. Available at: https://www.forbes.com/sites/corneliawalther/2024/11/12/generative-ais-impact-on-climate-change-benefits-and-costs/ last accessed: 7 March 2025

Maria Toro-Troconis, PhD (she/her) is a Blended Learning Specialist at the University of Cambridge. Maria is also the Founder and Director of the Association for Learning Design and Education for Sustainable Development (ALDESD). She has led digital education transformation programmes for UNICEF, UNESCO, and UNDP and was recognised as one of the top international education influencers of 2021 (Edruptors). Her interests lie in the field of education for sustainable development and digital learning innovations.

Did you enjoy reading this? To become a member of our community, see Membership details here www.alt.ac.uk/membership

1 Comment

  • Ensiklopedia says:

    How can individuals minimize their AI carbon footprint when interacting with large language models?

Leave a Reply to Ensiklopedia Cancel reply

Your email address will not be published. Required fields are marked *