November 10, 2025
Sarvesh V Pandiarajan

Growing AI That Gives Back

Exploring the environmental costs of AI and how we can build a more sustainable, reciprocal relationship between artificial intelligence and the communities it serves.

Symbiosis - representing the reciprocal relationship between AI and communities

Photo by Kris-Mikael Krister

AI has rapidly and effectively ingrained itself into many aspects of life. Everyday, millions of people, from students to business professionals to researchers, all type their questions into ChatGPT, expecting their problems to be solved in approximately 4 seconds. However, these solutions come at a cost that is hidden to many, an environmental one.

Altman, we have a problem

When you think about AI, chances are your mind immediately goes to thinking about LLMs or Large Language Models. These are the models that power your favorite "AI" like ChatGPT, Gemini, or Claude. However, as we'll be exploring shortly, every query into an LLM comes at a cost. This has to do with the fundamental methodologies that were used to make these technologies come to be.

You may have heard someone say before that "ChatGPT is as smart as the smartest human being in the world." This is because these models are trained on a vast, vast array of data that covers basically all the information that humans have gathered over our time existing on this planet. However, the process of training models on this immense amount of data requires large data centers and comes at a cost, both environmentally and ethically.

It's not just the training process either, as Adam Zewe from MIT points out, "deploying these models in real-world applications, enabling millions to use generative AI in their daily lives, and then fine-tuning the models to improve their performance draws large amounts of energy long after a model has been developed."

To get a more in depth analysis, I sat down with Vaishakh Balu whose research and involvement in the computer science department at NC State let him see firsthand how models are trained. When asked about what concerns him the most about the environmental effects associated with the proliferation of AI, Balu explains:

"AI is now accessible to the masses that have the necessary facilities [and] AI engines' electricity consumption has accounted for that of an entire nation. Most of this energy usage could be used for other needs, but our society doesn't seem to prioritize that right now."

— Vaishakh Balu

Data center infrastructure

Photo by Taylor Vick

To put this cost into perspective, here are some comparisons:

The Cost Per Query

Slide to see how many homes could be powered and pools filled based on the number of AI queries

02.5 Billion

Move the slider to see the impact

Based on 0.32 Wh/query for electricity and 0.32 mL/query for water. Average US home uses ~10.7 MWh/year. Olympic pool = ~2.5 million liters.

In the simulation game above, the max figure of 2.5 billion queries represents the average number of queries that ChatGPT handles per day. So, with that in mind, every day of ChatGPT usage can power around 75 homes for a year.We can see a pattern here. Balu explains "there's no doubt at all that AI has helped us learn many new things…but nowadays, it seems like the entire world is so dependent on AI for the simplest of things." The problem, similar to a lot of other issues facing the environment, doesn't lie with the practice (in this case the use of AI), but rather the widespread acclimation to it which also leads to this explosion of new and similar AI companies who don't have environmental interests in mind.

Looking to tradition, for innovation

Robin Wall Kimmerer is a Botanist who writes in her book, Braiding Sweetgrass, about the lessons we can take from the indigenous Potawatomi culture in preserving the environment. You may be wondering how this relates at all to AI, and while the connection may not be immediately obvious, there is an interesting concept from Braiding Sweetgrass that can be applicable to AI.

Throughout the book, Kimmerer brings up this concept of the "grammar of animacy." Unlike European languages, Kimmerer explains, "Potawatomi does not divide the world into masculine and feminine. Nouns and verbs both are animate and inanimate" (Kimmerer, Braiding Sweetgrass, p. 53) and the criteria to be animate isn't as strict as the criteria to be considered alive in most western settings. In Potawatomi, a rock, for example, can be considered animate. I encourage anyone who found this interesting to go and read Braiding Sweetgrass, but for now the question persists: how does any of this relate to AI?

Well, the process of building AI models is very similar to the process of growing and harvesting crops. We've already established that data is the backbone of all these AI models, but we treat our data very poorly. Often, data is referred to as being mined, scraped, or extracted; these terms tell the whole story. Kimmerer brings up the concept of the honorable harvest, where she explains that "In gathering roots, just plunging in will get you nothing but a hole. We have to unlearn hurrying. This is all about slowness. 'First we give. Then we take.'" (Kimmerer, Braiding Sweetgrass, p. 233).

With data, there is currently no such honorable harvest. Whether it's a painter's art being unknowingly scraped for a new image generation model to be trained on, or just the trillions of data points that come from everyday people without their permission, there is an argument to be made that this obsessive rush to see who can extract the most data the fastest comes at the cost of the quality of said data.

Looking to the future

Apart from the gathering of the data, this concept of the honorable harvest can also be powerful when applied to the physical communities that these data processing centers reside near. Laksha Pawar studies sustainable data solutions at the University of North Carolina at Charlotte, but is also the co-founder of the Pure Ripple Foundation, whose goal is to empower underserved communities through sustainable water solutions to break the cycle of poverty. To put their work into perspective, Pure Ripple Foundation recently built a crucial well for a rural community in Pakistan that had dangerously limited access to water.

When asked about his vision for the future of AI training and management, Pawar proposed a more cyclical, reciprocal model for how he'd like to see future AI efforts:

"The power to control these technologies should come with an obligation to give back. If we see the waste created from these processes as an opportunity to sustainably power another process, we can create less waste and empower the communities we're taking from."

— Laksha Pawar

Stockholm cityscape

Photo by Jon Flobrant

There are already projects that pave the path to a more sustainable AI future by way of this symbiotic data center design. For example, take Stockholm Data Parks who repurpose the excess heat that comes about as an unwanted byproduct of these data centers running to provide heating for the city's residents. This project is a great example of one that intentionally chooses to put the environment as its starting point and not just shoehorn it in for some favorable press down the line.

If we are to truly make AI symbiotic with the communities it takes from, we must put these communities first; We must remember that artificial intelligence was created to serve the world, not the other way around.

References

Kimmerer, R. W. (2013). Braiding Sweetgrass: Indigenous Wisdom, Scientific Knowledge and the Teachings of Plants. Milkweed Editions.

This article draws inspiration from Kimmerer's concepts of the "grammar of animacy" and the "honorable harvest," applying these indigenous principles to the relationship between AI development and environmental stewardship.

Zewe, A. (2025, January 17). Explained: Generative AI's environmental impact. MIT News. Retrieved from https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117

This article provided foundational information about the electricity and water consumption of generative AI models, data center energy demands, and the environmental implications of AI deployment.

Mytton, D. (2025, June 12). ChatGPT energy usage is 0.34 Wh per query. /dev/sustainability. Retrieved from https://www.devsustainability.com/p/chatgpt-energy-usage-is-034-wh-per

This source provided the energy consumption figure of 0.32-0.34 Wh per query used in the calculations for this article.

TechCrunch. (2025, July 21). ChatGPT users send 2.5 billion prompts a day. Retrieved from https://techcrunch.com/2025/07/21/chatgpt-users-send-2-5-billion-prompts-a-day/

This article provided the daily query volume figure of 2.5 billion queries used as the maximum value in the interactive simulation.

AI Basics. How much energy does ChatGPT really use? Retrieved from https://ai-basics.com/how-much-energy-does-chatgpt-really-use/

This source provided additional context and verification for energy consumption calculations.

U.S. Energy Information Administration. (2023). Average annual electricity consumption per U.S. residential utility customer. Retrieved from EIA.gov

This source provided the figure of 10.7 MWh per year for average U.S. household electricity consumption used in the calculations.

Acknowledgements

Special thanks to Vaishakh Balu for sharing insights from his research and involvement in the computer science department at NC State, providing firsthand perspective on how AI models are trained and the environmental concerns associated with their proliferation.

Special thanks to Laksha Pawar for sharing his vision on sustainable AI development and the work of the Pure Ripple Foundation in empowering underserved communities through sustainable water solutions.