Thirsty for AI? Training GPT-3 required 185,000 gallons of water, study finds
As AI continues to evolve, so does its environmental impact. A recent pre-print research paper reveals that training OpenAI's GPT-3 in Microsoft's U.S. data centers directly consumed a whopping 185,000 gallons of water, used for cooling purposes.
- To put that into perspective, that's enough water to produce 320 Tesla electric vehicles or 370 BMW cars!
Things to Consider Next Time You Wanna Ask ChatGPT 2+2: What about the water consumption for everyday interactions with AI like ChatGPT? The paper estimates that a basic exchange involving 25 to 50 questions is equivalent to consuming a 500-milliliter water bottle.
- Interestingly, the researchers found that if the training had been done at Microsoft's Asia-based data centers, the water consumption would have been triple the amount.
Looking Ahead: The study, conducted by researchers from the University of California Riverside and the University of Texas Arlington, highlights the importance of addressing water footprints in AI model development. As AI continues to advance, it's crucial for creators to consider the environmental impact and strive for sustainable solutions.