Turns out theres another problem with AI its environmental toll
https://www.theguardian.com/technology/2023/aug/01/techscape-environment-cost-ai-artificial-intelligence
-snip-
Lets start with the water use. Training GPT-3 used by 3.5m litres of water through datacentre usage, according to one academic study, and thats provided it used more efficient US datacentres. If it was trained on Microsofts datacentres in Asia, the water usage balloons to closer to 5m litres.
Prior to the integration of GPT-4 into ChatGPT, researchers estimated that the generative AI chatbot would use up 500ml of water a standard-sized water bottle every 20 questions and corresponding answers. And ChatGPT was only likely to get thirstier with the release of GPT-4, the researchers forecast.
Estimating energy use, and the resulting carbon footprint, is trickier. One third-party analysis by researchers estimated that training of GPT-3, a predecessor of ChatGPT, consumed 1,287 MWh, and led to emissions of more than 550 tonnes of carbon dioxide equivalent, similar to flying between New York and San Francisco on a return journey 550 times.
-snip-
Sacrificing performance to reduce ecological impact seems unlikely. But we need to rethink AIs use and fast. Technology analysts Gartner believe that by 2025, unless a radical rethink takes place in how we develop AI systems to better account for their environmental impact, the energy consumption of AI tools will be greater than that of the entire human workforce. By 2030, machine learning training and data storage could account for 3.5% of all global electricity consumption. Pre-AI revolution, datacentres used up 1% of all the worlds electricity demand in any given year.
-snip-