The problem is that we only have a finite amount of energy. If all of our clean energy output is going toward AI then yeah it’s clean but it means we have to use other less clean sources of energy for things that are objectively more important than AI - powering homes, food production, hospitals, etc.
Even “clean” energy still has downsides to the environment also like noise pollution (impacts local wildlife), taking up large amounts of space (deforestation), using up large amounts of water for cooling, or having emissions that aren’t greenhouse gases, etc. Ultimately we’re still using unfathomably large amounts of energy to train and use a corporate chatbot trained on all our personal data, and that energy use still has consequences even if it’s “clean”
Also cooling! Right now each interaction from each person using chatGPT uses roughly a bottle’s worth of water per 100 words generated (according to a research study in 2023). This was with GPT-4 so it may be slightly more or slightly less now, but probably more considering their models have actually gotten more expensive for them to host (more energy used -> more heat produced -> more cooling needed).
Now consider how that scales with the amount of people using ChatGPT every day. Even if energy is clean everything else about AI isn’t.