https://arab.news/rrsfx
Artificial intelligence is rapidly transforming economics, science, and governance. Yet behind these benefits lies a steadily growing cost: environmental degradation.
AI systems demand enormous amounts of energy and water to operate, including the giant computing clusters, data centers, and cooling systems they rely on. This tension is at the heart of the AI paradox: the very technology that can help mitigate climate change may also worsen it. Depending on its design, power source, and governance, AI can either be a solution or a liability for the planet.
Current AI models are inherently energy-intensive. Training large language models can consume hundreds to thousands of megawatt-hours of electricity. For example, training GPT-3 reportedly required approximately 1,287 MWh of electricity, resulting in over 550 tonnes of carbon emissions when powered by a conventional electricity grid.
Energy demand does not stop at training. Inference — the energy needed to respond to user queries — occurs continuously and grows with global usage. Even minor energy costs per query add up significantly across millions of daily queries. Moreover, the carbon footprint of AI depends heavily on the electricity grid: fossil-fuel-heavy grids produce far more emissions than renewable-based ones.






