AI operations, especially training large language models, are extremely energy-intensive. For instance, training GPT-3 required about 1,287 MWh, emitting roughly 502 metric tons of CO₂, comparable to the annual emissions of 112 gasoline-powered cars
AI operations, especially training large language models, are extremely energy-intensive. For instance, training GPT-3 required about 1,287 MWh, emitting roughly 502 metric tons of CO₂, comparable to the annual emissions of 112 gasoline-powered cars
Future projections warn AI data centers may account for 8–35% of U.S. electricity use by 2030, significantly increasing demand for power—often still sourced from fossil fuels
AI’s environmental footprint extends beyond energy: data centers generate electronic waste, rely on rare earth mining, and consume vast quantities of water for cooling. Estimated AI-related water usage could reach between 4.2 and 6.6 billion cubic meters by 2027.