Environment & Energy
Related: About this forumAs Bill Gates Pimps AI To Fight Warming, IEA Projects Datacenter Energy Use To Equal That Of Japan By 2026
If you want evidence of Microsofts progress towards its environmental moonshot goal, then look closer to earth: at a building site on a west London industrial estate. The companys Park Royal datacentre is part of its commitment to drive the expansion of artificial intelligence (AI), but that ambition is jarring with its target of being carbon negative by 2030.
Microsoft says the centre will be run fully on renewable energy. However, the construction of datacentres and the servers they are filled with means that the companys scope 3 emissions such as CO2 related to the materials in its buildings and the electricity people consume when using products such as Xbox are more than 30% above their 2020 level. As a result, the company is exceeding its overall emissions target by roughly the same rate. This week, Microsofts co-founder, Bill Gates, claimed AI would help combat climate change because big tech is seriously willing to pay extra to use clean electricity sources in order to say that theyre using green energy.
EDIT
Training and operating the AI models that underpin products such as OpenAIs ChatGPT and Googles Gemini uses a lot of electricity to power and cool the associated hardware, with additional carbon generated by making and transporting the related equipment. It is a technology that is driving up energy consumption, says Alex de Vries, the founder of Digiconomist, a website monitoring the environmental impact of new technologies. The International Energy Agency estimates that datacentres total electricity consumption could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the energy demand of Japan. AI will result in datacentres using 4.5% of global energy generation by 2030, according to calculations by research firm SemiAnalysis.
EDIT
The upfront energy cost of training AI is astronomical. That keeps smaller companies (and even smaller governments) from competing in the sector, if they do not have a spare $100m to throw at a training run. But it is dwarfed by the cost of actually running the resulting models, a process known as inference. According to analyst Brent Thill, at the investment firm Jefferies, 90% of the energy cost of AI sits in that inference phase: the electricity used when people ask an AI system to respond to factual queries, summarise a chunk of text or write an academic essay.
EDIT
https://www.theguardian.com/business/article/2024/jun/29/ai-drive-brings-microsofts-green-moonshot-down-to-earth-in-west-london
eppur_se_muova
(37,665 posts)It's not like we've reached the limitations of that by a long shot. Keep the fake (not artificial, FAKE) stuff out of it. Clearly, it's part of the problem, not part of the solution.
bucolic_frolic
(47,607 posts)SCOTUS will find a way to make it profitable and healthy for the wealthy while dehumanizing the poor. It won't always be free.
hatrack
(61,194 posts)Because without blockchain, poorly rendered portraits of women with 13 fingers and Bored Ape NFTs, civilization will crumble!!!!
progree
(11,463 posts)https://www.statista.com/statistics/201794/us-electricity-consumption-since-1975/#:~:text=Electricity%20consumption%20in%20the%20United,electricity%20by%20the%20producing%20entity.
It's also the combined usage in Texas, California, and Florida combined.
Another: https://www.eia.gov/energyexplained/electricity/use-of-electricity.php
Lots of graphs and pie charts