https://i0.wp.com/www.beaude.net/no-flux/wp-content/uploads/2023/04/Halpern_final.gif?w=676&ssl=1

“According to OpenAI’s charter, its mission is “to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity.” Leaving aside the question of whether AGI is achievable, or if outsourcing work to machines will benefit all of humanity, it is clear that large-language A.I. engines are creating real harms to all of humanity right now. According to an article in Science for the People, training an A.I. engine requires tons of carbon-emitting energy. “While a human being is responsible for five tons of CO2 per year, training a large neural LM [language model] costs 284 tons. In addition, since the computing power required to train the largest models has grown three hundred thousand times in six years, we can only expect the environmental consequences of these models to increase.””

Source : What We Still Don’t Know About How A.I. Is Trained | The New Yorker