Nuclear power for artificial intelligence

The double dystopia

AI consumes a lot of energy.

The corporations want to use nuclear power for this.

Crazy? https://taz.de/picture/7306818/1900/36799034-1.jpeg

Pretty baddest AI: Agent Smith from “The Matrix Revolutions”   Photo: mary evans/warner/imago

Probably ..

But maybe it will help

19.10.2024 19:19
Johannes Drosdowski

Commentary by Johannes Drosdowski

translated from german:https://taz.de/Atomkraft-fuer-die-Kuenstliche-Intelligenz/!6041039/

First Things First

Nuclear power is fun. Just not: nuclear waste and nuclear disasters. Anyone who thinks further than the nearest socket should actually fear this. Nevertheless, Google, Microsoft and Amazon are now increasingly relying on nuclear power again. For artificial intelligence, of all things. They cause a lot of energy demand in the companies' data centers. And their solution is nuclear power from recommissioned and completely new mini-power plants in the USA. Oh dear, a double dystopia: nuclear catastrophe and AI.

In pop culture - let's stay there, because real emotions are much more complex - the AI dystopia is usually based on the AI becoming independent (also in terms of character) and then a) looking for freedom like in “Ex Machina” or b) its hunger for energy wants to breastfeed like in the “Matrix” series. When it comes to character, willingness to act, consciousness... we're not there (yet?). But fantasy does not provide us with facts, but rather with truths about feelings. In “The Matrix,” people in artificial amniotic sacs act as living batteries to supply the machines. To do this, people themselves need energy and – boom! – we are having lunch à la “Soylent Grün”, because the machines feed people via tubes with other people processed into mush. We consume ourselves.

The fear of self-consuming in favor of the machine has been wandering around in films and comics for a while. In “The Matrix,” the robots eat. With Romero and Co. the zombies. The machine, the brutal, murderous, criticized system there was initially slavery, then capitalism.

Are they allowed to do that?

AI is not slavery. AI won't eat our brains or squeeze us into artificial amniotic sacs. But we have reached a point where corporations are tapping into energy sources for AI that should have dried up and been sealed long ago. Because even if there is no catastrophe in the power plants - and experience contradicts optimism here - the waste will still remain. And in the long term it will consume the healthy future of our planet and thus that of humanity.

So the ethical question arises: Are corporations even allowed to do this? Are we even allowed to do that? Because even if Google, Microsoft and Amazon decide where they get their energy from, we also use the energy.

"Only what receives energy can grow. We can determine which direction. Maybe in this one: AIs in the fight against climate change. We humans have not yet found any solutions on our own"

Many of us ordinary citizens consciously use AI to chat with us. So that she summarizes a super long text for us. Or briefly explains how I cook perfect, vegan spaetzle. Or for relationship tips. A single search query on ChatGPT will consume 2.9 watt-hours, the Electric Power Research Institute calculated. Ten times more than a Google query. If you google it 20 times, you will use as much energy as an energy-saving lamp in an hour. Anyone who asks ChatGPT to rephrase their work email might as well turn on a light for 30 minutes. Unfortunately I wouldn't get a better email for that.

It's all about the masses: In November 2023, Sam Altman, the CEO of OpenAI, said that 100 million people worldwide were using ChatGPT every week. But they don't just ask one question. The portal ingenieur.de from the publisher VDI Fachmedien, a media company for engineers, expects 214 million daily inquiries. This consumes over half a million kilowatt hours of energy per day.

Das Problem mit der Eigenverantwortung

AI therefore urgently needs to become more energy efficient. There needs to be more green coding, smarter data centers, more climate-conscious AI training. And yes: Maybe we don't have to ask the AI everything. But it would be wrong to just rely on personal responsibility now.

Because our dealings with Corona (“Well, I don’t wear a mask on the train anymore!”) and climate change (“Oh, that one flight to Italy”) have shown that we humans are not particularly good at personal responsibility. And the fiction of the zombie series shows that anyone who shoots against capitalism in a brain rush will quickly be bitten. It takes communities that give themselves rules and set boundaries. In our case perhaps: rules for corporations. So that nuclear power, for example, is not completely wasted on chatbots, but instead flows into progress. Maybe like this: 70 percent of the energy must be used for science, education and medicine and the corresponding AIs must be usable free of charge.

Only what receives energy can grow. We can determine which direction. Maybe in this one: AIs in the fight against climate change. We humans obviously haven't found any solutions on our own that we can or want to actually implement. Maybe the AIs will come up with something. Ideally without nuclear power. But the main thing: without people as batteries. (Translator's addition: as in the matrix.. )

Johannes Drozdowski Editor media/digital Editor for media and digital. Otherwise, freelance journalist and teamer on the subject of conspiracy narratives and fake news. Likes comics, zombies and the internet. Mastodon: @[email protected]

* grobi