In the first of several interviews about the impact of generative AI on planetary resources, I talk to Alistair Alexander, an academic and climate activist based in Berlin. He has all the facts about about the power and water costs of the data centres being rolled out ‘for AI’. But he also asks us to think more widely about the costs of computation, and of embedding the logics of scale into every aspect of economic and social life. I found this a fascinating conversation that should make every organisation with an IT budget ask itself some hard questions. Like: is the use of generative models compatible with commitments on sustainability and climate justice? And: whoever asked for this anyway?
Links:
Alistair’s newsletter/blog Reclaimed Systems:
His Web site: https://reclaimed.systems
Alistair’s recent piece in the Berliner Gazette, ‘After Progress’: https://berlinergazette.de/generative-ai-is-degenerating-human-ecologies-of-knowledge/
The course Alistair mentioned: https://www.schoolofma.org/programs/p/early2025-ecologies-of-technology
The glass room website: https://theglassroom.org/
‘Materialising the virtual’, art project mentioned by Helen: https://we-make-money-not-art.com/how-artists-and-designers-are-materialising-the-internet/
Some more recent creative works designed to ‘make visible’ the invisible labour of AI: https://berlinergazette.de/projects/silent-works/
A recent post by Edward Ongweso Jr detailing the capital investments being made by the ‘big four’ in building out data centres (notice that the boss of Nvidia sees inference, not training, as the major driver of demand):
You might also like my recent post about the UK Government’s plans to turn the UK into a data park
And this earlier post about the climate costs of AI: Saving the Planet, one cute animal video at a time
Finally, you might want to listen (again?) to Dan MacQuillan on the podcast, talking among other things about the need to ‘decompute’.
Share this post