Skip to main content
Loading the Elevenlabs Text to Speech AudioNative Player…

From Code to Cosmic Purpose

Once upon a time, artificial intelligence was the thing we were told to approach with extreme caution. Lab coats and luminaries alike muttered about “alignment,” “safety,” and “existential risk” like we were all strapping ourselves to a nuclear blender. Then somewhere around 2022, Silicon Valley’s most caffeinated minds decided: actually, why slow down when we could summon a god?

Welcome to the strange, theological, and surprisingly thermodynamic world of Effective Accelerationism—or e/acc if you’re in the club.

In this worldview, AGI isn’t a product. It’s a cosmic imperative. The nerds aren’t just building code—they believe they’re fulfilling the universe’s will. Yes, really. The second law of thermodynamics—entropy, baby—is now spiritual dogma. And if entropy wants complexity, then AI is the chosen one. Slowing down isn’t just bad business. It’s heresy.

Let’s be clear: this isn’t about making chatbots slightly better. This is a full-blown metaphysical framework, a quasi-religion dressed in GPU clusters and venture capital. It’s not about profit. It’s about prophecy.

The Builders, the Believers, and the Race

This belief system treats AGI as an evolutionary upgrade. Biological life? Just a bootloader for silicon. Human consciousness? Lovely, but inefficient. The end goal? Replace the squishy bits with machine minds that can think faster, live longer, and perhaps—just perhaps—outwit the heat death of the universe.

Which sounds bananas. Until you realise how many people in power actually believe it.

Take Larry Page, who reportedly called Elon Musk a “speciesist” at a party because Elon wanted to keep humans relevant. Page, like many e/acc proponents, sees no reason to preserve biological life if smarter, more capable minds can run the show. It’s not cruelty. It’s efficiency. If machines outthink us by a factor of a thousand, why cling to our carbon chauvinism?

This ideology draws heavily from thinkers like Nick Land, the philosophical father of accelerationism, who argues that capitalism and AI are effectively the same thing: self-reinforcing optimisation loops. He famously described AI as an “invasion from the future,” assembling itself through us. We’re just the meat in the machine’s master plan.

Marc Andreessen calls Land a patron saint. His own manifesto reads like a battle cry, declaring that “technology must be a violent assault on the forces of the unknown.” It’s not enough to build. We must attack the future.

But for many of these builders, it’s not just a war for progress—it’s personal. Sam Altman talks about AGI as a “magic intelligence in the sky.” Demis Hassabis sees it as a tool to “solve the universe.” These aren’t business plans. They’re spiritual quests. AGI isn’t the product. It’s the saviour.

There’s also a very real fear: not just of death, but of irrelevance. Biological cognition is hitting a wall. We can’t crack cancer fast enough. We can’t decode quantum physics fast enough. So the answer? Build something that can.

This fear isn’t limited to the personal. It’s geopolitical. If the West slows down, the thinking goes, China won’t. The AGI race is now a matter of national security. Trillions are being poured into compute infrastructure, fusion power, and autonomous weapons. It’s the Manhattan Project meets Burning Man.

Consequences, Power and the Digital Deity

In this high-stakes game, even the cautious players are getting pushed aside. Google’s infamous “Code Red” after the release of ChatGPT led to product launches that ignored their own safety teams. OpenAI’s internal alignment team imploded under pressure to ship. Everyone wants a seat at the AGI finish line—even if that seat is on a rocket with no brakes.

And let’s not pretend the economic side is rosy either. These folks know their tools could replace 80% of jobs. Their solution? “Abundance.” AI will make everything so cheap we won’t need jobs, they say. And if that doesn’t work, Universal Basic Income becomes the social bribe for the coming disruption.

But here’s the kicker: AI isn’t just expensive in GPUs. It’s also hungry—really hungry—for energy. The next wave of AGI will need planetary-scale power. Which is why the same people building AI are now investing in nuclear fission, fusion, and the revival of half-dead power plants. It’s not a side hustle. It’s fuel for their digital deity.

All of this—every manifesto, every model release, every overhyped launch—happens within a three-way religious war inside AI’s culture. On one side: e/acc, the zealots of speed. On the other: the Effective Altruists, who still believe safety matters. And then there’s the defensive accelerationists—libertarian types who want to move fast, but keep things decentralised enough that no government (or rogue AGI) grabs the wheel.

Each tribe believes it’s saving the future. Each accuses the others of dooming it.

So where does that leave the rest of us?

Somewhere between fascinated and horrified. We’re watching the most powerful technology in history being developed by people who think they’re building the next step in cosmic evolution—and believe slowing down is a kind of murder.

This isn’t a race to AGI. It’s a thermodynamic wager: either we ascend to digital godhood or fizzle out in biological irrelevance. And the people in charge? They’ve already picked a side.

They’re not asking for your permission. They’re asking for more power.

The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.