TESCREALism
T=Transhumanism
E= Extropianism
S= Singularitarianism
C = Cosmism
R = Rationalism
EA = Effective Altruism
L = Longtermism
Émile Torres, a philosopher and historian who has focused recently on existential threats, developed what they refer to as a ‘bundle’ (we might call it a memeplex) that claims to link the above series of -isms into a sort-of singular force that has been embraced by many of the super-wealthy and influential in the tech world. It is the influence of these tropes on the super-rich and influential that, in Torres’ view, makes them very dangerous.
In an article for Truthdig, Torres writes, “At the heart of TESCREALism is a ‘techno-utopian’ vision of the future. It anticipates a time when advanced technologies enable humanity to accomplish things like: producing radical abundance, reengineering ourselves, becoming immortal, colonizing the universe and creating a sprawling ‘post-human’ civilization among the stars full of trillions and trillions of people. The most straightforward way to realize this utopia is by building superintelligent AGI.”
In the same piece, Torres gets into the wilder projections that I suspect even many techno-enthusiastic transhumanism-oriented Mindplex readers would find fantastic (rooted in brilliant minds taking their fantasies for reality), Torres theorem leans heavily on Oxford Professor Nick Bostrom’s views, writing that he “argues that if there’s a mere 1% chance of 10^52 digital lifetimes existing in the future, then ‘the expected value of reducing existential risk by a mere one billionth of one billionth of one percentage point is worth a hundred billion times as much as a billion human lives.’ In other words, if you mitigate existential risk by this minuscule amount, then you’ve done the moral equivalent of saving billions and billions of existing human lives.”
As he explained in his conversation with Douglas Rushkoff, Torres identifies TESCREALism as a philosophical ‘bundle’ that, in a sense, trivializes the lives and sufferings of currently existing humans by finding a greater importance in the possibly trillions of posthumans that could exist in physical and/or virtual space in the future — ‘people’ having experiences that can be valued beyond our imagining. Some of those quoted tend to use statistics to value experience, which is about as alienated from experience as you can get.
I can assume you all know about transhumanism and the singularity. If you’re here, you probably know about Ben Goertzel’s project to build AGI. But are most of you familiar with the eccentricities and extremities that have attached themselves to Rationalism (as defined by LessWrong), Effective Altruism and Longtermism?
In the interview below, I mainly ask Torres to thrash out how real all this is. Do a lot of people buy into the whole philosophical bundle? My own attitude, even as a longtime associate of transhumanism, has always been kind of “are you for real?” when it comes to people taking their shit too seriously, particularly when they’ve deluded themselves into thinking they’re rational.
In a follow up poll, I will ask Mindplex readers and veterans of the transhumanist culture to weigh in on the TESCREAL bundle.
What About Useful AI?

The Singularity is Probably Not Near

What About the Humane Transhumanists, Singularitarians and AI Enthusiasts?
Is Less Wrong A Cult?
Does Anyone Call Themselves a TESCREAList?
Eugenics

Techno Gloom
Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter.