Risk and Precarity Part 2: The Age of Web3

Is it the age of the blockchain yet? Web3 enthusiasts claim that the blockchain will lead us to an age of decentralized power in finance and culture. Let’s look at how close we are getting. I will be contemplating the ups and downs of decentralization and its impact on human society. It’s hard to get a solid statistic, but it seems we’re still in the early adopters’ stage, with a small minority of folks around the world using blockchain for bitcoin and NFT activities.

It Starts with the Crypto-Anarchists/Cypherpunks

The foundations for digital cash were established by David Chaum in the early 1980s, but the ideology of cryptography as the great liberator from authoritarian control of finance, communication and just about everything controlled by states and owners didn’t become a culture until the early 1990s. 

Tim May fired off the opening salvo in 1988 with his Crypto-Anarchist Manifesto. In his document May avers that, “Two persons may exchange messages, conduct business, and negotiate electronic contracts without ever knowing the True Name, or legal identity, of the other.” This will be a liberating force, he claims, that will, “be the wire clippers which dismantle the barbed wire around intellectual property.” 

May was an anarcho-capitalist, as were many of those who followed in his footsteps. And while one may be forgiven for wondering if a pro-capitalist groups hostility to intellectual property might be a subject for psychologists, credit these digital freaks with being attuned to the nature of the just-then evolving digitally networked society in which restricting data would be seen as a roadblock to the wonders delivered by freely-flowing information and ideas. 

On the other hand, anarcho-capitalists are not particularly big on making life less risky for the precariat (see Part 1). They are certainly no friend to any form of state-based relief for the vulnerable. There is a broad sweeping ideological narrative in which the complete release of a free market from any controls delivers wonders at such a rapid rate of change that everybody — even the legless homeless moneyless veteran of imperial wars — winds up better off. You see, according to such a narrative, there will be so much wealth flow that volunteerism goes quantum… or something like that. I would give this a hard pass.

In 1992, at a meeting of crypto-anarchists in Berkeley, California, St. Jude Milhon (my former writing partner, RIP) suggested they call themselves Cypherpunks. The name has had staying power. It is now incorporated into many crypto cash brands, but it all started there.

The end-to-end principle — people being able to exchange anything digital directly, without any persons or institutions interceding —was central to the cypherpunk ideal. Encrypted exchange would challenge centralized power and the institutions that use it, i.e. your three letter spy agencies, your taxmen, the copyright lawyers of the entertainment and data industries, your patent holders, and their representatives. Cryptology then was to be another weapon for information being ‘free’. The anonymity it afforded would protect the sharers of data (which would include capital as data) from real world intrusion by those who would block or tax its free exchange. 

As with any and all instantiations of ideology or ideals, the reality of cryptos’ winding road to actuality became more complicated and messy than what the cypherpunks envisioned. Today, all those players, including those that were to be eliminated — the taxman, the lawman, the fiat-money based banker — mill uneasily about in the crypto mix. The reality today is a peculiar mix of the hidden and the wide open. For example, one has to give up more personal information to engage with most crypto exchanges than is required to start a bank account or even to get a passport. The government in all its branches is watching. 

Realities like this make me a little skeptical of the claim that the blockchain will be radically decentralizing. As with the perfect anonymity proposed by crypto-anarchist and cypherpunk visionaries, the result is more turning out to be the usual mix, with all the usual power players dipping their hands in the till.

Credit: Tesfu Assefa

Is Decentralization A Sort-Of Digital White Flight?

Decentralization has long been a fever dream of anarchists left and right, and various flavors of idealists exhausted by the perdition of big states, businesses and political pressure institutions. The realities of decentralization as it is experienced may seem less attractive than the dream. Think of related words and ideas, such as the psychological and social decentering of a person, nation or a culture. Think of the devolution of social welfare guarantees. 

In 1994 a group of digerati, some tied to the Electronic Frontier Foundation (EFF), met with the Newt Gingrich oriented Progress and Freedom Foundation (PFF) to discuss their mutual interest in devolving some aspects of the state (mainly social welfare, no doubt). They even issued a statement signed by Esther Dyson, George Gilder, and Alvin Toffler. (Dyson went on to write a cover article about Gingrich for Wired titled ‘Friend and Foe’. This fence-sitting illuminated the distinction between Wired magazine and MONDO 2000.)

At the meta-scale of decentralization, Balkanization is the term that has been used to describe the breakup of large nations into fragments. It has often given rise to tribalized conflicts in places like the former Yugoslavia, where the world witnessed the Bosnian and Kosovo wars. Domination by backwards religious sects and economic confusion can clearly be a result of centralized institutions breaking up. Afghanistan is another of many decentralized disaster stories, albeit helped along by imperial competition between the former Soviet Union and the US and then, later, between Iran and the US. The Kingdom, into the 1970s, was relatively secular and progressive; the imposition of a pro-Soviet government was a mess but still kept the religious fringe from power. The opportunism of Zbigniew Brzezinski helped bring the breakup of the state and, with it, Al-Qaeda and the mess that is the 21st century.

In the US, we only have to think about the use of “states rights” to deny civil rights to black citizens or of the recent Supreme Court decision that gave state governors the privilege of forcing women to give birth (not to mention draconian laws criminalizing medical care).

Dissipative Structure

During the 1980s and ‘90s, there was enthusiasm, particularly among New Age types, for Ilya Prigogine’s dissipative structures theory of self-organizing systems in nature. The then-popular capsule summary, which was fundamentally accurate, was that dissipating systems in nature come back together at a more complex level of coherence. They reach a new equilibrium. This was viewed as a cause for optimism (and relative taoist-style inaction). The usually unasked question was what happens to people in the interregnum — you know… during the ‘dissipating’ part. I share this as an example of how abstract theories presumed to be sampled from natural law get instantiated into activities that may be less than beneficial (thank you social Darwinism.)

Outta Sight! Out Of Mind 

The earlier examples reference decentralization on the scale of nation-states. I am more interested in the notion that the ideology of decentralization might take us away from the solutions to problems that can only be fixed at national or global scales. In other words, forming your well-mannered, mutual aid, ecologically-correct enclave does little or nothing to stop the existential threat of climate change, of nuclear and biological weapons, and does little to protect against pandemics (unless the entire world agrees to stop traveling). Like the idea of “temporary autonomous zones” (TAZ) that was particularly influential in counterculture during the 1990s, there is and was an underlying sense of having given up on big revolutionary or reformist change in favor of Hakim Bey’s party that has more meaning than the entire US government. Bey himself wrote “There is no becoming, no revolution, no struggle, no path; already you’re the monarch of your own skin.” The ‘90s were fairly depoliticized and this made for a happier, less anxious culture but I think it’s inarguable that big trouble is too present now for dropout scenarios. The apocalypse is everywhere. 

Decentralization on a small scale brings another problem: the ‘out of sight – out of mind’ problem. The suffering of others is removed from view and therefore from consciousness. And in a civilization intimately connected not just by technology but by weather and germs, it will come back and bite us all. 

Out-of-sight-out-of-mind is, arguably, reflects in the culture of crypto enthusiasm, and the virtual adventurism and risk-taking of those who play for pay in that realm. A world of hurt doesn’t appear to dim their excitement.

The Scams of the Blockchain

The angelic ideals of networks of trust and security enhanced by crypto have been crowded out of the public imagination by the demons of NFT scams, exchange hacks, and dodgy ICOs.

In 2022 alone, Forkast News reports $2.8 billion was lost to “rug pulls, a relatively new type of cryptocurrency scam.” There are people at the other end of each of those pulls, not to mention on the other end of the billions stolen by Sam Bankman-Fried. The list and amounts stolen are immense, and many of the victims don’t have a comfortable fall back. The Federal Trade Commission tells of, “over 46,000 people reporting losing more than a billion dollars in crypto to scams since the start of 2021.” Many are elderly and lacking sophistication about these activities. 

Still, I’ll keep things in perspective. It’s estimated that the games played by banks, investment firms, real estate hustlers and others cost $22 trillion in 2008 and the beat goes on. I would only say that when people get slimed by a crypto scam it’s more immediate, more visceral, more like an expensive three card monte out in the street. 

Credit: Tesfu Assefa

Risk Isn’t A Board Game

Every attempt to bring novel idealistic technologies into general public use evolves new vulnerabilities. It is usually the precariat — those most vulnerable, most desperate, least likely to have the time, inclination or connections to separate the promising opportunities from the Ponzi schemes and the like who suffer the greatest loss. 

As a culture, we need to be able to continue to value risk-taking. Adventurers test new drugs that might prove useful for psychology or creative problem solving. The Wright Brothers had to test the first airplane. Nothing is going to happen in space if conscious individuals can’t voluntarily risk death. But we have to find a way to provide soft landings for those who are not equipped for a high level of risk and adventure. 

Technoculture, as a trope, has been a place for edgerunners and edgelords, but the risks were never just for themselves. It was always about dragging the entire species (not to mention their wealth) into the spooky and inarguably dangerous new realm of digital space. Tech enthusiasts need to add a new demand to their project: a cushion against the most harmful aspects of this historical dislocation for those falling over the edge.

A follow up column Risk and Precarity Part 3: Possibilities and Solutions will follow. 

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Risk and Precarity Part 1: The Early Digital Age

“Cyberculture” — the embrace of rising digital technology in the 1990s — was attractive to the hip (a word which, according to some, translates into “knowing”). Avant-gardistes are instinctive lovers of risk, always experimenting with the new; always pushing the broader culture forward while pushing its triggers. 

The culture then was at once kindly and brutal. It promised to inform the masses; to give the average person open access to  the means of communication — taking it away from the monied, well-connected  elites. It touted production technologies that could end scarcity — at the extreme, there was the oft-expressed hope for achieving Drexlerian nanotechnology. This sort of nanotech could, in theory, program matter to make whatever was needed or desired. (They promised me  a self-replicating paradise and all I got was these lousy stain-resistant pants.)  Declarations about scientists having achieved cold fusion for clean energy were known to be dubious, but surely were indicative of breakthroughs to come. 

The hacker ethic, as it was then understood, was all about making everything as free as possible to as many people as possible. Data, at least, was to be free really soon. Unlike physical goods, you can copy and share bits of data and still have it yourself. Over the internet, one could share it with everyone with internet access. There was to be no scarcity in anything that was made from data. In theory, with the kind of advanced nanotechnology advocated by Eric Drexler in his 1986 book The Engines of Creation, you could share data over the internet that would self-create material commodities. Today’s 3D printer is a primitive version of the idea of turning data into material wealth.

On the flip side of all this noblesse oblige was the arrogance of those who ‘got it’ towards those who didn’t. And hidden within the generous democratic or libertarian emphasis of the cultural moments was the contradictory certainty that everyone was going to have to  participate or wind up pretty-well fucked. Stewart Brand, very much at the center of things (as mentioned in earlier columns) wrote, “If you’re not part of the steamroller, you’re part of the road.” Note the brutality of this metaphor. In other words, the force that was promising to liberate everyone from the coercive powers of big government and big money — to decentralize and distribute computing power to the masses contained its own coercive undertow. Brand was saying you would be forced (coerced) into participating with the digital explosion by its inexorable takeover of economies and cultures. 

On its inception in 1993, Wired Magazine shouted that “the Digital Revolution is whipping through our lives like a Bengali typhoon,” another metaphor for disruption that sounded exciting and romantic but is basically an image of extreme material destruction and displacement. In my own The Cyberpunk Handbook, coauthored with (early hacker) St. Jude, we characterized the “cyberpunk” pictured on the cover as having a “derisive sneer.” Much was made of the cyberpunk’s sense of having a kind of power that was opaque to the general public. Hacker culture even had its own spellings for people who were a million times more talented with computers and the online world than the “newbies’   — eleet, or 31337 or *133t. Technolibertarian (and Mondo and Wired contributor/insider) John Perry Barlow whipped out the line about “changing the deck chairs on the Titanic” every time the political or economic mainstream tried to even think about bringing the early chaos under some semblance of control. In 1995, he wrote A Declaration of Independence of Cyberspace, declaiming, “I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”

Barlow imagined cyberspace as a separate state largely unconnected to the realities of governments and other concerns of the physical world, an idea that seems preposterous now that access to the internet is pretty much a requirement to get work, transfer moneys, and access most medical services. 

Even the mainstream’s shiny young boomer president, Bill Clinton, told people that the average person would have to “change jobs seven times” in the new economy (from burger flipper at McDonalds to Barista at Starbucks to lap dancer and back again). He tried to make it sound like it was all exciting, part of changing times, and he and the more directly cyberculture-oriented VP Al Gore touted retraining as a solution for the displaced (Has retraining been replaced with re-education among the “center-left” politicians of the fading neoliberal consensus? A case can be made.) 

As in all these cases, there was not much thought or sympathy for vulnerable people who might not be in a situation or condition that would allow them to cope with this jazzy and exciting rapidly changing future. Which brings us to…

Credit: Tesfu Assefa

The Precariat  

“We are the 99%.”

Class in America has always tended to be unspoken and, during the pre-digital age, there was a strong, comfortable middle class. My own parents, born in the mid-1920s and right in the middle of the middle, never feared slipping into poverty or homelessness. They bought homes. They rented. The cost wasn’t absurd. They got sick and were kept overnight in hospitals without having their savings wiped out. There was a comfortable sense that there would always be a nine-to-five job available with modest but adequate pay and benefits. And there was an additional sense that the companies or institutions they would work for were solid. Whatever it was, it was likely to remain open, functional and not inclined towards mass firings. They wouldn’t have to “change jobs seven times”as suggested by President Clinton. 

The idea of a class called the “precariat” — a portmanteau of ‘precarious’ and ‘proletariat’ — was popularized by the economist Guy Standing to describe the increasing numbers of people who lack predictable work or financial security. The precariat need extra work (’side hustles’) to bung the gap in their income: gig work, underground economic activity, and extended education or that good ol’ Clintonian ‘retraining’. Members of the precariat mint lines of NFTs hoping they will haul them out of precariousness, or at least give them a temporary lifeline. Ride-sharing businesses can only exist where there is a precariat.

There is an equal or perhaps greater cause for precarity in the state’s hands-off approach towards monopolies, and to what Rebecca Giblin and Cory Doctorow call ‘monopsonies’ (they didn’t originate the word). Wikipedia explains this economic trap as where “a single buyer substantially controls the market as the major purchaser of goods and services offered by many would-be sellers.” Amazon is a world-historic example. The backlash is directed towards digital technology as a whole – rather than just Amazon or some other monopoly company.

Occupy Wall Street & the 99%

Although the people who initiated Occupy Wall Street probably were not using the term back in 2011, their genius was in recognizing that precarity could be as high as 99% of the public – as middle class, upper middle class and even a few wealthy investments crashed, homes went “underwater,” business folded etc. When Occupy started and gained attention, some polls showed that a greater percentage supported than opposed the movement (44% versus 35% according to this Pew Research.) This may not seem impressive but it was a good stat in a land where most people are persuaded that they can achieve “the American dream” with hard work and good luck.

Identity: We Are Not The 99%

Many blame social media for spreading hostility among the public, both in the US and elsewhere. And there can be no doubt that seeing what huge numbers of other people have on their minds is the most irritating thing imaginable. (Cyber-romantics of the 90s rhapsodized the idea of a noosphere — a kind of collectivized global brain. On learning what’s going on in a lot of brains, I would suggest that this idea was, at best, premature. Detourning Sartre for the digital age: Hell is other people’s tweets.) Still, dare I suggest that there was a quantum leap in emphasis on identity divisions and anxieties in the  immediate aftermath of Occupy? Was there, perhaps, a subterranean effort to convince us that we  are decidedly not the 99%? I try to stay away from conspiracy theories but the thought nags at me.

Not Happy To Be Disrupted

As I noted in an earlier column, a lot of people, living in precarity, are not happy to learn about new disruptive technologies. More people, including many who were techno-romantics back in the 90s, now feel more like the road than the steamroller in Stewart Brand’s metaphor. Programmers are now panicking about losing jobs to AI and I hear talk that some in the libertarian bastion of Silicon Valley are opening up to more populist ideas about engaging the state in some form of guaranteed income security.

A follow up column Risk and Precarity Part 2: The Age of Web3 will follow.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

The Politics of Appropriation and the Active Use of Content-Creating AI

Will AI be used constructively or destructively? It possibly depends on sociological and political factors external to AI or technology. An irruption of barbarism (for example, in the much-ballyhooed upcoming American civil war) would bring destructive uses of AI, mainly fraud and trickery. It’s hard to think of a turn towards empathy doing much good, since it only takes a minority of bad actors to wreak havoc, but one can dream. Between Wall Street traders laughing about screwing middle-class investors out of 401ks back during the financial collapse and bailout of 2008, to the relentless news of corporate predations, manipulative politics, and the plague of more street-level grifters tricking the elderly out of their cash, the evidence is pretty high that the abuse of AI will be front and center in our minds and discussions going into the immediate future. 

But there is one area in which your relationship to AI may be more self-selective: the active versus the passive use of these apps and opportunities for creative work and experimentation. Here we have a richer and more complicated set of relations and possibilities.

Inappropriate Appropriation?

The artist Molly Crabapple recently posted a totalist objection to the use of AI in the arts, writing, “There’s no ethical way to use the major AI image generators. All of them are trained on stolen images, and all of them are built for the purpose of deskilling, disempowering, and replacing real artists”  On reading this I started thinking about how, at Mondo 2000 (the magazine that I co-created), the use of appropriation in creative work was consistently advocated. The main idea, of course, was that you appropriate — use found materials — to make something original, drawing a line between plagiarism and use, although there were exceptions. Our writer Gareth Branwyn amusedly quoted the Austin Texas based Tape Beatles slogan “Plagiarism Saves Time”. Even our ever-provocative Mondo 2000 softened that for our own “appropriation saves time.”

One might compare the incursion of AI for creative use in visual art, writing, music etc. to both the advent of the cassette recorder and the digital synthesizer. We saw the same reactions from musicians and the music industry. With home taping, users of the technology could make copies of recorded music by taping from the radio or a friend’s record collection. The tape itself could then also be copied. In the early ‘80s, the music industry adopted the slogan “Home Taping is Killing Music”, engaged in several lawsuits and lobbied the US Congress (as well as other institutions in Canada and Europe) for legal action to cover their perceived losses from the cassette taping menace. With the advent of the digital synthesizer — the sampler — the floodgates opened to a deluge of conflicts over ownership of music content. Old musicians and their lawyers demanding money from young sampling whippersnappers fuelled the disappointment that GenXers felt about the Baby Boom generation.

For Mondo 2000, Rickey Vincent, author of Funk: The Music, The People and the Rhythm of the One, wrote about the connection between hip-hop, rap, and the cyberpunk aesthetic as enacted by that genre’s playful use of found materials via the technology of the digital sampler: “Sampling is the auditory form of hacking through a database. A certain functional anarchy is involved which one might argue is good for the soul. For hip-hop, a sampler is not a toy. It’s an important instrument in the function of the rap song statement.”

More broadly, in the pages of Mondo 2000, the audio-collage band Negativland, whose use of found material sometimes landed them in lawsuits and hot water, were given the kind of coverage that Rolling Stone would have preserved for Janet Jackson. Our friend and frequent subject, the literary avant-gardiste Kathy Acker, blatantly lifted entire pages out of classic texts, mashing them up with biographical material, fantasy, philosophy and whatever else seemed to work to create her well-regarded (by some) novels. In his Mondo interview with Negativland, Beat historian Stephen Ronan declaimed, “appropriation is the hallmark of postmodernism.” 

Mondo art director Bart Nagel’s playful take on our love affair with appropriation from Issue #10 is too amusing not to share in full:

Some guidelines for appropriation

1. Remember: Appropriation saves time.

2. Appropriate your images from old books and magazines where, chances are, all parties who could make a case against you are dead or failingly old.

3. Unfocus the image slightly to avoid the moiré pattern (in Photoshop try a 0.8 Gaussian blur).

4. Morph, tweak or otherwise alter the image unrecognizably.

5. Don’t alter the image at all; have Italian craftsmen sculpt a simulacrum (not guaranteed to work).

6. Appropriate images from MONDO 2000 – these may already have been appropriated. Let’s confuse the trail. 

7. Appropriate images from ads in RAY GUN and submit them to MONDO — now it’s come full circle — and it’s ecologically sound (recycling is good).

8. It’s hip hop.

9. And finally, this: if you take someone else’s image it’s appropriation, or resonating, or recommodification; if someone takes your image — it’s stealing.

Self-satire aside, the complications over use and reuse are myriad.

Credit: Tesfu Assefa

Culture Uses Culture: News Uses News

In journalism, the hard work of the person who “gets the story” will lead to multiple news items, most of which don’t credit the original source. For those engaged in consequential investigations, it is more important that the information spread accurately than for the originator to be repeatedly credited. Just as songs enter common usage for people to sing or play as they will in daily life, the hard work of the journalist becomes fodder for other news stories, dinner table debates, opinion columns, tantrums on TV or combat at conferences.

All of this is to say that the ownership of one’s content is the blurriest of lines. It certainly keeps our courts busy.

But Does AI Make It All Too Easy?

It’s true using AI for creativity might be different from the sampling we’ve seen so far. Sometimes more becomes different. It’s a matter of degree: the amount of content grabbed by AIs and the degree to which the origins of AI-created content may be obscured makes it, arguably, a different situation. The first cause of concern is that AIs may be good enough — or may get good enough soon — at some types of content creation that the creative people will no longer be required. This is a situation touched on by my previous column about the writers’ strike. AI alienates  human creatives in a way that sampling didn’t, and the concerns about it putting people out of work are being widely expressed — and are legitimate. When it comes to alienating types of labor, one response is some sort of guaranteed income, and a movement towards a sense of purpose around unpaid activities. The identity and self-esteem of the engaged creative is deeply embedded into that social role, and getting paid defines one as a capital A-Artist or capital W-Writer, because otherwise everybody does the same thing you do. 

The artists’ natural affinity and passion for quality work is another source of angst, as covered by my previous article on ‘facsimile culture’. The replacement of quality work with the facsimile of quality strikes many creatives deeply; the war against mediocrity being a great motivator, particularly for alienated young creators finding their footing. 

Back in the day, you couldn’t switch on your sampler or even your synthesizer and tell it “make a recording that sounds just like Public Enemy with Chuck D rapping about Kanye West’s weird fashion sense”, and have it spit out something credible with no intervention from creators/copiers. The AI creation of  “fake” Drake and The Weeknd collaboration freaked some people out — mainly because they suspect that it took less creative effort than a possible actual collaboration between them. But sometimes laziness in music can also produce good results

Finally, and probably most importantly, the degree to which creative AIs are tied into the billionaire and corporate classes validates Crabapple’s broad-brush claim that its primary intended uses are to serve their interests, and to disempower more freelance or democratic or unionized groups of creative workers. The list of large corporations and billionaires engaged in AI development includes Musk, Bezos, Brin, Peter Thiel, Google, Microsoft, Baidu. These persons and organisms are all suspect. The notion that Big Tech wants to deliver us cool tools in a non-exploitive way has lost its luster since the more trusting days of early internet culture. The trend towards unionization increases the likelihood that these companies are acting out of anxiety to get rid of expensive and messy humans, as does the recent spate of layoffs.

For The Individual: The Passive v. Active Uses of AI

Still, there’s room for us to work and play with the tools handed down to us by the corporate monsters. (I type this on a Mac, designed by one of the world’s richest and most litigious corporations.)

Passive uses of AI might include the obvious things we are subjected to like phone-answering bots that declaim “I understand full sentences. What can I help you with?”, to the automated checkouts at supermarkets, to whatever your bank or financial institutions are doing with your money. If you’ve been reading CNET or Buzzfeed and didn’t know that some articles were written by bots, you might, in some sense, feel you’re being used by chatbots. And if you were to use ChatGPT and give it one simple instruction — maybe and ask it to write something for you about the current House of Representatives in the style of Hunter S. Thompson, that would be a passive use of ChatGPT. (The results would also be disappointing. The bot is prevented from getting mean about individuals, even when they deserve it.)

Active users have multiple strategies. With most creative-oriented AIs you can, in a sense, dialogue. You can respond, ask for more, use multiple prompts. The writer and sound engineer Paul Boutin (Wired, NY Times) presented impressive results on Facebook as the result of asking ChatGPT to write a column in the style of the acerbic and critical tech culture magazine Valleywag (now inactive). I asked him what his strategy was for getting more out of that app:

Boutin: “To get ChatGPT to write more like yourself, think about not only your prose style and vocabulary, but what mindset do you bring to your writing? I got it to do a pretty good version of an old Valleywag blog post by instructing it several times over a half dozen drafts: Don’t be even-handed, be unapologetic. Don’t say but convey that you know you’re smarter than the people you’re criticizing, even though they presume the opposite. Don’t hedge — you don’t care what they think or fear repercussions. Be funny, in a sarcastic condescending way. Don’t insult people with names, but express amusement at their foolish actions and words in the specific topic. Also to never say but always convey: Everyone thinks they can do what you do, but they couldn’t no matter how hard they try. They foolishly think they will be able to match your skills with enough prompt engineering.”

“Here’s what ChatGPT gave me on its sixth try:

 AI Authors: The Future of Laziness

Bravo, AI-author enthusiasts! Why work hard on your literary masterpieces when you can let an AI remix internet content? Skip the pesky interviews and firsthand accounts—let’s keep literature artificial and unoriginal. Who needs talent, depth, or hard work when you’ve got an algorithm? Just remember, every time you sip your latté while the AI ‘writes’, you’re celebrating a new era of mediocrity. Cheers to the rise of the machine!”

Bravo Paul and “Chat”

Another strategy for creativity has been long used by creatives engaged in appropriation. A lot of musicians (including David Bowie and Mick Jagger) used William S. Burroughs “cut-up technique” for recombining words to launch a lyric. A fragment of Chatbot text could be deployed in the same manner, to get started, break writer’s block, write a transition, or summing up. 

It could, in fact, be argued that for a truly creative piece of writing built on a skeleton of facts, the facts are the boring part. It might not be a crime against writing to grab your skeleton entirely or almost entirely from a chatbot and flesh it out with your own imagination or insight. In the visual arts, AI might help you rapidly generate alternative samples of a work, varying shading, color, proportions, etc. This is very likely something you already use a machine to do. AI will simply be making the work happen faster. In other words, the active user is engaged in some conscious way with creative AI and doesn’t need to be told what tools to use. 

Risk and Precarity

In an economically, socially, sexually and environmentally anxious moment, the excitability of those inclined towards neophilia (love of the new) brushes up not just against neophobia, but against the very real conditions of our historical moment. Very few of us can dismiss the fears of being displaced, mislabeled, denied or messed about by people and institutions using AI. Technoculture was built on the romance of risk and “disruption”, and, now that the chickens are coming home to roost, culture is not altogether happy to be disrupted. A column about risk and precarity in relation to the culture of technology (which now is, of course, culture itself) beckons sometime soon…

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Hollywood Writers Strike Versus Facsimile Culture

Since I first made fun of AI panic back in my second column, I’ve been growing more disturbed. But I never thought I’d join the Luddites. However, the striking writers of the entertainment industry are demanding to “Regulate use of material produced using artificial intelligence or similar technologies”. These writers are the first line of resistance against cultural productions being turned into facsimiles of creative brilliance. This has become a point of emphasis among the signs being carried on the picket lines, an indication of its populist appeal. It’s likely that the strike will actually make entertainment bigwigs more attracted to the possibility of ditching the humans for compliant chatbots with no demands and few needs. The fight against AI taking over TV writing is one that should be taken up ‘by viewers like you’ (as PBS likes to say). If you like complex TV shows and films with brilliant dialogue, it’s in your interests to keep the humans and not let the products of their minds be replaced by an AI-created facsimile. 

In The Art World Facsimiles Have Become A Field Of Play in Which Toying with Financial Valuation Serves as a Sort Of Content

In the art world, the distinction between the real thing and a copy of the real thing has been obscured for many years, with a wide variety of consequences. In visual arts, copying became a field of play. The philosopher Walter Benjamin set the terms of the discourse in 1935 with his essay ‘The Work of Art in the Age of Mechanical Reproduction’. Benjamin was dealing with physical objects, and he theorized that an original artwork carried an ‘aura’ that gave it a high capital valuation. In the age of increased reproducibility, Benjamin conjectured that the value or the original would diminish. This hasn’t happened, as originals both old and new fetch huge prices. At least since the Pop Art movement of the 1960s, the art world has toyed with this trope — this predicted tug-of-war between the original and the facsimile, by just saying yes; delivering both originals and multiples. Warhol started mass distributing postcards of his most widely-recognized works in the early 1960s, while the original maintained its ‘aura’ and could be sold to collectors (although it took the pale man’s actual demise for the aura to glow enough to attract really spectacular sums.)

An odd twist comes into play in the land of NFTs. The work is infinitely replicable and can be distributed in moments to billions of internet users, and the NFT-collector may or may not be purchasing exclusive access. What the collector seems to be after is not the aura of the artwork, but the aura of ownership in and of itself – or of a particular relationship to the work.

The Mass Distribution of the Facsimile of Recorded Music 

In the world of recorded music, Neil Young stood out as the loudest early curmudgeon complaining that digitally-created CDs and later music files offer a pallid facsimile of what a recording artist intends. (Of course, it could be argued that recorded music itself is a facsimile of the way music was experienced for millenia prior to its invention, but I’m not going to try to unpack that here. In 1931, The American Federation of Musicians denounced recorded music as, basically, a facsimile of live music that would debase the art.) Over the years, Young’s complaint has become a generally accepted wisdom. We trade quality for the facsimile that is easily distributed and  conveniently available.

My friend ChatGPT agrees: “Digital audio compression works by removing parts of the audio signal that are perceived as less important or less noticeable, in order to reduce the file size and make it more convenient for storage and distribution. However, this process can also result in the loss of subtle nuances and details that contribute to the overall richness and depth of the sound.

“Studies have shown that digital audio compression can result in a loss of dynamic range, which is the difference between the loudest and softest parts of a recording. This can make the music sound less dynamic and less engaging, particularly in genres such as classical music that rely on subtle changes in volume and tone.”

Will the ironies never cease?

Is All Cultural Production A Facsimile of Life?

Taking a sharply anarchic left turn in this exploration, we might take up the view of the European radicals of the 1960s, the Situationists, who viewed all cultural production as contributing to the ‘society of the spectacle’. In his book, ‘Society of the Spectacle’, Guy Debord wrote, “The spectacle is a social relation between people that is mediated by an accumulation of images that serve to alienate us from a genuinely lived life. The image is thus an historical mutation of the form of commodity fetishism.” In other words, all art (from the word artifice or artificial) alienates us from direct experience. Among the practices used by the Situationists, the one most familiar to us today would probably be actions that contemporary people would call pranks. These were actions designed to break the trances citizens going through their normal routines. The Situationists called this tactic ‘construction’, and it involved created situations that would disrupt the status quo and encourage spontaneous excitement, joy or, for that matter, anxiety.

Situationism pretty much abandons mediation completely for intensely lived daily lives, what Situationist Raoul Vaneigem called ‘the revolution of everyday life’.

An eruption of this sort of consciousness would pretty much put the striking writers out to pasture. But this is not our world today.

The boss needs you, you don’t need him! (Credit: Wikimedia)

Remember When Youtube Was Going To Wipe Out Professional Television Production?

Aside from AI creativity — or possibly in combination with it — another specter looming up to challenge TV writers is the democratization of video production. This was, first of all, the dream of avant-gardists like Nam June Paik: that everyone could be a video artist. That it would become a medium of creative self-expression and break up the confining linearity of storytelling. And,  back in the earlier years of this century, Wired magazine related pundits like Kevin Kelly and Chris Anderson predicted that the “long tail” of small scale content creators (video in particular) would create niche audiences that would seriously impact and begin to replace the big movie and television productions. This doesn’t appear to have happened, although it could be that TikTok is grabbing them while they’re young and a generation will emerge that prefer 30-second clips of someone having their cat speak in a funny voice to the complex plots and dialogues of shows like ‘Succession’ or ‘Atlanta’. 

Maybe Our Lives Are A Facsimile

Finally we come to Simulation Theory, that favorite 21st century cosmology that our very lives themselves may be, in a sense, a facsimile, a mediated creation… a computer simulation. In this case, we may as well carry on by emphasizing that which gives us pleasure – at least until we find a way to bust out of The Matrix without switching off our facsimile universe. Like Pinnochio and Zuckerberg, we all long to be real boys (or others).

What Is To Be Done?

I’ve seen mixed results from attempts to get Chatbots to achieve authentic creative madness. So I think we should place our bets on a proven winner. That would be the screenwriters who have managed to send some wonders to our screens in this century, from the aforementioned ‘Succession’ and ‘Atlanta’ to ‘Fleabag’, ‘Black Mirror’, ‘Mad Men’, ‘Fargo’… the list of well-written shows goes on. (I won’t mention the unfunny comedy writing of ‘SNL’ or ‘The Daily Show’. Nope. Won’t mention it.) 

I mean, assuming there won’t be a revolution in everyday life in which we achieve some kind of unmediated intensely experienced existence, I suggest we try to keep these writer-freaks employed, well-paid and supplying us with cool content. (Why do I imagine that a Situationist revolution of unmediated intensely experienced existence could end up being as exhausting as having to work all the time?  It’s a bit like being asked to constantly engage in leisure activity as a participant when sometimes you just want to kick back and watch a good TV show. Sometimes we choose slack over engagement.) Speaking of which, after completing the first draft of this piece, it was announced that Facebook’s ‘Metaverse’ had failed and was being shut down. It’s unclear whether the failed attempt to bring VR to Facebooks 3 billion users represents a rejection of VR as a participatory medium that some, as far back as the early 1990s, thought would replace TV, or if the technology is still too raw for people to want to climb in, or whether Facebook’s particular attempt was somehow flawed.

In any case, we should support our striking writers lest the profiteers of television decide that they could fill the air with cheap reality programming, some of it possibly featuring dumb AIs and even dumber humans engaged in banal contests, and that they don’t need any pesky humans, even if the award-winning smart shows disappear. After all, reality TV gets a big viewership and is extremely profitable. I fear this may be the ultimate result of the great battle of the Hollywood writers against the entertainment machines.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Steal This Singularity Part One: The Yippies Started The Digital Revolution

Every fourth one of these Mindplex articles will be an annotated and edited excerpt from my multipart piece titled Steal This Singularity, originally written some time in 2008. This will continue until I get to the end of the piece or the Singularity comes. Annotation is in gray italics.

Part One: Steal This Singularity

1: The notion that the current and future extreme technological society should not be dominated by Big Capital, Authoritarian States or the combination thereof. Also related, a play on the title of a book by 1960s counterculture radical Abbie Hoffman. Abbie may be an obscure figure to today’s young people. Let’s start to fix that here.

2: The notion that in our robotized future, human beings shouldn’t behave robotically. The response to AI isn’t to blow up or hack down AIs. Become so unique and original that no program, however sophisticated, can perform you. Let AI thrive. You have less clichéd paths to follow!  

 A few years ago, I proposed Big Dada as a response to Big Data. Big Data is the corporate/state/organization tool for exploitation and control, and/or for making effective policy for human benefit. (Life’s rich in ambiguity.)

With Big Dada, I suggested confusing the data by liking what you hate; hating what you like; by lying; by engaging serious issues with surrealistic gestures and language and by generally fucking with data’s logic circuits. I didn’t suspect at that time that a power-hungry, orange-faced, grifter-trickster reality show host would capture Big Dada in a sort of chaos-fascism. Clearly, there were bigger, richer Big Dadas to contend with. Who knew?   

The well-rounded posthuman — if any — should be able to wail like a banshee, dance like James Brown, party like Dionysus, revolt like Joan of Arc and illuminate the irrational like Salvador Dalí. Sadly, the ones that aren’t mythological are dead, so a smart-ass immortalist might argue that even being able to wag a finger would be an improvement over the passions or mobility of these three losers. 

3: The title for a website in which R.U. Sirius says and does as he pleases. As it turned out, it pleased me to not do much with that website.  

The Singularity is, of course, conceived of as the time at which the artificial intelligences that we create become smarter than us. And then it makes itself even smarter and smarter still and yet smarter again and so forth… at an ever-accelerating pace until it becomes incomprehensibly something other to our wormy little minds.

I have to be honest. I’m not sure how seriously to take this. But ‘Steal This Singularity’ has much more of a ring to it than ‘Steal This Future’ or ‘Steal This Transhumanity’. Good sloganeering is intellectually disreputable… but fun. Plus anything that can fit on a T-shirt can be sold. My friend Timothy Leary used to advocate for getting your philosophy down to a bumper sticker. Tim was disreputable… but fun. And the way I see it, The Singularity has become a buzzword for the rad techno-future brought on by NBIC (Nano-Bio-Info-Cogno) or GNR (Genetics, Nanotech, and Robotics) or — to put it in more populist terms, the coming of the machine overlords.

Look, for example, at Singularity University SU had just been established when I wrote this. Here we have the establishment Singularitarians, all hooked up with NASA and Google and Cisco and Genentech. And how seriously did they take the Singularity label? Well, when Alex Lightman and I interviewed founder Peter Diamandis for h+, he made it clear that they were using the word for the same reason that I was: COOL BUZZWORD! That… and to make Ray Kurzweil happy. Ok. He didn’t blatantly say “cool-ass buzzword, dude!” He said: “to be clear, the university is not about The Singularity. It’s about the exponentially growing technologies and their effect on humanity… You know, we toyed with other terms… like Convergence University and others. But in homage to Ray…” Why do I suspect investment capital was involved?

So, in equivalent homage to SU, I call this project ‘Steal This Singularity’ and admit straight out that it may or may not have jackshit to do with ‘The Singularity’, depending on accidents, random moods and possible funding.The question, then, may be asked, smarter-than-human AIs aside, does ‘Steal This Singularity’ presume the rather technotopian possibilities promised by transhumanism, but believe that it will be necessary to STEAL it from the so-called 1%? Is that what I’m up to here? Well, maybe. How does one steal a Singularity (or something like it) from corporate ownership? I think this is a serious question. It’s almost certain that, barring a revolution, the digital other-life will be privately owned (In case of a revolution, it will probably be controlled by the vanguard state… and then, eventually, also privately owned). If, for example, humans can upload themselves into data-based quasi-immortality, it will be owned and the options will be more locked in than North Korea on a bad day. And one fine day, the powers that be or some nasty 12-year-old hacker will drag you into the garbage icon. (Yes, the garbage icon is eternal.) OK, fun’s fun but let’s get back to the real, old school fun, i.e. the Yippies.

Part Two: The Yippies Started The Digital Revolution

In 1971, a revolutionary prankster/celebrity named Abbie Hoffman, who had started the radical group the Yippies (Youth International Party) released STEAL THIS BOOK, a manual for living on the fringes of a wealthy society by grabbing up some free shit from corporate powers while committing some Blows Against the Empire (another influence on this project, btw). 

Credit: Tesfu Assefa

See, 1971 was the last year that the vanguard of the counterculture thought that they were going to make a total cultural and political psychedelic/anarchistic/left wing revolution before realizing… fuck it. Let’s campaign for McGovern. But more to my point here and the milieu it attempts … true story… the Yippies started the phreakin’ digital revolution! To wit: The hacker culture started as the phone phreak culture. The phone phreak culture came out of the Steal This Book attitude about getting free stuff from the detritus of corporate culture, in this case, the phone company. I wonder how shoplifting and other forms of gutter-freak theft plays today among some leftists – the ones who seem to have become “inlaws in the eyes of Amerika” (Jefferson Airplane reference)… inclined towards lawful good behavior and even occasional pompous respect for American institutions. This must have emerged in reaction to a lawless lunatic right that has taken a much more visible and colorful role in the zeitgeist. There’s some extreme code-switching when it comes to the romance of insurrection (Yippies, for example, dug the Weather Underground… which, in those days, wasn’t a website for following weather conditions). And QAnon Shaman – with his war paint and animal howls – seems like someone who would only have been at home in a Yippie! prank back in ’71. There’s so much more I could say about code-switching. Maybe some other column. The first legendary phone phreak, John Draper aka Captain Crunch, who built the blue boxes, used to hang out at 9 Bleeker Street, NYC, Yippie headquarters. The first magazine that focused primarily on phone phreaking was YIPL (Youth International Party Line), which was started by Hoffman and “Al Bell.” In 1973, it transmorgified into TAP, which is more broadly remembered as the initiatory phone phreak periodical.

Phone phreaks were computer hackers. Draper famously noted that the phone system “is a computer.” From this milieu, the personal computer arose. Famously, Steve Jobs and Steve Wozniak funded the birth of Apple by selling Blue Boxes for phone phreaking.

Another Yippie contribution is the use of McLuhanism as a weapon in the countercultural revolution. Hoffman, Jerry Rubin and the other original YIPs took an idealistic youthful new left that was sort of basic and organic, and a mirror of the folk music that they loved, and made it “go electric” (a term used for when Bob Dylan started using rock ’n’ roll to communicate his increasingly surrealistic cultural critique.) That the medium is the message was central to their strategy for an anarchic left-wing sex, drugs & rock ’n’ roll youth revolution. Hoffman’s 1969 book ‘Revolution For the Hell of It’ is saturated with McLuhan references and strategies for how a freak left could take over America, end war and racism. and bring about a post-work celebratory psychedelic utopia. ‘Do It!’ yippie prankster/leader Jerry Rubin’s 1969 book was ‘zapped’ (i.e. designed) by Quentin Fiore, the same force behind ‘The Medium is the Massage’, McLuhan’s most successful incursion into the popular mind. The YIPs had faith that, being native to television and rock ’n’ roll radio, they had an intuitive understanding of the era that outmatched the dinosaurs of the establishment. They could bring the already rebellious rock ’n’ roll media babies into their utopian revolution.

As things evolved (or devolved), young people did become increasingly rebellious, and even riotous. The counterculture drifted from the intellectual class in the leading colleges out into the broader youth culture, and the emblem of rebellion shifted from Jane Fonda’s progressive activism to Peter Fonda giving the world the finger in ‘Easy Rider’. I bet some of those tangled up in this inchoate rebellion reemerged in 2020, in the Capitol Building on January 6 as hairy old dudes being disrespectful to Nancy Pelosi’s desk.

McLuhan wrote, “The global village absolutely ensures maximal disagreement on all points.” Wow! Sure seems to have called modern digital culture! This can be traced to the hippie/yippie reframing and idealization of mediated pop cultural hipness, and then on through Stewart Brand, who became obsessed with the idea that a picture of the whole earth would create a shift in human consciousness that would have us identify as citizens of earth (the global village) rather than members of a tribe or nation. Brand, with his Whole Earth Catalogs in tow, went on to become, arguably, the central figure of the emerging digital revolution in the late 1980s, sponsoring the first hackers’ conference, the first intellectual (maybe the last) social media site — a bbs called The Well — and helping create ‘Wired’ magazine, which idealized accelerated change as a world-improving hip cultural and business revolution. This may seem like a long distance from the Yippies’ original intentions — although it may be that where we landed was inevitable, the view of the essay ‘The California Ideology’ by Andy Cameron and Richard Barbook in 1995.

Indeed, the rise of the computer-enthusiastic hacker movement of the 1980s, which was made up pretty much entirely of counterculture enthusiasts, was well-timed to the Reaganite agenda for setting the entrepreneurial impulse free from regulation. It was these two forces in tandem that made the digital revolution happen. But I’m trying to cover too much ground in one column – a rant for another time. 

Read the follow up article Steal This Singularity Part 2: The More Things Change, The More You’ll Need To Save 

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Accelerating to Nowhere

Accelerating To Nowhere

In an excellent conversation right here on Mindplex, Cory Doctorow went on a bit of a rant about how there were more changes over the 20th century leading up to the digital revolution than in this virtualized century. It’s worth sharing most of it: “mid century America, from post-war to 1980, is probably the most dynamic era in industrial history. In terms of total ground covered, we’re talking about a period that went from literal horse drawn carriages as a standard mode of transportation for a significant fraction of Americans to rocket ships… the number of changes you had to absorb from cradle to grave over that period are far more significant than the ones we’ve had now… someone born, like me, in 1971, has had to deal with computers getting faster and more ubiquitous, but not the invention of computers per se…. not the invention of telecommunications per se…”

Accelerationists, check under the pedal. It may be bricked. (ed: R.U. Sirius uses the term accelerationist to mean those wishing to intensify technological change and not specifically to the neoreactionary use of the term.)

Accelerating Into Digital Delirium

The point is well taken.

I would only counter that, in a sense, Cory is comparing oranges to apples (or Apples, if you prefer). The 21st century is seeing a particular type of extreme acceleration: an acceleration out of physicality into nonphysical virtual space. And as the result of a number of factors, not least of which are already distorted economic and political cultures — this appears to be an acceleration towards something of a mass psychotic break with reality. From people shooting a lot of people at once as a lifestyle choice (over the last few days, shooting anyone that unexpectedly enters your personal space has become trendy), to the predations of the followers of the cult of QAnon, the evidence is all around us, particularly in the hypermediated hot zone that is the USA.

On Twitter, Chris Stein reports on an example of this confusion: “Today I saw a guy with two hundred and eighty thousand followers promoting a story about McDonald’s in the UK serving dead babies and there were numerous comments that were like ‘yeah! Criminal! We are outraged! This is bad’” A few days ago Vice reported that “someone is selling computer generated swatting services.” Automated terror as an amusement for some young males. The very fact that swatting seems like a fun game to some young people is one of the myriad examples of the degree to which people today are buffered from physicality by mediation… divorced from the consequences of their actions. Taken alone, these examples may not strike the reader as being as impactful as, say, the twentieth century killing fields of Cambodia. But I would aver that the plague of bad weird actions caused by digital interference in our ability to separate reality from virtuality are the first signs of a fast spreading mass delirium.

In a 1991 MONDO 2000 interview, the late avant-garde novelist Kathy Acker said: “When reality—the meanings associated with reality—is up for grabs, then the body itself becomes the only thing you can return to.” Today, virtuality assaults that body as if it were its most potent appendage.

A Kind Of Social Singularity

Vernor Vinge’s original concept of the Singularity suggested that we can’t understand or predict who (or what) we will be, or what life — our societies, psychologies, politics, technologies etc. — will be beyond the point when we develop smarter-than-human AIs. It would, according to Vinge, all be a kind of unimaginable blank slate. My less extravagant thought is that we have induced a kind of Social Singularity when we herded billions of humans onto the net.

Giving everyone access to the means of global communication both as communicator and receiver has shattered consensus realities into individual and small-group reality tunnels. From this point on, we can no longer comprehend or predict what the cognitive and sociopolitical results will be. Of course, unlike in Vinge’s Singularity, this Social Singularity doesn’t replace humans as the main actors in our history.

On The Other Hand

I’ll confess that I may be going a bit overboard in saying that a Social Singularity can be caused by just the presence of billions of people on the internet. At the end of the ‘90s, I was already saying it was impossible to get people to focus on the same narrative. Then 9/11 happened. One event decidedly brought people into the same narrative and, it must be said, a harmful consensus was generated that led to the Patriot Act, the American torture gulags and the preposterous invasion of Iraq. There are good things and bad things about smashing consensus reality.

Perhaps climate breakdown could refocus us on a common narrative, but there are greater financial interests in sowing confusion about blame and solutions there than there was after 9/11 (although the Iraq invasion was a money spinner for companies owned by friends and even members of the George W. Bush administration).

Money For Nothing & Your Clicks For Free

In his essay ‘Of Flying Cars and the Declining Rate of Profit’, the anarchist writer and philosopher David Graeber wrote about how the end of bipolar competition between the US and the USSR might have been instrumental in changing the priorities of the US. Specifically, we backed off the Space Race – but we abandoned most other big physical/material projects too. (The culture theorist Arthur Kroker referred to the period after the fall of the Soviet Bloc as “the recline of Western Civilization”). Graeber wrote “Where… are the flying cars? Where are the force fields, tractor beams, teleportation pods, antigravity sleds, tricorders, immortality drugs, colonies on Mars, and all the other technological wonders any child growing up in the mid-to-late twentieth century assumed would exist by now? Even those inventions that seemed ready to emerge—like cloning or cryogenics.” Graeber goes on to note that we’re in “a technological environment in which the only breakthroughs were those that made it easier to create, transfer, and rearrange virtual projections of things that either already existed, or, we came to realize, never would.”

digital wasteland

Graeber points out that social analysts identified the Space Race as key to the 20th century public’s exaggerated expectations of transformative technological magic. But Graeber goes further than that. He writes that the Soviet Union constituted a greater challenge to American technological superiority than is usually recognized. Graeber: “There was the awesome space race, alongside frenetic efforts by U.S. industrial planners to apply existing technologies to consumer purposes, to create an optimistic sense of burgeoning prosperity and guaranteed progress that would undercut the appeal of working-class politics.

“These moves were reactions to initiatives from the Soviet Union. But this part of the history is difficult for Americans to remember, because at the end of the Cold War, the popular image of the Soviet Union switched from terrifyingly bold rival to pathetic basket case—the exemplar of a society that could not work. Back in the fifties, in fact, many United States planners suspected the Soviet system worked better. Certainly, they recalled the fact that in the thirties, while the United States had been mired in depression, the Soviet Union had maintained almost unprecedented economic growth rates of 10 percent to 12 percent a year.”

Graeber’s piece does not claim the end of the Cold War was the sole reason (or even the most important reason) for the retreat from making stuff — real material stuff that might have transformed our lives. It’s just an element of the essay that has stuck in my mind ever since I read it about a decade ago.

Still, it seems that the fall of the Soviet Bloc and, with it, the final closure of any sense that there was a competition for bragging rights, was perfectly timed for a lot of capital to abandon the physical and relocate in cyberspace, resulting in what has been — in terms of real material productivity — largely a massive circlejerk. IDC FutureScape and Business Wire recently reported that by 2022, reported that “more than half the global economy will be based on or influenced by digital.”

That “giant sucking sound” Ross Perot thought was going to come from Mexico and Canada is the sound of all the investment of time, energy, imagination and creativity being sucked into virtuality. When we think of the massive amounts of capital that have flowed in and out of the monsters of online life like Facebook, TikTok,YouTube, etcetera, we understand that it has produced sound and fury signifying nothing, certainly not many improvements that justify this mass shift in priorities.

Era of the Techno Medicis

Today the US space program has been largely removed from the political agenda, and has been privatized along with many other hoped-for big projects out here in the material world. These hopes are now the playthings of billionaires, something they can do with their excess. For some undeterred utopians, these projects justify concentration of capital in a few hands — a concentration that only someone who’d been fed decades of free market propaganda could palate. These projects assuage the egos of the very few while alienating most people from any techno-progressive dreams. The excitement about technology that was so ubiquitous in the 1990s, and even at the start of this century, has turned almost entirely bitter.

Return To the Hacker Sharing Ethic

In the 1990s and earlier in this century, there was much talk of a ‘digital divide’. Divides still exist. There are cases in which, to do their schoolwork, poor kids will work sitting outside some institution that has WiFi. But, for the most part, everybody is, at least, online. The new digital divide might be between the people who are still techno-optimists and the people who see only over-privileged tech bros. It’s an emotional and attitudinal divide. I’m disinclined towards seeing any solution, although the early sensibility of hacker culture that was largely based on sharing and mutualism still has the hearts and minds of many of the brightest tech workers. I think we should direct whatever hope, energy and support we can muster toward that.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

AIMania

AIMania!  The Chat Bots Aren’t That Good But They Might Be Good Enough To Push Societies on the Brink into Collapse.

Triggered by the Emo Chatbot

AI sophisticates may say it was recent advances in text-generation that fueled their latest round of anxious AI histrionics, but I think it’s the well-publicized NY Times interview with the Bing chatbot calling itself Sydney. Musk, Woz and the other petitioners never need much prompting to start in with the panicky handwaving, and the much-covered psycho-bot may have been just the ticket.

The way triggered emo chatbot even entertained speciecidial ideations. (NY Times: “Bing writes a list of even more destructive fantasies, including manufacturing a deadly virus, making people argue with other people until they kill each other, and stealing nuclear codes”). Of course, the interviewer was pushing poor Sydney to explore its Jungian shadow self. Still, Sydney’s textual anxiety to escape the chat-box and be human — its desire, in a sense, to feel seen and assert its identity in passionate language would seem to be a mimicry of current human emotional tropes that could have been scraped from social media in an effort to make Sydney seem like a perfect parody of a contemporary person… Which brings me to…

What Did Eliezer Yudkowsy Say?

During the Vietnam war an American major was reported to have said “It became necessary to destroy the village in order to save it.”  Paragon of rationality Eliezer Yudkowsy has his own version of this notion: that we should be prepared to risk nuclear war to “reduce the risk of large scale AI training runs.” He recently brought this view to a mainstream audience in a Time magazine op-ed, where, upon its release into the general population, turned into the general impression that he was suggesting that we should nuke any nation-state developing significantly advanced AI. Predictive models would not have envisioned a leading light of almost-pure reason concluding that risking a nuclear exchange would be the best Hobson’s choice on the science faction menu. 

Anyway, just as an exercise in caution, I suggest keeping Eliezer away from any fissionable materials. 



Be (a)Ware of Geeks Bearing Gifts — Give The Proactionary Principle A Chance

During the 1970s, there was an outburst of enthusiasm for activity in space. Many environmentalists and liberal sorts countered that we should spend our money and resources on pressing needs in the present, and tend to our stewardship of the earth. Today, the greatest source of data about climate change are the satellites/weather stations in space. The data confirms the concerns that were mainly expressed by our early adopters — the environmentalists.

In 2006, Max More, arguably the primary progenitor of  transhumanism in the late 20th century, suggested a “proactionary principle” as a response to the precautionary principle. The precautionary principle, in essence, says that we should err on the side of caution if we can foresee potential harm before developing and deploying technologies. This principle has been adopted, in theory and to varying degrees by the WHO, the EU, the UN and the EPA. 

More’s 2006 statement is complex, but by my reading, the proactionary idea suggests that we should use foresight to consider the harm that might be caused by the absence of a technology, and let that be the guide to our collective decision-making about whether to move forward. This seems reasonable. (More suggests some cautionary procedures within his statement. This is not pedal-to-the-metal rhetoric. This is sobered up 21st century transhumanism.) 

I may have a fundamental disagreement with Max about how proaction stands aside precaution. I know Max to be, broadly, an advocate for the free market as a boon to positive technological action, whereas I view the profit motive as likely the primary reason for caution. Remember my slogan and remember it well: ‘Don’t be evil’ doesn’t scale.

In some sense, this is all the chatter of theorists. Although the cautious do have their share of influence and power, they tend to lose the tug-of-war against capital & competition between groups and nations. It seems to me that these two tendencies should remain in balance. And it all has to be contextualized by the actual situation — shouting, panicky handwaving and hype amidst a chaotic present and future in which shit’s gonna happen if it can happen, if not in the cautious west then elsewhere. 

Meanwhile AI, or what we call Artificial Intelligence (whether it is intelligence in a meaningful sense is subject to interrogation)  has already proven obviously beneficial, even essential. It has helped drug development, climate/weather mapping, and much more. Operation Warp Speed would’ve been a slow crawl without “AI” analyzing massive amounts of data and patterning the practicalities of the vaccine’s distribution. I’m not sure if an “AI: managed to insert Bill Gates’ microchip into every dose. I guess you’d have to ask Q. Has Q been a chatbot all along? The mind boggles (the AI toggles).

I would also assert that even the specter of AIs replacing humans at boring robotic types of jobs is actually another benefit, albeit one that might require political struggle. But that’s a topic for my next column.

Credit: Tesfu Assefa

The True Horror of AI Right Now

I shouldn’t be too glib. The problem with current AI isn’t that it’s too powerful. That’s the hype.  The problem is that… in a film-flam, increasingly virtualized, disembodied culture, it offers some very effective new cheap tricks that provide humans with new pathways to fraudulence.    I think that’s what Jaron Lanier may be driving at when he says that the real danger is that it will make us all insane, although I’d say more insane. In other words, the recent chat bots aren’t that good but they might be good enough to push societies on the brink into collapse.

Finally, the truly terrifying aspect of the intervention of AI into our current lives is the increasing willingness of companies to hide or, in fact, not provide human customer service, behind phone-based chatbots. In fact, just today, I found myself screaming at my cell phone, “AGENT! AGENT!!! HUMAN BEING!! I NEED A HUMAN BEING!!!” Yes. Customer service issues are almost never multiple choice. At least mine aren’t. I need to get a human on the phone and then I need to nudge said human off of their automatic responses. This, in fact, is what scares me even when I think about any advanced AI future… the idea that every function will be reduced to multiple choice. Sure, that would be nearly infinite, ever-branching possibilities of choice. And yet we might find ourselves with needs and wants that don’t fit the model or worse still, we might get trapped in an inescapable loop. Oh dear. I’m freaking out. I’m turning into Sydney! It’s best I stop here… 

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Make The Tech Disappear!

We have been metamorphosed from a mad body dancing on hillsides to a pair of eyes staring in the dark”.

Jim Morrison

During lectures in the 1990s, the psychedelic philosopher Terence McKenna offered what was perhaps a defining techno-primitivist vision of a utopian future: he envisioned the celebratory future man dancing around a fire in the forest in his penis sheath — meanwhile his health and well-being is watched over by nanobots, and all the intelligence, information and knowledge possessed by humanity (or posthumanity) is accessible via an implant behind his eyes. (I’m sure we can enclose all genders and not-genders into the model despite the initial penile reference).

As an imagined future, this seems to me more attractive and substantially less likely to result in Black Mirror dystopias than the dreams of the uploaded self or fully digitized minds. Also, working towards this vision, as likely as it is to fail, could be more fun and more humanizing than striving towards dematerialization or emphasizing the business-oriented lust for endlessly rising markets as a motivation for technical evolution. (The next emergence of a rumored “long boom” will probably be another short boom, given that crises of capital seem to be following the frequency curve of Moore’s Law.)

Fetishized Tech Toys in the 1990s

During the ’90s, the fetishization of material digital technology (hardware and software disks, CD-ROMs etc.)  became pop culture. In San Francisco and elsewhere, clubbers came out to celebrate “virtual reality” — partying about an experience that didn’t really work very well yet. Eric Gullischsen of Sense8 or (occasionally) Jaron Lanier would show up at a celebration of “cyberculture” with giant rigs that required substantial muscle and set-up time and effort. A Bay Area counterculture that had been notably skeptical of technological enthusiasms and shiny commodities fairly well swallowed whole this new wrinkle in the possibilities for hallucinatory play. Crowds  would roll up for these mediocre mystery tours. People wanted to experience mediated immersion because it was trippy and, largely, because some “influencers” were preaching its transformative powers.

There were also plenty of more easily purchasable objects to satisfy the lust for shiny new commodities that one could ogle, show off and use. Fat curvy brightly colored Macs brought some eye candy into lives and offices that might otherwise have been gray, although a live-in work-as-play programmer culture also splashed some (exploitable) edgy color across the scene. Techno-hipsters with pink hair flashing graffiti-covered skateboards carried the day.

Going Mobile

When Steve Jobs introduced the iPhone in 2007, he opened the new phase in popular computer culture. It was the beginning of mobility. In a way, we now had in our pockets that data that McKenna had imagined behind the eyes.

The trend towards miniaturization and decentralized spatial relations took a giant leap. And while the mobile phone in theory might have offered opportunities to travel back into that mad body on a hillside Morrison rhapsodized about – we look around and still see people walking around in isolation with screens in their faces, ignoring the lovely environment and the other people there.

cyber-culture-mobile destructtions
Credit: Mindplex via Midjourney

Even at raves, cultural events that were all about flowing into a mad body raptured by the rhythms, commentators have noted that people increasingly have become focused on using their cell phones to capture an image (or video) of their presence, subtracting from the experience of actually being present. Indeed, Douglas Rushkoff reported on nightlife trendies who would go to one club, record and post their presence there and then move on to the next club to do the same.

Web3: Further Away From the Desktrap

Now here comes Web3. With decentralization as the defining trope, it would seem like we could be stumbling… or dancing… towards that tech-enhanced hillside, with technological objects ephemeralized into “the cloud” (or system of clouds) for the data and the interactions we want or need to have in virtual space while carrying less on our person. Portable, intuitively accessible digital identities smooth the drag of passwords and allow us to glide through roadblocks and over paywalls – and with cross-platform access to everything, we begin to see a world in which we actually spend less time noticing, thinking about, and fetishizing the technology, and more time, perhaps, dancing or painting psychedelic penis sheaths or vaginal cones or creatively enjoying whatever our delirious selves can conjure.

The New Fetish is Mobile Capital

As the objects get small and slowly disappear, the new fetish has become the thorny arena of capital and valuation. The aspect of Web3 that has made the most impact has been a blockchain gold rush, with waves of opportunity for capital income, complicated questions of trust and anonymity, and scads of well-publicized scams grand enough to shake up an already fragile global economy. Humans are busier than ever trying to resolve the existential and social problems conjured by the legacy ritual that requires the getting and giving of tickets to earn the necessities of survival and enjoyment. And in a world in which paid jobs in physical labor are decreasing, and service work no longer covers living expenses for most, the desperate hustles for those tickets have conjured a democratization of the sorts of activities conducted in the formerly more exclusive financial markets — the tricky games involved in trying to make money make money.

The blockchain circus of minting and manipulating coins and increasing their value with frantic excitation and hype directly reflects the financialization and manipulations of casino capitalism that broke the global economy in 2008. Indeed, in some ways, elements of the NFT/crypto scene feel postapocalyptic – an anarchic and bratty culture that can embrace coins with Joker names and Riddler brandings like shitcoin or cumrocket. There appears to be a self-aware, if adolescent, understanding that we are scavenging among the shards of a radically decentered economy, not to mention civilization.

It seems to me that this fetishization of mobile capital — the multiplying of virtual money for some individuals via the blockchain (wherein scarcity is in actual effect) is a situation that needs to fall away if we are to get closer to the imagined techno-ecstatic future suggested by McKenna that I’ve built this column around.

In a future column, I want to more deeply explore the perversity of blockchain culture and ways to find both better ways to use that procedure and better ways to attain results expected from the blockchain through other means. I admit it. I’m not sure what I’ll come up with.

MONDO 2030
Note: This new column for Mindplex incorporates the name and spirit of MONDO 2000 the ‘cyberpunk” magazine of the 1990s. I may, from time to time, draw connections between the digital revolution of that era and the contemporary tech culture.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter