Ghosts in the Machine: The Digital Graveyards of the Future

Death is still an unexplored country. If the Singularity arrives in all its glory it may even be one some of us may never explore. Faint dreams of immortality aside, death is almost certainly coming – if the rest of history is anything to go by.

But when we’re gone, will we be forgotten? Humanity is outputting ordered information at a greater rate than ever before. We marvel at the black-and-white stills of a century past, with their stiffened faces to let the long exposure work: a tiny glimpse into an otherwise imagined land. Our descendents will marvel, though, in high-definition fidelity at the great tapestry of our lives. Sometimes in all-too-intimate detail. They’ll have the past on their cinema screens.

You’re In the History Books!

It’s easy to overlook this change. Most of us have enough to keep us preoccupied in the current year without worrying about the traces we’ll leave decades after we’ve gone. Yet the incredible advance in data-capture from our reality, and our ability to store it in a more reproducible, durable, distributed state means future historians will have a lot more data to sift through.

Future generations will know a lot more about you, if they care to look, than you could know about anyone from even a few decades past. Your digital imprint – your videos, texts, interactions, data, places visited, browsing history. All of it, if it’s not deleted, will be available to a future generation to peruse, to tell stories about the life you led. Will you care about your browsing history when you’re dead? Has it been a life well lived? What will your Yelp reviews tell your great grandchildren about you?

Credit: Tesfu Assefa

What Will They Say About You?

The dilemma is raised fast. We worry about privacy now; should we worry about legacy? Do we want Google to survive forever and preserve the data it holds about to us for the public domain, so that we can be recognised by eternity. Or should the dead take some secrets to the grave?

There is a broad social question here, but it’s not one any of us can answer. Ultimately, Google, or any other major surveillance firm who is holding, using, and processing your data will get to decide how you are remembered. Privacy and legacy are twin pillars of an important social and ethical question: how do we control our information?

Even if you went to lengths to hide it, it’s too late. If the internet as we know it survives in some form, and we continue toward greater technological integration, then advances in data storage, processing power, cloud computing, and digital worlds will mean the creation of a far greater memory of you and a record of your actions than could have existed to any previous generation. And it will only ever increase in generations to come.

Resurrecting the Dead

History then, is changing, as future tech starts becoming real. Humanity may, in the not-too-distant future, have full access to the past. Imagine AI historians trawling databanks to recreate scenes from history, or individual stories, and playing them out in a generative movie played on the screen for the children.

Look! There is your great-grandad on the screen – that’s him playing Halo in his first flat, that’s him at Burger King on Northumberland Street before it closed down. The data is there: that Twitch video of you playing games in your room you uploaded once; the CCTV inside and outside the restaurant. If the data has been stored and ordered – as it increasingly will be – then a not particularly advanced AI could make that movie. Heck, it could almost manage it now. In the further future, it could even do more – it may be able to bring you, in some form, back from the dead.

Gone But Never Forgotten

We must start to grapple with the stories we plan to tell our children. Our digital lives are leaving a deeper footprint on the soil of history than before. We know our ancestors through scattered traces, but our descendents will watch us on IMAX screens. Data capture, storage, privacy, and legacy are all crucial questions we must face – but questions that few are asking. If the future proceeds as planned, then our descendents will know things we may wish they didn’t, but at least we won’t be forgotten.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

The Alluring Turing Test: Mimicry and AI

The Turing test, originally called ‘the imitation game’, is one of the first AI benchmarks ever posed. If an artificial conversational agent can convince someone that it is human then it can be supposed to be intelligent. It was proposed in 1950 by the father of computer science, Alan Turing, and is, in the collective imagination, the definitive milestone an AI has to pass to begin to be considered sentient.

But many AIs, for decades, have passed forms of the Turing test (think of a spambot or catfishing account sending someone a few messages) yet we don’t generally consider them sentient. Rather, we’ve decided people are easy to fool. The Turing test has been called obsolete. For John Searle, this was true on a philosophical level: his Chinese Room experiment showed that just because a computer can process symbols that does not make it sentient – just like how ChatGPT guesses the next word in the sequence. It’s just good at simulating one effect of intelligence.

Fool Me Once

If an AI can fool you into believing it’s real, what else is an illusion? Sci-fi landmarks like The Matrix and Lawnmower Man have long played with the idea of hallucinated reality. It’s part of life to question reality, to check that everything is as it seems. It was natural to apply this to proto-AI, to check that it could seem intelligent. Over time, Turing tests haven’t become obsolete, they’ve just become harder to pass and more rigorous. 

Rather than testing whether someone is sentient, the Turing test has evolved into whether content was created by AI. Our civilisational consciousness is now attuned to the idea that what we are talking to might not be human, or what we are seeing might be made by a computer. We accept that generative AI can paint gorgeous pictures and write beautiful poems. We know they can create virtual environments and deepfaked videos – albeit not, yet, at the fidelity to fool us consistently.

Fool Me Twice

That fidelity might be close, however. And, when the average deepfake fools more than 50% of the humans that see it then, suddenly, generative AI has the ability to make a 51% attack on our entire society. Scepticism, always a powerful human evolutionary tool, will become more essential than ever. We have already seen a damaging polarisation of society caused by social media, fueled by a lack of scepticism about its content. Add generative AI with plausible content, and the problem escalates. 

The Turing test, that rusted old monocle of AI inquiry, may become more vital to human thought than it has ever been. We, as a society, need to remain alert to the reality and unreality we are perceiving, and the daily life to which we attend. Generative AI will be a massive boon in so many sectors: gaming, financial services, healthcare, film, music – but a central need remains the same: knowing who we’re talking to and what they want and whether they’re real. Critical thinking about what you’re being told in this new hyperverse of real and unreal information. It will be what makes you a human in an endless constellation of AI assistants.

Credit: Tesfu Assefa

A Turing Test for Humans

The Turing test may end up not being for the AI after all, but for the human. Corporate job examinations could test your ability to identify what content is from a bot and what is not, which film was made by AI, and which by a human. You’ll need to have your wits about you to stay Turing-certified – to prove that no false reality generated by AI could hoodwink you into revealing secrets. We saw this through the virtuality of dreams in Christopher Nolan’s film Inception – but with digital VR worlds coming soon, such espionage might be closer than we think.

Alan Turing’s test remains relevant. Judging what is a product of legacy humans and what is from our digital children will become a fascinating battleground in just about every human sector. Will people want to watch AI-made films? How close to fidelity can they get? Cheap AI-produced neverending sitcoms based on classic series already exist – they just fail the Turing test, as do endless conversations between AI philosophers. These wonders would have fooled people 25 years ago, they would be convinced that a machine could never make it up – now they come off as the playful fancies of a new tool.

You Can’t Get Fooled Again

But soon, these fancies will become fantasies, and more people will be fooled. A deepfake video of a world leader issuing a declaration of war need only convince so many people before it became an existential risk. AI will write dreamworlds that promise the most fantastic ways of productivity and play, but should too many of us become too intimate with the machine, and think, like the Lambda engineer, that it truly is sentient, then the influence these AIs innocently exert could be dangerous.

And what if our pursuit towards AGI and the utopian Singularity leads to us declaring that an AI we created was finally sentient, and that it was on our side? Would we put it in charge? Then would it really matter if it was faking it the whole time? Well, yes, but by then it will be too late. 

So run along and take the Turing test. Both of you.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

The End of Days: Crypto and the Apocalypse

Never has the end seemed quite so near. Climate change, war, a pandemic… and the birthing of a monstrous digital god that rewrites society in a few arcane flashes of code. The genesis of digital life could be humanity’s approach to The Great Filter – if we don’t nuke each other in a fit of overenthusiasm first. 

Life, however, always finds a way. And crypto, many argue, could too. Crypto has long been championed by doomsday prophets, preppers, and hedge funds as the ultimate and absolute hedge against complete societal breakdown, whatever form that takes. Should there be an apocalypse-level event, crypto’s properties do make it uniquely resilient against such a fall. Where has this narrative come from, and does it hold up to scrutiny?

Crypto as a Hedge Against Disaster

Crypto has historically boomed in times of distress. Much of the last bull run was driven by crypto’s ability to be a hedge against inflation, as money was being printed at a neckbreaking pace to pump liquidity in the economy. In the stricken economy of Turkey crypto ownership is at record levels. In Russia and Ukraine, where war rages, crypto offers a way of transferring value that can’t be blocked by a bank’s policy or collapse. Crypto’s consensus systems operate globally, not locally, so should any central banking system fail (and with it the society it oversees), crypto should still store value. 

Anarchists and preppers have long seen crypto’s value: no government control, anonymity, non-correlation with geopolitical borders, and a form of cash to use if traditional cash becomes worthless. That global consensus maintained by computers means any localised failure doesn’t bring down a given cryptocurrency except in the absolute worst cases (a meteor strike or something affecting all global regions). The universality of crypto is unmatched by anything but gold, and its ability to cross borders and ignore financial firewalls is unparalleled. It’s no wonder crypto has carved out a place as the ‘apocalypse’ currency. 

This is particularly true of any manmade apocalypse, such as a powerful dictator running riot on the world stage, or any usurpation of the central financial system by a single overweening authority (maybe that last one has already happened). Pseudonymous and sanction-resistant, crypto can maintain a financial ecology governed by the people on the ground, and act as a counterpower to techno-dictatorships.

Can crypto be a medium of exchange in a truly global apocalypse? That is far more questionable. First, who would want it? As the ashes of the nuclear winter fall, will people care what the ledger says? People will be far more interested in food than tokens on a ledger. If you’re scavenging in the wastelands, a packet of noodles becomes more important than the contents of your Web3 wallet. 

Moreover, upkeep of these decentralised ledgers could be gravely compromised by mass outage of the internet, eradication of mining hubs, and more. It’s possible one large-scale intact mining farm could gain a 51% share of a blockchain, and this would break the blockchain’s status as trustless or decentralised. There are counters to this: it is possible to send Bitcoin over radio, and there are satellite networks which are likely to survive any terrestrial disaster – but it’s grasping at straws to think the priorities of society would drive towards upkeep of the ledger when the ice caps melt. 

Proof-of-stake coins – the majority of the top 100 cryptocurrencies – are even more under threat. Substantial amounts of the competitive quorum that governs these chains could be wiped out no matter what the event, and 51%-ing these chains might become a whole lot more feasible as your competitors die off. The sad fact is when everything goes wrong, humanity has two choices: order or violence. 1s and 0s on a ledger are unlikely to be what holds back our most ruthless instincts.

And then there is AI. The black box of the Singularity could have some unexpected outcomes, one of which is apocalyptic. A newly minted AGI may decide that crypto is the way forward, and immediately ensure its own seizure of the ledger. Such an AGI may require us to advance to quantum computing – already itself an existential threat to crypto.

Credit: Tesfu Assefa

Hold Until the End?

So, crypto, pemmican, and a gun? Is it all you need to survive the end of days? Well, maybe. Crypto will continue to serve as a hedge against social upheaval, and a ‘minor’ or localised apocalypse will probably lead to exponential uptake of crypto as a medium of exchange. But if the end of days is truly everywhere, it’s unlikely crypto will be part of any new world order. But keep your cold storage USB close just in case.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Digital Realms: The War for Sovereignty in Cyberspace

The internet is the infrastructure that supports our economy and society. Whoever controls it controls the world. Whoever can censor it, deny access, and control its output controls society. The internet is a permissionless network with countless participants, but nevertheless access to it has agglomerated towards centralised entities, whose influence grows by the day. Privacy is now a relic and your access to the internet is less assured than you might think. The war for cyberspace hasn’t just begun – it’s been raging for decades, and the war over the digital realm is no less vital than those waged in the real.

Erecting Digital Walls

The Great Firewall of China, the tongue-in-cheek name given to China’s mass surveillance, restriction, and gatekeeping of the internet has for decades now inhibited its citizens’ access to data. Russia recently followed suit. Societies on the totalitarian end of the spectrum want more than anything to keep the internet under their control, and deny access to global information. 

It’s easy to see why. The internet, like communication technologies before it, lets societies communicate and distribute information en masse without oversight of the elite. Remember that the printing press was heavily censored for centuries almost as soon as it was created, although in the end it didn’t stop the Lutherian reformation and the messages of the newly minted protestant movement being distributed in secret, smuggled under the cowls of renegade preachers.

Yet corporate America has its own issues with free internet access, with net neutrality under siege from ISPs who would like to discriminate and levy fees based on access it, or what they are accessing (although in fairness to the USA, their surrendering ICANN’s control of the DNS system to a multi-stakeholder model was a major move towards ‘decentralisation’ of the internet). 

Meanwhile, the EU panics about the US-led cartel in cloud computing, and the fact that the majority of the world’s data is held in massive data farms controlled by US techopolies and routed through Amazon, Google, and Microsoft’s services – data used by national governments to service their own ends, or wielded by corporations who finally rip off the fig-leaf of social conscience (remember that Google stripped ‘Don’t Be Evil’ from its corporate manifesto). 

How AI Data Scouring Leads to Dystopia

The advance of AI is central to the current hubbub of concern over all of this. Mass harvesting of data is useless without appropriate indexing and, as anyone who uses Windows can tell you, even searching a hard drive for a file can be a difficult task. No matter how many data crunchers you put to the task, and how powerful your indexing software is – there is simply too much data to reliably capture, store and output in any meaningful way. 

Command-and-control technologies like this are still in their infancy, despite decades of research. Yet neural nets trained to harvest innocuously-generated data lead to a dystopian future, one where you can say ‘Hi DataGPT, please look up [John Maguire], give me a three-paragraph profile on who he is, and a verdict on whether he is an enemy of the state’. To think governments won’t use it is a naive fallacy. In a decade, getting caught for speeding might have the cop asking his AI about you, and what you’ve been up to, before he decides whether he should wave you on or shoot you down. 

Credit: Tesfu Assefa

A Return to the Original Internet

The internet was originally dreamed up as a fully decentralised network, built to withstand the possible infrastructure-annihilating shocks of war or catastrophe. Over time, commercialisation crept in, and centralisation with it. Rather than accessing any given server, instead people accessing through one ‘node’, that of the ISP. 

That was Web 1.0 but, in some ways, Web3 is an attempted return the prelapsarian state first envisaged by the creators of the early internet, where activities and services are run on a decentralised set of nodes and are permissionless, trustless, and free (in an access sense) – forever, with no one able to revoke access, and no great firewalls being erected and – in an ideal world – with pseudonymous or anonymous privacy maintained.

Of course, Web3 currently needs the infrastructure rails of the ‘old internet’ to function. Yet as decentralised scalability improves, there is perhaps a future in which an internet exists which no nation state can colonise, where privacy is retained, and which enshrines the rights of the individual. Excitement over crypto starts with the power of trustless decentralisation, with tokens that give you the right to wander these digital realms without fear.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Middle Class Precariat: The Obsolescence of the Intelligent Workforce

Is your job safe? Are you sure? In recent times, journalists around the world are facing mass layoffs in the wake of the generative AI boom. Following Buzzfeed’s lead, even traditional new organisations like Murdoch’s News Corp are using AI to mass produce their content. 

To some, this widespread cull of the journalistic class in traditional media is a necessary casualty in the march to the Singularity. If AI can do the job at 70% of the quality but 1% of the cost, then for any media CEO, it makes sense. Machines don’t go on strike, they don’t require breaks, and they never go home after a hard day at the office. 

Modern content and consumption habits are increasingly formulaic. Ad-driven sites spurn quality in favour of clickbait dopamine, driving communication to become ever more bitesize – and become something a computer program can handle. With language models among the first neural networks to make a breakthrough, the writers were among the first in front of the robot firing squad. Yet as generative AI develops, they won’t be the last.

Just The Latest Panic?

Technology is a labour-saving device, and the efficiency savings should, in theory, lead to a more wealthy, more liberated society. If we can get technology to drastically save time and effort on essential activities, then – theoretically – everyone should have more free time, more leisure, and more opportunity to create wealth or art on their own terms. 

They said that about the plough, they said it about washing machines, neither which turned out to be wholly true. But they absolutely didn’t say it about industrial looms, or automobile production. ‘Once a generation we have a near panic [that] technology is destroying jobs’, says Professor Richard Cooper, and he’s right. Historically though, new jobs emerged in the vacated space. 

Is AI just the latest panic? This time, the fear is different. A general intelligence won’t just take over one field of work, but all of them. Generative AI is the most generalist labour-saving technology ever conceived. The annihilation of the content journalist class is only the beginning. First they came from the writers. Then they came for the graphic designers. Then they come for you.

The Two Paths 

So which is it? Will AI finally unlock an abundant life of leisure, or consign humanity to a new serfdom? Where is our Neo-Luddite movement? There are two paths. One, where AI just augments current jobs, piloted by skilled humans, boosting efficiency and output, leading to broad wealth creation, or even unlocking new talent where before the barriers to entry were too high. A virtuoso game designer who was never able to code well may suddenly find their visions easy to enact. This path requires an orderly, fair, consultative transition about the integration of AI agents into our economic workforce.

Capitalism is rarely that careful. The key aspect of this economic meteor is how AI agents may take over large areas of the labour force in one short, brutal blow. If it’s just the graphic designers who lose out, perhaps they can retrain. But if 25% or 50% of middle class jobs get obliterated in one fell swoop? The potential stress on society could lead to far more than widespread poverty, it could lead to revolution. Society exists based on a treaty between the have and the have-nots, a line constantly fought over in politics and, at times of strife, the battlefield. If huge parts of society suddenly become ‘useless’ to the political and social economy, it may not be them who have to change, but society itself; such change rarely happens without violence or upheaval.

Credit: Tesfu Assefa

An ‘Organic’ Tariff

There is hope. We may see a turn back to ‘organic’ work taking on its own value-add, the same way that homespun crafted products often fetch a higher price than factory-made products. Yes, an AI may produce superior, more complex, and more technically adept work at any given task, but it may lack that ‘human’ touch. Right now, with the current state of AI, this unheimlich, or uncanny, valley is quite easy to spot, and often induces aversion in observers. 

Over time, it may become ever more imperceptible. In the case of sectors firmly in the crosshairs, like clerical work, it never mattered in the first place. Yet the hope of an artisan society, an economy powered by human creativity and in which AI allows us to meet our basic needs while we focus on what makes us happy and fulfilled, is too utopian a view in a world where the processing power that fuels AI agents (and the code that runs them) is in the clutches of a few corporations.

Things Will Fall Apart Fast

AI needs to benefit all of us. To do that, we all need a stake in it. If we let our rapacious capitalist tendencies as a society run too long without safeguards on the development of AI – we may find wealth inequality, aided and abetted by AI agents working for zaibatsus, becomes too extreme too quickly to fix. We are sleepwalking toward a nightmare society, too enthralled by the promethean fire to notice that it’s burning everything.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Old Money, New Game: Does Institutional Money Spell the End of Web3?

Big money – smart, dumb, and everything in between – is coming to Web3. The increasingly likely prospect of a spot Bitcoin ETF is a milestone in crypto’s final acceptance by the mainstream. Adoption is coming: a word that fuels the dreams of bedroom miners who have for years waited for the wider world to catch on to crypto’s promise.

For many, adoption is something to be fervently wished for, the final ratification of crypto’s potential. For others, it spells the end of crypto’s status as an alternative asset class, a death knell for the underground financial resistance that crypto historically represented.

Bitcoin: Always an Alternative

Bitcoin’s creation was predicated on being an alternative to big money. The first block in the entire chain contains a cryptic jab at central banks: “The Times 03/Jan/2009 Chancellor on brink of second bailout for banks.” The most recent bull run, although powered by stimulus checks and everyone having too much time on their hands, was built on the belief that Bitcoin and cryptocurrencies can be a store of value – a trustless hedge against the rampant money printing of central banks with levers operated by shady politicos funded by corporations keen to see their share price rise. I’m not suggesting this narrative is true or false, but it certainly fed the meteoric rise in crypto asset prices in 2020 and 2021.

Fifteen years of 0% interest and quantitative easing have made everyone’s money mean less (and everyone’s ownership mean more). Crypto represented a resistance to this. Almost since its inception, crypto has been seen as a chance for the little guy to make it, for Millenials and Gen Z (who are far more likely to be invested in crypto) to overturn the Boomers’ hoarded wealth and have a chance at replicating the stable, successful accumulation of their forebears, for those operating outside the standard rails of society to hold, store, and gain wealth. To let les miserables get involved in playing the game.

What ETFs Will Do To Crypto

The Securities and Exchange Commission (SEC) former chair’s statement that a spot Bitcoin ETF is ‘inevitable’ is, to some, a cause for sadness as well as celebration. Make no mistake, a Bitcoin ETF will open the doors for institutional money to get into crypto. ETFs (exchange traded funds) are a gold standard for institutional investors. Let’s talk about the positives first.

An ETF is a regulated mutual fund, professionally managed, that pays out dividends to shareholders based on its basket of securities. Unlike mutual funds, ETFs can be listed on a stock exchange, and are freely fungible for other cash or stocks. Most crucially, ETFs are an investment instrument that would not break fiduciary responsibility for pension funds, hedge funds, public businesses, or any other large institution who wants to hold crypto on their balance sheet and be exposed to crypto’s upside.

Upsides could be enormous if, as expected upon ETF ratification, institutions begin piling into crypto, as a method of diversifying their massive portfolios. The ‘$15 Trillion earthquake’ has the potential to send crypto not just to the moon, but to Oort Cloud. What about this is sad at all? Won’t everyone benefit? Well, yes, those who hold crypto will financially benefit – a lot.

Credit: Tesfu Assefa

A Requiem For Web3

The sadness is perhaps more philosophical, less practical. They worry that on the grandest scale, the cat will be out of the bag. Old money – banks, institutions, pension funds, Wall Street – these will become the primary drivers of the crypto market once crypto ETFs go live. The fun underground culture of Discord announcement parties, acid-mediated 125× Binance longs, Pepe-meme punts on shitcoins, and community-led price action with groundswell social campaigning will be completely swamped by the ticker tape tapestry of Bloomberg-reading MBA suits pumping tsunamis of money around the market or letting an algorithm HFT for them. Crypto will no longer be an alternative asset class, but just an asset class: regulated, controlled, and milled by the ancient financial machine that plunders all our tomorrows.

A New Financial System For Everyone

The hope, of course, is that crypto actually presents the opportunity for a fundamental change to the old systems. Ethereum (itself the subject of an ETF application) has, through its programmable smart contracts, the potential to act as an alternative financial substrate – one that is decentralised, trustless, and censorship resistant. One that levels the playing field and lets everyone ‘play up, play up, and play the game.’ It won’t just be Old Money buying into these assets, but these assets will form a new foundation on which the financial world can thrive – one that is permissionless and (at least nominally) fair, governed by smart contracts and regulated by all. Old Money might be entering the new game, but at least this time everyone gets a chance to join in.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Digital Title Deeds: Ownership for a Virtual Generation or Commodified Engagement?

NFTs are, by now, old news. No single aspect of crypto has been more derided than the ‘million dollar jpegs’ which headlined the last crypto bullrun. Around coffee tables, bars, and Discords all over the world, snide remarks about the utter insanity of the modern market reigned supreme. To many, the idea that a duplicable image of an Ape could set you up for life was cause for bemusement, anger, and not the least bit of jealousy. 

Has this modern speculative capitalism gone mad? Was it the outrageous excess of the crypto-minted tech bros’ 1% indulging themselves in the new ‘roaring 20s’ of the 21st century amid a backdrop of pandemic and war? Or was someone, somewhere, actually on to something: that the dizzying prices and speculative excess was a harbinger of a newly consecrated form of online ownership and digital demesnes that would lead to a new concept of cyberspace.

The answer, of course, is that all three are true. Though the positive narrative has, to date, almost been entirely sunk by the precipitous, and in some cases hilarious, losses that early NFT ‘investors’ suffered. All gold rushes bring charlatans, and nowhere was this more acute in the insane pell mell towards the jackpot that occurred as literally anyone with a few thousands dollars and a basic concept of programming, blockchain or otherwise, could spin up a brand new NFT collection, promising insane gains, ambitious roadmaps, and eternal friendship among the community. The barrier to entry was near-zero, and the market was hungry for every new ape collection that rolled off the bedroom CEO production line. A lot of people – mainly the young – made a lot of money.

Everyone else lost everything. Very few projects ever grew beyond the initial launch. Leaders collected the minting fees and promptly stopped working, realising perhaps innocently, perhaps not, that the roadmaps they had set out would be difficult even for Apple to execute in the timeframes spoken about. Discords turned feral as thousands of users realised a 14 year old, perfectly innocently, had sold them a few pictures of whales to test his skills with Rust, with zero plans to do anything else for the project. It was just a hobby to make a few dollars. 

Credit: Tesfu Assefa

Yet even without a roadmap, communities wrote one in their heads. This was going to be the latest craze, the keys to a better virtual future where whale-owners would walk tall in the new halls of cyberspace, a chance to pay off the mortgage. How dare this 14 year old kid rob us of that future they’d already dreamed they were in. Scammer, I can doxx you! I know where you live!

How did this happen? What is it about those jpeg apes that so seized the cultural imagination? Yes, there were an incredible amount of push factors – Covid, quantitative easing, stimulus, lockdown, BTC’s massive gains creating crypto-related mania. But there must have been more – what was the pull?

First, they’re not jpegs. The picture associated with an NFT is not truly the NFT itself. An NFT is a token created (‘minted’) by a smart contract that has certain information on it (like pointing to a webhost hosting a jpeg), is completely unique (even if duplicates are made, each NFT would still a specific blockchain signature), and has an owner ascribed to it (usually the person sending tokens to the smart contract to make it execute its creation function. The NFT’s information, the transaction that created it, and the current ownership is all publicly visible, irrefutable, and benefits its blockchain’s security, making fraud impossible without breaking the network entirely.

This means that we’d finally figured out a way to record digital ownership, and thus digital items, which due to their reproducibility had very little worth, but could suddenly have a lot. It started with art, but games quickly realised they could consecrate ownership over their in-game assets to players, creating cooperative gaming experiences. Ownership of the first digital ‘printing’ of your favourite artist’s new album having kudos. The ability for vibrant secondary economies to spring up around their favourite talent as users could trade NFTs with one another, or sell them. NFTs created a whole new economy to be exploited where there was none before. And boy, was it exploited. Influencers, artists, and anyone with a following could create new engagement models using NFTs, with bespoke experiences attached. At time of writing, Cristiano Ronaldo’s instagram bio asks his 600m followers to join him on his NFT journey, and bid for NFTs in open auction for a chance to meet the man himself.

What’s wrong with a ticket though? Just tell me why an ape picture is worth millions. Well, the reason is, as with so many new technologies, is the possibilities. Bitcoin, Ethereum, Solana, Cosmos – whichever – blockchains by their nature are designed to be permanent, digitally-native operating systems for our future. An NFT bought in 2017 will keep its functionality for eternity. It can’t be erased from the blockchain, or from time. 

That means that should, in the future, a new business, say, declare that the only way to buy the first release of their hot new product is by owning said NFT, it would be easy for them to borrow the operation security of the blockchain and create instant exclusive access to whatever ‘club’ of people they those at near-zero outreach cost. Membership of said clubs would be powerful, digital cartels impossible to access except through the NFT and the key it provides. Or a blockchain game grows and develops a powerful online community over a decade. The first NFTs of in-game assets would be priceless, and nothing would stop developers engineering new functionality for them over time. Only a fool would suggest that we are not becoming ever more cyberised as history advances, so why wouldn’t the first digital artefacts – the first time we can truly declare failsafe ownership of a digital asset – have value? 

As alluded to, all of that is decades hence. NFTs have been mooted for use in retail, supply chains, schools but, as ever, the integrative technology to make that happen and make it useful has a long way to go. Those most in the know are too busy getting rich, or at least were, to truly focus on advancing NFTs as a useful digital technology. Now, as almost every project suffers on the wind-down from mania, perhaps it’s time to take stock of what digital ownership could truly give us. As a blaze of stimuli, images, and simulacra race past us in virtual headsets, NFTs just give us something to hold on to.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Idorus of our Imagination: Neuro-sama, Culture and Connection in an AI World

Don’t look at the Idoru’s face. She is not flesh; she is information. She is the tip of an iceberg, no, an Antarctica, of information. Looking at her face would trigger it again: she was some unthinkable volume of information.

William Gibson, Idoru

Some prophets of the Singularity say neural nets and LLMs like breakout star ChatGPT are the beginning of the end; they mean that humanity’s evolution to techno-sapien is nearly complete. Like the Neanderthals before us, homo sapiens may be in its twilight – an extinction event we may not even notice. 

Soon, an unimaginably advanced AI, gluttonously feeding on the dataways of the internet, will emerge. Life 3.0, as Max Tegmark calls it. An ascension to power so sudden that, unlike Skynet, it won’t need any explosions. Humanity may or may not survive, but either way the torch of our destiny will have been passed along to our digital eidolons, and the future will be theirs to create.

A Rewiring of the Global Economy

Slightly overdramatic? Many in the know say it’s not dramatic enough. What is sure is that many people in previously secure employment are going to find their job is not so secure after all. The long tail of content creators doing workaday commissions may find clients a lot harder to find. Yet lawyers, pilots, software designers are all more at risk than you might think. The economy is going to get rewired and, short of a neo-luddite revolution, you are more likely than not to be in the firing line.

Nonsense, sceptics retort. These bland media talking points are no cause for concern. But even so: yes, LLMs may replace the need for content writers (help me!). Sure, tech will scythe down some inefficient admin jobs, streamline the nuts and bolts of our racing enterprises, perhaps help software patches get deployed faster and see developers get a pay cut. ‘But moi? No – what I do is essentially human. An A.I could never displace me.’ For example a streamer, whose entire business is their own personality, may scoff at the idea of an AI taking their job…..

Meet Neuro-Sama – An AI Twitch Streamer

Credit: Tesfu Assefa

Meet Neuro-sama. An AI-streamer that uses an LLM to formulate responses, synthesises a voice using a text-to-speech, and is visually compiled in Unity. 

Neuro-sama has already become one of Twitch’s largest streamers. Very much in the ‘weeaboo’ mold, she sings Karaoke, plays Minecraft, and reacts to viewers’ questions. She tells meandering and random stories – and every so often, she becomes slightly deranged. Tormented by the Antarctic ice flows of information flowing through her soul, she can bug out, become hostile, and attack her chat. ‘EvilNeuro’, as she’s affectionately known by her captivated audience, who call her ‘Neuropossesed’ as she condemns them for the tortured, trapped life she leads for their enjoyment.

She is, in many ways, a vision of William Gibson’s novel Idoru come true. The novelist who coined the word ‘cyberspace’, launched the cyberpunk genre, and was the imaginative underwriter for a whole industry of poets and novelists, has seemingly got another era-defining prediction on his CV. This one may have greater ramifications than any that came before. In the novel, Idoru is a synthetic holographic J-pop idol who is the world’s biggest superstar and who, with the help of nano-technology, wants to get married and become ‘real.’

‘She is a personality-construct, a congeries of software agents, the creation of information-designers.’ 

Heady stuff and, when Gibson wrote it, perhaps just a gloriously titillating tech fantasy for his punky caper plot. 

Now, it’s real. It’s happening right now. We’ve already seen culture being digitised with Abba’s ‘Abbatars’, a holographic stage show that plans to run, effectively, forever. Neuro-sama is the next step in that evolution. An entirely synthetic cultural product. An Idoru for the techno-sapien generation.

All Too Human

A sceptic may see this as an anomaly, a natural novelty emerging from an AI-focused zeitgeist. Yet novelties are seen, remarked upon, and abandoned. Neuro-sama has thousands of daily, recurring viewers, who pay actively for her performances and for her company, who feel connection to her in a way they do not with human streamers. She is the first in a wave of AIs that will provide the connection that is so sorely lacking in a disconnected world.

Her’ starring Joaquin Phoenix explores the way that an AI can be there 24-7 in a way no human can. An AI who can sift through the emotional gibbering of information that we spew constantly and know what to do. An entity who does not judge, and only comforts. How this digital creature can take primacy in the life of a lonely person. It’s not just fiction, we’ve seen it in reality too. A woman recently married her AI husband, whilst a man in Japan married his holographic wife (only for the company to savagely take it away from him). 

Culture, Love, and Connection in the Datastreams

Culture, connection, even love. These Idoru may well infiltrate every aspect of our human lives – even the parts we consider most distinctly human of all. What makes us laugh and cry, what makes us yearn to create, what drives us to improve ourselves. The endorphin orchestra that daily feeds our brains needs constant violins. 

As large Twitch streamers and Youtubers quit en masse citing stress and the need to constantly be ‘always on’, week after week, year after year, for fear of their subscriber count dropping, the vacuum in entertainment grows larger for AI , who feels no such pain of mortal flesh, to step in. An AI-made culture never takes a break. 

Yet what culture could be left? Just a relentless march of regurgitated images, flashing brightly across brains too doused in stimuli to tell the difference anymore, falling in love with the Idoru of our recycled imagination.

Blade Runner 2049 GIF (Credit:

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter