There is no spoon. The spoon is in your head. We don’t need a computer-generated dream world to tell us that. The world we perceive “out there” exists “in here.” There’s no way other way it could be.
Consider: It’s cold, so I’m sitting by the fire, sipping sweet tea, listening to music, petting my kitty, smelling the muffins cooling in the kitchen, gazing out the window at the smiley blue Amazon van. The warmth, the sweetness, the sounds, the soft feel of the fur, the photons of light impinge on my senses as a messy smattering of electrical signals—largely misinformation. The image of the truck in my eyeball, for example, is upside down, obscured by blood vessels, riddled with blind spots and floaters, blurred by my movements. All this is filled in, reframed, color-corrected, resized. The truck doesn’t look like a toy, even though I can easily block it with my thumb.
Ho hum, you say. Even Galileo knew that perception is internal, that qualities such as soft and sweet “can no more be ascribed to the external objects than can the tickling or pain caused sometimes by touching such objects.” But consider how convincingly we project the stuff in our heads out into the world, like a ventriloquist throwing his voice. The lilting harmonies bouncing around inside my skull get flung across the room to the turntable, the smells relocated to the kitchen, the truck displaced to the street.
Artists like to toy with our attachment to reality in part because it’s such fun—and so easy—to do. The late San Francisco artist Bob Miller fashioned an inside-out corner that appears as a box which turns to follow you as you walk by. The fact that this is impossible doesn’t bother your brain in the least. You can poke your finger in the box-that-isn’t-there, yet the illusion persists. (Bob was initially denied a patent on his sculpture because, the patent people argued, it only existed in a person’s head. As if there’s anything that doesn’t.)
The queasiness that comes on with confronting such uneasy truths is made palpable in the Matrix movies: Reality is whatever we make it. It’s whatever we collectively agree upon, whether it’s bitcoin, atoms, or the devil.
Science gives us the tools to extend the reach for the “real,” opening doors to vastly broader vistas—visions we often dismiss as unreal. When Galileo saw mountains on the moon and moons around Jupiter, people pooh-poohed them as distortions in his telescope. Science writers have called out astronomers for using “false colors” to visualize objects we can’t perceive directly because they emit signals in otherwise indiscernible x-rays or infrared. As if there were any other way!
Is what you see with your glasses off more real than what you see with them on? If your focus clears, is the image before you fake?
So much of what’s real is indiscernible. Billions of neutrinos rain right through me as I sip my tea. Are they less real than the tea? Is an NFT less real than a dollar? Is your virtual meeting less real than the vivid memory you have of that meeting that never happened?
Most PopularThe End of Airbnb in New YorkBusiness
The world designed by human senses makes “sense,” as we put it. But there’s an argument to be made that the colorless, soundless, impalpable structures of symbols and relationships of science are far more revealing.
The code says more than the woman in red.
Science is too smart to try to define reality. (Or as the physicist Stephen Weinberg put it: “When I say something is real, it means I’m giving it a measure of respect.”) For scientific cred, you need multiple lines of evidence. You see the spoon, but can you touch it? Will it make a sound if you bang it on the wall? Will it hang from your nose?
If you’re not sure the bullet is real, would you stand in its path to find out?
It’s not disturbing or strange to make up worlds in our heads. It’s very different when computers do it.
For one thing, computers exaggerate and codify our love of putting things in boxes. Are you type A or type B? Gen X or Gen Z? Do think or feel? Are you LGBT or Q? Cat person or dog person? Happy or depressed?
My answer to such questions invariably hovers between “all of the above, “none of the above,” “kinda sorta one of the above,” or “depends on when you ask.” Mostly, it’s some version of: “I don’t understand the question.” And there’s no box for that.
We can hardly blame computers entirely. A Trivial Pursuit question posed to me years ago asked: How many colors are there in the rainbow? It didn’t make any sense. The spectrum is continuous. The number is infinite. Duh. I was wrong, though. Someone had decided the number was seven.
In a similar spirit, the psychologist Paul Ekman decided in the late 1960s that the number of human facial expressions could be counted. He came up with six: anger, happiness, surprise, disgust, sadness, and fear. These supposedly reflected internal emotional states. Though widely discredited, such categorizations are still widely used by AI platforms worldwide.
Boxing creates neat borders, ties loose ends, banishes uncertainty, erases fuzzy edges, cuts out ambiguities. The cat is either alive or it’s dead. No waffling about. No essay questions.
Reality to a computer is based on multiple choice. There’s no other way it could be.
Once you’re stuffed in a box, computers tend to lock you in; your choices narrow. Whether you’re “single” or “looking” makes a difference. Sometimes we’re not even the ones checking the boxes. If an AI tags your expression as “angry,” good luck getting that job. Worse, it’s not just data that gets locked in; the protocols and programs for handling data rigidify. Hardening of the arteries, so to speak. Potentially fatal clots in the flow of information.
All that feeds the dogma of inevitability, which makes machine learning seem forever entrenched, too complex to understand much less regulate, too powerful to refuse.
We should resist that kind of thinking, says Jaron Lanier, founder of virtual reality and so much else. “The net didn’t design itself. We design it.” I hope he’s right.
Most PopularThe End of Airbnb in New YorkBusiness
I see my image in the mirror on the way to grab a muffin and am reminded that people sometimes refer to the (my) “KC Cole” face; it makes them laugh for some reason. I wonder what kind of category it fits into. Do I even want to know?
Now comes the really creepy part. When I first started writing about the “future of reality,” it occurred to me that the worlds depicted in The Matrix were already all too real. I didn’t dare say that, because I thought people would dismiss me as your typical technologically challenged old person.
But it turned out I had a lot of esteemed (and younger) company. Hardly Luddites, those pushing us to take the red pills these days are mostly visionaries like Lanier. A decade ago, he wrote the classic book You Are Not a Gadget; increasingly, he argues, you are. To be specific, you (we) are becoming “computer peripherals attached to the great computing clouds.”
Microsoft’s Kate Crawford, in Atlas of AI, describes the perils of cramming our complex and fluid personal and social realities into “representations of the world made solely for machines.” AI forces the “systemization of the unsystematizeable,” reduces depth, kills grace notes, flattens experience and us along with it.
Lanier and Co. believed that the world would be a better place if everyone shared information freely; instead, he describes it as a place where we’re watched all the time, handing over data whether we want to or not. (I wonder if the Amazon driver outside is being watched by some version of a Smith—the better to keep him in line. We all know the answer to that.)
The humongous stores of data scraped off the internet—our faces, habits, health, finances, kids, lovers, favorite actors, vacations, conversations with your Roomba—go to megacomputers that tweak what you see in order to keep you hooked, sell you stuff. It’s a one-way street. We’re transparent to the megaservers, but they’re opaque to us. Distant corporations use the data to change our lives “in unfathomable ways,” Lanier writes. “You never really know what might have been if someone else’s cloud algorithm had come to a different conclusion about your potential as loan taker, a date, or an employee.”
The movie-version Matrix gets its fuel from human batteries. The giant networks of machines we ironically refer to as “clouds” also feed on humans: those who mine rare minerals, assemble devices, drive trucks, load packages, translate text, label and evaluate objects and faces (often incorrectly; especially if you’re female, dark-skinned, or otherwise other).
It takes many thousands of people to sustain the illusion of fluffy, weightless automation. For these tasks, humans are cheaper than robots.
Attempting to take in this underground reality is as jarring as Neo’s first look at the vast array of baby-powered fuel cells. No one wants to hear about the costs: the enormous carbon footprint of computing, the draining of community water and power supplies, the reliance on taxpayer-funded infrastructure, sewers, gas lines, fiber optics, you name it. There’s a reason these megaservers are hidden away in remote locales.
Most PopularThe End of Airbnb in New YorkBusiness
Too disquieting to imagine, I turn away; HBO is pushing the new Sex and the City at me; it knows who I am. Now even my cat (I’m actually a dog person) is hooked on TV. Who can resist the tempting spread of digital bread and circuses, birds and cake?
But truth be told, my favorite devices are analog: my friends, my cat, the soup, the fire, the Amazon driver. There’s a lot to be said for touching, feeling, sniffing, tasting, hugging, bumping into things. We evolved to use these senses of ours. We have minds, but also bodies (though they do tend to fall apart).
And let’s get real: The human mind is a mess. The reality evolution left us is an accidental collection of spare parts of make-dos. Consciousness is chattery, convoluted, slippery, spotty, fickle.
Intelligent design my arse.
It occurred to me that quantum computers could make even artificial reality at least a little more real. They can handle complexity. The cat can be alive and dead in a quantum computer. Everything is all of the above until it isn’t.
But technology isn’t the problem, the people close to the machines tell me. It’s the power structure, stupid. Too few choices. The spoils all flow to the toll-keepers. We need to claw it back. Make it transparent. Spread the wealth around. Make it by humans and for humans—as was originally intended. Make the internet great again. Build Back Better.
They tell me: The future of reality is what we make it.
In the meantime, I’m tempted to start selling T-shirts that might tap into a new kind of identity politics: This device is analog!
Or maybe: Analog (and only analog) Lives!
Analog and Proud!
More Great WIRED Stories📩 The latest on tech, science, and more: Get our newsletters!The Twitter wildfire watcher who tracks California’s blazesA new twist in the McDonald’s ice cream machine hacking sagaWish List 2021: Gifts for all the best people in your lifeThe most efficient way to debug the simulationWhat is the metaverse, exactly?👁️ Explore AI like never before with our new database✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers