ARGUMENT ONE
YOU ARE LOSING YOUR FREE WILL
WELCOME TO THE CAGE THAT GOES EVERYWHERE WITH YOU
Something entirely new is happening in the world. Just in the last five or ten years, nearly everyone started to carry a little device called a smartphone on their person all the time that’s suitable for algorithmic behavior modification. A lot of us are also using related devices called smart speakers on our kitchen counters or in our car dashboards. We’re being tracked and measured constantly, and receiving engineered feedback all the time. We’re being hypnotized little by little by technicians we can’t see, for purposes we don’t know. We’re all lab animals now.
Algorithms gorge on data about you, every second. What kinds of links do you click on? What videos do you watch all the way through? How quickly are you moving from one thing to the next? Where are you when you do these things? Who are you connecting with in person and online? What facial expressions do you make? How does your skin tone change in different situations? What were you doing just before you decided to buy something or not? Whether to vote or not?
All these measurements and many others have been matched up with similar readings about the lives of multitudes of other people through massive spying. Algorithms correlate what you do with what almost everyone else has done.
The algorithms don’t really understand you, but there is power in numbers, especially in large numbers. If a lot of other people who like the foods you like were also more easily put off by pictures of a candidate portrayed in a pink border instead of a blue one, then you probably will be too, and no one needs to know why. Statistics are reliable, but only as idiot demons.
Are you sad, lonely, scared? Happy, confident? Getting your period? Experiencing a peak of class anxiety?
So-called advertisers can seize the moment when you are perfectly primed and then influence you with messages that have worked on other people who share traits and situations with you.
I say “so-called” because it’s just not right to call direct manipulation of people advertising. Advertisers used to have a limited chance to make a pitch, and that pitch might have been sneaky or annoying, but it was fleeting. Furthermore, lots of people saw the same TV or print ad; it wasn’t adapted to individuals. The biggest difference was that you weren’t monitored and assessed all the time so that you could be fed dynamically optimized stimuli—whether “content” or ad—to engage and alter you.
Now everyone who is on social media is getting individualized, continuously adjusted stimuli, without a break, so long as they use their smartphones. What might once have been called advertising must now be understood as continuous behavior modification on a titanic scale.
Please don’t be insulted. Yes, I am suggesting that you might be turning, just a little, into a well-trained dog, or something less pleasant, like a lab rat or a robot. That you’re being remote-controlled, just a little, by clients of big corporations. But if I’m right, then becoming aware of it might just free you, so give this a chance, okay?
A scientific movement called behaviorism arose before computers were invented. Behaviorists studied new, more methodical, sterile, and nerdy ways to train animals and humans.
One famous behaviorist was B. F. Skinner. He set up a methodical system, known as a Skinner box, in which caged animals got treats when they did something specific. There wasn’t anyone petting or whispering to the animal, just a purely isolated mechanical action—a new kind of training for modern times. Various behaviorists, who often gave off rather ominous vibes, applied this method to people. Behaviorist strategies often worked, which freaked everyone out, eventually leading to a bunch of creepy “mind control” sci-fi and horror movie scripts.
An unfortunate fact is that you can train someone using behaviorist techniques, and the person doesn’t even know it. Until very recently, this rarely happened unless you signed up to be a test subject in an experiment in the basement of a university’s psychology building. Then you’d go into a room and be tested while someone watched you through a one-way mirror. Even though you knew an experiment was going on, you didn’t realize how you were being manipulated. At least you gave consent to be manipulated in some way. (Well, not always. There were all kinds of cruel experiments performed on prisoners, on poor people, and especially on racial targets.)
This book argues in ten ways that what has become suddenly normal—pervasive surveillance and constant, subtle manipulation—is unethical, cruel, dangerous, and inhumane. Dangerous? Oh, yes, because who knows who’s going to use that power, and for what?
THE MAD SCIENTIST TURNS OUT TO CARE ABOUT THE DOG IN THE CAGE
You may have heard the mournful confessions from the founders of social media empires, which I prefer to call “behavior modification empires.”
Here’s Sean Parker, the first president of Facebook:
We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever.… It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.… The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway … it literally changes your relationship with society, with each other.… It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.1
Here’s Chamath Palihapitiya, former vice president of user growth at Facebook:
The short-term, dopamine-driven feedback loops we’ve created are destroying how society works.… No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem—this is not about Russian ads. This is a global problem.… I feel tremendous guilt. I think we all knew in the back of our minds—even though we feigned this whole line of, like, there probably aren’t any bad unintended consequences. I think in the back, deep, deep recesses of, we kind of knew something bad could happen.… So we are in a really bad state of affairs right now, in my opinion. It is eroding the core foundation of how people behave by and between each other. And I don’t have a good solution. My solution is I just don’t use these tools anymore. I haven’t for years.2
Better late than never. Plenty of critics like me have been warning that bad stuff was happening for a while now, but to hear this from the people who did the stuff is progress, a step forward.
For years, I had to endure quite painful criticism from friends in Silicon Valley because I was perceived as a traitor for criticizing what we were doing. Lately I have the opposite problem. I argue that Silicon Valley people are for the most part decent, and I ask that we not be villainized; I take a lot of fresh heat for that. Whether I’ve been too hard or too soft on my community is hard to know.
The more important question now is whether anyone’s criticism will matter. It’s undeniably out in the open that a bad technology is doing us harm, but will we—will you, meaning you—be able to resist and help steer the world to a better place?
Companies like Facebook, Google, and Twitter are finally trying to fix some of the massive problems they created, albeit in a piecemeal way. Is it because they are being pressured or because they feel that it’s the right thing to do? Probably a little of both.
The companies are changing policies, hiring humans to monitor what’s going on, and hiring data scientists to come up with algorithms to avoid the worst failings. Facebook’s old mantra was “Move fast and break things,”3 and now they’re coming up with better mantras and picking up a few pieces from a shattered world and gluing them together.
This book will argue that the companies on their own can’t do enough to glue the world back together.
Because people in Silicon Valley are expressing regrets, you might think that now you just need to wait for us to fix the problem. That’s not how things work. If you aren’t part of the solution, there will be no solution.
This first argument will introduce a few key concepts behind the design of addictive and manipulative network services. Awareness is the first step to freedom.
Copyright © 2018 by Jaron Lanier