Can this tech pioneer convince you to delete your social media accounts?
Jaron Lanier says evidence shows addictive design was deliberately used for platform
Originally published on Oct. 5, 2018.
Jaron Lanier has never created a social media account, but that doesn't make him a technophobic luddite.
In fact, the Silicon Valley pioneer is considered a founding father of virtual reality, who worked on the growth of the Internet in the 1990s. He's currently an interdisciplinary scientist at Microsoft.
And he's urging you to delete your social media accounts, free yourself from the toxicity and disastrous affects.
Lanier breaks down his reasoning with evidence in his book Ten Arguments for Deleting Your Social Media Accounts Right Now. He spoke to The Current's Friday host Piya Chattoaphdyay.
Here is part of that conversation.
How did someone so steeped in the world of tech end up feeling like social media is fundamentally a destructive force?
The destructive force is not exactly social media. The problem is that behind the scenes, the social media companies —especially Facebook and Google through YouTube and some of its other services — use these algorithms that are designed to get you riled up so you get more and more addicted.
I would like to think that I am smarter than these companies and their ability to lure me in. How are they getting me to be addicted?
The way this works is not through intelligence, it's through relentless statistics. So a company will measure as much information about you as possible. They're ultra information hoarders. They know every little thing they possibly can get about you: how your body moves, your facial expressions, all kinds of stuff. And then they try to put you in a bin with millions of other people who might bear some similarity to.
So it's not that they're smart, it's not that they have any insight into how to hypnotize you or get you addicted. It's just that statistics work. It's math.
You ask the users — and I'll admit I'm on social media — not to be insulted when you suggest we may be turning into well-trained dogs as a result of social media.
But what about people who feel their eyes are wide open? I'm doing it through a critical lens and I'm not being manipulated. What do you say to me, and others?
For some, and it's a minority, they can say that and be correct. There's some number of people who have a great experience on it. But even Facebook's own scientists report results that on the whole, people get sad or they lose quality of life. You know, one of the symptoms of being addicted to something is anhedonia — a loss of joy in life. Overall statistically, you're more likely to experience that and to not realize why it's happening, just like somebody who's addicted to gambling or some other behavioural addiction.
I always thought the manipulation machine in the background was unethical and destructive, so I never was on board.- Jaron Lanier on why he never had a social media account
But is there evidence that people are actually addicted to their social media?
There is a world of research on this and, furthermore, the people who created these systems have publicly stated that they used addictive designs as has been studied by scientists in the field of behaviourism. Sean Parker, the first president of Facebook has stated that for the record. So I think we can really say that they're both deliberately and effectively addictive design.
I imagine, given the book that you've just written, that you have cancelled all your social media accounts?
I never got them to begin with. I always thought the manipulation machine in the background was unethical and destructive so I never was on board.
Would people call you a luddite even though you work in the tech world?
No. I suspect I've been involved in more new technologies and there'd be very few people alive who I think could match my record for that. But the thing is I strive to be an ethical luddite. I don't think you can be a technologist without also being an ethicist. It's just not right.
What do you mean by that?
Technology gives us the wiggle room to become better toward each other, but it doesn't guarantee it. Ethics can only come from the heart, at the end of the day. So to be a technologist without also being an ethicist is not to be helpful. You don't achieve anything. You might just be sending humanity backwards.
I go on Twitter and ... people are so mean. You argue that social media makes what we say not only meaningless, but that it destroys our capacity for empathy. I want you to connect those two things.
The thing is that early in the history of the science in behaviourism, a famous behaviourist named [Ivan] Pavlov showed that you could get a dog to salivate through the ringing of a bell. And so what he did is he rang a bell while the dogs were eating and then they would associate food with the bell and then the bell itself became the treat. So you can use symbols instead of actual candy as a treat or electric shock as as a punishment.
And so in our social media technologies that are manipulative, we use social pain — which might be rejection, humiliation or just being ignored — as the equivalent of the electric shock from classic behaviourism. Then we use that occasional time when you're liked or something you do goes viral, those are like the equivalents of the treats that's like Pavlov's bell for the dogs. And so there's a whole science about how you time these things in order to get people more engaged.
And there's an unfortunate little fact of human psychology which is while in the real world negative emotions like anger and paranoia and fear and resentment and jealousy, those things are roughly equal by the positive ones like admiration and affection, trust. In the cyber world because we're measuring so fast the algorithms tend to pick up on the emotions that rise the quickest and so the negative emotions — the fight or flight emotions — rise really quickly. So the algorithms elicit them more because they're just economical, they're effective.
This Q & A has been edited for clarity and length. Listen to the full conversation near the top of this page.
Produced by The Current's Alison Masemann.