Interview with Aza Raskin, Center for Humane Technology, San Francisco, October 2018
Patrick
Chappatte: Realigning tech with humanity’s best interests. What do you mean by that?
Technology has surpassed or capacities?
I think we're just at that point now. With tech, it's like being in an abusive relationship, they gaslight you. On one hand, they use a hundred engineers behind every screen with metrics and supercomputers to try to make this thing as addictive as possible. And on the other hand, they tell you it's your fault if you use it too much.
Trump’s election, was that a waking call in Silicon Valley?
I worry they mostly see it as a PR problem. I don't think there's a willingness to try to accept responsibility for what these platforms are doing. Something like two third of Americans get their news from social media and YouTube. 70% of views on YouTube are from the recommendation engine. And this becomes like a feeding trough for the human mind. A New York Times op ed called YouTube "The great radicalizer or of our time."
We work with governments, the E.U. and the U.S, on amending the rules that say platforms are not responsible for content that their users post. But the freedom of speech is not the same thing as the freedom for reach. We want to make companies legally liable again for any content that they amplify to more than 100000 people.
Zuckerberg and others have started talking about
regulation.
Is that true? Not a cliché ?
It's true. I know some of this stuff because my dad was the guy who did the Macintosh. And so I grew up with many of these other kids.
What rules did he give you, your dad?
Their idea was: technology should extend human creativity. It should be a bicycle for the mind. Not these things that have become like Huxley. They amuse us to death. Do we program apps, or are we programming people?
Are smartphones the problem because they’re in our pockets?
Yeah, the surface area that touches our bodies has gone way up. Most college kids experience what they call phantom buzzes, where you think that their phone is buzzing. It's physically changing our physiology. You know you're addicted when you start checking your phone before you pee the morning - or while you’re doing it. This surface area created the jack into our brains that all this other stuff flows from.
So, the iPhone was Steve Jobs' bad idea?
This plus our engagement-based business models. Our attention is the most valuable resource in the world. The companies that are doing the best in the stock market are all the big ones, Google, Facebook, that are learning to exploit the human as a resource. They're flourishing.
Are we at risk of destroying ourselves?
We’ll be soon able to open up the human skull and make it believe whatever you want it to believe. That's what microtargeting is fantastic about. The new iPhone reads your face in real time 3-D. Netflix or YouTube might know the exact moment when you get bored. Computers are better at reading micro expressions than humans. Now apply that to an entire society. We can micro target the perfect political ad because we know exactly when you were happy or sad or sardonic. We can generate an ad that uses a face that you can't help but trust.
The perfect manipulation tools. Is there a way to avoid that?
We've always thought Western liberal democracy is based on the idea that we as humans make sovereign decisions based on our own lived experience. Not true anymore. Look at Google's AI voice that sounds just like humans. Imagine all the ways this will be abused. The next political campaign, say in 2020, they'll just spin up a call center entirely virtual that's a hundred-thousand people equivalent and it's going to be testing all of the different scripts. You're going to get a call, it's going to sound a little bit like your dad, and it will have the perfect script for just you. This might be the way we do influence in the future. It's terrifying.
Do we need a new humanism?
Yeah. We think it as a new field of design, or of technology. It's why we called it "Humane" . We want this technology to be protective of human vulnerability, sensitive to our frailties and extend the best parts of us. We call it, like, "technology-society interaction design" or "human-centered design". In our age, when what a programer makes scales up to 100 million users in a couple of months, every piece of code we write is inherently political.
The dark side of the force won in technology?
Um no I mean, look, open source won…
Really? Don't you think it's social media?
It's a different dark side, yes. Facebook has lust, pride, envy, gluttony, it has it all. They found that teens, when they're depressed buy more cosmetics . So the IA targets especially teen women and just starts shoving beauty products at them exactly when they're feeling most vulnerable. Each one of us has these kind of vulnerabilities.
Isn’t there a little paradox? The personal computer pioneers had idealistic goal of giving "Power to the people". Haven’t social media accomplished that?
These are not spaces for us to express ourselves for our own purpose. Platforms are not good, they're not bad, but they're not neutral. They always tilt the flow of human behavior in some way. Twenty years ago, fame was not on the top 10 list of things that people wanted in their life. Today it's number two. You have to ask this question: are we our best selves when we're using social media?
Do you imagine something really bad happening?
The next infopocalypse?... Look at climate change: we know we're heading to someplace really bad but unless people like feel it, they don't change. That's humans. Sot I think it's gonna have to get worse before it gets better. But on the other hand there is an awakening. Almost everyone I know in Silicon Valley has heard of the Center for Humane Technology. People want to go into work feeling like they're doing something good for the world. The people from the inside: I think the fastest way that we can change Silicon Valley.
GO TO: PART 3