The Social Dilemma
Film writer Vickie Curtis ’07 explains how the technology that connects us might destroy us.
The tech industry insiders who invented infinite scrolling, the Facebook “Like” button and so many of the other elements that help make social media so addicting didn’t set out to radically alter the fabric of society. But they did.
Now, Vickie Curtis ’07 is helping to warn the world. She is one of three writers of The Social Dilemma, a wildly popular Netflix docudrama that explores the dangerous human impact of social networking, as told by the very people who created the platforms.
“What they were realizing at the time we were filming wasn’t that there were quirks to the thing they made that needed fixing; it was that they had helped create these three billion–tentacled monsters that are actually shifting the course of human history,” she said.
Curtis joined Austin Jenkins ’95, an Olympia, Washington–based political reporter with the Northwest News Network, for a virtual conversation in November about the process of crafting the film, which premiered at the 2020 Sundance Film Festival and was released on Netflix in September. More than 400 alumni, parents, faculty, staff and students took part in the live event, which was hosted by Conn’s Office of Alumni and Parent Engagement.
CC Magazine has edited the conversation for clarity and length.
Austin Jenkins: Can you tell us a little bit about the origin of this film?
Vickie Curtis: In Jan. 2018, the director, Jeff Orlowski, organized a group of folks to meet with [former Google employee] Tristan Harris, who features as sort of a protagonist in the film. A lot of Jeff’s friends were working for these companies in Silicon Valley, and more and more people were coming to him having left Twitter, Google and Facebook, saying, “I have regrets,” or “Things have taken a turn for the worse and I want out.” The more stories he heard, the more he was wondering, “Could this be a film?”
At first the thought was, “Is this a big enough issue? Does this just mean everyone’s addicted to their phone, which we all already know?” There’s no big reveal there. But the more experts we talked to, the more we realized it is so much bigger than [someone] being advertised to, or being addicted to the phone or looking at Facebook too much. It is really an existential threat that is tearing apart the fabric of society.
AJ: When I watched the documentary, I immediately thought of the 1999 film The Insider, which was about a former tobacco industry scientist who was ultimately convinced by 60 Minutes to tell the story of what was really going on inside Big Tobacco. I’m curious whether you think there is a comparison to be made between Big Tobacco of yesterday and Big Tech of today.
VC: There are ways in which I would say yes. It’s an industry where the product is not aligned with the incentives of the user of that product. For Big Tobacco, they’re making cigarettes that we now know—and they did know—cause cancer. I haven’t met any cigarette smokers who smoke cigarettes to get cancer. It’s not an outcome they’re looking for. With tech, there’s a similar thing, where the product is these platforms that are addictive and that are misinforming us, making us more narcissistic, more anxious, more depressed, and that are tearing apart some of our institutions. For that reason, I would say that it’s probably more dangerous than tobacco, because it isn’t just having effects on individuals, it’s having effects on larger institutions as well.
AJ: There are all these light bulb or “aha” moments when you’re watching the film, and one is that social media, and tech in general, is the only industry outside of the drug industry that talks about its customers as “users.” I was also struck by the line, “If you’re not paying for the product, you are the product.”
VC: A great question for us at the beginning of the filmmaking process was, “If these companies are worth hundreds of billions of dollars, why? We’re not paying them, so who is paying them?” Advertisers. Advertisers are paying them to target their advertisements to people who will be the most susceptible to that advertisement.
To figure out who’s most susceptible, Facebook and Google create an avatar of you, which is a collection of up to 29,000 data points about what kind of person you are. They’re monitoring things like how fast you scroll, or how you move your mouse, or how long it takes you to absorb an article, or which things you’ve clicked on in the past or what time of day you click on certain kinds of information. Google Maps and Google phones are sending back to Google headquarters all of your real-life habits of where you physically go.
They have a ton of data on who you are as a person, and then they can group you and say, “Okay, there are 29 other people just like Austin in his neighborhood, and they all are doing this, so why don’t we advertise that to Austin, too? We know he’s likely to be susceptible to that thing.”
And on one level, people are like, “Oh, well, this just means the advertisements I see are relevant to me, so, great, this is a pair of shoes I want to buy.” But the algorithm isn’t perfect, nor is it trying to figure out what shoes you want to buy. It’s trying to figure out how to keep you on the platform longer, what gets you hooked. It preys on our fears, doubts and insecurities, so it’s going to show us more outrageous, salacious, fearmongering information in order to keep us there so that we will see more ads, click on more ads. That makes data this really powerful tool for shifting and manipulating people’s beliefs and behaviors.
AJ: I read during the Cambridge Analytica scandal that they could get enough data points to eventually know more about you than you know about yourself.
VC: Absolutely, because a lot of it is subconscious. Like, I don’t know how fast I scroll; I don’t know what my mouse-click patterns are; I don’t know what my personality profile is based on those habits of mine. They have a whole understanding of your personality profile based on the information that they’ve collected on you, and that determines which particular conspiracy theory to show you next.