Rob Wiblin & Tristan Harris , 80,000 Hours Podcast with Rob Wiblin

#88 – Tristan Harris on the need to change the incentives of social media companies

03 Dec 2020 • 155 min • EN
155 min
00:00
02:35:38
No file found

In its first 28 days on Netflix, the documentary The Social Dilemma — about the possible harms being caused by social media and other technology products — was seen by 38 million households in about 190 countries and in 30 languages. Over the last ten years, the idea that Facebook, Twitter, and YouTube are degrading political discourse and grabbing and monetizing our attention in an alarming way has gone mainstream to such an extent that it's hard to remember how recently it was a fringe view. It feels intuitively true that our attention spans are shortening, we’re spending more time alone, we’re less productive, there’s more polarization and radicalization, and that we have less trust in our fellow citizens, due to having less of a shared basis of reality. But while it all feels plausible, how strong is the evidence that it's true? In the past, people have worried about every new technological development — often in ways that seem foolish in retrospect. Socrates famously feared that being able to write things down would ruin our memory. At the same time, historians think that the printing press probably generated religious wars across Europe, and that the radio helped Hitler and Stalin maintain power by giving them and them alone the ability to spread propaganda across the whole of Germany and the USSR. Fears about new technologies aren't always misguided. Tristan Harris, leader of the Center for Humane Technology, and co-host of the Your Undivided Attention podcast, is arguably the most prominent person working on reducing the harms of social media, and he was happy to engage with Rob’s good-faith critiques. • Links to learn more, summary and full transcript. • FYI, the 2020 Effective Altruism Survey is closing soon: https://www.surveymonkey.co.uk/r/EAS80K2 Tristan and Rob provide a thorough exploration of the merits of possible concrete solutions – something The Social Dilemma didn’t really address. Given that these companies are mostly trying to design their products in the way that makes them the most money, how can we get that incentive to align with what's in our interests as users and citizens? One way is to encourage a shift to a subscription model. One claim in The Social Dilemma is that the machine learning algorithms on these sites try to shift what you believe and what you enjoy in order to make it easier to predict what content recommendations will keep you on the site. But if you paid a yearly fee to Facebook in lieu of seeing ads, their incentive would shift towards making you as satisfied as possible with their service — even if that meant using it for five minutes a day rather than 50. Despite all the negatives, Tristan doesn’t want us to abandon the technologies he's concerned about. He asks us to imagine a social media environment designed to regularly bring our attention back to what each of us can do to improve our lives and the world. Just as we can focus on the positives of nuclear power while remaining vigilant about the threat of nuclear weapons, we could embrace social media and recommendation algorithms as the largest mass-coordination engine we've ever had — tools that could educate and organise people better than anything that has come before. The tricky and open question is how to get there. Rob and Tristan also discuss: • Justified concerns vs. moral panics • The effect of social media on politics in the US and developing countries • Tips for individuals Get this episode by subscribing: type 80,000 Hours into your podcasting app. Producer: Keiran Harris. Audio mastering: Ben Cordell. Transcriptions: Sofia Davis-Fogel.

From "80,000 Hours Podcast with Rob Wiblin"

Listen on your iPhone

Download our iOS app and listen to interviews anywhere. Enjoy all of the listener functions in one slick package. Why not give it a try?

App Store Logo
application screenshot

Popular categories