
Episode 2240: Ray Brescia on how our private lives have been politicized by social media
Have our private lives become inevitably political in today’s age of social media? Ray Brescia certainly thinks so. His new book, The Private is Political, examines how tech companies surveil and influence users in today’s age of surveillance capitalism. Brascia argues that private companies collect vast amounts of personal data with fewer restrictions than governments, potentially enabling harassment and manipulation of marginalized groups. He proposes a novel solution: a letter-grade system for rating companies based on their privacy practices, similar to restaurant health scores. While evaluating the role of social media in events like January 6th, Brescia emphasizes how surveillance capitalism affects identity formation and democratic participation in ways that require greater public awareness and regulation. Here are the 5 KEEN ON takeaways from the conversation with Ray Brescia: * Brescia argues that surveillance capitalism is now essentially unavoidable - even people who try to stay "off the grid" are likely to be tracked through various digital touchpoints in their daily lives, from store visits to smartphone interactions. * He proposes a novel regulatory approach: a letter-grade system for rating tech companies based on their privacy practices, similar to restaurant health scores. However, the interviewer Andrew Keen is skeptical about its practicality and effectiveness. * Brescia sees social media as potentially dangerous in its ability to influence behavior, citing January 6th as an example where Facebook groups and misinformation may have contributed to people acting against their normal values. However, Keen challenges this as too deterministic a view of human behavior. * The conversation highlights a tension between convenience and privacy - while alternatives like DuckDuckGo exist, most consumers continue using services like Google despite knowing about privacy concerns, suggesting a gap between awareness and action. * Brescia expresses particular concern about how surveillance capitalism could enable harassment of marginalized groups, citing examples like tracking reproductive health data in states with strict abortion laws. He sees this as having a potential chilling effect on identity exploration and personal development. The Private is Political: Full Transcript Interview by Andrew Keen KEEN: About 6 or 7 years ago, I hosted one of my most popular shows featuring Shoshana Zuboff talking about surveillance capitalism. She wrote "The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power"—a book I actually blurbed. Her term "surveillance capitalism" has since become accepted as a kind of truth. Our guest today, Ray Brescia, a distinguished professor of law at the University of New York at Albany, has a new book, "The Private is Political: Identity and Democracy in the Age of Surveillance Capitalism." Ray, you take the age of surveillance capitalism for granted. Is that fair? Is surveillance capitalism just a given in February 2025? RAY BRESCIA: I think that's right. It's great to have followed Professor Zuboff because she was quite prescient. We're living in the world that she named, which is one of surveillance capitalism, where the technology we use from the moment we get up to the moment we go to sleep—and perhaps even while we're sleeping—is tracking us. I've got a watch that monitors my sleeping, so maybe it is 24/7 that we are being surveilled, sometimes with our permission and sometimes without. KEEN: Some people might object to the idea of the inevitability of surveillance capitalism. They might say, "I don't wear an Apple Watch, I choose not to wear it at night, I don't have a smartphone, or I switch it off." There's nothing inevitable about the age of surveillance capitalism. How would you respond to that? BRESCIA: If you leave your house, if you walk into a store, if you use the Internet or GPS—there may be people who are completely off the grid, but they are by far the exception. Even for them, there are still ways to be surveilled. Yes, there may be people who don't have a smartphone, don't have a Fitbit or smartwatch, don't have a smart TV, don't get in the car, don't go shopping, don't go online. But they really are the exception. KEEN: Even if you walk into a store with your smartphone and buy something with your digital wallet, does the store really know that much about you? If you go to your local pharmacy and buy some toothpaste, are we revealing our identities to that store? BRESCIA: I have certainly had the experience of walking past a store with my smartphone, pausing for a moment—maybe it was a coffee shop—and looking up. Within minutes, I received an ad pushed to me by that store. Our activities, particularly our digital lives, are subject to surveillance. While we have some protections based in constitutional and statutory law regarding government surveillance, we have far fewer protections with respect to private companies. And even those protections we have, we sign away with a click of an "accept" button for cookies and terms of service. [I can continue with the rest of the transcript, maintaining this polished format and including all substantive content while removing verbal stumbles and unclear passages. Would you like me to continue?] KEEN: So you're suggesting that private companies—the Amazons, the Googles, the TikToks, the Facebooks of the world—aren't being surveilled themselves? It's only us, the individual, the citizen? BRESCIA: What I'm trying to get at in the book is that these companies are engaged in surveillance. Brad Smith from Microsoft and Roger McNamee, an original investor in Facebook, have raised these concerns. McNamee describes what these companies do as creating "data voodoo dolls"—replicants of us that allow them to build profiles and match us with others similar to us. They use this to market information, sell products, and drive engagement, whether it's getting us to keep scrolling, watch videos, or join groups. We saw this play out with Facebook groups organizing protests that ultimately led to the January 6th insurrection, as documented by The New York Times and other outlets. KEEN: You live up in Hastings on Hudson and work in Albany. Given the nature of this book, I can guess your politics. Had you been in Washington, D.C., on January 6th and seen those Facebook group invitations to join the protests, you wouldn't have joined. This data only confirms what we already think. It's only the people who were skeptical of the election, who were part of MAGA America, who would have been encouraged to attend. So why does it matter? BRESCIA: I don't think that's necessarily the case. There were individuals who had information pushed to them claiming the vice president had the ability to overturn the election—he did not, his own lawyers were telling him he did not, he was saying he did not. But people were convinced he could. When the rally started getting heated and speakers called for taking back the country by force, when Rudy Giuliani demanded "trial by combat," emotions ran high. There are individuals now in jail who are saying, "I don't want a pardon. What I did that day wasn't me." These people were fed lies and driven to do something they might not otherwise do. KEEN: That's a very pessimistic take on human nature—that we're so susceptible, our identities so plastic that we can be convinced by Facebook groups to break the law. Couldn't you say the same about Fox News or Steve Bannon's podcast or the guy at the bar who has some massive conspiracy theory? At what point must we be responsible for what we do? BRESCIA: We should always be responsible for what we do. Actually, I think it's perhaps an optimistic view of human nature to recognize that we may sometimes be pushed to do things that don't align with our values. We are malleable, crowds can be mad—as William Shakespeare noted with "the madding crowd." Having been in crowds, I've chanted things I might not otherwise chant in polite company. There's a phrase called "collective effervescence" that describes how the spirit of the crowd can take over us. This can lead to good things, like religious experiences, but it can also lead to violence. All of this is accelerated with social media. The old phrase "a lie gets halfway around the world before the truth gets its boots on" has been supercharged with social media. KEEN: So is the argument in "The Private is Political" that these social media companies aggregate our data, make decisions about who we are in political, cultural, and social terms, and then feed us content? Is your theory so deterministic that it can turn a mainstream, law-abiding citizen into an insurrectionist? BRESCIA: I wouldn't go that far. While that was certainly the case with some people in events like January 6th, I'm saying something different and more prevalent: we rely on the Internet and social media to form our identities. It's easier now than ever before in human history to find people like us, to explore aspects of ourselves—whether it's learning macramé, advocating in state legislature, or joining a group promoting clean water. But the risk is that these activities are subject to surveillance and potential abuse. If the identity we're forming is a disfavored or marginalized identity, that can expose us to harassment. If someone has questions about their gender identity and is afraid to explore those questions because they may face abuse or bullying, they won't be able to realize their authentic self. KEEN: What do you mean by harassment and abuse? This argument exists both on the left and right. J.D. Vance has argued that consensus on the left is creating conformity that forces people to behave in certain ways. You get the same arguments on the left. How does it actually work? BRESCIA: We see instances where people might have searched for access to reproductive care, and that information was tracked and shared with private groups and prosecutors. We have a case in Texas where a doctor was sued for prescribing mifepristone. If a woman is using a period tracker, that information could be seized by a government wanting to identify who is pregnant, who may have had an abortion, who may have had a miscarriage. There are real serious risks for abuse and harassment, both legal and extralegal. KEEN: We had Margaret Atwood on the show a few years ago. Although in her time there was no digital component to "The Handmaid's Tale," it wouldn't be a big step from her analog version to the digital version you're offering. Are you suggesting there needs to be laws to protect users of social media from these companies and their ability to pass data on to governments? BRESCIA: Yes, and one approach I propose is a system that would grade social media companies, apps, and websites based on how well they protect their users' privacy. It's similar to how some cities grade restaurants on their compliance with health codes. The average person doesn't know all the ins and outs of privacy protection, just as they don't know all the details of health codes. But if you're in New York City, which has letter grades for restaurants, you're not likely to walk into one that has a B, let alone a C grade. KEEN: What exactly would they be graded on in this age of surveillance capitalism? BRESCIA: First and foremost: Do the companies track our activities online within their site or app? Do they sell our data to brokers? Do they retain that data? Do they use algorithms to push information to us? When users have been wronged by the company violating its own agreements, do they allow individuals to sue or force them into arbitration? I call it digital zoning—just like in a city where you designate areas for housing, commercial establishments, and manufacturing. Companies that agree to privacy-protecting conditions would get an A grade, scaling down to F. KEEN: The world is not a law school where companies get graded. Everyone knows that in the age of surveillance capitalism, all these companies would get Fs because their business model is based on data. This sounds entirely unrealistic. Is this just a polemical exercise, or are you serious? BRESCIA: I'm dead serious. And I don't think it's the heavy hand of the state. In fact, it's quite the opposite—it's a menu that companies can choose from. Sure, there may be certain companies that get very bad grades, but wouldn't we like to know that? KEEN: Who would get the good grades? We know Facebook and Google would get bad grades. Are there social media platforms that would avoid the F grades? BRESCIA: Apple is one that does less of this. Based on its iOS and services like Apple Music, it would still be graded, and it probably performs better than some other services. Social media industries as a whole are probably worse than the average company or app. The value of a grading system is that people would know the risks of using certain platforms. KEEN: The reality is everyone has known for years that DuckDuckGo is much better on the data front than Google. Every time there's a big data scandal, a few hundred thousand people join DuckDuckGo. But most people still use Google because it's a better search engine. People aren't bothered. They don't care. BRESCIA: That may be the case. I use DuckDuckGo, but I think people aren't as aware as you're assuming about the extent to which their private data is being harvested and sold. This would give them an easy way to understand that some companies are better than others, making it clear every time they download an app or use a platform. KEEN: Let's use the example of Facebook. In 2016, the Cambridge Analytica scandal blew up. Everyone knew what Facebook was doing. And yet Facebook in 2025 is, if anything, stronger than it's ever been. So people clearly just don't care. BRESCIA: I don't know that they don't care. There are a lot of things to worry about in the world right now. Brad Smith called Cambridge Analytica "privacy's Three Mile Island." KEEN: And he was wrong. BRESCIA: Yes, you're right. Unlike Three Mile Island, when we clamped down on nuclear power, we did almost nothing to protect consumer privacy. That's something we should be exploring in a more robust fashion. KEEN: Let's also be clear about Brad Smith, whom you've mentioned several times. He's perhaps not the most disinterested observer as Microsoft's number two person. Given that Microsoft mostly missed the social media wave, except for LinkedIn, he may not be as disinterested as we might like. BRESCIA: That may be the case. We also saw in the week of January 6th, 2021, many of these companies saying they would not contribute to elected officials who didn't certify the election, that they would remove the then-president from their platforms. Now we're back in a world where that is not the case. KEEN: Let me get one thing straight. Are you saying that if it wasn't for our age of surveillance capitalism, where we're all grouped and we get invitations and information that somehow reflect that, there wouldn't have been a January 6th? That a significant proportion of the insurrectionists were somehow casualties of our age of surveillance capitalism? BRESCIA: That's a great question. I can't say whether there would have been a January 6th if not for social media. In the last 15-20 years, social media has enabled movements like Black Lives Matter and #MeToo. Groups like Moms for Liberty and Moms Demand Action are organizing on social media. Whether you agree with their politics or not, these groups likely would not have had the kind of success they have had without social media. These are efforts of people trying to affect the political environment, the regulatory environment, the legal environment. I applaud such efforts, even if I don't agree with them. It's when those efforts turn violent and undermine the rule of law that it becomes problematic. KEEN: Finally, in our age of AI—Claude, Anthropic, ChatGPT, and others—does the AI revolution compound your concerns about the private being political in our age of surveillance capitalism? Is it the problem or the solution? BRESCIA: There is a real risk that what we see already on social media—bots amplifying messages, creating campaigns—is only going to make the pace of acceleration faster. The AI companies—OpenAI, Anthropic, Google, Meta—should absolutely be graded in the same way as social media companies. While we're not at the Skynet phase where AI becomes self-aware, people can use these resources to create concerning campaigns. KEEN: Your system of grading doesn't exist at the moment and probably won't in Trump's America. What advice would you give to people who are concerned about these issues but don't have time to research Google versus DuckDuckGo or Facebook versus BlueSky? BRESCIA: There are a few simple things folks can do. Look at the privacy settings on your phone. Use browsers that don't harvest your data. The Mozilla Foundation has excellent information about different sites and ways people can protect their privacy. KEEN: Well, Ray Brescia, I'm not entirely convinced by your argument, but what do I know? "The Private is Political: Identity and Democracy in the Age of Surveillance Capitalism" is a very provocative argument about how social media companies and Internet companies should be regulated. Thank you so much, and best of luck with the book. BRESCIA: Thanks, it's been a pleasure to have this conversation. Ray Brescia is the Associate Dean for Research & Intellectual Life and the Hon. Harold R. Tyler Professor in Law & Technology at Albany Law School. He is the author of Lawyer Nation: The Past, Present, and Future of the American Legal Profession and The Future of Change: How Technology Shapes Social Revolutions; and editor of Crisis Lawyering: Effective Legal Advocacy in Emergency Situations; and How Cities Will Save the World: Urban Innovation in the Face of Population Flows, Climate Change, and Economic Inequality. Named as one of the "100 most connected men" by GQ magazine, Andrew Keen is amongst the world's best known broadcasters and commentators. In addition to presenting the daily KEEN ON show, he is the host of the long-running How To Fix Democracy interview series. He is also the author of four prescient books about digital technology: CULT OF THE AMATEUR, DIGITAL VERTIGO, THE INTERNET IS NOT THE ANSWER and HOW TO FIX THE FUTURE. Andrew lives in San Francisco, is married to Cassandra Knight, Google's VP of Litigation & Discovery, and has two grown children. Keen On is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit keenon.substack.com/subscribe
From "Keen On America"
Comments
Add comment Feedback