Screens and Nightmares
Civil rights lawyer and researcher Rashida Richardson discusses how The Social Dilemma resonates in an era of COVID and crisis.
FROM THE “SUR-FAKE” SERIES, 2015
Tech innovation has created complications for our democracy, our mental health, and our society. We sit at a pivotal moment: We’re in the midst of a pandemic that’s furthering the divide between the haves and the have-nots. It’s pulling us further into the filter bubbles created by the very same algorithms that promised to make the world more open and connected. At stake: the sway of misinformation, a restructuring of truth, the future of civil rights. For these reasons, Rashida Richardson finds herself pretty busy lately.Richardson is a civil rights lawyer who focuses on the intersections of technology, race, and society. She got her start at Facebook in 2011, where she lasted all of three months, realizing that she could have more impact outside of the Silicon Valley bubble. She has since become a thorn in the side of powerful tech companies, challenging lawmakers and executives to pay attention to an often-overlooked aspect of tech: who loses in the race to innovate.Richardson is currently a visiting scholar at Rutgers Law School, researching how A.I., big data, and data-driven technologies are implemented in society. She previously served as the director of policy research at the A.I. Now Institute, and she worked on privacy and surveillance issues for the A.C.L.U. Most recently, she was featured as an expert in Jeff Orlowski’s documentary The Social Dilemma, about the destructive impacts of social networks on our lives and behavior.The documentary was released in the months before the global pandemic lockdown, but the current environment has only exaggerated problems arising from tech innovation: government surveillance, facial recognition practices that disproportionately impact minorities, algorithmic bias. For Richardson, calling attention to the unintended consequences of technology has become a civil rights matter. I recently spoke with her about those consequences, and about the growing resonance of The Social Dilemma.
Laurie Segall: You got your start about the time that I got my start covering technology — right out of the Recession. It was such an extraordinary time. You had these companies and these economies that were coming out of scarcity. It was almost like this Wild West.
Rashida Richardson: You’re dealing with innovation at a time when Facebook’s slogan, “Move fast and break things,” wasn’t understood as a problematic slogan. What gets broken, who deals with the burden of the things that are broken, and is “broken” a good thing? I’m abstracting it to a certain degree, but we’re now dealing with the burdens of that inability to see the downside of innovation at all costs.
I’ll stop there. That seems dark enough.”
Many of the people who created these problems and designed these products are raising red flags. Why now?
RR: There’s been a lot of robust advocacy happening, especially on the state and local levels — whether we’re talking about bans and pushbacks against facial recognition, or even hiring algorithms. In some ways, “why now” is a result of many years of trying to force these issues to greater awareness on the Hill and elsewhere. I hope that policymakers understand that changes need to happen yesterday for us to not go into a nightmare scenario.
Many of us have lived our lives on social media. Our data is out there. Whether or not you like it, you’ve opted in.
RR: People need to think more critically about their actions and data collection. You don’t need a computer to have data collected about you. A credit card is another common way that information is obtained about us. Discount cards at the grocery store are recording everything you purchase. If we had better public education to understand the wide variety of ways that data is collected about us, then people could make better-informed choices about what they want to opt in and opt out of. You’re always trading convenience for something else. People need to question whether that convenience is worth it, and understand what those inherent trade-offs are.
Stay engaged and figure out your own perspective.”
Could there be any upside with respect to mental health and the gathering of some of that data to help us better understand ourselves?
RR: It’s tough with mental health, but I’ll give you a complicated positive. One thing that predictive analytics or data analytics is very good at is finding patterns in historical data. If you feed a bunch of mammograms into a system, the system can find a pattern and actually spot cancer better than the human eye. I’d see that as a good thing, but it’s complicated if that dataset is not representative of our society. This has been shown in research: That mammogram-scan system may not work well for me, a Black woman, because there’s less data about Black people.
As a civil rights lawyer and someone who’s looked at data, surveillance, and the impact of algorithms your whole career, what would your message be to folks in Silicon Valley who are building the products that impact our lives?
RR: Where do we start? One of the issues is who gets to define the problems that technology is going to solve. Mostly that’s been the Tim Cooks of the world. First, there needs to be more community engagement, trying to integrate different voices in the room around what the problems are and what solutions are being driven. They also need to hire more people that are not represented in the companies right now.
Watch The Social Dilemma
on Netflix now.