The Perilous Power Of Social Media Platforms
Social media platforms have immense power: to shut down voices and amplify what we see. But is that singular power perilous to democracy?
On the digital century’s impact on democracy
Shoshana Zuboff: “What is this fight for the soul of our information, civilization? It’s a fight about whether or not our digital century is going to be compatible with democracy. And let me be even more forceful than that, not only compatible with democracy, but will our digital century be a place where democracy can thrive, where we can thrive, where our ideas and our civilization can thrive?
“This is what is under assault right now from the tech companies who practice in economic logic that I’ve called surveillance capitalism. … There’s that old saying ‘believing is seeing.’ For the past 20 years or so, we’ve been believing that the digital was going to be the golden age of democracy.
“And that’s what we were seeing, because we believed it. And it’s because it’s what we were told. While we were believing, these companies have engineered what I call an epistemic coup. Epistemic means not just knowledge, but how we know. How we can know things in the world.
“And what they’ve done is they have declared our personal information their private property, so they own all the information. They have all the rights to the information based on their ownership. They know more than we do. The gap between what we can know and what can be known about us is growing exponentially every moment. So there’s new extreme inequality.”
On AI on digital platforms
Shoshana Zuboff: “The AI is a model and these AI’s are engineered. And so what I’ve been trying to help folks understand is that there is a very specific institutional logic in YouTube, in Google, in Facebook, in all of the surveillance capitalist platforms. And this economic logic, this institutional logic is economics.
“That’s what’s called surveillance capitalism. And how does it produce revenue? It produces revenue by extracting massive scale amounts of human generated data. Massive scale. What does that mean? We know from Facebook’s own documents, its AI backbone ingests trillions of data points every day and produces six million predictions of human behavior every second.
“That’s the kind of scale we’re talking about. These companies are extractors. They can’t target, they can’t recommend, they can’t use subliminal cues to get us to look at things or do things or join things. They can’t use engineered social comparison dynamics.
“They can’t use any of their targeting mechanisms unless they have so much data that they know who we are. They know our personality profile, our sexual orientation, our political orientation, our emotional proclivities and so forth. So we’re talking about massive scale extraction.”
On the Google algorithm and how it extracts information
Shoshana Zuboff: “These extraction processes are geared to one thing, volume scale. They are indifferent to meaning. They are indifferent to our considerations. Is it true or is it false? That’s what journalism is based on. That is the raison d’être of the fourth estate that was part of the originating context of our democracy, to protect truth and to distinguish between truth and falsehood. We no longer have that.
“The platforms extract. Let me read you one tiny little comment from Eric Schmidt, former CEO of Google. Obviously, this is critical to the YouTube situation. In 2017, Google was criticized for spreading disinformation and misinformation just like this.
“And he’s defending the algorithmic operations. And what he says to the journalist is ‘there is a line we really can’t cross. It is very difficult for us to understand truth.’ There is a line we cannot cross. It is very difficult for us to understand truth. So here we have an economic machine. That is extracting everything, truth doesn’t even figure anywhere on the radar as a metric. It is indifferent to truth, truth versus lies, that’s invisible.
“So this means that the largest data company, whose mission was to organize and make accessible the world’s information, cannot tell the difference between truth and lies. This failing does not impede their success. On the contrary, it’s essential to their success. Without it, they don’t get the scale. Without it, they don’t get the targeting. Without it, they don’t get their trillion dollar ad markets.”
On how to defeat an ‘epistemic coup’
Ramesh Srinivasan: “I think it’s really important for us before diving into critique of any technology platform, though that is essential, to think about what kind of society we want to live in. … If we want to live in a democracy, what are the fundamental building blocks upon which that democracy rests? And one of the key components of that is the idea and the vision of living in a society where we have a certain kind of baseline of what is evidence, what is facts, what are facts.
“And the ability to voice different perspectives and opinions and be aware of one another’s perspectives, have the opportunity to have debate and dialogue. I mean, our principles of democracy rest and rely upon that. You know, I used to live in Boston. Right next to the Boston Common, we all remember what the Boston Common was all about. The town square, the civic life of technology is something we need to restore and humanize.”
And is that possible if the sense of commonality that you’re talking about seems to be ebbing away due to these very technologies and their pernicious effects?
Ramesh Srinivasan: “What we have occurring is the privatization of every aspect of public life, and that’s not just true when it comes to technology. But even if we want to look at technology itself, we can understand that private corporate-driven value systems, which are about profitability and equity valuation, you know, just expanding and expanding, expanding what you’re worth in this bizarro stock market that we have right now, as we saw with GameStop last week, this is what these things rely upon.
“So what we need to kind of restore these issues is to get back to the basics, which is a deeper understanding and analysis that are guided by public interest values of what the Internet and new technologies should look like. Because every kind of sector imaginable — taxi companies, accommodation — these are all taken over by private technology corporations whose entire jobs — their interest and their vision — is in making as much money as possible. Never mind the effects.”
On how to take power back from tech companies
Shoshana Zuboff: “None of this discussion is a discussion about technology. Everyone wants to be able to harvest and thrive in the possibilities and promise of digital technology, how it can solve individual needs, how it can solve society’s needs, how it can make our cities better, how it strengthens our democracies and spreads democracy. This is what we all want. Something has gotten in the way.
“And how did this happen? We allow surveillance capitalism to thrive and flourish for 20 years and democracy being asleep at the switch, as it were. This left a void. And it’s these companies and their economic logic that has filled the void. So they’re making all the decisions, they have all the power, we now take that back. We’re not putting an end to the digital. We’re freeing the digital from a narrow economic logic that does not use data to actually serve individual’s needs, society’s needs … that is not aligned with democracy.”
How do we free ourselves from this narrow economic logic?
Shoshana Zuboff: “We want to go right back to the illegitimate foundation of these economics. They took our experience. They survey our experience. They translated into data. They claim that data is their private property and then they use it for their commercial operations. This is fundamentally illegitimate, right at the very base. It’s stealing.
“Ask any eight-year-old. They took something from me. They didn’t ask. Now they say it belongs to them. Ah, that’s called stealing. This whole thing is built on a bed of sand, fundamentally illegitimate. We need to codify the new rights, just as we codified workers’ rights and consumer rights a century ago.
“That say you have no right to my face, you have no right to my photos, you have no right to all of my data. This belongs to me. And as a citizen of a democratic society, I decide what is shared. I decide how it is shared, with whom it is shared and for what purpose. These rights have to come under the governance of law and democratic institutions.
“The rights and laws that take care of us in our daily lives in the real world cannot vaporize at the cyber border. The digital has to move into democracy’s house, right? And we have to find ways for the digital to thrive in democracy’s house. So we have the innovation and we have the solutions for climate, solutions for disease, solutions for our daily lives that we are all desperately looking for.”
What are specific laws that you would change, or new actions, new organizations that you would recommend?
Ramesh Srinivasan: “The Internet was originally funded by United States taxpayers. Of course, that’s not the Internet we experience today, but the original kind of architecture upon which every technology platform and company operates, including our mobile applications at this point. So United States taxpayers invested into an Internet.
“Without that Internet, a company like Facebook or Uber or Amazon would not exist and would not have any value. Just like I see Amazon trucks combing my street outside my apartment. And so what we can’t have is a system, a political economy where all the economic value goes to an extremely small number of folks who make their own private, self-serving decisions and all of the costs are dumped back onto us, especially with something that we originally paid for, like our roads, like the Internet itself.
“So if I take that logic and try to think about how do I restore humanity to technology, how do I allow certain value systems that we aspire toward to be the guiding lights for where we take technology? We have to think about this in multiple ways. Certainly we need to think about it in the image of democracy that I alluded to earlier, which is the idea of some sort of common baseline of why we see what we see when we go online and the opportunity to understand, reflect and debate with one another.
“But I think when we want to think of what I’ve called for in my last book, “Beyond the Valley: A Digital Bill of Rights,” we have to think not just in terms of democracy as some sort of political concept, but its actual enactment in our country. And so there are a few components to this.
“I think there are a few layers to this and I think they all interact quite nicely with one another. And I’ve actually been talking to folks actually pretty significantly across the political spectrum in Congress about some of these ideas. So first is the personal layer. As human beings, as members of a society, especially of a democracy, it’s important for us to know what is being collected about us.
“What is being collected about us by whom? How long do they have that data for? How many of us realize or know that all of our credit card data, for example, can be bought, sold and trafficked and even sold to shady third parties like Cambridge Analytica, which used such data. … So on the level of the intimate, personal level, we need to understand as human beings what we’re being fed and why and what is being known about us and have greater agency and power over directing that experience. So that’s kind of the privacy and kind of transparency and accountability level.
“But I think we then have to shift to understanding what will protect our democracy. And so there’s a great amount of attention. I know WBUR has covered this on what we call Section 230, which basically allows online tech companies to not really have any liability over stuff they publish on their platforms. Obviously, it’s impossible to control. And I’m definitely not for censorship.
“But I think what we need are a certain set of guidelines that are actually enforceable that technology companies and platforms must follow in terms of their decisions around basically what we end up seeing, what counts as appropriate conduct or inappropriate conduct for deplatforming something.
“They need to be consistent across the board with that. And most importantly, the algorithms which are feeding us content, which we know are optimized to make us all enflamed, and addicted and polarized. We need to have public oversight and accountability of those algorithmic systems because they are destroying the public interest.”
From The Reading List
New York Times: “The Coup We Are Not Talking About” — “Two decades ago, the American government left democracy’s front door open to California’s fledgling internet companies, a cozy fire lit in welcome. In the years that followed, a surveillance society flourished in those rooms, a social vision born in the distinct but reciprocal needs of public intelligence agencies and private internet companies, both spellbound by a dream of total information awareness.”
Harvard Business Review: “How to Hold Social Media Accountable for Undermining Democracy” — “The storming of the U.S. Capitol Building on Wednesday by a mob of pro-Trump insurrectionists was shocking, but it was not surprising to anyone who has followed the growing prominence of conspiracy theorists, hate groups, and purveyors of disinformation online.”
CBS News: “A protected right? Free speech and social media” — “A decade ago this very month, in Cairo’s Tahrir Square, social media was being praised. Its role as an organizing tool during the pro-democracy rallies had many calling the Arab Spring the ‘Facebook Revolution’ instead.”
New York Times: “Opinion: You Are Now Remotely Controlled” — “The debate on privacy and law at the Federal Trade Commission was unusually heated that day. Tech industry executives ‘argued that they were capable of regulating themselves and that government intervention would be costly and counterproductive.'”
This article was originally published on WBUR.org.
Copyright 2021 NPR. To see more, visit https://www.npr.org.