There are a lot of regrets coming out of Silicon Valley these days as the dark side of the tech revolution becomes increasingly apparent, from smartphone addiction to the big scandal involving the misuse of personal information from some 87 million Facebook users.
Facebook Chief Operating Officer Sheryl Sandberg expressed her regrets in an interview last week with NPR. "We know that we did not do enough to protect people's data," Sandberg said. "I'm really sorry for that. Mark [Zuckerberg] is really sorry for that, and what we're doing now is taking really firm action."
Facebook CEO Mark Zuckerberg will have a chance to express his regrets when he testifies in front of Congress later this week.
But the remorse coming out of Silicon Valley isn't just from high-profile leaders like Sandberg and Zuckerberg. Investors and people who worked to build some of the problematic technologies are also taking the blame; some are even turning their attention to fixing the problems.
When Sandy Parakilas went to work for Facebook in 2011, he says, he deeply believed in its mission of bringing the world closer together and building community. At the time, the Arab Spring was in full bloom and social media companies were getting credit for helping to launch a revolution.
"I was extremely excited about the power of social media to advance democracy all over the world," Parakilas says.
But his optimism would be tempered by the reality of Facebook's hunger for raw data about its users. He didn't like the direction it was going.
"They have a business model that is going to push them continuously down a road of deceiving people," he says. "It's a surveillance advertising business model."
Parakilas says he tried to warn his managers at Facebook that they were at risk of putting private information into the wrong hands. But the company was growing fast and making money. Its leaders believed connecting people was inherently good.
Many of its earliest investors believed in its mission, too. But now Roger McNamee, who helped mentor Zuckerberg, says he feels bad about what has happened, "because at the end of the day these were my friends. I helped them be successful. I wanted them to be successful."
As part of his penance, McNamee helped found the Center for Humane Technology. The center is trying to "realign technology with humanity's best interests." Parakilas has also joined the effort as an adviser.
While Facebook may be in the headlines now, there is plenty of regret going around Silicon Valley from people who were part of other companies.
Guillaume Chaslot joined Google/YouTube in 2010. He, too, started as a true believer. "We could only make things better if people were more connected," he says. "If everybody could say what he wanted to say, things would naturally get better."
But Chaslot says he noticed the main goal at YouTube wasn't to inform people; it was to keep people watching videos for as long as possible. "This goal has some very bad side effects and I started to notice the side effect as I worked at YouTube," he says.
Among the side effects he noticed: People tended to get only one point of view on a topic — and not always the right one. For example, a search for "moon landing" might bring up videos from conspiracy theorists arguing that NASA faked the whole event.
Chaslot tried to create an algorithm that would show people different points of view. But, he says, his bosses weren't interested.
A spokesperson from the company says it has updated its algorithms since Chaslot left. According to the company, it no longer just tries to keep people on the site for as long as possible; the goal is to measure through surveys how satisfied users are with the time they spend on the site.
Chaslot left in 2013. But he continued to lose sleep over what was happening on YouTube. From the outside, he observed the site fill up with conspiracy theories and divisive content. He privately met with former colleagues and tried to warn them. But nothing began to change until after the presidential election, when news of Russian interference brought more attention to the kinds of videos on YouTube.
Chaslot now says he wishes he'd gone public sooner. "Now that's what I'm doing but with a bit of a delay," he says. He even started a site to track what kinds of videos surface when you search terms like "Is the Earth flat or round?" and "vaccine facts." The results bring up plenty of factually incorrect conspiracies.
Of course, it may be easier for many techies to speak out now — investors have done well and employees were paid well for their work. Still, it's probably good news that the very people who helped create the problem are now using their inside knowledge to fix it.
STEVE INSKEEP, HOST:
Facebook says that starting today it will begin sending a message to roughly 87 million people. These are people whose information may have been shared with Cambridge Analytica - the political data firm - without any consent. Disclosing this information is part of Facebook's damage control as founder Mark Zuckerberg prepares to testify this week before Congress. NPR's Laura Sydell reports there are some feelings of remorse right now in Silicon Valley.
LAURA SYDELL, BYLINE: Many of the people who helped build Facebook started out believing in its mission of building community and bringing the world closer together.
SANDY PARAKILAS: I was extremely excited about the power of social media to advance democracy all over the world.
SYDELL: Sandy Parakilas joined Facebook in the summer of 2011 in the midst of the Arab Spring as journalists around the world watched how social media helped the protesters.
(SOUNDBITE OF ARCHIVED RECORDING)
UNIDENTIFIED REPORTER: The weapons of the activists of the so-called Arab Spring weren't guns and bombs but the Internet and the mobile phone.
SYDELL: But Parakilas' optimism would be tempered by the reality of Facebook's hunger for raw data about its users.
PARAKILAS: And that is going to push them continuously down a road of deceiving people. It's a surveillance-advertising business model.
SYDELL: Parakilas says he tried to warn his managers at Facebook that they were at risk of putting private information in the wrong hands, but the company was growing fast and making money. Its leaders believed connecting people was inherently good. Parakilas says he left Facebook in 2012 feeling doubts about the company. Now that the problems with its massive data collection system have become public, several early investors are also expressing regrets. Roger McNamee says he helped mentor CEO Mark Zuckerberg.
ROGER MCNAMEE: I still feel terrible about it because at the end of the day, these were my friends. I helped them be successful. I wanted them to be successful.
SYDELL: There's plenty of regret these days in Silicon Valley, and more people are coming forward about the negative effect tech is having on society. Guillaume Chaslot says he joined Google-YouTube in 2010. Chaslot came all the way from France. He too started out optimistic.
GUILLAUME CHASLOT: We could only make things better if people were more connected. If everybody could say what he wanted to say, things would naturally get better.
SYDELL: But Chaslot says he noticed the main goal at YouTube wasn't to inform people. It was to keep people watching as long as possible. He spoke to NPR on Skype.
CHASLOT: This goal has some very bad side effects, and I started to notice the side effect as I worked at YouTube.
SYDELL: Among the side effects that Chaslot noticed was people tended to get one point of view on a topic and not always the right one. For example, search for moon landing on YouTube, and you might get a video like this.
(SOUNDBITE OF ARCHIVED RECORDING)
UNIDENTIFIED PERSON: This particular lecture looks and analyzes the possibility that man has or has not stepped foot on the moon. This is the NASA-Apollo hoax.
SYDELL: Chaslot tried to create an algorithm that would show people different points of view, but he says his bosses weren't interested. A spokesperson from the company says it has updated its algorithm since Chaslot left. According to the company, the goal is not just to try to keep people on the site for as long as possible. The goal now is to measure how satisfied users are. Chaslot left in 2013, but he says from the outside, he watched the site fill up with conspiracy theories and divisive content. He warned his former colleagues, but nothing began to change until the presidential election brought more attention to the kinds of videos on the site.
CHASLOT: I should've spoken publicly about these issues. Now, that's what I'm doing, but there was a bit of a delay.
SYDELL: Chaslot says he's still losing sleep over what's going on there. He's even started a site to keep track of YouTube's search algorithms. Of course it may be easier for many techies to speak out now. Investors have done well. Employees were given good salaries for their work. Still, it's probably good news that the very people who helped create the problem are now using their inside knowledge to fix it. Laura Sydell, NPR News. Transcript provided by NPR, Copyright NPR.