© 2024 Ideastream Public Media

1375 Euclid Avenue, Cleveland, Ohio 44115
(216) 916-6100 | (877) 399-3307

WKSU is a public media service licensed to Kent State University and operated by Ideastream Public Media.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

The Central Question Behind Facebook: 'What Does Mark Zuckerberg Believe In?'

"To an extraordinary degree, Mark Zuckerberg is Facebook ... so if you're going to understand Facebook in any meaningful way, the conversation really has to start with him and end with him," journalist Evan Osnos says.
Justin Sullivan
/
Getty Images
"To an extraordinary degree, Mark Zuckerberg is Facebook ... so if you're going to understand Facebook in any meaningful way, the conversation really has to start with him and end with him," journalist Evan Osnos says.

Last week, Facebook announced the most serious security breach in its history, in which unknown hackers were able to log onto the accounts of nearly 50 million Facebook users.

That breach was just one of several crises plaguing the world's largest social media platform. Free speech issues and the Russian disinformation campaign targeting the 2016 election had already put Facebook and its founder, Mark Zuckerberg, under scrutiny as the midterm elections approach.

Journalist Evan Osnos, who wrote about Facebook and Zuckerberg recently for The New Yorker, says the company has come up against "a growing and really serious decline of public trust, both among politicians and among the general public."

Osnos notes that Facebook, which now has 2.2 billion monthly active users, is larger than any country in the world: "It's really closer in terms of scale and reach to a political ideology or a religious faith. ... In literal terms, it now has as many adherents as Christianity."

At the heart of the company is Zuckerberg, who remains something of a mystery despite the fact that he controls 60 percent of Facebook's voting shares.

"There is this profound question mark around what does Mark Zuckerberg believe in? What does he stand for? What does he care about?" Osnos says. "To an extraordinary degree, Mark Zuckerberg is Facebook ... so if you're going to understand Facebook in any meaningful way, the conversation really has to start with him and end with him."


Interview Highlights

On Facebook's most recent data breach

This is the largest security breach in Facebook's history. And what was unusual about this, and what sets it apart from other cases like Cambridge Analytica, was that this was outright theft. This was a case of hackers or [a] hacker — we still don't know who it was — finding essentially an underprotected door and walking through it and taking control of at least 50 million Facebook user accounts. ...

In this case, the hackers were able to get total control of the accounts. So they were able to get control of your privacy settings. They could go into your messages. They could post things on your behalf.

At this point, Facebook says they haven't found any evidence of these hackers doing that, so that only heightens the mystery. They don't know why they did it. They don't know if this was a foreign government or if this [was] individuals [or] if this was a criminal act.

On former Facebook operations manager Sandy Parakilas being an early whistleblower for data and privacy concerns

Sandy Parakilas had joined Facebook in 2011 and was one of the people responsible for going out and figuring out whether the data that they were giving to developers was being misused in any way. Were people violating privacy? Were they taking more data than they were supposed to?

And what he found was that they were. In some cases, programmers, for instance, were siphoning off people's pictures and their private messages. In other cases, he found a developer that was building essentially shadow profiles — profiles for people who'd never given their consent, including for children. This was a case in which there was just this feeling of it being the Wild West. This data was out there and nobody was paying attention, and he raised alarms internally. ...

He said, "Look, we need to do a major audit to go out and figure out where is our data — who has it and how are they using it?" And as he says, he was told, "That's not going to happen because if you do it, you may not want to know what you're going to find." Meaning that they may have already lost control of so much of that data that they didn't really want to discover the full reach.

On Mark Zuckerberg implying in 2010 that privacy is no longer a social norm

It caused a big uproar at the time. He said, "Look, this is a generational difference. We don't feel the same way about privacy that our parents and grandparents did." And people said, "That's wild, that's not right." Privacy is built into the very nature of the United States. It's really embedded in the Bill of Rights, and his belief was that it was, as it was often described, an antique and that we needed to push people further.

In the early days of Facebook, there was a theme, a phrase that was bandied about called "radical transparency," the idea that you had to be aggressively transparent in order to be modern. The sense was, as one person had put it, that in the future, because of Facebook and other things like it that were exploding the boundaries of privacy, that extramarital affairs would become impossible. People couldn't hide things like that. They could no longer hide their lives outside of work from their lives in work. And they believed that to be a virtue — the sense that there would be this fusion this union of our private selves and our public selves. But that put them at odds with the public.

The key fact, I think, was that over and over again, Mark Zuckerberg believed that being at odds with the public was not a sign you were doing something wrong; it was a sign that you were doing something innovative. Their mantra, their motto, of course, became "move fast and break things." And that motto really captured the way that they see the world.

Over and over again Mark Zuckerberg believed that being at odds with the public was not a sign you were doing something wrong; it was a sign that you were doing something innovative.

On how Facebook, leading up to the 2016 election, saw it as a major revenue opportunity

Facebook had used its lobbying power. It had argued to the Federal Election Commission that it should be exempted from rules that require television advertising to be identified by the source of the funding. You know, that point at the end where they say who who paid for the ad. They said we shouldn't have to follow those rules because we're a new technology, and in their filings, they said you don't want to stifle the growth of new innovation.

But, as a result, that meant that it was in a sense a very dark terrain. Things that were being posted on Facebook, that were ads around politics, were in many cases of mysterious origin. It was very hard to know who was posting them and why.

On Facebook's offer to embed employees in both the Trump and Clinton campaigns to help them use the platform effectively

The Clinton campaign rejected the offer. They thought they had more or less enough of their own technical capability to do it. But the Trump campaign embraced it eagerly. They were a much smaller, almost [a] sort of shoestring operation. They had very little of the seasoned political expertise that was rallying around other presidential candidates.

And so Facebook moved employees into the Trump campaign headquarters, and they helped them craft their messages. They helped them figure out how to reach the largest possible audience, how to test different messages — many, many messages a day — to figure out just what small differences — changing the background color or changing the text or the font — how that would impact the number of people that would click on it, and ultimately might give money and support the candidate. ...

So later, in the end, after Donald Trump won the election, the senior campaign strategists were very clear. As one of them, Theresa Hong, said to an interviewer, "Without Facebook, we wouldn't have won." They played an absolutely essential role in the process.

On Facebook's initial reluctance to admit the full scope of the Russian disinformation campaign during the 2016 election

Initially, they estimated that about fewer than 10 million Facebook users might have been affected by Russian disinformation, and they later had to revise that in preparation for testimony in Congress and they said actually as many as maybe 150 million Facebook users were affected by Russian disinformation. And what's remarkable about that is how efficient it was actually as a conduit for disinformation, because the Russian Internet Research Agency, which was reporting to the Kremlin, had fewer than 100 members of its staff on this project, and yet they were able to reach a size of 150 million Facebook users. That is extraordinary.

On Facebook removing Alex Jones' site Infowars, which promotes false conspiracies

This is just the front edge of an unbelievably complex problem which is: What are the bounds of free speech? ... What do we consider to be out of bounds? What is in effect shouting "fire" in a crowded theater and what is legitimate provocative unsavory speech? And these are some of the hardest problems that we face. And they're now, let's face it, they're in the hands of the engineers — the people who created this incredibly powerful application. ...

This is a case in which a private company is making profound choices about the contours and the boundaries of political expression, and we don't have obvious tools with which to regulate them.

Even if you're not a fan of Infowars — and God knows I'm not — it has to make a person uneasy to know that there is now a company which is capable of deciding not only what kind of information it's going to suppress, but also which kind of information it's going to promote. And, on any given day, there are people who are going to be offended by those choices. But the tools by which Facebook is held accountable are not the tools that we use in politics, it's not like you vote the bums out. It's not like people are appointed to Facebook's board as if they were Supreme Court justices. This is a case in which a private company is making profound choices about the contours and the boundaries of political expression, and we don't have obvious tools with which to regulate them.

Amy Salit and Seth Kelley produced and edited the audio of this interview. Bridget Bentz and Molly Seavy-Nesper adapted it for the Web.

Copyright 2020 Fresh Air. To see more, visit Fresh Air.

Tags
Dave Davies is a guest host for NPR's Fresh Air with Terry Gross.