by Jeremy Fassler
Creators, inventors, and CEOs often have a hard time understanding the consequences of their products when used for ill means, arguing that technology, when it first comes into existence, is a tabula rasa -- that is neither all good nor all bad. Professor Melvin Kranzberg of Georgia Tech has argued the opposite, saying tech developments often have many unforeseen consequences:
“Technical developments frequently have environmental, social, and human consequences that go far beyond the immediate purposes of the technical devices and practices themselves, and the same technology can have quite different results when introduced into different contexts or under different circumstances.”
Now that we are aware of the impact social media had on the 2016 election, the behavior of Facebook founder and CEO Mark Zuckerberg has reflected this problem, especially with the lame defense of his company he posted after Donald Trump said it had "always been anti-Trump."
True, the people who run Facebook were Hillary supporters - Facebook COO Sheryl Sandberg is quoted in the Podesta emails as saying she wanted her to win badly - but those who took advantage of the company were pro-Trump. With increasing evidence that Facebook may have been a deciding factor in swinging the election to Trump, it is on Zuckerberg's shoulders to prove that he understands the consequences of his product. So far, he has not delivered on that proof, made evident by his defense below:
Now, to be fair to Zuckerberg, this is a step up from where he was in November 2016, when, in the days following the election, he said that the idea that fake news that disseminated through his platform influenced the outcome of the election was "crazy." Thanks to the excellent reporting done this week by The Washington Post, we know that President Obama told Zuckerberg a few weeks later that he had to take this possibility seriously, which, to his credit, he did. Facebook, at the request of Senator Intel Chair Mark Warner, worked to find the accounts and pages purchased by the Internet Research Agency, the Russian troll farm that purchased at least 3,000 ads throughout election season, set up fake accounts, and organized fake events. And to Facebook's credit, they have agreed to cooperate with the Senate and House Intel Committees, turning over the 3,000 ads to them for further research. Given all this, it makes sense that Zuckerberg says he "regrets' his earlier remarks, which he calls "dismissive." But all hindsight is 20-20, and unfortunately, he still has yet to process what happened, as these remarks indicate:
"Every day I work to bring people together and build a community for everyone. We hope to give all people a voice and create a platform for all ideas.
"Trump says Facebook is against him. Liberals say we helped Trump. Both sides are upset about ideas and content they don't like. That's what running a platform for all ideas looks like."
This idea, that "both sides" use Facebook to spread information and take offense at ones they don't like, is not just intellectually dishonest, it reveals just how successful the Russians were at duping its users with false information. Rather than fully apologize for his wrongdoing, Zuckerberg wants us to see this as an honest mistake that comes from running a free speech platform where everyone has the power to spread ideas. His naivete in this regard has led him to make some terrible decisions with regards to how Facebook uses free speech.
Between 2014-15 Russian trolls managed to suspend Ukrainian activists in the Orange Revolution from Facebook, claiming that the anti-Kremlin protestors were engaging in hate speech. Many of these posts were harmless: activist Yaroslav Matiushyn claimed he was suspended for posting a photo of a rainbow. When Zuckerberg held a town hall meeting in May 2015, Ukrainians submitted questions over and over, asking why he allowed Russians to abuse the "report" function on his website. Thanks to their submissions, the top 20 questions worldwide all dealt with this topic. But Zuckerberg dismissed it, saying, "We did the right thing according to our policies in taking down the posts," standing up for those who hit "report" because he couldn't believe they would do it in bad faith.
Other decisions like this have needed no interference from foreign powers, and come straight from the company itself. In 2011, Facebook lobbied the FEC to block rules requiring that the sponsors of political ads had to be displayed. Facebook is now working to change those rules so that you have to see who sponsored the ad when you click on it. But the lack of such rules allowed the Russians to dupe gullible voters into believing the worst about Hillary Clinton and the best of Jill Stein. Similarly, in 2013, Chief Justice Roberts believed he could strike down the Voting Rights Act because he believed racism was no longer as bad as it used to be. The outcome of Facebook's lobbying may prove to have been as consequential to the 2016 election as Roberts' ruling was.
Zuckerberg cannot be totally blind to these missteps, otherwise he wouldn't be as defensive of his company as he is. He once said that "there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is they saw some fake news," in an attempt to defend those whom the Russian operation targeted. As far as I can tell this translates as 'Facebook's success relies on his belief that people are good at heart, and only want to serve your interests well'.
This reliance on trust has made Facebook and its News Feed the 21st century equivalent of Main Street. Back in the 19th and early 20th century, you went to Main Street every day to run some errands, learn what was happening in the world, and see your friends at the general store. Facebook serves that function now, and Zuckerberg is the sheriff of your new town. That's the image he presents of himself on the site: a good guy with a strong charity, who donates to people in need, and talks with regular, non-billionaire folk around the country. Now that his failings have been exposed, they threaten the survival not just of his company's future, but our democracy. And if we can't rely on Facebook to be an unbiased market of ideas, who can we trust?
"People trust people," he said, "not corporations." Maybe, in a time where public trust in institutions is failing, he has a point. But as an excuse for his company's actions, or lack thereof, Zuckerberg's remark is akin to another famous industry's slogan used to dodge responsibility when their products it supports cause harm:
"Guns don't kill people, people kill people."