It’s Not Facebook’s Fault

By Shaun Tan

By Shaun Tan

Founder, Editor-in-Chief, and Staff Writer

28/12/2020

Picture Credit: www.shopcatalog.com

Poor Facebook. In recent years, it’s been one of America’s most maligned companies. Since November 2016, it, along with other social media platforms, has been blamed for helping Donald Trump get elected president by facilitating the dissemination of misinformation that favored him. In the intervening years, it’s been singled out for special condemnation because of the reluctance of its founder and CEO, Mark Zuckerberg, to do more to censor content on the site, and his refusal to ban political advertising there. Its measures to reduce the spread of hate speech and disinformation, including removing racist and misogynistic posts about Vice President-elect Kamala Harris and appending links to accurate information beneath certain posts, have been deemed inadequate, and it’s been criticized for not censoring inflammatory posts and for allowing political ads with falsehoods in them to run. President-elect Joe Biden has heavily criticized Facebook for “propagating falsehoods,” and has said Section 230 of the Communications Decency Act, which protects Facebook (and other sites) from liability for their users’ posts, should be removed.

 

But these criticisms are unfair. Trying to censor hate speech is a fool’s errand since no one can really define what “hate speech” even is, especially since many people (particularly on the left) habitually use that term to describe virtually anything they disagree with. Expecting Facebook, or any social media platform for that matter, to comb the content their users post there for hate speech or misinformation is similarly unrealistic, because, on YouTube, for instance, more than 82 years’ worth of video content is uploaded every day. No matter how deep your pockets are, you’re not going to be able to hire enough people to do that, and it’s not something a computer program is likely to get right either.

 

Nor is having techies in Silicon Valley police their platforms for hate speech and misinformation desirable even if it was feasible. Recall how in 2017 James Damore, a software engineer at Google, was branded a sexist by his colleagues and fired for thoughtfully questioning the wisdom of the company’s attempts to achieve a 50/50 male-female ratio among its employees. What possible confidence can we have that these people can be entrusted to determine what’s hate speech and misinformation and what’s not?

 

On the contrary, if Facebook should be criticized for something, it should be for doing too much to try to combat hate speech and misinformation. It holds itself out as a neutral platform on which anyone can share their thoughts (and it enjoys protection from liability under Section 230 on this basis), and yet at times it breaches that supposed neutrality and acts more like a publisher by curating content, suppressing or removing posts or accounts it deems objectionable. By deliberately suppressing dubious stories about Hunter Biden, and banning pages linked to former White House Chief Strategist Steve Bannon for advertising baseless claims of election fraud, it invites suspicion that it’s biased against Republicans and conservatives. Sometimes, Facebook’s overzealous curation ends up suppressing genuine debate: a friend of mine recently had his post labelled as “hate speech” and censored because it criticized the Islamist notion that the Muslim zakat tax shouldn’t be processed by a secular payments system.

 

Overall, though, Facebook is the social media giant that treats people most like thinking adults. Whilst TikTok’s offering is limited to an endless series of extremely short and extremely mindless videos, and Instagram’s is limited to pictures and vapid “inspirational” quotes, Facebook allows people to upload and see pretty much anything they want in whatever format they want. Whilst Twitter’s 280-character limit seems specifically designed to encourage the oversimplification of ideas into snappy quotes (which people often end up having to qualify or apologize for later), a Facebook post can be as long as you like, enabling users to develop and explain their ideas at length.

Facebook is the social media giant that treats people most like thinking adults.

Facebook also seems to have more respect for freedom of expression and democracy, and more intellectual humility, than the other social media giants. In a speech at Georgetown University in October 2019, Zuckerberg outlined how his company had rightly cracked down on fake accounts (accounts that pretend to be people they’re not). He expressed alarm, however, at the rising calls for censorship in the name of protecting democracy. “Increasingly, we’re seeing people try to define more speech as dangerous because it may lead to political outcomes they see as unacceptable,” he said. “Some hold the view that since the stakes are so high, they can no longer trust their fellow citizens with the power to communicate and decide what to believe for themselves. I personally believe this is more dangerous for democracy over the long term than almost any speech. Democracy depends on the idea that we hold each others’ right to express ourselves and be heard above our own desire to always get the outcomes we want.”

He’s right. Democracy is premised on the idea that people have the freedom to choose for themselves what to believe, what to share, and what to vote for – and freedom of choice must include the freedom to choose wrongly. They might not get it right all the time, but confidence in the system requires only that, as E. B. White wrote, “more than half of the people are right more than half of the time.” Calls to restrict the spread of certain ideas because they’re supposedly dangerous reflect a lack of confidence in democracy and in the people. Contrast Zuckerberg’s speech with Twitter Founder and CEO Jack Dorsey’s tweets explaining why Twitter banned political ads:

A political message earns reach when people decide to follow an account or retweet. Paying for reach removes that decision, forcing highly optimized and targeted political messages on people.

 

In Dorsey’s estimation, people are like mindless automatons, who have no choice but to follow an account or retweet a post if it’s put in front of them.

 

In truth, those who blame social media for the unpleasantness they sometimes see on it, and for people’s bad decisions (including the decision to elect Trump), are barking up the wrong tree. You almost feel sorry for the social media companies, scapegoated for the weaknesses of democracy, for the failings of the people. Is it their fault if, given the ability to post anything, so many people choose to post baseless conspiracy theories? Is it their fault if, given the option of following the social media accounts of CBS, The Atlantic, and the Wall Street Journal, so many people choose to follow Breitbart and Infowars instead? Is it their fault if so many voters are apparently so impressionable that seeing a photoshopped picture of Hillary Clinton sporting devil horns made by Russian trolls is enough to influence their vote?

 

The fault, and the ultimate responsibility for choosing what to believe and who to vote for, lies, as it always has, with the people themselves. “[F]iguring out when politicians are full of shit is the responsibility of the voters and no one else,” said the comedian Bill Maher. “People have to build up an immunity to falsehoods. We can’t pass the buck to a referee.” Democracy only works if you can assume a minimum level of intelligence and decency among the populace, and, in the absence of that, no social media restrictions, no tweaking of the algorithms, can compensate for it.

The fault, and the ultimate responsibility for choosing what to believe and who to vote for, lies, as it always has, with the people themselves.

Ultimately, social media sites and applications are (mostly) neutral platforms, and serve a valuable social function as such. They’re inclusive and allow anyone to join. They produce no content themselves: all content on them is produced and shared by their users. They’re therefore really just mirrors, reflecting society as a whole, or, rather, the large portion of society that uses them. They reflect us: our loves and hates, our hopes and fears and prejudices, the good, the bad, and the ugly. If there’s hatred and stupidity on them, that’s because so many of us are hateful and stupid. (And, since social media platforms give us a lot of freedom to choose who we want to befriend and follow online, and even which ads we want to see, if you’re seeing a lot of hatred and stupidity on them, that’s because your friends (or your “friends”) are hateful and stupid, or because you’ve deliberately chosen to follow hateful and stupid people, pages, or channels.) That’s a problem, but it’s a problem with us humans, and cannot be laid at social media’s door. This dark side of society can be seen in the half of America that voted for Trump; social media didn’t create those people – they were there all along – and suppressing them online won’t make them change or go away. If we’re honest with ourselves, that darkness probably lives, to a greater or lesser extent, in every one of us, just as there’s goodness, to a greater or lesser extent, in the hearts of Trump supporters. Blaming social media companies for unpleasantness on their platforms makes as much sense as blaming WhatsApp and Apple because some people choose to use their products to send unpleasant messages to each other.

“The fault, dear Brutus,” says Cassius in William Shakespeare’s Julius Caesar, “is not in our stars, but in ourselves.” And so it is. Social media merely reflects these faults, along with everything else. A social media platform can only be as good as the people who use it, just as a reflection can only be as beautiful as the person who casts it. And the most pointless thing you can do if you don’t like the reflection you see is to blame the mirror.