Facebook Believes Americans Are Good at Evaluating Their Sources, And Other Comfortable Delusions

oh my god shut up

Mark Zuckerberg would like you to know that he cares a lot about disinformation and bots and propaganda. He is very concerned about this, and is also very aware that he possesses terrifying technological powers. (See, his brow! Consider how it furrows!) And so on January 19th, he made another one of his big announcements.  He’s decided, in his serene wisdom, to trust the people of Facebook to determine what is true. Nothing could possibly go wrong.  

“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” Zuckerberg chirped in his announcement (I always imagine him chirping in these, like a smug billionaire chickadee). “We decided that having the community determine which sources are broadly trusted would be most objective.” Users will be asked to rate the credibility of news sources, though only those that Facebook determines they are familiar with, through some mysterious and possibly eldritch method. These “ongoing quality surveys” will then be used to determine which news sources pop up most often in users news feeds. Will there be any effort to correct for craven partisan sentiment? No, apparently there will not be. Will there be some mechanism for avoiding another mass and gleeful ratfucking by 4chan and 8chan and whatever other slugbeasts lurk within the Internet? No, apparently there will not be. Everything will be fine! 

On January 19th, we learned that Facebook is the last organization in the entire world that still has great faith in the research and assessment powers of the average American. Is Facebook actually that unfathomably, enormously naive? Well, maybe. Or perhaps they are, once again, betting that we are stupid enough to believe that Facebook is making a legitimate effort to correct itself, and that we will then stop being so mad at them. 

Which is insulting. 

Any creature more intelligent than an actual avocado knows that Facebook’s user-rating scheme is doomed to miserable failure. Researchers  Alan Dennis, Antino Kim and Tricia Moravec elegantly diagnosed the project’s many, many problems in a Buzzfeed post, drawing on their research on fake news and news-source ratings. They conclude, as you’d think should be obvious, that user-ratings for news sources are a very different thing than user-ratings for toasters. “Consumer reviews of products like toasters work because we have direct experience using them,” they wrote. “Consumer reviews of news sources don’t work because we can’t personally verify the facts from direct experience; instead, our opinions of news are driven by strong emotional attachments to underlying sociopolitical issues.”

Facebook, if we are to believe that they are not actively hoodwinking us, legitimately believes that the American people have, in the past year, somehow become astute and critical consumers of the news. But this impossible.  Facebook’s magical thinking is roughly equivalent to putting a freezer burned Hot-Pocket in a microwave and hoping that it will, in three minutes, turn into a delicious brick-oven pizza. There is no transmutation and there is no improvement. The Hot Pocket of ignorance and poor civic education will remain flaccid and disappointing no matter how much you hope and wish and pray. 

there is some trippy ass clipart for Facebook on pixabay

This doesn’t mean there is no hope for the information ecosystem of the United States. It does not mean that this ongoing nightmare is permanent. As Dennis, Kim, and Moravec suggest, Facebook could grow a spine and start employing actual experts. Experts empowered to filter. Experts who are empowered to deem what is bullshit and what is not. But of course, this is what scares them most of all. See what Zuckerberg wrote in his Big Announcement: “The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with.”

“Not comfortable with.” Consider that wording. They’re not comfortable with doing the one thing that might actually help to dislodge the cerebral-fluid sucking leech that is currently wrapped around the brainstems of the social-media using public. It would be so awful if Facebook was made uncomfortable.

And it will do anything to avoid discomfort. Mark Zuckerberg and Facebook are simply abdicating responsibility again. They know that these “checks” won’t work. They know damn well that hiring editors and engaging in meaningful moderation is what they haven’t tried, and what is most likely to work, and what is most likely to earn them the ire of the Trump cult that now squats wetly in the White House. Cowardice has won out, again: they’ve simply come up with another semi-clever way to fob off responsibility on its users. When these “credibility checks” inevitably fail or are compromised by hordes of wild-eyed Pepes, Facebook will, right on schedule, act surprised and aghast, then quickly pretend it never happened. You should be insulted that they think we’ll just keep falling for this. We have to stop falling for this. 

These so-called credibility checks are just Facebook’s latest milquetoast and insulting effort to pretend it is dealing with its disinformation problem.  Just a few weeks ago, Facebook announced that it would be reducing public content on the news feed. This is to social-engineer “meaningful social interactions with family and friends” for its users. This might sound well and good – if you are much more comfortable with being socially-engineered by blank-eyed boys from Silicon Valley than I am – or at least it does until you hear from people who have already undergone this change. Facebook is fond of using countries from markets it deems insignificant as guinea pigs for its changes, and in 2017, Sri Lanka, Guatemala, Cambodia, Slovakia, Bolivia, and Serbia were shoved in the direction of “meaningful social interaction.” (One does wonder about the selection, considering the unpleasant history these nations share). The results were, to quote local journalists in Guatemala, “catastrophic.” Reporters in these countries suddenly found their publications – important sources of information in fragile political systems – deprived of their largest source of readership and income.

Adam Mosseri, head of Facebook’s News Feed, responded to these reporter’s anguish with the serene, Athenian calm that only tech evangelicals can muster: “The goal of this test is to understand if people prefer to have separate places for personal and public content. We will hear what people say about the experience to understand if it’s an idea worth pursuing any further.”(Whoops, we broke your already-fragile democracy! Move fast! Break things!) Dripping a new shampoo line in little white bunny rabbit’s quivering eyeballs is also a test . The difference between the two? Testing your new product on embattled reporters in formerly war-torn nations is much more socially acceptable. 

Facebook has also recently attempted to socially engineer us into being better citizens. In late 2017, I wrote about Facebook’s ill-considered civic engagement tools or “constituent services,” which were meant to (in a nutshell) make it easier for you to badger your representative or for your representative to badger you back. Using these tools, of course, required a Facebook account – and you also had to tell Facebook where you lived, so it could match you up with your representative.  Facebook would very much like a world in which people need to submit to having a Facebook account to meaningfully communicate with their representatives. Facebook would, we can probably assume, very much like a world where pretty much everything is like Facebook. This is probably not going to change. 

Yes, I know: Zuckerberg furrowed his brow somewhere in his mansion and said that he might consider cutting his profits to reduce the gigantic social problem that he’s engendered. By that, he means doing things that might actually address the disinformation problem: these things might take a variety of forms, from actually hiring experts and editors, to actually paying for news (as, incredibly, Rupert Murdoch just suggested) to hiring and meaningfully compensating a competent army of moderators. But consider our available evidence.  Do we really believe that he’ll flout his (scary) board and do the right thing? Or will he and Facebook once again choose comfort, and do nothing at all? 

“We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard,” said John F. Kennedy, in a quote that I am deadly certain Facebook employees like to trot out as they perfect methods of micro-targeting underpants ads to under-25 men who like trebuchets, or perfect new Messenger stickers of farting cats, or sort-of-accidentally rupture American democracy. Perhaps someday Facebook will develop an appetite for dealing with things that are actually hard, that are actually uncomfortable.

I’m not holding my breath. 

Leave a Reply

Your email address will not be published. Required fields are marked *