Facebook Destroys Everything: Part 3

When Covid arrived, I was, like most reasonable people, terrified of the virus. I was also absolutely terrified by the glittering, data-hoovering opportunity that a global pandemic represented for the always-hungry likes of Facebook and Google.

My fears about how Big Tech might take advantage of this planet-sized tragedy only solidified after it came out in early March 2020 that the Trump administration had been holding conference calls with Silicon Valley to discuss how they might be able to work together on battling the pandemic – and if the companies had any useful data they might be willing to share with the federal government.

My mind filled with visions of an unholy alliance between privacy-destroying tech firms and the deranged Trump administration, who could use public health concerns to legally mandate that Americans cough up their health and location data to both Zuck and the MAGA set in exchange for access to Covid testing and vaccines.

There was some precedent for my paranoia.

I’d spent the last decade watching as Facebook sweet-talked governments, medical systems, and non-profits around the world into adopting their platform for communicating with the public about crises, seducing decision-makers with promises of an easy, domesticated solution that would liberate them from having to futz with building and updating their own websites.

I’d also watched in disgust as how, after crucial organizations became comfortable with pushing out vital information on Facebook, the company began to make it harder and harder for people to find or to view those potentially life-saving posts if they weren’t already logged in. The end-game was obvious: they were building a world where if someone wanted to look at updates from their city government on local flooding, or see what their local hospital was saying about flu vaccinations, they’d have to submit to becoming legible to Facebook first.

Covid, then, represented a massive opportunity for a company that was already so clearly hell-bent on taking advantage of disaster and crises as away to herd even more organizations and people into its blue, walled-off paddocks.

And while it was true that Facebook and Trump regularly sparred with one another in public, as GOP leaders complained that the platform was unfairly censoring them (when in truth, the site was doing the exact opposite), it was a different story in private.

i was a kid when this happened and it just keeps getting dumber and more insidious the more I read about it

At the time, Facebook policy vice president and former George W Bush policy advisor Joel Kaplan – a notorious participant in the 2000 “Brook’s Brothers” riot that helped secure the presidency for George W Bush – was working overtime to win the MAGA set’s trust.  Why wouldn’t Zuckerberg and his highly-paid and ethically suspect colleagues take the opportunity to partner, at least for now, with the Trump administration?

Much to my surprise, and relief, both Trump and Facebook spectacularly fumbled the world-domination bag.

 In retrospect, it was even less surprising that the rift between the MAGAs and Big Tech began over disinformation.

In early March, as the world became horribly aware that Covid was both real and destined to become real bad, Facebook joined forces with Google and Twitter to announce that their sites would make a special effort to counter the spread of egregious misinformation about the pandemic.

Then came May 25th, 2020, and the brutal murder of George Floyd at the hands of a bloodthirsty Minneapolis cop. As protests against police brutality ignited across the United States, social media users were confronted with a tsunami of hate speech and disinformation directed against Black Americans and activists. Perhaps anticipating what would happen next, Trump hastened to sign a executive order on “preventing online censorship,” although it was almost entirely symbolic in practice.

President Trump then, in the course of making his own contributions to the fire hose of racist bullshit that swirled around the Internet at the time, crossed a line. In ominous May 29th posts on both Facebook and Twitter, he declared that “once the looting starts, the shooting starts.”

Twitter acted relatively quickly to limit the public’s ability to view or interact with Trump’s post, citing their rules against “glorifying violence.”

Facebook, meanwhile, didn’t do shit.

As both the public and national media took note of the two social media platform’s distinctly different approach to Trump’s violent rhetoric, Mark Zuckerberg was eventually forced to say something. In an impressive display of weasel-words, Zuckerberg wrote a lengthy post justifying his decision to leave the President’s egregiously terms-of-service violating emission up, claiming (as he had before in response to Myanmar) that the company “shouldn’t be the arbiter of truth.” 

Mark was, I suspect, surprised when his word salad failed to turn down the heat on both himself and his company.

Repulsed Facebook employees publicly called both Zuckerberg and Joel Kaplan out, accusing their leaders of bending over to accommodate the whims of the GOP. Soon, over 800 advertisers had joined a boycott against the company, including heavy-hitters like Coca-Cola, Ford, and Unilever. Caught between a rock and a hard place, Zuckerberg finally agreed at the end of June to do more to remove violence-inciting posts and to label posts by politicians with virulently policy-flouting content.

While many critics from the left were temporarily quieted by this move, Facebook’s woes weren’t over yet.

zuckerberg and fauci touching base

In mid-July, Zuckerberg, in a rare display of semi-human sentiment, openly criticized the Trump administration’s stunningly shit response to the virus in a live interview with Dr. Anthony Fauci. Soon after the Fauci comments, Zuckerberg insisted to Axios that he didn’t have a secret deal with Trump, as some media outlets had begun to speculate – though he did confirm that he spoke with the President “from time to time.” Trump, for his part, largely kept quiet about these open provocations. For a few days, it seemed like Zuckerberg was, infuriatingly, managing to once again get away with his obfuscating aw-shucks act.

Then in early August, Trump claimed (falsely) in a Fox and Friends interview, which he shared on Facebook, that children are “almost immune” from Covid-19. Facebook, pushing its luck, decided that it would hold the President to its terms of service: it deleted Trump’s video.

what a time to be alive

Predictably, Trump lost his shit, and perhaps even more predictably, he lost his shit during an interview with Gerald Rivera.

After deeming his comment on Covid to be “a perfect statement, a statement about youth,” he took up his old claim that Facebook was censoring him. “They’re doing anybody, on the right, anybody, any Republican, any conservative Republican is censored and look at the horrible things they say on the left,” Trump wailed to Geraldo’s sympathetic listeners.

By September, Trump was making ominous noises at the White House about taking “concrete legal steps” against social media sites that censored conservatives online. The relationship between the President and Facebook would remain distrustful at best until Trump – grudgingly – left office.

Which was, of course, a good thing. The Trump administration’s wildly unpredictable behavior and constant hostility to Silicon Valley’s prideful overlords ensured that both the government and Big Tech would fail to pull off the frightening privacy-destroying partnership I’d been so afraid of when the pandemic first began.

But bad as the relationship between Trump and Zuck now was, Donald Trump was still allowed on the platform. Which he used to spewed claims about voter fraud up to and after the 2020 election, and where his supporters openly discussed the plans that would eventually lead to January 6th in ever-more-deranged Facebook groups.

On that particular day of infamy, Facebook did suspend Trump’s account. But only after Twitter did it first. (Trump now has his Facebook account back, but he doesn’t use it much. The moment has passed).

Facebook found little friendliness from the new Biden administration, populated by staffers who were far less enamored with big tech than the technocrats of the Obama era had been.

thanks Facebook!

Biden’s team immediately criticized the company for failing to adequately control rampant disinformation about the Covid vaccine, as the Democrat-led administration set about frantically picking up the pieces of the GOP’s disastrous pandemic response. Meanwhile, it battled with Biden in public, Facebook (per whistleblower revelations) carefully tracked the spread of Covid disinformation internally – while consistently sharing as little of their findings with the new Democrat-led government as possible. 

Eventually, Facebook did eventually, begrudgingly, give into Biden administration pressure to take down obvious Covid-19 bullshit. It was a move that was in alignment, you might recall, with what Zuckerberg publicly claimed he was going to do when the pandemic began.

It was also a choice that the GOP is now, as I write this in the summer of 2023, using to bolster their nonsense claims (which they’ve been making in one form or another since 2015) that the Biden administration is unjustly censoring the GOP on social media.

A Louisiana judge recently used this exact rationale to ban federal agencies and officials from working with social media companies to address “protected speech.” And much of the media continues to politely ignore the fact that Trump and the GOP have spent years blatantly pressuring social media companies to cater to them, actions they’ve figured out they can obfuscate by shrieking as loudly as possible about how they’re being oppressed by the Coastal Elite.

blue pretzel/ouroboros

And then came Meta.

At the end of 2021, Zuckerberg, high on an in-house supply incomprehensible to the likes of groveling, ground-dwelling peasants like us, announced that his company would be changing its name, placing products like Facebook, Instagram, and WhatsApp under the same blandly ominous title.

What’s more, the whole shambling horror would be pivoting operations over to something he’d dubbed the Metaverse, an incomprehensible concept that was – I think, it’s terrifically hard to say for sure – positioned somewhere in between hideous NFTs of vomiting apes, The Blockchain (such as it is), and a 2005-era VR video game where you don’t have any legs. Supposedly, it was a play to attract more young people, more hip people, to Meta’s increasingly geriatric lineup of products. After all, nothing says youthful cool like dropping fake computer money on virtual branded estate.

turns out that people just want to be sexy 20-foot dragon ladies in VR worlds, not dead-eyed dorks posing in front of monuments

Unsurprisingly to everyone who isn’t Mark Zuckerberg, the Metaverse was a majestic, world-beating failure. Meta hemorrhaged money, burning billions of dollars in pursuit of a lame product that nobody wanted. The company’s frantic flailing drove even more people away from Facebook’s both grotesquely ethically compromised and now terminally lame platform. For the first time ever, in early 2022, Facebook started losing users.

Facebook, or Meta, was by no means dead. But Facebook, surprisingly, had stopped feeling inevitable.

wow, he’s just like us

As the world became aware of Elon Musk’s manure-brained battle to weasel out of buying Twitter in 2022, the attention of what remained of tech journalism shifted away from Zuckerberg’s failings to Musk’s even splashier, rocket-fuel stained antics. By 2022, the Metaverse’s incredible, legless failure had conditioned many people to view the company as more absurd than it was outright evil. I noticed a considerable uptick in fluff pieces about how Mark Zuckerberg was learning BJJ, like a normal human with normal, relatable hobbies.

For Mark, Elon Musk’s incredible two year effort to light his own reputation on fire has also had the remarkably convenient knock-on effect of making him seem reasonable. “Yes, Zuckerberg’s companies ransack private data and tear apart societies, and he does openly thirst for world domination,” some reasoned, “but you also don’t see him promoting creepy eugenics theories, blowing up rockets in environmentally sensitive areas, or directly meddling in the Ukraine War.”

And so, Zuckerberg and the Metaverse and everything else were able to slink back into the shadows for a bit. Sure, there were still stories about how the company was failing to control hate speech in conflict zones. How it had been slapped with more historically huge and yet affordable fines from the European Union. How people in poor countries were getting charged for their supposedly free Facebook-branded mobile data. But the media had, largely, shifted its coverage of man-made horrors beyond our comprehension to the latest, splashiest abominations that Elon was involved in.

When Elon Musk finally did walk into Twitter HQ with a shit-eating grin and a stupid Home Depot sink in his hands, his status as the Internet’s new Most Hated Man was secured. And it became terribly apparent that Twitter as we knew it, as I knew it, was gone for good, and something much, much worse was going to take its place.

relics from the old, fun internet

Enter Threads. 

Meta’s Twitter-killer features little news by design, in line with Meta’s new hardline strategy against accommodating those press-room bastards that have inflicted them with so many indignities in the past. It also has even less moderation than Facebook or Instagam ever did, echoing both Musk and Zuckerberg’s profoundly cynical, if hard to argue with, realization that governments don’t have the courage to force them to make their websites less evil. Unsurprisingly, the site already has a hate-speech problem.

 Somehow, some people, mourning over the terminally-ill wreck of what was once Twitter, are still hailing Zuckerberg as something of a savior, or at least, as someone who’s substantially less evil than Elon Musk (which is wrong, but is very convenient for Zuck). Others are shrugging and leaning into Threads, shifting back into the once all-powerful idea that Facebook is inevitable, that resisting it is as foolish as shooting into the eye of a hurricane.

As for me? I’m somewhat afraid of Threads, albeit less so now, in August, then I was when it first came out in July, as it’s become clear that the service isn’t becoming the default Twitter-replacement that Meta had so fervently hoped it would be. But I’m also angry about Threads, the kind of rage that develops when you see your oldest and most loathsome enemy somehow survive threat after threat, and continue to shamble hungrily on. 

I’m angry about how Mark Zuckerberg and Facebook and all the rest of his horrible companies have been able to spend the last 15 years getting away with it, how they never seem to suffer truly meaningful consequences for constantly, continuously, making the world worse. And I’m also angry about how so many people know what Zuckerberg is, and know what he’s done, and are still willing to give him yet another chance.

facebook has always made me feel like I’m trapped in a Bruegel painting

“Maybe this time, he won’t be evil!” people say, and then he does something evil again, and the same people claim that this was, somehow, a surprise. It’s a lot like inviting the Dread Vampire Zartok into your home, even after he’s drained the blood of your neighbors, because he hasn’t drained your blood yet. It’s a form of collective madness, or at least, it makes me, and everyone else who has spent years trying to warn people, feel mad.

Oh, I’d like to imagine that Mark Zuckerberg sleeps terribly.

That every night, the hungry ghosts of the dead close in upon him.

The small, charred ghosts of the Rohingya children burned alive in their homes, who still smell faintly of smoke and cooked flesh.

The pale and bloated ghosts of the people who drowned in the Mediterranean after fleeing ethnic cleansing in their home countries, whose faces have been nibbled upon by deep-sea fish.

The suicides.

The men and women slowly tortured to death in secret Syrian prison cells. 

They gather around him, and they whisper things that cannot be written into his ear. And he is tormented. 

But that’s a fantasy. 

Mark Zuckerberg is a man who sleeps well. He has hobbies. He enjoys non-descript barbecue sauce. He’s happily married. He has none of the freakish, manic anxiety that swirls around Elon Musk. Zuckerberg is self-assured.

He walks, serene, under a shield of plausible deniability. After all: he didn’t burn those Rohingya villages himself. He didn’t lead the soldiers that chased those Muslim Indians off of their land, or the vigilantes killing their ethnic enemies in Ethiopia.

He didn’t personally destroy the self-esteem of teenage girls, or publicly stream a mass-shooting at a mosque in New Zealand, or coordinate storming the Capitol on January 6th. He didn’t spread the lies that persuaded millions of Americans to wave off the vaccines that might have saved their lives, and he didn’t give those Kenyan moderators the PTSD that makes them see the faces of the screaming dead at night. 

Certainly, Zuckerberg would acknowledge that his website played a role. But who’s to say how much of one? It is so hard to quantify these things. And there are fewer and fewer people left who have the time and the resources to try.

“But can we really blame Facebook for that?” some people will say. “Wasn’t journalism already in trouble before he came along?

Maybe. But isn’t it interesting how Mark Zuckerberg and his company exists entirely in a cocoon of plausible deniability, in an ecosystem they’ve designed to exquisitely accommodate their own version of reality?

Perhaps I am too hard on Mark Zuckerberg.

Perhaps he deserves another chance to connect the world, like he says he always meant to do. Move fast. Break things. You have to make a few mistakes to get ahead. Just a few little mistakes. 

“The idiots trusted me,” Mark Zuckerberg famously said, in the early-on years, when people had not learned what he was yet. 

No. I won’t be posting on Threads. 

Little Flying Robots is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

You Don’t Want to Delete Your Facebook (And That’s OK)

Everyone should stop Facebook. Everyone is not going to stop using Facebook this week. That’s OK. There’s a middle ground between deleting your account forever and between spending all of your waking, earthly hours refreshing your Facebook feed. And we should be telling our relatives and friends about that middle ground, instead of telling them they have to stop using Facebook right away.  We can counter the sense of helplessness that many people feel about their relationship to Facebook and to other social media platforms – but we’re going to need to do it in an incremental, careful way. 

You’re reading this and you don’t live under a rock, so you’re probably very much aware that Facebook is under an immense amount of heat right now. Last weekend, the world found out that the eminently-creepy (albeit over-hyped) Cambridge Analytica voter profiling company scraped 50 million Facebook users profiles, information that they used to on behalf of Donald Trump in the 2016 election. It was the latest in a year and a half long succession of failures and embarrassments for Facebook, from their widely derided failure to do something about Russian bots to their ham-fisted attempts at fighting fake news to their disturbing treatment of their underpaid moderators.

Facebook seemed invulnerable for a very long time, but this latest scandal, on top of all the others, actually seems to have wounded it: Facebook’s stock valued dropped by as much as 8 percent in the US and the UK, and Mark Zuckerberg had lost already $9 billion of his net worth by Tuesday. The leviathan has been hit, there’s blood in the water, and it’s easy for us privacy paranoiacs to feel like Captain Ahab.  For the first time that we can remember, there’s an opportunity to take Facebook down, or at least to weaken it by reducing its user base. 

“You’ve got to delete your Facebook profile, it’s the only way!” we tell our friends and relatives, waving around our harpoons. Whereupon our friends and relatives smile (so as not to provoke us) and back away. They don’t want to delete Facebook entirely, or maybe they can’t due to their job, or because it’s the only way to communicate with their families. It’s not like it matters, they think. Facebook already has all my information. 

The Captain Ahab approach is not a good way to get people to alter their social media habits, and it’s not a good way to convince people to better protect their privacy from companies like Facebook. When we get all Captain Ahab, we’re forgetting some important realities about human beings and how most human beings feel about Facebook. 

 

“Ego non baptizo te in nomine patris, sed in nomine diaboli!

First: we forget that most people don’t know much about how their Facebook data is used and abused. Endless news stories about the evils of Facebook may fill them with unease and distrust, but they’re not getting much good information on what to do about it.   A 2015 Pew Research Center survey found that 47% of Americans lacked confidence in their knowledge of what companies actually do with their personal data.  This 2016 survey from the UK found that while 74% of Internet users believed that they adequately protected their online data, only 28% of respondents had actually turned off location tracking on the platforms they use, and only 31% had changed their social media privacy settings. This confusion is  in part because Facebook and other social media companies have done a bang-up job of obfuscating what they’re up to: another recent study found that Facebook’s privacy policy became much less transparent and much harder for mere mortals to understand in the decade from 2005 to 2015. 

Your conflicted Facebook-using friends and relations are living in what frustrated security researchers call the “privacy paradox”: most people will swear up and down that privacy is important to them, and then will continue to share their personal information widely on the Internet. This is not because they are stupid.  It is because they believe that they live in a dark, howling Internet panopticon from which they cannot escape. (I’m exaggerating, but only kinda). This 2016 focus-group study found that young people were aware of the risks of sharing their information online. They just didn’t think they could do anything about those risks: they felt that “privacy violations are inevitable and opting out is not an option.” They’ve  fallen prey to privacy cynicism, which is defined rather succinctly by these researchers as ” an attitude of uncertainty, powerlessness and mistrust towards the handling of personal data by online services, rendering privacy protection behavior subjectively futile.” 

I met a traveler in an antique…look, you get it, hint hint.

Many people also feel powerless because they think Facebook is unkillable. The average person viewed Facebook as a doofy college-kid rumor service back in 2007: now, most see it as a bit like an unstoppable, inescapable international hive-mind. I’m sure Facebook would be just fine with being viewed sometime decades hence as an inscrutable but appeasable deity: provide your data tribute, and the crops will flourish! Withold your tribute and face its wrath! Facebook knows no past or present or death! 

Thankfully, this is horseshit.

In the last 20 years, we’ve watched former juggernauts like AOL, Yahoo, MySpace, Ask Jeeves, and many many more weaken and die. From one point of view, Facebook is already dying: young people have correctly identified that Facebook is now dominated by their elderly and incoherent relatives, and they’re ditching the platform in droves.  For the first time in a decade, Facebook usage has decreased amongst Americans, dropping from 67% to 62%,  while Google and YouTube usage continues to grow.

These people might be backing away because they’ve lost trust in Facebook. Trust is everything for social media companies like Facebook: people’s willingness to share the data that social companies must feast upon to survive is dependent upon how much they trust the platform not to wantonly abuse it.  A 2017 study from the UK found that only one in four Britons trust social media, and a majority believe that social media companies aren’t adequately regulated. A mere 35 percent of Bay Area residents say that they trust social media companies. A study from October found that while a majority of responders do believe that Facebook’s effect on society is positive overall, they also trusted Facebook the least of the “big five” tech companies (and only 60% knew that Facebook owns Instagram). We can work with this. 

Yep, that’s a hideous dolphin figurine.

Second:  People are absolutely horrible at going cold-turkey at things.  Look, I’ve spent many, many hours of my fleeting and precious life sitting slack-jawed on my couch, refreshing Facebook like a Skinner-box trained rat. I know that it’s fiendishly hard to stop using social media. Some scientists now believe that social media can be the focus of a true psychological addiction, just like World of Warcraft or gambling or collecting hideous dolphin figurines. A PLOS One study found that heavy Internet users exhibit physical “withdrawal” symptoms and anxiety when they suddenly stopped using social media. 

Changing your relationship with the Internet and social media is particularly difficult because they are such fundamental parts of modern life: abstinence isn’t really an option. You can live a normal, productive life without WoW or cigarettes, but it’s just about impossible to live normally without the Internet. It can also be hard to go without Facebook: many people do need it for their jobs, or to stay attached to relatives who may not be as up for getting off Facebook as they are. 

So what can we ask people to do? What are some realistic, relatively easy things that people can do to better protect their privacy? How can people scale back their Facebook usage and the data they share with Facebook, without deleting their profile entirely? Here’s some suggestions. 

  • Figure out the motivation behind your compulsion to use Facebook. “Cyber psychologist” John Suler (what a great job title) suggested this type of scrutiny in a Quartz article: “Is it a need for dependency, to feel important and powerful, to express anger, to release oneself from guilt? In compulsive behaviors, people are expressing such needs but rarely does the activity actually resolve those needs.” If you know why you’re spending hours combing through your colleague’s second-cousin’s dog photos, you’ll have a better sense of what you need to do to stop. You can also try restriction apps, like Self Control – they’ve helped me reduce my own “pigeon pecking at a button” behavior immensely. 
  • Turn off location sharing. I do not use location sharing on any of my devices. There is no good reason for Facebook to know where you are. 
  • Turn off Facebook’s platform feature. This feature is what allows third-party apps and other websites to integrate with Facebook, and it’s also what permits these third-party apps to slurp up lots and lots of your data. Shut that sucker off. No, you won’t be able to play Farmville anymore, you deviant. 
  • Review your third-party app settings. If you don’t want to take the nuclear option of turning off Facebook’s platform –  though you really, really should – you can still review your third party app settings and revoke access to apps you distrust. (Don’t trust any of them). Buzzfeed has a good guide here. You should do this for all the social media sites you use, not just Facebook. 
  • Stop liking things. “Likes” give Facebook useful information on how to advertise to you. Do not do that. 
  • Stop Facebook from tracking you across the Internet. Facebook extensively tracks users, both on the platform and on sites that have a Facebook “like button” – yes, they’re following you even when you aren’t on Facebook itself.  There’s a number of good ways to stop this tracking, on your computer and on your phone: I like the uBlock Origin browser plug-in, and the 1Blocker app for mobile devices. 
  • Lock down your privacy settings. Review your privacy settings at least once a month: Facebook has an infuriating habit of resetting them. 
  • Delete as much information as you can possibly stand from your Facebook profile. Delete as many old posts as you can possibly stand. You can download your Facebook archive if you don’t want to lose those memories entirely. 
  • Facebook targeted ads are majestically creepy, and you should opt out of them right now. You can do this in your Facebook account settings. HowToGeek has a nice guide to opting out of these ads on multiple platforms.
  • Read these other guides to protecting your privacy on Facebook. Here’s a good one from the Guardian. Here’s one from CNBC. Here’s one from Motherboard,

Don’t get me wrong. We do not live in a perfect world. Us Captain Ahabs are not going to convince every Facebook user to rise up and delete their profile as one in a Glorious Attention Revolution, in which Facebook evaporates into a puff of dark and oily mist, and all the Facebook money is redistributed to the world’s privacy-loving children, and Mark Zuckerberg is forced to live in penitent exile in a hole in the forest on a very remote island. We are not going to harpoon this stupid privacy-hating white whale right now. 

What we can do is slowly starve Facebook: by cutting down on our time using Facebook and the amount of information we share with it, we can reduce its ration of nutrient-rich data krill. Facebook’s advertisers are dependent on your attention and knowledge about you, and their job gets a lot harder if you provide less of it. By starving Facebook, we reduce its power over us and its power over our government and over our minds. It’s absolutely true that users can only do so much: we are going to need regulation with teeth to truly loosen Facebook’s grip over our societies. Still, we can help bring about that regulation and help alter how our communities approach Facebook by altering our own behavior and helping others do the same.

I don’t necessarily want Facebook to die (though I’m not sure I’d be very sad). I do want it to be humbled. I want Facebook and its leaders to realize that we do live in a world where actions have consequences, and where the actions of gigantic companies that control mind-exploding quantities of data have some of the most important consequences of all.  We can do our small part to make this happen. In short: Facebook users of the world, unite. You have nothing to lose but constant interaction with your racist uncle. 

Facebook Believes Americans Are Good at Evaluating Their Sources, And Other Comfortable Delusions

oh my god shut up

Mark Zuckerberg would like you to know that he cares a lot about disinformation and bots and propaganda. He is very concerned about this, and is also very aware that he possesses terrifying technological powers. (See, his brow! Consider how it furrows!) And so on January 19th, he made another one of his big announcements.  He’s decided, in his serene wisdom, to trust the people of Facebook to determine what is true. Nothing could possibly go wrong.  

“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” Zuckerberg chirped in his announcement (I always imagine him chirping in these, like a smug billionaire chickadee). “We decided that having the community determine which sources are broadly trusted would be most objective.” Users will be asked to rate the credibility of news sources, though only those that Facebook determines they are familiar with, through some mysterious and possibly eldritch method. These “ongoing quality surveys” will then be used to determine which news sources pop up most often in users news feeds. Will there be any effort to correct for craven partisan sentiment? No, apparently there will not be. Will there be some mechanism for avoiding another mass and gleeful ratfucking by 4chan and 8chan and whatever other slugbeasts lurk within the Internet? No, apparently there will not be. Everything will be fine! 

On January 19th, we learned that Facebook is the last organization in the entire world that still has great faith in the research and assessment powers of the average American. Is Facebook actually that unfathomably, enormously naive? Well, maybe. Or perhaps they are, once again, betting that we are stupid enough to believe that Facebook is making a legitimate effort to correct itself, and that we will then stop being so mad at them. 

Which is insulting. 

Any creature more intelligent than an actual avocado knows that Facebook’s user-rating scheme is doomed to miserable failure. Researchers  Alan Dennis, Antino Kim and Tricia Moravec elegantly diagnosed the project’s many, many problems in a Buzzfeed post, drawing on their research on fake news and news-source ratings. They conclude, as you’d think should be obvious, that user-ratings for news sources are a very different thing than user-ratings for toasters. “Consumer reviews of products like toasters work because we have direct experience using them,” they wrote. “Consumer reviews of news sources don’t work because we can’t personally verify the facts from direct experience; instead, our opinions of news are driven by strong emotional attachments to underlying sociopolitical issues.”

Facebook, if we are to believe that they are not actively hoodwinking us, legitimately believes that the American people have, in the past year, somehow become astute and critical consumers of the news. But this impossible.  Facebook’s magical thinking is roughly equivalent to putting a freezer burned Hot-Pocket in a microwave and hoping that it will, in three minutes, turn into a delicious brick-oven pizza. There is no transmutation and there is no improvement. The Hot Pocket of ignorance and poor civic education will remain flaccid and disappointing no matter how much you hope and wish and pray. 

there is some trippy ass clipart for Facebook on pixabay

This doesn’t mean there is no hope for the information ecosystem of the United States. It does not mean that this ongoing nightmare is permanent. As Dennis, Kim, and Moravec suggest, Facebook could grow a spine and start employing actual experts. Experts empowered to filter. Experts who are empowered to deem what is bullshit and what is not. But of course, this is what scares them most of all. See what Zuckerberg wrote in his Big Announcement: “The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with.”

“Not comfortable with.” Consider that wording. They’re not comfortable with doing the one thing that might actually help to dislodge the cerebral-fluid sucking leech that is currently wrapped around the brainstems of the social-media using public. It would be so awful if Facebook was made uncomfortable.

And it will do anything to avoid discomfort. Mark Zuckerberg and Facebook are simply abdicating responsibility again. They know that these “checks” won’t work. They know damn well that hiring editors and engaging in meaningful moderation is what they haven’t tried, and what is most likely to work, and what is most likely to earn them the ire of the Trump cult that now squats wetly in the White House. Cowardice has won out, again: they’ve simply come up with another semi-clever way to fob off responsibility on its users. When these “credibility checks” inevitably fail or are compromised by hordes of wild-eyed Pepes, Facebook will, right on schedule, act surprised and aghast, then quickly pretend it never happened. You should be insulted that they think we’ll just keep falling for this. We have to stop falling for this. 

These so-called credibility checks are just Facebook’s latest milquetoast and insulting effort to pretend it is dealing with its disinformation problem.  Just a few weeks ago, Facebook announced that it would be reducing public content on the news feed. This is to social-engineer “meaningful social interactions with family and friends” for its users. This might sound well and good – if you are much more comfortable with being socially-engineered by blank-eyed boys from Silicon Valley than I am – or at least it does until you hear from people who have already undergone this change. Facebook is fond of using countries from markets it deems insignificant as guinea pigs for its changes, and in 2017, Sri Lanka, Guatemala, Cambodia, Slovakia, Bolivia, and Serbia were shoved in the direction of “meaningful social interaction.” (One does wonder about the selection, considering the unpleasant history these nations share). The results were, to quote local journalists in Guatemala, “catastrophic.” Reporters in these countries suddenly found their publications – important sources of information in fragile political systems – deprived of their largest source of readership and income.

Adam Mosseri, head of Facebook’s News Feed, responded to these reporter’s anguish with the serene, Athenian calm that only tech evangelicals can muster: “The goal of this test is to understand if people prefer to have separate places for personal and public content. We will hear what people say about the experience to understand if it’s an idea worth pursuing any further.”(Whoops, we broke your already-fragile democracy! Move fast! Break things!) Dripping a new shampoo line in little white bunny rabbit’s quivering eyeballs is also a test . The difference between the two? Testing your new product on embattled reporters in formerly war-torn nations is much more socially acceptable. 

Facebook has also recently attempted to socially engineer us into being better citizens. In late 2017, I wrote about Facebook’s ill-considered civic engagement tools or “constituent services,” which were meant to (in a nutshell) make it easier for you to badger your representative or for your representative to badger you back. Using these tools, of course, required a Facebook account – and you also had to tell Facebook where you lived, so it could match you up with your representative.  Facebook would very much like a world in which people need to submit to having a Facebook account to meaningfully communicate with their representatives. Facebook would, we can probably assume, very much like a world where pretty much everything is like Facebook. This is probably not going to change. 

Yes, I know: Zuckerberg furrowed his brow somewhere in his mansion and said that he might consider cutting his profits to reduce the gigantic social problem that he’s engendered. By that, he means doing things that might actually address the disinformation problem: these things might take a variety of forms, from actually hiring experts and editors, to actually paying for news (as, incredibly, Rupert Murdoch just suggested) to hiring and meaningfully compensating a competent army of moderators. But consider our available evidence.  Do we really believe that he’ll flout his (scary) board and do the right thing? Or will he and Facebook once again choose comfort, and do nothing at all? 

“We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard,” said John F. Kennedy, in a quote that I am deadly certain Facebook employees like to trot out as they perfect methods of micro-targeting underpants ads to under-25 men who like trebuchets, or perfect new Messenger stickers of farting cats, or sort-of-accidentally rupture American democracy. Perhaps someday Facebook will develop an appetite for dealing with things that are actually hard, that are actually uncomfortable.

I’m not holding my breath.