Faine Opines

Southeast Asia, liberation technology, drones, and pontification

Category: facebook

You Don’t Want to Delete Your Facebook (And That’s OK)

Everyone should stop Facebook. Everyone is not going to stop using Facebook this week. That’s OK. There’s a middle ground between deleting your account forever and between spending all of your waking, earthly hours refreshing your Facebook feed. And we should be telling our relatives and friends about that middle ground, instead of telling them they have to stop using Facebook right away.  We can counter the sense of helplessness that many people feel about their relationship to Facebook and to other social media platforms – but we’re going to need to do it in an incremental, careful way. 

You’re reading this and you don’t live under a rock, so you’re probably very much aware that Facebook is under an immense amount of heat right now. Last weekend, the world found out that the eminently-creepy (albeit over-hyped) Cambridge Analytica voter profiling company scraped 50 million Facebook users profiles, information that they used to on behalf of Donald Trump in the 2016 election. It was the latest in a year and a half long succession of failures and embarrassments for Facebook, from their widely derided failure to do something about Russian bots to their ham-fisted attempts at fighting fake news to their disturbing treatment of their underpaid moderators.

Facebook seemed invulnerable for a very long time, but this latest scandal, on top of all the others, actually seems to have wounded it: Facebook’s stock valued dropped by as much as 8 percent in the US and the UK, and Mark Zuckerberg had lost already $9 billion of his net worth by Tuesday. The leviathan has been hit, there’s blood in the water, and it’s easy for us privacy paranoiacs to feel like Captain Ahab.  For the first time that we can remember, there’s an opportunity to take Facebook down, or at least to weaken it by reducing its user base. 

“You’ve got to delete your Facebook profile, it’s the only way!” we tell our friends and relatives, waving around our harpoons. Whereupon our friends and relatives smile (so as not to provoke us) and back away. They don’t want to delete Facebook entirely, or maybe they can’t due to their job, or because it’s the only way to communicate with their families. It’s not like it matters, they think. Facebook already has all my information. 

The Captain Ahab approach is not a good way to get people to alter their social media habits, and it’s not a good way to convince people to better protect their privacy from companies like Facebook. When we get all Captain Ahab, we’re forgetting some important realities about human beings and how most human beings feel about Facebook. 

 

“Ego non baptizo te in nomine patris, sed in nomine diaboli!

First: we forget that most people don’t know much about how their Facebook data is used and abused. Endless news stories about the evils of Facebook may fill them with unease and distrust, but they’re not getting much good information on what to do about it.   A 2015 Pew Research Center survey found that 47% of Americans lacked confidence in their knowledge of what companies actually do with their personal data.  This 2016 survey from the UK found that while 74% of Internet users believed that they adequately protected their online data, only 28% of respondents had actually turned off location tracking on the platforms they use, and only 31% had changed their social media privacy settings. This confusion is  in part because Facebook and other social media companies have done a bang-up job of obfuscating what they’re up to: another recent study found that Facebook’s privacy policy became much less transparent and much harder for mere mortals to understand in the decade from 2005 to 2015. 

Your conflicted Facebook-using friends and relations are living in what frustrated security researchers call the “privacy paradox”: most people will swear up and down that privacy is important to them, and then will continue to share their personal information widely on the Internet. This is not because they are stupid.  It is because they believe that they live in a dark, howling Internet panopticon from which they cannot escape. (I’m exaggerating, but only kinda). This 2016 focus-group study found that young people were aware of the risks of sharing their information online. They just didn’t think they could do anything about those risks: they felt that “privacy violations are inevitable and opting out is not an option.” They’ve  fallen prey to privacy cynicism, which is defined rather succinctly by these researchers as ” an attitude of uncertainty, powerlessness and mistrust towards the handling of personal data by online services, rendering privacy protection behavior subjectively futile.” 

I met a traveler in an antique…look, you get it, hint hint.

Many people also feel powerless because they think Facebook is unkillable. The average person viewed Facebook as a doofy college-kid rumor service back in 2007: now, most see it as a bit like an unstoppable, inescapable international hive-mind. I’m sure Facebook would be just fine with being viewed sometime decades hence as an inscrutable but appeasable deity: provide your data tribute, and the crops will flourish! Withold your tribute and face its wrath! Facebook knows no past or present or death! 

Thankfully, this is horseshit.

In the last 20 years, we’ve watched former juggernauts like AOL, Yahoo, MySpace, Ask Jeeves, and many many more weaken and die. From one point of view, Facebook is already dying: young people have correctly identified that Facebook is now dominated by their elderly and incoherent relatives, and they’re ditching the platform in droves.  For the first time in a decade, Facebook usage has decreased amongst Americans, dropping from 67% to 62%,  while Google and YouTube usage continues to grow.

These people might be backing away because they’ve lost trust in Facebook. Trust is everything for social media companies like Facebook: people’s willingness to share the data that social companies must feast upon to survive is dependent upon how much they trust the platform not to wantonly abuse it.  A 2017 study from the UK found that only one in four Britons trust social media, and a majority believe that social media companies aren’t adequately regulated. A mere 35 percent of Bay Area residents say that they trust social media companies. A study from October found that while a majority of responders do believe that Facebook’s effect on society is positive overall, they also trusted Facebook the least of the “big five” tech companies (and only 60% knew that Facebook owns Instagram). We can work with this. 

Yep, that’s a hideous dolphin figurine.

Second:  People are absolutely horrible at going cold-turkey at things.  Look, I’ve spent many, many hours of my fleeting and precious life sitting slack-jawed on my couch, refreshing Facebook like a Skinner-box trained rat. I know that it’s fiendishly hard to stop using social media. Some scientists now believe that social media can be the focus of a true psychological addiction, just like World of Warcraft or gambling or collecting hideous dolphin figurines. A PLOS One study found that heavy Internet users exhibit physical “withdrawal” symptoms and anxiety when they suddenly stopped using social media. 

Changing your relationship with the Internet and social media is particularly difficult because they are such fundamental parts of modern life: abstinence isn’t really an option. You can live a normal, productive life without WoW or cigarettes, but it’s just about impossible to live normally without the Internet. It can also be hard to go without Facebook: many people do need it for their jobs, or to stay attached to relatives who may not be as up for getting off Facebook as they are. 

So what can we ask people to do? What are some realistic, relatively easy things that people can do to better protect their privacy? How can people scale back their Facebook usage and the data they share with Facebook, without deleting their profile entirely? Here’s some suggestions. 

  • Figure out the motivation behind your compulsion to use Facebook. “Cyber psychologist” John Suler (what a great job title) suggested this type of scrutiny in a Quartz article: “Is it a need for dependency, to feel important and powerful, to express anger, to release oneself from guilt? In compulsive behaviors, people are expressing such needs but rarely does the activity actually resolve those needs.” If you know why you’re spending hours combing through your colleague’s second-cousin’s dog photos, you’ll have a better sense of what you need to do to stop. You can also try restriction apps, like Self Control – they’ve helped me reduce my own “pigeon pecking at a button” behavior immensely. 
  • Turn off location sharing. I do not use location sharing on any of my devices. There is no good reason for Facebook to know where you are. 
  • Turn off Facebook’s platform feature. This feature is what allows third-party apps and other websites to integrate with Facebook, and it’s also what permits these third-party apps to slurp up lots and lots of your data. Shut that sucker off. No, you won’t be able to play Farmville anymore, you deviant. 
  • Review your third-party app settings. If you don’t want to take the nuclear option of turning off Facebook’s platform –  though you really, really should – you can still review your third party app settings and revoke access to apps you distrust. (Don’t trust any of them). Buzzfeed has a good guide here. You should do this for all the social media sites you use, not just Facebook. 
  • Stop liking things. “Likes” give Facebook useful information on how to advertise to you. Do not do that. 
  • Stop Facebook from tracking you across the Internet. Facebook extensively tracks users, both on the platform and on sites that have a Facebook “like button” – yes, they’re following you even when you aren’t on Facebook itself.  There’s a number of good ways to stop this tracking, on your computer and on your phone: I like the uBlock Origin browser plug-in, and the 1Blocker app for mobile devices. 
  • Lock down your privacy settings. Review your privacy settings at least once a month: Facebook has an infuriating habit of resetting them. 
  • Delete as much information as you can possibly stand from your Facebook profile. Delete as many old posts as you can possibly stand. You can download your Facebook archive if you don’t want to lose those memories entirely. 
  • Facebook targeted ads are majestically creepy, and you should opt out of them right now. You can do this in your Facebook account settings. HowToGeek has a nice guide to opting out of these ads on multiple platforms.
  • Read these other guides to protecting your privacy on Facebook. Here’s a good one from the Guardian. Here’s one from CNBC. Here’s one from Motherboard,

Don’t get me wrong. We do not live in a perfect world. Us Captain Ahabs are not going to convince every Facebook user to rise up and delete their profile as one in a Glorious Attention Revolution, in which Facebook evaporates into a puff of dark and oily mist, and all the Facebook money is redistributed to the world’s privacy-loving children, and Mark Zuckerberg is forced to live in penitent exile in a hole in the forest on a very remote island. We are not going to harpoon this stupid privacy-hating white whale right now. 

What we can do is slowly starve Facebook: by cutting down on our time using Facebook and the amount of information we share with it, we can reduce its ration of nutrient-rich data krill. Facebook’s advertisers are dependent on your attention and knowledge about you, and their job gets a lot harder if you provide less of it. By starving Facebook, we reduce its power over us and its power over our government and over our minds. It’s absolutely true that users can only do so much: we are going to need regulation with teeth to truly loosen Facebook’s grip over our societies. Still, we can help bring about that regulation and help alter how our communities approach Facebook by altering our own behavior and helping others do the same.

I don’t necessarily want Facebook to die (though I’m not sure I’d be very sad). I do want it to be humbled. I want Facebook and its leaders to realize that we do live in a world where actions have consequences, and where the actions of gigantic companies that control mind-exploding quantities of data have some of the most important consequences of all.  We can do our small part to make this happen. In short: Facebook users of the world, unite. You have nothing to lose but constant interaction with your racist uncle. 

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Facebook Believes Americans Are Good at Evaluating Their Sources, And Other Comfortable Delusions

oh my god shut up

Mark Zuckerberg would like you to know that he cares a lot about disinformation and bots and propaganda. He is very concerned about this, and is also very aware that he possesses terrifying technological powers. (See, his brow! Consider how it furrows!) And so on January 19th, he made another one of his big announcements.  He’s decided, in his serene wisdom, to trust the people of Facebook to determine what is true. Nothing could possibly go wrong.  

“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” Zuckerberg chirped in his announcement (I always imagine him chirping in these, like a smug billionaire chickadee). “We decided that having the community determine which sources are broadly trusted would be most objective.” Users will be asked to rate the credibility of news sources, though only those that Facebook determines they are familiar with, through some mysterious and possibly eldritch method. These “ongoing quality surveys” will then be used to determine which news sources pop up most often in users news feeds. Will there be any effort to correct for craven partisan sentiment? No, apparently there will not be. Will there be some mechanism for avoiding another mass and gleeful ratfucking by 4chan and 8chan and whatever other slugbeasts lurk within the Internet? No, apparently there will not be. Everything will be fine! 

On January 19th, we learned that Facebook is the last organization in the entire world that still has great faith in the research and assessment powers of the average American. Is Facebook actually that unfathomably, enormously naive? Well, maybe. Or perhaps they are, once again, betting that we are stupid enough to believe that Facebook is making a legitimate effort to correct itself, and that we will then stop being so mad at them. 

Which is insulting. 

Any creature more intelligent than an actual avocado knows that Facebook’s user-rating scheme is doomed to miserable failure. Researchers  Alan Dennis, Antino Kim and Tricia Moravec elegantly diagnosed the project’s many, many problems in a Buzzfeed post, drawing on their research on fake news and news-source ratings. They conclude, as you’d think should be obvious, that user-ratings for news sources are a very different thing than user-ratings for toasters. “Consumer reviews of products like toasters work because we have direct experience using them,” they wrote. “Consumer reviews of news sources don’t work because we can’t personally verify the facts from direct experience; instead, our opinions of news are driven by strong emotional attachments to underlying sociopolitical issues.”

Facebook, if we are to believe that they are not actively hoodwinking us, legitimately believes that the American people have, in the past year, somehow become astute and critical consumers of the news. But this impossible.  Facebook’s magical thinking is roughly equivalent to putting a freezer burned Hot-Pocket in a microwave and hoping that it will, in three minutes, turn into a delicious brick-oven pizza. There is no transmutation and there is no improvement. The Hot Pocket of ignorance and poor civic education will remain flaccid and disappointing no matter how much you hope and wish and pray. 

there is some trippy ass clipart for Facebook on pixabay

This doesn’t mean there is no hope for the information ecosystem of the United States. It does not mean that this ongoing nightmare is permanent. As Dennis, Kim, and Moravec suggest, Facebook could grow a spine and start employing actual experts. Experts empowered to filter. Experts who are empowered to deem what is bullshit and what is not. But of course, this is what scares them most of all. See what Zuckerberg wrote in his Big Announcement: “The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with.”

“Not comfortable with.” Consider that wording. They’re not comfortable with doing the one thing that might actually help to dislodge the cerebral-fluid sucking leech that is currently wrapped around the brainstems of the social-media using public. It would be so awful if Facebook was made uncomfortable.

And it will do anything to avoid discomfort. Mark Zuckerberg and Facebook are simply abdicating responsibility again. They know that these “checks” won’t work. They know damn well that hiring editors and engaging in meaningful moderation is what they haven’t tried, and what is most likely to work, and what is most likely to earn them the ire of the Trump cult that now squats wetly in the White House. Cowardice has won out, again: they’ve simply come up with another semi-clever way to fob off responsibility on its users. When these “credibility checks” inevitably fail or are compromised by hordes of wild-eyed Pepes, Facebook will, right on schedule, act surprised and aghast, then quickly pretend it never happened. You should be insulted that they think we’ll just keep falling for this. We have to stop falling for this. 

These so-called credibility checks are just Facebook’s latest milquetoast and insulting effort to pretend it is dealing with its disinformation problem.  Just a few weeks ago, Facebook announced that it would be reducing public content on the news feed. This is to social-engineer “meaningful social interactions with family and friends” for its users. This might sound well and good – if you are much more comfortable with being socially-engineered by blank-eyed boys from Silicon Valley than I am – or at least it does until you hear from people who have already undergone this change. Facebook is fond of using countries from markets it deems insignificant as guinea pigs for its changes, and in 2017, Sri Lanka, Guatemala, Cambodia, Slovakia, Bolivia, and Serbia were shoved in the direction of “meaningful social interaction.” (One does wonder about the selection, considering the unpleasant history these nations share). The results were, to quote local journalists in Guatemala, “catastrophic.” Reporters in these countries suddenly found their publications – important sources of information in fragile political systems – deprived of their largest source of readership and income.

Adam Mosseri, head of Facebook’s News Feed, responded to these reporter’s anguish with the serene, Athenian calm that only tech evangelicals can muster: “The goal of this test is to understand if people prefer to have separate places for personal and public content. We will hear what people say about the experience to understand if it’s an idea worth pursuing any further.”(Whoops, we broke your already-fragile democracy! Move fast! Break things!) Dripping a new shampoo line in little white bunny rabbit’s quivering eyeballs is also a test . The difference between the two? Testing your new product on embattled reporters in formerly war-torn nations is much more socially acceptable. 

Facebook has also recently attempted to socially engineer us into being better citizens. In late 2017, I wrote about Facebook’s ill-considered civic engagement tools or “constituent services,” which were meant to (in a nutshell) make it easier for you to badger your representative or for your representative to badger you back. Using these tools, of course, required a Facebook account – and you also had to tell Facebook where you lived, so it could match you up with your representative.  Facebook would very much like a world in which people need to submit to having a Facebook account to meaningfully communicate with their representatives. Facebook would, we can probably assume, very much like a world where pretty much everything is like Facebook. This is probably not going to change. 

Yes, I know: Zuckerberg furrowed his brow somewhere in his mansion and said that he might consider cutting his profits to reduce the gigantic social problem that he’s engendered. By that, he means doing things that might actually address the disinformation problem: these things might take a variety of forms, from actually hiring experts and editors, to actually paying for news (as, incredibly, Rupert Murdoch just suggested) to hiring and meaningfully compensating a competent army of moderators. But consider our available evidence.  Do we really believe that he’ll flout his (scary) board and do the right thing? Or will he and Facebook once again choose comfort, and do nothing at all? 

“We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard,” said John F. Kennedy, in a quote that I am deadly certain Facebook employees like to trot out as they perfect methods of micro-targeting underpants ads to under-25 men who like trebuchets, or perfect new Messenger stickers of farting cats, or sort-of-accidentally rupture American democracy. Perhaps someday Facebook will develop an appetite for dealing with things that are actually hard, that are actually uncomfortable.

I’m not holding my breath. 

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

© 2018 Faine Opines

Theme by Anders NorenUp ↑

css.php