WSJ’s Facebook Files: behind the scandals and the paywall

Originally published on Sumner Newscow on October 10, 2021

When I was 14 years old, I got in trouble with my mom.

This wasn’t out of the ordinary, I’d been in trouble with her most of my life up till that point, but there was something new about this lecture; it was over something I did online. As a kid, people my age were just starting to develop into the internet junkies we are today, but I’d never had a substantive conversation about it with an authority figure.

Why did I get in trouble? I lied about creating a Facebook account.

In Facebook’s early days, one had to be 18 to start an account, which was easy enough to lie about, but the platform was so much better than its predecessor Myspace, that I needed to get on.

To start, it was way more user-friendly and encouraged interaction in an easier way than I’d experienced online thus far.

Flash-forward over a decade and that novel social media site my mom yelled at me over has ballooned into one of the biggest companies in the world and might be responsible for global disasters like the 2016 election and worsening the COVID-19 pandemic.

How we got here is a much larger discussion for a much longer piece, but we have new information thanks to thousands of leaked internal Facebook documents and a series of landmark articles from the Wall Street Journal titled the Facebook Files.

The internal memos, presentations, and intra-staff communications were provided by Frances Haugen, a whistle-blower and former employee, who has since testified about their contents to Congress. The documents paint a horrid picture of Facebook’s criminal negligence in policing their platform, their ruthless pursuit of profits over the people that made them billionaires, and that they understood their harm the whole time.

In short, we now know what Facebook knew and when it knew it.

But because this is such an important series of articles and the Wall Street Journal is a little lax on its ethics, you won’t be able to read them unless you have an account. Here’s the quick and dirty version.

FB allows “elite” members to spread disinformation and hate speech they’d otherwise restrict

Facebook spouts every chance it gets that all its users “play by the same rules.” Not that we didn’t already know, but now we have proof that that’s not the case at all.

Facebook actually has the policy to allow “high-profile” accounts like politicians, celebrities, fake news “experts,” etc. to violate those rules.

They get that privilege because Facebook is more terrified that they’ll get embarrassed in the media for overly restricting a big account than adequately policing what that account promotes.

Speaking of privilege, this practice is ironically named “whitelisting.”

The actual program Facebook uses to work around community guidelines is called XChange and started out with a few dozen names that the company dedicated a staff member to look after. Now, that list includes tens of thousands of names, which isn’t manageable, and it has created some high-level mishaps.

One such mishap was Trump’s notorious post “When the looting starts, the shooting starts.” FB’s algorithm scored the post a 90 out of 100 likelihood that it violated community standards, but didn’t take it down because of the former president’s status on the XChange list.

Another, equally high-level offender was soccer star Neymar. Neymar has been credibly accused of rape by a Brazilian woman who claims he assaulted her in a Paris hotel.

The footballer responded by posting a text conversation with her in which he calls her a liar and includes her real name, prompting thousands of his supporters to attack her online and in-person. But, because he was whitelisted, Facebook allowed that video to stay online for over a day.

The documents reviewed by WSJ claim the problem with XChange is pervasive and touches every part of the company. Facebook has chosen on several occasions not to share data about the program with their own oversight board so, we still don’t know the full extent of the damage.

Instagram knows how much it hurts teenagers and it doesn’t care

It’s no secret that Instagram is toxic. Unlike any other social media platform, it seems almost exclusively designed to make young people insecure about their bodies.

While SnapChat filters try to keep your attention on the face and make that face goofier, Instagram chooses to lean into the toxic “perfect body” motif. Instagram uses filters designed to touch up any facial blemishes and add unnecessary contour, all leading to a fictional “idealized” person of yourself.

If you’re passively interested in a workout routine or makeup brand, Instagram will bury you in targeted ads to achieve the “perfect” body whether that be fitness programs or sketchy diet pills.

But why is that such a bad thing?

If adults want to achieve a near-impossible body shape against health and reason, they should enjoy those kinds of ads. And if the content really is that debilitating, they should log off the app entirely. Well, there are two big flaws in that (besides the main one of crazy body standards):

  1. 40% of Instagram’s users are 22 years old and younger.
  2. It’s not so easy to just quit.

It’s not just me (and by extension, the Wall Street Journal) saying that. Facebook knows that and has known for a while.

FB ran its own study on young Instagram users in 2019-2020 and how they feel after being on the app. They found a few key issues.

The biggest being that Instagram creates a “negative social comparison” that isn’t found on other social media platforms and seems almost unique to Instagram.

The comparison happens in young people, mostly girls, who see these fictionalized bodies that they could never hope to achieve, nor should they want to. Instagram bombards its viewers with touched up pictures (or even real pictures) starring people with much different body types than the vast majority.

Teenage testimonials from that study describe their feelings after being the app “like a kick in the gut” and that they “feel too big or too ugly after just a few minutes.”

The reward from Instagram is almost like an addiction to young people’s brains too. Think of it more like a gambling problem than alcoholism. The chance to see something great or even just to see their friends, keeps kids coming back again and again.

Body image issues affect all genders, but they’re especially harmful to young girls because of how society conditions them to be pretty in order to be valued. Instagram just gamifies that harmful practice and knows it.

Facebook is set to launch a new Instagram for kids 13 and younger next year.

Facebook tried and failed, to make their platform less toxic already

If you’ve been on Facebook, you’re probably most familiar with the News Feed function. It’s the home page and pulls all the most relevant posts from your friends, pages you follow, targeted ads and so on into one place. You’ve probably also noticed that it’s gotten much worse recently.

That’s largely due to a big change Facebook made to value “meaningful social interactions” over any other metric.

Meaningful social interactions are between friends and family members rather than brands. FB meant to highlight community; arguably what people flocked to their website in the first place but made some critical mistakes in how it measured that.

Here’s how FB decides what is “meaningful”:

  • Emoticon reacts rather than just likes;
  • Shares (no matter who the original poster was, more on that in a second);
  • Comments;
  • Longer comments with “passionate” arguments.

In addition to declining users overall, Facebook has been bleeding “active” users in recent years. Active users are anyone who comments, shares, and otherwise interacts with people daily, as opposed to every once in a while.

Creating a value system like the one above would reward those who use the app more often, whether they know it or not.

The big problem came when they realized that controversial subjects performed better than any other post under the new value system. If someone posted a nasty political take, fake news story, or something insensitive, it was more likely to receive the “anger” reaction, impassioned comments, and long comment chains. All things are disproportionately overvalued.

Coincidentally, the metrics didn’t care who the original poster of those negative stories was after it was shared. That meant trolls could willfully create fake news, then spread it through all of Facebook without much issue.

This is what people mean when they say Russia “hacked” the 2016 election. It was almost exactly their strategy and Facebook didn’t attempt to fix the problem until 2019.

It failed.

Facebook knows drug cartels and human traffickers use its platform

This particular problem is largely outside of the USA and Canada, where 90 percent of Facebook users are these days. That’s concerning to keep in mind when we know the company spends over 80 percent of its time fixing (north) North American content as the rest of the world combined.

Hate groups and terrorists using Facebook to organize isn’t a new phenomenon.

From the Myanmar genocide over the last few years to the domestic terrorist attack on the U.S. Capitol in January, Facebook has plenty of blood on its hands already. But we’ll focus on three new examples of the website’s culpability in major crimes.

The first concerns CJNG, the largest drug cartel in Mexico.

CJNG uses Facebook to make events for its “training camps” where the description lays out in plain language that enrollees will be tortured or killed if they try to leave early. In addition to that, the cartel has held several Instagram accounts proudly displaying gold-plated guns and gruesome pictures of their victims.

Facebook is keenly aware of the CJNG and the verbiage they use to post on their website, but is lax to boot them from the internet. The cartel use thinly-veiled acronyms and slang words to avoid Facebook’s automatic hate-speech sensors and when they do get kicked off, it’s far too easy to make another account.

Facebook, unsurprisingly, is loath to explain why they’re having such a hard time keeping the cartel off its service and is under no obligation to explain further. Unless Congress makes them of course. Whether it’s a lack of manpower or lack of competency, FB is at serious fault.

The second criminal enterprise heavily misusing Facebook is a human trafficking ring in the Middle East.

I would love to tell you a specific country rather than the broad, opaque (and arguably racist) classification of “the Middle East,” but WSJ declines to give more details.

The ring in question posts ads for “maids” on Facebook and includes a list of their “skills” that are made up of sexual acts. It’s sexual slavery and the women forced into it are usually kidnapped and have their families threatened if they attempt to escape.

The reason we, the public, even know about this “maids” scheme is because of the BBC ran a story about it. Facebook didn’t even remove the ads until Apple threatened to kick the app off their Appstore.

In an internal memo, a Facebook researcher fully admits the company knew about the practice before the BBC story dropped.

In that same admission, the company stated it wanted didn’t want to “alienate” the buyers (slavers). Valuing website engagement over human life will be even more apparent in the next example.

The third and final case (that I’ll mention) involves Ethiopian citizens and politicians openly calling for the genocide of a minority group on Facebook, uninterrupted.

The Tigrayan minority is a small section of the Ethiopian populace and one under extreme threat recently. Public figures write terrible posts calling them dogs and calling for their extinction while others echo that sentiment to a terrifying degree.

And if you remember all the way back to one story ago, Facebook loves that sort of “drama.”

There’s a special sort of racism at play with this part of the story, however. Facebook, a company valued at nearly 900 billion dollars, refuses to employ someone who speaks the necessary dialect to monitor hate speech.

So, not only are they ignoring the problem, they refuse to pay one person to even read it.

A former Facebook vice president told WSJ that the company sees the violence it helps cause in developing countries as “just the price of doing business.” So, maybe they shouldn’t be allowed to do that business anymore.

Facebook is the home of anti-vax conspiracy theories

Of course, it is.

Before going any further on this, if you’re still unsure if you should get the COVID-19 vaccine please read my earlier article 25 COVID vaccine myths debunked to convince you to get the shot.

If you aren’t unsure and simply refuse to get it, however, you’re either a bad person, being lied to, or both and Facebook is largely to blame for that.

An internal memo found that 41 percent of FB comments about the vaccine promoted false facts, yet they’ve been hopeless to stop it. It also doesn’t know how to crack down anti-vax’s cousin, the QAnon conspiracy theory, so they’re especially bad at this.

It turns out most of the disinformation is coming from a select few accounts called “big whales” within the company.

They discovered that of the over 150,000 accounts deactivated for vaccine misinformation, over half of all content came from 5 percent of them. That means even if the misinformation you’re seeing on your timeline comes from a random article your uncle shared, it’s more likely that it came from a select few trolls dedicated to pushing harmful conspiracy theories.

Facebook causes a lot of harm, but the attack on truth it has helped wage is immeasurable and will have the longest impact of anything else discussed in this article.

So, what should we do?

It’s not like we can just ban Facebook.

Facebook isn’t just a social media website anymore. It’s how groups find each other, the vast majority of which aren’t dangerous. How friends organize a barbeque. It’s the only way we remember our relative’s birthdays.

Instagram is the avenue by which thousands of people make their living. It also provides a great outlet for healthy expression in addition to its harmful practices.

WhatsApp, another Facebook property, is the most common form of communication in the entire world. When it shut down the other day (mysteriously at the same time Facebook’s whistle-blower was appearing before Congress) millions of people were suddenly cut off from friends and family.

What we should’ve done is not let it get this big. But such an action would’ve been a crime against the great god of capitalism, so now we have two options.

Regulate them

Since Facebook is basically infrastructure now, let’s treat it like infrastructure and appoint a few dozen people to look after it.

Facebook already tried to appoint their own oversight board after their last major scandal (where they got caught selling our personal information to the Russians right before the 2016 election) and they lied to them. It’s time we put some people in the room they can’t lie to.

There should be a legitimate oversight board that has the power to see any part of Facebook’s communications, data, and financial records it deems relevant to public safety.

That might sound like “government overreach” to you, but we’ve literally never had a problem like this before. It’s becoming harder and harder to tolerate these old congress members who don’t even know how Facebook makes money, try to hold them accountable.

It’s time to get some less technophobic people in power to make these decisions and watch over Mark Zuckerberg’s shoulder.

Break them up

Back in the old days of America, when businesses got too big and started hurting people, we broke them up.

Facebook is the principal form of communication and social media posts through its platform and all its affiliate companies. And as we’ve seen from these files, they aren’t capable of doing it all.

So, let’s force them to at least sell-off Instagram and WhatsApp. I don’t care who they sell to or who remains in charge, but the first step to holding everyone accountable is making sure they only have themselves to worry about.

No matter what we do, however, it’s vital we do it soon and do it right. Because how much more of this can we take.

Leave a Reply

Your email address will not be published. Required fields are marked *