How Social Media Giants like Facebook Successfully Monetized Hate

Header image for “How Social Media Giants like Facebook Successfully Monetized Hate” Blog by Prerna Manchanda of Technically Spiritual

As we know, Facebook (and therefore Instagram, WhatsApp, and Oculus VR) went down for several hours on Monday.

UK Tech Editor Alex Hern shares how this likely happened: Facebook’s OWN operations run THROUGH Facebook. Here’s Hern’s simplified breakdown:

“Facebook (accidentally, we assume) sent an update to a deep-level routing protocol on the internet that said, basically, ‘hey we don’t have any servers any more xoxo’…

Normally, this would be quite easy to fix. You just send another update saying, ‘oh, don’t worry, we have servers, they’re here, xoxo.’ Things still break, it takes a while for the message to spread to all corners of the internet, egg on face, but livable…

But Facebook runs EVERYTHING through Facebook…

So when its servers were booted off the internet, it also booted off… the ability to send that follow-up message…

And the ability to log-in to the system that would send the follow-up message…

And the ability to use the smartcard door lock to the building that contains the servers that control the system that sends the follow-up message…

And the messaging service you use to contact the head of physical security to tell them they need to high-tail it to the data centre out east with a physical key to override the smartcard door lock on the front door…

FACEBOOK: ‘We’ll run everything our company does through our own products and platforms, that way we’ll have the tightest corporate culture imaginable!

*FACEBOOK accidentally deletes its products and platforms from the entire internet*

FACEBOOK: ‘Oh no.’”

But there’s of course the other circulating concern that, it’s odd this happened the morning after 60 Minutes aired a special wherein whistleblower Frances Haugen shared that, during her time at Facebook, she saw “conflicts of interest between what was good for the public and what was good for Facebook.”

So let’s learn more about Facebook’s business model… especially what else runs on their own platforms...

Facebook’s OWN research reveals that, 64% of the time a person joins an extremist Facebook group, they did so because the platform recommended it.

(Roger McNamee, Author of Zucked: Waking Up to the Facebook Catastrophe).

Facebook also acknowledged that pages and groups associated with QAnon extremism had at least 3 million members.

It’s clear then: Facebook helped radicalize 2 million+ people.

And with this year being kicked off by an insurrection of extremists, it’s no far-fetched idea that Facebook (and thereby Instagram) need to be held accountable as enablers of these groups-- That is, the people joining and subscribing to extremist, violent ideologies (despite the incredible, provable spread of the misinformation that fuels them).

We’ve heard a lot on how social media platforms can’t do anything to legally regulate social media due to freedom of speech, and how they’re actually protected by Section 230 of the Communications Decency Act of 1996, which shields interactive computer services from being liable for user-generated content.

This was originally created so that, fairly, T-Mobile can’t be held liable if two people using T-Mobile are talking about doing something illegal (for example). But things have gotten a lot more nuanced since 1996, and the regulations have done anything but keep up with the times.

Roger McNamee points out how social media corporations hide behind the first amendment to justify their policies, claiming they don’t want to be arbiters of truth.

But algorithmic amplification of extreme (and often false) content is a business choice made in pursuit of profit (just as Haugen shared).

Eliminating or altering these algorithms would reduce the harm from hate speech, disinformation, and conspiracy theories, without any limitation on free speech. As Renee DiResta of the Stanford Internet Observatory says, free speech is not the same as free reach.

Indeed, in America, you can absolutely stand up on a park bench and scream lies so long as you’re not harming anyone. But what companies like Facebook actually do is amplify the screaming to people that otherwise wouldn’t be looking for it. And thus, by these people seeing it over and over again, we know the disinformation then becomes truth in their mind even if it has no factual standing.

Similarly, YouTube’s video recommendation algorithm inspires 7 million hours of watch time, or 70% of the platform’s use, per day. Guillaume Chaslot, who helped write YouTube’s recommendation algorithms, actually left YouTube after discovering the harms it was doing to society, because:

Their algorithm makes no effort to prioritize what is truthful or balanced, and was particularly not neutral in how it recommended videos during the 2016 presidential race.

Chaslot now spends time shedding light on these hidden patterns. He explains that, in the case of the 2016 election, no matter where you started, the recommendation algorithm was more likely to push you in a pro-Trump, anti-Clinton direction.

This is because the goal is to keep people engaged -- with extreme content.

The flat earth videos instead of the scientifically-backed round earth videos. The exciting propaganda instead of the peer-reviewed analyses. What keeps people online (what keeps social media platforms making the most money) is the disturbing, hateful, and extreme content.

What this means, is that in our society, social media corporations have successfully monetized hate, and have a vested interest in spreading it.

These social media platforms have done exactly what they hid behind at first: they became arbiters of truth, except they fight for that which is profitable, which is decidedly not truth.

Currently, there are no U.S. laws that call for any social media to protect our democracy, our public square, or our elections. The rules are left to the CEOs of these private, for-profit companies. They have no incentives to make changes with their current business model.

At this point, people are using the tools of social media exactly as they’re designed: the algorithms are being used for hate.

Social media corporations must step up. Even it means altering their entire business model and changing their product. They cannot continue to only focus on growth, engagement, and profit, because it is at the dire expense of our democracy, privacy, and safety.

The government must step up as well -- by holding corporations accountable for the harm they do to our society. This can begin by demanding transparency around how these recommendation engines are working, around how the curation, amplification, and targeting are happening.

Most of us have no clue how these algorithms work, or how Facebook has successfully monetized hate and misinformation.

It’s not enough to amend or repeal Section 230. It’s not enough for Apple to update user privacy agreements. It’s not enough to do anything less than hold the businesses and engineers responsible for enabling harm against our society.

Roger McNamee goes on to say, “Policymakers must take action. The harms of internet platforms are no longer contained or abstract, they are destabilizing our society and our government. The Biden administration will not be able to stop the pandemic and revive the economy without limiting disinformation and conspiracy theories spread by internet platforms.”

Ultimately, we as average citizens can:

  1. Pressure our representatives to step up to advocate for our society.

  2. Educate family and friends who blindly utilize these tools and platforms so they understand how they’re being manipulated.

  3. Do our research, vet our sources, understand where information comes from, and engage with people that have different viewpoints than ourselves (How? See here and here.)

2020 and 2021 have been incredibly dark years. But darkness is also where we have space for new things to come to light, including the many things we have to work on. It’s time to take action on these things that have come to the surface.

The safety and (physical, mental, emotional, and spiritual) health depends on it.

Previous
Previous

Permission to Be Bored: Creating Blank Space in a World of Constant Downloads

Next
Next

What Happens in Your Brain When You Think, “I am inadequate.”