In a very uncharacteristic move, Facebook has today issued an official rebuke to claims made in the new Netflix documentary ‘The Social Dilemma‘, which aims to provide an overview of the various ways in which social media platforms have become increasingly divisive and dangerous as their usage has increased over time.
Most reviews of The Social Dilemma have actually been highly critical, noting that while the documentary does make some valid and important points, it descends into sensationalism, which ultimately dilutes its key messaging.
But clearly, it’s got Facebook concerned. The Social Network generally stands pat on most criticisms and claims of this sort, but with reports that many users have considered deleting their Facebook and Instagram accounts after watching the documentary, the company felt the need to issue a two-page, seven-point response to its core points.
Facebook’s responses are as you would expect:
- On social media addiction – “[we]Â prioritize meaningful social interactions”
- On people as the product – “we don’t sell your information to anyone”
- On algorithms – ” Portraying algorithms as ‘mad’ may make good fodder for conspiracy documentaries, but the reality is a lot less entertaining”
- On data usage – “Despite what the film suggests, we have policies that prohibit businesses from sending us sensitive data about people”
- On polarization – “The overwhelming majority of content that people see on Facebook is not polarizing or even political”
- On election interference – “the film leaves out what we have done since 2016 to build strong defenses to stop people from using Facebook to interfere in elections”
- On misinformation – “The idea that we allow misinformation to fester on our platform, or that we somehow benefit from this content, is wrong”Â
Facebook has shared its opinions on all of these topics in the past, so there’s nothing surprising in its responses. The only surprise, as noted, is that Facebook felt the need to respond at all – if anything, an official response from Facebook will only add more fuel to the fire, and prompt more people to check out the documentary.
If Facebook felt a need to respond, it must have struck a nerve, right? There must be something to it worth checking out. Right?
And while Facebook’s responses are as expected, they do somewhat misconstrue the core of some key elements.Â
For example, in response to the claim that ‘people are the product’ on Facebook, Facebook twists that question to be about personal data sharing, which is not exactly what the documentary makers mean. Even if Facebook doesn’t provide your information to advertisers directly, Facebook does indeed make a lot of money from its advanced ad targeting systems, which utilizes people’s personal information to better focus Facebook ads.Â
In this sense, people are the product, data is the product, and Facebook has the most intricate database of personal information ever created – even if it doesn’t, as Facebook notes, share those insights directly with advertisers, as such.
You could argue that Facebook’s counter-claims on polarization and misinformation are also a little misleading, but a lot of it comes down to how you choose to answer the question, as opposed to what the actual answer might be.
As has been well-documented, Facebook has turned a blind eye to certain controversial issues and topics at times, from which it has derived user engagement benefits, while various experiments have shown that users can indeed become more politically-aligned via algorithmic recommendations and selective blocking tools.
Facebook has even underlined that case itself – earlier this year, Facebook’s head of VR and AR Andrew Bosworth published what was intended to be an internal memo, in which he acknowledged that Facebook’s algorithm essentially incites further division through news content exposure. Â
But Facebook tends to re-frame these questions in its responses, and shift the focus onto other areas –Â like, for example, what it’s done to improve its political ad efforts since 2016. Which is true, Facebook has improved on this front, but there are still significant concerns that the platform is being used for political misinformation and voter manipulation programs across the world.
Saying ‘yeah, but…’ doesn’t necessarily rebuke the core premise in many of these cases and queries.
Which, again, is why it’s strange that Facebook has bothered to respond at all, because it only puts more focus onto its processes, and leads to posts like this one, which further question its claims.
I can only imagine that Facebook has seen something a significant jump in the number of people either deactivating their accounts, or claiming that they will (ironically, on Facebook), which has prompted its PR team to take action. But it seems misguided – Facebook would have been better off sitting this one out, and letting the initial discussion around the documentary fade out on its own. Â