Well, this is quite unusual.
On Friday, Facebook released a long statement about the movie The Social Dilemma in the form of a PDF on its website.
As long as I’ve been following the social media juggernaut, I haven’t seen a response like this to a specific movie or specific claims by one entity.
It’s a big step in terms of a new counter-offensive and makes me think the movie must be having an impact on people who use the service and are wondering about the real value of Facebook.
I first wrote about The Social Dilemma a little over a week ago and made a simple request. I asked people to look into the app they use most and consider deleting it for a while. It’s a test to see how much you might rely on that app or even addicted to it.
I removed Instagram from my phone and I still feel the urge to browse through that feed or post a photo. (Then again, I was starting to have some success with it.)
Facebook took aim at a few of the most surprising claims, but the one that stands out the most has to do with misinformation. The company claims they do not make money from fake news posts or conspiracy theories.
They say: “The idea that we allow misinformation to fester on our platform, or that we somehow benefit from this content, is wrong.”
The document is a little light on details, though.
For one thing, how do we even define fake news?
I see posts on Facebook that question the severity of the pandemic and whether it is important to wear a mask. Some would argue that is fake news and obviously a false claim, others insist mask-wearing should be optional.
What Facebook really profits off the most is not so much outright lies and misinformation, it’s opinion about highly debatable topics. If I post today about not wearing a mask, it does go against the federal guidelines but it’s not really fake news. Facebook profits from these posts because we’re all seeing ads. It’s my opinion and it would likely not get flagged. (I might even try testing my theory.)
Some users push those boundaries, of course. Some of the posts I’ve seen over the weekend wishing President Trump dies from coronavirus are obviously a violation. They sound a bit like a death wish to me. But what about the people who merely drop hints? What about mild implications? Facebook has admitted it doesn’t have the perfect AI routines to spot misinformation.
The company profits from debate and opinion. Only occasionally do opinions skirt over into actual misinformation or outright lies.
The official rebuttal also addresses social media addiction, but in a way that seems woefully sparse. They have a team of experts who advise them, they changed the feed, they don’t post viral videos. “The change led to a decrease of 50 million hours a day worth of time spent on Facebook. That isn’t the kind of thing you do if you are simply trying to drive people to use your services more.”
That reads as defensive. Tell us how much time people do spend on Facebook and what else you are doing to curb addition and we might be interested.
Overall, the rebuttal falls a bit flat.
To alter a quote by Shakespeare:
“The social network doth protest too much, methinks.”