Over the past two years, Facebook has made a concerted push on its Groups feature as a means to boost on-platform engagement and keep people logging in.
In the company’s official communications on this, Facebook has reiterated the popularity of groups – over 1.4 billion Facebook users engage in groups every month, and more than 400 million people are members of what Facebook calls ‘very meaningful groups’, or groups that become a central part of their interactive experience. Those numbers have likely risen even further during the COVID-19 lockdowns – but at the same time, a more skeptical view of Facebook groups is that they enable the platform to hide some of the more controversial discussions within the app from public view.
Sure, you might be annoyed if you see an anti-vax post in your main feed – but if Facebook moves that discussion into a private group, you won’t even know about it, and that user can then continue their discussion, giving Facebook that extra engagement. Rather than removing some of these more concerning posts and debates, it does seem that its groups push has also been about moving them into more private areas. Which, of course, means that Facebook still enables them to proliferate, and the platform has, in many cases, been identified as a key facilitator in growing fringe, and sometimes dangerous, real-world movements.
Which is where Facebook’s latest revision of its group rules comes in.
This week, Facebook has announced a range of new regulations for groups, which will see the implementation of new penalties and restrictions for those that violate its Community Standards, even in private groups.
Among the various measures, Facebook will:
- Remove all groups related to militia and anarchist organizations from recommendations, while also restricting them in search, and reducing their content in News Feeds. “We also remove these groups when they discuss potential violence, even if they use veiled language and symbols.”Â
- Cease recommending health-related groups to users in its discovery surfaces. Users will still be able to invite friends to health groups, while specific searches will still surface such groups, but Facebook will no longer include them in general recommendations to users.
- Reduce exposure for groups which repeatedly share content rated false by fact-checkers, and remove groups that either continue to violate the rules or share content that violates its Community Standards. Group admins are also now notified each time a piece of content rated false by fact-checkers is posted in their group.
- Stop admins and moderators of groups taken down for policy violations from creating new groups for a period of time, while members who have any Community Standards violations in a group will now need to get approval for their posts within that group for the proceeding 30 days.
These are important measures, given the way Facebook groups can be used to amplify concerning elements.Â
Back in April, a report showed that Facebook was hosting thousands of groups and Pages, with millions of members and followers, that supported the QAnon conspiracy theory. Facebook has since taken action on QAnon, and various other related groups, but the fact that these networks ever existed on the platform at all is a major concern – and this is just one example of how Facebook facilitates the spread of concerning, divisive movements through its groups offering.
In other noted concerns:
- Facebook hosts various climate change denial groups, and has applied differing interpretations to its fact-checking process for such, even exempting some climate change denial content by labeling it as ‘opinion’.Â
- Earlier this year, various Facebook groups, with millions of members, were found to be spreading COVID-19 conspiracy theories, with the ‘Stop 5G UK’ group just one example of those gaining significant traction.Â
- An investigation conducted by ProPublica last year uncovered secret, members-only Facebook groups of current and former Border Patrol agents, in which they joked about the deaths of migrants
Many of these cases don’t see significant press coverage, because the discussions occur within private groups, so they may not even be reported, or seen by regular users, which limits their potential for enforcement.Â
But Facebook says that it does apply its Community Standards to all groups, even private ones:
“Our Community Standards apply to public and private groups, and our proactive detection tools work across both. That means even if someone doesn’t report an issue to us, our AI can detect potentially violating content and we can remove it.”
Clearly, that AI is not always effective, but hopefully, with this renewed push, Facebook is stepping up its action against such, and will work to remove violating content in order to limit its spread.
In addition to this, Facebook is also going to start archiving groups that don’t have an active admin, while also prompting active members to become admins of those groups where the original manager has left.
“Moving forward, when a single remaining admin chooses to step down, they can invite members to become admins. If no invited members accept, we will suggest admin roles to members who may be interested. If no one accepts, we’ll archive the group.”
Essentially, Facebook is moving to ensure that there’s somebody in place to enforce group rules, which could be another measure in reducing instances where those groups that are going against the platform’s guidelines, and potentially spreading misinformation and hate.
It’s good to see Facebook looking to step up its action in this respect, but as highlighted in these examples, Facebook groups remain problematic, and likely will for some time yet. And that, of course, also comes down to how Facebook chooses to take action – yes, it’s going to push harder against those that violate its rules, but those rules also often allow for a lot of leeway in what’s acceptable, and what Facebook deems as misinformation in its own approach.
What the impacts, then, will be, only time will tell.Â