San Francisco, (Asian independent) In a tough action on Groups that spread harmful content, hate speech and misinformation, Facebook on Thursday said admins and moderators of Groups taken down for policy violations will not be able to create any new Group for a certain period of time.
For members who have any Community Standards violations in a Group, their posts in that Group will now require approval for the next 30 days, Facebook said in a blog post.
“This stops their post from being seen by others until an admin or moderator approves it,” said Tom Alison, VP of Engineering at Facebook.
Sometimes admins may step down or leave their Groups.
“Our proactive detection continues to operate in these Groups, but we know that active admins can help maintain the community and promote more productive conversations.
“We now suggest admin roles to members who may be interested. A number of factors go into these suggestions, including whether people have a history of Community Standards violations,” Facebook informed.
In the coming weeks, the social network will begin archiving Groups that have been without an admin for some time.
“Moving forward, when a single remaining admin chooses to step down, they can invite members to become admins. If no invited members accept, we will suggest admin roles to members who may be interested. If no one accepts, we’ll archive the group,” Facebook said.
Facebook said it removed about 1.5 million pieces of content in Groups for violating its policies on organised hate.
It also removed about 12 million pieces of content in groups for violating its policies on hate speech.
“When it comes to groups themselves, we will take an entire group down if it repeatedly breaks our rules or if it was set up with the intent to violate our standards. Over the last year, we took down more than 1 million groups for violating these policies,” Alison informed.
Currently, if admins or moderators repeatedly approve posts that violate Community Standards, Facebook removes the Group.
Groups that repeatedly share content rated false by fact-checkers are not recommended to other people on Facebook.
“We rank all content from these groups lower in News Feed and limit notifications so fewer members see their posts,” Alison noted.
In another bid to help people get health information from authoritative sources, Facebook said it is starting to no longer show health groups in recommendations.