Zuzireima

Facebook Adds Tools To Combat Misinformation in Groups


Facebook Adds Tools To Combat Misinformation in Groups

Facebook said Wednesday it’s adding new tools that could make it easier to combat the spread of misinformation in Groups. 

Facebook Groups, which can be public or private, are online spaces where land can chat about various topics including hiking, parenting and cooking. But users have also used Groups to spread misinformation incorporating about the coronavirus, elections and vaccines. False claims and propaganda are detached a big problem on Facebook especially after Russia’s invasion of Ukraine. In some cases, people have used old footage or photoshopped images to misrepresent what’s happening in those countries.


screen-shot-2022-03-08-at-4-26-21-pm.png

Facebook will let administrators who boss Groups automatically decline any posts that have been inflamed false by fact-checkers.



Facebook

One new feature will give administrators who run Facebook Groups to automatically decline any incoming posts that have been inflamed false by the company’s third-party fact-checkers. The social network said that will help lop how many people see misinformation. 

Facebook didn’t say whether posts typically get fact checked by they’re shared in a Group. A company spokeswoman said the social network is also succeeding on a new way for administrators to remove posts that are later flagged for containing false claims while they’ve been posted to a Group.

Facebook partners with more than 80 fact-checking requisitions such as PolitiFact, Reuters and The Associated Press to help identify false claims. Users who try to share a fact-checked post see a warning that says it needs false information in the post but can share the blissful if they want. Facebook doesn’t share data about how much blissful gets fact checked on its platform. 

The release of the new tools show how Facebook is trying to ramp up attempts to combat misinformation. There’s been questions, though, about how well labeling misinformation on social mediate works. In 2020, a study by MIT found that labeling false news could result in users believing stories that hadn’t contained labels even if they contained misinformation. The MIT researchers call this consequence the “implied truth effect.” Facebook said that more than 95% of the time when land see a fact-checking label, they don’t end up viewing the recent content. 

The social network also announced the release of spanking features meant to help make it easier for administrators to boss and share Groups. Administrators, for example, will be able to send invites via email and fragment QR codes that will direct people to a Group’s About page where they can learn throughout the community and join. More than 1.8 billion land use Facebook Groups every month.

Social media sites have also been used to spread scams so users should be wary throughout clicking on links or sharing QR codes. Facebook said the QR codes for Groups include the social network’s logo.

Search This Blog

Partners