Facebook, owned by Meta, has made a significant change to its controls for fact-checking for US users, giving them power over the platform’s algorithm. The move allows users to adjust the visibility of fact-checked content, a decision that has drawn both praise and concern.
Previously, the platform’s algorithm would automatically push flagged posts lower in users’ feeds in case they were identified as misleading or false. However, with the introduction of the new “content reduced by fact-checking” option in Facebook’s settings, users now have the flexibility to control the visibility of debunked posts.
Users have the option to make fact-checked posts less visible by selecting the “reduce more” option, which can further decrease the prominence of such content within their feed or even make it disappear. Conversely, the “don’t reduce” option increases the visibility of fact-checked content, making it more likely to appear higher in the feed.
According to a spokesperson from Meta, this change aims to empower users and give them greater control over the ranking of posts in their feeds. The decision was based on feedback from users who expressed a desire for more agency in curating their Facebook experience. The fact-checking option was initially rolled out in May but went unnoticed by many users until they discovered it in the settings.
This move by Facebook comes at a time when the United States is experiencing a highly polarized political climate, which has heightened the scrutiny of content moderation on social media platforms. Some conservative advocates in the US claim that the government has influenced or collaborated with platforms like Facebook and Twitter to suppress right-leaning content under the guise of fact-checking.
The impact of Facebook’s alteration to fact-checking controls is still uncertain. While it allows users to customize their feeds based on their preferences, critics argue that it may inadvertently benefit purveyors of misinformation by granting them greater visibility. As the debate over content moderation continues, Facebook’s decision to empower users in this manner will undoubtedly shape the dynamics of information dissemination on the platform.
It remains to be seen how users will navigate and use these new controls and whether this change will effectively address concerns related to the spread of false or misleading content. With the power now in the hands of users, the responsibility to discern and curate a trustworthy online environment lies with each individual as they determine the visibility and impact of fact-checked posts in their personal Facebook feeds.