Sick to death of the misinformation that has been running rampant on its platform for years now, Facebook announced on Tuesday that it plans to roll out new measures aimed at curbing the spread of fake news by allowing group administrators to appoint designated “experts” in their spaces.
In implementing the new tool — which will soon become available for use in select groups on desktop and mobile, according to CNET — Facebook is hoping to crack down on any content that violates its rules by promoting hate, conspiracies or straight up falsehoods.
After being named an “expert” in their group, designated individuals will receive official badges that appear next to their names, which will, ostensibly, act as an easy signifier that they are more knowledgeable than the average user on a given topic. But who will be responsible for bestowing such an honor? That’s the rub, a Facebook spokesperson told CNET; the selection of experts will be “all up the discretion of the admin to designate experts who they believe are knowledgeable on certain topics.”
If you’re the administrator of an anti-vax Facebook group, then, it makes sense to believe that the person you’d designate as an “expert” in your particular space would not hold that title in a way that’s consistent with what many Facebook users would consider to be an expert in the subject of vaccine science. It’s the same type of lazy equivocation Facebook has been relying on for years: Although CEO Mark Zuckerberg insists on taking the most mealy-mouthed stance possible when it comes to fact-checking bothsidesism, the reality is that Facebook is constantly making decisions — political ones — about which type of content is allowed to take up space and even thrive on its platform.
The initiative is part of a larger campaign by Facebook to stop the spread of disreputable news content that has met with mixed results in recent years. As Gizmodo previously reported, the advocacy group Avaaz found that “had Facebook tackled misinformation more aggressively and when the pandemic first hit in March 2020 (rather than waiting until October), the platform could have stopped 10.1 billion estimated views of content on the top-performing pages that repeatedly shared misinformation ahead of Election Day.”