Facebook is becoming a bit more like Nextdoor in an attempt to increase its group functionality. The only problem is that Facebook seems to be borrowing one of Nextdoor’s more controversial concepts: giving more power to community moderators.
On Wednesday, the company announced that it was making a major improvement in the powers of its groups’ community moderators. Now administrators can do a number of new things, such as automatically blocking certain people from commenting in conversations based on factors such as how long they have been a member of the group. Facebook says the new tools are designed to help “administrators play a key role in helping maintain a safe and healthy culture.”
There are other new powers available to admins, such as an AI-powered alert that marks “controversial and unhealthy” conversations and new summaries that moderators can use to review each member’s activity in a particular group. Asked whether the new features were inspired by Nextdoor’s moderation system, Facebook spokesman Leonard Lam said: “Our product team talks regularly with our management community to better understand their needs, and the features we announced today reflect direct feedback, which we have received from them. ”
The approach is very similar to the way Nextdoor, the neighborhood-based media platform, has handled moderation for years. The problem is that Nextdoor’s model has not really worked. Its society is plagued by a random approach to misinformation and complaints about toxic fights between group members along with accusations of biased and inconsistent community moderators.
Maybe things work differently for Facebook. But the new approach to moderation is not the only example of Facebook trying to be more like Nextdoor. Facebook is also preparing to launch a Nextdoor-style group feature in the U.S. called Neighborhoods – the feature is already available in Canada – that allows users to create and join groups that are limited to geographic areas, which is what Nextdoor does. Facebook will also rely on unpaid community moderators to enforce its Neighborhoods Feature Guidelines, which are designed to keep content “relevant and friendly.” Nextdoor does this too.
Assigning users to act as community moderators has its problems, something Nextdoor knows all too well. In recent years, Nextdoor has encountered many of the same moderation issues as Facebook, including the distribution of hate speech, conspiracy theories, and political misinformation. Nextdoor faced criticism last year when unpaid community moderators censored and removed posts in support of Black Lives Matter protests following the assassination of George Floyd. The company later stressed that these posts were allowed to speak, and earlier this year released an anti-racism messaging system to encourage users who are about to post potentially racist content. Medical misinformation about Covid-19 is also a problem, users told Recode in February. They also complained that the platform’s community-based moderation system had allowed conspiracy theories to flourish.
Nextdoor has also struggled to handle policy talks. As Recode reported last year, Nextdoor groups may be overwhelmed with tense political arguments that its unpaid moderators are either uneducated or unmotivated to resolve. The platform’s problems with political speech were exposed after the Capitol uprising on January 6, when Nextdoor quietly stopped recommending political groups (Facebook decided to do the same at about the same time).
Nextdoor’s moderation model is far from perfect, but Facebook is betting on making itself more like Nextdoor – which has become more and more popular during the pandemic – it may be successful. Ultimately, the two platforms appear to be converting to offering group-based interactions and AI-enhanced community moderation, though both Facebook and Nextdoor continue to struggle with misinformation, racism and toxic discourse.
Today’s news is just another sign that Nextdoor and Facebook are getting more and more alike, which is probably bad news if you went to Nextdoor to avoid Facebook or vice versa.