Inside Nextdoor’s ‘Karen problem’

0
11
- Advertisement -

Illustration by William Joel / The Verge

Can Nextdoor really be a social network for communities if black people don’t feel safe on it?

Kalkidan G. moved to Rancho Santa Fe because it was one of the nicer neighborhoods in San Diego. The community was gated, the schools were some of the best in the area, and it was only a short drive from restaurants and grocery stores. Sure, it was pretty white, but being one of the few black families didn’t seem like an issue to Kalkidan. People were friendly enough, at least to her face. Then, she downloaded Nextdoor.

Kalkidan found the app, a neighborhood-focused social network, useful for local news and vetting repair companies. She’s used it for “everything” over the last few years, even if the comments on her posts about contracting companies would spiral into unwanted political conversations. She could brush that off. But as Black Lives Matter protests began to take place in her area, her white neighbors voiced their condemnation of the movement. All the vital information organizing peaceful protests was drowned out by comments of “All Lives Matter,” “#BeachLivesMatter,” and, at times, threats of violence.

One protest, planned by community high-schoolers, was scheduled to take place last week just five minutes down the road from Kalkidan’s home at a local shopping area. But her neighbors on Nextdoor were quick to assume it was a planned “riot.” One post alarming the neighborhood to the protest came from a Rancho Santa Fe community “lead,” or volunteer Nextdoor moderator, looking to verify these “riot” reports with local law enforcement on the platform.

“Apparently the Target is already boarded up,” the lead wrote. “I pray this doesn’t come to our neighborhood but everyone should plan to stay safe.”

The post’s comments quickly descended into a fight between users who shared the lead’s unwarranted fears and others who called her out for spreading misinformation. One neighbor threatened the protestors, writing, “If anyone gets unruly or violent, I plan on coming with pepper spray and a stun gun to help the police.” He continued, “Looters need to be taught a lesson. If they get violent, we need to hit them back 10 fold and protect our community.”

Last week, Nextdoor put out its first company statement in response to the death of George Floyd and the resulting protests. “Black lives matter,” the statement said. “You are not alone. Everyone should feel safe in their neighborhood.”

Despite its public statements, black users on Nextdoor are being silenced by community moderators after participating in discussions about race. Some are opting to leave the app altogether while others are considering moving out of their neighborhoods based on what they’ve seen on the platform. “As a black person, I don’t feel safe at all using it for anything,” Kalkidan told The Verge. “I’m always terrified, thinking ‘Oh my god. I already know what so-and-so thinks of us.’ This is a very horrible situation to be in.”

The promise of a gated community like Rancho Santa Fe is to protect its members from outside intruders, but for Kalkidan, it’s her neighbors who she’s beginning to fear the most. While the same kinds of discussion are happening across Facebook and Twitter, Nextdoor’s hyper-local groups make tensions more intimate, more personal. Kalkidan knows the names and street addresses of people on her Nextdoor feed. She sees them when she goes for a walk, shops at the grocery store, and at school events.

The Verge spoke with black Nextdoor users around the country who found the app opened a window into how their neighbors truly feel about them at this moment. Even though racists have lurked on Nextdoor for years, they’ve come out in full force over the last few days.

“Facebook and Twitter, somehow they feel a little distant,” Kalkidan said. “Nextdoor is another level. I mean, it’s literally next-door.”


For years, Nextdoor has struggled to shed its reputation as a “snitch” app, used by white and wealthy users to racially profile their neighbors and report them to the police. There are meme accounts, like @BestofNextdoor, dedicated to sharing shockingly bad Nextdoor posts. The accounts highlight posts from “Karens” complaining about everything from children laughing outside to their neighbors’ Wi-Fi names. The problem has gotten so bad that, just in the last week, Rep. Alexandria Ocasio-Cortez (D-NY) called for Nextdoor to “publicly deal w/ their Karen problem.”

It’s hard to outgrow a reputation, especially when it’s been calcified as a meme. But Nextdoor’s challenges stem from the fact that it’s set up to be self-governed. Unpaid “community leads” are in charge of reporting and removing posts that are in violation of the app’s community guidelines. According to Nextdoor, the first users to launch a neighborhood forum are appointed leads. Those leads can then appoint others “based on their behavior and qualifications,” and an algorithm can select new ones if they are active in the community, like inviting new users to join.

This hands-off approach is what makes Nextdoor able to be as big as it is. By outsourcing moderation to untrained and unpaid volunteers, the company has been able to expand into over 200,000 neighborhoods across the country. But it’s also empowered community members to strike down posts they personally don’t like. All across the country, Nextdoor posts advertising protests get struck down by community moderators while racist and inflammatory messages, some calling for direct violence against black people and protestors, are left to stand.

Leads don’t go through any formal training from Nextdoor before receiving the authority to strike posts, and the guidelines listed on the site are vague enough for leads to interpret them in different ways. There are no rules promoting diversity in moderation leadership either. In a private forum — known as the National Leads Forum, as first reported by BuzzFeed News — some community moderators were enraged by Nextdoor’s decision to support the Black Lives Matter movement. Around the same time the company issued its public statement last week, that same language was published on Nextdoor feeds, enraging some moderators active in the private forum.

“I would like to see Nextdoor post a ‘White lives matter’ [post],’” one moderator from Orlando, Florida, wrote. “Sometimes, we need to remember ‘All lives matter!’”

“With everything going on in the nation these days, wtf are you thinking about when you come onto the forum and toss out a topic that’s going to ignite people?!?!? Is adding more fuel to this fire your goal?” another lead from Durham, North Carolina, said.

In a separate thread, posted on June 3rd, one lead in Atlanta wrote, “From a practical perspective, volunteer Leads do not have the proper tools to handle these situations; the tools we do have have not been functioning properly for weeks; and we don’t have training from Nextdoor about how to deal with the content moderation issues that can arise with these challenging topics.”

Guidance from the company has been minimal. Gordon Strause, Nextdoor’s director of community, responded to questions regarding Black Lives Matter moderation in a separate thread, calling for leads “to take a step back” from moderating these discussions. “I would let people have their say as long as they’re expressing their own beliefs and not attacking others,” Strause wrote.

In a statement to The Verge, a Nextdoor spokesperson said that allowing Black Lives Matter content on the platform has caused “some confusion” among moderators, but it did not offer any additional guidance. “We want your neighborhood on Nextdoor to reflect your actual neighborhood, and therefore being community moderated is important,” the spokesperson said.

Still, simple posts from black users publicizing protests or looking to engage in thoughtful conversations about race continue to be taken down by leads. Adeyanju Giwa, a black Nextdoor user from Irvine, California, even had her account disabled after responding to a post from a white user suggesting that President Donald Trump declare martial law because of the protests in their neighborhood. Giwa’s comments were reported, and her account was disabled for using a “fake name,” even though she used her real one.

After contacting Nextdoor support, Giwa’s account was reinstated. “Your account was reported for a fake name and the auto name classifier temporarily disabled your account,” Lorie, a Nextdoor support employee, said in an email to Giwa.

The Verge found several other instances where people of color had their accounts disabled for violating the real names policy over the last week after they took part in discussions about race in their communities. Emery Real Bird, a Native American man living in Washington, DC, had his account disabled after posting a meme to his community thread and for violating Nextdoor’s real name policy.

“Nextdoor takes our real name policy extremely seriously, but that means we also have to be serious about enforcing that policy in the right way,” Garrett, a Nextdoor escalations manager wrote to Real Bird in an email. “Our training for support agents who work real name cases includes a section on recognizing Native American names; but clearly that section of the training needs to be strengthened.”

Any user can report another person for using a fake name on Nextdoor without any evidence. It appears “Clearly there should be some sort of verification before [an account] is blocked,” Real Bird wrote to Nextdoor Support. “I do hope that Nextdoor realizes actions that invalidate a person’s identity take a deep toll on their ability to exist within a community.”

Outside of the platform’s moderation problems, Nextdoor has spent years recruiting law enforcement onto the app, according to OneZero. Not only do police departments have access to Nextdoor’s community forums, but the platform launched a function in 2016 that allows users to forward certain crime and safety posts directly to law enforcement. Serah Blackstone-Fredericks, a black writer from Oakland, found a post in her local Nextdoor forum last week that was just a photo of a black man on a bike in her neighborhood. “Suspicious man looking into Del Rio Cir Carports,” the post was titled. Singling out black people as “suspicious” is commonplace in Blackstone-Frederick’s community forum, and it has happened frequently enough that she decided to take things into her own hands last week, penning a letter to Nextdoor’s CEO Sarah Friar, demanding that the app forbid users from profiling each other based on race.

“Within many neighborhoods, there is a lot of neighborhood watch behavior that escalates into non-POC (people of color) people applauding one another for taking photos of people’s kids stealing candy, writing about a Black person robbing a store in the daytime, and posting things that should be sent directly to the police on Nextdoor,” Blackstone-Fredericks wrote in her letter. “One thing for certain is that we need change.”


As of publication, Nextdoor has not publicly released new moderation guidelines in light of the recent protests. “We want all neighbors to feel welcome, safe, and respected when using Nextdoor. As a community-building platform, racism has no place on Nextdoor and is completely counter to our purpose, values, and Community Guidelines,” a company spokesperson told The Verge.

Still, nonprofits like Color of Change and even Nextdoor meme accounts like @BestofNextdoor are pushing the app to commit to a series of demands to ensure that black, indigenous, and other people of color feel safe on the platform.

“It’s clear that they’re not addressing the problems,” Jade Magnus Ogunnaike, deputy senior campaign manager for Color of Change, told The Verge. “Nextdoor needs to commit to not only recruiting within the black community, but they really need to bring in civil rights experts on staff.”

She continued, “They need to commit to continuous evaluations and public reporting of how racial profiling and discrimination are showing up on their platform.”

The user behind the BestofNextdoor Twitter account, Jenn Takahashi, and Andrea Cervone, a former Georgia city council member, launched a petition on Monday calling on Sarah Friar, Nextdoor’s CEO, to commit to rolling out implicit bias and anti-racism training for every community lead and increasing transparency around moderation, like releasing a yearly report on data trends and committing to quarterly reviews of moderators.

“Designated moderator leads on Nextdoor can remove content at will, there is no accountability for them, and there are murky guidelines for how Nextdoor picks the leads it chooses to empower,” the petition reads.

Nextdoor may have launched as an app to “spread the word about a lost dog” or “find a new home for an outgrown bicycle” — and for many, it works pretty well as a hyper-local forum, a more accessible and less spammy alternative to Craigslist — but the company needs to ask itself: how useful is it if black members don’t feel safe on the platform? As the threats of violence and racist posts become increasingly prevalent and dangerous, black users are being forced off the app altogether. What is the value of a community-based social network that excludes people?

“Honestly, it boils down to this Nextdoor stuff and seeing what your neighbors are saying about you,” Kalkidan said. Last week, she called a family meeting with her husband and three kids, all under the age of nine, to talk about what she’s seen on Nextdoor. Because of her neighbors, Kalkidan said that she was forced to explain police brutality and systemic racism to her kids for the first time.

“It’s just so sad because we wanted to keep that innocence for our kids,” she said.

For Giwa, the app has reaffirmed her fears about her neighbors. “I don’t think it’s healthy for me to even have an account,” she said. “I’m definitely moving, but this is so emotionally corrosive.”

Written by Makena Kelly
This news first appeared on https://www.theverge.com/21283993/nextdoor-app-racism-community-moderation-guidance-protests under the title “Inside Nextdoor’s ‘Karen problem’”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.