logo

Wake up daily to our latest coverage of business done better, directly in your inbox.

logo

Get your weekly dose of analysis on rising corporate activism.

logo

The best of solutions journalism in the sustainability space, published monthly.

Select Newsletter

By signing up you agree to our privacy policy. You can opt out anytime.

Leon Kaye headshot

How a Social Media Company Drastically Reduced Online Racial Profiling

By Leon Kaye
california-bungalow.jpg

Today, fewer people know their neighbors and often do not feel connected with their local communities. Bay Area startup NextDoor promised to become a game-changer in how people interact with each other in their own neighborhoods. After all, when more consumers have become less trusting of peer-review sites such as Angie’s List or completely creeped out by Craigslist, why not use technology to break down barriers where we live and avoid the annoying banter that consumes our Facebook feeds?

NextDoor was onto something, as it is now a popular means of communicating across at least 115,000 neighborhoods. The company estimates that thanks to its growth at least 65 percent of Americans now live in a neighborhood that has a NextDoor group. The site has proven to become a great tool for getting rid of unwanted stuff. It can also help that distraught owner find his lost pet, assist in locating a dependable babysitter, find a handyman or handywoman, and, of course, help keep an eye on crime.

On that last point, however, some NextDoor users became too vigilant and even assumed the worst about some of their neighbors. As the blog Fusion reported in March 2015, the site’s crime and safety postings were bogged down with what amounted to racial profiling. A typical posting would be a report of "suspicious characters" who turned out to just be people of color in hoodies. One incident, for example, came after one Oakland resident responded to a safety posting about "sketch characters checking out a house" to let his neighbors know they were, in fact, friends who showed up at the wrong house. The commenter responded with a brusque, "That's a relief."

Such comments sound innocent on the surface but in aggregate represent a persistent distrust upon the first sight of people of color who happen to be walking outside, a highly disturbing trend. When such assumptions go unchallenged in the environment of an online community forum, people of color quickly and rightly begin to feel unwelcome. And no one should be made to feel they need to be the PC police in a social setting.

There was no way additional comment moderation or community meetings could solve this problem. It was both widespread and insidious, and it represented a real problem for NextDoor. The company risked losing not only the feel-good atmosphere of neighborhood friendliness it desired, but with it its business.

Another challenge for NextDoor was that this issue sprung up in the midst of racial tensions that flared across the U.S. in recent years. NextDoor could amplify those tensions by doing nothing, or it could address them head-on and be a beacon of change.

NextDoor decided to take the latter route and did so by not only turning the orthodoxy of technological design on its head, but also taking plenty of time to get it right. So far, the company insists the outcome is a 75 percent reduction in racial profiling on its crime and public safety postings.

To learn more, TriplePundit spoke with Kelsey Grady, NextDoor’s head of communications, from her office in San Francisco.

“One of most important things to know is that we are not done with this process,” Grady told us. “We still have a lot of work that needs to be done, especially with our mobile application. But we’ve reduced these instances, and in the meantime have gotten a lot of great feedback from our users and peers at other companies.”

By October of last year, NextDoor realized that the level of racial profiling incidents left the company no choice but to make addressing this problem a priority. It did not matter to the company that the overwhelming majority of these posts were submitted by users who did not intend to incite stereotypical fears but were largely operating on unconscious bias. Even if these posts were not the result of some racist trolls purposely fanning the flames of hate, NextDoor decided a design change was needed.

To that end, as Wired explained this summer, NextDoor broke convention by going against the unwritten rule of minimizing the number of steps in the design of a technology product. It has been inculcated in us that when using a website or smartphone app the fewer steps between us and our purchases or our published comment, the better.

But speed and ease pose plenty of risks. Hence the nasty note we may get when we click on “haha” to react to a Facebook post when we meant to click “sad.” On that point, NextDoor went against convention and added steps to the process involved when users post entries related to crime and safety.

“We realized that it doesn’t matter how small the numbers are, the fact is that any racial profiling incident can have a hugely negative impact in any neighborhood,” Grady said. “Maybe we couldn’t exactly measure that posting’s impact, but as a tech company, we had to get past our conventional analytic testing as we had a moral obligation to get this right.”

The process evolved from January to August of this year, which is a million years in tech company time. The company simultaneously released six different iterations of the crime and safety posting process to different online neighborhood groups in order to figure out which process could best reduce racial profiling. One design feature offered no changes to the product’s interface, allowing it to serve as a control group. Meanwhile, moderators from NextDoor and its partner organizations manually combed through the site to flag postings that were clearly instances of racial profiling.

Throughout the product re-design, NextDoor partnered with community organizations in Oakland, California, including 100 Black Men and Neighbors for Racial Justice. Public officials in Oakland were also consulted for their expertise. In the end, the company and its stakeholder partners settled on a feature that eliminated the most occurrences of racial profiling.

One of the first steps for the company was to describe racial profiling clearly and succinctly for users. NextDoor defines racial profiling not so much by what is said, but by what is left out. The most frequent examples of racial profiling, in fact, were posts that lacked a complete description of an individual.

Simply put, “black male” or “white woman” is far from enough when it comes to gauging dodgy behavior in one’s neighborhood. “When you have a post that has only the race and gender in such a description, that is damaging to anyone who can be described by those terms,” Grady explained.

In addition to the requirement for more details of a person’s appearance, additional details related to their behaviors are also now required. “Let’s face it, it’s not criminal to make a U-turn,” Grady said. “Nor is it criminal to walk down the street.”

As users submit a crime and safety posting, they are asked questions that nudge them to look inward. Is what that person doing really suspicious, if race or ethnic background is taken out of the equation? Clothing must now be described from head to toe; after all, just about everyone under the age of 60 sports a hoodie from time to time, or even daily. The more detailed a description, the more helpful a resident could then be to neighbors and, in more dire cases, to law enforcement.

“With the new posting flow, it only makes sense to ask for more detail, as you can share better details with your neighbors, really motivate people to be on the lookout for truly suspicious behavior, and finally, if necessary, to share this information with police,” Grady continued.

There is no way any algorithm or design flow can weed out all racial profiling. So many of the assumptions we make based on someone’s race or ethnicity are due to the systemic racism and unconscious bias endemic in the U.S. No technology or company will be able to stamp out these assumptions 100 percent.

But what NextDoor accomplished puts other companies on notice that better design is a critical step in eliminating harmful assumptions. Good design can help us treat each other with more respect and empathy. And as Grady made clear, this is still an ongoing process for NextDoor, from which the company and its employees can continue to learn.

“We’re not done yet, as we’re still working with our partner organizations. And we’re still holding regular stakeholder calls so we can update each other on what’s going on,” Grady told us. “No one at this company was an expert at racial profiling, but what we were able to do is merge our software development team’s expertise with that of local community groups and city officials to arrive at a conclusion.”

Image credit: David Sawyer/Flickr; NextDoor, Screenshots 

Leon Kaye headshot

Leon Kaye has written for 3p since 2010 and become executive editor in 2018. His previous work includes writing for the Guardian as well as other online and print publications. In addition, he's worked in sales executive roles within technology and financial research companies, as well as for a public relations firm, for which he consulted with one of the globe’s leading sustainability initiatives. Currently living in Central California, he’s traveled to 70-plus countries and has lived and worked in South Korea, the United Arab Emirates and Uruguay.

Leon’s an alum of Fresno State, the University of Maryland, Baltimore County and the University of Southern California's Marshall Business School. He enjoys traveling abroad as well as exploring California’s Central Coast and the Sierra Nevadas.

Read more stories by Leon Kaye