Nextdoor won’t survive if only a small segment of the neighborhood feels welcome & respected enough to stay. To fix the problem, @gemmagaldon
says the company should start auditing their platform & track it’s psychological impact
#policing #Algorithms

Research also tells us that the most active users on platforms like Nextdoor tend to be the most biased to begin with. Gemma Galdon Clavell, a Spanish policy analyst who has developed an audit for reducing bias and improving algorithmic predictions, has seen this trend when conducting research on safety-focused WhatsApp and Facebook groups. “There’s a tendency of the [active] group to see themselves as the only legitimate users of the space, and everyone who’s not part of the group immediately becomes suspect,” she says.

Ultimately, the Nextdoor platform may not survive if only a small segment of the real-life neighborhood feels welcome and respected enough to stay online. To fix the problem, Clavell says the company should start by publicly auditing their platform on an ongoing basis, and track it’s psychological impact on users.

“You want to know those things before they become a problem,” she says, referencing Zoom’s admission that they hadn’t thought through security concerns before launching. “You want to avoid that situation where you have to make all these improvements on-the-go. Zoom managed to do that, but other companies collapsed.” Nextdoor appears to be making the same gamble. 

For the complete article, please click here.