Three Privacy Risks in Community Policing

The CRM, app or AI that help corporations require a different approach when policing. Law enforcement needs vetted technology that understands impact, law, have trust, oversight as well as transparency.

Despite the authoritative nature of policing, they are still required to abide by privacy and data protection laws. It’s especially true when they are processing a wide range of personal data and when performing investigative duties. Further, they are bound by a set of legal procedures that guarantees due process. Community policing is no different.

Even though community policing implies an informal dimension of public security, integrating officers and civic participation does not mean that privacy and data protection rights loosen. In fact, this form of alternative policing requires a heightened sense of privacy awareness. If the risks are not duly addressed, it puts not only police officers in danger, but also neighborhoods and entire families.

How? Here are three ways:

1. If a community policing app aims to solve crimes, but requires civic participation, community members using this app should be fully protected. This means complete assurance that their personal data remains anonymous so that vindictive perpetrators can’t harm civic participants.

Protected groups remain the most vulnerable to revealed identities.They include minors (especially victims of bullying or domestic violence), male and female victims of domestic violence, elderly people who live alone, threatened neighbors, rehabilitated offenders or citizens with criminal records, as well as civil servants.

Alternatively, revealing the identities can potentially result in false accusations. Let’s remember, alleged offenders still have rights and need protection.

2. We’re all aware how quickly pictures and videos circulate through social media. When a community manager creates a post containing inappropriate language and spreads misinformation about alleged suspects, the impact has long-term ramifications as it reaches viewers far and wide. Even if the contents are later removed, the harm remains.

3. As always, cost factors result in choices that aren’t always in our best interest. When technology is outsourced to third-party services, that also means we’ve outsourced the responsibility of processing sensitive data. If third-party services fail to consider the security and privacy risks, the platform can accidentally disclose sensitive data, such as the identity of the accused and / or accuser. 

Limits of Regulation

While we have Regulation 2016/679  and Directive 2016/680 to govern the handling of data in law enforcement not all risks can be addressed through regulation.

We could cite many examples of security protocols gone awry.

A well-known case that is still impending is of Francisco Nicolás Gómez Iglesias, a former law student and known in the Spanish press as “El pequeño Nicolás who impersonated himself as an envoy of Spain’s King Felipe VI as well as a secret government agent . For years, I have enlisted corrupt local police officers who were complicit in exploiting their access to driver license databases.

Impact Assessment Required

The CRM, app or AI that help corporations require a different approach when policing. Law enforcement needs vetted technology that understands impact, law, have trust, oversight as well as transparency.

New innovations require a rigorous impact assessment and deployment of corresponding organizational, technical and legal safeguards and measures.

Recent Blog Posts

If you want to know if your algorithms work or not, let's talk!

  1. Select a date and time. 
  2. Enter your name and email.
  3. Invite your colleagues.
  1. Selecciona una fecha y hora.
  2. Ingresa tu nombre y correo electronico.
  3. Invita a tus colegas.