To switch from one technology to another, you usually have to make a new investment. Many are held hostage bc of a technological decision made yrs ago w/ companies that no longer provide updates or don’t allow them to adapt to new tech.

COVID-19 has highlighted the importance of protecting our privacy rights. Can you tell us more about that?

Gemma Galdon-Clavell, PhD

Since the ’80s, advances have been made at the technological level. Big data allows you to accomplish things that were once unthinkable are opening up new job opportunities.

However, there very few policies that protect us. Every hospital, every city council, every autonomous community that use software generates data and we’re not as alarmed as we should be when the data we generate can easily be shared, exchanged and/or exploited.

Since data hasn’t been planned beyond their initial management, we have to gather minute-by-minute data on the number of beds, tests performed, or active health professionals. We lack that infrastructure and there are no technical reasons for why we’re missing the mark on this.

You pointed out that what we’re seeing is “pandemic theater”. What do you mean?

Gemma Galdon-Clavell, PhD

In security and risk management, we often use the term, security theater. An example is at airports, with multiple security cameras or facial recognition software. The technology isn’t failproof, but it is sold to the public as if they are real solutions from the future. Theater is when we generate the feeling of security and safety rather than to really offer it.

With this pandemic, the best example is infrared cameras, used to allegedly detect someone with a fever but only measure the temperature at the surface level, which has nothing to do with the interior. At the epidemiological level, the real problem is in asymptomatic individuals.

You also stressed the development of trustworthy, explainable & ethical algorithms and tech. During our state of urgency in this pandemic, is will we inadvertently depend on risky tech that violates this trust?

Gemma Galdon-Clavell, PhD

That’s what I’m afraid of. When companies promote their software as the only default technology available to public institutions, as time passes, they will start collecting, creating technical legacies that will be very difficult to undo.

To switch from one technology to another, you usually have to make a new investment and return to the beginning. There are many administrations that are held hostage because of a technological decision made years ago with companies that no longer provide updates to their systems or do not allow them to adapt to new ones.

With distance education platforms, how can it impact the rights of minors?

Gemma Galdon-Clavell, PhD

The teacher’s conversations, images, and opinions are stored on servers, which they have absolutely no control. Many teachers use free business applications –  if you don’t pay for the app – you are the product.

If a minor uses a commercial platform, all the data: biometric data, geographical location, etc will be collected and sent to a server over which you also have no control.

Our children are the product of large multinationals that sell that data to third parties. Not to mention phone apps, which may require microphone or camera access permissions.

Pedophiles have been detected and caught on virtual classrooms with minors. We can’t have infrastructure without security measures. Zoom, for example, has improved its privacy during the pandemic.

Another debate that has intensified during the pandemic is fake news shared through social media and forums. How do we deal with this issue?

Gemma Galdon-Clavell, PhD

As Internet use increases, so does the ability to lie and manipulate. There are two options: let this new reality be managed by private companies such as Facebook or Google or by the countries themselves.

It is more democratic to establish legal rules that determine the limits of deception and manipulation. Platforms already censor a lot of content, but not of political opinions but of misinformation. Shouldn’t elected officials, which we democratically vote for regulate this issue? If we don’t like private companies doing it, we should allow the state to legislate and then critique their proposals, if we disagree.

In the EU, we’re considering an app that tracks the movement of the population and lets you know if you have been in contact with patients with COVID-19. You pointed out that for it to be effective, you have to use it 60% of the population.  What other characteristics must it meet for users to trust it?

Gemma Galdon-Clavell, PhD

For starters, be very transparent and very respectful of privacy and explain exactly how it works.

If what you believe is an epidemiological alert system, it should also serve to get a person to access a test in the same way as when they call an emergency hotline. Experts are working on that.

And there has to be no doubt that it won’t be used for other purposes. It is our struggle at the moment, because there are countries that want to have access to more data but do not realize that if the population won’t use the app, the data is useless.

Original Spanish interview, here