Although awareness of algorithms and their potential for discrimination have increased significantly over the past five years, Gemma Galdon Clavell, director of Barcelona-based algorithmic auditing consultancy Eticas, tells Computer Weekly that too many in the tech sector still wrongly see technology as socially and politically neutral, creating major problems in how algorithms are developed and deployed.
On top of this, Galdon Clavell says most organisations deploying algorithms have very little awareness or understanding of how to address the challenges of bias, even if they do recognise it as a problem in the first place.
The state of algorithmic auditing
Many of the algorithms Eticas works on are “so badly developed, oftentimes our audit work is not just to audit but to actually reassess where everything’s being done”, Galdon Clavell says.
While analysing and processing data as part of an algorithm audit is not a particularly lengthy process, Eticas’s audits “six to nine months” because of how much work goes into understanding how algorithm developers are making decisions and where all the data is actually coming from, she adds.
“Basically all these algorithms have a really messy back end, like someone’s not even been labelling the data or indexing everything they’ve been using. There’s so many ad-hoc decisions we find in algorithms with a social impact – it’s just so irresponsible, it’s like someone building a medicine and forgetting to list the ingredients they used,” she says, adding that 99% of the algorithms she comes across are in this state.
For the complete article, please click, here.