Personal Collections

Blanca Vives on Invisible Women: Exposing Data Bias in a World Designed for Men

9 February, 2022

The architecture of bias in artificial intelligence

“Seeing men as the human default is fundamental to the structure of human society”. This is the premise of Caroline Criado’s book. The text depicts how female perspective and needs have often been envisioned as a deviation of men’s and thus have been misrepresented in all aspects of society: from urban design to the healthcare system.

Reading the book got me thinking about how this exact same premise applies to Artificial Intelligence (AI). Despite it could seem like externalizing the decision-making process and leaving it to technology could be a path to avoid the existent bias in human decisions, the truth is technology has turned out to be just as biased.

AI is fed with data, which is then processed by an algorithm to reach a conclusion and function accordingly. Therefore, data is the bricks to the architecture of AI. But if we train algorithms with inherently biased data and if our assumptions as a society are built disregarding certain segments of the population, we are just perpetrating the same pattern that Criado explains in her book.

There are already numerous examples of how our society’s structure impacts the architecture of bias in AI. For instance, the recruiting system created by Amazon. It screened the traits that were more common to the resumes of successful individuals within the company to then locate them in those of applicants. The tool never went far because during the trial period they discovered that, because most people with a successful career within the company were men, the system ruled out female candidates systematically. This is a great example of how statistics may generally lead to a true picture of our society, but they also carry the bias it has been built up on. And data analysis is all about statistics.

As a lawyer, I have been following this matter for years now and some of the legal initiatives to avoid bias in AI are aimed at ensuring diversity in programming teams, auditing the outcome of AI systems to spot bias at an early stage or putting in place codes of conduct for programmers where aspects such as discrimination are taken into account. If we want technology to serve us in the ways we need it to, we must ensure that it stands on the architecture of diversity.

Blanca Vives on Invisible Women: Exposing Data Bias in a World Desig...

The architecture of bias in artificial intelligence “Seeing men as the human default is fundamental to the structure of human society”. This is the premise of Caroline Criado’s book. The text depicts how female perspective and needs have often been envisioned as a deviation of men’s and thus have been misrepresented in all aspects of […]