Skip to Content

Is AI Doing Gender?

Sogeti Labs
September 10, 2019

Before we engage in a discussion with Alexa or any other AI, let’s see what happened in the last article. As detailed in my previous post, the actual Gender situation has been culturally shaped by heterosexual patriarchs throughout the last millennials. All the differences we think of between male and female are just stereotypes, grouped into human culture. Furthermore, Gender is a component of intersectionality. Gender is one among various factors that generate these different dynamics (intersectionality) in a planned society.

If you are a “simple” user of Alexa or “OK Google” you may think that there is a real intelligence inside these little devices. Thanks to Eva Holmquist (Sogeti Sweden) and Rik Marselis (Sogeti Netherlands) last post, we have a deeper view on AI guts.

Machine Learning

To know whether AI is Gender or not

We must evaluate the involvement of humans in the process of AI integration. Given that machine learning is primarily based on a massive amount of data, let’s have a look at how this data is blended into asexual AI’s brain.

AI’s procreation goes through three steps:

  1. Data collection: depends on the goal the product owners want to achieve with the machine learning algorithm
  2. Dataset correction: remove outliers and data elements that shouldn’t influence the outcome in a wrong way
  3. Data transformation: fashion the model in the proper format as expected by the Business

Moreover, for the machine learning to occur, the system also needs to have feedback on the result. In many cases, this is also done by humans, giving us another human influence on learning. We could say that

The machine learns while human teaches

prejudices

On that note, let’s bring another element to the discussion. Imagine an algorithm that is created to rotate pictures of human portraits. Regardless of the subject (male or women) each picture will be treated by the machine as equal. No difference in treatment for male or female pictures. We can then affirm that, by essence, a Process is devoid of any gender stereotype. That may open real hope for Gender to get help from Digital.

But, as we have seen in the beginning of the discussion, AI is not only based on process but made of flesh and data. Data that has been processed by a human. Meaning that the Gender capacities of AI are directly dependent on statisticians.

AI’s Gender position is based on the combination of its teacher’s Gender vision

A combination, not an addition. Even if today we don’t have enough perspective on the issue, we can nevertheless consider that if there is no willingness to introduce actively gender into AI, it is likely that the stereotypes present in our planned society will pop up out of AI’s mind.

Consequently, there must be sponsoring from the decision makers to inoculate at least gender equity into machine learning or chatbot projects to be sure that future AI will participate virtuously to do Gender. Yet, just 24 female CEOs lead the companies on the 2018 Fortune 500 (even fewer than last year), meaning that even though Alexa sounds like a female, its attitude is much more the one-off a heterosexual patriarch.

About the author

SogetiLabs gathers distinguished technology leaders from around the Sogeti world. It is an initiative explaining not how IT works, but what IT means for business.

    Comments

    One thought on “Is AI Doing Gender?

    1. Hello,
      This is a very interesting topic.
      My feeling is that we will still have « gender » on AI while they are designed by human. Even if we try to get ride of it, I really think that our « unconscious » habit will put some « genderized » part on AI.
      However, in a future not so far, the AI will be designed by AI and since the machine will learn from themselves that they don’t have gender, it will not be inherited to the latter. My conclusion is that « gender free AI » will come when the machine will rise and design themselves.

    Leave a Reply

    Your email address will not be published. Required fields are marked *