Before we dive into why more women should lead AI teams, I want to share a fascinating story I heard from Tania Biland, a 3rd-year student of Lucerne University of Applied Sciences and Arts.

The story as narrated by Tania:

Last semester, our class got split into three different groups in order to develop a safety technology solution for Swiss or German brands:

Group 1: Only women (my group)

Group 2: Only men

Group 3: Four women and one man

After 4 weeks of work, each team had to present their work.

Group 1, composed of only women, developed a safety solution for women in the dark. As the jury was only male we decided to tell a story using a persona, music, and videos in order to make them feel what women are experiencing on a daily basis. We also put emphasis on the fact that everyone has a mother, sister, or wife in their life and that they probably don’t want her/them to suffer. In the end, our solution was rather simpletechnologically: using light to provide safety but connected to the audience emotionally.

Group 2, mostly composed of men, presented a more high-tech solution using AIGPS, and video conferences. They based their arguments on facts and numbers and pointed out their competitive advantages.

In Group 3, with 4 women and 1 man, the outcome didn’t seem finished. The only man in the group could not agree to be led by women and they, therefore, spend too much time discussing group dynamics instead of working.

The groups not only had different outputs but also approached the problem differently. My group (group 1) decided to start by defining each other’s work preferences and styles in order to distribute some responsibilities and keeping a hierarchy as flat as possible.

On the other hand, the two other groups elected a leader for the team. It turned out that these “leaders” were more perceived as dictators, which lead to heavy conflicts where the teams spent hours discussing and arguing while our group was just working and productive.

What science tells us about gender differences

The science landscape with regards to gender differences and effects on behavior is still evolving and has not come up with a clear set of scientific explanations for different behaviors yet. By compiling most of the research, there are two main factors that influence behaviors:

  1. Potential physiological differences between men and women
  2. Social norms and pressures forming different behaviors

In the above story, as told by Tania, women developed the solution in a Collaborative Leadership Style (adhocracy culture), adapting the leading position based on the tasks with an almost flat hierarchy. They derived their argumentation by involving all stakeholders (in this case the mothers and wives = users), showing empathy for their problems. They saw the bigger picture and also built a simpler solution that was actually finished.

Through the story, I was able to connect the dots on why most AI projects never end up moving out from the prototype phase to a real-world application.

Why AI products are not adopted?

Based on my experience, there are three main reasons why most AI and Machine Learning (ML) solutions do not move from the prototyping phase to the real-world:

  1. Lack of trust: One of the biggest difficulty for AI or ML products is lack of trust. Millions of dollars have been spent on prototyping but with very little success in the real-world launches. Essentially, one of the most fundamental values of doing business and providing value to customers is trust, and Artificial Intelligence is the most-heavily debated technology when it comes to ethical concerns and related trust issues. Trust comes from involving different options and parties in the entire development phase, which is not done in the prototype phase.
  2. The complexity of a launch: Building a prototype is easy, but there are tens of other external entities that need to be considered when moving into the real world. Besides technical challenges, there are other areas of focus that need to be integrated with the prototyping (such as marketing, design, and sales).
  3. AI products often do not take into account all stakeholders: I heard the story that Alexa and Google Home are being used by men to lock out their spouses in instances of domestic violence. They are turning up the music really loud, or they are locking them out of their homes. It is possible that in an environment with mostly male engineers building these products, no one is thinking about these kinds of scenarios. Additionally, there are many instances about how artificial intelligence and data sensors can be biased, sexist and racist [1].

Interestingly, none of the three points relate to the technical challenges, and all of them can be overcome by creating the right team.

How to make AI more successfully adopted?

In order to solve the above challenges and build more successful AI products, we need to focus on a more collaborative and community-driven approach.

This takes into account opinions from different stakeholders, especially those who are under-represented. Below are steps to achieve that:

Step 1. Involve different groups esp. women from the middle of the talent pyramid

In technology, most companies focus on hiring people at the top of the talent pyramid, where for primarily historical reasons, are fewer women. For example, most Computer Science classes have less than 10 percent of women. However, many talented women are hidden in the middle of the pyramid, educating themselves through online courses but lack opportunities and encouragement.