top of page

A Gender Bias Classifier

Nobody wants to think of themselves as biased or even sexist. Nevertheless, we all associate certain attributes with certain genders in one way or another. I actually consider myself an open-minded, feminist and self-critical person, and that's why I was curious to reveal my unconscious associations.

The basic idea is to capture my unconscious association in data and reveal them through a classification model. This experiment was done with the opensource tool teachable machine . I would like to invite you to follow along these steps yourself and to leave me a comment or an e-mail with your experiences.


Step 1 - Prepare the Dataset

I started this small exploration by classifying random objects in my environment into the binary categories of either 'male' or 'female'. Although this might seem pretty random, I would like to ask you to just try this task yourself. I was surprise how easily I could tell whether an object was feeling more semantically 'male' or 'female'. However, explaining why I fell that way about certain objects, turned out to be a way more difficult task.


Step 2 - Training

In the next step I used teachable machine to train a classifier on my dataset. I had to make sure to include images of my camera background in both categories, and to keep my fingers out of the images to not confuse the results. I furthermore made sure to have an almost equal amount of images per category.


Step 3 - Testing my unconscious associations

With the new individual gender bias lens I created, I started testing a bunch of different objects from my environment to see what category they would fall into. I also started testing my own designs to see weather they could clearly classified as either 'male' or 'female'. Some of the classifications were surprising, like the one pictured below. You would probably agree with me, when I say 'knitting is female'. Curious to find out why the classifier was thinking differently, I started to investigate the images in my 'male' category. I found at least 3 images that illustrated tools of some kind, that if taken out of their context of use, did actually show quite some similarity with the knitting sticks.



I experienced the first step of classifying my images and creating my dataset as a sensitizing activity, that already started to make me aware of the existence of bias in my thinking.

The testing itself was then very helpful to gain a deeper understanding of my unconscious associations. I learned that my gender bias manifests itself a lot in color associations. Whereas a lot of the black objects where classified as 'male', most of the more richly colored objects were seen as 'female'. The same object could thereby be identified as two different categories depending on which side of the object was facing the camera.


Next Steps

Next I am curious to use the findings and 'bias lens' I created to spark a dialogue with different people about biases in our designs. In a larger context I hope to run that experiment with multiple designers, that are then able to create a safe space for a discussion around their findings ind terms of personal biases.

bottom of page