Just as I wrote in last weekās assignment, the relationship between labels and images in a machine learning dataset is deeply influenced by the person who defines these labels. People have biases, and so do the labels they define. āDataā itself is socially constructed, as the book Artificial Unintelligence mentions. Therefore, the process of labeling images is not neutral but infused with subjective decisions that can reflect biases related to race, gender, class, and other social categories. And these labels shape how models interpret the world. Also, an image is hard to define using only a few labels. It has context and other background information that should not be neglected. As the passage Excavating AI states, "Images are remarkably slippery things, laden with multiple potential meanings, irresolvable questions, and contradictions" (Crawford, Paglen). These labels, embedded with cultural, racial, or gendered assumptions, influence how machine learning models make decisions, which can have widespread impacts on society. When people use these biased-labeled datasets to train a model, the model trained on this data influences not only personal decisions but also critical decisions in areas like law enforcement, employment, and education.
I decided to build on my previous project, which is a Flappy Bird game, but I modified the game's appearance. I used an image I drew of my dog, Millie, as the 'bird,' and a bone-shaped object as the 'tube' in the game. In my previous version, Millie was controlled by the mouse. To further develop this project, I decided to use sound to control whether Millie flies or not. My plan was that if someone clapped, Millie would fly, and with continuous clapping, Millie would fly higher and higher. However, it didnāt work for some reason, and I assume the issue was with loading the ml5 library. Therefore, I decided to work with an image classification model and use the image to control Millie instead.
When I was working with the image model, there werenāt any issues with the ml5 setup. I initially tried using my hand to control Millieās jump, but the detection didnāt work properly. Therefore, I decided to try using different objects that could be detected more easily
I trained a model using two conditions: me without a toy and me holding a toy. When I hold the toy, Millie would fly.
https://drive.google.com/file/d/15rd49tHNWHnFiJ3-9QH6jxF7GK3ENF_l/view?usp=sharing
Then, I applied the model I trained to the Flappy Millie game. Instead of using the mouse to control Millieās flight, I used example code to help me connect ml5 to p5.js and integrate the trained model into my code. Now, when I hold the toy, Millie flies.
https://drive.google.com/file/d/1RYNiN_bSwuvYx3jPdBRlvjD3gDnNHy4p/view?usp=sharing
However, using a toy to control the game makes it extremely difficult to play. I believe that if the sound model had worked properly, it would have made the game much easier to control.
Here is my p5js link:https://editor.p5js.org/Christinawu/full/rFV-MvI-4