I worked on AI and how I can help it learn new things. Basically I tried to show it different ahnd gestrues and give each of those gestures a certain command that the AI machine had to follow.
I did this because its very fun and innovative and I've never tried something like this before. It was very fun but confusing and I haveto admit many times there were difficulties and challenges.
I recorded my hand doing the gestures from different angles and then gave it a command like: Open Hand = Pause. After I finished doing all the gestures and commands I made it process that information I just showed it and learn it.
It works because the AI sees the gestrues and connects them with the command and what it learns is that a certain gesture meant a certain thing. So it collected the many images of lets say my fist and understood that when I put my fist up it meant to go to the previous track.
Thanks to this project we can learn more about machines and i really think it's a new and fun way to teach kids how machines work. Also the results had many outocmes. Sometimes my gestures were too similar and so the results ended up v=being 50-50 because the machine got onfused. Using the results to see what I did wrong and correct it was essential for my project.
I think I could make it better if I somehow managed to connect that to a robot with a camera so then I could make it either move how I wanted or bring certain things and act a certain way depending on my getsures.
I got plenty of help from Mr. Herron. Thank you Mr. Herron.
I love the practical applications of your project for sign language and how you're embracing the possibilities of working with generative AI. It’s exciting to see technology being used to promote inclusion and communication. Keep up the great work!