Winner of Best AR/VR Hack, and 3rd place overall at HackUMass 2019
Odin is a project to assist those who are deaf or hearing impaired by using a combination of sensors and sound placement algorithms to project the information onto AR glasses. It can then detect and classify the sound, notifying the user where the sound is coming from and what it is.
This project was created in collaboration with my teammates Hannan Rhodes, Bill Ray, and Keerthan Ekbote. Find the code here
This project was built through the use of Web Technologies, Socket Programming, and ARCore technology. The sensor itself was connected to a Raspberry Pi, which served as the "brains" of the sensor. The Pi was able to send data through a TCP connection to a running Express server, which then processed the data and relayed that information to the mobile app. At each of these stages, data had to be transformed according to necessities. For example, noisy microphone data from the Pi had to be high-pass filtered, and processed to extract X Y Z vector components for the direction of the sound. In addition, we were able to use audio clips gathered from the microphones, and use IBM's Audio Classification API to classify the types of noise that the sensor was hearing.