2020.12.01 - 2021.01.15.
Cochl. X Mercedes Benz
Non-Verbal Interaction In Autonomous Vehicles
During my Internship as a Software Enginner at Cochl, (a Machine Learning Company specializing in non-verbal Sound Recognition AI ) I created the car cockpit display of Cochl's Sound Recognition AI integrated in Mercedes Benz car cockpit display. This was a very meaningful experience to be because it first opened my eyes to issues with Human Interaction with Autonomous Systems. After this internship, I gained an interest in studying the development of Human Interaction with Autonomous Technology.
With the uprise of smart cars and more interactive technology, Benz Daimler was interested in creating a more Emotionally Aware Car. Cochl's Sound AI technology can recognize non-verbal sounds such as sighs, coughs, sirens, and machine malfunctions to recognize non-verbal, more emotionally aware human and environmental states.
Among these features, the ones in the graph below are the ones we were planning to add to Mercedes Benz.
My main job was to create front-end user display for Mercedes Benz cockpit and integrate the frontend with the backend webserver. Messages recieved from the backend SDK would be caught by the webserver and passed onto the frontend to display changes or notify the user. This whole process took about a month and it was an exciting experience for me to create user interaction with AI devices on a smart car display. Below are the features I developed in this software application.
Feature 1 : Harmonizer
The below video is a demonstartion of the Harmonizer feature!
Feature 2 : Emergency
Feature 3 : Secret Language
Feature 4 : Baby Cry
Feature 5 : Cough
Feature 6 : Hand Clap
Feature 7 : Sigh
Feature 8 : Dog Bark
Feature 8 : Animal Game
This was a very meaningful experience to be because it first opened my eyes to issues with Human Interaction with Autonomous Systems. After this internship, I gained an interest in studying the development of Human Interaction with Autonomous Technology.