Emotional Analysis For The Visually Impaired

About the project

Visually impaired people cannot clearly understand the feelings of the person they are talking to. With this project, they can easily understand the thoughts and feelings of the people they are talking about. I'm using Google Joy Detector software with Google Text to Speech api.

Project info

Difficulty: Moderate

Platforms: GoogleRaspberry PiWindows

Estimated time: 4 hours

License: GNU General Public License, version 3 or later (GPL3+)

Items used in this project

Hardware components

Google AIY Voice Kit for Raspberry Pi - Starter Pack Google AIY Voice Kit for Raspberry Pi - Starter Pack x 1
Google Aiy Vision Full Kit - Includes Pi Zero Wh - V1.1 Google Aiy Vision Full Kit - Includes Pi Zero Wh - V1.1 x 1

Software apps and online services

Adafruit IO MQTT Service Adafruit IO MQTT Service For communicate AIY Vision Kit with AIY Voice Kit.
Google Joy Detector Google Joy Detector AIY Vision Kit Face Detection Api
Google AIY Voice Google AIY Voice AIY Voce Kit Text to Speech Api

Hand tools and fabrication machines

Screwdriver Screwdriver x 1

Story

Before starting all things, I would like to state that I had great difficulty in developing the project due to my health problems. But don't worry, I'm so much better now and I keep getting better.

When I saw the concept of the contest, I decided to make this idea. I thought how difficult my life would be if I were a visually impaired. With the help of the camera on the Google AIY Vision kit and the Google Joy Detector, the person's emotions can be understood.

Since the software versions of the Raspberry Pi cards in both kits are old, I had to update them first.I send the emotion data I received from Vision Kit to the Google AIY Voice Kit with MQTT. I decided to use MQTT, which is the most widely used method for the internet of things, for fast communication between two devices. Vision kit transmits emotion data to the voice kit. Voice kit outputs the data it receives as speech. I needed a broker for MQTT. I used IO Adafruit MQTT service, one of Adafruit's services, as it is a free data transfer service for students. I developed an MQTT Publish code in Python language for Vision Kit. This code works with Joy Detector and publishes the emotion analysis to the MQTT broker. On the Voice kit side, I developed an MQTT Subscriber code in Python language. This code works with AIY Kit's Text to Speech library. It converts incoming emotion analysis data into speech.

You can see my Python codes on my GitHub Repo.

CAD, enclosures and custom parts

AIY Challenge

Project: Emotional Analysis For The Visually Impaired

Go to download

Code

AIY Challenge

Project: Emotional Analysis For The Visually Impaired

Credits

Photo of fkurt97

fkurt97

I am Furkan, I'm studying electrical and electronic engineering. I am also a software developer. I have been developing mobile applications for about 4 years. I also work on electronic projects.

   

Leave your feedback...