Emotion detection with Beyond Verbal

Emotion detection with Google Glass was an early proof of concept for Google Glass. It depended on facial analysis to determine a person’s emotion.
She was rather happy with herself
Communication UI and emotion detection has significantly improved over the years. What are new possibilities for wearables?

Could a device incorporate emotion detection in the wearer’s voice to give them feedback to control anxiety or reward other emotions? Could it help a person with autism better understand a social situation?

Founded out of Tel Aviv in 2012, Beyond Verbal has been building on decades of emotion analytics research with its own studies, which have been carried out in conjunction with notable organizations including the Mayo Clinic, the University of Chicago, Scripps, and Hadassah Medical Center. Its data gathering has led to the collation of more than 2.5 million “emotion-tagged voices” across 40 languages. The company’s technology doesn’t consider the content or context of the spoken word, but instead looks for signs of anxiety, arousal, anger, and other emotions through examining the intonation in a person’s voice.


Beyond Verbal

Posted in Communication UI Tagged , , .

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*