Google Glass and smart watches are garnering much of the media’s attention on wearable computers, but the wrist and face are only two locations for wearable computers. Many researchers and engineers are looking at the ear as a logical location. This is especially helpful as we’ve become accustomed to seeing people with objects within the ear, be it ear rings, headphones, or blue-tooth devices.
The recent article People Don’t Like Google Glass Because It Makes Them Seem Weak brings attention to the stigma that was associated with bulky, scary looking assistive technology.. Compare the large assisted communication devices that found competition with Proloque2Go on an iPad. The user’s technology went from exotic to invisible.
Obviously the ear is conveniently located to our key senses: sound, taste, and sight. Ear based computers are also perfect for voice recognition and audio output.
The Earclip-type Wearable PC from the Hiroshima City University looks like an ear ring, but allows the user to control the functionality via facial expressions.
The system, which developers are hoping to have ready for Christmas 2015, can be connected to an iPod or other gadget and would allow the user to navigate through software programmes using facial expressions, such as a raised eyebrow, a stuck-out tongue, a wiggle of the nose or by clenching teeth.
The device uses infrared sensors that monitor tiny movements inside the ear, which differ depending on how the eyes and mouth move.
Japan researchers testing tiny ear computer – Phys Org
Dash, a project from BRAGI, takes a bluetooth ear plug and combines it with sensors and ear-bone conducting microphone to provide user with clear sound, wireless microphone, and tracking of pulse and other physical information.
The iRiver device continues the concept of using the ear as both audio and physical sensor. However, this device has a base unit that sits behind the neck.
Olga Kharif’s article Ears follow eyes in $1.84 billion wearable computer boom highlights several products and research products that have focused on the advantages of using ear based computers and devices. She quotes Romulas Pereira’s description of what makes the ear so attractive to developers.
“The ears are a very convenient place, because they are the one part of your body that tilts and pans to where you are looking at,” Pereira said. “Life requires both hands, frequently. Glasses and the wrist have become the poster child for exploring wearability. Behind this poster child is a whole population of things.”
With clear access to biometric feedback, audio, speech, and line of sight, the ear-based wearables may provide the ideal location for hands and eyes free products. There are many more possibilities, especially as developers tap into existing cochlear implants, using stereo microphones to detect sound location, and integrating motion sensors to capture the full spectrum of personal data.