Wearables on Amazon
Wear Ability Tweets
Individuals with Parkinson’s and other tremor conditions lose the ability to manage basic tasks that involve gripping, pouring, and eating due to persistent shaking of their hands. Google introduced Liftware, an amazing spoon that is able to counteract the hand’s shaking for eating, but this is limited to a single task.
Instead of focusing on a single task, GyroGear is focusing on the hand itself. They’ve added a small, yet powerful gyroscope to the back of the glove to increase stability.
Gyroscopes are spinning discs. Inspired by bleeding edge aerospace technology, but no more different than children’s toy tops. Gyroscopes do their utmost to stay upright – they conserve angular momentum. These spinning discs thus counter any input of force instantaneously and proportionally.
GyroGlove – GyroGear
GyroGlove was built by Faii Ong after caring for an elderly patient with Parkinson’s while a medical student in London. Ong was told there was nothing that could be done for the patient, which was not an adequate answer for the young student. The glove is still under development, but is showing great promise.
Among Parkinson’s sufferers, the device has generated a significant amount of hope, according to Sarah Webb, founder of the South London Younger Parkinson’s Network. “People with Parkinson’s take a cocktail of drugs daily, which over time won’t be so effective,” she says. “The GyroGlove is an exciting and a completely different concept: something we can wear, something we can feel the benefits of immediately and something which will make our lives easier and allow us to get on with our daily lives.”
Hope in a Glove for Parkinson’s Patients MIT Technology Review
Wearable devices go beyond the watch and eye glasses. This presentation introduces the wide variety of devices and how they can make the world more accessible. This article by Smithsonian explains how the pocket watch was wearable-first design
Continue Reading Wearable First Rethinking Accessible Design
They say 2014 is the year of wearable computers and devices. While marketed mostly towards the busy professionals and health conscious athletes, there’s far more to these devices.
This presentation looks at the intersection of wearable computers and accessibility. How can these sensor filled devices provide alternative displays and gestures? How can they help a blind person see the world, a person with a mobility issue explore, track health and detect traumatic events before they happen?
This presentation was created for the Inclusive Design 24 series of webinars that celebrate Global Accessibility Awareness Day 2014.
Google Glass and smart watches are garnering much of the media’s attention on wearable computers, but the wrist and face are only two locations for wearable computers. Many researchers and engineers are looking at the ear as a logical location. This is especially helpful as we’ve become accustomed to seeing people with objects within the ear, be it ear rings, headphones, or blue-tooth devices.
The recent article People Don’t Like Google Glass Because It Makes Them Seem Weak brings attention to the stigma that was associated with bulky, scary looking assistive technology.. Compare the large assisted communication devices that found competition with Proloque2Go on an iPad. The user’s technology went from exotic to invisible.
Obviously the ear is conveniently located to our key senses: sound, taste, and sight. Ear based computers are also perfect for voice recognition and audio output.
The Earclip-type Wearable PC from the Hiroshima City University looks like an ear ring, but allows the user to control the functionality via facial expressions.
The system, which developers are hoping to have ready for Christmas 2015, can be connected to an iPod or other gadget and would allow the user to navigate through software programmes using facial expressions, such as a raised eyebrow, a stuck-out tongue, a wiggle of the nose or by clenching teeth.
The device uses infrared sensors that monitor tiny movements inside the ear, which differ depending on how the eyes and mouth move.
Japan researchers testing tiny ear computer – Phys Org
Dash, a project from BRAGI, takes a bluetooth ear plug and combines it with sensors and ear-bone conducting microphone to provide user with clear sound, wireless microphone, and tracking of pulse and other physical information.
The iRiver device continues the concept of using the ear as both audio and physical sensor. However, this device has a base unit that sits behind the neck.
Olga Kharif’s article Ears follow eyes in $1.84 billion wearable computer boom highlights several products and research products that have focused on the advantages of using ear based computers and devices. She quotes Romulas Pereira’s description of what makes the ear so attractive to developers.
“The ears are a very convenient place, because they are the one part of your body that tilts and pans to where you are looking at,” Pereira said. “Life requires both hands, frequently. Glasses and the wrist have become the poster child for exploring wearability. Behind this poster child is a whole population of things.”
With clear access to biometric feedback, audio, speech, and line of sight, the ear-based wearables may provide the ideal location for hands and eyes free products. There are many more possibilities, especially as developers tap into existing cochlear implants, using stereo microphones to detect sound location, and integrating motion sensors to capture the full spectrum of personal data.
Using your finger to follow along with printed text is a method many people use to keep track of what they are reading. The FingerReader, a prototype from MIT’s Fluid Interfaces Group, incorporates a camera into a ring to follow the finger and announce the printed text. While still a prototype, the following video shows how this can make reading easier for people with low vision, dyslexic, or other print disabilities.
While watching this video, I noticed the user had a patch of fingernail polish on her thumb. This led me to think that Google Glass, or similar head based unit could probably track a marked finger and do the same tracking and word prediction. This would be similar to head tracking units for onscreen keyboards.
Fleksy started as an alternative keyboard application for blind iOS users. As it matured, the application became a very popular alternative keyboard on Android and now iOS applications for sighted users as well. Fleksy’s intelligent pattern matching allows a user to type without having to carefully hit specific keyboard keys, which makes it especially useful on the limited real estate of a wearable computer.
This video shows Fleksy on an Omate smart watch. I watched them use this last year, long before the Galaxy Gear was released. You’ll see how quickly a user can write on this limited space. Unfortunately, this video doesn’t have audio descriptions, but for every tapping sound, the user is adding a letter, the swipe sound is completing the word.
The University of Washington’s Mobile Accessibility department has been prototyping applications that use mobile devices to solve particular accessibility problems. Their projects have used multiple sensors within the device, from accelerometer to camera. Example projects include detecting alarms for hearing impaired to using a phone as a virtual cane for navigating open spaces.
The projects are open source and you can easily download the source code and Android APK files for testing and developing. Many of these projects could easily be integrated into a Glass product. Visit their Projects page for a list projects. You can also visit the collections on their Google Code page.
Google Glass has triggered significant research into using alternative inputs to enhance accessibility. Many of these researchers and developers are sharing information on Facebook via the Google Glass Accessibility group.
The group was started by Andy Lin, a developer and Google Glass Explorer who started an IndieGogo campaign to kick off his exploration of Google Glass. Lin is exploring the use of Glass with external switches for those with physical disabilities.
This prototype from the Sarohm team in Israel interprets sign language via an Android phone. This was demonstrated in 2011 and is still a rough draft, but is a good example of using hand held devices to improve communication.
They describe the glove as a poetic communication platform, which hints at expanding the use beyond sign language.
A POETIC COMMUNICATION PLATFORM
A glove that translates sign language into text. The idea was initially conceived as an alternative communication platform for future wearable computers in order to bridge between the virtual textual and offline physical communication.
The use of sign language presents a poetic solution; taking a tool used today by communicatively disabled people and putting it to use in order to enable and enhance the communication skills of the rest of us.
Show and Tell – Sarohm