They say 2014 is the year of wearable computers and devices. While marketed mostly towards the busy professionals and health conscious athletes, there’s far more to these devices.
This presentation looks at the intersection of wearable computers and accessibility. How can these sensor filled devices provide alternative displays and gestures? How can they help a blind person see the world, a person with a mobility issue explore, track health and detect traumatic events before they happen?
This presentation was created for the Inclusive Design 24 series of webinars that celebrate Global Accessibility Awareness Day 2014.
History of wearable assistive technology
Facial prostheses, 1919 Anna Coleman Ladd fashioned much-admired face masks for WWI soldiers in the European theater. Facial prosthetics and reconstructive surgery advanced significantly because of the war.
The obstacle detection device from WLVA from 2004 consists of three major components:
- head mounted display (HMD)
- backpack mounted equipment
The HMD incorporates the scanning fiber display and optics mounted in a tube on one side of a spectacle frame, and a video camera with IR light emitting diodes mounted on the other side. The backpack-mounted equipment consists of a laptop computer, an embedded processor, and hardware to drive the scanning fiber display.
Iglasses have obstacle detection and vibrate when the user gets close to a low hanging branch or other object.
Google patent on eye frame based obstacle detection.
Acceptance and Ubiquity
Complexity in the right place
“What made the Rio and other devices so brain dead was that they were complicated. They had to do things like make playlists, because they weren’t integrated with the jukebox software on your computer. So by owning the iTunes software and the iPod device, that allowed us to make the computer and the device work together, and it allowed us to put the complexity in the right place.” -Steve Jobs
Internet of Things
IPV6 allows every object to have its own ip address. Wearable devices can surround and work within your body and be uniquely defined. Lamprey inspired nano-robots are being developed that can migrate through your body for healthcare. photo: pacific lamprey by USFWS Pacific
- MIT’s Media Lab has been innovating the vision-based computers for many years.
- Eye-based wearables follow the head and the user’s point of interest.
- Eye tracking allows hands free. Sensors are at head level instead of floor level (cane)
Hands free control is what makes wearable accessibility devices mainstream. Handsfree control works for drivers, people carrying objects, provides alternative gesture support, and helps those with physical disabilities.
The tilt control function is available on gitHub to integrate with other applications. This is just one example of using Glass and other devices to control the user’s environment and computers without their hands. Watch for more work towards hands-free devices at Google Glass Accessibility Group on Facebook
- Ear based devices are common, especially after years of bluetooth headphones.
- Ears provide same benefits of eye-based location.
- Biometric information, such as body temp, pulse are available
- hands-free, display-free interactions
- Japanese researchers and ear based computer
The Dash is a KickStarter funded project to create an ear-based device that provides hands-free/display-free support, biometrics, and audio enhancement. This is just an example of ear-based exploration. Hearing based devices are display independent, a paradigm shift for designers.
Dialog is just one example of using biometric sensors for health monitoring. This watch and patch detect seizures, user feedback, and emergency alerting. Biometric tracking has become a key feature of most wearables, watch for Apple’s expected announcement of a health monitoring package.
Cloth can be embedded with threading that transmits electronic signals. A connected shirt can track movement, bio metrics, and potentially gestures. This is great for health monitoring.
Prosthetics are no longer static extensions of the body. They contain multiple sensors, processors, and controllers. DARPA has been encouraging prosthetic and exoskeleton designs. the DEKA arm is a good example photo:
Detecting the person’s motion for gesture recognition, detecting their tasks, understanding the difference between a sit up and a push up, hands-free Lars Asplund, Professor Emeritus in Robotics at MDH, is working on this hand-based device.
Smart watches could help those with memory loss via geolocation/geofencing, reminders, providing information to rescue. photo: No, I’m sorry, I don’t know who you are by Neil Moralee
Short Term Memory Loss
People with short term memory loss require reminders. The watch is a natural location for reminders. Motion and biometric sensors could also detect when an activity has been accomplished. This avoids reminding user to do an already completed task and encourages independence.
Sension is working on facial and gesture recognition with Google Glass
Text and Object Detection
Captioning on Glass allows a person with a hearing impairment to see what another person is saying. This Georgia Tech project is available as open source. Google also has a patent on speech to text transcription.
The finger reader project from MIT uses a finger based device to track the sentence a user is reading and convert to speech. It also guides the person to move up/down if they go off the line. This should be possible with an eye-based device.
The intellect of the wise is like glass; it admits the light of heaven and reﬂects it.
The opportunity exists for us to build devices that express our hidden selves, explore the world with new senses, and combine reactors and reactions. What could you solve with wearable computing?