Wearable devices go beyond the watch and eye glasses. This presentation introduces the wide variety of devices and how they can make the world more accessible. This article by Smithsonian explains how the pocket watch was wearable-first design
Evolution of Design
User feeds information to computer, output can be cryptic.
Low bandwidth, small memory, but personal sized and with instant feedback. Text based interface and output
Linked data with images and dynamic interactions. Design was fluid and allowed fast iterations. Monitor size was not standard and led to restrictions in design
Mobile First introduced the practice of designing for a device as a strategy. Mobile first required reducing the design to the most important elements and pushing the user to backup (web) for more complicated interactions. Luke W’s Mobile First Decree introduced this concept.
What does it mean to design for wearable first? What is the interface? What is the input? What are the limitations? What are the advantages?
Time to start over
What is the interface?
Wearables force us to re-evaluate the interface. How does one interact with the device? How do they get feedback? What is the sole purpose of the tool/ app/wearable?
How can a wearable device have no interface? Perhaps the device is less about feedback and more about completing a task or providing input.
The ChiTronic Smart Ring NFC Controller for Android connects to an Android phone via NFC and allows the user to unlock their phone without a password.
Biometrics Dialog is just one example of using biometric sensors for health monitoring. This watch and patch detect seizures, user feedback, and emergency alerting. / Biometric tracking has become a key feature of most wearables.
Internet of Things: IPV6 allows every object to have its own ip address. Wearable devices can surround and work within your body and be uniquely defined. lamprey inspired nano-robots are being developed that can migrate through your body for healthcare.
Brillo + Weave Google’s Brillo and Weave projects are defining a standard interface for the Internet of Things. This will make it much easier for a wearable to communicate directly with other smart objects, such as your smoke alarm sending a signal directly to a watch.
Karo Caran tries on a prototype that projects an image into the eye via laser at CSUN 2015. Karo exclaimed she’d never seen anything as sharp as the image delivered by these glasses. I don’t have the product information for this.
- MIT’s Media Lab has been innovating the vision-based computers for many years.
- Eye-based wearables follow the head and the user’s point of interest.
- Eye tracking allows hands free. Sensors are at head level instead of floor level (cane) photo: http://cameraculture.media.mit.edu/
Emotion Recognition Sension is working on facial and gesture recognition with Google Glass Facial Expressions Sension builds on the state of the art in face tracking to locate 76 landmarks in a user’s face. Sension machine learns what it means to be engaged across the internal shape parameters of the face, giving us super accurate user engagement scores and analytical insights into the content users are interacting with in real-time
Instant Captioning on Glass allows a person with a hearing impairment to see what another person is saying. This Georgia Tech project is available as open source. Google also has a patent on speech to text transcription
People with short term memory loss require reminders. The watch is a natural location for reminders. Motion and biometric sensors could also detect when an activity has been accomplished. This avoids reminding user to do an already completed task and encourages independence.
- Ear based devices are common, especially after years of bluetooth headphones.
- Ears provide same benefits of eye-based location.
- Biometric information, such as body temp, pulse are available
- hands-free, display-free interactions
- Japanese researchers and ear based computer
Reading –The finger reader project from MIT uses a finger based device to track the sentence a user is reading and convert to speech. It also guides the person to move up/down if they go off the line. This should be possible with an eye-based device.
OrCam Reader is an eye glass mounted camera and bone conducting headphone. Uses your ﬁnger as the guiding gesture. An intuitive portable device with a smart camera mounted on the frames of your eyeglasses, OrCam harnesses the power of Artificial Vision to assist the visually impaired. OrCam recognizes text and products, and speaks to you through a bone-conduction earpiece.
Horus, View for the Blind :
- Mobility assistance
- Object detection
- Text reading
- Facial detection
- Spoken interface
Clothing clothing can be embedded with threads that transmit electronic signals. A connected shirt can track movement, bio metrics, and potentially gestures. This is great for health monitoring. Clothing can also be combined with sensory output, such as vibrations, to give the wearer feedback.
Project Jacquard: Google and Levi’s are working together to develop a method to mass-produce cloth with embedded electronics. They’ve created the conductive threads and manufacturing process that allows the clothing to become the controller.
The Navigate Jacket gives vibrational feedback to let the user know if they should turn left or right while walking.
Vibrating Shoes apply vibrations to the feet can provide enough stimulation to trigger a threshold within individuals with limited sense of touch. This could prevent falling.
- Replace joysticks and buttons with wearable switches that detect pressure change
- Embed within shoes and gloves
Myo Gestures primarily for basic gestures, i.e. presentations.
Google Soli announced at Google IO, Soli is a handsfree gesture recognizer that uses radar instead of cameras. The latest prototype is tiny enough to fit within a watch or other wearable and allows people to control devices by moving fingers in the air. This solves the problem of small to no screen sizes.
Acceptance and Ubiquity
Which of these kids is using an assistive technology device? Tablets, phones, and assorted mobile devices are so common that stigma associated with earlier devices has diminished.
Complexity in the right place
“What made the Rio and other devices so brain dead was that they were complicated. They had to do things like make playlists, because they weren’t integrated with the jukebox software on your computer. So by owning the iTunes software and the iPod device, that allowed us to make the computer and the device work together, and it allowed us to put the complexity in the right place.” -Steve Jobs
photo: Seele der Musikindustrie by ken fager https://www.flickr.com/photos/kenfagerdotcom/4398922649/
Wearable as Prototype – Cheaper and Faster
At web4all 2015 conference there was a presentation on an application for improving stability with people that have parkinson’s. The project started with sensors mounted at the feet and hands as a proof of concept. Afterwards, the same functionality was recreated with just a phone’s built-in sensors.
Meta Wear Rapid Prototyping Kit $40 from mbient lab This kit includes the hardware needed for creating your first wearable. It also includes plugins and schematics to 3-D print enclosures.
Adafruit kit allows you to build your first wearable project for $40
Apple watch released with full accessibility and an Accessibility API that closely matches iOS. Users can control the visual display for low vision. Android Wear is improving, but not to the same level. Watches provide remote microphone for voice dictation, actions.
Apple Pay and Google Pay allows user to make payments with minimal movements
…as someone with low vision and (mild) cerebral palsy, no longer do I have to fumble around my wallet trying to ﬁnd my credit card or struggle with swiping my card into the terminal. –Steven Aquino