In their Designing for Wearables presentation at Google I/O 2014 members of Google's Glass and Android Wear teams walked through lessons learned creating wearable UIs over the past few years. Here are my notes from their talk:
- Technology should do the hard work for you so you can get on with your life. Computing needs to disappear and not be the foreground of our attention.
- The world is the experience. Software can't compete with the real world. At best you can complement the World with relevant information and information.
- Wearables are intimate, we wear them on our body -they need to be personal and made for the closest people in our lives.
- Micro-interactions: wearables are the rear view mirror, not the windshield. Timely, glance-able information is what makes wearable UI work.
- Phones often distract us and take us out of the World. Wearables provide much more compact experiences that are as short as possible and as fast as possible.
- Natural voice: speaking in plain language to a device makes interactions much faster and appropriate.
- When starting design for Glass, early ideas were too close to the phone interface (apps, home screens, etc.). They weren't appropriate for a device like Glass.
- Make interactions as natural as possible. Singular, focused tasks: do just one thing at a time & see just one thing at a time.
- Carefully design the voice experience. Don't port over a mobile UI just focus on simple, singular tasks.
- There's a tendency to port over existing design structures to new devices but this has never really worked.
- Sets of cards work much better than a Windows, menus, icons, pointers (WIMP) based UI for small screen interfaces.
- Sensors inside wearables can sense what's happening, match patterns, and reveal the most important cards/information when it is relevant. These signals provide context. Let the technology do the right thing and adapt to what the user is doing.
- Design for voice: Image you are the app and you need to communicate with your user by talking to them.
- Design for context: when someone reaches out their hand, what do you put into it? When do you expect people to reach out their hand? Build triggers around these scenarios.
- The UI for most wearable apps are minimal. Even the mock-ups for wearable apps are more focused on the background imagery than the actual application design itself.