In his Buttons are a Hack presentation at An Event Apart in Washington DC, Josh Clark made the case for moving beyond GUI controls on touch devices and the Web. We can do better! Here are my notes from his talk:
- You can't be a futurist without being a historian. History has a big role in the future.
- In the last five years, touch has become mature even though its been around for many years.
- Touch interactions will help us sweep away buttons and a lot of existing interface debris by moving us closer to the content and away from GUI abstractions.
- Many of our desktop interactions don't do well on touch screens.
- User interfaces are an illusion. But with touch interfaces we can cut through the illusion and let people interact directly with content.
- Microsoft's investment in touch on Windows 8 is a clear indication that touch is the future. It's now a key part of the largest operating system in the World.
- Touch devices have created all kinds of expectations that the Web does not implement well. Currently the Web is not an equal citizen for interaction design on touch devices. Since the Web only supports simple touch interactions, native OS applications are leading innovation on mobile.
- Real support for touch gestures is needed in the browser. Right now we only have touchstart, touchmove, and touchend. We need to push for more complete gestures in the browser. Web browsers really only support tap and swipe. There’s limited support for pinch, zoom, and press.
- The browser hems you in. We need to think beyond the browser. The Web is much broader: Web views inside apps, books, and more. For example, Windows 8 encourages you to use Web technologies to build native apps.
- It's good news that we don't have to be constrained by the browser in the future, we can use advanced touch and gestures in places beyond the browser.
- Cut your teeth on tap and swipe in the browser, but keep experimenting.
- But the browser isn't the only home for the Web. Windows 8 encourages you to build applications using Web technologies, ebooks are developed with Web tech, Web UIViews are inside native apps.
- The Web (browser) is inside of every application instead of every application being inside the Web (browser).
No Buttons?
- Buttons are a hack but they are an inspired hack. They operate at a distance by creating abstractions away from content. Do we still need that hack on a touch interface? Can we instead aim for direct interactions with content?
- How do we help people to use software that has no labels or buttons? This is the state of many touch interfaces.
- As we develop a new gesture vocabulary, we’ll need affordances to communicate what’s possible. Buttons aren’t going away right now but we can explore other options.
- The iPad's back button break Fitts Law: the further and smaller a target is the harder it is to hit. Instead, let people be lazy. Let them use the whole screen or touch shortcuts instead.
- On larger touch screens, Fitts’s Law still applies. Cognitive and motor effort is increased when people need to touch small targets across the screen.
- Gestures are the keyboard shortcuts of touch interfaces.
- Let people use entire screen as control: swipe and paw instead of tap-tap-tap.
- Big screens invite big gestures. Where can you eliminate buttons and use the whole screen for controls? Let people be lazy in how they interact with content through big targets.
- Button alternatives: Pinch to cancel, or close view (like Reeder app). Swipe thru history (like Twitter for iPad).
- A gesture-driven interface can look more like playing an instrument than a GUI interface (Clear to-do list app). This is a different form of physicality than the current GUI paradigm.
- Touch interfaces can turn GUI interfaces on their head: each time you want to add a UI control, stop and think how can you do it without touch.
- In the TouchUp sketching app, brush sizes don't change. There's no GUI control for it. Instead the canvas size changes through zoom in/out.
Hidden Gestures
- How do you find gestures? They are hidden, unlabeled, and as a result hard to discover. People will figure things out by trying physical or mouse-based conventions as gestures.
- Rely on visual clues or past experience. The less a gesture resembles a physical action, the harder it is to find.
- Muscle memory allows us to get things done faster than with conscious knowledge. We can start with physical points of reference. Example: turning off an alarm clock.
- Physicality invites touch. Textures and physicality draw people in. Real world metaphors need to meet expectations.
- Nature doesn’t have instructions. Design for nature. Realistic interfaces (if done right) can hint how UI works.
- Ensure you follow through on your metaphor, if your interface suggests a gesture –support it. Don’t follow a metaphor too literally or half-heartedly.
- On a touch screen, visuals are also hints on usage. Be careful on what your visual designs imply.
- Who needs a control when you have the content itself? Even better than labels is no labels. Example: clear salt and pepper shakers.
- Let the content be the message. Instead of labels, let people directly interact with content. Content can be the interface.
- The message can be the medium. The content is the control. This means features and actions don't take content over.
Can Education Help?
- Up front instruction manuals make things feel more complex than they really are. Few people read the manual because most of us are impatient. A manual should be a reference not the primary way to learn to use something.
- A manual is a diversion from getting thing done.
- Because people don’t read we often resort to show and tell. But being told how to do something takes a lot of the joy out of using it. You learn to interact with the world through trial and effort.
- Toddlers understand the idea of trial and error and physical interactions. Toddlers haven’t been poisoned by 30 years of desktop interactions. Use them to see how approachable your interfaces are. Look at the world with fresh eyes.
- Don’t patronize or dumb down your apps but be patient with people as they come to terms with how to interact with your app.
- Video games are a good model for how to introduce people to new modes of interaction.
- Coaching, Leveling up, Powering up. Active participation is the best way to learn. Coaching allows you to practice as you interact with a service in real time. If you can avoid forcing people to read in coaching even better.
- An important element of coaching is not revealing everything at once. You start small then build up over time. Ease people into an app with an element at a time.
- Use visual cues to communicate where gestures are available.
- The best time to learn a skill is when you need it.
- Levels that require a feature are the best time to teach that feature. The first time through, it is a guaranteed success.
- Think about your app as levels. How can people move between them as they learn how to use it? Where can you help people level up?
- Power-ups are ways to turbo boost your games. They provide shortcuts. Shortcut gestures can tackle common actions that could be streamlined. Power-ups are useful for everyone but most powerful in the hands of an expert.
- Twitter added unadvertised gestures shortcuts in their new app. Instead, they could have coached in the moment to explain what's possible.
- We are waiting for conventions to emerge but no one is showing leadership. We need to experiment and share what we learn. This will help us move things forward. We need to help each other through this exciting time.