An Event Apart: Buttons are a Hack

by February 7, 2012

In his Buttons are a Hack presentation at An Event Apart in Atlanta GA, Josh Clark made the case for moving beyond GUI controls on touch devices. We can do better! Here are my notes from his talk:

  • Touch interactions will help us sweep away buttons and a lot of existing interface debris by moving us closer to the content and away from GUI abstractions.
  • Touch devices have created all kinds of expectations that the Web does not implement well. Currently the Web is not an equal citizen for interaction design on touch devices. Since the Web only supports simple touch interactions, native OS applications are leading innovation on mobile.
  • Real support for touch gestures is needed in the browser. Right now we only have touchstart, touchmove, and touchend. We need to push for more complete gestures in the browser. Web browsers really only support tap and swipe. There’s limited support for pinch, zoom, and press.
  • We need to be more creative and collaborative with developing gesture conventions.

No Buttons?

  • Buttons are a hack but they are an inspired hack. They operate at a distance by creating abstractions away from content. Do we still need that hack on a touch interface? Can we instead aim for direct interactions with content?
  • How do we help people to use software that has no labels or buttons? This is the state of many touch interfaces.
  • As we develop a new gesture vocabulary, we’ll need affordances to communicate what’s possible. Buttons aren’t going away right now but we can explore other options.
  • Gestures are the keyboard shortcuts of touch interfaces.
  • Let people use entire screen as control: swipe and paw instead of tap-tap-tap.
  • Big screens invite big gestures. Where can you eliminate buttons and use the whole screen for controls? Let people be lazy in how they interact with content through big targets.
  • On larger touch screens, Fitts’s Law still applies. Cognitive and motor effort is increased when people need to touch small targets across the screen.
  • Button alternatives: Pinch to cancel, or close view (like Reeder app). Swipe thru history (like Twitter for iPad).
  • How do you find gestures? They are hidden, unlabeled, and as a result hard to discover. People will figure things out by trying physical or mouse-based conventions as gestures.
  • Rely on visual clues or past experience. The less a gesture resembles a physical action, the harder it is to find.
  • Muscle memory allows us to get things done faster than with conscious knowledge. We can start with physical points of reference. Example: turning off an alarm clock.
  • Physicality invites touch. Textures and physicality draw people in. Real world metaphors need to meet expectations.
  • Nature doesn’t have instructions. Design for nature. Realistic interfaces (if done right) can hint how UI works.
  • Ensure you follow through on your metaphor, if your interface suggests a gesture –support it. Don’t follow a metaphor too literally or half-heartedly.
  • On a touch screen, visuals are also hints on usage. Be careful on what your visual designs imply.

Can Education Help?

  • Who needs a control when you have the content itself? Even better than labels is no labels. Example: clear salt and pepper shakers.
  • Up front instruction manuals make things feel more complex than they really are. Few people read the manual because most of us are impatient. A manual should be a reference not the primary way to learn to use something.
  • Because people don’t read we often resort to show and tell. But being told how to do something takes a lot of the joy out of using it. You learn to interact with the world through trial and effort.
  • Toddlers understand the idea of trial and error. Toddlers haven’t been poisoned by 30 years of desktop interactions. Use them to see how approachable your interfaces are.
  • Don’t patronize or dumb down your apps but be patient with people as they come to terms with how to interact with your app.
  • Video games are a good model for how to introduce people to new modes of interaction.
  • Coaching, Leveling up, Powering up. Active participation is the best way to learn. Coaching allows you to practice as you interact with a service in real time. If you can avoid forcing people to read in coaching even better.
  • An important element of coaching is not revealing everything at once. You start small then build up over time. Ease people into an app with an element at a time.
  • The best time to learn a skill is when you need it.
  • Levels that require a feature are the best time to teach that feature. The first time through, it is a guaranteed success.
  • Think about your app as levels. How can people move between them as they learn how to use it? Where can you help people level up?
  • Power-ups are ways to turbo boost your games. They provide shortcuts. Shortcut gestures can tackle common actions that could be streamlined.
  • Facebook has a custom gesture to get back to the home screen by tapping on the title bar. This is a useful feature but there’s no affordance to explain it to them. Perhaps the app instead should let people “level-up” by introducing them to the feature after they see its need (after a long browse path, for exmaple).
  • Explore multi-touch gestures in your apps. We need to move things forward as conventions don’t exist. Multi-touch gestures allow us to play more than use applications.
  • We are waiting for conventions to emerge but no one is showing leadership. We need to experiment and share what we learn. This will help us move things forward. We need to help each other through this exciting time.