In my second video for Intel Software Partners on re-imagining desktop application design, I take a deeper dive into designing touch-based interactions. That is, how large we need to make our application controls and where should we place them on screen in order to optimize for touch.
In addition to general guidelines, I also showcase a before and after design that converts a keyboard and mouse application to a touch-optimized interface by rethinking navigation, input controls, and more.
Welcome to the re-imaging apps for ultrabook series with Luke Wroblewski. Today we're going to continue looking at how touch forces us to rethink desktop application design with a practical guide to touch targets.
In the first video of this series we outlined the impact new input methods have had on personal computing and walked through the high level principles of designing for touch. Now well dig a bit deeper with touch target sizes and posture. Touch is all about direct interactions with content. And at a high level that's what we want to push for. But how do we actually allow people to interact with touch? Our fingers aren't nearly as precise as our mice. And without a cursor, making sure people can actually interact with the elements in our applications is an important consideration.
In the Windows design guidelines, Microsoft outlines a couple minimum thresholds for what makes a control touchable. More precisely, they recommend applications use controls that are at least 23 by 23 pixels in order to be touchable. That translates to about 6mm by 6mm in physical dimensions. This size is important for reducing the number of errors when we try to hit targets with our fingers. To see what this looks like on screen, let's look at a simple dialog window. The choices in the bottom dialog window have been spaced appropriately so they don't conflict with each other when someone tries to touch one of them. The dialog window above, on the other hand, has a set of controls well suited for keyboard and mouse interactions but too tightly spaced for touch.
Touchable is different than touch-enabled. For touch enabled controls, we generally want to make things bigger. That is, we are aiming for around 10mm in physical size for our touch targets. This number isn't arbitrary. It comes from studies which found the size of the average human finger pad is 10-14mm and the average human fingertip is 8-10mm. It's especially important for commonly used controls and elements that could result in significant errors if touched erroneously to fit in the 10mm range -sometimes even bigger. For both touchable and touch-enabled controls, we need to consider the spacing between targets as well. A minimum of 2mm of space between targets should do the trick.
Why is touch target size such a big deal? Taking a look at the number of errors that occur per touch target size reveals that when touch targets get too small , that is when they fall below the 7mm marker, error rates shoot up. And as you get smaller and smaller touch target sizes, error rates climb higher and higher. Obviously we don't want people making errors in our software - that's a recipe for frustration. So sizing controls appropriately for touch is an important consideration.
And before you assume it's just people with large fingers that have problems with touch target sizes, consider that many of our current interfaces are too small even for a baby's finger size. Much less a basketball players or the average human index finger width of 11mm.
When converting an existing desktop application or building a new desktop application for a platform like the ultrabook, you generally have two ways of thinking about your app. One is it requires the precision of keyboard and mouse interaction. In which case, it makes sense to design the app for keyboard and mouse inputs first. The other option is to make touch a first class citizen and as a result, a key factor in the design of the application's user interface.
For keyboard and mouse first applications, a minimum control size of 5mm, a recommended control size of 7mm, and size of 10mm for commonly used or potentially error prone controls will go a long way to ensuring an app is touchable. For a touch first application we want to push the minimum control size to 7mm and the recommended size to 10mm. For commonly used controls, error prone controls, or controls near the edge of the screen, we want to go even bigger to controls over 10mm in physical size. In both cases, the minimum spacing rule of 2mm between touch targets still holds true.
So a keyboard and mouse first application could simply take an existing set of controls and enable them to be used with touch interactions. This is what the touch mode feature in windows 8 can do for an application like Microsoft outlook. Here you can see the controls get spaced out more in touch mode and as a result, become more touchable. Note that this is not the same as touch-optimized.
Touch mode can also be turned on automatically when a touch input is detected. This works especially well for in-context actions that are actually triggered by an input. For example if someone attempts to edit text, we can determine they are using a finger to do so and automatically space controls apart to be touchable.
Touchable, though is quite different than touch-optimized. For example, the new email application on windows 8 has been redesigned for touch. As you can see the controls are much larger as they make use of the touch first sizing guidelines we discussed a bit ago.
For the purpose of illustrating how we can go from an existing desktop application to something rethought for touch capabilities, I created a demo app. Tweeter: it's the ultimate social networking tool for the storm chasing community. But what's most important for our purposes is that well take an existing desktop application and adapt it to take advantage of new ultrabook capabilities, in this case touch. But we aren't going to just aim for touchable, we'll design something that's more touch-optimized to take full advantage of what we can do with touch based interactions.
Step one is making sure that our Tweester application uses proper touch target sizes. Looking at the desktop build, we see small targets and a lot of ui chrome used for things like moving between photos and scrolling lists.
Rethinking this from a touch first perspective gives us something that's closer to the windows 8 email application we saw earlier. That is, large touch targets designed for fingers first. We also treat the content as the interface instead of falling back to ui chrome for interactions. If we'd like to see the next photo, we simply swipe across it. No buttons required. We saw this interaction in the first re-imaging apps for ultrabook video. Have another look if you missed it.
Where these changes are even more visible is on our create New update screen. Here if someone would like to post a new update on their storm chasing activities, they have a traditional desktop GUI interface for data entry. Note again the targets are quite small for touch interactions. In particular the little spinner inputs are basically unusable with our fingers. Also the interactions on this form are very text centric and not touch optimized.
Rethinking this new update form for touch based interactions might result in a design more like this one. Note we're immediately pushing people into input mode. The cursor is set by default to the primary input field and a result, the virtual keyboard is open and ready to go. No additional actions required to start posting an update. We've also made the controls in this form much bigger and reduced the amount of chrome on the screen. But let's say someone wanted to annotate this post with a little bit of weather information, they simply tap on the appropriately sized storm icon and instead of getting the form elements we saw earlier, what we've used instead is a slider and a series of checkboxes.
Through these input controls, we're reducing the dependence on typing and instead allowing people to get things done using touch interactions. In case there's actually a tornado present, just move the slider over to the f0 value. An f0 tornado is characterized by chimney, trees, and signs being damaged. Well what were seeing is a bit more sever e so let's move that slider over again to the f2 value.
And note that were now in significant tornado territory. Instead of typing a lot let's just quickly tap that we saw a roof being damaged and trees being uprooted. If that's really going on around you, just touching with your finger might be a little less nerve racking than trying to type using the keyboard. But this isn't about typing in storms, it's really about using touch interactions. And you'll note what were doing here is reducing the reliance on keyboard interactions and instead allowing people to get things done using a few simple gestures for accurate data entry.
Your reaction to this redesign may be "but but I have so many things to fit in my app. How can I do that if the touch targets have to be so big?" frankly you can't and quite often that's a good thing. Designing towards touch really forces us to simplify and decide what's most important- what needs to stay on the screen.
If we go through that exercise we ultimately end up with software that's easier to understand and as a result more often used. Both good things. Also while big touch targets can be comfortably used with a mouse (in fact they'll be easier to hit with a mouse), small mouse size targets can't be used easily with touch. So when in doubt, optimizing for touch will make sure things are usable for both mouse and touch users. Of course there Are interactions which require the precision of a mouse, which might push you more toward keyboard and mouse first applications as we discussed earlier.
But touch targets aren't just about the size of controls, the placement of controls is important as well. To understand why this matters, let's look at how people hold a smartphone. In general one hand and one thumb is quite popular. Or holding with one hand and using a single finger on the other to tap, or the more involved two hands two thumbs position for significant typing or complex tasks. In each of these examples, the bias is toward right handed use as most people in the world are right handed.
These common patterns of posture create easy to hit and hard to reach touch areas. The area toward the bottom of the screen is easy, whereas the upper corners are a bit of stretch. So in the upper corners we'll put the destructive controls: clear all my data or undo my typing. Actions that we really want to consider as we stretch our finger uncomfortably to reach them. In contrast, the bottom area of a smartphone screen is where we want to put an applications most common and important interactions. Where they can be reached quickly and easily.
Similarly we can look at tablet postures or how people typically hold tablet computers. That is two hands along the sides, or typing over the screen in their lap. In these two postures we see a different series of easy, ok, and hard to hit touch areas. Once again the bottom corners are easy but now the top and center are a bit of stretch as we need to move our hands to reach them which might tire them out.
Tablets can also be held with one hand and tapped with the other. In this mode, we once again see comfortable touch areas can be biased towards the right side of the screen as most people are right handed.
What about touch-enabled ultrabooks? What kinds of postures can we expect and how will they influence where we place touch targets? In a recent intel study of touchscreen laptop interactions, a number patterns started to emerge. People got pretty close to the screen and used their two thumbs yielding easy to hit areas in the bottom corner of the screen.
In other positions, they used their index finger to touch the screen. Perhaps that position makes easy to hit targets look more like this. Because touch-based ultrabooks are new to the market, we don't yet know exactly what the human ergonomics of their use will be. But just like the posture of smartphone and tablet users effects where we place controls, so will the posture of touch-enabled ultrabook users.
Touch will probably be even more important for convertible ultrabooks that can shift between tablet and laptop form factors by swiveling the screen to the front of the device and putting touch front and center.
To summarize what we've talked about with touch targets, generally we want to controls big enough for our fingers and maintain the spacing we need between controls to avoid mistakes. Building towards touch optimized applications, is likely to focus your application designs as larger ui elements require you consider what really needs to be on the screen and what can be removed or subsequently revealed. Touch target optimized interfaces are likely to be more usable across different input types as touch target sizes are easy to hit with a mouse while mouse sized targets are hard to with with touch. So when in doubt lean towards touch. Last but not least where we place controls matters. Think in terms of human ergonomics and how people will actually be holding and interacting with devices when you consider where to place your controls on the screen.
For more information on developing applications using touch take a look at the resources in intel's ultrabook developer community. There you'll find touch and sensor guides in addition to many other resources. You can find the link in the blog post accompanying this video.
The mainstream adoption of touch by consumers and increasingly enterprise users, makes it a key part of re-imaging desktop applications for the ultrabook platform. The touch target considerations we walked through in this video are an essential part of designing for touch. But were not done yet. In the next video of this series well take a deep dive into touch gestures. That is the kinds of interactions touch enables and how we can take advantage of them in our application designs.
I hope you'll join me then. Thanks.
Stay tuned for more video in the series coming soon...
Disclosure: I am a contracted vendor with Intel. Opinions expressed on this site are my own and do not necessarily represent Intel's position on any issue.