Joseph Fletcher’s Untold Stories of Touch presentation at Microsoft’s MIX09 conference discussed the relationship of touch-based user interface (NUIs) to other forms of UI and shared a few lessons learned while working on Microsoft Surface.
- Command line interface: extremely efficient form of input but requires training
- Graphical User Interface: lowered the barrier of entry to bring in more users
- Natural User Interface: new type of input needed a new system around it, touch and gesture for direct manipulation
- Touch is not good for everything. Touch is a computing evolution not a revolution. Touch is great for specific actions.
- Principle Driven design (based on lessons in history):
- CLI: text (recall) –directed, high-low, disconnected, static
- GUI: graphics (recognition) –exploratory, double-medium, indirect,
- NUI: objects (intuition) –contextual, fast few, unmediated
- Performance Aesthetics, Direct Manipulation, Scaffolding, Contextual Environments, Super Real
Lessons Learned
- When designing applications, tendency is to put menus at the top of the screen. In touch-screen laptops, hitting the top of the screen wobbles the display. Need to design application differently.
- With a tablet, people hold with one hand –so need gestures to work with one hand.
- Capacitive: require touch of human hand
- Infrared: do not need to touch screen to register action. Cameras track inputs
- How many points of touch can a technology accept: 1 (Wacom) -52 (Surface). Do you use lowest common denominator?
- In Firefox, swipe left goes backward in history. In iPhone, swipe left goes forward. There are inconsistencies in patterns.
- Orientation issues: horizontal vs. vertical
- Challenge: cardio effect. People get tired after a while of moving parts of their body
- To button or not to button: when people see a control, they assume other controls will be present. When you add one GUI control, people will look for other GUI controls. Be very careful about when and how you mix paradigms of touch and GUI.
- Challenge: getting people to touch. Exploration –water application on Surface invites people. Instruct –iPhone “slide to open” text.
- Challenge: vision-based feedback. How do you know the system heard you? Can you visualize when you touch so people know the system heard them?