Device Motion in Application Design

by May 14, 2013

In my fifth video for Intel Software Partners on re-imagining desktop application design, I provide an overview of device motion and walk through a few ways we can take advantage of this set of capabilities in the desktop apps we create.

Device motion is made possible by a combination of always-on sensors (typically an accelerometer, a magnetometer, and a gyroscope) that tell us how a computer is moving through the space around it. The ability of these sensors to provide precise information about the movement of a device opens up new design possibilities for applications. From adjusting the user interface based on orientation changes to using three dimensional motion as input, to combining device motion with location detection, video cameras, and light sensor capabilities, there's no shortage of interesting interface designs made possible by Device Motion:

Complete Transcript

Welcome back to the re-imaging apps for Ultrabook series with Luke Wroblewski. Today we’ll continue to look at how new technologies in Ultrabook computers allow us to rethink desktop application design. In this video I’ll provide an overview of device motion and walk through a few ways we can can take advantage of this set of capabilities in the desktop apps we create.

Device motion is made possible by a combination of sensors that can tell us how a computer is moving through space. No, not that kind of space. The space all around us. For example, how you move a smartphone in your hand is typically tracked by three sensors: an accelerometer, an magnetometer, and a gyroscope. The ability of these sensors to provide precise information about the movement of a smartphone opened up new design possibilities on mobile devices. Mobile developers and designers made use of these capabilities to create interfaces that relied on shakes or tilts of the phone to get things done. Over time these experiments became commonly used in mobile application design. Now these same sensors are now available within Ultrabooks and developers have the same kind of opportunity to establish new patterns in desktop application design. To better understand what’s possible, let’s take a look at what each of these device motion sensors does.

The accelerometer is perhaps the most common device motion sensor as its been included in smartphones and laptops for quite some time. Essentially it measures the motion of a device and can be used to determine changes in your computer’s position. When you rotate a smartphone from portrait to landscape position, the accelerometer tells the software that an orientation change has happened. Many mobile applications take advantage of this change to adjust and improve their interface design for landscape use.

The same kind of orientation change can be detected in Ultrabooks and is quite useful in a world of increasingly hybrid devices. That is, traditional laptop form factors that easily convert to tablets, which are likely to be rotated from landscape to portrait mode more frequently than smartphones.

One additional consideration unique to Ultrabooks is the location of sensors like the accelerometer. In some devices, the sensors are in the screen. In others, they are present in the body of the computer. Because they are located in different parts of the hardware, they can return different values for device position and applications have to account for that distinction.

In addition to raw data about their position, some convertible devices also provide events that tell us when they change modes. For example, when being converted from clamshell to tablet mode. As we’ll see later this can be a useful change to watch for and act on.

Accelerometers don’t just tell us about orientation changes, they provide a stream of data about how a device is moving. This information can be used to create very natural feeling interactions. For instance, readers of books could activate a “tilt-scrolling” mode that gradually scrolls through content as a device is tilted forwards or backwards. The accelerometer tells us the device is changing positions, and we can scroll the text someone is reading accordingly. This is a great example of the kind of innovation we’ve seen in mobile applications that’s now possible on Ultrabooks.

In addition to accelerometers, digital compasses, or magnetometers are increasingly common in modern computing devices. These sensors give us a sense of the direction a device (and thereby its owner) is facing. When coupled with location detection technology, which we discussed in the last video, magnetometers provide us a way of determining someone’s current orientation in the real world.

Last but not least is the gyroscope, which measures the rate of change around an axis and generally makes the device motion data we have to work with much more accurate. In fact, the its really the combination of these sensors that gives us the most insightful look at the true motion of a device. And the more data we have the more interesting things we can do. Consider the mobile service, Bump, that allows two people to exchange contact information by “bumping” their phones together. A combination of accelerometer data, geolocation, IP addresses, and more is used to make a match between two phones.

Before you get too nervous about needing to work with lots of data coming from various sensors, rejoice because the Windows 8 team has done a lot of work to combine the information coming from the accelerometer, digital compass, and gyroscope and simplified how developers can work with this information on Ultrabooks. When all three sensors are present on a device, Windows gives you easy access to “9 axis” motion and orientation sensing.

What that boils down to is the true orientation of a device in real time. So we can detect and make use of any number of rotations, twists, or even shakes of a device to trigger actions and adapt the interface of an application. And that opens up a whole realm of interesting possibilities.

Now that we have a sense of what device motion gives us, let’s put it into action. In the previous videos in this series, we took an existing desktop application design and rethought it for touch and location support. Let’s continue that work and see how we can take advantage of the device motion capabilities of the Ultrabook platform. As before, we’ll work with the Tweester app- designed to be the ultimate social networking tool for the storm chasing community. Tweester allows storm chasers to track and record severe weather conditions in a number of ways that could benefit from device motion. Let’s look at some specific examples.

For starters what kinds of adjustments can we make to the Tweester user interface to account for the kinds of orientation changes we discussed earlier. To answer that, we’ll pull the screen out of this convertible Ultrabook.

And pull up the map feature of Tweester. As you can see, this interface features a set of top-level menu options on the left, some contextual actions and information on the right, and a large map in between, which gives us a nice big canvas for hunting storms. But what happens if we rotate the screen into portrait mode?

Due to the aspect ratio of this device, our nice and big canvas has been reduced to a narrow sliver. Not only that but our left and right columns now stretch the full height of the screen and are taking precious space away from our map -which is the primary purpose of this feature. We can do better.

One option is to move the contextual actions for the map into the main navigation column and open up a lot more room for the map. While we’re certainly being smarter about screen space, shifting action around this dramatically make be a bit much as people may get a bit disoriented by the now combined menu system.

So instead we might consider hiding the main navigation list on the screen and giving people an easy way to bring it back into view. With this approach, the menu is available if people need it but not competing with the map when they don’t. But there’s still more we can do because the column on the right is still eating up more screen space than it should. We might use that as an opportunity to add more relevant content to the interface or just try another adjustment.

In this design, we’ve kept the main navigation off screen and easily accessible. In addition, we’ve changed the orientation of the contextual actions and information. These elements are now at the bottom of the screen and can be accessed easily by scrolling left and right. Though we’ve moved things around a bit, the visual style has stayed the same and we haven’t combined two distinct interface elements like we did in our first version. Every element still has its own place we just adjusted where that place is to account for the device’s vertical orientation. Hopefully these iterations give you a sense of how to account for orientation changes in an application’s user interface. Because wide screen aspect ratios are increasingly common on Ultrabook devices, a fair bit of interface adjustments are often needed between portrait and landscape mode designs.

The variety of Ultrabook form factors available today provide a broad palette of possibilities. Consider a convertible device that shifts from a clamshell form factor to a tablet. We can detect this change through device motion and again adjust the interface.

For instance, slideshows in Tweester could enter fullscreen mode for lean-back viewing when this type of orientation changes happens. We could then make use of an integrated camera to track the motion of someone’s hand. When they move from left to right or right to left, the gallery of images can scroll accordingly. That’s two capabilities, the camera and an orientation change, working together to create a unique image viewing experience.

While orientation changes give us an opportunity to create a better interface for people as they use our application in different settings, they’re a relatively simple use of device motion. We can do lots more.

With two hands on the device, we can tip and twist the screen in our hands to move around the map in Tweester. This mode, which users should be able to turn on and off, uses the real time motion of a device to provide a fluid panning experience so people can just tip their screen in any direction to explore parts of the map that may interest them. In this example, the motion of the device has essentially become the input mechanism. That is, moving the screen in your hands also moves the elements you see on screen.

The same device motion data could be used in combination with other forms of input as well. Let’s say we’re out storm-chasing with Tweester and our Ultrabook and an interesting weather pattern is forming near us. We could quickly start to capture some high-definition video of what we are seeing to share it with other storm chasers. But due to the weather, its likely our hands won’t be that steady and the screen will be moving in as we hold it. By tracking the motion of our device, we could fix things by adjusting the video to account for the motion of our hands. The Ultrabook is actually a great platform for this type of feature because of its powerful processing capabilities.

Device motion sensors can tell us a lot about how a device is moving in someone’s hands but they can also clue us in to environmental changes as well. For instance, if I hop into the car to chase down a storm, the car’s acceleration can tell us to shift the interface into “travel” mode or offer the opportunity for the user to make this adjustment -after all we don’t want to infer too much about someone’s intent from the fact the entered a moving car. Once in travel mode, the interface can show less, but more important, weather information around us to account for our faster rate of motion.

For an extra bonus feature, if we actually make it into a storm and things go dark, the ambient light sensor could detect the amount of light went down substantially and reverse the map display for us so we can still see the information we need in these new lighting conditions. Now that’s a useful storm chasing companion.

In looking at these examples in the Tweester application, we’ve seen that device motion gives us several ways to enhance an existing application. From adjusting the user interface based on orientation changes to using three dimensional motion as input, to combining device motion with location detection, video camera, and light sensor capabilities, the Ultrabook platform provide lots of ways to re-imagine desktop application design. And we didn’t even touch on how the motion of two or more devices can allow them to interact with each other. But we have to save something for the next video in this series.

Further information on developing applications to take advantage of device motion is available in intel's ultrabook developer community site. You can find the link in the blog post accompanying this video.

Thanks for your time and I’m looking forward to having you join me next time. Until then thanks for tuning in.

More Soon...

Stay tuned for more video in the series coming soon...

Disclosure: I am a contracted vendor with Intel. Opinions expressed on this site are my own and do not necessarily represent Intel's position on any issue.