In my fourth video for Intel Software Partners on re-imagining desktop application design, I provide an overview of location detection technologies and how we can use them to enhance desktop applications. I also walk through redesigning an existing app to take advantage of location.
Location is much than just finding points on a map. Through location inclusion, filtering, interactions between objects, background processing and more, desktop applications can be enriched and improved through location. Let's see how:
Welcome to the re-imaging apps for Ultrabook series with Luke Wroblewski. Today we’ll continue our look at the impact of new technical capabilities on desktop application design with an overview of location detection and how we can use it to enhance the apps we create.
Location detection has been a key element of mobile devices for years. Today a typical smartphone can locate itself in several ways: using cell tower triangulation, GPS positioning, or WiFi access point information in dense urban areas or indoor environments. The immediacy and increasing accuracy of these systems has led to an explosion of location-based services on mobile.
Nearly three quarters of all smartphone users in the United States use real-time location-based services regularly on their devices. Facebook alone collects over 2 billion location tags a month from user check-ins. Clearly location is a big deal for highly-portable smartphones ... but what about larger form devices like Ultrabooks?
Today not only can laptops use the same WiFi access point-powered location services used by smartphones but increasingly more accurate location detection technologies like GPS are being included in platforms like the Ultrabook. This creates new opportunities for desktop application designers and developers to integrate location-based information into their apps.
Step one to taking advantage of this opportunity is understanding what’s possible. For that, let’s look at what we can expect from the location detection features one might find in an Ultrabook. GPS gives us very accurate location, down to 10 meters, but it can take some time to establish a location, especially the first time. GPS can also drain battery life and is usually ineffective indoors.
WiFi access point information, on the other hand, takes almost no time to position and has no additional drain on battery. WiFi location accuracy, however, doesn’t hold up to GPS - just 50m meters or so when density of WiFi access points is high. Even still, two-thirds to three-fourths of the time a device like the iPhone locates itself, it is using WiFi location look-up services.
And when you compare both GPS and WiFi location detection to what was previously available on the Internet with IP detection, you can see we’ve come a really long way from knowing with 99% accuracy that you’re in a particular country right now.
Now that we have a sense of what location detection can do, let’s put it into action. In the previous videos in this series, we took an existing desktop application design and rethought it for touch target and gesture support. Let’s build on that work and see how we can take advantage of the location capabilities of the Ultrabook platform. As before, we’ll work with the Tweester app- designed to be the ultimate social networking tool for the storm chasing community. Tweester is very well suited for large screen, high-powered, portable devices with multiple input formats, like the Ultrabook. Storm chasers need to capture lots of accurate data on location, process it quickly, and track what other chasers are seeing where they are as well. So location detection is key to making this app work.
The most basic capability of location detection is finding our devices, and ourselves on a map. Even this simple feature, however, can do a lot for our app.
When browsing updates from fellow-storm chasers in the original Tweester application design, you’ll note we can see where a update was created. That’s really useful when you’re sharing storm sightings and data.
But capturing this location data with enough accuracy to make it useful in our original application was painful. We had to ask each user to tell us precisely where they were so we could include detailed location information with their update. This manual data entry is not only painful for users, but could lead to entry errors as well.
In our redesign for the Ultrabook platform and Windows 8, things are much smoother. When our redesigned version of Tweester for this platform is first opened, we simply ask people if we can have access to their location. If they agree, from that point on we can instantly look-up their location using the most precise information the device has: WiFi access points, GPS, or anything else that available.
This means the process for capturing location when someone creates an update is drastically simplified. In fact, for our user, its non-existent. We can simply append their location to an update and give them the option to remove if they choose. No data entry required, which means we can capture a lot more location information.
Moving over to the Maps portion of the Tweester application, we can now see this data in use and how it adds value for anyone using the application. In this example, three storms have been reported by Tweester users. Since the Tweester application has access to our current location, we can also see how far away we are from each of them.
Let’s say we’re considering taking a closer look at the storm nearest us. We’ll simply tap on its icon to reveal a set of actions and information. Right away you can see updates from the location and about the storm. Because we made it so easy to append a location to each update, we can surface location-relevant information instantly.
But what fun is reading about a storm when you are a storm chaser? So let’s head over to see things first hand by tapping on the Intercept action. Now, we’re starting to see location information used in a richer way. That is instead of just pinpointing ourselves on a map, we can see our relationship to another object: in this case the storm we want to chase. Note the storm’s predicted path is matched with the best route for us to intercept it. We even get a sense of how long it will take to intercept based on our rate of travel and distance from the storm.
Now let’s update that information in real time as we drive closer to capture more detailed observations. As this example indicates, location detection isn’t just static. It can provide useful dynamic information as well. When coupled with additional sensors and the processing power of an Ultrabook, we can collect and share lots of storm data as we move closer to our target.
Even when we’re not in hot pursuit of a storm, having access to location detection can be quite useful. Consider as we’re driving through storm country, our path may be annotated with storm images, shared with fellow storm-chasing Tweester users, or even previously marked up with information or reminders by us.
By keeping track of location in the background of Tweester, we can surface these useful bits of information to people using the application as they become relevant. Perhaps with a dialog that let’s us know some recent updates were posted near our current location. Or maybe less intrusively with a notification that there are new points of interest on the map.
We can even give our users control of these “geo-fenced” updates by allowing them to select a location on the map and set reminders for themselves the next time they are in the same location: collect new data, check previous readings, or whatever they choose. These reminders will just automatically come up when the same location is visited again.
In looking at these examples in the Tweester applications, we’ve seen that location is much then just finding points on a map. The location detection capabilities in the Ultrabook platform provide lots of ways to rethink existing application design. Through location inclusion, filtering, interactions between objects, background processing and more, desktop applications can be enriched and improved through location.
Further information on developing applications to take advantage of location detection is available in intel's ultrabook developer community site. You can find the link in the blog post accompanying this video.
Thanks for your time and I’m looking forward to having you join me in the next video in this ongoing series. Until then thanks for tuning in.
Stay tuned for more video in the series coming soon...
Disclosure: I am a contracted vendor with Intel. Opinions expressed on this site are my own and do not necessarily represent Intel's position on any issue.