The Minority Report Interface

by September 23, 2004

Wall-sized displays, gesture recognition, and seamless information convergence: it’s the stuff that Interface Designer dreams are made of. It’s also a vision found in The Minority Report. So how did Tom Cruise get such a nice set up? It turns out that Microsoft Research, MIT, and several design shops had a say in the interface designs found in the film.


Starfire

Sun Microsystems’ 1992 project (and video prototype) “to develop and promulgate an advanced integrated computing/communication interface” was a early example of a gesture-driven large-scale information space. The Starfire story was set in November of 2004, but we still have a ways to go before we achieve the principles it laid out.


Microsoft Research

“In July of 1999 Microsoft entered into an exploratory relationship with Fox Studios for the upcoming production of Minority Report. We exchanged ideas and briefs and eventually provided mockups of user interface design for the police helmet, outdoor kiosks, and conceptual virtual information spaces.”


Schematic

“Schematic's creative director, Dale Herigstad, was one of those commissioned by Minority Report director Steven Speilberg for conceptual design for the film; specifically around the area of user interface.”


Katherine Jones

“One of the brainiac designers behind all those futurama interface concepts in Minority Report. After trading a few emails, Kat was kind enough to share some of her early prototype sketches.”


MIT

“John Underkoffler set out to create a 'gestural' method of input, whereby specific hand movements would represent different commands. In doing so, he built on research that has been taking place at the Media Lab for years, including some of his own.”

Bringing it to Life

Today, several Human Computer Interaction research projects and prototypes are bringing the The Minority Report interface vision to life:


  • Riot's System B1 "utilizes simple computer vision techniques to allow a user to control their system using hand and finger motion."
  • Multi-Touch Interaction Research enables a user to interact with a system with more than one finger at a time, as in chording and bi-manual operations. Such sensing devices are inherently also able to accommodate multiple users simultaneously, which is especially useful for larger interaction scenarios such as interactive walls and tabletops."
  • Physical Motion Interface is "a physical, motion based interface using cheap, common electronics and parts, and readily available software."
  • Sensetable is "a system that wirelessly tracks the positions of multiple objects on a flat display surface quickly and accurately."