One of the computing trends from the Intel Developer Forum 2011 in San Francisco was the use of Microsoft Kinect as a platform for controlling electronics with gestures. The demo showcased a prototype electronic sign that could be used as an endcap display in a retail shop. The sign was a large LCD panel mounted vertically with half a dozen images of drug store products, arranged as if in a high-end magazine ad, making razors, shaving cream, sun block, etc. appear as glamorous as they possibly could. For the demo I was a shopper seeking information about the air fresheners advertised on the screen.
Getting the sensors to recognize my hand took a couple of tries. The demo staff suggested I shift my feet 6 inches left and then modified my wave to something between Queen Elizabeth’s and a warm Howdy. Hovering my virtual hand over a product image brought it to the foreground, and icons at the bottom of the screen let me choose among price, product information, and reviews. The sensor and software technology may evolve so that almost everyone can activate the display on the first try.
The demo illustrated well how gestures can be useful for interactive advertising and product information in retail. Similar applications include interactive signage or large screen kiosks that the public can use to access information about locations, schedules, events, transit, menus, and so on. In the workplace, using gestures to control computers and electronics will make tasks easier. Creative prototypes from developers using the Microsoft Kinect SDK show some of the ways gesture interfaces will change how we work. When giving live presentations, gestures can replace using the remote control “clicker” to advance slides or zoom in on a chart. In warehouse or factory uses, guiding some heavy equipment or robots by gesture could be more economical and potentially safer than with push-buttons and levers. Gestures can be especially useful when analyzing large amounts of graphical data, such as manipulating multiple charts, graphs, tables and images on a large screen for business dashboards or to plan treatment for patients with multiple health issues.
To discuss how new user interfaces are shaping computing, I invite you to join David Willis and me for our webinar on October 27th: “iPad and Beyond: What the Future of Computing Holds.” Here is the link to register.
Thanks for reading my blog.