Thursday, October 20, 2011

OmniTouch turns any surface into a touchscreen interface



Had Shakespeare been born several centuries later, he might have said "All the world's an interface," especially if he'd had a chance to play with the recently-developed, wearable OmniTouch system. While interactive interface projectors are far from new, this innovative concept design utilizes a different approach that promises to turn just about any solid surface into a touch-sensitive input device. Books, tables, walls, hands and other body parts, it's all fair game.

In its current proof-of-concept iteration, which was prototyped at idea-rich Microsoft Research in Redmond, Washington, by PhD student Chris Harrison and his team, the rough-hewn shoulder mounted device resembles a sci-fi prosthetic weapon, but looks can be deceiving.
"We explored and prototyped a powerful alternative approach to mobile interaction that uses a body-worn projection/sensing system to capitalize on the tremendous surface area the real world provides," explains Harrison.

Like the proverbial "better" mousetrap, the concept of mobile interaction seems prone to constant tinkering. The OmniTouch draws from a blend of disciplines to overcome numerous issues that beset similar devices. Some approaches require placing markers on the fingertips but still can't discern whether the fingers are "clicked" (touching the surface) or hovering. Others can't "read" surfaces beyond those of the user's own body or they lack the ability to respond to touch/drag motions.

To surmount these hurdles, Harrison and his colleagues combined a PrimeSense short-range depth camera with a Microvision ShowWX+ laser pico-projector. The camera generated a 320x240 depth map at a rate of 30FPS, even for objects as close as 8 inches (20cm). The projector delivered a sharp, focus-free, wide-angle image independent of the surface's distance - a useful property in such applications. Both devices were then linked to a desktop computer.
The OmniTouch gets its edge in finger position detection through a complex series of calculations that begins with the generation of the depth map. The second video below contains a detailed description of the process which enables the device to determine whether one's fingers are floating above a surface of actually contacting it. The inputs yielded closely approximate those of touchscreens and mice, so the possibilities for the OmniTouch are seemingly endless. Let's hope the wait for a commercial version isn't.
The paper, OmniTouch: Wearable Multitouch Interaction Everywhere, by Chris Harrison, Hrvoje Benko and Andy Wilson was presented in the Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (Santa Barbara, California, October 16 - 19, 2011)
All images courtesy Chris Harrison.

No comments: