I recently came across a few articles that featured natural user interface or NUI design that I found interesting. (Note: I’m currently doing contract work for the Microsoft Surface team.) The first article in Tech. Rev. Mar 2010 discussed a way of improving the haptic feedback users get when manipulating buttons on small touch screens, like the ones used in most smartphones. The touchscreen designed by a company called Immersion, uses a piezoelectric material that vibrates 100 micrometers every millisecond as the user’s finger moves over a button and then stops once the finger is no longer on the button. With this small but fast movement, the user gets the illusion that the flat glass surface has a convex button on it.

Improved haptic feedback is an extremely important advance in the design of automobile dashboards, where glancing down at the controls can lead to driver distraction and vehicle accidents. It can also add haptic feedback for controls used in factories, power plants, aircraft, hospitals, etc. where diverting your attention from the work surface to look at a control can be dangerous. Immersion is already licensing its technology to LG for smartphones.

LG-Chocolate-BL40

LG Chocolate BL40 smartphone uses haptic technology

This is very different from the traditional motorized vibration technology that most firms were pursuing a few years ago to give user’s feedback when they press an on-screen button correctly. This feedback is coarse, so while it informs the user that the button was pressed, it doesn’t give the user the illusion the surface actually has a button. For more about this older style of feedback see Tech. Rev. May 2007.

tactile_cellphone_x220

State of the art haptic feedback back in 2007

****

Microsoft Research is working on a very interesting touch screen painting application called Project Gustav. It combines texture mapping, color and transparency mixing, multi-touch input, and a 3D paint brush bristle deformation algorithm to make a realistic painting simulation. The effect is very impressive and could have lots of commercial applications. It will be interesting to see if this technology makes its way out of the lab and into some retail products soon.

Gustav

An expensive finger painting simulator. Image from Microsoft

Also from Microsoft Research in cooperation with Carnegie Mellon University is a touchscreen interface called skinput. It projects an image onto your forearm or hand and then records the user’s finger taps using a microphone. Somehow it doesn’t seem really practical. Plus it seems a bit creepy. Check out the YouTube video.

skinput_236x236

Tap tap tap. Photo from Carnegie Mellon

Finally, gamers everywhere are excited about Project Natal, the controller-less interface for the Xbox 360 that is slated to be released for the upcoming holiday season. Although most people have just been amazed that Project Natal works, I wanted to learn more about how it works. That is how does the vision system convert 2D video of a person in motion into a 3D wire-frame model? Microsoft has been fairly quiet, either because they are still patenting the IP behind it or because they want to focus attention on the games not the technology. Alas, the only article I could find was Pop. Sci. Jan 2010.

ProjectNatal3dme2

3D wireframe generated from a 2D camera. Photo from Pop. Sci.

[Update: Microsoft has released Project Natal under the tradename Kinect.]
Advertisements