Pugetworks
Seattle Software Developers

blog-archive

Pugetworks Blog Archive!

What is a Natural User Interface?

Summary

Natural User Interfaces (NUIs) allow interaction with displayed data in a manner which mimics the physical world. They are enabled by motion or speech recognition devices, and when used properly they are more easily understood by novice users.

There's something appealing about the notion of being able to throw away your mouse and keyboard and interact with your computer directly. Memorable scenes in movies like Minority Report and Avatar suggest a future where instead of laboriously clicking, dragging and typing, information on a screen behaves like a physical object that can be manipulated with the hands and voice in ways that feel natural, and come naturally to the user. In other words, a Natural User Interface, or NUI.

While the idea itself is old, the technology to support NUIs, and the field of study around them, is relatively young. So, there is no formal definition in place, and plenty of conflicting opinions. What follows is more or less my understanding of the salient features after doing a considerable amount of design for the iPhone and Android, which along with the XBox Kinect, are currently the best places to see Natural User Interface design at work. Not all of my work has been on NUIs, but it's given me plenty to think about.

Holotable interface from the movie Avatar

The Problem with Mice

Think about the difference between drawing a picture on paper and drawing a picture in Adobe Photoshop. I think most people would agree that, artistic ability aside, it's harder to learn how to use Photoshop. But why? In Photoshop you can draw with a pencil, just like in the real world. You have an eraser too, and a pen, and as many brushes as you'd ever want. In fact you'll find that many of the tools in Photoshop are exactly the same as the ones you'll find at the art supply store, except in Photoshop these tools aren't real, they're just metaphors for the physical objects whose functions they mimic. They're buttons used to set the current mode for how the cursor effects the canvas. You still have to figure out how to use them even before you can figure out what to use them for.

User interfaces have always relied on metaphors. We like to draw comparisons between real-world things or activities and some computer function that serves the same role. For instance, the Windows desktop is a metaphor for a flat physical space on which one expects to find paper documents, and the MacOS Dock is a metaphor for a shelf where interesting objects are pulled out for closer examination. Even the concepts of directories (or folders) and files are relics of path dependent design approaches from the days where filing cabinets were the height of information storage and retrieval technology. If you stop to think about it, almost everything you interact with in a GUI environment is a metaphor for something you might interact with in the real world.

And on the whole, our use of physical objects as analogies for digital ones has served us fairly well, allowing new computer users to more quickly understand the purpose of what these icons and interfaces represent.

However, while we've always seen physical objects on our screens, we've not until recently been able to treat them as such. Even if your on-screen document looks like a piece of paper, you have to know what a mouse and scroll bar are for in order to read it. Nobody needs to be taught be taught how to turn the pages of a book, but using a mouse even for simple activities requires some learning.

The lesson is: it's hard for an interface to seem natural if you interact with it in a totally artificial way.

Natural User Interfaces address this by  mapping intuitive physical interactions to commands. That means that things in a NUI behave the way you'd expect them to behave given how they are depicted on screen. If you see a see a ball, it probably rolls when you push it. If you see a stack of photographs, you can expect to be able to rearrange the contents by pulling out pages from the stack and laying them out. The object represented on screen looks like something (a book) which affords some activity (opening) which in turn suggests the proper gestural command (touching the edge of the cover and pulling it open with a swipe).

One last thing: A natural user interface isn't simply a traditional GUI interface that's so good it 'just works', though of course that's what all interfaces should strive for. The distinction between a really good GUI and a NUI is the degree of abstraction in the input. No matter how well-chosen the icons, how suggestive the visual hierarchy, how thoughtful the menu system, you're still moving a cursor around and touching buttons on a screen, which is (almost certainly) an artificial way of representing whatever actions are supposedly taking place on screen. Likewise, a gestural interface isn't automatically a NUI simply by virtue of being touchable. If an iPad or Android app is only using touch to replace a point-and-click interface, it's not a NUI.

Okay, should I throw away my mouse and keyboard?

Well, not exactly. I don't think NUIs will replace all current methods of interacting with computers. For one thing, if you need to touch something in order to interact with it, things on screen need to be fairly big, so you can't cram a lot of stuff on a NUI.  For another, not all software is made up of faux-real objects, some activities are abstract. For this reason, it's hard to imagine NUIs replacing a command-line interface, where high information density and expertise are called for rather than user-friendliness. I do not expect to find my colleague Bill using a NUI to grep for a regular expression any time soon.

Likewise, skilled users will always find a mouse an keyboard faster than a wave of the hand once they've mastered how to use them properly, so the GUI probably has a long future ahead as well.

But novices and intermediate users will benefit by being able to perform relatively simple tasks with—at long last—a relatively simple interface. There a large number of problems for which NUIs happen to be a preferable solution to what we're used to, and as a designer I'm excited to see the ways this new technology can be put to work.

Resources

DesignAdam