My Dr. Dobbs article on multi-touch, multi-user, gesture-based design

[Infragistics] Joel Eden / Monday, April 27, 2009

A few weeks ago Dr. Dobbs published a pretty lengthy (and informative) article of mine titled “Designing for Multi-Touch, Multi-User, and Gesture-Based Systems.”

 

In my opinion, it’s very easy to get caught up in new technologies, new features, etc to the point of forgetting that the systems we design need to first and foremost be about helping people accomplish things; i.e. any visual or interaction “aesthetics” that are added should be done so in a way that helps usefulness and usability rather than being done just because it might be fun for the designer or developer to try it out. Therefore, in this article I really tried to stress how what we already know about designing systems with great UX (i.e. useful, usable, and desirable) can be used as a strong foundation for new types of systems, including these gesture based systems that seem to be a new breed.

 

One major theme of the article (I don’t want to give the whole thing away though, so do read the article) is all about how the real world is already multi-touch, multi-user, and gesture-based…therefore, designing systems like this have a lot to do with looking around at seemingly mundane everyday activities and trying to understand how we already accomplish them, and then leveraging what we notice in our designs.

You already hear this all the time in UX anyway (e.g., observe users doing real things, rather than relying on user preferences and what they think they need), but it’s really important when moving to these different types of systems, because it becomes just too easy to get caught up in the excitement of the new technology.

 

I spent a lot of time in grad school researching distributed cognition and the extended mind (which are new views of cognition that basically oppose the idea that cognition, and the mind reside only in the physical brain), and I tried to put some of that info into the article. I included a little bit about  how gestures and material artifacts can actually be thought of as first order cognitive resources, a role that older views of cognitive science reserve only for the brain. Here’s a related quote from the article that I think is interesting, not just because I wrote it, but because someone else who read it noticed it and blogged about it:

 

“If gestures are already so much a part of our cognitive processing then in some ways, the growing excitement around gesture-based systems is a sign that software systems are finally catching up to how we already think and behave, rather than really representing an innovative way of interacting with information.”

 

Dr. Dobbs tells me that lots of people have already read the article, so please if you haven’t read it yet, do so, and come back here to let myself and others know what you think about it (I don’t think Dr. Dobbs has a comments section on the actual article page).