Movement-based interaction with hand held devices

Sile O'Modhrain

Queens University, Belfast, Northern Ireland

The current convergence of hand held digital devices and low-cost sensing technologies is creating an opportunity to rethink the way in which such devices are controlled. As motion sensors find their way into low-cost cameras, phones and PDAs, the challenge for designers is how to take advantage of motion and gesture as potential components of interaction design.

Given that mobile devices are used in a wide variety of contexts, a question which arises is whether the ability to carry out actions requiring fine motor control such as menu navigation will be affected by being engaged in body-scale cyclical motion such as walking or running? In a recent study, Crossan et al (2005) showed that when navigating a menu on a mobile device while walking, people were more accurate and more likely to tap items on the downward phase of the gait. Given this knowledge, can the rhythm of such motion become an integral part of the gestural control of such devices? Lance et al (2004) suggest that rhythmic gestures may be highly affective for such applications as they are easier to reproduce for the user and hence provide robust data for model-based recognition systems.

In the Body Mnemonics project (2003), our approach was to move away from menu-driven interaction and to design an entirely new form of gestural interface to a portable device where the body space of the user becomes a means of organizing the information in the PDA. Information is stored and retrieved by moving the device to locations in the body space. One open question is to what extent people modify the speed of these gestures when using the system while walking, and to what extent the rhythm of walking will further define the temporal characteristics of the gestures.

In summary, the work reviewed here illustrates the potential for movement as a rich source of information in interaction with portable devices. However, it is clear that we are only beginning to understand the complexity of motion, particularly as it impacts the varying contexts of use of our devices. In this respect, we have much to learn both from other fields of research such as perception and robotics, and from practitioners of performing arts such as music and dance.

Thus, future paths for this work will include exploration of gestural control in the context of audio mixing as well as further development of movement-based interfaces for portable devices.