As we move toward embodied interaction, we maintain these basic principles of good interaction design: the user initiates interaction with some form of action, and the system responds or alters as the user intended. However, the trend for embodied interaction is the design of very broad and varied ways in which the user is expected to act to initiate interaction, and the iterative action-response needs to be discovered and learned. For example, laptop and touchscreen interactions are ubiquitous enough that there are established guidelines and design patterns that designers adhere to (Norman, 1983). These patterns and guidelines cause users to have certain expectations of how a system might work even before they begin interacting with it. However, embodied interaction is relatively new and does not have as coherent a set of consistent design patterns for interaction. Therefore, we transition from an expectation for consistent and effective interaction design using keyboard, mouse, and display, toward novel interactive systems in which the user explores and learns which actions lead to expected changes in the state of the system. We propose that HCI is a cognitive process in which the user mental model is the basis for their exploration and use of the interactive system. Users decide how to interact on the basis of expectation and prior experience, and the affordances of the specific interactive system modify the user mental model.
As Dourish (2001) says, when users approach an embodied interactive system, they must construct a new understanding of how it works on the basis of their physical exploration. Different people may have unique experiences and expectations, which will affect the way in which they initially explore a system and, ultimately, the mental model they construct of how the system works (Dillon, 2003). Embodied interaction has been used to describe the interactions of users with a wide range of interactive technologies, including tangible and gesture-based interfaces.
We posit that good tangible and gesture interaction design depends on an understanding of the cognitive issues associated with these modalities. We organize these issues into four categories: embodiment, affordances, metaphor, and epistemic actions. These four categories can be used as clues that the designer can give the user to aid the user in understanding how the interactive system is to be operated. If these concepts are integrated into the design process, the user’s mental model and knowledge can be activated and extended as they try to use and understand the interactive system. While these cognitive issues require further exploration and empirical validation (Antle et al., 2011), we present specific projects that explore various aspects of embodied HCI.
1.1 EMBODIMENT
Interaction through tangible and gesture-based systems is intrinsically embodied, and therefore decisions about which embodied actions can be recognized by the interactive system are part of the design process. Human gestures are expressive body motions involving physical movements of the fingers, hands, arms, head, face, or body that may convey meaningful information or be performed to interact with the environment (Dan and Mohod, 2014). Designing embodied interaction is not just about designing computing ability, but is also about designing the human experience and anticipated human behavior.
Research has shown that gestures play an integral role in human cognition. Psychologists and cognitive scientists have explored the role of gesture and thought for several decades. McNeil (1992, 2008) explains that gesture and thought are tightly connected, and he also establishes a categorization of gestures and their role in human cognition and communication. There is evidence that gesturing aids thinking. Several studies have shown that learning to count is facilitated by touching physical objects (Efron 1941; Kessell and Tversky 2006; Cook et al., 2008). Kessell and Tversky (2006) show that when people are solving and explaining spatial insight problems, gesturing facilitates finding solutions. Goldin-Meadow and Beilock (2010) summarize findings as “gesture influences thought by linking it to action, (p. 669)” and “producing gesture changes thought (p. 670)” and can “create new knowledge (p. 671).” These studies show that gesture, while originally associated with communication, is also related to thinking. Embodied interaction design creates an environment that is activated by gesture and actions on objects and therefore induces cognitive effects that traditional user interaction does not.
One challenge to embodied interaction is that while it is built upon natural actions, it still requires some level of discovery, especially when it is a public display. Tangible and gesture-based interaction designers consider both the integration of technology and its effects on human experience. The major consideration that has emerged to influence tangible design is the physical embodiment of computing. Interaction design is not just screen-based digital interaction anymore. Tangible interaction designers should think about physical, graspable objects that give cues for understanding and provide the basis for interaction. Gesture interaction designers should think about how various human movements can be recognized and interpreted in the context of changing the state and response of the computational system. Interactive platforms can be interpreted as spaces to act and move in, and they effectively determine interaction patterns.
Dourish (2004) explores the physicality of embodied interaction and its affect on moving human computer interaction toward more social environments. He describes an approach to embodiment grounded in phenomenology, and claims that any understanding we have of the world is the result of some initial physical exploration. Embodied interaction is about establishing meaning and it is through embodied interaction that we develop an understanding of the meaning of the system. As the user constructs their mental model, they are influenced by the phenomena they are experiencing at that moment as well as their prior experiences and understanding of how technology works.
In this book, we take a cognitive view of embodied interaction design: Discovering the interaction model relies on pre-existing mental models derived from physical experiences, and executing intentional physical movements during interaction has an effect on cognition. We demonstrate and elaborate on this view of embodiment through four projects; where we describe the gestures that enable interaction, the design methods, and the usability issues for each project.
1.2 AFFORDANCE
The concept of affordance was introduced to the HCI community by Norman (1988) and Gibson (1982). According to Norman (1988), an affordance is the design aspect of an object that allows people to know how to use it and that gives a clue to its function and use. Norman discusses the concept of affordance as properties of an object that allow specific actions such as a handle affords holding and turning, a button affords pressing and make it its own function clear. Tangible interaction design is arguably more influenced by physical affordances than by visual or gesture interaction design.
TUIs change the way we interact with digital information, with physical affordances that are distinctly different from pointing and keyboard/mouse interaction. According to Wang et al. (2002), there are two advantages to tangible interaction; first, it allows direct, naïve manipulability and intuitive understanding; and second, the sense of touch provides an additional dimension. The tactile feedback afforded by TUIs is consistent with the early empiricist argument that kinesthetic information provides us with the ability to construct a spatial map of objects that we touch (Lederman and Klatzky, 1993; Loomis and Lederman, 1986). Fitzmaurice (Fitzmaurice, 1996; Fitzmaurice and Buxton, 1997) demonstrated that having multiple graspable interactive devices encourages two-handed interaction that calls upon everyday coordination skills. Leganchuk et al. (1998) explored the potential benefits of such two-handed input through experimental tasks to find that bimanual manipulation may bring two types of advantages to HCI: manual and cognitive. The two-handed interaction doubles the freedom simultaneously available to the user and reduces the cognitive load of the input performance.
The potential affordances of the TUIs, such as manipulability and physical arrangements, may reduce cognitive load associated with spatial reasoning, thus resulting in enhanced spatial cognition and creative cognition. Brereton and McGarry (2000) studied the role of objects