Interaction design is almost always a synthesis of traditional methods and approaches from varied established disciplines. When I write about interaction, most people reading it view it in the context of software or some form of digital technology.
“Interaction” isn’t only about technology or software. Industrial designers are taught to design ‘things’ that engage people and facilitate their relationships with those things.
These days, those things could range from information, objects, and activities to services and systems. This is a very broad range of activities that involve interaction design.With the growing convergence of physical form and computing an entirely new style of digital interaction design is emerging, an area called “tangible” interaction design, this is an area where designers seek to create objects or artifacts that interact with people, react to them, behave with them. This is not just about computing ability or adding intelligence to objects. The focus is always the human experience and behavior. Tangible interaction designers work on the integration of technology and its effects on human experience.
Design has always been about interaction, and interaction is something that’s tangible. Over the time one major factor that’s emerged to influence tangible design is the growing physical embodiment of computing. Again, the iPhone is a good example of just such a physical embodiment of computing. It’s really quite simple – as a designer of physical things, you must now decide whether to embed software in the ‘thing’, something that’s very common these days. As a designer and writer of software, we must consider and use the limitations and affordances of the real world. New paradigms of interaction are emerging because of the instantiation of such technology. Integrated form and computing will enhance our experiences with objects, systems and the very places we inhabit. We’ll rapidly be living in a world with adaptive, responsive, real-world physical objects that invite interaction. Objects will be more ‘alive’ than ever and you’ll possibly never look at the toothbrush the same way again if it interacts with you.
Embedded computing forces a change in the direction of design, simply by making objects that just weren’t possible to create a decade ago. Interaction design is not just screen-based digital interaction anymore. Tangible interaction is the physical embodiment of computation. Tangible interaction designers must use traditional interaction design, engineering, computing, and robotics in a mash-up of skills and methods. We must start to think and make in physical form, electronics, and software. We must work across the old disciplinary boundaries – form and computing.
Form is an important element in design and similarly so in tangible interaction. Form visually communicates and physically represents a ‘thing’s functionality, it gives cues for understanding, and provides the basis for interaction. Form connects with computing through sensors and effectors .
Sensors provide input. The simplest sensor is a switch; buttons dominate our interaction with electromechanical products. There are loads of sensors out there – temperature, movement, pressure, force, moisture, chemicals, stretch and strain, etc. Effectors provide output. Used as indicators, LEDs are used to indicate an electromechanical device’s state. There are audio indicators too, beeps, bleeps, dings, and buzzes. We could do so much more with sound design. Motors are effectors as well – providing motion and other physical action. For example, vibrator motors in cell phones bring a physical quality to digital interaction.
The new element in design is software, a fundamentally abstract and disembodied way to prescribe behavior. Earlier, designers made decisions about physical form and materials to govern the interaction with the designed ‘thing’. Now the designer must also script the thing’s interactive behavior. The simplest program relates inputs directly to outputs (”when this lid is opened, the lights go out”). The software itself is a more subtle model of the design in use.
Artists, hobbyists and DIY hackers built entirely new hardware tool kits and platforms to make it easier to build and program working prototypes of products with embedded electronics. Today the Arduino family of microcontroller boards, including the Lilypad, engineered for embedding in textiles. Hardware design environments such as Fritzing and a host of programming environments such as Pd and Funnel invite designers to play with and provide the ability to get your hands dirty with electronics and code. Such tool-kits and platforms are critical to moving tangible interaction design forward; we’ll see more of them emerge as these products and experiences become mainstream.
As a learning designer, I’m quite challenged to imagine a world where every object is capable of providing information about itself and that too in an interactive manner. This would change the way we design and deliver learning. We’ve always looked at learning to be separate from the objects/environment that it is about – we learn to use something. While we accept learning is enhanced by interaction, how will tangible interaction change things? What if every object came embedded with the ‘training’ required to use it, easily accessed through simple interaction with the object itself, perhaps with the object having a ‘learning mode’. As learning designers, we need to force ourselves out of imagining interaction as being human mediated or digital – there’s a new type of interaction that we’ll need to be leveraging soon – tangible interaction.