In my interaction design studies at California College of the Arts, I find myself designing on and for physical screens. I ponder the size of each icon (can a user easily tap it?), the micro-interactions lacing a scroll, pinch, or press-and-hold, adjustments from screen to screen (responsive design) and more. All of these activities rely on a physicality that seems like nothing more than common sense—and yet, day by day, it also seems to edge closer to becoming a thing of the past.
That’s not to say that alternate forms of interaction (such as gesture design) are something new. People were experimenting with gesture-based technology as early as the 1980s, with IBM designing a camera-based web interface in the 90s. Thirty years later, in 2010, Microsoft released one of the most responsive motion-detecting gaming consoles, the Kinect (though it was short-lived).
Gesture-based technology was a part of my life early-on in the form of the video game Just Dance. I played it on the Wii; as I followed the choreography of each song, the system assessed whether the gestures I was making matched those of the character on-screen. I’ve actually experimented with this type of design myself in a college project. I designed a partially hands-free hair tutorial app, aptly named Styler. It’s telling, to me, that the interaction design curriculum here at CCA already includes this type of design.
Looking forward, I believe gesture design—and by extension, “screen-less interfaces”—will only grow more widespread as time passes, finding use in less niche avenues of human life. In Enchanting Everyday Objects, David Rose offers insight on interactions that do not require a screen but still use touch:
“By tangible I mean that the interaction between human being and object does not require a screen. Instead, interfaces rely on gesture, tactility, wearables, audio, light and pattern, and haptics—the use of touch…these non-screen-based interfaces will change the fabric of the ways in which we live, work, and play.”
(Rose 91)
In Animism: Living with Social Robots, Rose mentions gestures again as a means of communication between people and robots: “The robot of your imagination understands human speech (and your particular language and accent) as well as the meaning of gestures, and it can speak and gesture in response to you” (65). It’s an interesting thought: robots as an interface. Gestures as language and input. Already we find ourselves inching in this direction: the Metaverse is perhaps becoming the most obvious harbinger of this new way of designing, interacting, and existing. This is what, I believe, awaits us in the future.
Sources:
https://intranet.birmingham.ac.uk/it/innovation/documents/public/Gesture-Control-Technology.pdf