Siri is one of my favorite features on the iPhone and one that I use incredibly frequently. The tool is particularly useful to me if I am driving, wherein my vision and kinetic ability are limited. In this situation, my voice is the only safe and available method to interact with my phone. And I’m not the only one who finds Siri useful. Voice-controlled interfaces have become increasingly popularized in recent years, in other forms such as smart home devices. However, such technology inadvertantly excludes the deaf and hard of hearing. Verbal language has become standardized, and yet, gestural language has been broadly used and developed by the deaf and hard of hearing. Considering that 20% of Americans are disabled, why does technology continue to exclude individuals of differing abilities? By perpetuating the language of the able-bodied and priviledged, disabled individuals are continually pushed to the margins.
In a recent project, I demonstrated what it would look like for American Sign Language to be integrated in Siri. By utilizing the TrueDepth camera system, technology already developed and used in Apple products, the user may communicate to Siri by using gestural language. In navigating this project, I pondered what other technologies that could be used to aid disabled individuals, but were not. In this case, the system utilizes sensors, cameras, and a dot projector to create a detailed 3D map of the users face. This technology could be used to recognize hand gestures, which we already see through hand tracking in virtual reality headset, Occulus Quest.
In consideration of other disabilities, I would redesign Siri to utilize a haptic feedback system to communicate braille, technology that is already being researched at Bayreuth University in the form of a hands-free display.
I believe that it is because we are social creatures, we have developed the ability to connect by any means necessary. As Haben Girma says in her talk Universal Benefits of Accessibility, “humans are incredibly creative, we design new ways for each of us to connect and engage and share information” (6:27). She continues by explaining that designing with accessibility in mind is not only beneficial to end-users with disabilities, but beneficial to able-bodied users as well as those creating the products. Even while writing this post, the transcript provided with the video allowed me to quickly reference quotes and time stamps that were particularly meaningful for me. Accessibility allows users flexibility in their methodology, providing various pathways to fulfill their tasks and goals as well as finding information. By doing so, products are usable to a broader audience, benefitting the corporations that produce them.
One point from Girma’s talk that I found particularly impactful was that disability is designed. Cities have especially been notorious for actively excluding individuals of differing abilities, as those designing these cities did so based on their very limited perspectives. This notion is reminsicent to the argument of ethics in artificial intelligence today, wherein the biases reflected in technology are only instilled by their creators. I believe that the perpetuated biases in design not only exclude those with disabilities, but inadvertantly pigeonhole able-bodied individuals to communicate, connect, and engage in a set number of ways. It is important then, that we acknowledge our responsibilities as designers to ensure that the perspectives of all are heard and inform our work.
I can tell you did a lot of research, thanks for all your hard work.
Wow. Your design is very systematic and of high quality. I really hope this design can be used in real life!