A downloadable game

When I first found out that Oculus Quest offered hand tracking - though this is currently limited to the Unity editor - I felt I had to take advantage of this opportunity to test its ability. I wanted to discover its potential in using sign language as inputs instead of the standard VR controllers. This desire originated from my experience as a Deaf person, whose first language is British Sign Language, as many game developers felt that just adding a basic subtitles system is sufficient to offer accessibility to Deaf players. Sign language is its own "standalone language," separate from native spoken language, with grammar and words that do not have a 1:1 analogue in written language. I wanted to see if I could push game accessibility for Deaf signers further.

To achieve this, I used a tool called InteractML (https://interactml.com/), a machine learning plug-in created for Unity, to train the game to accept specific signs from British Sign Language as inputs. Though there were issues due to the fact that InteractML was still in its alpha phase, it proved itself to be highly useful and allowed me to leap ahead toward achieving my goal of integrating sign language into gaming. This integration could be meaningful and complex, not just simply animating NPCs or "translating" from the spoken language,but as input methods and a communication approach in multiplayer settings for Deaf users.

As Oculus Quest's hand tracking is playable only in Android device builds or Unity's playmode, and InteractML can only be used in Desktop builds or playmode, I am unable to create a playable demo for you to experience the game for yourself at this time.

Instead, I hope that the following video of the demo will display the potential I saw for the future of accessibility in gaming.

This video is a recording of my presentation about this prototype that was devised and created for my MA thesis: