You, Me, and Accessibility in 3D Pt. 3: Designing Inclusive Spatial Interactions and Visual Cues
Part 1: Designing for An Inclusive Future
Part 2: Designing Immersive and Adaptive Experiences
Part 3: Current Page
Part 4: Designing for Auditory Engagement
Part 5: Designing New Ways to Navigate
Part 6: Designing Feedback and Learning
Part 7: Designing A Strategic Process
At CARNEVALE, we aspire to design and develop experiences that are diverse, not average, people can use equitably. When creating experiences that take place in immersive 3D environments, there are new considerations, challenges, and opportunities to ensure we’re including as many users as possible.
By Caitlyn Haisel, Design Director at CARNEVALE
Visual information now exists in 3 dimensions, and many elements may even exist on an axis that spans much broader than a user can see in a single moment. Users need context and guidance to understand the space they are navigating.
We had conversations with people from diverse backgrounds and asked questions like: How do we design the environment to convey depth and provide spatial information? How can we prompt a user to explore their environment? How can we ensure users who are deaf or hard of hearing are included?
Off-Screen Cues
During an experience that has interaction points across potentially 360º, it’s likely that there are sounds, elements, and actions outside of the user’s view at any given moment. There should be cues to help the user understand their surroundings such as audio, visual, or haptics.
For anything communicated through audio mediums, there should be a method to receive the same information without audio at all. Translating audio information to visual cues can be particularly useful for people who are deaf or hard of hearing but is also helpful for users who are in a loud environment, recovering from an ear infection, have an auditory processing disorder, etc.
Visual and Haptic Guided Cues
Along with visual indicators, other non-audio cues like haptic cues are powerful tools that can be provided to help guide the user’s attention. For example, we can use an arrow that points to an element paired with haptics to direct the user to an interactive object.
Along with visuals and audio, haptics are an important tool to convey information without further inundating the user with visual information. Along with translating sound to visual cues, which would be helpful for deaf or hard-of-hearing users, it also provides useful information and context to users with visual disabilities.
Proximity Base Sizing
The sizing of elements, like text, could be dynamic. Instead of elements being scaled to fit a static environment, element sizing could be dependent on the user’s proximity. For example, text from a farther distance away would be larger and easier to see from a distance. As the user approaches, it could scale down to an appropriate size for their proximity.
This also allows for customization and magnification. We could give the user control to define if they need larger visuals (and to what degree), and the environment/content could react in real-time to the proximity of the user.
Contextual Captions
Including captions is a pivotal component to making any experience with audio accessible from both traditional to 3D/Extended Reality (XR) products in the future. Traditionally, captions can be both beneficial and distracting. They can convey important information, but they can also be an additional, overwhelming visual element on top of an already visually dense experience. This can also be exasperated if the user has to look far away from the source of the sound to read the captions. It can cause the user to split their attention in too many directions, sacrificing visual communication efficacy.
Contextual captions help reduce this overstimulation by appearing close to the source of the sound. They can appear in the context of the source of the audio in the 3D environment. There are visual indicators to further create association with the sound location such as arrows directing the user to the source of the sound. When the user pans away from the source, the captions would stay present to the user, and the visual indicator helps the user understand there is audio, even if in a location off-screen to them.
Further, captions should be synchronized so they are appearing at approximately the same time as the audio.
Adjustable Captions
In many cases, it can be helpful to allow the user to adjust the styling and location of captions to the user’s preference. We can allow users to define details like text size, color of the text and background, and placement.
ASL Interpretation
There is no replacement for the value and effectiveness of real people doing real ASL interpretation. From conversations with those in the Deaf and Hard of Hearing Community, we learned that ASL interpretation is the most helpful way to communicate with people who are deaf and hard of hearing. While current technology cannot match the expression and value a human ASL interpreter brings, we could find new ways to bring ASL interpretation to immersive experiences.
A human interpreter could record the ASL interpretation for an entire immersive XR experience. Then, the interpretation could be transformed into a highly detailed and expressive ASL interpretation 3D avatar that follows the user through their activities and interprets written or spoken English language to ASL.
Social Impact
Making XR experiences more inclusive doesn’t just create better products. Making XR experiences more inclusive ensures that as many people as possible have access to immersive experiences, allowing them to connect with others and our world. Accessibility allows users to contribute and collaborate in creating the future of immersive 3D experiences.
To design experiences to include diverse users, we have to connect with and have conversations with diverse users.
Explore the Series
Part 1: Designing for An Inclusive Future
Part 2: Designing Immersive and Adaptive Experiences
Part 3: Current Page
Part 4: Designing for Auditory Engagement
Part 5: Designing New Ways to Navigate
Part 6: Designing Feedback and Learning
Part 7: Designing A Strategic Process