What We Do
Reimagine digital engagement for better business growth.

You, Me, and Accessibility in 3D Pt. 5: Designing New Ways to Navigate

Part 1: Designing for An Inclusive Future

Part 2: Designing Immersive and Adaptive Experiences

Part 3: Designing Inclusive Spatial Interactions and Visual Cues

Part 4: Designing for Auditory Engagement

Part 5: Current Page

Part 6: Designing Feedback and Learning

Part 7: Designing A Strategic Process

 

When navigating the physical world, we’re limited by constraints like gravity, distance, location, and even our own mobility. Extended reality (XR) presents us with an opportunity to redesign how we navigate and experience both our own world and new virtual spaces.

1_oVxA_aFCephiUGU4RZ9GeQ

By Caitlyn Haisel, Design Director at CARNEVALE

With powerful new technology, there always comes opportunity. As designers, we can choose to invent new innovative and inclusive methods for users to navigate our XR digital experiences or…we can unintentionally recreate the inaccessibility of our physical world. At CARNEVALE, we’re excited to explore new ways to navigate XR experiences that people from diverse backgrounds, identities, and abilities can use equitably.

Most XR headsets now support multiple input methods — physical controller-based, eye tracking, hand tracking, etc — each with its own pros and cons for different users.

Unfortunately, there isn’t a one-size-fits-all accessibility solution for designing navigation in 3D, XR environments. However, we can offer our users the right methods, tools, and features for their needs and the way they experience the world.

1_Z1wYfbvMHr_0Uy7CT8y35A

Controller Navigation and Customization

Let’s start with the classic navigation method: the controller.

We use controllers all the time. Turning on and off the TV, playing video games, adjusting the climate in our cars, typing this Medium article, talking to our Amazon Alexa.

While controllers are a technology we are familiar with, that doesn’t mean they’re automatically accessible.

1_qkF1XgsN_uUOT68JSU3xMA

Many XR headsets are accompanied by 2 controllers that the user interacts with to navigate digital worlds and execute actions, but what happens if someone doesn’t have 2 hands? What if they are left-handed or they don’t have the dexterity in their fingers to reach all of the buttons to complete tasks? What if the weight of the controller is physically taxing to use?

The controller itself could be configured to a user’s preference in areas such as the sensitivity of the thumb-stick, axis response, left or right-handedness, one or two hands, motion tracking sensitivity, etc. All actions that can be performed with both controllers should also be achievable with only 1 controller.

Despite the constraints physical controllers present, there are some benefits for users who aren’t able to ambulate freely or have a limited area to move around in the physical world. In virtual reality (VR), thumb-stick navigation can be enabled to allow users to complete all movements using only the thumb-stick on the controller. Users can experience free locomotion without the need to walk or rotate physically.

1_-UORp-KDjFR_bAWxchdfMg

Gestural-Based Hand Tracking

Some XR headsets allow your hands to become the controllers by detecting specific gestures and that align with actions. For example, while operating the new Apple Vision Pro headset, users can perform an action with a simple pinch of 2 fingers. With the barrier of hardware removed, users are able to interact with digital elements in a more natural feeling way, making experiences feel more immersive.

Hand tracking removes requirements like grip strength, muscle strength to hold the controller, and dexterity to operate it. One-handed gestures can also be used for greater accessibility for users who may only have the use of 1 hand to navigate.

Navigating and performing actions with hand tracking can also be more intuitive. For users with cognitive disabilities who may face difficulty learning the complex functionality of a physical controller, gesture-based actions offer a much more accessible input method.

1_Wm25c5t_4fnCb_i5Uir2lA

Eye Tracking and Control

Eye tracking has been used for decades as an assistive technology that offers alternate an input method to a keyboard, mouse, touch screen, or voice control.

In XR experiences, eye tracking can be used in place of controllers or physically moving around a space. Precise eye tracking allows users accuracy in selecting elements, controlling interactions, and performing actions. Often, blinking is used as a selection device with sole eye-tracking navigation. Hand gesture input is also frequently paired with eye tracking for a combined navigation experience.

Eye tracking has the power to make XR experiences a hands-free experience.

1_DZPGNe9HYs4mQqSXoso4gA

Head Tilting

With just a slight tilt of their head, users can navigate freely in 3D space forward, backward, left, right, and across full 360º rotation. Head tilting could be offered as another hands-free locomotion option that supports users regardless of whether they are seated or standing.

1_Y3mLXikgg3hN77ZcT3OxZQ

Ambulation

While the most immersive option, ambulation is the least accessible method to navigate 3D extended reality space. Requiring the user to physically ambulate, or walk, to specific locations to interact or perform actions presents accessibility challenges to users who aren’t able to walk or do not have the space in their environment to do so.

Some XR experiences currently offer the option of walking in place. In the future, VR treadmill technology like omnidirectional flooring may make ambulation a more realistic XR navigation method for users who are able to walk but don’t have a large real-world space to roam in.

1_0GIK4YKOXHgZQzCkKvyaCg-1

Voice Control

Enabling users to perform actions, give commands, and navigate using voice control makes XR immeasurably more accessible by eliminating the need for the users to physically move, see a UI, or use a controller.

Voice control alone does not make an experience inclusive to everyone, but providing users a choice to engage with an experience verbally makes our experiences more accessible and adaptable.

1_MqdEnBA-7n96dCWXBkTwyw

Alternate Device Input

The challenge of inaccessible controllers could be reduced by allowing users to bring their own input devices. Many users who have limited mobility, dexterity, or vision are accustomed to keyboard navigation from previous experiences, like navigating a website.

Some VR headsets (such as Meta Quest Pro) can accommodate navigation using a Bluetooth-connected keyboard or other controller types. Providing compatibility with alternate device input allows users to employ tools they are already familiar with and comfortable with in new, immersive ways.

1_ySgXTcAYBU5nWfh4rQMEtw

Mobile Device Input

Mobile devices with a touchscreen could also be paired with an XR headset. This would allow for simple and accessible gestures such as swipe, tap, etc to replace complex VR controller actions. Buttons like volume control on the phone could also be leveraged for actions. Complex actions could be listed on the screen and instead, be completed as a single tap.

Our phones are devices that many users are already familiar with operating and accustomed to holding for extended periods of time. For some users, a phone could be a much more versatile and adaptable input tool than out-of-the-box XR controllers.


Input types help us define what people use to navigate our 3D XR experiences. As creators, we also have the power to design options for how people use those input devices and how our experiences react.

1_9yspBWNaGWyU2MywTRk7Sw

Teleportation

Teleportation is an already widely used — and maybe even the most popular–VR locomotion technique in which the user can aim in any given direction and transport to the selected location. This allows the user to navigate around a 3D space without needing to physically walk, making the experience more immersive and accessible to users who have limited real-world space available to move around in.

1_3pLf7SD9JOwqPJPOKfsw_A

World Rotation

In VR, instead of the user being required to rotate, turn, and walk to navigate the 3D world, the world could be rotated around them. This eliminates the need for a user to have a full 360º range of motion and allows for exploration from a stationary position.

1_FuQHxCXOoieG_FEmI7Rg3Q

On Rails

Instead of the user moving, some experiences can be on rails. For experiences on rails, objects may be moving toward the user or the user’s point of view (POV) may be moving automatically, leading them through the experience. The user does not have any control over the locomotion. Head movement and controller tracking are independent, but it does not physically move the user.

While experiences being on rails requires minimal physical user exertion, the lack of control of the motion can cause motion sickness for some users.

1_nmZPCD8bFd7BPcpyCd12wA

Simplified Action Menu

As an alternative to more complex methods of navigation, users could be provided with a simplified menu of all actions available to them.

In XR experiences, this would provide an alternate interaction method, bypassing the need for users to execute complex gestures, remember button commands, or move to a specific location.

1_ML2gGY2iV3bQytoHO0p-lA

Magnet Hands

Magnet hands enable users to grab objects from far away without needing to navigate closely. This would allow users to bring an object or other item that requires their interaction to them instead of making them travel to the object. No walking, crouching, or reaching is required.

1_W60E5ryzxxBT0-_aKDGGBA

Grab and Pull

In virtual reality, users could be allowed to “grab and pull” using their hands to navigate around the space. No walking is required, and this can be executed from a seated position. This is similar to the style of navigation many 3D tools currently use such as Unity and Blender. While this method of navigation requires more manual effort, it offers the user total control over the speed and nuance of the motion.

1_GL3ltR7aEn5lbhjbyEO_Yg

Mobility Options

Of all of the different possibilities and innovative navigation methods we can design for XR experiences, the most important method is granting the user the power to choose between, or even combine, mobility options.

Options should always be offered to immediately adapt the navigation experience to the user’s needs and preferences. All users should have an equitable experience standing with full mobility around the room, standing in a limited space, and seated without a full range of motion. Gestures shouldn’t require walking or complex movement that can’t be done from a seated position. By creating different, selectable, and customizable modes, we could intuitively make navigational adjustments across an experience based on how we know the user experiences the world.


As we design new ways to navigate, we can also expect our users to have new reactions, experiences, and challenges.

Unfortunately, some users report feeling overwhelmed or experiencing motion sickness when using immersive technologies like virtual reality. XR experiences may quickly become overstimulating, particularly for users with vestibular, seizure, or sensory processing disorders.

The key component of designing an effective and engaging motion experience in XR is giving the user the power to customize it.

1_0v1GBvgxu7O1Hb0859-ZSg

Incremental Rotations and Snap Turns

For locomotion methods such as thumb-stick navigation or world rotation, instead of smooth rotations, it can be helpful to allow the user to turn in specified increments. By using snap turns for rotation, motion can be reduced, along with the feeling of motion sickness for some users. It can be thought of as rotating like a clock with the user at the center.

1_Og8vncyIVhmHURmiMTHzhg

Motion Reduction, Sequencing, and Control

Users should have the ability to control the motion they are seeing and experiencing in an XR experience. We have the opportunity to make our virtual worlds so much more adaptable than the real world to user needs.

Users should be in full control, able to play, pause, stop, and reduce motion in their 3D immersive environments. Additionally, simultaneous motion from multiple elements should be reduced and played sequentially, with a cool-down period between, when possible.

1_6270za4jRgpAjip8ThVt0Q

3D Immersion Reduction

XR experiences are often filled with incredibly vivid and immersive visual interactions that can amaze one user and overwhelm another.

Users could have the option to reduce 3D immersion by hiding or fading out non-essential 3D elements that are creating visual noise. Reducing the visual clutter allows the user to focus solely on important interaction points or elements that require their attention.

Reducing immersion and hiding non-essential elements can even alleviate overstimulation or distraction for some users.


As we create immersive experiences where digital components are mixed with reality — or virtual worlds entirely — users are left in a vulnerable state as their real-world environments are partially obscured or completely hidden.

As creators, we have the responsibility to thoroughly analyze the safety risks to our users and design safeguards against them.

1_pGnP0ZhvD3YJ0MyhH6pQqA

Safety Geofencing

In XR, it’s important that the user’s real world space is clear of obstacles. Before immersing, they should be able to define the boundary of the safe zone in their physical environment. Meta Quest’s Guardian Boundary system is an existing example of this strategy.

Within the experience, users should be warned when they are nearing the edge of the safe zone and are in danger of running into real-world obstacles. It would also be beneficial to warn the user if something unexpected enters the zone such as another person or a pet.

1_WVQmV9eVf0YQfmx6ZTduug

Safe Ranges

All elements and interactions that are important to the user/experience should fall within a specific and height-limited Safe Range, which acts as a starting point.

Elements should be placed in areas that are reachable by a default range, but it could still be greatly beneficial to allow the user to customize their own range that will work best for them.

1_33gKmSs7WVh8SCLxcE09XA

Object Differentiation

In immersive augmented reality experiences, it’s easy to forget what’s real and what’s digital. When digital elements begin to blend seamlessly with the physical world, it’s important that users not lose awareness of their environment. Designers can mitigate this risk by making digital objects intentionally visually differentiated from the real world such as with a glow or supplemental UI elements to indicate.

Making XR engaging is a delicate balance between immersion and safety. These digital objects should also be placed thoughtfully, avoiding occluding too much of the user’s field of view and blocking important elements in the real world.

By reimagining what it means to navigate space in immersive experiences, we have the opportunity to offer our users so much beyond the limitations of our real world. We can make the intentional choice to ensure as many people as possible are able to use our experiences equitably.

Designing truly accessible and safe navigation in XR can be achieved by providing powerful options that bend our experiences to our users’ needs and preferences.

Explore the Series

Part 1: Designing for An Inclusive Future

Part 2: Designing Immersive and Adaptive Experiences

Part 3: Designing Inclusive Spatial Interactions and Visual Cues

Part 4: Designing for Auditory Engagement

Part 5: Current Page

Part 6: Designing Feedback and Learning

Part 7: Designing A Strategic Process