In our Summer 2023 issue, I wrote an overview article on game accessibility. That article provided a high-level look at game accessibility as a whole, covering multiple different disabilities. For this piece, I wanted to take a deeper dive into game accessibility for people who are blind or have severe low vision. This is the demographic in which I personally fall, and which on the spectrum of game accessibility for those with vision-related disabilities I believe is the most complex.  

For this piece, I will be specifically taking a closer look at game user interfaces (UI), both the UI used out of game (settings, main menus, etc.,) and the UI used during gameplay proper. In addition, we will be taking a more detailed look at the use of sound cues as alternatives for visual information and strategies for providing navigation assistance to this audience. 

My goal with this article is to provide strategies and solutions for game developers, showcase games that implement these solutions to provide examples to follow, while sharing games other gamers with blindness or low vision might enjoy. 

As a note, for this piece, we will be specifically focused on games played on consoles or computers. Some recommendations detailed here may apply to games on mobile touchscreen devices, but they are not the focus of this specific article. 

UI Accessibility 

When we discuss the accessibility of game user interfaces (UI), there are two distinct aspects that are being referenced. One of these is the UI present when a user is making selections out of game. This includes main and settings menus, pause screens, and other interfaces outside of the game that are similar to those you might find in an app or piece of desktop software. On the other hand, there is the UI during gameplay; generally, the text-based information that is presented to the player during gameplay. This essentially covers the information that is always present during gameplay but not directly interactable, or the UI present during gameplay where time becomes a factor, such as during a competitive card game. For this piece, we will be addressing these two types of UI separately.  

As we discussed in the first article, all game text needs to be presented to the user in audio form to make the game accessible. Before we dive into the design of an accessible UI, it is important to take time to discuss how speech can be implemented. In the late 1990s and 2000s, including recorded speech for game text was quite popular amongst hobbyist game developers in the low-vision and blind community. With the advent of better options in regards to Text-to-Speech (TTS) and screen reader integration, this form of text delivery has become far less popular for modern games.  

At this time, the most common method for providing speech is to either push text content to a user's screen reader or implement TTS into the game directly, in practice creating a screen reader for the game from scratch.  

Due to the wide array of programming languages used for game development and the particular complexity of differing game platforms, providing comprehensive technical direction for implementing TTS or screen reader compatibility into video games is outside the scope of this piece. That being said, some common example solutions follow. TOLK is a multi-language solution for providing screen reader access to applications on Windows. An Accessibility Plugin exists for the popular Unity game development environment with compatibility with Windows, iOS, Android, and Mac games as well as other platforms.

Implementing screen reader or TTS into console games will depend on the console in question. For example, Microsoft has made it possible to include TTS available in Xbox games, seemingly through this API

Out of Game UI 

This portion of a game's UI includes menus most commonly, along with other text that appears on screen but is not generally interactable by the user. It is essential to be sure that speech can be interrupted either by a key press or when a user moves from one item to another. When speech is allowed to finish without any method for halting it, it can cause significant delays in navigation for someone who is blind or low vision.  

Since every game is different, it is helpful to include hints on how to interact with the UI as a user navigates. For example, when an item is focused and read, speech might be programmed to say something like "Use the Up and Down arrows to navigate through menus and "A" to select". Hints can be useful to alert users to UI elements that need to be interacted with differently from the norm. For example, when adjusting settings such as sound and background music volume, does a user simply use Right and Left arrows to adjust the sliders or must they interact with the control first before being able to adjust it? With hints like this, it is helpful to include a setting to turn them off when a user becomes familiar with an interface. 

Much like hints that are spoken after items are focused, if there is non-interactable text on screen, it should be read after appearing, specifically after whatever interactable element was focused that produced the text. It can be helpful to dedicate a key or key combination to repeating this sort of information if a user needs a refresher. Take, for example, assigning key statistics in a role-playing game. Being able to refer to how a given statistic is used on the fly would be helpful as a user makes decisions.  

In-Game UI 

In this section, we are specifically discussing a game's UI during gameplay, or when time is involved. Time is the key factor: some games where the gameplay is not timed may more resemble the UI discussed in the previous section. For example, a real-time strategy game in which game time is paused while the player makes decisions in regards to building, assigning jobs, research, etc., would likely fall under the previously discussed criteria. 

In an in-game UI, we are normally referencing the sort of information you may be presented with in a Heads-up Display (HUD). Information could include anything such as health, stamina, ammunition, time remaining, score, etc., that the player may need to be able to access during gameplay. There are several methods for presenting this information that are equally valid.  

One method is to only announce information when it changes in a significant way. When keeping the player informed of their current health, it may be announced at specific intervals-75%, 50%, 25%, 15%, 10%, etc., for instance. Another method, commonly employed by blind or low-vision developers, is to dedicate specific commands to speak information on demand. Particularly useful when the game is meant to be played on a computer keyboard, common commands include "H" for health, "S" for score, "T" for Time, letters you might naturally associate with specific information. Some developers choose to design information announcement commands based on hand placement when playing a given game. For a detailed example of this strategy, see our review of the Hearthstone Access Mod.  

Since you only have so much keyboard real estate and games that aim to be controller compatible need to make good use of the limited buttons they have available, it can be useful to provide a shortcut to read the most crucial information on the fly and include other HUD information in a menu format that can be called at will and that pauses gameplay.

Sound Cues and Navigation 

There are as many ways of using audio cues and navigation strategies as one can imagine. Recommendations in these sections will be based on strategies that have a proven track record or are extrapolated from such.  

Using Audio Cues for Visual Objects 

Outside of games that can be made entirely accessible using text (See Hearthstone for an example) it is nearly always necessary to use audio cues to provide an alternative for certain visual information. Note that extra audio cues need not be provided if an object is making sound in some way. For example, if there is a fire hazard that a player must navigate around, the sound fire might provide enough information for the player to avoid on that audio alone. Since sound is relative, we may wish to include an optional sound when the player is one step away from the hazard, to indicate the same information that the sighted player is receiving visually. The game The Vale, an audio-only action role-playing game, uses natural sound and spatial audio to provide all of the information a player needs to interact with the game's world. See this gameplay demonstration for a look at multi-person melee combat, navigation, and a boss battle. 

When designing audio cues, consider all of the attributes of a sound to which you have access. You can communicate a good deal of information using spatial positioning, volume, and pitch. Spatial positioning provides intuitive knowledge of the direction of an object while volume does the same for distance in most cases. Pitch can be used to indicate information that more natural sound might not. Examples include if the audio source is above or below the player, or if it is in front or behind the player. This is true for both 2D and 3D games, pitch can be used to indicate audio sources above or below the player in games such as the 2D side-scrolling Super Mario Bros titles as well as the 3D titles such as Super Mario 64 or Super Mario Odyssey. If a game requires the player to both know the vertical position of audio sources as well as if sources are behind or in front of the player, muffling or otherwise distorting sounds behind the player can be used.  

Artificial sounds, think beeps, chimes, clicks, and others that aren't naturally occurring, are generally used to indicate something that is visual but wouldn't make sound on its own as well as to indicate distances in situations when the relative sound of volume will not suffice. Commonly, artificial sounds are used to indicate the presence of stationary items such as interactable or collectible objects, or when something is within a specific important range. One common use of a sound to indicate proximity is when a player is near a ledge or dropoff. Sounds might include something artificial, or something more natural in this case, such as the sound of crumbling or an audible gasp and stumble from the character. 

These sounds need not be artificial, In the Manamon navigation demonstration from our Summer game accessibility piece, notice that artificial sounds are used for the presence of walls and other characters, while natural sounds are used for other objects, beds, doors, bookshelves and the like.

Navigation strategies for People Who are Blind or Have Low Vision 

Before implementing a navigation strategy, it is important to determine if one is needed at all. In the Manamon demonstration above, notice that the game does not include any other navigation options aside from sounds to indicate the surface the player is walking upon and sounds to indicate proximity of walls and objects. As navigation-based puzzles are a key part of this game, extra navigation options are not provided to the player. Compare this to A Hero's Call where though navigation cues are provided so that the player can explore freely, a solution to provide directions to given key landmarks is also provided. One thing important to notice about the Hero's Call navigation strategy is that speech is used in addition to sound to better aid the player in knowing where they are located.

This strategy is also used in the popular audio zombie survival first-person shooter, Swamp. In Swamp, your character has multiple radar options to indicate if the space around them is clear, obstructed, or obstructed by something that can be fired over or through, a window or wrecked vehicle for example. This radar can be triggered to sweep forward and back on your left or right sides, back and forth in front of you as if you were using a cane, swept behind you, and all in sequence. In addition, as your character enters different zones, roads, sidewalks, buildings, etc., these will be announced to you. Swamp's maps tend to be large and contain many landmarks, so a beacon system has also been implemented. You can set a beacon to a given landmark on your current map whereupon you will be told the direction to its location while a high-pitched beep will play constantly at shorter and shorter intervals as you get closer. For a demonstration of these features, see this video. Note that if the timestamp in the link does not take you to the proper section, the demonstration of Swamp's navigation features begins at 12 minutes and 50 seconds.

The Bottom Line

One of the best ways to determine how to make an aspect of a game accessible is to seek out real-world examples of accessible forms of that aspect. When it comes to accessibility, uniqueness is not the goal, you are simply looking for the most efficient method possible to adapt something from a visual medium to a spoken or auditory one. If you find the perfect solution for your problem, use it even if it is identical to how someone has implemented accessibility in their own game.

I highly recommend taking a look through the games at Audiogames.net. The site lists over 800 games completely accessible to people who are blind or have low vision so is a treasure trove for anyone seeking examples on how to make their games accessible. If you have found a game that has been lost to time, you might be able to find it in the Audio Games Archive, as has been mentioned in previous articles, many accessible games over the past 30 years were created by single programmers who were blind or low vision themselves so may have been lost to time without the archive.

Author
Aaron Preece
Article Topic
Accessible Gaming