Uncharted 4: An Accessible QTE Case Study

Uncharted 4 is a widely acclaimed PS4 exclusive. A game that many have praised for its story, gameplay and atmosphere. As a gamer without sight though, I had to call upon sighted assistance to progress in any fashion.

After having an amazing time and completing the game with the aid of a Titan 2 controller converter and a sighted CoPilot in the same room, I began to reflect on one of many memorable moments from my first Uncharted adventure.

Spoiler Warning:

The rest of this page will spoil elements of Uncharted 4's finale. I strongly suggest you turn back now and play the game for yourself if you are able or want to do so.

The Question

The final boss fight is a hollywood-style sword fight, complete with the almost cliche ringing of sword-on-sword action. The interesting thing is how the fight is presented, not as a standard combat sequence, but as, essentially, one long Quick Time Event (QTE).

QTE's, as a gamer without sight, are something of a frustration point. Mostly, this can be boiled down to the fact that developers, at least to my knowledge, have never implemented accessible versions, at least not at the time of writing. The option may be in certain games to skip QTEs or puzzle segments, but if I can actually press the buttons, why should I be cut off from a part of the experience just because of a lack of sight?

Looking at this long string of button prompts in the final boss fight of Uncharted 4, I had a thought: What if this fight had additional audio cues or prompts for accessibility so that, as a gamer without sight, I could play it start to finish without assistance?

This idea was in my head from almost immediately after completing the game. After ages trying to figure out where I'd begin with such an idea, I had enough of pondering the concept and reached out on social media to see if I could finally bring it to fruition in some kind of tangeable form.

Fortunately, a friend saw my request for assistance and signal boosted it, reaching out to a streamer who had finished Uncharted 4 the day prior.

Thankfully, Jasmine, the streamer in question, showed much interest in the idea and, on my suggestion of a run on the game's hardest difficulty, crushing, she was willing to take on the challenge.

Though both of us entered this challenge with trepidation, the final boss fight (which can be completed separately via chapter select, fortunately), was completed in about two hours that same day.

Receiving the video and being very pleased with the audio quality, I decided to construct a rough test of how the final sequence of the game (a non-random series of button prompts) would function, just to see if the overall theory would prove to be sound (pun definitely intended).

Prompts VS Audio Cues

The Theory

Originally, I'd contemplated using sounds that fit the universe/scenario of the fight, but then realised that doing so might be too complicated for a straightforward proof of concept like this. I also realised that, in an ideal scenario, developers would be able to design sounds that directly fit into this category as part of the development process, so decided to prove that it worked via spoken prompts instead.

Working with my experience from fighting games and combos, I thought using numbers would be an effective way to communicate with the player what buttons to press, of course once they knew what numbers map to what buttons on the PS4 controller.

Given that cross/x is never used within the entirety of this sequence, I elected to work with the following:

Square (attack)=1

Triangle=2

Circle=3

Hold Triangle=hold 2

Why "Hold 2" Instead Of "Tap 2" Or Similar?

Uncharted 4 has a plethora of accessibility options, though unfortunately none that are directly intended to assist gamers without sight like myself. One of these options, however, did prove useful to me during my playthrough of the game, that being to hold buttons instead of frantically mashing them.

Though the visual button prompts in the video are indicating to mash/tap triangle repeatedly, I recalled my own experience and, when creating the audio files to edit into the raw gameplay, substituted the tap for a hold. Though this was not an intentional replacement on my part, either spoken prompt would've fit into the available gap and the discrepency is a relatively small one. If you're just listening to the audio and closing your eyes to emulate the experience of playing the game without sight, there would be no difference regardless as to the player's choice of taping or holding the inputs.

Back To The Process

In creating the first test, at times the cues were only a fraction of a second out. I was happy with this though, especially as I was working by hand and with only the audio to go on.

Jasmine then supplied the entire first phase of the boss fight as a cue sheet with all the information I'd need (excluding time stamps as we felt that would be unnecessarily tedious to put together). But it did raise another interesting question.

It was actually something I'd thought of before, but now I had the sheet and the audio to work with, I had to figure out an answer: How does the player learn what buttons map to which numbers, in this particular example? ,P> The answer, it turned out, was almost staring me in the face: use a button then number format. Fortunately, "triangle 2" and "circle 3" were just short enough to slide between the dialogue, almost at least.

Later that day she sent, to my surprise, everything, from the first hit to the final decisive end to the fight, as a cue sheet as well. I started working on the second phase that night and continued the next day.

We realised that, by this point in the game, the player would be accustomed to pressing Square to attack and that the audio would, in part, do the work for us in terms of identifying attack opportunities. It is possible that attack cues could be a toggleable sub-option as well, were such features requested by players.

The first, and I hoped final iteration of this proof of concept was sent to Jasmine, where she then synced it with the original video footage. We discovered that, in putting everything together as one long edit, that I'd put a few of the cues later than the visual counterparts, as well as miss-cuing part of the sequence.

This meant, I realised, that I had a real problem on my hands. Either I re-cue everything, or I try and replace the sequence I'd incorrectly cued with an accurate version.

Fortunately, I managed to resolve the issue with comparatively few problems. Jasmine lined everything up, got a file together and sent it to me, but we realised that having a 60 FPS render of the fight would be good just for overall quality. Consequently, one final realignment later, everything was ready for publication.

Here's the end result if you want to take a look for yourself.

Putting this proof of concept together was a really interesting experience for me, as I felt as if I were part of the design process for a feature that might one day actually be a part of a mainstream videogame release. Problem solving is one part of my job as an accessibility consultant anyway, but actually being able to take an idea, prototype it to an extent and see it realised, even within a format with its own limitations as to interactivity was certainly very rewarding.

Collaborating with individuals like Jasmine, who not only have the skills I need to get the necessary footage and assist with providing additional vital resources, in this case, but also are enthusiastic and interested in learning about accessibility is one of the many reasons I enjoy my job.

Now, when I want to consult with studios who are excited about the possibilities of accessibility and are considering QTEs in their titles, I'm glad there's an example I can provide of how things could work, even if it's only in the format of a video like this one at present.

Jasmine's Thoughts On The Process

As someone who does not need to rely on many accessibility features to play my favorite video games, I found myself frustrated for my fellow gamers who could not have the same level of independent success that I am privileged to have while gaming. I am very grateful that SightlessKombat trusted me to handle the game capture and cue sheet for this case study, especially since this was my first time playing any part of Uncharted 4 on Crushing difficulty.

While lining up the audio cues with the gameplay wasn't the easiest, as we were simply working with game capture, I know it would be an incredibly simple addition for devs to add to their QTEs. It really would not take much to introduce an event at a certain frame in an attack animation, trigger an audio cue to signify the necessary input to block, and give gamers with little or no sight the same opportunity I had to frantically press the correct buttons, prevail over Rafe, and squash him with a pile of gold.

I hope that this case study will inspire game developers to continue to expand their ever-growing toolbox of accessibility features, because the world always needs more gamers!

Back To The Main TranscribingGames Page