• Sun. Nov 17th, 2024

How Naughty Dog prototyped the interactive guitar in The Last of Us Part II

Byadmin

Jul 23, 2021


For better or worse, The Last of Us Part II commits to telling a brutal, unrelenting tale of revenge and loss. It means the sequel can be an emotionally draining experience at times, but there are moments of levity that offer phosphorescent glimmers of hope whenever the darkness threatens to overwhelm.

Many of those scenes center on Ellie and her guitar. Players are given the chance to keep her relationship with Joel alive by reciting some of his favorite songs and recalling their lessons together. Given what happens early on in the game it’s emotional to say the least, with the relationship between Ellie and her instrument reinforcing what she (and players) have lost while also bringing some much needed warmth to proceedings.

Speaking at GDC 2021, Naughty Dog game designers Mark Burroughs and Grant Hoechst said the sequel was always going to be bookended with the guitar for that reason. Joel talks about his love of the instrument in the first game, so it’s serves as a natural through line. Beyond that, the guitar is tonally important to the world of The Last of Us as it harkens back to Gustavo Santaolalla’s mesmerizing score and Joel’s Texan country singer roots. 

So, Naughty Dog knew from minute one there was going to be a guitar in the game and that Ellie was going to play it. That begged the question: “could they put it on the stick?” Were those moments going to be non-interactive cinematic that would deliver emotional impact and pacing, or something players could actually engage with? 

The studio ultimately pursued the latter option because, as explained by Burroughs and Hoechst, it’s more fun. In creating the first prototype, the pair began by asking what the in-game instrument needed to achieve. They ruled out using QTEs (quick time events) or creating some kind of rhythm  game because the former is too performative and the latter would be tonally jarring given Ellie is meant to be a beginner trying to recall some basic lessons. 

To make it feel real, the duo felt they had to deliver one-to-one input between players’ actions and the resulting guitar sounds. That would allow them to hit their metric for success, which was about pushing players to recall and play the correct notes and chords rather than maintaining a steady rhythm. The DualShock’s touchpad presented an ideal way to achieve that goal because it offered the universal affordable of swiping and would stop players from needing to rely on janky button inputs. Now Burroughs and Hoechst had their building blocks, it was time to get to work. 

“Alright, first thing. Let’s get something, anything, into the game. Let’s start with just the one song we have. We have Ashley Johnson (Ellie) playing ‘Take on Me’ from a recorded cinematic, so we have something that will show up in the final game that we can use as a target,” says Burroughs, recalling how the team kicked off the prototyping process.

“Let’s grab some temp notes. I think Grant recorded some midi notes from his phone for this step. We’ll place those notes and divide the touchpad into six string regions. The player can strum through those to play a chord, or touch one region at a time in order to play one note. At this point, just playing the desired note when a region is pressed will be enough. We can worry about how it sounds later on.”

Burroughs noted there are five chords in ‘Take On Me,’ which meant the initial prototype had to allow the player to choose a chord from a list of five. “If we put [those chords] on a list connected to face buttons, however, players will have to either check or remember how the buttons are paired to the chords,” he continues.

“That creates a mental leap players have to overcome. Even when the buttons are placed on-screen with their corresponding chords, for people unfamiliar with the layout, they’ll have to check where a button is, creating a diversion of that player’s attention. So the simplest affordance is probably going to be a wheel menu on the analog stick. That way the player can deflect in a corresponding direction without having to look down.”

That solution was a double-win because it meant players would be allowed to stay focused on what’s happening on-screen, while also letting them highlight a chord selection without committing to it with a button press. This was important, because as Burroughs points out, chord progression is key. 

“When you’re learning a song on guitar you first learn the chord progression and then you speed up faster and faster until you can play in tempo. Let’s focus on that first half where the chord progression is key,” he explains. “For now, I’ll place a yellow square near the edge of the next chord to tell the player what to play next. When the player strums with a chord selected, it’ll either flash green for correct or red for wrong. This feedback is going to be important to let a player know when they’re making progress.”

All of those design decisions fed into the very first prototype (above). At this stage, the UI was made with debug lines so it was fairly ramshackle, but that allowed the dev team to make changes quickly, altering the division of the wheel, what text would appear in each wedge, the colors of the wedges themselves, and so on. 

In Burroughs own words, that first iteration of the guitar “kind of worked,” and featured a lot of elements that made it into the final product. It allowed players to feel like they were consciously choosing a chord, clearly explained when they got something right or made an error, and even delivered fun through failure by letting players hear whatever wrong chords they played. 

Want more game dev lessons from GDC 2021? Click here to catch up on even more tips, pointers, and tidbits from this year’s conference.



Source link