Showing posts with label engineering. Show all posts
Showing posts with label engineering. Show all posts

Monday, 30 July 2012

Different Keys: How Sound Works In Games

The games industry is a very broad artistic church. From architecture to sound engineering, almost every artform is virtually represented in some way, and all at various stages of evolution. Although many games can’t manage to tell a story more complicated than Roger Red Hat, by comparison gaming’s orchestral scores and electronic soundtracks are held in the highest regard.

It’s fair to say that gaming’s contribution to music is one of its biggest success stories. These days, videogame music is played in concert halls around the world, and its creators are some of the most respected people in the industry. But what exactly is music’s contribution to gaming? After all, music is not a fundamental component in the development of a game, insofar as the game will continue to function without it.

Yet the pervasiveness, quality and success of videogame music indicates it is more vital than may initially be apparent. So what can we learn from music’s influence on gaming? How does it affect the player’s experience? And to what extent can it influence the way games are created? To answer these questions, I spoke to three members of the games industry who have tackled the relationship between music and videogames in very different ways.

Jack Wall is one of the game industry’s best known composers, having written music for the first two Mass Effect games and the likes of Unreal II and the Myst series prior to that. He is also the co-founder of Video Games Live – the touring concert event which had its debut in 2005. Currently working on the score for Black Ops 2, Jack believes that music plays a crucial role in the development of storytelling in games. “Music is the unseen character. It's the emotion behind the actions of the player. It's gently there to show the game designer's intention. It's totally collaborative with the developer.”

Music is the unseen character. It's the emotion behind the actions of the player. It's gently there to show the game designer's intention. It's totally collaborative with the developer.

This is the relationship between music and games that we are probably most familiar with. Music can be used to accentuate the actions of the player, to provide certain emotional cues and communicate the tone of the current level or scene. In some ways this scoring of a game is similar to creating music for a film, designed to run in concurrence with the events being played out on the screen.

But even the most directed games must take into account one massive variable: the player. With a complex game like Mass Effect, this can include how long a player is in a particular area, transitions between peaceful and combat situations and the choices the player can make. “It's really not until I start to see gameplay that I truly know what to do with the music,” Jack says. “As soon as I see a decent rendering of a level I can get a beat on what the music should do.”

Interestingly though, music does not always follow the pace of the game. Sometimes the opposite is true. An example of this is the end-run [Minor Spoilers] from Mass Effect 2, composed by Jack before this part of the game was developed:

“Casey Hudson came to me fairly early and said, ‘I'd like you to start by writing the end music for the game. [Spoiler] There's going to be this suicide mission and I want it to feel like you're taking your team and rushing in to save the universe’. He was giving me permission to write a truly kick-ass piece of music. No visuals and no timing. He wanted to get the music done so that when he was piecing the ending together, he'd be listening to that piece of music.”

When music begins to directly influence not just how a game is experienced, but how a game is actually created, things become really intriguing. Sometimes music can be the basis upon which entire games are built. Since the advent of home and commercial computing, it has been possible to break down digitised music into its component parts. The resulting information can be used in the creation of levels and environments – and the most successful example of such a game is 2008’s musical rollercoaster Audiosurf.

Audiosurf can visualise not only the music as it happens, but also the music's future. Players can see the music coming to heighten their feeling of anticipation.

“A big part of it was just wanting a better music visualiser,” says Dylan Fitterer, Audiosurf’s creator. “There were a few I enjoyed, but they quickly became boring and my mind would wander. I wanted one that could create a better listening experience by focusing my attention on the music.”

The path to creating a more engaging music visualiser involved making it interactive, compelling the player to react to something directly linked to the music. Audiosurf  further differentiates itself from standard music visualisers in more fundamental ways, as Dylan explains. “It analyses the entire song before the player starts listening. This way Audiosurf can visualise not only the music as it happens, but also the music's future. Players can see the music coming to heighten their feeling of anticipation.”

 Audiosurf  approaches music from the opposite direction of Mass Effect. Whereas Jack creates music for games, Dylan has created a game for his music. At the same time, however, both games share a striking commonality, which Dylan goes some way to explaining. “I'm excited to have found a tight relationship between gameplay and music. Because it's a game, Audiosurf is a better music visualiser. Gameplay goals serve to focus the player completely on the music. Because it's a music visualiser, Audiosurf is a better game.  Replayability is my most important ideal in game creation, and music gives Audiosurf unlimited replay.”

Here we have a conceptual feedback loop in which the music and the game complement each other. Music becomes more powerful, more evocative, when the listener is involved in an activity linked directly to it. In turn, those actions that become associated with the music are themselves heightened. This applies equally to Mass Effect and Audiosurf, even though these games implement music in very different ways.

Game Scores are often freely or procedurally rhythmic and can have strange formal structures, all emerging from the player’s inclinations.

One man who has put considerable thought into the theoretical similarities between gaming and music is David Kanaga, the co-developer behind the upcoming maverick indie title Proteus.

“Many of us are not accustomed to perceiving the time-structures (rhythms/forms) of videogames as musical,” David says. “They are often freely or procedurally rhythmic and can have strange formal structures, all emerging from the player’s inclinations. These rhythms and forms tend to more closely resemble free jazz and other improvised music than they do the film scores or pop forms that a lot of game music takes its inspiration from.”

Proteus strips away many of the conventions we have traditionally come to associate with videogames, foremost amongst which are a predetermined challenge or objective, and the ability to directly interact with the environment. Interaction is only possible through the game’s music, which changes dynamically depending on the player’s location, the time of day, the season, and so on. Every tree, animal, and building emits an individual noise, the sounds gradually layering themselves as the player explores the game’s islands.

“Music in videogames can be so interesting because you’re creating a musical space, a field of possibilities, rather than a musical script. This allows the composition process to be infused with improvisation/play at all times, even as a final product. And because of how the computer is good at handling data, and manipulating it, it’s possible to delineate the boundaries of the spaces in dynamic ways that haven’t really been possible until now.”

Proteus sits somewhere in the between Mass Effect and Audiosurf in terms of how it plays with music. Like Mass Effect, the music alters depending on the actions of the player, but at a layered, note-by-note level rather than in specifically composed chunks. And like Audiosurf, the music is directly related to the layout of the level, except the relationship is inverted, so the movement of the player dictates how the music evolves. This results in a curious psychological effect where the player is encouraged to explore and experiment without any direct instruction.


Source : ign[dot]com

Monday, 23 July 2012

University of Southern California is Making a Real-Life Holodeck




Project Holodeck is exactly what it sounds like: the University of Southern California’s schools of engineering and cinematic arts have joined forces to develop a virtual reality gaming platform, built almost entirely from widely available technology. 


Players’ body - and head-movements are tracked respectively by a Razer Hydra and Playstation Move. Variable fans (the prototype’s are Arduino-controlled) simulate wind for touch feedback. The team is building its software from the Unity3D engine.



Project Holodeck’s main development snag for the moment is its head-mounted display. The Oculus RIFT, designed by Project Holodeck’s Lead Hardware Engineer Palmer Luckey, will have a definition of 640x800 per eye, with a field of vision sufficient for simulated peripheral vision. “This isn’t like watching a floating television,” the project’s hardware page explains. But pending a Kickstarter campaign, which will price the HMD at $500, the RIFT only exists in prototype.



The $500 price-point will certainly make the RIFT a competitor to Sony’s HMZ-T1, but it’s hard to imagine something with such a limited range of use appealing to anyone except serious gaming hobbyists. “We want to make the dream of a VR play space a reality, and at a affordable cost,” the team has stated. But even if you already owned all the required components (at least one version of the platform featured four networked Kinects), setting up a living room VR system looks like it would be prohibitively expensive for average consumers.


The USC team is reportedly developing its own game to showcase the platform. Wild Skies has yet to post game footage or even a screenshot, so it’s likely the game doesn’t yet exist in any playable form, though a recent Project Holodeck video, which combines gameplay clips from “Skies of Arcadia” with (we’re assuming) staged footage of player actions, makes the concept of a sense-immersive flight-sim/adventure game look surprisingly feasible.





Project Holodeck’s hints at a partnership with Disney Imagineering make full commercial availability of the system seem even less likely - at least for the next few years. But maybe that’s a good thing. VR gaming is hardly a sure thing for developers. (Remember Nintendo’s Virtual Boy? Most people don't.) But current technology would certainly make developing VR games easier and more affordable than it’s been in the past.


If there was a consumer VR platform would you buy it? More importantly, how little would it have to cost? Let us know in the comments.



Source : ign[dot]com