Friday, February 24, 2017

Acoustics in Games

Defined by American National Standards Institute, sound is called acoustics. Acoustics is the science of sound and can be modelled as mechanical waves in an elastic medium using the acoustic wave equation. The fundamental understanding of acoustics is frequency. "Frequency is defined as the number of times in a given period a repetitive phenomenon repeats; in audio, the unit frequency is hertz, which measures repetitions per second" (Somberg, 2017, pg. 5). Sound surrounds us and gameplay has sounds that mimic and create immersion for the players. Human has a dynamic range of 120 decibels (dB). Also, "humans have the ability to tell the direction that a sound is coming from, primarily due to geometry of the head and ears (Somberg, 2017, pg.6). Spatial hearing helps to create immersion in the gameplay. Soundscape was previously known to provide acoustic feedback to the player in the past, music was not the main audio element. However, consumers expectations changed. Sound effects, music, and dialog became one cohesive audio vision.

To create a sense of space in games, audio designers have to model 3D audio. This is done by activating spatialisation in audio wares. Game designers could experiment with different acoustic modelling such as diffraction, occlusion, reflection, attenuation and auralisation. Obstruction between the listener and emitter will result to quieter and less low-frequency content. This is known as the diffraction modelling. Diffraction is hearing sound not in line of sight. "Diffraction can cause sound to bend past edges, thus allowing one to hear sound through portals such as doors or past objects" (Bengtsson, 2009, pg.6).

Occlusion modelling is used when the sound emitter is placed in another space. This acoustic model is frequently used when a door or window is present. When a sound is confined in another space, the sound will lose some of its high-frequency content and will be attenuated.



An example of occlusion and attenuation is shown in the Counter-Strike (Valve Corporation, 2000) gameplay video above. The sound of the bomb from afar was occluded and has low pass filters [3:11]. Furthermore, it was attenuated because of its distance from the player.

Reflection occurs when sound bounces off a surface or strikes an object, the reflected sounds may reach the listener at different strengths, times and pitch. An example would be when one experienced echo situated near a cliff or large building. Apart from echo, reflection also causes early reverberation and late reverberation. The audio designer will need to build a sound propagation system to cover more than one reverb zone for one game object. The sound propagation comprises of the direct sound (emit directly from the sound source), early reflections (the first echoes of a audio that reach the player after direct sound arrives), and lastly the late reverberation (the last component heard by the player).

The video above shows that reverb added into the space, the gun shot has a reverb tail that indicates the spatial quality [2:20].

Attenuation emphasis on placement of the sound source. "The strength of the sound source decreases with distance, from air absorption caused by the reflecting surfaces" (Bengtsson, 2009, pg. 6). The closer the distance from the sound source, the louder the dynamics.

"Auralisation models the listener as two point 'microphones' in the virtual world yields a rather unconvincing result" (Bengtsson, 2009, pg. 6). The sound designer could either mix the audio in surround systems (5.1 or 7.1 systems) or manipulate
a Head-Related Transfer Function (HRTF). "Head-related transfer function is a function used in acoustics that characterises how a particular ear receive sound from a point in space" (Potisk, 2015, pg.1).

Bibliography


Bengtsson, J. (2009) Real-time Acoustics Modeling in Games. thesis. Available from: [Accessed 23 February 2017]. 

T. (2012) Counter-Strike: Global Offensive Gameplay PC HD [Internet]. Available from: [Accessed 23 February 2017]. 

Anon (2011) Gsound: Interactive Sound Propagation for Games. thesis. AES 41st Conference: Audio for Games. Available from: [Accessed 23 February 2017]. 

Guay, J.-F. (2012) Game Developers Conference 2012. In: Real-time Sound Propagation in Video Games. California, Ubisoft Montreal . Available from: [Accessed 23 February 2017]. 

Howard, D.M. & Angus, J. (2009) Acoustics and psychoacoustics. 4th ed. Oxford, UK, Focal Press. 

Potisk, T. (2015) Head-Related Transfer Function. thesis. Available from: [Accessed 23 February 2017]. 

Somberg, G. (2017) Game audio programming: principles and practices. Boca Raton, FL, CRC Press Taylor & Francis Group. 

Valve Corporation (2000) Counter Strike, computer game, Microsoft Windows, Valve Corporation America.




MY VIDEO!