Wave Field Synthesis & Spherical Harmonics
I went to the 2014 edition of Manifeste festival at IRCAM and attended Fundamental Forces, an audiovisal projection by Robert Henke and Tarik Barri. The whole piece focuses on accelerating motion and attraction between visual and sound structures, complex visual shapes and sonic structures emerge as the result of repeated applications of simple mathematical operations, are transformed and thrown around in real time to form an abstract world of floating objects in deep space. The visual component is based on Tarik Barri’s ‘Versum’ computer animation engine, whilst the auditive part is created by Robert Henke, using real time synthesis modules created in Max MSP, embedded in a Max4Live environment.
Adapted for the venue, the premiere of Fundamental Forces benefited of the Wave Field Synthesis (WFS) and Ambisonics techniques that reconstruct the physical properties of a sound field.
Based on Huygen’s Principle, WFS permits to synthesize “sound holograms” by simulating acoustic waves produced by virtual sound sources. A large number of loudspeakers are regularly spaced and used conjointly, each controlled with a delay and a gain to form preserve the fidelity of the spatial image.
Ambisonics is a method of recording and reproducing 3D sound that represents the spatial dependence of an acoustic field as a combination of basic spatial patterns. The sound undergoes an encoding step made up of an ordered series of components called spherical harmonics.
I had a glimpse of the research going on in the studios, from voice synthesis to IRCAMAX audio plugins and gesture-sound interaction. I discussed about works that’s done internally in web sound visualization/generation using low level WebAudioAPI, throw an eye to the open source code made available on the IRCAM github repo.