This team of scientists and programmers has created some of the most advanced physical modelling synthesis algorithms in existence, so ahead of their time that they have to be run on a remote, custom super computer. These systems can synthesise incredibly realistic approximations of acoustic sounds, but their experimental nature doesn’t make them very accessible: so the researchers approached Gadi, who was known to them as a synthesis-focussed composer from his past electronic work, to explore what potential for real-world usage in music and sound design was there and how to make these tools accessible to musicians and sound artists who don’t know much about coding, maths or physics. Gadi quickly realised that his rudimentary understanding of these fields fell short of what was needed to navigate the complex NESS parameter space creatively: he needed to establish a deep dialogue with the scientists who created the code in order to even attempt using it. What began as technical mentoring, however, quickly turned into a creative conversation about the fringe areas of operation of the models, where by delicately broking the laws of physics it is possible to create otherworldly sounds, in something of a sophisticated evolution of the concept of glitch. In this process, Gadi found a tool he had long been looking for: a bridge between real and abstract sonic spaces, hyper-realistic sounds spectrally morphing into strange textures and vice versa. By mixing these algorithmic sounds with modular synths and live strings in his studio, Gadi was able to realise a vision where tangible acoustic sounds and abstract synthesis are brought together by a new textural domain which exhibits traits of both.
So the music of the Multiverse album was born. The title reflects the fact that these pieces were made by simulating impossible geometries with bent rules of physics, creating instruments that do not belong in our universe: mile long trumpets blown by air twice the temperature of Mercury, lattices of friction-free rattling metal objects, giant guitars fretted by needle shaped fingers, infinitely bouncing masses and many more absurd devices.
The music on Metaverse exists both as stereo mixes and as an immersive multichannel experience. This is thanks to the algorithmic geometries being listenable from more than one point simultaneously, providing an output that is naturally multichannel. These surreal sonic spaces have been performed at Ircam, CCRMA, Sound Forms Festival in Hong Kong and most recently with live octaphonic synthesis and visuals at NIME 2018. These performances have universally been praised as unique experiences that put the listener an altered state, and have inspired VR creators at the Tribeca Film Festival to turn them into virtual art and even a game already in pre-production.
Multiverse involved years of ongoing work from a dozen people between computer scientists, mathematicians, physicists, photographers, film makers, VR creators and the composer, in 4 different countries. They are:
Gadi Sassoon - Composer, creative director, performer
Stefan Bilbao - Research, coding, NESS project leader
Michele Ducceschi - Research, coding, string models
Craig Webb - Research, coding, plug-in prototyping, UI
Reg Harris - Research, coding, brass models
Luigi Ziliani - Photography, phantom camera director
Giulia Ghedini - Video director, editor
Adrian Davies - Animation
Jessica Brillhart - VR design
David Gochfeld - VR design
Multiverse has also been made possible by the generous technical sponsorship of Physical Audio and Izotope.