// A Brenton Costa collaboration

Introduction

Synes was a UI experiment in audio mixing in mixed reality. We wanted to see if we could replace the traditional digital audio workstation (DAW) controls with virtual hands and acoustically-modeled spaces to create mixes similar to real life processes. In order to produce a proof of concept, we focused on three main parts of the prototype:

  • A Live Room with Acoustic Modeling
  • Frequency Analyzer with Bandpass Audition
  • Record Box Project Selection

Live Room

We needed a way for the space to be acoustically modeled so that the placement of the various audio sources would actually change depending on their position in the room. To accomplish this, I auditioned several audio engines in Unity, ultimately deciding on Steam Audio by Valve. I found that it produced the most realistic acoustical experience that included occlusion and refraction.

Frequency Analyzer

We envisioned a completely different way of listening to and auditioning frequencies in a more experiential way by having them waft over the listener. This was accomplished through a combination of analyzing audio, spawning prefab cubes and a particular system to create the river of sound we were aiming for. This experiment also included hand tracking through Leap Motion where I was able to affix a rudimentary palm UI to the left hand that would scale the bandwidth of the notch filter (Q) to the physical scale of the frequency visualizer and provide a solo listening function.

Behind the Scenes