-
Notifications
You must be signed in to change notification settings - Fork 0
Components
Sam Bilbow edited this page Dec 8, 2022
·
1 revision
- Software Companion for the Project North Star open-source AR headset that allows developing Unity scenes with MRTK/Leap Motion assets.
- LibPdIntegration: a wrapper for libpd that allows for the implementation of Pure Data patches into Unity
- Automatonism: a library of Pure Data Vanilla patches that emulate modules of a synthesizer.
- A set of example scripts and scenes that use the above components to demonstrate possible interactions between head/hand tracking and patch parameters in Pd, with the chief aim of creating a set of expressive multisensory AR instruments / experiences.
- Six degrees-of-freedom (3D position / orientation) head tracking via Intel T261
- 90 fps, 170° hand tracking via Ultraleap
- Single piece optical combiner allowing for up to 110° horizontal FoV
- 2x 120Hz displays per-eye for a total resolution of 2880x1600
- 2x 3-metre cables (1x miniDP, 1x USB-A 3.1)
- Spatial audio AR (the ability to hear localised sound whilst being able to hear your real audio environment) via Unity3D and Aftershokz Aeropex bone conduction headphones.
- The ability to create 3D scenes that contain 'GameObjects' that in turn can have visual attributes such as 3D meshes, material colours, and textural properties; physical attributes such as edges, position, mass, velocity and real-time parameterisation via C# scripting.
- Thanks to the Software Companion, the headset is created as a GameObject with real-time position / orientation.
- Thanks to LeapMotion, hands (all the way down to individual finger joints) are created as GameObjects with real-time position / orientation relative to the headset.
- LibPdIntegration uses native Unity3D audio spatialisation. This is great because it means that a GameObject can output the signal of a Pd patch whilst moving, rotating and scaling. The effect of these can perceived in real-time because the AudioListener is anchored to the real-time headset position. This, for example, means that the volume of a Pd patch whose signal is being transmitted from a GameObject located in space is automatically scaled dependent on its distance to the participants head (quieter as it gets further away, louder as it is brought closer).
- LibPdIntegration can 'instance' Pd patches, meaning it can use one patch on multiple GameObjects, but maintain processes like randomness within them as they are technically different 'instances' or versions of the patch.
- Pure Data allows extended audio techniques through an extensive library of algorithmic 'objects' that can create and manipulate audio signals.
- LibPdIntegration allows real-time parameter control in Unity of any object in a Pd patch via "receive" objects and a specific C# method.
- The combination of "Play Mode" toggling in Unity, and the quick visual patching style of Pure Data means that audio-visual interactions can be prototyped very rapidly