This page collects some of my creative experiments and prototypes with XR technologies.

 

Sound-based Interaction for Mobile Augmented Reality

The microphone -which is included in every smartphone- can be an incredibly powerful input device for creating new and unexpected interactions with AR content.

I have created several prototypes to explore this underestimated sensor.

1. How to Blow Virtual Flowers  

Device: SONY Xperia XZ1

OS: Android

Made with: ARCore / Unity ARFoundation

 

2. How to Blow Virtual Chimes

Device: SONY Xperia XZ1

OS: Android

Made with: ARCore / Unity ARFoundation

 

3. “Knock Test”   

Device: SONY Xperia XZ1

OS: Android

Browser: Google Chrome

Made with: WebXR (three.js) / Tensorflow.js

This is a personal take on the famous (famous for AR developers!) “hit test”. With “hit test” we refer to a function (common to several AR platforms, such as ARKit or in this case with WebXR) that let you point to an arbitrary surface in the real world and get an accurate 3D position of the point you’re aiming at. Usually, after the position is retrieved, you tap on the same position on the screen and a virtual character or object will be displayed…here we don’t hit a screen, we hit directly the surface we are looking at!

The microphone can be used to capture specific sound events that later can be classified and recognized in real-time. One of these is the sound you make by knocking your hand on a table. This can be used to evoke Virtual characters on the table itself. By doing so, the interactions become extremely natural and the whole experience feels realistic and present.

 

Walking AR app for Smart Glasses

Digital pedometers and apps that track and analyze human walking are becoming increasingly popular. They can provide to the user a lot of information, such as: how many steps they walked, how much calories they burned, and for how long a walking session lasts.

However, to consult such apps you need to take out the smartphone from your pocket or look at your smartwatch.

By using AR glasses (such as the one from Nreal) we can overlay such information with our environment, and have it displayed the whole time…or for the time we want.

Device: Nreal Dev Kit

OS: Android

Made with: Unity / NRSDK

 

Musical Augmented Reality

This is one of my favorite areas of experimentation. But, I don’t understand why it’s so underdeveloped!

During the years I developed several iterations of AR musical instruments using different devices and platforms.

 

AR Theremin

Not everyone can own a Theremin. Especially a really old (and rare!) RCA Theremin.

In the past, I developed several AR and VR Theremins, and this is the latest one.

You can play a virtual version of the RCA Theremin directly in your house or anywhere you want. This version can be experienced with the Nreal Light Glasses, using the hand-tracking feature provided by the most recent SDK of Nreal.

(the audio is generated and recorded in real-time, except for the rhythm track)

Device: Nreal Dev Kit

OS: Android

Made with: Unity / NRSDK

 

AR FM Synthesizer

Analog and Digital synthesizers are nice, but can be extremely tricky and tedious to interact with virtual knobs…also they can be pricy and tend to occupy a lot of space.

This is an iteration on an AR sound synthesizer (using FM synthesis) that be displayed anywhere. Moreover, you can interact with the instrument using the controller that comes with the Nreal glasses. The controller is composed of a touchpad and some motion sensors (gyroscope and accelerometer). Instead of replicating existing controllers (such as knobs, buttons, and sliders), I developed a way to control an FM synthesizer that feels more embodied, direct, and rough. You are going to move and sweat.

Device: Nreal Dev Kit

OS: Android

Made with: Unity / NRSDK

 

HoloStage: a Mixed-Reality Environment for Music Production and Performance

Digital music production software will end up in the mixed-reality space. I don’t get why they are not already there!

HoloStage is a prototype for a very simple but powerful music production environment, where every instrument is presented as a virtual element. It can be used in your house or taken outside (as I did).

This is not only software, it’s an environment. It can be experienced by walking into it. Each track becomes an object in space. The software can be experienced with our bodies, is habitable.

The environment is composed of two main instruments: an 8-steps sequencer, and a Kaosspad-like controller for sound generation. With the former, you can create and modify a rhythmic structure, while with the latter you can play some “melodies” on the top.

In some of the videos below, you will (sometimes) see a 3D avatar dancing…even if this environment will be used primarily by musicians and producers, it can be turned easily into a performative environment where audience members -from different locations- will participate with their avatar.

Device: Hololens

Made with: Unity / MRTK

 

Sequencer

 

 

 

 

 

Hand Menu

 

 

Synth