One of my interests is the development of haptic devices that can help humans to get physically in touch with virtual environments and digital data. I feel the need to find new ways to get the digital world more close to our body and away from omnipresent screens and any type of pixel-based displays.

In this page, I’m collecting several prototypes and systems I designed and developed for different clients and artists. They range from haptic displays to haptic interfaces for audio-visual virtual environments.

 

1. Physical Haptic Display

2. Force-feedback: contact and sound

3. Force-feedback: spring and sound

4. Other haptic-sound tests

 

Physical Haptic Display

 

Two-ways Haptic Display

The physical artifact displays a motion pattern. When a person touches its top, the motion stops and it can be directly modified through touch input. The artifact exhibits a spring-like behavior, which produces a perceivable force-feedback, which can be used to render different types of materials and physical properties of virtual materials. When the person releases his/her finger, the motion pattern restart.

 

Physical Output 

Different motion patterns are played by the physical artifact.

 

 

 

Here, the motion is manually controlled through a GUI slider.

 

 

 

Force-feedback: contact and sound

 

 

 

 

 

Force-feedback: spring and sound

 

 

 

 

 

Other haptic-sound tests

A work in progress collection of haptic devices for Virtual Musical Instruments.

 

A force-feedback interface for interacting with a virtual reed instrument. It is a hybrid instrument, which sounds between a clarinet and a saxophone. The sound model is developed with Modalys.

 

Plucking a virtual string. The sound model is developed with Modalys.

 

This application lets a user control a virtual bow which interacts with different types of virtual metal plates. The sound model is developed with Modalys.