TacTile App on iOS

This article describes an experimental setup to study the effects of different sensory modalities on the perception of touch. Using a psychophysical approach we question the role different sensory inputs play on the tactile ability to distinguish different textured surfaces. Using motion capture of the hand and fingers, force sensors on the textured objects and surface reactions we are able to capture a rich set of behavioural data as well as user responses to the tactile task.

Tactile feedback is one of the least studied areas when it comes to designing physical interactions with our virtual environments. Whether it is through the screen of a phone or in virtual and augmented reality, vibro-tactile feedback is often seen as an optional feature, not essential to the experience.

However, in the last five years there have been several great developments towards more advanced vibro-tactile stimuli, such as through the Oculus Rift Touch Controllers, or the Apple Taptic engine (Force Touch). These systems use linear actuators, as opposed to rotary motors, which have a smaller ramp-up/down phase and as a result can deliver effective vibro-tactile stimuli faster.

In this project I present an iOS app (TacTile) I have been developing to study multi-sensory (visual, auditory and tactile) roughness perception, the perception of surface roughness through vision, audio and touch.

The idea is that the perception of how rough or smooth a surface feels is differentially dependent on the afferent visual, auditory and vibratory neural signals to the brain.

Through the app we can control the level of each signal modality in order to “make” a surface feel more smooth or rough (diagram 1).