SonicTexting is a way of writing text - "texting" - using touch and sound. It explores people's hand-ear coordination and their ability to use sound as information. Like with musical instruments and everyday mechanical objects, sound in SonicTexting is synchronous and responsive to actions. SonicTexting proposes that through touch and sound, interacting with digital devices can become an experience on the borders between a functional task, an instrument and a game.  

 


The SonicTexting components and design process

 

The Keybong:

A one-handed joystick that naturally supports the SonicTexting gesture patterns. The keybong also contains a push-button and a small eccentric motor that gives vibro-tactile feedback in the writing process. LEARN MORE

keybong optiona

 

The Gesture Map:

A static visual representation of the gesture path for each letter, used for learning the gestures. It is read as follows: to write an 'axis' letter (letters on the main axes) the controller is moved in this direction, then returned to the center. To write a nested letter (all other letters), first move in the axis direction...LEARN MORE

map options

 

Sound:

Sound provides continous feedback for navigation during the movement: an interactive sonification of the gesture path. There are two sound modes: beginner mode, which optimizes memorization and learning, and expert mode, which optimizes speed. HEAR MORE

 
 

SonicTexting enables writing an SMS from your pocket using only sound.
It has been exhibited and presented in:
the Triennale Museum, Milan, the Victoria and Albert Museum, London, the CHI conference, and more.
Even more about the sounds, sights and ideas behind SonicTexting in the paper.

 

SonicTexting was created by Michal Rinott as part of her Masters thesis, 'Audio-Tactile', at Interaction Design Institute Ivrea.
Contact michalrin [at] gmail [dot] com.