TUIO 1.0 Protocol Specification


Since the publication of the TUIO [1] protocol specification to the public domain, in addition to its initial implementation within reacTIVision [2], it has been adopted by several other projects related to tangible and multi-touch interaction, such as the projects by the NUI group community and several other tangible interaction platforms. This document provides an online resource for the public protocol specification, correcting a few errors from the original document. Due to its widespread adoption it seems that this protocol is considered a useful tool and it has proved to be versatile and stable enough for its main purpose, the transmission of object and finger parameters for interactive surfaces.

General Observations

This protocol definition is an attempt to provide a general and versatile communication interface between tangible table-top controller interfaces and underlying application layers. It was designed to meet the needs of table-top interactive multi-touch surfaces, where the user is able to manipulate a set of objects and draw gestures onto the table surface with the finger tips. The objects are tracked by a sensor system and can be identified and located in position and orientation on the table surface. An additional cursor object, representing for example finger touches, doesn't have an unique ID and doesn't provide rotation information.

Implementation Details

The TUIO protocol defines two main classes of messages: set messages and alive messages. Set messages are used to communicate information about an object's state such as position, orientation, and other recognized states. Alive messages indicate the current set of objects present on the surface using a list of unique session IDs.

To avoid possible errors evolving out of packet loss, no explicit add or remove messages are included in the TUIO-protocol. The receiver deduces object lifetimes by examining the difference between sequential ALIVE messages.

In addition to SET and ALIVE messages, FSEQ messages are defined to uniquely tag each update step with a unique frame sequence ID. To summarize:

Efficiency & Reliability

In order to provide low latency communication our implementation of the TUIO protocol uses UDP transport. When using UDP the possibility exists that some packets will be lost. Therefore, our implementation of the TUIO protocol includes redundant information to correct possible lost packets, while maintaining an efficient usage of the channel. An alternative TCP connection would assure the secure transport but at the cost of higher latency.

For efficiency reasons set messages are packed into a bundle to completely use the space provided by a UDP packet. Each bundle also includes a redundant alive message to allow for the possibility of packet loss. For larger object sets a series of packets, each including an alive message are transmitted. When the surface is quiescent, alive messages are sent at a fixed rate dependent on the channel quality, for example once every second, to ensure that the receiver eventually acquires a consistent view of the set of alive objects.

The state of each alive but unchanged object is periodically resent with additional set messages. This redundant information is resent at a lower rate, and includes only a subset of the unchanged objects at each update. The subset is continuously cycled so that each object is periodically addressed.

Finally, each packet is marked with a frame sequence ID (fseq) message: an increasing number which is the same for all packets containing data acquired at the same time. This allows the client to maintain consistency by identifying and dropping out-of-order packets. To summarize:

Message Format

Since TUIO is implemented using Open Sound Control (OSC) [4] it follows its general syntax. An implementation therefore has to use an appropriate OSC library such as oscpack, and has to listen to the following message types:

/tuio/[profileName] alive [list of active sessionIDs]

/tuio/[profileName] set [sessionID parameterList]

/tuio/[profileName] fseq [int32]


The parameters defined in this section reflect the object properties we considered important for an interactive surface interface. Some of these parameters (id, position and angle) are retrieved directly by the sensor. Others (speed, acceleration) are derived from these primary parameters using timing information. Computing these parameters on the low level side of an tangible user interface system allows a more efficient computation, since the necessary timing information does not need to be transferred to clients.

Table 1: semantic types of set messages
s sessionID, temporary object ID, int32
i classID (e.g. marker ID), int32
x, y, z position, float32, range 0...1
a, b, c angle, float32, range 0..2PI
X, Y ,Z movement vector (motion speed & direction), float32
A, B, C rotation vector (rotation speed & direction), float32
m motion acceleration, float32
r rotation acceleration, float32
P free parameter, type defined by OSC packet header

A session ID number is assigned to each object. This is necessary to uniquely identify untagged objects across successive frames, and in the case where multiple objects tagged with the same classID are simultaneously present on the surface. The semantic types allowed in a set message are shown in Tab.1.

Parameter Computation

The TUIO coordinate system is normalized for each axis and is represented by floating point numbers in the range from 0.0f to 1.0f. In order to compute the X and Y coordinates for the 2D profiles, a TUIO tracker implementation needs to divide these values by the actual sensor dimension, while a TUIO client implementation consequently can scale these values back to the actual screen dimension.

x = sensor_x / sensor_width
y = sensor_y / sensor_height 

The movement velocity unit is defined as a movement over the full length of an axis within one second. As an example, moving a finger horizontally from left to right across the full surface within one second represents a movement velocity vector of (1.0 0.0). The motion speed values are computed from the normalized distance between two positions divided by the time between the two samples in seconds. The acceleration value is the speed change divided by the time between the two samples in seconds.

X = (sensor_dx / sensor_width) / dt
Y = (sensor_dy / sensor_height) / dt
m = (speed - last_speed) / dt

The rotation velocity unit is defined as one full rotation per second. Therefore performing one full object rotation within one second represents a rotation velocity value of 1.0. The rotation velocity values are computed from the normalized angle change divided by the time between the two samples in seconds. The according rotation acceleration value is therefore calculated from the rotation speed change divided by the time between the two frames in seconds.

A = ((a - last_a) / 2*PI) / dt
r = (A - last_A) / dt


We define a set of profiles, which apply to most table-style tangible user interfaces. This allows the tracking of objects and cursors on two dimensional surfaces and in special cases also in the 3D space above the table surface. If one of these predefined profiles doesn't meet a system's requirements one can also implement free form custom profiles, which allow a user defined set of parameters to be transmitted.

2D Interactive Surface
/tuio/2Dobj set s i x y a X Y A m r
/tuio/2Dcur set s x y X Y m
2.5D Interactive Surface
/tuio/25Dobj set s i x y z a X Y Z A m r
/tuio/25Dcur set s x y z X Y Z m
3D Interactive Surface
/tuio/3Dobj set s i x y z a b c X Y Z A B C m r
/tuio/3Dcur set s x y z X Y Z m
custom profile
/tuio/_sixyP set s i x y 0.57

For the last profile, the parameters of the set message are can be a user defined format. The custom profile carries the message format within its name, similar to the OSC header.


1 Kaltenbrunner, M., Bovermann, T., Bencina, R., Costanza, E.: "TUIO - A Protocol for Table-Top Tangible User Interfaces". Proceedings of the 6th International Workshop on Gesture in Human-Computer Interaction and Simulation (GW 2005), Vannes, France, 2005

2 Kaltenbrunner, M., Bencina, R.: "reacTIVision: A Computer-Vision Framework for Table-Based Tangible Interaction". Proceedings of the first international conference on "Tangible and Embedded Interaction" (TEI07). Baton Rouge, Louisiana, 2007

3 Jorda, S., Geiger, G., Alonso, A., Kaltenbrunner, M.: "The reacTable: Exploring the Synergy between Live Music Performance and Tabletop Tangible Interfaces". Proceedings of the first international conference on "Tangible and Embedded Interaction" (TEI07). Baton Rouge, Louisiana, 2007

4 Wright, M., Freed, A., Momeni A.: "OpenSound Control: State of the Art 2003". Proceedings of the 3rd Conference on New Instruments for Musical Expression (NIME 03), Montreal, Canada, 2003.