Mixed Interaction Space is developed by thomas riisgaard, eva eriksson and I. MIXIS - mixed interaction space is a new approach to gesture based interaction on mobile devices. Mixis uses the camera in mobile devices to track a fixed point and thereby establish a 3 dimensional interaction space in which the position and rotation of the device can be tracked. The fix point can be drawn with a pen creating a flexible way of establishing the mixed space between physical and digital.

We have developed three applications -
ImageZoomViewer , DrawMe and DRoZo on top of the mixis concept and we are still looking towards new improvements. These include mapping between new drawn patterns and applications in the device itself or in remote appliances, as well as multiuser applications with identity for public displays.

See the movie explaining the concept and the three applications here (7 mb)

After further development done by all three at thomas' place in Berkeley we are moving towards multi-user applications. We still need a lot of developing but a small video shows our proof of concept. In the video two persons draw a track point (a circle) and connects to a display via bluetooth. After each device connects a cursor/graphic representation appears and move around the display according to the the mixis gestures - see the video here (13mb)

In march 2006 we developed the multiUser concept further and made a concept for true multi-user interaction in public settings. Screendumps from the video can be seen to the right and a 3 min video explaining our concept and showing the system in use can be seen here (26mb)

Read a paper on mixis here