Making of

The Technology is using a bespoke c++ application made using openFrameworks and multiple add-ons from the community (such as ofxTimeline, ofxGui, ofxOsc,mappamok), some shaders are remixed from shadertoy, PCL.

The robotic arm we used is a Kuka KR 6/2 with windows 95 KRC2 controller, it was generously lent to us by Dipl.-Ing. Siegfried Müller Druckgießerei  , the communication with the robot and the old controller has been possible through openShowvar, an application made by Massimiliano Fago.

After arranging the space and adding the set elements, the space has been scanned with a Faro laser scanner. We have then turned the point cloud from the scan into a mesh for the mapping application. A model was filmed with multiple kinect sensors in order to replay the model point cloud in the space using projection.

Visual effects (mostly shaders) were then wrapped as texture on the virtual mesh, the projection is acting as a camera with its position being updated according to the robot’s motion and position. If you would like to know more about the technology please watch our making of or get in touch with us. All code used for the project will be shared shortly.










Photography : Robert Klebenow
Model : Jana Sachse from No Toys
Make-up Artist : Anna Borho from Nicola Weidemann
Sound design : Omri Azaria
Kuka robot expert and help : Massimiliano Fago and Davide Rosa
3D modelling : Anne-May Abel
Project Assistants : Isabel Webb and Eli Block
Robot, location and help : Dipl.-Ing. Siegfried Müller Druckgießerei