As part of our work towards the mid-term project we explored the possibility to use a light-sensor interface to control the motion and the visual output of Kinect. The visual output was then projected on a custom-made screen.
Here is a short video of the setup without the interface (a box containing the Arduino board with the circuit configuration of a photoresistor) and with the computer monitor as the medium to display the image generated instead of the custom-made screen:
The Arduino board controlled the tilting behavior of Kinect through a serial communication to a Processing sketch running on a iMac computer. The sketch was based on Daniel Shiffman's code associated and supporting the Open Kinect library used in this experiment (URL: http://shiffman.net/p5/kinect/).
Changes in voltage on the board as result of varying light intensity was communicated to the Processing sketch and mapped to angle degrees (0 to 30) that controlled the tilting of the Kinect. Changes in voltage from the Arduino was also communicated as input to the Processing sketch and mapped to the size of rectangles drawn on the screen. The image shown is simply a grid in which rectangles represent pixels with gray values associated with the distance to the sensor (brighter pixels representing closer distance of body parts in relation to the Kinect sensor.
The Arduino code used can be accessed HERE; and the Processing sketch based on Shiffmann's code and be accessed HERE.
This experiment mainly focused on getting the technological aspect of the installation working as primary step and prior to working on the interaction side of the project. Although many interactive projects have relied in the past on the use of Kinect, what's novel here (to the extent of my knowledge) is the use of light to manipulate the behavior of hardware that is tracking the movement of persons intended to interact with the system later on.
Possible uses of light-controlled Kinect
One possible use would be the creation of a light-modulated kinetic sculpture in which mechanical aspects of the sculpture would be moved by the Kinect sensing changes in light intensity and motion.
A second possibility would be to use a wearable Arduino board with the light sensor circuit and a fixed lamp illuminating a subject, as the subject body changes the relation of the sensor to the light source also changes, and with it the tilting behavior and visual output of the Kinect.