week_2: creating movie playback system (first experiment)
Based on the short videos palette as starting material for this week assignment, I decided to focus on the contrast between my hand touching a plastic work of art (created by me) as opposed to my hand touching a technological tool, in this particular case the iPhone. The piece is intended to convey a reflective stance between creation and created; between the inner world of humans and the inner world of electronic tools.
When created the video work, I took advantage of the 'jit.xfade' object in order to hybridize two different sources of video into one. Using the object 'jit.noise' I applied noise algorithms to each of the videos prior to fade them into a single moving image.
The patch created can be viewed in Figure 1 and the resulting moving image in Video 1.
Figure 1. This patch was created by combining elements of previous patch provided as class resources by the instructor Matt Romein. I observed that by placing elements at different positions within the patch, it altered the visual results because of differences in data flow throughout the program. Please notice that although 'jit.xfade' takes an argument between 0 and 1, by having a negative number as argument it radically altered the image properties of the video (I don't understand why though).
Video 1. Resulting work was recording using 'screen recording' function of QuickTime Player in Mac and the output file was imported into Adobe Premiere for further editing. I tried to record within Max-MSP-Jitter itself and watched a couple of tutorials in which Syphon was used to record but I found it a bit advanced to get it done as part of this homework.