For the past month or so my work in Max For Live has been purely audio. For my upcoming final project performance, I felt it worth looking into live reactive visuals, visuals that change and move to the audio it’s being feeding. So to begin, started to explore the Vizzable modules within Max 7, allowing myself to get comfortable with working with video with Ableton. What I found was it is not dissimilar to working with audio, it’s just a different medium. Also, I found that like working with audio, the possibilities were endless.
So my final result today was not necessarily reactive, but could be easily made to do so. The main bulk of the experimentation today was with combining different visual stimuli at once, to create unique and expressive visuals, that once manipulated by audio, could be very powerful and interactive.
As you can see in the screenshot above, I used multiple visual inputs such as: the GRABER, PATTENMAPPR and the PRIMR, and combined them using multiple ‘CHROMAKEYR’ objects. Although it cannot be seen in the picture, but I also used a WANDR, to automate the movements of the patterns generated by the PATTENMAPPR. After combining all of the visual stimuli together, what resulted was a glitchy, physical image of myself. In later experiments, I want to explore changing certain parameters of multiple devices using the AUDIO2VIZZLE device. If this proves to be successful, it could become a key component of my planned interactive performance.