Joe Beedles’ reflects on the residency to date
Since starting the Conversation Series II Residency with Venture Arts back in October I have mostly been developing ideas surrounding computer animated human models to try and create a more engaging, empathetic response in audiences in the context of my live performance - In November I debuted a new live performance at Cafe Oto in London comprised of all new audio and visual materials which displayed these advancements.
The previous night I exhibited a special audiovisual performance for Venture Arts’ Thursday Late event at The Whitworth which incorporated videos of Venture Arts artists’ work, manipulated live in an audio-reactive fashion - this performance included all new compositions including an edited underwater recording of a ‘Singing in the Rain’ hand-crank music-box that the rest of the group took a shine to, a recording of Amy singing along to Take That, and a video that we had all been involved in creating during the Tuesday meetup sessions and is a likely precursor for what we will showcase at the end of the residency.
Recently I have been researching ‘datamoshing’ glitch techniques and how they might be achieved in real-time as opposed to being edited on a timeline then rendered offline. Like with the majority of my work, I have been using the software Max to realise this. As far as I understand, the technique is achieved by reading any incoming video or live webcam feed, then multiple variables and thresholds are set to decide whether or not the last frame of the video feed is deleted resulting in trailing effects and overlapping frames, sometimes similar to the green-screen effect that John shared with Fran and I early on. So far I have been feeding the datamoshing patch with videos I have taken in the local park where we eat lunch, and other videos I have been capturing of macro bubble formation as part of another collaboration I am working on with Bradford-based artist Cat Scott.
Using the generous materials budget provided by Venture Arts for the residency, I have invested in a controllable RGB & UV club light. Utilising the UV capabilities, we're in the process of figuring out how we can conceal some of James’ coded messages throughout our exhibition that will only be revealed when viewed under the UV light. I hope to contribute some slow generative light sequences that will feature in our collaborative video and in each of the physical gallery spaces as part of our group exhibition that begins its tour of the North in March!