389319
Sarah Hutchinson
AVP Development Blog
!! FLASHING LIGHTS AND VIDEO THROUGHOUT !!
My audiovisual performance will be based around an existing musical alter ego of mine called Buildercore. This is because Buildercore already has a strong visual identity and aesthetic that I can build upon. The outcome for this audio visual performance will be visuals that react to the audio of a track. Below is a compilation of the videos I have put out under Buildercore so far. ​​​​​​​
RESEARCH STAGE:
Software:
During the research stage of development my main objective was to find a software that I was comfortable using and programming a performance into. I had previously tried the Resolume suite and found it difficult to use for what I wanted to do. I came across TouchDesigner and immediately took to the program. Its workflow is similar to Blender which I use often for creating visuals. To get more comfortable with the program I followed some tutorials on the basic functions and visuals that could be created with it. 

Previous Performance Research:
In terms of visual aesthetic my main influences are music videos by bands such as Black Midi and Otta. In previous buildcore videos I have made I play with setting the video editing parameters to their maximum settings or purposely 'breaking' the source video material. My favourite technique is using the colour key tool to key out a prominent colour, which will leave lots of holes in the video. Another video can then be layered underneath to create a glitchy and broken look to the video. In the video for Crow's Perch by black midi similar techniques have been used. In this video, created by susan_creamcheese (unfortunately I can't find their current portfolio of work under that username anymore), the video material and editing is manic and over the top. ​​​​​​​
In another example, the music video for Three Of Us by Otta, the approach is similar to Black Midi's but stripped back and much more simple. ​​​​​​​In particular at 0:55 in the video, the way the lyrics are written out in a word processing document and screen recorded have had a direct influence on previous buildercore output.
In terms of actual performative and audio reactive visuals, the influence here was visuals created by WeirdCore for Aphex Twin's live sets. These visuals take a live video feed of the audience and pipe it into a program to implement this footage in the visuals displayed on the screens around the stage. ​​​​​​​
PROTOTYPES AND DEVISING:
During this period I made various different live visual manipulation patches in TouchDesigner. These programs I made with the help of youtube tutorials to get my head around the workflow of TouchDesigner, but input my own video sources and parameter changes to get the desired effect. The aim of this was to build up a selection of generative visual and audio reactive patching skills to build into my own audiovisual setup. 

Initial visual tests and patches built within TouchDesigner: 
#1: Audio reactive generative shapes: The volume of the audio input is analysed and converted into an on/off signal depending on the volume. This on/off signal then triggers when the shapes show up on screen. I had the volume parameter set to a fairly low number so the visuals showed up more frequently. The shapes are generated from basic circle geometry which go through visual processing to become liquid like. The movement of these visuals are tied to the time parameter in TouchDesigner. A python command 'absTime.seconds' is connected to the displacement node, which moves the visuals around the screen.
#2: Geometry movement and manual controlled visuals: This patch was made to learn about moving geometry around randomly (almost like a screensaver). To make it more interesting a user controlled blending/blurring of the generated cubes can be triggered by using the '1' key on the keyboard.
#3: Slitscreen: This patch can be used more as background material for a visual performance. It takes a visual movie file input, maps it onto two rectangles that then blend outwards. This process involved learning about feedback loops to get the glitchy effect. It currently has no audio reactive features but they could be built in at a later date. 
#4: VHS style circuit bent visuals: For this patch I did follow a tutorial but added custom parameters to glitch out the visuals to a more extreme level. This again would be good background material but could have some audio reactive features built in. 
#5: Blender objects import and audio reactivity: This patch involved a lot of trial and error to be able to import a custom geometry into TouchDesigner which could be edited and manipulated as an inbuilt geometry could be. Eventually I came across an addon for Blender that converts an objects mesh into code that can then be imported into Touchdesigner as Python code and run to generate a functional geometry. After getting a basic geometry import to work (which happened to be a go kart I made recently) I used audio analysis learnt previously to visually manipulate the import. I'm most looking forward to building this into my visual performance using custom made blender objects. 

The next step is to start consolidating these skills and building a multifunctional TouchDesigner patch that can be used as a visual performance tool. ​​​​​​​​​​​​​​
Main Patch:
Step 1: Video Material
As the basis of my TouchDesigner patch will be an audio reactive video controller, I first need to record and source video material in the style of Buildercore to feed into the patch. I have quite a lot of unused footage from previous buildercore projects that I'll be able to use. ​​​​​​​
Step 2: Blender Objects
During the Blender stage of this project, I originally wanted to have a buildercore logo that would react to the music being input into the program.  I ended up deciding against this as I wouldn't like this from an aesthetic perspective for audio reactive visuals. I decided to get into modelling in Blender instead with the aim of creating two models: one of me and one of my skeleton (his name is Chad). For the modelling I found a great addon for Blender called Keentools FaceBuilder. Below are videos of me starting to use this addon: 
(video features old Buildercore demos)
I tried to model myself and Chad the skeleton a couple of times: me once with a builder hat and sunglasses, me once just as normal me (with hair out), Chad the skeleton on his own and me once with a beanie on. As shown in the video, the first model was super broken and I liked this, the second didn't turn out great because I didn't spend the extra time modelling the hair - I wouldn't use one without the builder hat as that is the core of the Buildercore look. The third model, Chad the skeleton, also turned out pretty glitched but with some extra time modelling he could look more like he does in real life. The fourth model (me with beanie) was also fairly glitched out but I settled on exporting this over to Touchdesigner to see if it would look good reacting to audio.
The next step is to combine the previous TouchDesigner work and the new techniques with Blender to create an audiovisual patch.  
Final Patch Explanation:
User Guide:
Summary of Ideas and Techniques:
- Continuing development of visual aesthetic for Buildercore with a focus on digital recreation of Buildercore characters in Blender and video manipulation in TouchDesigner.
- Creation of bespoke TouchDesigner patch to create both live visuals and music video visuals within the Buildercore aesthetic. 
- Research and development of skills in using a new program (TouchDesigner) for creating audio reactive visuals.
- Continued techniques in visual design within Blender.
Development of Practical Work:
Before working with TouchDesigner I had never made visuals that reacted to music. This is the biggest development to my practical work. At the beginning of this project I was using Premiere Pro to manually manipulate video footage into a glitchy composition of media. 
Throughout this project I have learnt how to use TouchDesigner from scratch and how to use the tools within TouchDesigner to analyse audio input in a way that translates it into data that can be used to manipulate video footage. I also learnt more about Blender throughout this process and about the various file types and methods of using a computer programmed visual model in an audio reactive context. 
The work I have produced during this project has been a good start to further exploring audio reactive techniques within my visual and musical practice.
Audiovisual Performance:
Patch Output:
Demonstration of the type of visuals that could be created for future Buildercore musical output.