Processing vs Star Wars Uncut

March 18, 2013

I was recently inspired by the realtime VJing visuals that Weirdcore have been doing for Aphex Twin’s live shows. Technology has advanced a long way since my days VJing – I would love to program some precedural animations that reacted to music or ways of manipulating video streams in realtime. I was looking into manipulating 3D data feeds from a Kinect when I came across Dan Shiffman. He is one of the founding users of Processing – a coding framework I had known about for some time but never had chance to learn. He has a series of amazing Processing video tutorials that teach physics and mathematical concepts, but applying them to animation / motion graphics situations (they form part of his nature of code book which I highly recommend). I thought this would be a great way of learning 2 things at once, but also needed a project where I could apply these experiments to video footage. It just so happened that the awesome Star Wars Uncut project has re-opened, and they are now working on The Empire Strikes Back. I signed up and claimed my clip. This would be perfect as it gave me only 30 days to learn Processing from scratch and create 15 seconds of footage. My clip had 6 shots in it, so that meant I could spend no more that 5 days on each one (and I would only be doing this evenings and on weekends).

EXPERIMENT 01 – SPARKLER

The first thing to note was how quick Processing is to pick up and use. It’s supposed to reduce that hardcore coding side of things to allow artists to experiment quickly. You really can bash things out with quite messy code. I started with the tutorials on vector maths, as this was an area I thought I had never really understood. Well it turns out I had actually been using vector maths for years when building particle generators in games without realising. I’d written my own functions for things like mag(), normalize() and angleBetween() without realising these were common mathematical operations. Thankfully Processing has the PVector class that has all of these methods included. So I started by importing the footage with the PImage class and creating a particle emitter that sampled the colour from the underlying image. This is basically the same as the Pointilism demo, but the points are given a velocity on creation. A pretty basic start, but it covered techniques that I will use repeatedly later. Time to move on.

EXPERIMENT 02 – FLOWFIELD

Now I wanted to do something more interesting with the motion of the particles than give them random velocity. Some of the tutorials on manipulating pixels covered creating a convolution matrix. So in this version I weighted the particle emitter to generate the greatest number of particles in the brightest part of the image. Then on each frame they sampled their surroundings using a convolve martix, and moved towards the darkest area. This created a flowfield effect with pixels moving from light to dark.

EXPERIMENT 03 – COLOR MATRIX

This next experiment came to me in my sleep (this is why I don’t do programming full time any more – I tend to dream in code when I have a project on). When I woke up I scribbled notes down, but when I came to create it I realied it looked remarkably like the Processing artwork – so I guess my subconscious was less original than I thought. I created a matrix of points on the screen that had motion driven by perlin noise. If they came within a certain distance of any other point they would draw a line to connect to it. Brighter points would have a further reach for their connections. The tricky bit here was creating a colour gradient fill for the line – as far as I can see there isn’t a native Processing method of doing this, so I wrote my own.

void gradientLine(float _x1, float _y1, float _x2, float _y2, color _c1, color _c2, float _w) {
  
  float steps = 20;
  strokeWeight(_w);
  
  PVector v1 = new PVector(_x1, _y1);
  PVector v2 = new PVector( (_x2 - _x1), (_y2 - _y1) );

  v2.div(steps);

  for (int j = 0; j <steps; j++) {
    float i =  ( j * (1/steps));
    color lc = lerpColor(_c1, _c2, i );
    stroke(lc);
    float vX1 = v1.x;
    float vY1 = v1.y;
    v1.add(v2);
    float vX2 = v1.x;
    float vY2 = v1.y;
    line(vX1, vY1, vX2, vY2);
  }

}

EXPERIMENT 04 – FRACTAL SABERS

Next I had a play with fractals. I thought it would be interesting to track the light sabers, and use them as the trunk for fractal branches. This article gave me the jumping off point for how they work, but I wanted to break the recursive loop that generates the entire tree in one go. Instead I wanted each tree to “grow” frame by frame, so that meant creating a Branch class and adding each instance to a control array. Then each frame the Branch would display itself, and spawn x number of child objects. This was probably my favourite result of any of these experiments – every version I rendered had some new interesting detail in it.

EXPERIMENT 05 – HALFTONE

Tracking the light sabers in Nuke and then using then tracking data in Processing was giving some interesting results. I returned to the particle generator and wanted to create a version where the saber location was used as the emitter. There is an intro to Cinder tutorial that creates a halftone image effect using particles that repel each other. Raven Kwok had re-written this in Processing, which I grabbed and tweaked. Instead of the mouse location being the emitter, the tracking data location is used. I also changed the particles to sample the underlying colour. The result was OK – I thought I could return to it to make something more interesting though.

EXPERIMENT 06a – GLITCH (ABANDONED)

I hit a bit of a dead end with this one. I wanted an effect similar to data moshing, where the compression of video is deliberately broken to create interesting effects. MPEG compression works by storing 2 sets of data – the colour information in the image and the motion information (I’m over simplifying here for the sake brevity). The colour data is only updated on keyframes – the in-between frames use the motion vectors to move the colour around. I generated motion vectors using the Furnace plugins in Nuke. The colour data was created using a series of blocks that are filled with the average underlying colour. They used a convolution matrix to become smaller and therefor more detailed in areas of high contrast. They are then smeared using the motion vectors, with some deliberate coding elements to break the image. Well the result just looked a bit ugly and crap (glitch art always treads a very fine line), so I ditched it and moved on.

EXPERIMENT 06 – FLUID SABERS

Here I was trying out fluid dynamics simulations. Daniel Erickson has some great Processing sketches that I used as a starting point. Again, where this would normally be driven my the mouse movement, I substituted the tracking data from the sabers. Whereas Daniels demo is monochrome, I tweaked it to sample the hue information from the background image and bleed through to this target colour. I must admit that no matter how I tweaked this code, I never fully got to grips with the underlying maths, so the end results (which take forever to render at full resolution) were always unpredictable. I was fairly happy with the result, but the wasted days on the Glitch experiment meant I had simply ran out of time and had to conclude it there.

AUDIO

Now the rules of Star Wars Uncut state you have to create your own audio for the piece. One thing I certainly am not is a sound designer. I knew I wanted to use something quite lo-fi and distorted. One of my favourite bands Nine Inch Nails have given away the source files for several of their albums and encourage fans to use and remix them under the Creative Commons licence. There are loads of great drones and messed up guitar sounds in their tracks, so I plundered these for the audio mix.

END RESULT

So there you have it. Hardly the most ground-breaking motion-graphics ever created but that was never the point. I’ve now got to grips with Processing enough to delve deeper into it. I hadn’t done any mograph work in over a year, so it was nice to mess around with a brief that was pretty loose. I need to continue with Shiffman’s great video tutorial series and see where this ends up.

Leave a comment

required