I’ve been somewhat busy installingnew work, doing a video mapping performance in France, and doing talks, but development has continued apace on Fugio. Most recently I’ve just added Oculus Rift virtual reality support, which means its very easy to update an existing patch into an Oculus enabled one by the addition of one extra node.
I used Fugio to animate this exploding Stanford Bunny as a test for the 3D model loading and OpenGL shader code:
I’ve been adding some basic audio analysis nodes, so now there’s a FFT (Fast Fourier Transform) and an initial power spectrum node (works, but needs cleaning up):
I think we’re in the end game. I think that Learning to Code is a last ditch cry to wrestle control and we’re too late. I think that the cost of major software is spiralling to zero and, just like the music industry, the focus is now on mass adoption with revenues generated around the product, not from it. It happened on the web and it happened on your phone and in your home. They’re coming for open-source, they’re coming for us lone developers, they’re controlling the formation of culture, and they have a leash for us all. If you’re going to code then code to distrupt, code to disseminate, code with fire and bile and fury. No one may thank you but stand against the wave in the best way you can and hold fast. Coding is a manifestation of imagination and will; whose do you choose?
Fugio is almost out of alpha development, which means that I’ve completed all the features I have planned for the first release, and tested it until I’m confident about it’s general stability and performance.
Now it’s time for you to try it!
I’ve done a call out for beta testers who will be the first people to get their hands on the software and will be instrumental in the next stage of its development.
If you are interested in applying, please see the online form.
For the past two weeks I’ve been chasing down bugs and writing documentation for Fugio, neither of which are my favourite pastimes.
I’ve done 152 code commits, added a great deal of helper features like a menu of examples that show what each node does in a nice, friendly way, and am generally trying to standardise what things are called and how they operate as much as possible to make it as easy as it can be to get past the initial hurdle of approaching a new piece of software.
It’s amazing how much time that side of things takes. While I’m most excited about the timeline system (and the upcoming audio, video, and OpenGL plugins), I’ve spent almost as long working on making sure you can undo and redo things, that copy and paste works, and that MIDI synchronisation is good and tight.
I’m having to temper my enthusiasm for wanting to show it to the world with the knowledge that if it’s not documented and tested as best that I can, people just aren’t going to bother with it.
Have still got about half the documentation to write, which I want to have done over the next week.
Fugio (and Painting With Light) are both written in C++ and built using the Qt Project, mainly because it offers a (mostly) consistent API across multiple platforms. It offers a wide range of low and high level functionality, which are often great fun to play with.
Take the Qt Multimedia module, for example. It’s so high level that I couldn’t resist adding in a couple of nodes that interface with it, so above we have the new SoundEffect node that can load and quickly play WAV audio files when triggered.
You can also see the new Filename Node, which is another small helper: click the button and a file open dialog appears.
And here we have the Multimedia Player Node that can playback more complex media formats such as mp3’s and also video!
While I’ve been putting a lot of development time into a ffmpeg based timeline controlled media playback node, sometimes you just need a simple way to play media, and these new nodes fit the bill nicely.
As I’m working on streamlining the MIDI control workflow, I hooked Fugio up to Resolume to see how easy it was to get them talking.
Here I’m using a colour timeline, breaking it into RGB components and passing them through one of the new MIDI Helper nodes, which takes floating point values from 0.0 to 1.0 and converts it to MIDI values of 0-127, and outputs these values to Resolume to control its RGB controls.
Everything works, but the process of mapping controls was more fiddly than it should be due to not being able to send a single CC value at a time and thus not being able to take full advantage over Resolume’s MIDI mapping listen functionality. I’m thinking how to add a way of doing that…
MIDI listen is a nice, fast way to map controls so I added it into Fugio’s MIDI Input node to automatically add pins on receipt of MIDI messages.
Today I’ve been looking at synchronising Fugio with various applications over MIDI. First, using the ever useful MIDIOX and loopMIDI, I was able to get Fugio synchronised to MIDI Time Code.
I also got MIDI clock working, synchronising Fugio to Ableton Live. The main difference being that Fugio works in time, not measures/beats, so I had to manually set the BPM value to match the setting in Live, then it all matched up.
My plan for this is to feed the MIDI clock into a grid track, allowing direct and flexible translation between song positions and time.
I still have to finish off sending clock/MTC from Fugio, but it’s almost there.
I’ve been away for a couple of days running a Painting With Light video mapping workshop at Bournemouth University so today I managed to do a little Fugio coding and added a keyboard node to catch any keyboard sequence such as simply pressing R or combinations like CTRL+7 and generate a trigger.
In this patch pressing R generates a new random number.
This morning I added timeline track data recording into Fugio.
For now it can record numeric values and also colours (have an idea for this) over time and then play them back. I need to add some punch in/out control and am thinking to put loop recording support in that would incorporate the functionality I was aiming for in my old app MIDILoop.
At some point I guess I need to do raw MIDI and OSC data tracks too for fine control. There now exists the possibility of recording data from one source into a timeline, outputting that through other processing stages, and re-recording it all in-app.
Apologies for the horrible timeline colours – was testing some stylesheet stuff… 🙂
I managed to get the colour timeline controls working pretty well (still some finessing to do) so I thought I’d try a little experiment and feed the Hue, Saturation, and Lightness from the colour being generated in the colour timeline to a MIDI output, creating musical notes depending on the levels. There is a grand tradition of linking colours and musical pitch (see Isaac Newton’s 1704 book Opticks) so this provides a way of playing about with this data.