Be a Fugio beta tester!

Screenshot 2015-02-07 11.15.42

Fugio is almost out of alpha development, which means that I’ve completed all the features I have planned for the first release, and tested it until I’m confident about it’s general stability and performance.

Now it’s time for you to try it!

I’ve done a call out for beta testers who will be the first people to get their hands on the software and will be instrumental in the next stage of its development.

If you are interested in applying, please see the online form.

http://goo.gl/forms/67CHDOo1KL

 

Fugio Update Feb 13

Screenshot 2015-02-12 17.02.00For the past two weeks I’ve been chasing down bugs and writing documentation for Fugio, neither of which are my favourite pastimes.

I’ve done 152 code commits, added a great deal of helper features like a menu of examples that show what each node does in a nice, friendly way, and am generally trying to standardise what things are called and how they operate as much as possible to make it as easy as it can be to get past the initial hurdle of approaching a new piece of software.

It’s amazing how much time that side of things takes.  While I’m most excited about the timeline system (and the upcoming audio, video, and OpenGL plugins), I’ve spent almost as long working on making sure you can undo and redo things, that copy and paste works, and that MIDI synchronisation is good and tight.

I’m having to temper my enthusiasm for wanting to show it to the world with the knowledge that if it’s not documented and tested as best that I can, people just aren’t going to bother with it.

Have still got about half the documentation to write, which I want to have done over the next week.

Qt Multimedia in Fugio

Screenshot 2015-01-26 23.54.32

Fugio (and Painting With Light) are both written in C++ and built using the Qt Project, mainly because it offers a (mostly) consistent API across multiple platforms.  It offers a wide range of low and high level functionality, which are often great fun to play with.

Take the Qt Multimedia module, for example.  It’s so high level that I couldn’t resist adding in a couple of nodes that interface with it, so above we have the new SoundEffect node that can load and quickly play WAV audio files when triggered.

Taking MIDI input (or a multitude of different options, a Makey Makey for instance) and linking them up to SoundEffect nodes would rapidly create a simple sound board, but, y’know, triggered off touching various fruits

You can also see the new Filename Node, which is another small helper: click the button and a file open dialog appears.

Screenshot 2015-01-26 02.13.47

And here we have the Multimedia Player Node that can playback more complex media formats such as mp3’s and also video!

While I’ve been putting a lot of development time into a ffmpeg based timeline controlled media playback node,  sometimes you just need a simple way to play media, and these new nodes fit the bill nicely.

Sequencing Resolume Arena with Fugio

Screenshot 2015-01-25 23.26.14

As I’m working on streamlining the MIDI control workflow, I hooked Fugio up to Resolume to see how easy it was to get them talking.

Screenshot 2015-01-25 23.27.23Here I’m using a colour timeline, breaking it into RGB components and passing them through one of the new MIDI Helper nodes, which takes floating point values from 0.0 to 1.0 and converts it to MIDI values of 0-127, and outputs these values to Resolume to control its RGB controls.

Everything works, but the process of mapping controls was more fiddly than it should be due to not being able to send a single CC value at a time and thus not being able to take full advantage over Resolume’s MIDI mapping listen functionality.  I’m thinking how to add a way of doing that…

MIDI listen is a nice, fast way to map controls so I added it into Fugio’s MIDI Input node to automatically add pins on receipt of MIDI messages.

Fugio MIDI synchronisation

Screenshot 2015-01-23 17.39.12

Today I’ve been looking at synchronising Fugio with various applications over MIDI.  First, using the ever useful MIDIOX and loopMIDI, I was able to get Fugio synchronised to MIDI Time Code.

Screenshot 2015-01-23 18.30.26I also got MIDI clock working, synchronising Fugio to Ableton Live.  The main difference being that Fugio works in time, not measures/beats, so I had to manually set the BPM value to match the setting in Live, then it all matched up.

My plan for this is to feed the MIDI clock into a grid track, allowing direct and flexible translation between song positions and time.

I still have to finish off sending clock/MTC from Fugio, but it’s almost there.

 

Fugio: Timeline Recording

This morning I added timeline track data recording into Fugio.

Screenshot 2015-01-15 10.50.42

For now it can record numeric values and also colours (have an idea for this) over time and then play them back. I need to add some punch in/out control and am thinking to put loop recording support in that would incorporate the functionality I was aiming for in my old app MIDILoop.

At some point I guess I need to do raw MIDI and OSC data tracks too for fine control. There now exists the possibility of recording data from one source into a timeline, outputting that through other processing stages, and re-recording it all in-app.

Screenshot 2015-01-15 10.50.48Apologies for the horrible timeline colours – was testing some stylesheet stuff… :-)

 

Fugio: Colour to MIDI notes

I managed to get the colour timeline controls working pretty well (still some finessing to do) so I thought I’d try a little experiment and feed the Hue, Saturation, and Lightness from the colour being generated in the colour timeline to a MIDI output, creating musical notes depending on the levels. There is a grand tradition of linking colours and musical pitch (see Isaac Newton’s 1704 book Opticks) so this provides a way of playing about with this data.

Christmas Goodies

I used this Christmas as an excuse to get a few things relating to the various software tools I’m writing at the moment.

The C++ Programming Language 4th Edition by Bjarne Stroustrup replaces my well loved, dog-eared 2nd Edition that I’ve had for many years.  I haven’t felt like I’ve fully  got my head around the new language features of C++ 11 and I enjoy Bjarne’s non-nonsense description of them.  He did create C++ after all…  It’s probably not a book for the absolute beginner but it’s one that I refer to often, always picking up new tricks or refreshing some of the less used techniques.

FugioI’ve been working on my new software called Fugio (pictured above) for over a year now and I want to make it support a variety of hardware, so I got a couple of new things to try it with:

The last game controller I had was an ancient Logitech one that was quite nice until batteries kept leaking inside of it.  I upgraded to the Xbox 360 controller for Windows and wrote a node for Fugio to read all the various parameters from it.  It’s very simple to do with the Microsoft XInput API, although obviously Windows only.

While I love my original Korg nanoKontrol for MIDI control, I felt like I needed something a bit more ‘hitty’ so I plumbed for the Akai MPD18 Compact Pad Controller so I can experiment with triggering off events within Fugio.  I’ve got most of the controls mapped in using PortMidi and am just sorting out a small bug in the MIDI clock code so I’ll be able to use the note repeat controls on the MPD18 in sync with the Fugio playback.

And while not related to software development and much more related to the process of creating visuals and art, I’m very much enjoying reading Sculpting in Time: Reflections on the Cinema by the late, great film director Andrei Tarkovsky.  It’s a no-holds-barred personal rant about his views and experiences on making films and is full of inciteful comments that are giving me much food for thought.

And with that Amazon Affiliate link laiden post done, I will wish you all a very happy New Year and am looking forward to bringing you some new exciting tools in 2015.

Atmospheric Scattering Shader

Atmosheric-Scattering

Just found this rather nice Atmospheric Scattering rendering C++ code over at scratchapixel.com and thought I’d do a quick conversion to a GLSL shader as a test for the Timeline software I’m working on. Works rather nicely…

Screenshot-2014-08-20-11.25

My (none optimised) fragment shader conversion is:

#version 150

#define M_PI 3.1415926535897932384626433832795

uniform float TimeOfDay; // range 0.0 -> 1.0 (0.0 = Midnight, 0.5 = Midday, etc)

const float RADIUS_EARTH = 6360e3;
const float RADIUS_ATMOSPHERE = 6420e3;
const float RAYLEIGH_SCALE_HEIGHT = 7994;
const float MIE_SCALE_HEIGHT = 1200;
const float SUN_INTENSITY = 20;

const float g = 0.76;

const vec3 betaR = vec3( 5.5e-6, 13.0e-6, 22.4e-6 );    // Rayleigh scattering coefficients at sea level
const vec3 betaM = vec3( 21e-6 );                       // Mie scattering coefficients at sea level

vec3 sunDirection = vec3( 0, 1, 0 );

const int numSamples = 16;
const int numSamplesLight = 8;

struct Ray
{
    vec3 o; //origin
    vec3 d; //direction (should always be normalized)
};

struct Sphere
{
    vec3 pos;   //center of sphere position
    float rad;  //radius
};

const Sphere SPHERE_EARTH      = Sphere( vec3( 0 ), RADIUS_EARTH );
const Sphere SPHERE_ATMOSPHERE = Sphere( vec3( 0 ), RADIUS_ATMOSPHERE );

bool intersect( in Ray ray, in Sphere sphere, out float t0, out float t1 )
{
    vec3 oc = ray.o - sphere.pos;
    float b = 2.0 * dot(ray.d, oc);
    float c = dot(oc, oc) - sphere.rad*sphere.rad;
    float disc = b * b - 4.0 * c;

    if (disc < 0.0)
        return false;

   float q;
    if (b < 0.0)         q = (-b - sqrt(disc))/2.0;     else         q = (-b + sqrt(disc))/2.0;       t0 = q;     t1 = c / q;     // make sure t0 is smaller than t1     if (t0 > t1) {
        // if t0 is bigger than t1 swap them around
        float temp = t0;
        t0 = t1;
        t1 = temp;
    }

    // if t1 is less than zero, the object is in the ray's negative direction
    // and consequently the ray misses the sphere
    if (t1 < 0.0)
        return false;

    if( t0 < 0.0 )
    {
        t0 = 0;
    }

    return( true );
}

vec3 computeIncidentLight( in Ray r )
{
    float       t0, t1;

    if( !intersect( r, SPHERE_ATMOSPHERE, t0, t1 ) )
    {
        return vec3( 1 );
    }

    float segmentLength = ( t1 - t0 ) / numSamples;
    float tCurrent = t0;

    vec3 sumR = vec3( 0 );
    vec3 sumM = vec3( 0 );

    float opticalDepthR = 0;
    float opticalDepthM = 0;

    float mu = dot( r.d, sunDirection );
    float phaseR = 3 / ( 16 * M_PI ) * ( 1 + mu * mu );
    float phaseM = 3 / (  8 * M_PI ) * ( ( 1 - g * g ) * ( 1 + mu * mu ) ) / ( ( 2 + g * g ) * pow( 1 + g * g - 2 * g * mu, 1.5 ) );

    for( int i = 0; i < numSamples ; i++ )
    {
        vec3    samplePosition = r.o + r.d * ( tCurrent + 0.5 * segmentLength );
        float   height = length( samplePosition ) - RADIUS_EARTH;

        // compute optical depth for light

        float hr = exp( -height / RAYLEIGH_SCALE_HEIGHT ) * segmentLength;
        float hm = exp( -height / MIE_SCALE_HEIGHT      ) * segmentLength;

        opticalDepthR += hr;
        opticalDepthM += hm;

        // light optical depth

        Ray lightRay = Ray( samplePosition, sunDirection );

        float lmin, lmax;

        intersect( lightRay, SPHERE_ATMOSPHERE, lmin, lmax );

        float segmentLengthLight = lmax / numSamplesLight;
        float tCurrentLight = 0;
        float opticalDepthLightR = 0;
        float opticalDepthLightM = 0;
        
        int j = 0;

        for( ; j < numSamplesLight ; j++ )
        {
            vec3 samplePositionLight = lightRay.o + lightRay.d * ( tCurrentLight + 0.5 * segmentLengthLight );

            float heightLight = length( samplePositionLight ) - RADIUS_EARTH;

            if( heightLight < 0 )
            {
                break;
            }

            opticalDepthLightR += exp( -heightLight / RAYLEIGH_SCALE_HEIGHT ) * segmentLengthLight;
            opticalDepthLightM += exp( -heightLight / MIE_SCALE_HEIGHT      ) * segmentLengthLight;

            tCurrentLight += segmentLengthLight;
        }

        if( j == numSamplesLight )
        {
            vec3 tau = betaR * ( opticalDepthR + opticalDepthLightR ) + betaM * 1.1 * ( opticalDepthM + opticalDepthLightM );
            vec3 attenuation = exp( -tau );

            sumR += hr * attenuation;
            sumM += hm * attenuation;
        }

        tCurrent += segmentLength;
    }

    return( SUN_INTENSITY * ( sumR * phaseR * betaR + sumM * phaseM * betaM ) );
}

void main()
{
    const int width = 512;
    const int height = 512;

    float a = mod( TimeOfDay - 0.5, 1 ) * 2.0 * M_PI;

    sunDirection = normalize( vec3( 0, cos( a ), sin( a ) ) );

    float x = 2 * ( gl_FragCoord.x + 0.5 ) / ( width  - 1 ) - 1;
    float y = 2 * ( gl_FragCoord.y + 0.5 ) / ( height - 1 ) - 1;

    float z2 = x * x + y * y; 

    if( z2 <= 1 )
    {
        float phi   = atan( y, x );
        float theta = acos( 1 - z2 );

        vec3 dir = vec3( sin( theta ) * cos( phi ), cos( theta ), sin( theta ) * sin( phi ) );
        vec3 pos = vec3( 0, RADIUS_EARTH + 1, 0 );

        gl_FragColor = vec4( computeIncidentLight( Ray( pos, normalize( dir ) ) ), 1 );
    }
    else
    {
        gl_FragColor = vec4( 0 );
    }
}