fugScreenCapture is a new utility that allows you to capture all or part the window of an application running on your computer, and send it as a video stream to other applications that support the bigfug video streaming system.
NOTE: fugScreenCapture is currently in development, which means:
It’s currently only available for Windows – OSX and Linux versions are planned
It uses only the most basic window capture method, so:
The speed of capture might not be that fast on your system
It might not be able to capture all windows
You can download a demo version to try the capture performance out on your system and see if it meets your requirements.
Laurent Smadja and Scott Baker informed me of a couple of problems with fugFeedbackGL, namely that alpha wasn’t working quite as expected, and that on some Apple computers, the resulting image was stretched incorrectly.
There is now an 1.1 update for fugFeedbackGL for both Windows and Apple OSX that fixes these issues.
Existing customers can download the update by visiting the My Account page.
Michael DeMattia sent me a couple of images from the recent VJ gig he did at Echostage, which is Washington DC’s largest concert, where he’s been using my fugScopeGL and fugPowerGL FFGL plugins.
Eight years ago I first had the idea for a real-time video streaming system that would enable me to take a live stream from within my VJ software of choice at the time – Visual Jockey – and send it, either locally to another application running on the same computer, or across a network to another application running on a different computer altogether.
The original idea was to externalise not just video, but audio, MIDI, and any other streams of data. I came up with a piece of software called PatchBox, the idea of which being that it would operate like a patch bay, allowing data to be routed from multiple sources to multiple destinations with some allowance for processing along the way.
My vision was to leverage the processing abilities of all software on all platforms for the purposes of real-time creativity and art. Why have such limitations and incompatibilities?
Of course, from testing the initial release, I soon found that streaming live video in real-time was fraught with complications; requiring a fair amount of processing and a complicated arrangement of management of images – especially across a network – to maintain a consistent, high quality stream that would be usable.
Despite initial interest, it seemed too complicated a task to maintain, although I’ve tried several times since then.
The latest release of my real-time video streaming system is the fourth incarnation (that I can remember) and is finally able to be considered as usable. It is also the first version that has fulfilled my original goal of running on both Windows and OSX.
This new initial release currently focuses on implementing sending and receiving plugins for the FreeFrame specification used now, as it was back then, as a standard for VJ software.
I hope to incorporate the system into more software over time, including my own video mapping software Painting With Light, further pushing ahead to my goal of incorporating real-time improvised video art performance with video mapping.
A couple of example patches running simultaneously in my art software that I’m slowly developing as time allows. Its interface is somewhat along the lines of vvvv or Max/MSP but am hoping to sculpt it into something clean and usable for dynamic VJ performances.
A strangely profound moment today when I printed a 3D “Utah Teapot“, which is an old, and very common model often used as a simple test. Created in 1975, I’ve seen images of it and played with it for my whole life. To finally hold a physical manifestation of this wholly virtual object, feeling the faces of the polygons between my fingers for the first time, was quite fascinating. I can’t put it down now!