Live Performance

How would I go about using Processing to generate images in a live setting, as in accompanying a DJ or other performer in real time? Is this even possible, or would it be too much of a pain to do with this software?

I am trying to learn how to do live visuals similar to what Jonathan Singer and Android Jones are known for. I know Android does his shows entirely live and on the fly, but the generation and variety of images as well as the speed at which he does it all is astonishing. I know he uses some processing, but what he uses for live performances I have no idea.

If processing is simply not the way to go for live performances then feel free to point me in another direction.

For reference: Phadroid - Singer -

Tagged:

Comments

  • No-one is coding this stuff on the fly: it will all be pre-coded and the artists will be switching between different sketches (or possibly even pre-rendered videos if the content is non-interactive) that generate the different visuals. On interactive sketches they'll probably also be adjusting settings too; but most likely through a GUI rather than manually editing variables in the code.

    So I guess the real question is how to switch between sketches/videos etc. in a live environment. I don't know the answer; but I'd suspect a utility that allows switching video output between different windows; just as a DJ can switch between different audio channels. You'd then be able to queue up sketches, videos etc. and fade between them.

  • edited January 2016

    If I was you, I'd check out Syphon.

    I made processing visuals for a club where they have a big wall of projections. What they used to operate the videos in these projections is Resolume, which is one of the main VJ software available. To connect my sketch to Resolume, we used Syphon. There is a Syphon library for Processing (see the library page). So, basically, you import the Syphon library, you call it in your sketch, you run the skecth and then you connect Resolume to Syphon. Then, Resolume will have the visual output of your sketch along the other video playbacks and feeds, and you'll be able to display it and switch from one visual to the other. Those VJ softwares also have fancy transitions and opacity settings so you can combine your interactive sketch with prerecorded stuff and live feeds.

    Syphon is integrated with an impressive number of VJ and multimedia softwares so try to see what's the best option for you. Maybe you could check with the venue where you want to perform, see what is their system and start from there.

    About the sketch and code itself... as blindfish said, no one really codes live. The best strategy would be, in my opinion, to pre-code visuals and implement an input system (from your keyboard, mouse, video game controller, Arduino sensor, camera, etc.) you'll use to modify and change the visuals during the performance. I even heard of people using a bluetooth keyboard and wandering through the dancing people and controlling the visuals from the dancefloor.

Sign In or Register to comment.