After several months of work, a new alpha release of Processing 2.0 is finally out. This 2.0a5 release is significant because it includes many important changes (the detailed list is
here), among them a completely overhauled OpenGL renderer. Since this renderer is brand new, not all the functionality is implemented yet (PDF recording for example) and probably contains several bugs that we will try to iron out in upcoming releases. I just wrote an
entry in my blog describing the main aspects of this renderer.
I'm currently working in the new core video library for Processing 2.0. It is a simplified derivative of
GSVideo, so it is based on
GStreamer in order to ensure cross-platform compatibility. Performance-wise, it should also be faster than the current built-in video library based on Quicktime, at least in most situations.
However, it requires to bundle the GStreamer native libraries for each platform (actually, only for Windows and OSX since it is safe to assume that GStreamer will be available on a Linux system). To minimize the impact on the final size of the Processing package, I'm trying to trim away GStreamer plugins that support esoteric codecs and functionality.
To make sure that I don't remove anything important, I'd like to know what codecs (h264, xvid, theora, etc) and containers (mov, avi, etc) are most used by people doing video with Processing.
Please make your comments here. If the subsequent discussion becomes too chaotic, I could also organize an online poll to get more accurate numbers and make a decision based on them.
I'm using the dlibs_freenect library in a tracking setup with two kinects running simultaneously. The library supports multiple kinect devices, but I'm having a tricky time to get both working at the same time (most of the time, only one would get properly initialized and sending data while the second cannot be opened). I would like to know if there is any advice or recommendation when using a configuration like this. The operating system in Windows 7 32 bits.