We are about to switch to a new forum software. Until then we have removed the registration on this forum.
The code: https://github.com/i-make-robots/LEDWall/blob/64x36-wall/Processing/screenCapture/screenCapture.pde
The goal: I've built a wall of 64x36 WS2812B LEDs driven by an Octoshield on a teensy 3.1 https://www.pjrc.com/teensy/td_libs_OctoWS2811.html The sketch samples a corner of the screen on my laptop and sends it to the octo, which then displays the video. All of which works! It's even going to be at Maker Faire NYC 2015.
So what's the issue: There's about a 1s delay between the video playing on my laptop (2011 17" Macbook Pro, OSX 10.10.1) and the preview visible in the processing window. I'd like to reduce this delay as much as possible so the audio better matches the image on the LED wall.
I'm comfortable writing my own Java apps. I'd like to stay in an environment that's widely adopted so install headeaches are someone else's problem. I've been trying to optimize the heck out of the code to no avail.
Thoughts?
Please & thank you!
Answers
Hello,
The Robots API is done for testing purposes and not really realtime applications.
On OSX you can have a look at Syphon, on Windows these is Spout and Linux FFMPEG with X11 grab.
Cheers,
Jeremy.
I don't think Syphon will be of any use for what you want. Also Robots should be no problem for the size you want, that should be possible at 60fps.
I think you should measure what is taking so long.