We are about to switch to a new forum software. Until then we have removed the registration on this forum.
I have been reading up about fisheye removal and digital lens trickery in Processing.
I am curious to find a good method to remove (or add) the fisheye effect in live video (camera, USB, etc). Assuming the lens and camera optics is something you cannot change, etc. This seems to be a really useful solution to nail down as this is a requisite for pixel perfect tracking and other camera fun.
It seems the shader method is the best. I have found examples here: https://forum.processing.org/two/discussion/14788/fisheye-lens-shader
Questions arise such as:
1) Can this shader be applied to live video?
2) Is this a computationally efficient way remove distortion or is there another way?
Answers
shaders can be applied to anything that draws on the screen - every line or pixel can be moved, recoloured by a shader. so yes.
and yes, if you have a semi-decent graphics card. in a fragment shader each fragment (pixel) is handled by a different core of GPU and some of the newer cards have thousands of these dedicated cores.
Rock on that sounds promising!
Say I wanted to scale this down to a cheap windows PC or maybe even a Raspberry Pi for the lens correction/distortion.
Would this still be an appropriate technique or would the computational overhead be too much and require a different approach? Or maybe there is a "resolution" to such a shader which could scale down the performance for smaller systems?
Many thanks,