Return min / max from Shader

edited November 2013 in GLSL / Shaders

I wrote my first Shader (Mandelbrot Set) by hacking the Conway and Nebula examples. It works but after zooming a little bit the image looses contrast. This is not a code problem, it is a problem of not increasing the contrast with an algorithm.

To increase the contrast I would need to find the max (ideally the min as well) of the pixel values. Is it possible to get the max from the Shader somehow in Processing or do I have to scan the pixels after the sketch shows them and find it like that?

Answers

  • You will have to step through each pixel after the scene has been rendered to find the darkest and brightest values of the pixels in the image. If you are worried about framerate dropping because of the number of operations, experiment with skipping 10 or 100 pixels at a time when traversing the pixels[] array. Hell, even randomly sampling enough pixels to find max and min values will do the job.

  • Well that's a bummer. The skipping idea sounds like a good idea, I'll try it out.

  • If you want to leverage the GPU for this operation, you would be performing a kind of map/reduce job with the GPU. Where you would write a shader that samples a number of pixels from a texture, set as a uniform in the shader, and writes the smallest and largest pixel values found to a new smaller texture. That operation would be repeated enough times that reading the pixels of the final texture would take a small amount of time.

    Does this make sense? I did something similar a while back to find the average value of the brightness of a texture without having to traverse the entire pixels array.

    Step 1: 512x512 -> 256x256 Step 2: 256x256 -> 128x128 Step 3: 128x128 -> 64x64 Step 4: 64x64 -> 32x32 Step 5: 32x32 -> 16x16

    Each step the shader would take in the previous generation, average up 4 pixels, and write the value to the texture that was the target of the render.

    Then I would just sum up the values of the 16x16 texture. It was a real pain the ass, and a total nightmare to debug.

  • I think understand how what you're explaining would be done on a CPU, but I will have to look up some GLSL tutorials on how to use multiple pixels.

    I am testing on my computer at the moment but eventually it will run on a wall so I'd prefer not to use something that requires specific pixel dimensions (powers of 2). I would imagine that sending every column of pixels (when I figure out how to do that) might work. Basically I would just write the min / max to a 2 pixel tall and w pixel wide texture right?

Sign In or Register to comment.