Why can't I set colors using full alpha.

We have 32bits for the colors. So for ARGB we have 8 bits for each channel.

So in theory I should be able to make a color with for example.

red 255 green 0 blue 0 alpha 0

So a full red color that you can't see cause it's totally transparent... In practice it becomes 0, 0, 0, 0. Which makes sense cause it's a lot faster to draw the result.

Then a bit further. If we have a color like:

vec4(1, 0, 0, 0.5);

Then in processing we have 128,0,0,128.

I would like that to be 256,0,0,128. Is this possible in any way? The reason is I use bitflags to set data. And as it seems now, it's not possible to use the alpha channel cause they mess up all the bitflags in the RGB area.

Hope someone can follow me :) (I want to make a tutorial about it soon for the processing site cause I think there can be done really neat things with it).

Answers

  • Ok, I'm a bit further. https://www.opengl.org/wiki/Image_Format

    There is for example:

    GL_RGB10_A2

    Which uses 10 bits for r,g and b. And the remaining 2 for alpha. However, no Idea if it is possible in anyway to use this format in Processing. I might check later once I have my original idea working. If someone knows please tell.

  • edited October 2015 Answer ✓

    By default Processing (OpenGL) uses the fragment's alpha value to blend/mix it with other fragments "behind" the current one (at least if alpha is smaller than 1). In other words, the default is blendMode(BLEND);.

    In case you don't want the graphics card to perform blending, just call blendMode(REPLACE); before using shader() or filter(). This way you can use the alpha channel without screwing up red, green and blue.

    2 links about blend modes:

  • Thanks you so much!! This is going to be so nice :) Now I can store 32 states in 1 pixel!

Sign In or Register to comment.