How do use a sampler2d array in Processing?

edited February 2018 in GLSL / Shaders

I would like to keep a frame buffer in the shader (say, the last 1 second of video).

To do that I tried with uniform sampler2D tex[30]; and from Processing sometimes I try writing into that array with shader.set("tex[" + F + "]", frame);

Somehow it doesn't work. It behaves as if there is only one frame and the frame index is completely ignored. It always displays the last received video frame, no matter which indices I use for reading or writing.

I thought about having 30 sampler2D variables, but then I could not access them by index... Any ideas on how to achieve this?

Tagged:

Answers

  • I tested the approach to set arrays and it works fine with floats.

    This creates 3 horizontal bars that I can control independently by pressing 'r', 'g' and 'b'.

    // frag.glsl
    uniform float rgb[3];
    void main() {
      vec2 imgSize = vec2(640.0, 480.0);
      vec2 pos = gl_FragCoord.xy/imgSize;
      float c = pos.x > rgb[int(pos.y*3.0)] ? 1:0;
      gl_FragColor = vec4(c, c, c, 1.0);
    }
    // .pde
    void keyPressed() {
      if(key == 'r') fx.set("rgb[0]", random(1));
      if(key == 'g') fx.set("rgb[1]", random(1));
      if(key == 'b') fx.set("rgb[2]", random(1));
    }
    

    So the issue might be related to webcam references / PImage / textures.

  • I leave here a full minimal program you can try:

    // fragment shader
    uniform sampler2D tex[30]; 
    uniform int show;
    varying vec4 vertTexCoord;
    void main() {
      gl_FragColor = texture2D(tex[show], vertTexCoord.st);
    }
    
        // processing
        import processing.video.*;
    
        PShader fx;
        Capture cam;
        //PImage tex[] = new PImage[30];
        int frame = 0;
    
        void setup() {
          size(640, 480, P2D);
          //for(int i=0; i<tex.length; i++) {
          //  tex[i] = createImage(ARGB, 640, 480);
          //}
          fx = loadShader("shaders/frag.glsl");
          cam = new Capture(this, 640, 480, Capture.list()[0], 30);
          cam.start();
        }
    
        void draw() {
          if (cam.available() && frame < 30) {
            cam.read();
    
            //A
            //tex[frame] = cam.copy();
            //fx.set("tex[" + frame + "]", tex[frame]);
    
            //B
            fx.set("tex[" + frame + "]", cam.copy());
            frame++;
          }
    
          fx.set("show", 1); // it should be stuck on frame 1
          shader(fx);
          rect(0, 0, width, height);
        }
    

    The program sends the first 30 frames to the shader. With the uniform show we tell the shader it should show only frame 1. Instead, it somehow shows live webcam, ignoring that value.

    I tried keeping a copy of the images on the Processing side (commented out) but the result was the same.

  • edited February 2018

    @hamoid
    Your code is working correctly on my Windows 10 x64 machine with Processing Version 3.3.6 Processing Video Version 1.0.1.

    Make sure that your Capture.list()[0] is capable doing 30fps.

    //
    import processing.video.Capture;
    int i =0;
    for (String name : Capture.list()) 
      println(name+""+"["+i+++"]"); // : )
    exit();
    
  • edited February 2018

    @nabr Thank you! Do you mean that when you run the program it succesfully shows a static image from the webcam? If it does then it might be related to my integrated Intel graphics somehow.

    How would the fact that my webcam is capable or not of doing 30fps affect? I do see the webcam. The problem is that it's not showing just one frame, it keeps showing the frames as long as they arrive instead.

    The output of running your test program is this:

    Microsoft® LifeCam HD-3000: Mi[0]
    FJ Camera: FJ Camera[1]
    

    With Processing 3.3.6 and video 1.0.2 on ArchLinux.

    ps. shorter version of your program:

    import processing.video.Capture;
    printArray(Capture.list());
    exit();
    
  • edited February 2018

    @hamoid Thx for the shorter version. Im not the java guy here. the output on my system looks like this:

    [0] "name=DroidCam Source 3,size=640x480,fps=0"
    [1] "name=DroidCam Source 3,size=640x480,fps=30"
    [2] "name=EasyCamera,size=1280x720,fps=30"
    [3] "name=EasyCamera,size=160x120,fps=30"

    So the DroidCam, is an virtual App i need, forgot about it.

    I can run you scetch with
    Capture.list()[2] // EasyCamera,size=1280x720,fps=30

    Yes, it succesfully shows a static image from the webcam.

    I don't know e.g
    you check if (cam.available() && frame < 30) { frame++;

    means the cam with only 22fps would take longer then 1sec @ 30 frames per secound to start up shader. The Shader is running with 60 fps, means a singel shader cycle is two time and "a fraction" shorter then your frame++ loop.

    Ergo: i could be that you contanly feeding the stream or you just don't recognize the effect, becourse many of thouse 22fps look the same, if you break them down eg via ffmpeg, i don't want to go deeply on this but motion starts about 16pfs.
    Ergo: their could be an inconsistent behaivor in general.

    http://nuigroup.com/forums/viewthread/12405/

    Sorry man, i can't help you any further, i only here only on a jump.

    EDIT: so i also put a weird nummers in the if statement like 23. and it works.

    hm. maybe a shaderbug. try to upgrade your shader version. processing
    {PJOGL.version=3}
    and in the shader
    #version 150

    EDIT 2: Also im on windows means every loop gets unrolled tex[0], tex[1] etc. befor transpiled from GLSL->ANGEL->HLSL. If you are on linux, - look up via "your favorite search engine" something like texture array loop linux glsl bug

    @hamoid

    Did some webcam test, my webcam is working correcly : ) Very happy about it.

    // fragment shader see above
    
    // processing
    import processing.video.*;
    
    PShader fx;
    Capture cam;
    void setup() {
      size(640, 480, P2D);
      fx = loadShader("shaders/frag.glsl");
      cam = new Capture(this, 640, 480, Capture.list()[2], 30);
      cam.start();
      frameRate(30);
    }
    int frame = 0;
    void draw() {
        if (cam.available()) {
          if(frame%30==0) frame=0;
        cam.read();
        fx.set("tex[" + frame + "]", cam.copy());
        frame++;
      }
      //
      fx.set("show",1 );
      shader(fx);
      rect(0, 0, width, height);
    
    }
    

    // gif at 33fps you see the millis are jumping around 60millisecs.

    Alt-text

  • Thank you. Your comment gives me some pointers. These seem relevant:

    https://stackoverflow.com/questions/12030711/glsl-array-of-textures-of-differing-size https://www.opengl.org/discussion_boards/showthread.php/171328-Issues-with-sampler-arrays-and-unrolling

    I also discovered texture3D. I need to look into that... Maybe these things are easier in OpenFrameworks where I can use OpenGL 4.5.

    I'll post if I make any progress.

    Basically my goal is to have an array of textures in the GPU, so I can quickly sample video frames, to achieve something like this on real time:

    Creative Code Jam v2 - Berlin, February 2018 from Abe Pazos on Vimeo.

  • edited February 2018

    @hamoid

    https://stackoverflow.com/a/12031821

    GLSL 1.20 and below do not allow sampler arrays.

    Yes, try to use a higher shader version. The default is what ? #version 100 if you on mobile and 110 or 120 on desktop.

    test

    void settings() {
      size(640, 360, P3D);
      PJOGL.profile = 4;
    }
    
    PShader shdr;
    void setup() {
    
      // will be to date #version 460
      final String version ="#version "+PGraphicsOpenGL.GLSL_VERSION.substring(0, 4).replace(".", ""); 
    
      final String[] vs={version
        , "in vec4 position;"
        , "void main() {"
        , "gl_Position = vec4(position.xy*.5-1.,0.,1.);"
        , "}"
      };
    
      final String[] fs={version
        , "out vec4 fragColor;"
       // , "uniform vec2 res;"
        , "void main() {"
        , "vec2 p = gl_FragCoord.xy/ivec2("+width+","+height+");"
        , "fragColor = vec4(vec3(p.x>=p.y),1.);"
        , "}"
      };
    
      shdr=new PShader(this, vs, fs);
     // shdr.set("res", float(width), float(height));
    
      print(version);
    } 
    void draw() {
      filter(shdr);
    }
    

    // output

    Alt-text

  • Thank you very much for that. I didn't know it was so easy to jump to version 4 in Processing.

    Unfortunately it does not help in my laptop! I'll try the same program in my nVidia computer and see if I'm lucky there.

  • I tried the same program on Ubuntu and Windows with a GTX 1060 and it does the same wrong behaviour. When the program starts it shows 30 different frames (live webcam) when it should only show one frame repeatedly. I will try without using an array.

  • I figured it out after a hundred tests and variations. Super tricky.

    The thing that breaks it is: using the shader before the sampler2D array is fully populated.

    So I should not call rect() before sending 30 textures to the shader. If I do, the thing behaves in a completely different and broken way and never recovers.

    If I wait and make sure enough textures have been sent, then it works as expected. Now I can scrub through my webcam frame buffer at 60 fps :)

  • edited February 2018

    @hamoid Their is actually a GL_TEXTURE_2D_ARRAY in OpenGL/JOGL. You try to emulate it, interesting :)

    Yes, i read it somewhere and multiple times on the web, linux needs to initialize the variables. Sometimes a shader will fail when you simple:

    vec3 col;  
    (maybe a for-loop here)  
    --> col+= value;  
    --> gl_FragColor = col;
    

    and becourse you running the shader in the draw loop, at running time the col value is NaN, so linux fails.

    maybe you can check for a value in the shader itself https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/isnan.xhtml

    if  true == isnan( tex[0] ) 
       gl_FragColor = vec4(black);  
    else 
       gl_FragColor=texture 
    

    not sure if it works.

    maybe also an error in glsl optimizations

    #version [shaderversion]
    #pragma optimize(off)

    I see also some guys (on webgl) passing a default Texture, as long as the picture(s) are loading.

    import  com.jogamp.opengl.GL4;  
    import com.jogamp.opengl.util.GLBuffers;   
    ByteBuffer defaulttex = GLBuffers.newDirectByteBuffer(new byte[]{(byte)255, (byte)0, (byte)0, (byte)255 }) 
    gl4.glTexImage2D(GL4.GL_TEXTURE_2D, 0, GL4.GL_RGBA, 1, 1, 0, GL4.GL_RGBA, GL4.GL_UNSIGNED_BYTE, defaulttex );  
    

    So in Processing you could load a default

    void setup () {  
    PImage img = loadImage("placeholder.jpg");  
    myshader.set("tex[0]",img);  
    }
    
  • Thank you.

    I've had uninitialized variables before, but I thought the were just producing nice glitches, not breaking the program :)

    I saw GL_TEXTURE_2D_ARRAYs but how would you populate it from Processing? Calling JOGL code directly?

    I tried initializing the textures inside setup with something like this (also tried storing the images on the Processing side) and it failed to work:

    for(int i=0; i<30; i++) {
      myshader.set("tex[" + i + "]", createImage(640, 480, ARGB));
    }
    

    But since I found out what was the issue and a fix, all is good.

Sign In or Register to comment.