Implementing laser graffiti using processing

edited November 2017 in Library Questions

hello , i am trying to implement laser graffiti in processing but i am having trouble in drawing line as the point of laser moves. I need to implement in such a way that as the laser light moves it should draw a line along with it. There is a simple code for continuous line in processing example( https://processing.org/examples/continuouslines.html ) i need same but with laser light tracking help me with this...

Tagged:

Answers

  • @krishna_18 -- re:

    but with laser light tracking

    What does this mean? Are you using a laptop / webcam? A special camera like a Kinect or a Wii sensor bar? Will this be a literal laser-pointer -- and if so what size / at what distance?

    Lasers are so bright that you could probably do this just using a high-pass filter on your camera -- check each camera pixel and draw the ones bright than n to the screen.

    But you could also use computer vision. This sounds like it might be an extremely simple case of blob detection for the OpenCV or BoofCV libraries.

  • If you image is static, you can use static background subtraction. You can check the examples under File>>Examples in the Processing IDE. Then go to Libraries>>Video>Capture and there should be a background subtraction example there. The concept is as follows:

    1. Start by capturing the background image.

    2. Then check any new image against a background image. If you find your laser pointer in the image by using image differentiation (either pixel by pixel comparison or using the blob library), store those pixels x,y or store the center of the blob.

    3. Then display the background image

    4. Finally, display the x,y coordinates of the blob either using the pixels stored before or by using a predefined shape like an ellipse.

    For the blob library, you need to install it using the library manager in the Processing IDE. Then you can go to the example section and explore the provided examples that come with the library.

    Kf

  • edited October 2017

    @jeremydouglass

    i am using logitech Webcam having resolution 1280x720. it should detect the laser light from 20-30meter distance.

  • edited October 2017

    @krishna_18 --

    You could use the OpenCV, BoofCV, or blob / blobdetector libraries to detect your point, then feed those coordinates into the continuous line drawing sketch.

    This presumes that your laser is the brightest thing on camera -- i.e. that it is brighter than whatever you are drawing, if you are projecting the drawing into the same scene as the laser.

    If that is the case, you could also use an extremely simple brightest point detector.

    I don't have a laser pointer and camera setup handy, but here is a demo of a brightest point detector using a fake mouse-driven "laser":

    /**
     * LaserPointerDetector
     * Mocks up a fake "laser pointer" under the mouse.
     * It then draws a box around the brightest point in the scene.
     * Assumes that the laser pointer will contain the brightest point!
     * 2017-10-16 Jeremy Douglass - Processing 3.3.6
     * forum.processing.org/two/discussion/24516/implementing-laser-graffiti-using-processing
     */
    PImage bg;
    
    void setup() {
      colorMode(HSB, 255);
      frameRate(10);
      bg = loadImage("https://processing.org/img/processing3-logo.png");
      bg.resize(width,height);
    }
    
    void draw() {
      background(0);
      fakeScene();
      fakePointer(mouseX, mouseY);
      PVector pv = brightPointDetector();
      drawDetectBox(pv);
    }
    
    PVector brightPointDetector() {
      int idx = 0;
      float max = 0;
      loadPixels();
      for (int i = 0; i < pixels.length; i++) {
        float pb = brightness(pixels[i]);
        if ( pb >= max) {
          max = pb;
          idx = i;
        }
      }
      int x = idx%width;
      int y = (idx-x)/width;
      return new PVector(x, y);
    }
    
    void fakeScene(){
      pushStyle();
      tint(255,128);
      image(bg, 0, 0);
      popStyle();
    }
    
    void fakePointer(float x, float y) {
      pushStyle();
      fill(255, 255, 255, 16);
      noStroke();
      ellipseMode(CENTER);
      for (int i=0; i<20; i++) {
        pushMatrix();
        translate(random(-5, 5), random(-5, 5));
        ellipse(x, y, random(10, 30), random(10, 30));
        popMatrix();
      }
      popStyle();
    }
    
    void drawDetectBox(PVector pv) {
      pushStyle();
      noFill();
      stroke(255);
      rectMode(CENTER);
      rect(pv.x, pv.y, 30, 30);
      popStyle();
    }
    

    As you can see, it works pretty well.

    LaserPointerDetector--screenshot

    Now, how to use those accurate coordinates and draw a continuous line with them...?

  • edited November 2017

    @jeremydouglass

    this you are tracking on mouseX and mouseY. i need to do the same using laser pointer and that will be detected using camera. and as the pointer moves it should draw a line along with it.

  • @krishna_18 -- that is not true. Look at the code for brightPointDetector again.

    It is not tracking the mouse - the mouse is drawing a bright color, and it is tracking the bright color. It has no idea where mouseX and mouseY are. It could be run on a movie frame, or video with a laser pointer, or anything.

  • edited November 2017

    @jeremydouglass

    i am having the code for light tracking.... now how can i draw the continues line as the laser point moves...

    import processing.video.*; Capture cam; boolean sin= true; int x, y; void setup() { size(640, 480); background(0); cam = new Capture(this, 640, 480); cam.start(); } void draw() { if(cam.available()) { cam.read(); cam.loadPixels(); float maxBri = 0; int theBrightPixel = 0; stroke(255); for(int i=0; i<cam.pixels.length; i++) { if(brightness(cam.pixels[i]) > maxBri) { image(cam,640,480); maxBri = brightness(cam.pixels[i]); theBrightPixel = i; } } x = theBrightPixel % cam.width; y = theBrightPixel / cam.width; } if(sin == true) { ellipse(x, y, 5, 5); } println("framerate"+frameRate); }

    link:

    this is how i need to implement...

  • Check out the continuous lines example:

    https://processing.org/examples/continuouslines.html

    It uses the built-in pmouseX (previous mouse position) which is cached automatically. Instead, you will need to cache the previous position manually, then draw a line segment between the previous and current position.

    Once that is working, simulate a mouse-down and up event so that you know when to start and stop drawing -- otherwise you will always have a jagged line or two jumping to the 0,0 corner.

  • edited November 2017

    i tried that...

    import processing.video.*; float pct = 0.0; // Percentage traveled (0.0 to 1.0) Capture video; color trackColor; void setup() { size(1280, 720); noStroke(); background(0); video = new Capture(this, width, height); video.start(); trackColor = color(255); } void captureEvent(Capture video) { video.read(); } void draw() { video.loadPixels(); image(video, 1280, 720); float worldRecord = 100000; int closestX = 0; int closestY = 0; for (int x = 0; x < video.width; x ++ ) { for (int y = 0; y < video.height; y ++ ) { int loc = x + y*video.width; color currentColor = video.pixels[loc]; float r1 = red(currentColor); float g1 = green(currentColor); float b1 = blue(currentColor); float r2 = red(trackColor); float g2 = green(trackColor); float b2 = blue(trackColor); float d = dist(r1, g1, b1, r2, g2, b2); if (d < worldRecord) { worldRecord = d; closestX = x; closestY = y; } } } if (worldRecord < 10) { fill(255); stroke(0,0,255); strokeWeight(2); ellipse(closestX, closestY,12,12); mouseX=closestX; mouseY=closestY; stroke(255,0,0); line(mouseX,mouseY,pmouseX,pmouseY); ellipse(closestX, closestY,12,12); } println("framerate:"+frameRate); } void keyPressed() { if (key >= 'A') { background(0); } }

  • You tried that... and.... ?

  • You are trying to save values to the built-in system variable mouseX. That won't work.

    Instead, you need to create two variables in addition to closestX and closestY -- they are previousClosestX (or pclosestX) and pclosestY. At the end of draw, save your current closestX and closestY to the p versions. then you can draw a line:

    line(pclosestX, pclosestY, closestX, closestY);
    
  • i am getting the lines but it starts from (0,0). and when i switch off and on the laser it started tracking from the last tracked point

  • how can i or from where to initialize pclosestX, pclosestY ?

  • This is another example using your latest code. I believe this is similar to what @jeremydouglass showed already. In the video, people click on the handheld laser to start drawing. In this example, clicking the laser (activating the laser) is done by a mouse action. When you released the left button of the mouse, it will do two things: it will stop the laser (as in the video) and the laser sensor routine will stop working. Technically, you can have the sensor laser routine running all the time and then you enable/disable the green ellipse acting as a laser pointer in my code below.

    Notice I draw lines. I used a trick/hack to be able to differentiate between different drawn objects. See the comments in mouseReleased.

    Kf

     //REFERENCES: https://forum.processing.org/two/discussion/24516/implementing-laser-graffiti-using-processing#latest
    
    
    //INSTRUCTIONS:
    //         *-- The mouse acts a laser drawing on top of the current video frame.
    //         *-- This laser is green. When it is detected, it stores the current 
    //         *-- position into a container. A line is drawn based on those stored
    //         *-- positions.
    //         *--
    //         *-- Pressed and hold left click to activate laser sensor
    //         *-- Released left click to deactivate laser sensor
    //         *-- Press the space bar to show or hide all those detected points
    //         *-- Press any key [A..Z] to reset the sensor container
    
    
    //===========================================================================
    // IMPORTS:
    import processing.video.*;
    
    //===========================================================================
    // FINAL FIELDS:
    
    
    //===========================================================================
    // GLOBAL VARIABLES:
    float pct = 0.0;      // Percentage traveled (0.0 to 1.0)
    Capture video;
    color trackColor; 
    
    float worldRecord = 10; 
    ArrayList<PVector> fig;
    
    boolean activateSensor=false;
    boolean showFig=true;
    
    float r2;
    float g2;
    float b2;
    
    
    //===========================================================================
    // PROCESSING DEFAULT FUNCTIONS:
    
    
    void setup() 
    {
      size(640, 480);
      noStroke();
      background(0);
      video = new Capture(this, width, height);
      video.start();
      trackColor = color(0, 255, 0);
      fig = new ArrayList<PVector>();
    
      r2 = red(trackColor);
      g2 = green(trackColor);
      b2 = blue(trackColor);
    }
    
    
    void draw() 
    {
    
      image(video, 0, 0);
    
      noStroke();
      fill(trackColor);
      ellipse(mouseX, mouseY, 5, 5);
    
      if (activateSensor==false) {
        surface.setTitle("Laser sensor NOT active");
      } else {
    
        surface.setTitle("Laser sensor running");
        loadPixels();
    
        int closestX = 0;
        int closestY = 0;
    
        for (int x = 0; x < video.width; x ++ ) {
          for (int y = 0; y < video.height; y ++ ) {
    
            int loc = x + y*video.width;
            color currentColor = pixels[loc];
    
            float r1 = red(currentColor);
            float g1 = green(currentColor);
            float b1 = blue(currentColor);
    
            float d = dist(r1, g1, b1, r2, g2, b2);
            if (d < worldRecord) 
            {
              //worldRecord = d;
              closestX = x;
              closestY = y;
              fig.add(new PVector(x, y));
            }
          }
        }
      }
    
      if (showFig==true) {
        stroke(255);
        strokeWeight(3);
        boolean skipNext=false;
    
        for (int i=0; i<fig.size()-1; i++) {  //Size - 1  to avoid out of bound exception
          PVector v1=fig.get(i);
          PVector v2=fig.get(i+1);
    
          if (skipNext==true) {  //See mouse release comments
            skipNext=false;
            continue;
          }
    
          if (v1.dist(v2)<1) {
            skipNext=true;  //See mouse release comments
            continue;
          }
    
          line(v1.x, v1.y, v2.x, v2.y);
        }
      }
    }
    
    void mousePressed() {
      activateSensor=true;
    }
    
    void mouseReleased() {
      activateSensor=false;
      //This next line adds a copy of the last point inserted in the container
      //This is used to tell the drawing routine when to stop drawing a lines.
      //This allows to create different shaoes in the same sketch.
      fig.add(((PVector)fig.get(fig.size()-1)).copy()); 
    }
    
    void keyPressed() 
    { 
      if (keyCode==' ')
        showFig=!showFig;
    
      if (key >= 'A' && key <= 'Z') 
      {
        background(0);
        fig.clear();
      }
    }
    
    
    
    //===========================================================================
    // OTHER FUNCTIONS:
    
    void captureEvent(Capture video) 
    {
      video.read();
    }
    
  • along a related line : I am trying to track a (green) laser spot in the scene but my camera is so cheap it maxes out to white (255,255,255) on the spot itself. The fringes of the spot might be green enough ? anybod tried looking at ratio green/red , green/blue or even (green**2)/(red*blue )

  • @aldobranti -- the easiest thing to do is to run a preview filter on the scene and return white only for pixels that meet the criteria. Then you can visually inspect the screen and see if it is working or not for a given location / time of day / etc.

    If your filter takes parameters -- e.g. lowpass / highpass levels, etc. -- you can then adjust the parameters.

    What is the problem with maxed out white on the spot itself -- are there other sources of max white in the image?

    If your criteria are very bright green pixels e.g. 230-240, 250+, 230-240 -- then those are your criteria. But the signal may be very unstable, especially with a cheap camera...

  • I wouldn't be surprise if different cameras has different responses to different laser wavelengths and intensities. One thing you could do is to train your sketch to track the detected laser blob in the image. This will work in the video you provided above, as it is a dark environment and the laser is the only color in your scene. However, if you are trying to do this in a scene with different lightning, if your laser turns out to be detected as white by your camera then you can potentially pick up anything white in the scene. ,

    If you want more code samples or comments related to your graffiti project, you should check other posts via google: "Processing color tracking". Don't use the search engine in this forum as it will not work for this type of searches.

    Kf

Sign In or Register to comment.