<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom">
	<channel>
      <title>Tagged with #video - Processing 2.x and 3.x Forum</title>
      <link>https://forum.processing.org/two/discussions/tagged/feed.rss?Tag=%23video</link>
      <pubDate>Sun, 08 Aug 2021 18:00:36 +0000</pubDate>
         <description>Tagged with #video - Processing 2.x and 3.x Forum</description>
   <language>en-CA</language>
   <atom:link href="/two/discussions/tagged%23video/feed.rss" rel="self" type="application/rss+xml" />
   <item>
      <title>VIDEO - best codec for minimising processing during playback?</title>
      <link>https://forum.processing.org/two/discussion/10824/video-best-codec-for-minimising-processing-during-playback</link>
      <pubDate>Fri, 15 May 2015 11:16:44 +0000</pubDate>
      <dc:creator>rmaskey</dc:creator>
      <guid isPermaLink="false">10824@/two/discussions</guid>
      <description><![CDATA[<p>I'm currently using h264 in a .mov wrapper at 10mbs/s and have problems with the videos freezing after about 10 minutes.</p>

<p>These are being used for a light visualisation so need to be played for around 10 hours, no problems! I have a collection of videos that can be overlaid to achieve different effects so I'm hoping to find a better codec but cannot find any information online.</p>

<p>I'm going to try an uncompressed format but any experience or advice would be great as there is a lot of trial and error (and waiting for renders) ahead!</p>

<p>Cheers
Rich</p>
]]></description>
   </item>
   <item>
      <title>Capture images at 720p and 60fps with RealSense SR300</title>
      <link>https://forum.processing.org/two/discussion/23754/capture-images-at-720p-and-60fps-with-realsense-sr300</link>
      <pubDate>Wed, 09 Aug 2017 04:39:13 +0000</pubDate>
      <dc:creator>gellpro</dc:creator>
      <guid isPermaLink="false">23754@/two/discussions</guid>
      <description><![CDATA[<p>Hi</p>

<p>I want to obtain 720p images at 60fps.
I couldn't obtain 720p images with C922 though Capture.list() detected 1280x720 and 60fps.
<a href="https://forum.processing.org/two/discussion/23752/capture-video-at-720p-and-60fps-with-c922#latest" target="_blank" rel="nofollow">https://forum.processing.org/two/discussion/23752/capture-video-at-720p-and-60fps-with-c922#latest</a></p>

<p>Furthermore, I read this thread and found that I couldn't obtain probably 720p at 60 fps because the format of C922 capturing 720p and 60fps was MPEG.
<a href="https://forum.processing.org/two/discussion/23692/camera-capture-not-working-with-mpeg-streams#latest" target="_blank" rel="nofollow">https://forum.processing.org/two/discussion/23692/camera-capture-not-working-with-mpeg-streams#latest</a></p>

<p>So, and then, I tried to get 720p at 60fps with RealSense SR300 with the same source code used in my aforementioned post.
However, in this case, Capture.list() couldn't detect 1280x720 and 60fps although SR300 could capture the resolution and frame rate with other application, and the format was YUY2.
FYI, I could capture images at 960x540 and 60fps which Capture.list() detected.</p>

<p>Why cannot Capture.list() detect size=1280x720 and fps=60?
Please tell me what to do.</p>

<p>Best regards,
gellpro</p>
]]></description>
   </item>
   <item>
      <title>about Video Library</title>
      <link>https://forum.processing.org/two/discussion/23662/about-video-library</link>
      <pubDate>Tue, 01 Aug 2017 12:32:14 +0000</pubDate>
      <dc:creator>hiddenlines</dc:creator>
      <guid isPermaLink="false">23662@/two/discussions</guid>
      <description><![CDATA[<p>Hello,</p>

<p>I'm using the video library to loop a video. The video is playing well, but I'm having this error :</p>

<p>2017-08-01 14:22:45.855 java[1431:871990] 14:22:45.855 WARNING:  140: This application, or a library it uses, is using the deprecated Carbon Component Manager for hosting Audio Units. Support for this will be removed in a future release.</p>

<p>Also, this makes the host incompatible with version 3 audio units. Please transition to the API's in AudioComponent.h.</p>

<p>Thanks,
Ali</p>
]]></description>
   </item>
   <item>
      <title>Current state of video library support</title>
      <link>https://forum.processing.org/two/discussion/22645/current-state-of-video-library-support</link>
      <pubDate>Wed, 17 May 2017 21:54:18 +0000</pubDate>
      <dc:creator>grauwald</dc:creator>
      <guid isPermaLink="false">22645@/two/discussions</guid>
      <description><![CDATA[<p>Anyone have any knowledge on video support for Android mode?</p>

<p>This library seems to be the thing:</p>

<p><a href="https://github.com/omerjerk/processing-video-android" target="_blank" rel="nofollow">https://github.com/omerjerk/processing-video-android</a></p>

<p>But it hasn't been maintained since 2015.  Attempted to use it and having difficulty getting Processing IDE to see it as a library.  (so maybe someone knows how to do that?  I'm a newbie to Android mode).</p>

<p>Also, confusingly, in Android mode (4.x pre-release) when adding libraries via the "Sketch" dropdown menu there are two "native" options the "vr" lib, which is for Cardboard support (and works great!) and the other is "video" suggesting there is now a native video lib for Android mode.</p>

<p>Any help would be greatly appreciated.</p>
]]></description>
   </item>
   <item>
      <title>Real time video export from Processing</title>
      <link>https://forum.processing.org/two/discussion/8994/real-time-video-export-from-processing</link>
      <pubDate>Mon, 12 Jan 2015 07:08:56 +0000</pubDate>
      <dc:creator>hamoid</dc:creator>
      <guid isPermaLink="false">8994@/two/discussions</guid>
      <description><![CDATA[<p>This morning I started thinking about how to help someone who had trouble exporting video from Processing.
One hour later I had this proof of concept working:</p>

<p><a href="https://github.com/hamoid/Fun-Programming/tree/master/processing/ideas/2015/01/streamToFFMPEG" target="_blank" rel="nofollow">https://github.com/hamoid/Fun-Programming/tree/master/processing/ideas/2015/01/streamToFFMPEG</a></p>

<p>It allows you to build a video by adding frames one by one. No need to save all frames first and then merge them into a video.
It uses ffmpeg and pipes. In this demo, you can hold the mouse down to record. You don't need to record everything as one long segment: you can observe and when you see something interesting happening, hold down record (ok, I have to admit you won't see much exciting happening in this demo :-).</p>

<p>It will have to be adapted to Windows and OS X, and it should be configurable (bitrate, video format, etc).</p>

<p>But maybe it can be useful to someone already.</p>
]]></description>
   </item>
   <item>
      <title>How to use the Video library with eclipse ?</title>
      <link>https://forum.processing.org/two/discussion/10034/how-to-use-the-video-library-with-eclipse</link>
      <pubDate>Wed, 25 Mar 2015 19:56:33 +0000</pubDate>
      <dc:creator>tlecoz</dc:creator>
      <guid isPermaLink="false">10034@/two/discussions</guid>
      <description><![CDATA[<p>Hello !</p>

<p>The Movie object works fine if I use Processing IDE but I can't get it working with eclipse, and I really don't know what I can do...</p>

<p>I copy/paste the jars &amp; dll-folders in my project.
I added them to the build path using the button "add jars" and targetting the jars located in my project</p>

<p>But it doesn't work, don't know why...</p>

<p>Here is a screen capture with my file-structure and the error message.</p>

<p><img src="http://beginfill.com/processing/videoEclipse.jpg" alt="" /></p>

<p>I already searched on the internet for a solution. I found a lot of people who had issues with Movie &amp; Eclipse but nobody had exactly my error-message and I didn't find any solution yet...</p>

<p>Thank you !</p>

<p>(I'm using Windows 7 64bits)</p>
]]></description>
   </item>
   <item>
      <title>Is there a default max size for the canvas or video using the IPcapture library?</title>
      <link>https://forum.processing.org/two/discussion/17853/is-there-a-default-max-size-for-the-canvas-or-video-using-the-ipcapture-library</link>
      <pubDate>Tue, 16 Aug 2016 01:34:26 +0000</pubDate>
      <dc:creator>vivianatroya</dc:creator>
      <guid isPermaLink="false">17853@/two/discussions</guid>
      <description><![CDATA[<p>I've been trying to use this library but I keep getting an error: "Frame resize: from 800x600 to 640x480". I'm trying to load a video and live streaming at the same time using both IPcapture and the video library from processing. Can someone please help me?</p>
]]></description>
   </item>
   <item>
      <title>Videos triggered by color tracking?</title>
      <link>https://forum.processing.org/two/discussion/15186/videos-triggered-by-color-tracking</link>
      <pubDate>Mon, 29 Feb 2016 06:37:12 +0000</pubDate>
      <dc:creator>witchrogue</dc:creator>
      <guid isPermaLink="false">15186@/two/discussions</guid>
      <description><![CDATA[<p>EDIT TO PREVIOUS QUESTION: Alright so I've sort of found answer to my previous question, but now a new one has been brought up.  First off, I am in no way familiar with coding.  I'm just an artist desperately trying to get an installation piece prepared for a show.  I'm very sorry that my code is probably an inelegant hodge-podge.  Secondly I'm using Processing 2.2.1.</p>

<p>For this project, I have Processing create two windows.  In one window, it searches all the colors found in a webcam display for certain colors.</p>

<p>Based on whether it finds the specific colors, different videos will play in the second window.  (For instance, if the webcam sees the color red, the video Rune1 will play in the second window).</p>

<p>Now I've gotten the code to work before, but it just layers the videos on top of one another.  What I want is to try to have the videos switch out instead (One video stops playing when another starts).  I've tried some code I found in the forums, but I'm such a newbie and my code is so convoluted that I can't get it to work.  Would anyone be willing to take a look at this mess?</p>

<p>Code:</p>

<pre><code>import java.util.HashMap; 
import java.util.ArrayList; 
import java.io.File; 
import java.io.BufferedReader; 
import java.io.PrintWriter; 
import java.io.InputStream; 
import java.io.OutputStream; 
import java.io.IOException; 
import processing.core.*; 
import processing.data.*; 
import processing.event.*; 
import processing.opengl.*; 


import javax.swing.*; 
SecondApplet s;


import processing.video.*;


int VideoPlaying;

boolean Vid1 = false;
boolean Vid2 = false;


Movie Rune1;
Movie Rune2;

Capture video;

// Colors being tracked
color trackColor2 = color (136,33,27);
color trackColor3 = color (55,68,100);
  float worldRecord1 = 500;
  float worldRecord2 = 500;
  float worldRecord3 = 500;
  float worldRecord4 = 500;
  float worldRecord5 = 500;
  float worldRecord6 = 500;

public void setup() {

       Rune1 = new Movie (this, "C:/Users/I/Desktop/WORKING3/data/Video1.avi");
       Rune2 = new Movie (this, "C:/Users/I/Desktop/WORKING3/data/Video2.avi");



  frameRate (200);
  size(1280, 720);


  PFrame f = new PFrame(width, height);
  frame.setTitle("first window");
  f.setTitle("second window");


//Changes WebCam source


  video = new Capture(this, "name=Logitech HD Webcam C615,size=1280x720,fps=30");


 video.start(); 
trackColor2 = color (136,33,27);
trackColor3 = color (39,123,165);
 smooth();
}
public void draw() {


 if (video.available()) {
   video.read();
}
 video.loadPixels();
 image(video,0,0);


float worldRecord0 = 500;

 int closestX = 0;
 int closestY = 0;

 for (int x = 0; x &lt; video.width; x ++ ) {
   for (int y = 0; y &lt; video.height; y ++ ) {
     int loc = x + y*video.width;
     // What is current color
     color currentColor = video.pixels[loc];
      float r1 = red(currentColor);
      float g1 = green(currentColor);
      float b1 = blue(currentColor);


      float r3 = red(trackColor2);
      float g3 = green(trackColor2);
      float b3 = blue(trackColor2);

      float r4 = red(trackColor3);
      float g4 = green(trackColor3);
      float b4 = blue(trackColor3);    


     float d2 = dist(r1,g1,b1,r3,g3,b3); 

     float d3 = dist(r1,g1,b1,r4,g4,b4);

     if (d2 &lt; worldRecord1) {
        worldRecord1 = d2;
      }

     if (d3 &lt; worldRecord2) {
        worldRecord2 = d3;
      }

   }
 }

}

public class PFrame extends JFrame {
  public PFrame(int width, int height) {
    setBounds(100, 100, width, height);
    s = new SecondApplet();
    add(s);
    s.init();
    show();
  }
}
public class SecondApplet extends PApplet {
  int ghostX, ghostY;

  public void setup() {

    background(0);
    noStroke();
  }

  public void draw() {

   toggle();
  if (Vid1) {

    println("Rune1");
    Rune2.stop();
    background(0);
    Rune1.play();
    background(0);
    image(Rune1, 0, 0, width, height);
  }

  if (Vid2) {
    println("Rune2");
    Rune1.stop();
    background(0);
    Rune2.play();
    image(Rune2, 0, 0, width, height);
  }

}
void movieEvent(Movie m) { 
  m.read();
} 


     if (worldRecord1 &lt; 50) {

     VideoPlaying = 1 ;
     println(VideoPlaying); 
     }


           if (worldRecord2 &lt; 50) { 
            VideoPlaying = 2 ;
              println(VideoPlaying);
              }

  }
  }
}

void movieEvent(Movie m) {
  m.read();
}


void toggle() {

  if (VideoPlaying == 1) {
    Vid1 = true;
    Vid2 = false;

  }

  if (VideoPlaying == 2) {
    Vid1 = false;
    Vid2 = true;

  }

  }
</code></pre>
]]></description>
   </item>
   <item>
      <title>Prevent video in fullscreen to reduce frame rate i.e. get choppy</title>
      <link>https://forum.processing.org/two/discussion/15169/prevent-video-in-fullscreen-to-reduce-frame-rate-i-e-get-choppy</link>
      <pubDate>Sun, 28 Feb 2016 05:10:02 +0000</pubDate>
      <dc:creator>sg1</dc:creator>
      <guid isPermaLink="false">15169@/two/discussions</guid>
      <description><![CDATA[<p>If I play a video (size 640x360) in fullscreen (using the Processing Video library) the video playback is extremely choppy as if the framerate is reduced to 12fps or less. If I play the video in size(640,360) or equal the original video, the quality is fine.</p>

<p>I've tried putting framerate up and down but it didn't solve the issue. I'm new to processing and I have no clue what causes the issue.</p>

<p>I'm using the example app "loop" from the video library. You can reproduce the issue by exchanging size() with fullScreen().</p>

<p>Does anyone encounter the same issues and can help me?</p>
]]></description>
   </item>
   <item>
      <title>i want to play video on the base of motion tracking</title>
      <link>https://forum.processing.org/two/discussion/14613/i-want-to-play-video-on-the-base-of-motion-tracking</link>
      <pubDate>Mon, 25 Jan 2016 13:33:26 +0000</pubDate>
      <dc:creator>geekself108</dc:creator>
      <guid isPermaLink="false">14613@/two/discussions</guid>
      <description><![CDATA[<pre lang="javascript">

i want to play video on the base of motion tracking but some how i cant play.


import processing.video.*;
Movie mov;
// Variable for capture device
Capture video;
// Previous Frame
PImage prevFrame;

// How different must a pixel be to be a "motion" pixel
float threshold = 50;

int wid = 70; // for rect 
// for counting motion in the rect 
// and playing movie in reverse 
float count; 
float abc;

void setup() {
  size(640, 480);
  // Using the default capture device
  video = new Capture(this, 640, 480);
  video.start();

  mov = new Movie(this, "T_rex.mov");
  mov.play();
  // Create an empty image the same size as the video
  prevFrame = createImage(video.width, video.height, RGB);
}

// New frame available from camera
void captureEvent(Capture video) {
  // Save previous frame for motion detection!!
  prevFrame.copy(video, 0, 0, video.width, video.height, 0, 0, video.width, video.height);
  prevFrame.updatePixels();  
  video.read();
}

void movieEvent(Movie m ) {
  m.read();
}

void draw() {
  background(125, 240, 200);

  // You don't need to display it to analyze it!
  image(mov, 0, 0);
  tint(255, 126);
  image(video, 0, 0);

  // rect
  fill(255, 126);
  rect(width/6.5, 0, wid, height);
  rect(width/1.5, 0, wid, height);
  rect(width/2.5, 0, wid, height);


  loadPixels();
  video.loadPixels();
  prevFrame.loadPixels();

  // These are the variables we'll need to find the average X and Y
  float sumX = 0;
  float sumY = 0;
  int motionCount = 0; 
  int totalMotion = 0;

  float sumX2 = 0;
  float sumY2 = 0;
  int motionCount2 = 0; 
  int totalMotion2 = 0;

  float sumX3 = 0;
  float sumY3 = 0;
  int motionCount3 = 0; 
  int totalMotion3 = 0;



  // ----------------------- rect 1 -------------------------- //

  // Begin loop to walk through every pixel
  for (int x = 100; x &lt; 170; x++ ) {
    for (int y = 0; y &lt; video.height; y++ ) {
      // What is the current color
      color current = video.pixels[x+y*video.width];

      // What is the previous color
      color previous = prevFrame.pixels[x+y*video.width];

      // Step 4, compare colors (previous vs. current)
      float r1 = red(current); 
      float g1 = green(current);
      float b1 = blue(current);
      float r2 = red(previous); 
      float g2 = green(previous);
      float b2 = blue(previous);

      // Motion for an individual pixel is the difference between the previous color and current color.
      float diff = dist(r1, g1, b1, r2, g2, b2);
      // totalMotion is the sum of all color differences. 
      totalMotion += diff;
      // If it's a motion pixel add up the x's and the y's
      if (diff &gt; threshold) {
        sumX += x;
        sumY += y;
        motionCount++;
      }
    }
  }

  // average location is total location divided by the number of motion pixels.
  float avgX = sumX / motionCount; 
  float avgY = sumY / motionCount; 

  float avg = totalMotion / video.pixels.length;



  // ----------------------- rect 2 -------------------------- //

  // Begin loop to walk through every pixel
  for (int x = 230; x &lt; 300; x++ ) {
    for (int y = 0; y &lt; video.height; y++ ) {
      // What is the current color
      color current = video.pixels[x+y*video.width];

      // What is the previous color
      color previous = prevFrame.pixels[x+y*video.width];

      // Step 4, compare colors (previous vs. current)
      float r1 = red(current); 
      float g1 = green(current);
      float b1 = blue(current);
      float r2 = red(previous); 
      float g2 = green(previous);
      float b2 = blue(previous);

      // Motion for an individual pixel is the difference between the previous color and current color.
      float diff = dist(r1, g1, b1, r2, g2, b2);
      // totalMotion is the sum of all color differences. 
      totalMotion2 += diff;
      // If it's a motion pixel add up the x's and the y's
      if (diff &gt; threshold) {
        sumX2 += x;
        sumY2 += y;
        motionCount2++;
      }
    }
  }

  // average location is total location divided by the number of motion pixels.
  float avgX2 = sumX2 / motionCount2; 
  float avgY2 = sumY2 / motionCount2; 
  float avg2 = totalMotion2 / video.pixels.length;


  // ----------------------- rect 3 -------------------------- //

  // Begin loop to walk through every pixel
  for (int x = 430; x &lt; 500; x++ ) {
    for (int y = 0; y &lt; video.height; y++ ) {
      // What is the current color
      color current = video.pixels[x+y*video.width];

      // What is the previous color
      color previous = prevFrame.pixels[x+y*video.width];

      // Step 4, compare colors (previous vs. current)
      float r1 = red(current); 
      float g1 = green(current);
      float b1 = blue(current);
      float r2 = red(previous); 
      float g2 = green(previous);
      float b2 = blue(previous);

      // Motion for an individual pixel is the difference between the previous color and current color.
      float diff = dist(r1, g1, b1, r2, g2, b2);
      // totalMotion is the sum of all color differences. 
      totalMotion3 += diff;
      // If it's a motion pixel add up the x's and the y's
      if (diff &gt; threshold) {
        sumX3 += x;
        sumY3 += y;
        motionCount3++;
      }
    }
  }

  // average location is total location divided by the number of motion pixels.
  float avgX3 = sumX3 / motionCount3; 
  float avgY3 = sumY3 / motionCount3; 
  float avg3 = totalMotion3 / video.pixels.length;



  // --------------  play video --------------------- //


  //  playing movie reverse
  if (count == 1) {
    mov.speed(-1);
    println("movie reverse");
  }

// testing reverse direction only in rect 2 
  if (avg2 &gt; 1) {
    mov.play();
    println("movin' playing");
    abc = 1;
  } else {
    abc= 0;
  }

  if (avg &gt;1 || avg3 &gt; 1) {
     mov.play();
  }
  
  // pause movie if no motion
  if (avg3 == 0 &amp;&amp; avg2 == 0 &amp;&amp; avg == 0) {
    mov.pause();
    println("pause");
  } 


  if (mov.time() == 0 ) {
    mov.play();
    println(" replaying movie ");
  }

  // for testing 
  if (mov.time() &gt;11) {
    mov.jump(0);
    mov.play();
    println(" re start playing movie  ");
  }


  // Draw a circle based on average motion
  smooth();
  noStroke();
  fill(0, 0, 255);
  ellipse(avgX, avgY, 15, 15);
  ellipse(avgX2, avgY2, 15, 15);
  ellipse(avgX3, avgY3, 15, 15);

  textSize(30);
  text(" T : " + mov.time(), width/6, height - 50);

  // giving value 1 to count so that if 
  // that i got again motion on that same rect movie 
  //play in reverse direction  
  count = abc;
  println("count " + count );


  fill(255, 0, 0);
  text("avg", width/6.5, 100);
  text(avg, width/6.5, 130);

  text("avg3", width/1.5, 100);
  text(avg3, width/1.5, 130);

  text("avg2", width/2.5, 100);
  text(avg2, width/2.5, 130);
}

</pre>
]]></description>
   </item>
   <item>
      <title>Most Efficient Video Playback</title>
      <link>https://forum.processing.org/two/discussion/14351/most-efficient-video-playback</link>
      <pubDate>Fri, 08 Jan 2016 22:32:38 +0000</pubDate>
      <dc:creator>sgrigg</dc:creator>
      <guid isPermaLink="false">14351@/two/discussions</guid>
      <description><![CDATA[<p>Hi everyone. thanks in advance for the help. I am trying to run videos (multiple different videos for multiple different projects) in the background of a simple processing animation and I need them to loop seamlessly. I have tried multiple approaches with multiple codecs. I just want to see if anyone has any tips on making processing play well with video. Codecs I have tried without being able to get seamless loop :MPEG-2, Photo-JPEG, Animation, H.264, Pro-Res</p>

<pre><code>static final String FILE = "video", EXT = ".mov";
static final String RENDERER = JAVA2D;
static final int FPS = 30;
Movie vid;

import processing.video.*;


void settings() {

  size(1920, 1080, RENDERER);
  noSmooth();
}

void setup() {
  frameRate(FPS);
  println(width, height);
  vid = new Movie(this, FILE + EXT);
  vid.loop();
}

void draw() {
  image(vid, 0, 0);
}

void movieEvent(Movie m) {
  m.read();
}
</code></pre>

<p>I have also tried using frame images and playing them one at a time, this works well for a 128 frame video but not for a 1800 frame video</p>

<pre><code>static final String FILE = "videoFrame-", EXT = ".png";
static final String RENDERER = JAVA2D;
static final int FPS = 30;
PImage[] fireworx = new PImage[128];
int currentImage = 0;


void settings() {
  size(1920, 1080, RENDERER);
  fullScreen(RENDERER);
  noSmooth();
  for(int i=0;i&lt;=127;i++){
    fireworx[i] = loadImage(FILE + i + EXT );
  }
}

void setup() {
  frameRate(FPS);
  println(width, height);
}

void draw() {
  image(fireworx[currentImage], 0, 0);
  if(currentImage &lt; 127){
    currentImage++;
  }else{
    currentImage = 0;
  }
}
</code></pre>

<p>Any advice on looping video seamlessly with Processing
 is super welcome! Also if I am just barking up the wrong solution tree
 let me know! Again thanks for any help!</p>
]]></description>
   </item>
   <item>
      <title>java error on ubuntu</title>
      <link>https://forum.processing.org/two/discussion/12421/java-error-on-ubuntu</link>
      <pubDate>Mon, 07 Sep 2015 06:47:25 +0000</pubDate>
      <dc:creator>singmj</dc:creator>
      <guid isPermaLink="false">12421@/two/discussions</guid>
      <description><![CDATA[<p>my host:</p>

<p>hardware:pcduinio (as Raspberry B)</p>

<p>ubuntu 12.04  arm-jdk 7  oracle-java</p>

<p>when Run processing  example:
Libraries-&gt;video-&gt;GettingStartedCapture.pde</p>

<pre><code>//******************
import processing.video.*;

Capture cam;

void setup() {
  size(640, 480);

  String[] cameras = Capture.list();

//error message:
//alert message:
//-no class def found error:could not
  initialize class com.sun.jna.Native
</code></pre>

<p>but other processing  demo is OK (no include video libraries)</p>

<p>how to fix it?
thanks.</p>
]]></description>
   </item>
   <item>
      <title>Movie / Capture not working (Linux, Processing 2.2.1, 3.0a5, 3.0a10)</title>
      <link>https://forum.processing.org/two/discussion/11321/movie-capture-not-working-linux-processing-2-2-1-3-0a5-3-0a10</link>
      <pubDate>Sun, 14 Jun 2015 17:06:42 +0000</pubDate>
      <dc:creator>hamoid</dc:creator>
      <guid isPermaLink="false">11321@/two/discussions</guid>
      <description><![CDATA[<p>Hi hi,</p>

<p>If I try the example sketches for Movie and Capture they don't work.</p>

<p>Capture complains on Capture.list() with <code>IllegalArgumentException: No such Gstreamer factory: v4l2src</code>. gstreamer seems to be installed.</p>

<p>If I run <code>fswebcam</code> on the command line (a webcam capture command line program) it works fine:</p>

<pre><code>--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
--- Capturing frame...
Captured frame in 0.00 seconds.
--- Processing captured image...
</code></pre>

<p>The Movie loop example does nothing. No error, just black screen. I also tried with .mp4 instead of .mov, which didn't help. I can play video fine on web pages and in VLC.</p>

<p>I'm on 64bit ArchLinux. Other Processing features work fine (2D and 3D graphics for instance).</p>

<p>Any suggestions?</p>
]]></description>
   </item>
   <item>
      <title>Áudio Vídeo</title>
      <link>https://forum.processing.org/two/discussion/13403/audio-video</link>
      <pubDate>Wed, 04 Nov 2015 12:15:15 +0000</pubDate>
      <dc:creator>lviensen</dc:creator>
      <guid isPermaLink="false">13403@/two/discussions</guid>
      <description><![CDATA[<p>Posso usar o áudio de um vídeo pra gerar frequência?</p>
]]></description>
   </item>
   <item>
      <title>How to export Video in good quality in Point Cloud mode?</title>
      <link>https://forum.processing.org/two/discussion/12865/how-to-export-video-in-good-quality-in-point-cloud-mode</link>
      <pubDate>Wed, 07 Oct 2015 02:31:43 +0000</pubDate>
      <dc:creator>intentionalfallacy</dc:creator>
      <guid isPermaLink="false">12865@/two/discussions</guid>
      <description><![CDATA[<p>Hi, I am fresh in Processing and Kinect, but I played little bit with Point Cloud sketch.
 I would like to export this point cloud preview as video. I tried to find tutorial on youtube and google but I coudn't find solution for that. What code I should write to export that preview? The oryginal Sketch is:</p>

<pre><code>        /*
        Copyright (C) 2014  Thomas Sanchez Lengeling.
         KinectPV2, Kinect for Windows v2 library for processing
        */

        import java.nio.FloatBuffer;

        import KinectPV2.*;
        import javax.media.opengl.GL2;

        private KinectPV2 kinect;

        float a = 0;
        int zval = 50;
        float scaleVal = 260;

        //Distance Threashold
        float maxD = 4.0f; //meters
        float minD = 1.0f;

        public void setup() {
          size(1366, 768, P3D);

          kinect = new KinectPV2(this);
          kinect.enableDepthImg(true);
          kinect.enablePointCloud(true);
          kinect.activateRawDepth(true);

          kinect.setLowThresholdPC(minD);
          kinect.setHighThresholdPC(maxD);

          kinect.init();
        }

        public void draw() {
          background(0);

          //image(kinect.getDepthImage(), 0, 0, 320, 240);

          //Threahold of the point Cloud.
          kinect.setLowThresholdPC(minD);
          kinect.setHighThresholdPC(maxD);

          FloatBuffer pointCloudBuffer = kinect.getPointCloudDepthPos();

          PJOGL pgl = (PJOGL)beginPGL();
          GL2 gl2 = pgl.gl.getGL2();

          gl2.glEnable( GL2.GL_BLEND );
          gl2.glEnable(GL2.GL_POINT_SMOOTH);      

          gl2.glEnableClientState(GL2.GL_VERTEX_ARRAY);
          gl2.glVertexPointer(3, GL2.GL_FLOAT, 0, pointCloudBuffer);

          gl2.glTranslatef(width/2, height/2, zval);
          gl2.glScalef(scaleVal, -1*scaleVal, scaleVal);
          gl2.glRotatef(a, 0.0f, 1.0f, 0.0f);

          gl2.glDrawArrays(GL2.GL_POINTS, 0, kinect.WIDTHDepth * kinect.HEIGHTDepth);
          gl2.glDisableClientState(GL2.GL_VERTEX_ARRAY);
          gl2.glDisable(GL2.GL_BLEND);
          endPGL();

          stroke(255, 0, 0);
          text(frameRate, 50, height- 50);
        }

        public void mousePressed() {

          println(frameRate);
         // saveFrame();
        }

        public void keyPressed() {
          if (key == 'a') {
            zval +=1;
            println(zval);
          }
          if (key == 's') {
            zval -= 1;
            println(zval);
          }

          if (key == 'z') {
            scaleVal += 0.1;
            println(scaleVal);
          }
          if (key == 'x') {
            scaleVal -= 0.1;
            println(scaleVal);
          }

          if (key == 'q') {
            a += 1;
            println(a);
          }
          if (key == 'w') {
            a -= 1;
            println(a);
          }

          if (key == '1') {
            minD += 0.01;
            println("Change min: "+minD);
          }

          if (key == '2') {
            minD -= 0.01;
            println("Change min: "+minD);
          }

          if (key == '3') {
            maxD += 0.01;
            println("Change max: "+maxD);
          }

          if (key == '4') {
            maxD -= 0.01;
            println("Change max: "+maxD);
          }

          if (key == '2') {
            minD -= 0.01;
            println("Change min: "+minD);
          }

          if (key == '3') {
            maxD += 0.01;
            println("Change max: "+maxD);
          }

          if (key == '4') {
            maxD -= 0.01;
            println("Change max: "+maxD);
          }
        }
</code></pre>
]]></description>
   </item>
   <item>
      <title>Windows 8.1 vídeo</title>
      <link>https://forum.processing.org/two/discussion/12543/windows-8-1-video</link>
      <pubDate>Wed, 16 Sep 2015 15:11:05 +0000</pubDate>
      <dc:creator>lviensen</dc:creator>
      <guid isPermaLink="false">12543@/two/discussions</guid>
      <description><![CDATA[<p>Existe alguma maneira de executar um vídeo usando o windows 8.1?</p>
]]></description>
   </item>
   <item>
      <title>My webcam video is not flipping!</title>
      <link>https://forum.processing.org/two/discussion/12440/my-webcam-video-is-not-flipping</link>
      <pubDate>Tue, 08 Sep 2015 19:46:53 +0000</pubDate>
      <dc:creator>Jackie_ds</dc:creator>
      <guid isPermaLink="false">12440@/two/discussions</guid>
      <description><![CDATA[<p>Hello guys!</p>

<p>I have a little colortracking game, which includes a webcam. My only problem now is that the video is not flipping as it should be.
I really tried out all the code examples here in this forum but nothing worked! Sometimes it did flip the video correctly but then I wasn't able to track the color anymore... really don't know why.</p>

<p>So, first things first, here is my code:</p>

<pre><code>import processing.video.*;
Capture webcam;

color trackColor;
float trackR;
float trackG;
float trackB;
int topLeftX;
int topLeftY;
int bottomRightX;
int bottomRightY;
int maxColorDifference;

//CHERRY
PImage cherryPic;
Cherry[] cherry;
int count = 5;
int speed = 1;
int cherryX = -40;
int cherryY = -40;
int score = 0;
boolean caught;
boolean gameOver;

//basket
PImage basketPic;
int basketWidth = 103;
int basketHeight = 82;

//screen states
final int INACTIVE = 0;
final int ACTIVE = 1;
int state = INACTIVE;

void setup()
{
  size( 640, 480 );
  webcam = new Capture( this, width, height );

  trackColor = color( 255, 128, 64 );
  trackR = red(trackColor); //(trackColor &gt;&gt; 16) &amp; 0xff;
  trackG = green(trackColor); //(trackColor &gt;&gt; 8) &amp; 0xff;
  trackB = blue(trackColor); //trackColor &amp; 0xff;
  maxColorDifference = 40;
  topLeftX = width;
  topLeftY = height;
  bottomRightX = 0;
  bottomRightY = 0;

  //CHERRY
  cherryPic = loadImage("cherry.png");
  cherry = new Cherry[count];

  for (int i=0; i&lt;count; i++) {
    cherry[i]=new Cherry(cherryPic, cherryX, cherryY);
  }

  //BASKET
  basketPic = loadImage("basket.png");

  webcam.start();
}

void draw()
{

  //WEBCAM
  if ( webcam.available() ) {

    webcam.read();

    image( webcam, 0, 0 );    

    loadPixels();

    int counter = 0;
    for ( int j = 0; j &lt; webcam.height; j++ ) {
      for ( int i = 0; i &lt; webcam.width; i++ ) {        

        color c = webcam.pixels[counter];
        float r = red(c);
        float g = green(c);
        float b = blue(c);
        float colorDifference = dist( r, g, b, trackR, trackG, trackB );
        if ( colorDifference &lt; maxColorDifference ) {
          if ( i &lt; topLeftX ) {
            topLeftX = i;
          }
          if ( j &lt; topLeftY ) {
            topLeftY = j;
          }
          if ( i &gt; bottomRightX ) {
            bottomRightX = i;
          }
          if ( j &gt; bottomRightY ) {
            bottomRightY = j;
          }
        }
        counter++;
      }
    }

    updatePixels();

    //show basket
    image(basketPic, topLeftX, topLeftY);

    //CHERRIES
    for (int i = 0; i&lt;count; i++) {
      cherry[i].drawCherry();
    }  

    //start game with click on cherry
    if (state == INACTIVE) {     
      fill(255);
      textSize(48);
      text("Catch It", 240, 100);

      String text = "Click Your Tracking Point To Start";
      textSize(25);
      text(text, 120, 140);
    }
    //if state == ACTIVE
    else {

      caught = false;
      gameOver = false;

      for (int i=0; i&lt;count; i++ ) {
        cherry[i].speedItUp(speed);
        if (cherry[i].caught(topLeftX, topLeftY, basketWidth, basketHeight)) {
          caught = true;
          break;    //if one cherry is caugt, end if
        }

        if (cherry[i].posY &gt; height) {
          gameOver = true;
        }
      }

      if (caught) {
        score+=10;
      }

      //show score
      textSize(18);
      text("Score: " + score, 15, 460);

      //increase speed when score &gt; 100
      if (score == 100) {
        speed++;
      }

      println(score + " speed " + speed);
    }
  }


  //increase speed after 15 seconds
  int sec = second();
  if (sec == 40) {
    speed++;
    println(sec + "speed " + speed);
  }


  //"Game Over" Screen at negative points
  if (gameOver) {
    background(0);
    textSize(90);    
    text("GAME OVER", 70, 200);

    textSize(35);
    text("Your Score: " + score, 200, 270);

    //textSize(20);
    //text("Press ENTER To Start A New Game", 160, 380);
  }

  // reset tracking points
  topLeftX = width;
  topLeftY = height;
  bottomRightX = 0;
  bottomRightY = 0;
}


void mousePressed()
{
  if (state == INACTIVE) {

    trackColor = webcam.get( mouseX, mouseY );
    trackR = red(trackColor);
    trackG = green(trackColor);
    trackB = blue(trackColor);

    state = ACTIVE;

    for (int i=0; i&lt;count; i++ ) {
      cherry[i].setCord();
    }
  }
}

//Enter-Taste bewirkt Neustart nur bei Game Over oder bei gewonnenem Spiel
/*void keyPressed()
 {
 if (gameOver &amp;&amp; key==ENTER) {
 state = ACTIVE;
 }
 } */
</code></pre>

<p>I worked with this bit of code, which I think should flip the video:</p>

<pre><code>  pushMatrix();
  scale(-1, 1);
  translate(-webcam.width, 0);
  image(webcam, 0, 0); 
  popMatrix();
</code></pre>

<p>But whereever I put it, it doesn't work correctly. It doesn't flip the image, or it does but the tracking function doesn't work anymore!
Can you please tell me, where I have to put this code? Or which other code would do the same? Or if I missed something else?</p>

<p>Thank in advance.</p>

<ul>
<li>Jackie</li>
</ul>
]]></description>
   </item>
   <item>
      <title>Hexagonal Video Pixel grid question</title>
      <link>https://forum.processing.org/two/discussion/12400/hexagonal-video-pixel-grid-question</link>
      <pubDate>Fri, 04 Sep 2015 17:07:30 +0000</pubDate>
      <dc:creator>scottreinhard</dc:creator>
      <guid isPermaLink="false">12400@/two/discussions</guid>
      <description><![CDATA[<p>I'm working to produce a honeycomb grid of hexagons in which . Very similar to Dan Shiffman's pixel grid. I'm running into an issue with the code in which the furthest left column of of hexagons seems to capture the color data for the particular pixel, but all of the other hexagons turn out to be the same color. Where am I going wrong with this? I've included a screenshot of the problem.</p>

<p><img src="" alt="" /></p>

<pre><code>    var w = 10;
        var cols, rows, value;


        function setup() {
            createCanvas(500, 500);
            devicePixelScaling(false);
            video = createCapture(VIDEO);
            video.size(width, height);
            video.hide();
        }

        function draw() {
            video.loadPixels();

            //background(102);

            for (var x = 0; x &lt; width + w; x += (w * cos(PI / 6) * 2)) { 
                index = 0;
                for (var y = 0; y &lt; height + w; y += w * (1 + cos(PI / 3))) { 

                    var loc = (y + x * width) * 4;

        // The functions red(), green(), and blue() pull out the three color components from a pixel.
                    var r = video.pixels[loc];
                    var g = video.pixels[loc + 1];
                    var b = video.pixels[loc + 2];

                    fill(r, g, b);

                    push(); {
                        translate(x - ((w * cos(PI / 6)) * (index % 2)), y);
                        rotate(radians(30));
                        polygon(0, 0, 10, 6);
                    }
                    pop();

                    index++;
                    console.log(loc);

                }
            }

        }

        function polygon(x, y, radius, npoints) {
            var angle = TWO_PI / npoints;
            beginShape();
            for (var a = 0; a &lt; TWO_PI; a += angle) {
                var sx = x + cos(a) * radius;
                var sy = y + sin(a) * radius;
                vertex(sx, sy);
            }
            endShape(CLOSE);
        }!
</code></pre>

<p><a rel="nofollow" href="http://forum.processing.org/two/uploads/imageupload/653/ZZDSSEXW1IRF.png" title="Screen Shot 2015-09-04 at 1.01.46 PM">Screen Shot 2015-09-04 at 1.01.46 PM</a></p>
]]></description>
   </item>
   <item>
      <title>What is a good library to use for video scrubbing interactivity (like with a mouse location)?</title>
      <link>https://forum.processing.org/two/discussion/12394/what-is-a-good-library-to-use-for-video-scrubbing-interactivity-like-with-a-mouse-location</link>
      <pubDate>Fri, 04 Sep 2015 00:23:43 +0000</pubDate>
      <dc:creator>wbandel</dc:creator>
      <guid isPermaLink="false">12394@/two/discussions</guid>
      <description><![CDATA[<p>I am trying to find a library in which I can scrub through video depending on where my mouse is located. I have found a couple, jmcvideo and gsvideo, and have had issues using them on a PC (though they work fine on my Mac). Does anyone know of any other libraries that allow for more advanced video control?</p>
]]></description>
   </item>
   <item>
      <title>glitch pixels with sound</title>
      <link>https://forum.processing.org/two/discussion/12175/glitch-pixels-with-sound</link>
      <pubDate>Thu, 20 Aug 2015 15:36:49 +0000</pubDate>
      <dc:creator>Patricia_Brsk</dc:creator>
      <guid isPermaLink="false">12175@/two/discussions</guid>
      <description><![CDATA[<p>Hello everyone!</p>

<p>I am desperately trying to find out how to properly merge Shiffman's video pixelation sketch with some Minim to make rects react to mic input and glitch.</p>

<p>Any idea? Any Minim thing for mic input?</p>

<p>Thank you all in advance.</p>
]]></description>
   </item>
   <item>
      <title>VIDEO: What codecs are supported by the video library ?</title>
      <link>https://forum.processing.org/two/discussion/12037/video-what-codecs-are-supported-by-the-video-library</link>
      <pubDate>Tue, 11 Aug 2015 19:52:58 +0000</pubDate>
      <dc:creator>mark_orion</dc:creator>
      <guid isPermaLink="false">12037@/two/discussions</guid>
      <description><![CDATA[<p>As Quicktime (mov) is a container format I am interested to know what codecs (video and audio) are supported by Processing.</p>
]]></description>
   </item>
   <item>
      <title>send output in /dev/video1</title>
      <link>https://forum.processing.org/two/discussion/11980/send-output-in-dev-video1</link>
      <pubDate>Wed, 05 Aug 2015 16:55:50 +0000</pubDate>
      <dc:creator>matthieu</dc:creator>
      <guid isPermaLink="false">11980@/two/discussions</guid>
      <description><![CDATA[<p>Hi, 
I'm trying to send the video output of my sketch into /dev/video1.
Is it possible ?
I would like to take after in another software trough <a rel="nofollow" href="https://github.com/umlaeute/v4l2loopback">https://github.com/umlaeute/v4l2loopback</a></p>

<p>Thanks</p>
]]></description>
   </item>
   <item>
      <title>Video, motion sensors, and arduino</title>
      <link>https://forum.processing.org/two/discussion/11415/video-motion-sensors-and-arduino</link>
      <pubDate>Tue, 23 Jun 2015 14:59:49 +0000</pubDate>
      <dc:creator>kinesthtiaSonja</dc:creator>
      <guid isPermaLink="false">11415@/two/discussions</guid>
      <description><![CDATA[<p>I am trying to create a project that uses processing and Arduino with motion sensors. Basically I am trying to set up video that fades in and out or cross fades in relation to people's movement in a room. Does anyone know how to do this? i am relatively new to processing and still trying to figure stuff out!</p>
]]></description>
   </item>
   <item>
      <title>[video library] movies with alpha channel supported?</title>
      <link>https://forum.processing.org/two/discussion/11218/video-library-movies-with-alpha-channel-supported</link>
      <pubDate>Mon, 08 Jun 2015 20:45:08 +0000</pubDate>
      <dc:creator>kasperkamperman</dc:creator>
      <guid isPermaLink="false">11218@/two/discussions</guid>
      <description><![CDATA[<p>I'd like to use video with an alpha channel. I now have a Quicktime movie with the animation codec (that supports an alpha channel).</p>

<p>Is the video library able to read the alpha channel? I came across a solution with the GSVideo library, however I think this library is now integrated in Processing 2. Daniel Shiffman suggest to use two movie files and then to mask, however that makes the application kind of heavy.</p>

<p>Another solution I was thinking was to use a chromakey shader (I actually implemented that already), however using alpha information directly is much nicer of course.</p>
]]></description>
   </item>
   <item>
      <title>Cannot load MP4 video</title>
      <link>https://forum.processing.org/two/discussion/7971/cannot-load-mp4-video</link>
      <pubDate>Wed, 05 Nov 2014 11:38:07 +0000</pubDate>
      <dc:creator>rrymc</dc:creator>
      <guid isPermaLink="false">7971@/two/discussions</guid>
      <description><![CDATA[<p>Hello.</p>

<p>Started using Processing today, I want to use it to play with video.</p>

<p>Processing 2.2.1 x64 / Win 8.1 x64</p>

<p>Perhaps someone can tell me why the below code won't load the video? It is in the sketch directory, I have tried altering size() to half and quarter values, I have tried .mp4 and .MP4 - but Processing tells me that it "cannot load striplight.MP4". Thanks in advance.</p>

<pre><code> import processing.video.*;

    Movie test;

    void setup() {
      size(1920, 1080);
      background(0);
      test = new Movie(this, "striplight.MP4");
      test.loop();
      test.volume(0);
    }

    void movieEvent(Movie m) {
      m.read();
    }

    void draw() {
      image(test, 0, 0, width, height);
    }
</code></pre>
]]></description>
   </item>
   <item>
      <title>Video not showing up?</title>
      <link>https://forum.processing.org/two/discussion/10768/video-not-showing-up</link>
      <pubDate>Tue, 12 May 2015 07:39:17 +0000</pubDate>
      <dc:creator>dashrsp</dc:creator>
      <guid isPermaLink="false">10768@/two/discussions</guid>
      <description><![CDATA[<p>I'm making a little interactive doo-hickey for a class, and it's based on the fact that I little videos playing on each page.
I've only gotten far enough to implement one, but it doesn't show up. The video is called in "void driveOff()" (at the bottom). Any idea on whats wrong?</p>

<p>EDIT: I figured it out.</p>

<pre><code>import processing.video.*;

// Project #3, Art Assignment

PFont regular;
PFont light;
PFont small;
PFont big;

Movie sequence1;
PImage startOff;

boolean clutch = false;

color buttonText = color(40);
color titleText = color(250);
color textHover = color(250);
color buttonShadow = color(250);
color buttonColor = color(237, 227, 54);
color buttonHover = color(87, 74, 184);

color mainBG = color(28, 15, 120);

int page = 0;

// go Left/right button
int lrSize = 100;

int leftX = 230;
int lrY = 150;
// go RIGHT button
int rightX = 1050;
// go STRAIGHT button
int straightX = 490;
int straightY = 100;

// go Straight Button size
int straightW = 300;
int straightH = 100;

// Yes &amp; No
int yesX = leftX+120;
int yesnoY = lrY+100;
int noX = rightX-115;

// button corners
int buttonCorner = 10;
// Start Over button
int startOverSize = 200;
int startOverX = 640;
int startOverY = 600;

// START Button
int startButtonSize = 500;
int startButtonX = 640;
int startButtonY = 612;


void setup() {
  size (1280, 1024);
  regular = loadFont("Sansation_Regular-40.vlw");
  light = loadFont("Sansation_Light-40.vlw");
  small = loadFont("Sansation_Light-24.vlw");
  big = loadFont("Sansation_Light-100.vlw");
  startOff = loadImage("startOff.png");

  sequence1 = new Movie(this, "sequence01.mov");
  sequence1.noLoop();

  frameRate(60);
}

void draw() {    
  // HOME PAGE
  if (page==0) {
    startPage();
  } else if (page==1) {
    driveOrWalk();
  } else if (page==2) {
    pagetwo();
  } else if (page==3) {
    walkBoring();
  } else if (page==4) {
    clutch();
  } else if (page==5) {
    driveOff();
  }

  println("Page: " + page);
}

void mousePressed() {
  if (page==0 &amp;&amp; ellipseHover (startButtonX, startButtonY, startButtonSize) == true) {
    page=1;
  }
  if (page!=1 &amp;&amp; ellipseHover (startOverX, startOverY, startOverSize) == true) {
    page=1;
  }
  // Drive?
  if (page==1 &amp;&amp; ellipseHover (yesX, yesnoY, lrSize) == true) {
    page=2;
  }
  if (page==1 &amp;&amp; ellipseHover (noX, yesnoY, lrSize) == true) {
    page=3;
  }

  // Start
  if (page==2 &amp;&amp; clutch==false &amp;&amp; rectHover (straightX, straightY, straightW, straightH) == true) {
    page=4;
  } else if (page==2 &amp;&amp; clutch==true &amp;&amp; rectHover (straightX, straightY, straightW, straightH) == true) {
    page=5;
  } // Clutch
  if (page==4 &amp;&amp; rectHover (straightX, straightY+200, straightW, straightH) == true) {
    clutch = true;
    page=2;
  }
}

// START Page
void startPage() {
  background(mainBG);

  textAlign(CENTER, TOP);
  fill(titleText);
  textSize(100);
  textFont(big);
  text("'find the quietest place.'", 650, 150);

  startButton();
}


// BUTTONS
boolean rectHover (int x, int y, int width, int height) {
  if (mouseX &gt; x &amp;&amp; mouseX &lt; x+width &amp;&amp; mouseY &gt; y &amp;&amp; mouseY &lt; y+height) {
    return true;
  } else {
    return false;
  }
}

boolean ellipseHover (int x, int y, int diameter) {
  float disX = x - mouseX;
  float disY = y - mouseY;
  if (sqrt(sq(disX) + sq(disY)) &lt; diameter/2 ) {
    return true;
  } else {
    return false;
  }
}

void startButton() {
  textAlign(CENTER, CENTER);
  textSize(100);
  textFont(big);
  strokeWeight(3);
  if (ellipseHover (startButtonX, startButtonY, startButtonSize) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    ellipse(startButtonX, startButtonY, startButtonSize, startButtonSize);
    fill(titleText);
    text("START", startButtonX-1, startButtonY-8);
  } else {
    noStroke();
    fill(buttonColor);
    ellipse(startButtonX, startButtonY, startButtonSize, startButtonSize);
    fill(buttonText);
    text("START", startButtonX-1, startButtonY-8);
  }
}

void startOver() {
  textAlign(CENTER, CENTER);
  textSize(40);
  textFont(regular);
  if (ellipseHover (startOverX, startOverY, startOverSize) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    ellipse(startOverX, startOverY, startOverSize, startOverSize);
    fill(titleText);
    text("start", startOverX, startOverY-20);
    text("over", startOverX, startOverY+17);
  } else {
    noStroke();
    fill(buttonColor);
    ellipse(startOverX, startOverY, startOverSize, startOverSize);
    fill(buttonText);
    text("start", startOverX, startOverY-20);
    text("over", startOverX, startOverY+17);
  }
}

void pageButtons() {
  textAlign(CENTER, CENTER);
  textSize(40);
  textFont(regular);

  // GO LEFT
  if (ellipseHover (leftX, lrY, lrSize) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    ellipse(leftX, lrY, lrSize, lrSize);
    fill(titleText);
    text("left", leftX, lrY-3);
  } else {
    noStroke();
    fill(buttonColor);
    ellipse(leftX, lrY, lrSize, lrSize);
    fill(buttonText);
    text("left", leftX, lrY-3);
  }

  // GO RIGHT
  if (ellipseHover (rightX, lrY, lrSize) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    ellipse(rightX, lrY, lrSize, lrSize);
    fill(titleText);
    text("right", rightX, lrY-3);
  } else {
    noStroke();
    fill(buttonColor);
    ellipse(rightX, lrY, lrSize, lrSize);
    fill(buttonText);
    text("right", rightX, lrY-3);
  }

  // GO STRAIGHT
  if (rectHover (straightX, straightY, straightW, straightH) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    rect(straightX, straightY, straightW, straightH, buttonCorner);
    fill(titleText);
    text("straight", straightX+150, straightY+47);
  } else {
    noStroke();
    fill(buttonColor);
    rect(straightX, straightY, straightW, straightH, buttonCorner);
    fill(buttonText);
    text("straight", straightX+150, straightY+47);
  }
}


// PAGES

void driveOrWalk() {
  background(mainBG);
  textAlign(CENTER, TOP);
  fill(titleText);
  textSize(100);
  textFont(big);
  text("drive?", 650, 150);

  textAlign(CENTER, CENTER);
  textSize(40);
  textFont(regular);

  if (ellipseHover (yesX, yesnoY, lrSize) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    ellipse(yesX, yesnoY, lrSize, lrSize);
    fill(titleText);
    text("yes", yesX, yesnoY-3);
  } else {
    noStroke();
    fill(buttonColor);
    ellipse(yesX, yesnoY, lrSize, lrSize);
    fill(buttonText);
    text("yes", yesX, yesnoY-3);
  }

  if (ellipseHover (noX, yesnoY, lrSize) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    ellipse(noX, yesnoY, lrSize, lrSize);
    fill(titleText);
    text("no", noX, yesnoY-3);
  } else {
    noStroke();
    fill(buttonColor);
    ellipse(noX, yesnoY, lrSize, lrSize);
    fill(buttonText);
    text("no", noX, yesnoY-3);
  }
}

void pagetwo() {
  background(mainBG); 
  image(startOff, 0, 260);
  if (rectHover (straightX, straightY, straightW, straightH) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    rect(straightX, straightY, straightW, straightH, buttonCorner);
    fill(titleText);
    text("go", straightX+150, straightY+47);
  } else {
    noStroke();
    fill(buttonColor);
    rect(straightX, straightY, straightW, straightH, buttonCorner);
    fill(buttonText);
    text("go", straightX+150, straightY+47);
  }
}

void walkBoring() {
  background(mainBG);
  textAlign(CENTER, TOP);
  fill(titleText);
  textSize(100);
  textFont(big);
  text("boring", 650, 150);
  text("try again.", 650, 320);

  startOver();
}

void clutch() {
  background(mainBG);
  textAlign(CENTER, TOP);
  fill(titleText);
  textSize(40);
  textFont(regular);
  text("you forgot to press the clutch.", 650, 200);

  textAlign(CENTER, CENTER);
  textSize(40);
  textFont(regular);
  if (rectHover (straightX, straightY+200, straightW, straightH) == true) {
    stroke(buttonShadow);
    fill(buttonHover);
    rect(straightX, straightY+200, straightW, straightH, buttonCorner);
    fill(titleText);
    text("press clutch", straightX+150, straightY+247);
  } else {
    noStroke();
    fill(buttonColor);
    rect(straightX, straightY+200, straightW, straightH, buttonCorner);
    fill(buttonText);
    text("press clutch", straightX+150, straightY+247);
  }
}

void driveOff() {
  background(mainBG);

  image(sequence1, 0, 260);

  pageButtons();
}

void movieEvent(Movie m) {
  m.read();
}
</code></pre>
]]></description>
   </item>
   <item>
      <title>Hi,  How about setup camera communication?</title>
      <link>https://forum.processing.org/two/discussion/10764/hi-how-about-setup-camera-communication</link>
      <pubDate>Tue, 12 May 2015 06:42:39 +0000</pubDate>
      <dc:creator>SeongJongGwak</dc:creator>
      <guid isPermaLink="false">10764@/two/discussions</guid>
      <description><![CDATA[<p>Hi, Nice to meet you</p>

<p>I have a question for you.</p>

<p>I want Processing(processing.org/).
So. I operate Camera.
Camera data → Processing Display</p>

<p>How do I configure?
What products do you buy?
What should I do coding?</p>

<p>Please help me. Thank you.</p>
]]></description>
   </item>
   <item>
      <title>Flying Particles / Music Player (HD 720p)</title>
      <link>https://forum.processing.org/two/discussion/10780/flying-particles-music-player-hd-720p</link>
      <pubDate>Tue, 12 May 2015 19:22:22 +0000</pubDate>
      <dc:creator>atoro</dc:creator>
      <guid isPermaLink="false">10780@/two/discussions</guid>
      <description><![CDATA[<p>Made with Processing 
Flying Particles / Music Player (HD 720p) // gp 001 01 svsa
<span class="VideoWrap"><span class="Video YouTube" id="youtube-09j67sQ6WfQ"><span class="VideoPreview"><a href="http://youtube.com/watch?v=09j67sQ6WfQ"><img src="http://img.youtube.com/vi/09j67sQ6WfQ/0.jpg" width="640" height="385" border="0" /></a></span><span class="VideoPlayer"></span></span></span></p>
]]></description>
   </item>
   <item>
      <title>Calculating difference between colors</title>
      <link>https://forum.processing.org/two/discussion/10667/calculating-difference-between-colors</link>
      <pubDate>Wed, 06 May 2015 08:22:57 +0000</pubDate>
      <dc:creator>grumo</dc:creator>
      <guid isPermaLink="false">10667@/two/discussions</guid>
      <description><![CDATA[<p>I did a script that analyze every frame of a video and puts a square with the average color of that frame. I need to trigger some function when the difference of the current and the last color is bigger than "x" value. How can I achieve this? --i tried to hex() the colors and dehex() but the detection of difference isn't working and usually shows a "0" all the time (even if the  color changes a lot). This is urgent --a homework that i need to present in few hours--Thanks for any help, I really appreciate it.</p>
]]></description>
   </item>
   <item>
      <title>Full screen? Help please!</title>
      <link>https://forum.processing.org/two/discussion/10516/full-screen-help-please</link>
      <pubDate>Sat, 25 Apr 2015 23:53:16 +0000</pubDate>
      <dc:creator>jarmr</dc:creator>
      <guid isPermaLink="false">10516@/two/discussions</guid>
      <description><![CDATA[<p>Hey guys, I'm pretty new to processing. Can someone help me make this code go full screen? Thanks!</p>

<p>//WHITE w/ blk</p>

<p>import processing.video.*;
// Variable for capture device
Capture video;
// Previous Frame
PImage prevFrame;
// How different must a pixel be to be a "motion" pixel
float threshold = 40;</p>

<p>void setup() {
  size(640,480);
  video = new Capture(this, width, height, 30);
  // Create an empty image the same size as the video
  prevFrame = createImage(video.width,video.height,RGB);
  video.start();
}</p>

<p>void draw() {</p>

<p>// Capture video
  if (video.available()) {
    // Save previous frame for motion detection!!
    prevFrame.copy(video,0,0,video.width,video.height,0,0,video.width,video.height); // Before we read the new frame, we always save the previous frame for comparison!
    prevFrame.updatePixels();
    video.read();
  }</p>

<p>loadPixels();
  video.loadPixels();
  prevFrame.loadPixels();</p>

<p>// Begin loop to walk through every pixel
  for (int x = 0; x &lt; video.width; x ++ ) {
    for (int y = 0; y &lt; video.height; y ++ ) {</p>

<pre><code>  int loc = (video.width - x - 1) + y*video.width;            // Step 1, what is the 1D pixel location
  color current = video.pixels[loc];      // Step 2, what is the current color
  color previous = prevFrame.pixels[loc]; // Step 3, what is the previous color

  // Step 4, compare colors (previous vs. current)
  float r1 = red(current); float g1 = green(current); float b1 = blue(current);
  float r2 = red(previous); float g2 = green(previous); float b2 = blue(previous);
  float diff = dist(r1,g1,b1,r2,g2,b2);

  // Step 5, How different are the colors?
  // If the color at that pixel has changed, then there is motion at that pixel.
  if (diff &gt; threshold) { 
    // If motion, display white
    pixels[loc] = color(0);
  } else {
    // If not, display black
    pixels[loc] = color(250);
  }
}
</code></pre>

<p>}
  updatePixels();
}</p>
]]></description>
   </item>
   </channel>
</rss>