Hi everyone, I'm trying to use this code to apply a ripple effect to the video from a webcam. The code creates a ripple when you drag your mouse across the processing window. I want to make it so a single ripple appears in a random location (as if a single drop of water fell) when a motion sensor detects motion. Does anyone know how to do this? Or can you at least tell me how to change it so a ripple appears just when you click your mouse instead of drag it?
Thank you in advance for your help!
/**
* Frame Differencing
* by Golan Levin.
*
* Quantify the amount of movement in the video frame using frame-differencing.
*/
Ripple ripple;
import processing.video.*;
int numPixels;
int[] previousFrame;
Capture video;
void setup() {
size(640, 480); // Change size to 320 x 240 if too slow at 640 x 480
// Uses the default video input, see the reference if this causes an error
video = new Capture(this, width, height, 24);
numPixels = video.width * video.height;
// Create an array to store the previously captured frame
previousFrame = new int[numPixels];
ripple = new Ripple();
frameRate(60);
//loadPixels();
}
void draw() {
int x = int(random(video.width));
int y = int(random(video.height));
if (video.available()) {
// When using video to manipulate the screen, use video.available() and
// video.read() inside the draw() method so that it's safe to draw to the screen
video.read(); // Read the new frame from the camera
loadPixels();
video.loadPixels(); // Make its pixels[] array available
int movementSum = 0; // Amount of movement in the frame
// for (int i = 0; i < numPixels; i++) { // For each pixel in the video frame...
for (int i = 0; i < width * height; i++) {
color currColor = video.pixels[i];
color prevColor = previousFrame[i];
// Extract the red, green, and blue components from current pixel
int currR = (currColor >> 16) & 0xFF; // Like red(), but faster
int currG = (currColor >> 8) & 0xFF;
int currB = currColor & 0xFF;
// Extract red, green, and blue components from previous pixel
int prevR = (prevColor >> 16) & 0xFF;
int prevG = (prevColor >> 8) & 0xFF;
int prevB = prevColor & 0xFF;
// Compute the difference of the red, green, and blue values
int diffR = abs(currR - prevR);
int diffG = abs(currG - prevG);
int diffB = abs(currB - prevB);
// Add these differences to the running tally
movementSum += diffR + diffG + diffB;
// Render the difference image to the screen
pixels[i] = ripple.col[i] ;
// pixels[i] = ripple.col[i] + (RGB);
// The following line is much faster, but more confusing to read
So I have a motion sensor connected to an Arduino and 5 videos in Processing. I want each of the videos to play depending on someone's proximity to the motion sensor (each video corresponds to a different range). I have never connected Arduino to Processing before. Could anyone give me the general code framework to work from and tell me what does what in the code? This is probably asking for a lot, but I would really appreciate any help!
Thank you !!
I want to play a video in Processing but when I run the code, the window is blank and it just plays the sound from the video. I've used video before in Processing and it worked, so I think there is something wrong with the format of the video file? I edited it in Final Cut Pro X and exported it as a .mov file. I thought maybe the file was too big, and made it smaller (21 mb) but it still didn't work. Any suggestions?