Im currently working on a system with Ubuntu 10.04 and want to use the flob library to detect gestures. I read up on the forum to find a way to capture video from the webcam (since quicktime dosent run on linux) and found a few links talking of the GSVideo library.
So i installed the lib and made a few changes to the hello_flob example..(changed Capture to GSCapture), but the sketch compiles with an "ArrayIndexOutOfBounds" error for the line
blobs = flob.calc(flob.binarize(video));
The capture window then gives a very garbled output.
Is it an error in the code or flob isnt meant to work with GSVideo? Im very new to Processing so please pardon the ignorance.
Im trying to hook up Processing with Blender for a gesture recognition module. The user can control objects within th Blender Game Engine using gestures. Ive managed to use thee flob library to detect hand motion and want to move a cube in response to that gesture.
My question is: How do I write the motion data to a socket so that Blender can retrieve that data from the socket? Should I use a socket or a port?
Im very new to Processing so any help will be hugely appreciated.
Im totally new to processing so some of my questions might be very very trivial so please forgive that. Im currently using the flob library. What I wish to achieve is to have an image swipe across based on the movement of my hand as captured by the webcam. So I suppose Ive got to extract the largest blob and reposition the image to the position of the largest blob. Ive played around with the examples in flob and am trying to modify the code to suit my needs but it dosent exactly work. Can someone please help me out..Thanks :-)
Here's the code:
import processing.opengl.*;
import processing.video.*;
import s373.flob.*;
Capture video;
Flob flob;
int videores=128;
int fps = 60;
PFont font = createFont("arial",10);
//Bola bolas[];
PImage b;
float maxArea = 0;
float xPos = 0, yPos = 0;
float area = 0;
boolean showcamera=true;
boolean om=true,omset=false;
float velmult = 10000.0f;
int vtex=0;
void setup(){
//bug 882 processing 1.0.1
try {
quicktime.QTSession.open();
}
catch (quicktime.QTException qte) {
qte.printStackTrace();
}
size(640,480,OPENGL);
frameRate(fps);
String[] devices = Capture.list();
println(devices);
video = new Capture(this, videores, videores, devices[1], fps);
flob = new Flob(video, width, height);
flob.setMirror(true,false);
flob.setThresh(12);
flob.setFade(25 );
flob.setMinNumPixels(100);
flob.setImage( vtex );
//PImage b;
b = loadImage("dummy.jpg");
image(b,0,0);
//bolas = new Bola[10];
//for(int i=0;i<bolas.length;i++){
//bolas[i] = new Bola();
// }
textFont(font);
}
void draw(){
background(102);
if(video.available()) {
if(!omset){
if(om)
flob.setOm(flob.CONTINUOUS_DIFFERENCE);
else
flob.setOm(flob.STATIC_DIFFERENCE);
omset=true;
}
video.read();
// aqui é que se define o método calc, calcsimple, ou tracksimple
// o tracksimple é mais preciso mas mais pesado que o calcsimple