We are about to switch to a new forum software. Until then we have removed the registration on this forum.
I am trying to change some line from the code using old version of simple openni to the code using simple opennni 1.96
The aim of this program is to have the brightness of two LEDs driven by the position of your hand (a simple example arduino and kinect project). here's the code:
a. arduino code
int val, xVal, yVal;
void setup() {
Serial.begin(9600); // initialize the serial communication:
pinMode(10, OUTPUT);
pinMode(11, OUTPUT);
}
void loop(){
// check if enough data has been sent from the computer:
if (Serial.available()>2) {
// Read the first value. This indicates the beginning of the communication.
val = Serial.read();
// If the value is the event trigger character 'S'
if(val == 'S'){
// read the most recent byte, which is the x-value
xVal = Serial.read();
// Then read the y-value
yVal = Serial.read();
}
}
// And send those to the LEDS!
analogWrite(10, xVal);
analogWrite(11, yVal);
}
b. Processing code
the old code:
import SimpleOpenNI.*;
import processing.serial.*;
SimpleOpenNI kinect;
Serial myPort;
PVector handVec = new PVector();
PVector mapHandVec = new PVector();
color handPointCol = color(255, 0, 0);
void setup() {
kinect = new SimpleOpenNI(this);
// enable mirror
kinect.setMirror(true);
// enable depthMap generation, hands and gestures
kinect.enableDepth();
kinect.enableGesture();
kinect.enableHands();
// add focus gesture to initialise tracking
kinect.addGesture("Wave");
size(kinect.depthWidth(), kinect.depthHeight());
String portName = Serial.list()[0]; // This gets the first port on your computer.
myPort = new Serial(this, portName, 9600);
}
void draw() {
kinect.update();
kinect.convertRealWorldToProjective(handVec,mapHandVec);
// draw depthImageMap
image(kinect.depthImage(), 0, 0);
strokeWeight(10);
stroke(handPointCol);
point(mapHandVec.x, mapHandVec.y);
// Send a marker to indicate the beginning of the communication
myPort.write('S');
// Send the value of the mouse's x-position
myPort.write(int(255*mapHandVec.x/width));
// Send the value of the mouse's y-position
myPort.write(int(255*mapHandVec.y/height));
}
void onCreateHands(int handId, PVector pos, float time)
{
println("onCreateHands - handId: " + handId + ", pos: " + pos + ", time:" + time);
handVec = pos;
handPointCol = color(0, 255, 0);
}
void onUpdateHands(int handId, PVector pos, float time)
{
println("onUpdateHandsCb - handId: " + handId + ", pos: " + pos + ", time:" + time);
handVec = pos;
}
void onRecognizeGesture(String strGesture, PVector idPosition, PVector endPosition)
{
println("onRecognizeGesture - strGesture: " + strGesture + ", idPosition: " + idPosition + ", endPosition:" + endPosition);
kinect.removeGesture(strGesture);
kinect.startTrackingHands(endPosition);
}
i have changed some line such as:
the result:
import SimpleOpenNI.*;
import processing.serial.*;
SimpleOpenNI context;
Serial myPort;
PVector handVec = new PVector();
PVector mapHandVec = new PVector();
color handPointCol = color(255, 0, 0);
void setup() {
context = new SimpleOpenNI(this);
// enable mirror
context.setMirror(true);
// enable depthMap generation, hands and gestures
context.enableDepth();
context.enableHand();
context.startGesture(SimpleOpenNI.GESTURE_WAVE);
size(context.depthWidth(), context.depthHeight());
String portName = Serial.list()[0]; // This gets the first port on your computer.
myPort = new Serial(this, portName, 9600);
}
void draw() {
context.update();
context.convertRealWorldToProjective(handVec,mapHandVec);
// draw depthImageMap
image(context.depthImage(), 0, 0);
strokeWeight(10);
stroke(handPointCol);
point(mapHandVec.x, mapHandVec.y);
// Send a marker to indicate the beginning of the communication
myPort.write('S');
// Send the value of the mouse's x-position
myPort.write(int(255*mapHandVec.x/width));
// Send the value of the mouse's y-position
myPort.write(int(255*mapHandVec.y/height));
}
void onCreateHands(int handId, PVector pos, float time)
{
println("onCreateHands - handId: " + handId + ", pos: " + pos + ", time:" + time);
handVec = pos;
handPointCol = color(0, 255, 0);
}
void onUpdateHands(int handId, PVector pos, float time)
{
println("onUpdateHandsCb - handId: " + handId + ", pos: " + pos + ", time:" + time);
handVec = pos;
}
void onRecognizeGesture(String strGesture, PVector idPosition, PVector endPosition)
{
println("onRecognizeGesture - strGesture: " + strGesture + ", idPosition: " + idPosition + ", endPosition:" + endPosition);
context.endGesture(SimpleOpenNI.GESTURE_WAVE);
context.startTrackingHand(endPosition);
}
there is no error but kinect doesn't get hand tracking from my hand
thank you for attention
Answers
Try drawing ellipse instead of point with fill and slightly bigger radius and wave your hand infornt of the kinect (slightly faster).
I don't see any problem in your code but you can try this enableUser() in setup.
i have tried your suggestion but it didn't work.
just information: I forgot to say you, there is no problem when i tried an example from simple openni (hands test)
do you have any idea about this ?
Give me some details: 1. windows version ? 2. SimpleOpenNI version ? 3. Kinect sdk? 4. Post Example code that you run on your system
the original one is the old code (from arduino and kinect projects which created by Enrique Ramos Melgar)
code that I run on my system is the result code
i also wanna ask you about changes in new library (such as kinect.startPoseDetection("Psi", userId); etc). I'm still find all information about this. Could you give some information about this ? the documentation of simple openni didn't provide a good information about this
Thank you for attention