SimpleOpenNI Multiple user, Any help ?

edited October 2013 in Kinect

Hi, I found this interesting example of open / close palm gesture using SimpleOpenNI in this LINK. This example only works for single user. Please can anyone suggest how to make it working for multiple users :)

          import SimpleOpenNI.*;
          SimpleOpenNI context;
          float        zoomF =0.5f;
          float        rotX = radians(180);                                      
          float        rotY = radians(0);
          //the hand tracking part
          boolean      handsTrackFlag = false;//if kinect is tracking hand or not
          PVector      handVec = new PVector();//the latest/most up to date hand point
          ArrayList    handVecList = new ArrayList();//the previous points in a list
          int          handVecListSize = 30;//the number of previous points to be remembered 
          String       lastGesture = "";//used to keep track of gestures

          PVector      handMin = new PVector();
          PVector      handMax = new PVector();
          float        handThresh = 95;
          float        openThresh = 200;

          void setup()
          {
            size(1024, 768, P3D);  
            context = new SimpleOpenNI(this);
            context.setMirror(true);
            if (context.enableDepth() == false)
            {
              println("Can't open the depthMap, maybe the camera is not connected!"); 
              exit();
              return;
            }

            // enable hands + gesture generation
            context.enableGesture();
            context.enableHands();
            // add focus gestures  / here i do have some problems on the mac, i only recognize raiseHand ? Maybe cpu performance ?
            context.addGesture("Wave");
            context.addGesture("Click");
            context.addGesture("RaiseHand");
            stroke(255, 255, 255);
            smooth();
          }

          void draw()
          {
            // update the cam
            context.update();
            background(0, 0, 0);
            // set the scene pos
            translate(width/2, height/2, 0);
            rotateX(rotX);
            rotateY(rotY);
            if (handsTrackFlag)  
            {
              //update hand from point cloud
              handMin = handVec.get();
              handMax = handVec.get();
              // draw the 3d point depth map
              int[]   depthMap = context.depthMap();
              int     steps   = 3;  // to speed up the drawing, draw every third point
              int     index;
              PVector realWorldPoint;
              for (int y=0;y < context.depthHeight();y+=steps)
              {
                for (int x=0;x < context.depthWidth();x+=steps)
                {
                  index = x + y * context.depthWidth();
                  if (depthMap[index] > 0)
                  { 
                    // draw the projected point
                    realWorldPoint = context.depthMapRealWorld()[index];
                    if (realWorldPoint.dist(handVec) < handThresh) {
                      point(realWorldPoint.x, realWorldPoint.y, realWorldPoint.z); 
                      if (realWorldPoint.x < handMin.x) handMin.x = realWorldPoint.x;
                      if (realWorldPoint.y < handMin.y) handMin.y = realWorldPoint.y;
                      if (realWorldPoint.z < handMin.z) handMin.z = realWorldPoint.z;
                      if (realWorldPoint.x > handMax.x) handMax.x = realWorldPoint.x;
                      if (realWorldPoint.y > handMax.y) handMax.y = realWorldPoint.y;
                      if (realWorldPoint.z > handMax.z) handMax.z = realWorldPoint.z;
                    }
                  }
                }
              }
              line(handMin.x, handMin.y, handMin.z, handMax.x, handMax.y, handMax.z);
              float hDist = handMin.dist(handMax);
              if (hDist > openThresh) println("palm open, dist: " + hDist);
              else                   println("palm close, dist: " + hDist);
              pushStyle();
              stroke(255, 0, 0, 200);
              noFill();
              beginShape();
              for (int i = 0 ; i < handVecList.size(); i++) {
                PVector p = (PVector) handVecList.get(i);
                vertex(p.x, p.y, p.z);
              }
              endShape();   

              stroke(255, 0, 0);
              strokeWeight(4);
              point(handVec.x, handVec.y, handVec.z);
              popStyle();
            }
            // draw the kinect cam
            context.drawCamFrustum();
          }
          // -----------------------------------------------------------------
          // hand events

          void onCreateHands(int handId, PVector pos, float time)
          {
            println("onCreateHands - handId: " + handId + ", pos: " + pos + ", time:" + time);
            handsTrackFlag = true;
            handVec = pos;
            handVecList.clear();
            handVecList.add(pos);
          }

          void onUpdateHands(int handId, PVector pos, float time)
          {
            //println("onUpdateHandsCb - handId: " + handId + ", pos: " + pos + ", time:" + time);
            handVec = pos;
            handVecList.add(0, pos);
            if (handVecList.size() >= handVecListSize)
            { // remove the last point 
              handVecList.remove(handVecList.size()-1);
            }
          }

          void onDestroyHands(int handId, float time)
          {
            println("onDestroyHandsCb - handId: " + handId + ", time:" + time);
            handsTrackFlag = false;
            context.addGesture(lastGesture);
          }

          // -----------------------------------------------------------------
          // gesture events

          void onRecognizeGesture(String strGesture, PVector idPosition, PVector endPosition)
          {
            println("onRecognizeGesture - strGesture: " + strGesture + ", idPosition: " + idPosition + ", endPosition:" + endPosition);
            lastGesture = strGesture;
            context.removeGesture(strGesture); 
            context.startTrackingHands(endPosition);
          }

          void onProgressGesture(String strGesture, PVector position, float progress)
          {
            //println("onProgressGesture - strGesture: " + strGesture + ", position: " + position + ", progress:" + progress);
          }

Answers

  • I don't know Kinect / SimpleOpenNI, so as a start, I wonder how you distinguish several users.

  • edited October 2013

    SimpleOpenNI generates userID but for that we need to call some library functions. There is also getCOM function which gives user's center of mass position. Click Here

            int[] depthValues = kinect.depthMap();
            int[] userMap = null;
            int userCount = kinect.getNumberOfUsers(); 
            if (userCount > 0) { // check if number of user is more than zero
              userMap = kinect.getUsersPixels(SimpleOpenNI.USERS_ALL); // get user pixels of all the users
            }
    

    I don't really understand how hand functions works in SimpleOpenNI but I am sure you can distinguish hand. Hand can be tracked by either any of the two ways

    • Using skeleton data you can track hand position but is not accurate and unreliable
    • Using hand & Gesture Functions or Events

          void onCreateHands(int handId, PVector position, float time) {
            println("onNewHand - handId: " + handId + ", pos: " + position);
            ArrayList<PVector> vecList = new ArrayList<PVector>();
            vecList.add(position);
            handsTrackFlag = true;
            handVec = position;
            handPathList.put(handId, vecList);
          }
      
          void onUpdateHands(int handId, PVector position, float time) {
            //  println("onTrackedHand - handId: " + handId + ", pos: " + position );
            ArrayList<PVector> vecList = handPathList.get(handId);
            if (vecList != null)
            {
              vecList.add(0, position);
              // remove the last point
              // vecList.remove(vecList.size()-1);
            }
            handVec = position;
            //  println("size------ = "+vecList.size());
          }
      
          void onDestroyHands(int handId, float time) {
            ArrayList<PVector> vecList = new ArrayList<PVector>();
            println("onLostHand - handId: " + handId);
            handPathList.remove(handId);
            losthand++;
            context.addGesture("RaiseHand");
          }
      
          // -----------------------------------------------------------------
          // gesture events
      
          void onRecognizeGesture(String strGesture, PVector idPosition, PVector endPosition) {
            {
              if (strGesture.equals("RaiseHand")) {
                context.startTrackingHands(endPosition);
      
                context.removeGesture("RaiseHand");
              }
              context.convertRealWorldToProjective(endPosition, endPosition);
              if (strGesture.equals("Click")) {
                //splashVector.add(endPosition);
                endPosClick=endPosition;
                //splash(endPosition,sc);
                println("Click gestures executed");
                //context.startTrackingHands(endPosition);
      
                //context.removeGesture("RaiseHand");
              }
      
              if (strGesture.equals("Wave")) {
                splashVector.clear();
                //splash(endPosition,sc);
                println("Wave gestures executed");
                //context.startTrackingHands(endPosition);
      
                //context.removeGesture("RaiseHand");
              }
            }
      
            // -----------------------------------------------------------------
            // Keyboard event
          }
      
  • edited October 2013

    Can we create a class of the Grab gesture and call it for indvidual users according to their handIDs or userIDs. I am not sure it is possible but if there is anyone who can help me please do ...

  • Hey can any one help me with this please !.......................... Here is the code for multiple hand tracking and I have posted hand grab gesture in the above post but I dont know how to integrate both the codes.

    Any help would highly appreciable ..

          // multiple hands
          import SimpleOpenNI.*;
          import java.util.*;
          SimpleOpenNI      context;
          // NITE
          XnVSessionManager sessionManager;
          XnVFlowRouter     flowRouter;
          PointDrawer       pointDrawer;
    
          void setup()
          {
            context = new SimpleOpenNI(this);
            context.setMirror(true);
            context.enableDepth();
            context.enableGesture();
            context.enableHands();
            sessionManager = context.createSessionManager("Click,Wave", "RaiseHand");
            pointDrawer = new PointDrawer();
            flowRouter = new XnVFlowRouter();
            flowRouter.SetActive(pointDrawer);
            sessionManager.AddListener(flowRouter);
            size(context.depthWidth(), context.depthHeight()); 
            smooth();
          }
    
          void draw()
          {
            background(200, 0, 0);
            context.update();
            context.update(sessionManager);
            image(context.depthImage(), 0, 0);
            pointDrawer.draw();
          }
    
          // session callbacks
    
          void onStartSession(PVector pos)
          {
            println("onStartSession: " + pos);
          }
    
          void onEndSession()
          {
            println("onEndSession: ");
          }
    
          void onFocusSession(String strFocus, PVector pos, float progress)
          {
            println("onFocusSession: focus=" + strFocus + ",pos=" + pos + ",progress=" + progress);
          }
    
    
          /////////////////////////////////////////////////////////////////////////////////////////////////////
          // PointDrawer keeps track of the handpoints
    
          class PointDrawer extends XnVPointControl
          {
            HashMap    _pointLists;
            int        _maxPoints;
            color[]    _colorList = { 
              color(255, 0, 0), color(0, 255, 0), color(0, 0, 255), color(255, 255, 0)
            };
    
            public PointDrawer()
            {
              _maxPoints = 30;
              _pointLists = new HashMap();
            }
    
            public void OnPointCreate(XnVHandPointContext cxt)
            {
              // create a new list
              addPoint(cxt.getNID(), new PVector(cxt.getPtPosition().getX(), cxt.getPtPosition().getY(), cxt.getPtPosition().getZ()));
              println("OnPointCreate, handId: " + cxt.getNID());
            }
    
            public void OnPointUpdate(XnVHandPointContext cxt)
            {
              //println("OnPointUpdate " + cxt.getPtPosition());   
              addPoint(cxt.getNID(), new PVector(cxt.getPtPosition().getX(), cxt.getPtPosition().getY(), cxt.getPtPosition().getZ()));
            }
    
            public void OnPointDestroy(long nID)
            {
              println("OnPointDestroy, handId: " + nID);
              // remove list
              if (_pointLists.containsKey(nID))
                _pointLists.remove(nID);
            }
    
            public ArrayList getPointList(long handId)
            {
              ArrayList curList;
              if (_pointLists.containsKey(handId))
                curList = (ArrayList)_pointLists.get(handId);
              else
              {
                curList = new ArrayList(_maxPoints);
                _pointLists.put(handId, curList);
              }
              return curList;
            }
    
            public void addPoint(long handId, PVector handPoint)
            {
              ArrayList curList = getPointList(handId);
    
              curList.add(0, handPoint);      
              if (curList.size() > _maxPoints)
                curList.remove(curList.size() - 1);
            }
    
            public void draw()
            {
              if (_pointLists.size() <= 0)
                return;
    
              pushStyle();
              noFill();
              PVector vec;
              PVector firstVec;
              PVector screenPos = new PVector();
              int colorIndex=0;
              // draw the hand lists
              Iterator<Map.Entry> itrList = _pointLists.entrySet().iterator();
              while (itrList.hasNext ()) 
              {
                strokeWeight(2);
                stroke(_colorList[colorIndex % (_colorList.length - 1)]);
                ArrayList curList = (ArrayList)itrList.next().getValue();     
                // draw line
                firstVec = null;
                Iterator<PVector> itr = curList.iterator();
                beginShape();
                while (itr.hasNext ()) 
                {
                  vec = itr.next();
                  if (firstVec == null)
                    firstVec = vec;
                  // calc the screen pos
                  context.convertRealWorldToProjective(vec, screenPos);
                  vertex(screenPos.x, screenPos.y);
                } 
                endShape();   
    
                // draw current pos of the hand
                if (firstVec != null)
                {
                  strokeWeight(8);
                  context.convertRealWorldToProjective(firstVec, screenPos);
                  point(screenPos.x, screenPos.y);
                }
                colorIndex++;
              }
              popStyle();
            }
          }
    
  • edited October 2013

    @PhiLho : Can you suggest me a way to integrate these two different code so that I can perform grab gesture (open/close palm ) for multiple user.

    please help me. Thanks !

  • Answer ✓

    I don't know if this is any help: http://stackoverflow.com/questions/14742261/simple-openni-getuserpixels

    It looks like neither @PhiLho nor I own one, so it's not that easy to comment on Kinect code. Anyone donating Kinects out there? :)

  • edited October 2013

    @hamoid: I already used the method here you have given but it not useful in this context so can you give me any hint if possible :P BTW thanks again hamoid for the help :)

  • maybe somebody help me( i cant to remade for kinect my game for children. they must be catch the ball on the floor

    import SimpleOpenNI.*; SimpleOpenNI context;

    float x = 120; float y = 60; int radius = 12;

    void setup() { size(1024, 768, P3D); kinect = new SimpleOpenNI(this); kinect.enableDepth(); size(kinect.depthWidth(), kinect.depthHeight()); smooth(); ellipseMode(RADIUS); }

    void draw() { kinect.update(); background(0);

    // float d = dist(mouseX, mouseY, x, y); if (d < radius) { radius++; fill(0); } else { radius = 12; if(frameCount%150 == 0){ x = random(500); y = random(500); } fill(255); } ellipse(x, y, radius, radius); }

Sign In or Register to comment.