Kinect Gesture Control for Unity 1st Person Character Controller
in
Integration and Hardware
•
6 months ago
Hello!
First off, I'm a beginner to code and programming with a super basic understanding, but I've had no official training so you might as well consider me new to this.
My project is using Unity to create a simple virtual environment and I want to integrate gesture control with the Kinect to navigate the scene instead of the included keyboard/mouse character controls. I only have very simple gestures in mind. For example, raising a hand and tracking a left or right lean in the spine. I only need to control forward motion in the direction the camera is pointed and allow for camera rotation side to side, up and down is not required.
I have Processing, OpenNI, Nite, and Primesense all installed and working. I have code for
skeleton tracking,
identifying people in scene, and
distance calculation between joints all from
this site Codasign. The code from Unity (if you have it and want to look) is the first person controller: character motor, mouse look, and FBSImport controller.
I'm hoping to complete this all in Javascript, although most Kinect code is C#, and the mouse look is C# as well. All of the code from Codasign is Javascript.
It was going great until I ran across an
unfinished tutorial on "Triggering Events Based On Proximity to Hot Spots" on that site and unfortunately it is EXACTLY what I need. I have code to track skeleton and joints, but I don't know how to put it together to track specific gestures and then trigger the character control movement from with the Unity scripts.
Any help or direction towards tutorials would be GREATLY appreciated. Thanks in advance. And please let me know if you need any other information. I tried to be as thorough as possible.
1