We are about to switch to a new forum software. Until then we have removed the registration on this forum.
hello , i am new to Processing. i am doing a project related to 9 axis. what i want is i have a 9 axis in my hand. it keeps on sending data to computer serially. Now i want to visualize the human hand. i saw the video in youtube (FreeIMU) in that they have visualized human skeleton hand. even i want to do the same. So how can i do the same in Processing? any references,how to start with it? any help
Answers
can you post your code?
do you get the serial signal already?
do you get 9 3D positions or just 9 angles?
http://openprocessing.org/sketch/8274
I am using Polulu MinIMU-9 v2 board and their open source code https://www.pololu.com/product/1268 i am new to this(sorry about that), i don't know much about what all the data i need and all.. but i know the code of how to take serial data to Processing(whatever may be the data). i don't know whats next like what data i require , how to draw the shape and how to make that shape move w.r.t serial data real time?
try to break the problem down to smaller steps. What are these small steps?
1.
download the code from them and make it run
can you move the Polulu MinIMU-9 v2 board inspace and see the red bars etc. move on the sccreen? I see a red bar on their website. Can you do this? And move it?
2.
When you have that then look at the data you receive.
Understand the code.
3.
Then in a new program write a hand and experiment with the hand.
4.
Then later join the two codes.
Question
Do you want the Hand to be stiff and just move in 3D space on the screen (e.g. to click something) or do you want to move the fingers of the hand (bend a finger) / deliver different gestures?
;-)
https://processing.org/examples/reach1.html
https://processing.org/examples/reach2.html
https://processing.org/examples/reach3.html
i downloaded the code and even i got the other code for visualization. if i run that code i can visualize(as you see in their website blue color rectangle,yellow color bar and green color arrow like that) in real time. that means if move the sensor that blue color rectangle also will move. i gone through the code it is in python, they took only roll,yaw and pitch and done the visualization. i have understood how they created the box,window, arrow and all, how they took the data from serial port. but i cant understand how they process/convert that angle into movement. i just want hand(i dont want fingers,bending of fingers and all) if i move the sensor up, then hand(visual) should also point upwards.
https://processing.org/examples/reach1.html this found interesting. i want this thats enough, but it moves based on my mouse pointer position, i want the movement based on my sensor values
Yes. ;-)
You need to take the code from reach1 and put it into your program.
Then replace the mousex and y with your data
that is the main problem what data i need to put in place of mouseX and y.. i have accelerometer. it gives 3 value in terms of m/s^2 and the code in reach1 is 2D but i want 3D visualization.
above code for a simple hand
when you have three values (which range)
you could apply them to the 3 rotation axis
rotateX
rotateY
rotateZ
sketch for rotation above.
just use map() to apply your serial values to these 3 values:
thank you very much
the 2 values will be in the m/s^2 or 'g'(gravity) units m/s^2 ranges to 0 to 5000(it depends) or else g ranges from -3.9 to +3.9
sorry not 2 values its 3 values
Use map as shown in my sketch
Map 0 to 5000 to the range 0 to 2 PI