I'm utilising the new release of Proscene for a multitouch application I am working on for a uni project (University of Sydney represent!) which is controlled on a 42" TUIO-enabled multitouch surface.
The new HIDevice class is fantastic and just what I needed, as I can 'feed' the TUIO events directly into camera navigation.
However what I need now is the ability to simulate the CameraProfile mouse controls via TUIO events. So for example, if one TUIO cursor is 'clicked' onto the surface, that simulates a mouse click; and by clicking/dragging over an InteractiveFrame object, you can move it around just like if using a mouse.
I have not found an easy way to do this, and the HIDevice class at the moment only offers methods to feed translation and rotation variables.
Can someone suggest a way for me to access the mouse-specific methods (selecting/dragging an InteractiveFrame, ZOOM ON REGION etc) via TUIO events?
p.s. I love proscene so much, its made my love for Processing even stronger than ever =)
I am beginning a project where there will be a few planets floating about in a 3d space. I intend to create the spheres using toxiclibs and the mesh classes/methods, as I want to utilise the deforming methods and subdividing methods.
The user will be able to morph and shape the planets via deforming the mesh, and also to modify the texture of the planet itself (perhaps even include an atmospheric alpha-texture that floats slightly above the planets).
Now I figure that using the built-in OPENGL won't be able to render what I envision fast enough. So I've spent all day today researching OpenGL and the implementation library GLGraphics. The example included which is a port of toxi's NoiseSurface sketch is an eye-opening experience!
And so I am experimenting with this process:
create a WETriangleMesh and send it to the GPU via a GLModel using getMeshAsVertexArray() (like in the GLGraphics Integration example)
Perhaps modify the mesh as it is in the GLModel
Want to perform some toxiclibs-specific methods again, so pull the GLModel vertex-mesh out and convert it back to a WETriangleMesh
Perform some methods on it
Send it back to GPU via GLModel
Of course, the draw() method is only calling a render.model() on the GLModel, so displaying the mesh once its finished computing is smooth as silk
A roadblock I've hit however is step 3. The only way I've found in the GLGraphics javadocs to extract vertex information from a GLModel is the saveVertices(String s) method, which will save the vertex information to an external file. I was hoping for a similar method to the toxiclib's getMeshAsVertexArray(), but I haven't found one. I have tried creating a mesh in toxiclibs using this binary file but looking through the docs toxiclibs can only import binary .STL files. Not sure what format GLGraphics outputs the binary to, but it doesn't seem to be .STL.
So is there a way to easily pull a GLModel mesh into a toxiclibs mesh?
This is my first time ever toying around with processing in 3d and OpenGL so any other advice would be very grateful.
Also, is it fairly straightforward to modify/deform a mesh once placed inside a GLModel? I would only be looking at relatively simple deformities such as expanding, crushing a sphere etc. This would be much more ideal than jumping back/forth between GLGraphics/toxiclibs, as there would be much less cpu stuttering
And finally, I plan on each planet having its own texture placed on the mesh which will be dynamic. That is to say, the texture will be a swirl of colours, with the colours modifiable during runtime. And hopefully other effects too like the texture moving around the planet, simulating clouds. How best should I go about coding that?