I needed some basic gestures for a project and I found the GestureDetector class from the SDK to be unreliable. Also, it only detects pinches. There are probably better solutions out there but I just wanted something with no frills and not a big modular framework.
This code will: - keep a list of active points for you (like what ak_eric has posted) - trigger events on: pinch, rotate, drag, tap, double-tap and flick. For pinch, rotate and drag you get the number of fingers involved too, so for instance you could program different behaviors for a 2 finger pinch vs 4-finger pinch.
Here's the code in case anyone is interested. I have tested it on the Galaxy Tab and the GTablet. I'd like to make a proper library out of it in a week or so when my project is done. In the meantime, comments or suggestions are welcome!
This could also work easily with other multi-touch input sources.
dimavj: you should add the android.jar from the SDK folder
for me, in OSX, its in /Developer/android-sdk-mac_x86-1.6/platforms/android-12/android.jar
the platform you choose to get the android.jar from should be the same as you target so that you can have consistent android features available. (For example, if you go with pre 3.X, you won't be able to use the honeycomb animation features etc). the MotionEvent you are looking for is inside of this jar though
Hi, Finally I've updated my sdk to api 12. and the sketch compiles and executes ok... But it only shows 2 touch areas, so a third or fourth finger are not processed. Well, for the moment is fine! I'm on a HTC Desiree device. Great! A.
Can this code report the size of the contact surface between each finger and the device ?
I have been playing a bit with a Magic TrackPad from Apple and this device can report this information but it is coded with a poor resolution (8 or 16 levels only, I don't remember) and I am thinking about an Android device to replace it. Theoretically the Android SDK makes reporting this information possible but so far I haven't heard of any application using this feature.
I do not know which actual Android devices can report this information (it probably changes according to the touch controller) and with how much resolution. Getting the information as a number of pixels would be great but is it possible ?
Also, I'm trying to use the multi-touch code on a project I sometimes run on my PC. Is it possible to only enable the multi-touch code when run on Android. Even better, it would be amazing to still have the mouse trigger the single-touch events so that I could still test my app on PC.
True, a link to the wiki is probably a good idea :-)
Regarding the PC / Android issue, maybe just structure your code such that both mousePressed() and touchEvent() call the same function which contains all the important code? Since mousePressed() on Android is also triggered by touch events, you might have to rely on a single boolean or something simple like that to decide if you are running on Android or not for your development (ie: the Android app could just ignore the calls to mousePressed, while the PC app would forward them to your function)
That would work for generic touches, but is there an easy way to get it to recognize flick gestures from my mouse for example?
Also, do you know how to get your MultiTouch project to run in target>Standard mode on PC (as opposed to target>Android). When I hit run I get the error "Cannot find a class or type named "MotionEvent"", which i'm assuming is some Android specific class.
If you want the code to run in standard mode, you'll want replace the "surfaceTouchEvent", which is from the Android SDK, with mousePressed, mouseReleased and mouseDragged feeding the right information into the touch processor (pointDown, pointUp and pointMoved, respectively). Just make it so that the ID is always 1 since you'll have only one point.
Now I realize there won't be an easy way to compile for both platforms without having to first comment/uncomment some code. But I think it should be limited to swapping the aforementioned functions.
I also had to manually add a size() call in the setup() because the window was appearing really small.
For extra points, I added the ability to get multi-drag if you hold down shift when dragging the mouse. And the ability to rotate or pinch/zoom when pushing the alt/option key using the iOS simulator style reflection about the initial mouse down point. Feel free to add this into your awesome core code david.
My fork of PeasyCam will run on Andriod, along with additional features, but I haven't yet added any gestures. Would love it if you wanted to contribute. :) I just haven't had the time to spend on it as it's not anything I need at the moment.
I know the thread is about a year old… but I wanted to say thank you for that nice piece of code. I tested it on my Samsung Galaxy S Plus and it works like a charm. I'm gonna use it in my next Android project (with big credits to you). ;)