Hey everyone. After about two weeks fiddling around with the SDK I've finally rolled out my first Android app created with Processing. It's a generative music app based on some particle interactions and physical forces, with multitouch support.
It's a beta version but so far it seems to work great on a lot of devices (it hasn't crashed violently yet on anything), but if you discover a bug please feel free to post about it. You can read the usage instructions on the market page, but basically you can create a particle by holding a finger down, and drag to set it's launch direction, and when you release it goes in that direction. All the particles attract/chase each other, and when they collide they play notes based on a repeating electronic chord sequence. It's the first version, but I'm going to update it regularly with more features until I decide to call it a full version.
Also, I'd like to thank all the people on the forum, because the stuff I learned here was invaluable, and I finally managed to move on today and start compiling Processing stuff for Android in eclipse, and also got libpd working which was a bit of a party-time moment for me. So cheers!
I know what you mean about the ad. I wanted to test the admob SDK to learn how to do it (it's literally my first ever Android app), but it's going away in future versions, or either way I'm going to make it less intrusive. Also, I haven't been able to test it myself yet, but I think the ad doesn't scale down properly on phones, because on my tablet It's a tiny rectangle.
I've gotten reports of it lagging a bit on phones, but that's going to come down to optimizing when I update it. Thanks for checking it out.
I really enjoyed playing with this app! As a suggestion for future versions, I think that if the creatures don't destroy outright when they collide and they have some more movements options, such as flocking and periodical cycles, then the variety of the sound patterns would be much larger.
Great work! (I'm running it on a 10.1 tablet, and looks very good with the "Stretch to fill screen option")
Just out of curiosity, what renderer did you use? P2D, P3D or direct OpenGL?
I removed the ads completely, and there is an option to turn off tails, which far improves performance.
Thanks for the ideas Andres, I'm also planning to update it with more interactions and forces, including particles that repel or merge or flock together, but my main goal atm is to add different sounds. I managed to get PureData to work properly on Android with pdlib, and I'm kind of excited to do some kind of live audio synthesis based on the interactions, and that should also help reduce the download size if I use less samples. Right now I'm working on a "Continuum-like" microtonal synth to learn how to use it first.
I used A3D (which as far as I understand is the same as "opengl", and the same as P3D, only for Android). I was kind of hoping OpenGL mode would speed up rendering the tails (which are basically lines between points), but for some reason drawing lines is really slow, including if I use BeingShape/EndShape.
Just did an update fixing some bugs including one that caused a crash. Also added a slider that lets you scale the force between the particles, and also reverse it (to get repelling particles). I promised to do really regular updates, but I'm pretty stuck coding some other things that I actually get a living from and I'm also starting a new degree in a couple of days so it might get a little hectic, though I'm still planning some stuff for this one, and the next step is to add some real time audio synthesis.
I didn't really use any libraries at all, just Processing. I wrote my own physics (really simple vector stuff, each particle has a velocity vector, and a vector pointing towards each nearby object, and you just add the components and scale with the inverse square law). The audio is just using the Android MediaPlayer class.
Interesting. Did you have any problems getting the audio to work? I must say after reading about the latency in Android devices I was pleasantly surprised at how responsive the sound is in your app. Do you begin buffering the audio when the objects collide, or do you have to do some kind of pre-buffering and just trigger the sound when the objects collide?
Also if you have any general tips or thread/website recommendations for developing an Android app with processing then I'd be immensely grateful. I've been wanting to do something along these lines for a while, and your app has gone a longway to convincing me that it might be worthwhile.
Oh, and one last question - the menu buttons. Did you write them from scratch? Or are there some good base classes that can be exploited? They always seem such a small part of an app, but are surprisingly tricky to get right (especially, from what I read, with Android & Processing)
Many thanks in advance. And well done again on this excellent creation.
Joesteve51: I disabled screen rotating (in the manifest I think).
petejonze: Hey, sorry I didn't answer your question, I only now just noticed people have commented on this thread.
"Did you have any problems getting the audio to work? I must say after
reading about the latency in Android devices I was pleasantly surprised
at how responsive the sound is in your app. Do you begin buffering the
audio when the objects collide, or do you have to do some kind of
pre-buffering and just trigger the sound when the objects collide?"
I do it very simply. I just load all the samples into SoundPool, and simply 'play' them when objects collide. It's probably not the best way to do it, but it works.They're all pre-buffered in that regard. The latency is probably high, but it's not noticeable since you're not actually playing anything directly.
"Also if you have any general tips or thread/website recommendations for
developing an Android app with processing then I'd be immensely
grateful. I've been wanting to do something along these lines for a
while, and your app has gone a longway to convincing me that it might be
Assuming you know how to use processing, it was really simple for me. I know there is now a really good Eclipse plug-in that essentially turns it into a processing editor, so you can just as you would in processing. There is almost nothing different in the way you code your app for Android, it's still just like writing a normal processing sketch (which is probably what makes it cool). I did it by exporting the Java project from Processing and then loading it in Eclipse.
"Oh, and one last question - the menu buttons. Did you write them from
scratch? Or are there some good base classes that can be exploited? They
always seem such a small part of an app, but are surprisingly tricky to
get right (especially, from what I read, with Android & Processing)"
The menu buttons I wrote myself, although the sliders are taken from the processing tutorial and adapted to multitouch using my own multitouch handler, which you can find here:
thanks metabog, i'm looking at this now. Has to be more then just android:screenOrientation="landscape" as this has not resolved the app having to restart when i rotate the phone. I think it is that I'm loading a tsv data set and the data is being reinterpreted into attributes that map to width & height. So i'm trying to understand the options available in AndroidManifest.