Here is a simple program that I use to teach computational physics to non-programmers. It's an n-body code which uses a vortex kernel in 2D to draw lines. The initial positions and strength are set to generate a fuzzy, growing spirograph-like wreath in the middle of the screen.
// use this many particles int num = 400; // this variable scales how fast everything moves float dt = 2.0;
// these arrays store the parameters for each particle float[] x = new float[num]; float[] y = new float[num]; float[] s = new float[num];
// this gets run only once, before any iteration begins void setup() { size(1280, 960);
for (int i=0; i<num; i=i+1) { // create the particles randomly in the middle of the screen //x[i] = random(100, 540); //y[i] = random(100, 380); // or, create them along a circle float rad = height/3.5; float theta = i*2.0*3.1415927/num; x[i] = width/2.0 + rad*sin(theta); y[i] = height/2.0 + rad*cos(theta); // here we set the strength, centered on zero s[i] = random(-1, 1); }
frameRate(60); background(50); }
// this gets called once for each frame void draw() {
// turn off outlines noStroke();
// set the color for all particles // four numbers are: red, green, blue, alpha/transparency // each covers a range from 0 to 255 fill(255, 255, 255, 12);
// loop over the vortex particles for (int i=0; i<num; i=i+1) {
// compute the new velocity of each particle float velx = 0.0; float vely = 0.0; for (int j=0; j<num; j=j+1) { // particle j affects particle i's speed based on // a function of the distance and the direction // between the two particles float dx = x[j] - x[i]; float dy = y[j] - y[i]; // the 1.0 here is a smoothing parameter float distsq = dx*dx + dy*dy + 1.0; // this kernel looks like attraction/repulsion //velx = velx + s[j]*dx/distsq; //vely = vely + s[j]*dy/distsq; // this kernel looks like fluid dynamics velx = velx - s[j]*dy/distsq; vely = vely + s[j]*dx/distsq; }
// move each particle according to its new velocity x[i] = x[i] + dt*velx; y[i] = y[i] + dt*vely;
// and draw each particle as a 2x2 pixel circle ellipse(x[i], y[i], 2.0, 2.0); } }
I am helping some middle-school kids learn math, physics, and programming on Raspberry Pi computers (they love them!) and I would like for them to be able to hit 'p' during execution to drop a frame to a file. This is using 2.0b8 on raspbian. Here is the requisite code:
void keyPressed() {
if (key == 'p') {
saveFrame();
//saveFrame("img_vortex_#####.jpg"); // just as unsuccessful
}
}
But this is the error that I get:
Exception in thread "Animation Thread" java.lang.ClassCastException: [I cannot be cast to [S
at sun.awt.image.ShortInterleavedRaster.getDataElements(ShortInterleavedRaster.java:293)
at processing.core.PGraphicsJava2D.loadPixels(PGraphicsJava2D.java:2308)
at processing.core.PImage.save(PImage.java:3209)
at processing.core.PApplet.saveFrame(PApplet.java:4063)
at vortex_advanced.keyPressed(vortex_advanced.java:95)
at processing.core.PApplet.keyPressed(PApplet.java:3339)
at processing.core.PApplet.handleKeyEvent(PApplet.java:3157)
at processing.core.PApplet.dequeueEvents(PApplet.java:2606)
at processing.core.PApplet.handleDraw(PApplet.java:2277)
at processing.core.PGraphicsJava2D.requestDraw(PGraphicsJava2D.java:243)
at processing.core.PApplet.run(PApplet.java:2140)
at java.lang.Thread.run(Thread.java:722)
Any ideas how I can successfully trigger a frame dump to a file?
I used Processing to implement Adam Frank's "Performer," now running 10am-8pm every day in Anita's Way, between 42nd and 43rd Streets, just East of Broadway and Times Square. It will run until late November.
It uses GSVideo to capture and process a live video stream, converts the amount of foreground and motion to a series of numbers, and triggers one of 100 sounds files to play. At its core, it is a simple program, though making the sound files trigger realistically and mixing the results took some care. The result is that "performing" in the spotlight earns a potentially enthusiastic response of applause and cheering.
Thanks must go to a number of people: the developers of Processing, GSVideo, and sox; forum users bustup, akiersky, and especially andres.
I am on 10.6.8 with the latest stable Processing and (32-bit) GSvideo. My script works from within Processing (video capture and processing, sound output), but when I export an application and run it separately, there is no video, and the Console returns the following error:
10/4/11 2:19:53 PM[0x0-0x2b02b].performer[279]Exception in thread "Animation Thread" java.lang.UnsatisfiedLinkError: Unable to load library 'gstreamer-0.10': dlopen(libgstreamer-0.10.dylib, 9): image not found
I would like, eventually, to run this program at startup, but I need a standalone application in order to do that. Does anyone know what the problem could be?
[Processing 1.5.1 on Fedora 15 Linux, using Oracle's 64-bit Java libraries]
When I run the Minim examples, I hear no sound. For example, FrequencyEnergy returns the error:
and appears to run, but no sound emerges from the speakers. The same goes for my own program, On the other hand, when I use GSVideo for sound (the example program GSVideo->Player->Audio, I can hear sound just fine.
I'll eventually need to play and overlap 8-14 sounds clips, so I really hope GSVideo is up to the task, and I hope it also works well on OSX (the production machine).
Shouldn't Minim, being the default player, work on Linux if GSVideo does, too? And why would it not work?
I would like to develop a Processing program on my Linux laptop, but have it run on a Mac Mini. Unfortunately, the Linux version only supports video processing through GSVideo. I am not thrilled about using a non-standard video library on Mac as well, and I don't want to support two different programs. To solve this, I would like to place preprocessor directives in my Processing code to swap between GSVideo calls when the program is run on Linux and Quicktime calls when run on OSX. Is this possible?