can someone let me know why this library is not working with processing 3.3.5 and its working on 3.3.4 ?

]]>The project is a large LED panel installation on 40 panels, the code will run through processing on a raspberry pi so I am using this pi compatible version someone helped me find. I have one video which almost successfully runs on these 40 panels with a bit of lag (which isn't the worst for the project) but it is not the video I intend to use! In comparing it to the video which I do want to use I am not sure what the difference is. I made the first video which works a few months ago so I can't remember if I did anything different but both videos are made in final cut (mov) and then converted to mp4 online. In final cut I made the dimensions of both videos 640 x 360 and both were H264. The first video moves a bit slower and looks more lo-res but I believe those to be qualities from the original source before I imported and edited in FinalCut.

I believe the problem to be related to the amount/size of data being processed. I should mention that both videos will work according to processing and be recognized by the panels but the issue is that the video I want to use does not play consistently. One teensy may send part of the video to the LEDs mostly correct while another teensy's LEDs start to pick up after 10 seconds and then flickers a spectrum of random colors. None of the teensys seem to be synced up with this second video (though all teensys are mounted on the OctoBoards and connected to each other to sync.)

I'm wondering if anyone has any advice for what to do to the video to make it play correctly.

I have tried using HandBrake to also change some qualities of the video but have no clue what I'm doing really and all of my test have yielded no results.

I also noticed that the flickering happens more when theres a bigger change in color supposed to be happening

I suppose the problem could relate to how many rows I have, not sure, but the other video does seem to be working mostly properly. I am inserting some photos that show the project, some of the panels still need some work so thats why one is pink (wrong RGB ordered lights) and the all white ones just have no data input yet.

Anyone have any ideas?

I hope I entered this code correctly...

```
`/* OctoWS2811 movie2serial.pde - Transmit video data to 1 or more
Teensy 3.0 boards running OctoWS2811 VideoDisplay.ino
http://www.pjrc.com/teensy/td_libs_OctoWS2811.html
Copyright (c) 2013 Paul Stoffregen, PJRC.COM, LLC
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
*/
// To configure this program, edit the following sections:
//
// 1: change myMovie to open a video file of your choice ;-)
//
// 2: edit the serialConfigure() lines in setup() for your
// serial device names (Mac, Linux) or COM ports (Windows)
//
// 3: if your LED strips have unusual color configuration,
// edit colorWiring(). Nearly all strips have GRB wiring,
// so normally you can leave this as-is.
//
// 4: if playing 50 or 60 Hz progressive video (or faster),
// edit framerate in movieEvent().
//import processing.video.*;
import gohai.glvideo.*;
import processing.serial.*;
import java.awt.Rectangle;
GLMovie myMovie;
float gamma = 1.7;
int numPorts=0; // the number of serial ports in use
int maxPorts=24; // maximum number of serial ports
Serial[] ledSerial = new Serial[maxPorts]; // each port's actual Serial port
Rectangle[] ledArea = new Rectangle[maxPorts]; // the area of the movie each port gets, in % (0-100)
boolean[] ledLayout = new boolean[maxPorts]; // layout of rows, true = even is left->right
PImage[] ledImage = new PImage[maxPorts]; // image sent to each port
int[] gammatable = new int[256];
int errorCount=0;
float framerate=0;
void setup() {
String[] list = Serial.list();
delay(20);
println("Serial Ports List:");
println(list);
//serialConfigure("/dev/ttyACM1"); // change these to your port names
//serialConfigure("/dev/cu.usbmodem3550481");
serialConfigure("/dev/tty.usbmodem3999691");
//serialConfigure("/dev/cu.usbmodem3550481");
//serialConfigure("/dev/ttyACM0");
serialConfigure("/dev/tty.usbmodem3645941");
serialConfigure("/dev/tty.usbmodem3694501");
serialConfigure("/dev/tty.usbmodem3766451");
if (errorCount > 0) exit();
for (int i=0; i < 256; i++) {
gammatable[i] = (int)(pow((float)i / 255.0, gamma) * 255.0 + 0.5);
}
size(480, 400, P2D); // create the window
String mpath = sketchPath() + "/../../../media/x.mp4";
println(mpath);
myMovie = new GLMovie(this, mpath);
myMovie.loop(); // start the movie :-)
}
// movieEvent runs for each new frame of movie data
void movieEvent(GLMovie m) {
// read the movie's next frame
m.read();
//if (framerate == 0) framerate = m.getSourceFrameRate();
framerate = 30; // TODO, how to read the frame rate???
for (int i=0; i < numPorts; i++) {
// copy a portion of the movie's image to the LED image
int xoffset = percentage(m.width, ledArea[i].x);
int yoffset = percentage(m.height, ledArea[i].y);
int xwidth = percentage(m.width, ledArea[i].width);
int yheight = percentage(m.height, ledArea[i].height);
ledImage[i].copy(m, xoffset, yoffset, xwidth, yheight,
0, 0, ledImage[i].width, ledImage[i].height);
// convert the LED image to raw data
byte[] ledData = new byte[(ledImage[i].width * ledImage[i].height * 3) + 3];
image2data(ledImage[i], ledData, ledLayout[i]);
if (i == 0) {
ledData[0] = '*'; // first Teensy is the frame sync master
int usec = (int)((1000000.0 / framerate) * 0.75);
ledData[1] = (byte)(usec); // request the frame sync pulse
ledData[2] = (byte)(usec >> 8); // at 75% of the frame time
} else {
ledData[0] = '%'; // others sync to the master board
ledData[1] = 0;
ledData[2] = 0;
}
// send the raw data to the LEDs :-)
ledSerial[i].write(ledData);
}
}
// image2data converts an image to OctoWS2811's raw data format.
// The number of vertical pixels in the image must be a multiple
// of 8. The data array must be the proper size for the image.
void image2data(PImage image, byte[] data, boolean layout) {
int offset = 3;
int x, y, xbegin, xend, xinc, mask;
int linesPerPin = image.height / 8;
int pixel[] = new int[8];
for (y = 0; y < linesPerPin; y++) {
if ((y & 1) == (layout ? 0 : 1)) {
// even numbered rows are left to right
xbegin = 0;
xend = image.width;
xinc = 1;
} else {
// odd numbered rows are right to left
xbegin = image.width - 1;
xend = -1;
xinc = -1;
}
for (x = xbegin; x != xend; x += xinc) {
for (int i=0; i < 8; i++) {
// fetch 8 pixels from the image, 1 for each pin
pixel[i] = image.pixels[x + (y + linesPerPin * i) * image.width];
pixel[i] = colorWiring(pixel[i]);
}
// convert 8 pixels to 24 bytes
for (mask = 0x800000; mask != 0; mask >>= 1) {
byte b = 0;
for (int i=0; i < 8; i++) {
if ((pixel[i] & mask) != 0) b |= (1 << i);
}
data[offset++] = b;
}
}
}
}
// translate the 24 bit color from RGB to the actual
// order used by the LED wiring. GRB is the most common.
int colorWiring(int c) {
int red = (c & 0xFF0000) >> 16;
int green = (c & 0x00FF00) >> 8;
int blue = (c & 0x0000FF);
red = gammatable[red];
green = gammatable[green];
blue = gammatable[blue];
return (green << 16) | (red << 8) | (blue); // GRB - most common wiring
}
// ask a Teensy board for its LED configuration, and set up the info for it.
void serialConfigure(String portName) {
if (numPorts >= maxPorts) {
println("too many serial ports, please increase maxPorts");
errorCount++;
return;
}
try {
ledSerial[numPorts] = new Serial(this, portName);
if (ledSerial[numPorts] == null) throw new NullPointerException();
ledSerial[numPorts].write('?');
} catch (Throwable e) {
println("Serial port " + portName + " does not exist or is non-functional");
errorCount++;
return;
}
delay(250);
String line = ledSerial[numPorts].readStringUntil(10);
if (line == null) {
println("Serial port " + portName + " is not responding.");
println("Is it really a Teensy 3.0 running VideoDisplay?");
errorCount++;
return;
}
String param[] = line.split(",");
if (param.length != 12) {
println("Error: port " + portName + " did not respond to LED config query");
errorCount++;
return;
}
// only store the info and increase numPorts if Teensy responds properly
ledImage[numPorts] = new PImage(Integer.parseInt(param[0]), Integer.parseInt(param[1]), RGB);
ledArea[numPorts] = new Rectangle(Integer.parseInt(param[5]), Integer.parseInt(param[6]),
Integer.parseInt(param[7]), Integer.parseInt(param[8]));
ledLayout[numPorts] = (Integer.parseInt(param[5]) == 0);
numPorts++;
}
// draw runs every time the screen is redrawn - show the movie...
void draw() {
if (myMovie.available()) {
movieEvent(myMovie);
}
// show the original video
image(myMovie, 0, 80);
// then try to show what was most recently sent to the LEDs
// by displaying all the images for each port.
for (int i=0; i < numPorts; i++) {
// compute the intended size of the entire LED array
int xsize = percentageInverse(ledImage[i].width, ledArea[i].width);
int ysize = percentageInverse(ledImage[i].height, ledArea[i].height);
// computer this image's position within it
int xloc = percentage(xsize, ledArea[i].x);
int yloc = percentage(ysize, ledArea[i].y);
// show what should appear on the LEDs
image(ledImage[i], 240 - xsize / 2 + xloc, 10 + yloc);
}
}
// respond to mouse clicks as pause/play
boolean isPlaying = true;
void mousePressed() {
if (isPlaying) {
myMovie.pause();
isPlaying = false;
} else {
myMovie.play();
isPlaying = true;
}
}
// scale a number by a percentage, from 0 to 100
int percentage(int num, int percent) {
double mult = percentageFloat(percent);
double output = num * mult;
return (int)output;
}
// scale a number by the inverse of a percentage, from 0 to 100
int percentageInverse(int num, int percent) {
double div = percentageFloat(percent);
double output = num / div;
return (int)output;
}
// convert an integer from 0 to 100 to a float percentage
// from 0.0 to 1.0. Special cases for 1/3, 1/6, 1/7, etc
// are handled automatically to fix integer rounding.
double percentageFloat(int percent) {
if (percent == 33) return 1.0 / 3.0;
if (percent == 17) return 1.0 / 6.0;
if (percent == 14) return 1.0 / 7.0;
if (percent == 13) return 1.0 / 8.0;
if (percent == 11) return 1.0 / 9.0;
if (percent == 9) return 1.0 / 11.0;
if (percent == 8) return 1.0 / 12.0;
return (double)percent / 100.0;
}`
```

]]>

I have a commercial screen for 3.5 '' Raspberry, the resolution is 480x320.

When designing I set size(480,320,P2D) but the size of the screen that appears in the sketch does not correspond to the real dimensions of the screen. I would need to be able to get an idea of the final result while designing on my laptop or my desktop PC. I need the real dimensions when I run the processing sketch.

Regards,

jcduino.

]]>I am having trouble getting sound to play through my usb headset using Processing for Raspberry Pi. I wrote a very simple piece of code. When I plug in headphones to the 3.5mm jack, I hear the sound. When I plug in my usb headset, I hear nothing. I have switched the default sound to my logitech usb headset in the menu--> audio device settings. I hear sound from the web, so I know the headset is working. But alas, Processing seems unable to find it.

Thank you for your ideas! Here's my code:

```
import ddf.minim.*;
import java.io.*;
Minim minim;
AudioPlayer player;
void setup()
{
size(100, 100);
minim = new Minim(this);
player = minim.loadFile("operator.mp3");
player.play();
}
void draw()
{
background(0);
stroke(255);
player.play();
}
```

]]>thank you for your time! vangelis

]]>I havnt got horizontal keystone on my projector and want to project on a surface that is not right in front of the projector.

My idea is to go through a processing sketch running on an raspberry pi. Basically by outputting a corner pinned version of the desktop to my projector connected through hdmi. Using the Keystone library.,

Just need to be able to grab the screen in some way.

Are there any library for capturing the screen and stream it?

Thanks

]]>What is the problem there. Thanks

]]>I'm trying to make a colour screen that changes by gradient when physical sensors are touched. The goal is to have 8 different colours and an object with 8 sensors that change the colour on the projection. I have an arduino and raspberry pi and it would ideally just work from this with the output to projector (I'm going to be leaving it in an exhibition space so I can't leave my computer)

]]>Cheers, Des

Here is the problematic sketch :

import gohai.glvideo.GLMovie; import gohai.glvideo.PerspectiveTransform; import gohai.glvideo.WarpPerspective; import java.awt.geom.Point2D;

GLMovie movie,movie2,source;

PShape sh;

float xRot=0;

void setup() {

movie = new GLMovie(this, "/home/pi/sketchbook/libraries/glvideo/examples/SingleVideo/data/launch1.mp4"); movie.loop(); movie2 = new GLMovie(this, "/home/pi/sketchbook/libraries/glvideo/examples/SingleVideo/data/launch1.mp4"); movie2.loop(); source=movie; frameRate(15); size(640, 480,P3D);

}

void draw() {

if (movie.available()) { movie.read(); } if (movie2.available()) { movie2.read(); }

background(0);

createLayer(); sh.rotateX(xRot); shape(sh);

}

void keyPressed()
{
if(key=='m')
{
if(source==movie)
source = movie2;
else
source = movie;

}

if(key=='x') {

```
xRot+=.1;
```

}

}

void createLayer() {

sh= createShape();

sh.beginShape();

sh.texture(source);

sh.vertex(0, 0, 0, 0); sh.vertex(source.width, 0, source.width, 0); sh.vertex(source.width, source.height, source.width,source.height); sh.vertex(0, source.height, 0, source.height);

sh.endShape();

}

]]>I'm using arduino as ADC, which sends data through UART to Raspberry Pi model B, which draws a graph on 7'display, connected via HDMI.

So I've got a strange problem. Randomly straight lines appear from current measuring point to top of the screen.

I was thinking about mistakes or zero values or Nan values of something like that. But it is not-)

Lines drawn like that:

`line(xPos-1, height-lastfValue, xPos, height - fValue);`

I was trying to filter it out by using such a construction:

```
if(fValue <= 0 || (height - fValue) > height || ((fValue - lastfValue)*(fValue - lastfValue)/2) > lastfValue + 5 ){
fValue = lastfValue;
}
```

It helped a little, but did't became a solution for my problem. So I tried to figure out a problem and added pdf export of what I see on the screen and logging values of "fValue" to console and find out:

The values:

```
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
146.04106
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
145.7478
```

and the graph:

As you can see there only one different value and two peaks on the graph.

Does anyone know how to deal with it and get a correct graph?

This code works the same way on my notebook..

Here is the code I use for processing on raspberry Pi:

```
import processing.serial.*;
import processing.pdf.*;
Serial myPort; // The serial port
int xPos = 1; // horizontal position of the graph
float fValue;
float lastfValue;
boolean newVal = false;
void setup () {
size(1000, 300);
// fullScreen();
println(Serial.list());
myPort = new Serial(this, Serial.list()[0], 115200);
myPort.bufferUntil('\n');
background(28,28,28);
beginRecord(PDF, "rawData.pdf");
}
void draw () {
// if (newVal) {
stroke(255, 16, 0);
// size(100, 100, P3D);
// noSmooth();
//ellipse(xPos, height - fValue, 1, 1);
line(xPos-1, height-lastfValue, xPos, height - fValue);
println(fValue);
lastfValue = fValue;
//fValue = (fValue - 3);
//stroke(28, 28, 28);
//point(xPos, height - fValue);
if (++xPos >= width) {
xPos = 0;
background(28,28,28);
// }
// newVal = false;
}
}
void serialEvent (Serial myPort) {
String inString = myPort.readStringUntil('\n');
if (inString != null) {
inString = trim(inString);
fValue = float(inString);
fValue = map(fValue, 0, 1023, 0, height);
//if(fValue <= 0 || (height - fValue) > height || ((fValue - lastfValue)*(fValue - lastfValue)/2) > lastfValue + 5 ){
//fValue = lastfValue;
//}
}
// newVal = true;
}
void keyPressed() {
if (key == ' ') {
endRecord();
exit();
}
}
```

Here is the code for arduino Uno that I use as ADC:

```
// the setup routine runs once when you press reset:
void setup() {
// initialize serial communication at 9600 bits per second:
Serial.begin(9600);
}
// the loop routine runs over and over again forever:
void loop() {
// read the input on analog pin 0:
int sensorValue = analogRead(A0);
// print out the value you read:
Serial.println(sensorValue);
delay(1); // delay in between reads for stability
}
```

]]>here's that piece of code so far :

```
import processing.io.*;
I2C compass;
void setup() {
compass = new I2C(I2C.list()[0]);
compass.beginTransmission(0x51);
compass.write(0x00);
compass.write(0x00);
compass.endTransmission();
delay(100);
compass.beginTransmission(0x51);
compass.write(0x00);
byte[] in = compass.read(1);
println(in);
}
```

and the result i get from this is 0x30 instead of 0x00...

any help would be much appreciated!! Vangelis

P.S. yes using i2cdetect shows me the eeprom being on the 0x51 address :D

]]>I'm beginner with my Raspberry pi 3 model b, and I have a doubt: I have the last version of Raspbian and Processing and my framerate with Processing is very low, about 17 fps with an empty sketch like this:

```
void setup(){
size(1600,800);
}
void draw(){
background(0);
println(frameRate);
}
```

My question: is it normal?

With a minimal complex sketch the fps is ridiculous. I tried a full of things and nothing worked. Using OpenGl (P2D) I have few fps more, but also a big mouse movement lag. I'm starting to think that my Pi is defective, I don't know!

Anyone can help me, please?

]]>Is there any way to adapt this features from that library? Or is there is one way that I can code those features my self?

i´ve managed to make the mask:depthtoRGB image as you can see on this forum : https://forum.processing.org/two/discussion/25414/how-align-textures-using-glsl-for-depth-and-rgb-of-kinect#latest

but mask is very very noisy, apart from that I would really need to use skeleton tracking on my app.

]]>Pi is booting into the GUI. I'm using a simple sketch with no dependancies for tester PenroseTile from examples. Can't get it to work. I'm not great with Linux so maybe missing something basic. I'm trying to get this command to execute in a systemd .service file (rc.local approach didn't work either)

```
DISPLAY=:0 /usr/local/bin/processing-3.3.3/processing-java
--sketch=/home/pi/Desktop/PenroseTile/ --run > /home/pi/startup.log 2>&1 &
```

It works from command line with ssh from my computer. Boots up the sketch, but as `ExecStart=`

in the .service file I get nothing, not even the log file I'm trying to generate.

I've replaced the ExecStart string in the .service file to one that targets a python script and that works, so I think the .service file is setup correctly (`/usr/bin/python /home/pi/Desktop/enviroFields/enviro3.py`

works). I've noticed it doesn't seem to work if I try to execute a shell script from the .service file either ie `/bin/sh /home/pi/Desktop/enviro.sh`

won't run this way either.

I'm at a loss.
Is it that Processing sketch cannot be run by Pi at startup?
Should I Perhaps export as application? And how would I address it then?

Am I missing something in the command line construction?

Any pointers to existing documentation? I can't find anything that addresses this directly.

I just noted in forum post that Processing has a Jessie image. I'm not using that. I started with '2017-04-10-raspbian-jessie' set up with Kuman 5" display (HDMI). I installed Processing-3.3.3 directly.

Any help appreciated. Thanks

enviroFields.service file

```
[Unit]
Description=Start EnviroFields
After=multi-user.target
[Service]
Type=idle
ExecStart=DISPLAY=:0 /usr/local/bin/processing-3.3.3/processing-java --sketch=/home/pi/Desktop/PenroseTile --run > /home/pi/Desktop/enviroBoot.log 2>&1
[Install]
WantedBy=multi-user.target
```

]]>Fist time poster. Sorry it's a long one.

I am attempting to port over a sketch that I originally built on my Mac (1.6GHz core i5/8mb RAM), over to a RasPi2 and I am experiencing an unexpectedly *dramatic* loss in video performance. I'm looking for expectations, opinions, and any advice to get this thing working smoothly on a RasPi2. I get that the RasPi is obviously a much, much less powerful computer than my laptop, but gohai's SimpleCapture example worked so smoothly, that I hoped an OpenCV layer on top would also run smoothly. I tinkered with allocating more GPU memory and overclocking the Pi, but without any noticeable improvements.

The original script tracks faces and animates eyes to "watch" the viewer via a webcam. My original code is here, using Video and OpenCV libraries.

```
import gab.opencv.*;
import processing.video.*;
import java.awt.*;
Capture video;
OpenCV opencv;
float mouthLength = 50;
float mouthX = 120;
float mouthY = 175;
float leftPupilX;
float leftPupilY;
float rightPupilX;
float rightPupilY;
int radius = 40; // Radius of white eyeball ellipse
float pupilSize = 20;
PVector leftEye = new PVector(100, 100);
PVector rightEye = new PVector(200, 100);
int x, y = 120;
float easing = 0.2;
int scaleFactor = 3;
int counter;
void setup() {
size(960, 720);
smooth();
video = new Capture(this, 960/scaleFactor, 720/scaleFactor);
opencv = new OpenCV(this, 960/scaleFactor, 720/scaleFactor);
opencv.loadCascade(OpenCV.CASCADE_FRONTALFACE);
//opencv.loadCascade(OpenCV.CASCADE_PROFILEFACE);
//opencv.loadCascade(OpenCV.CASCADE_EYE);
video.start();
frameRate(24);
}
void draw() {
background(255, 255, 0); // Yellow
scale(scaleFactor);
opencv.loadImage(video);
opencv.flip(OpenCV.HORIZONTAL); // flip horizontally
Rectangle[] faces = opencv.detect();
println(faces.length);
strokeWeight(3);
leftPupilX = leftPupilX + (100 - leftPupilX) * easing;
rightPupilX = rightPupilX + (200 - rightPupilX) * easing;
leftPupilY = rightPupilY = leftPupilY + (100 - leftPupilY) * easing;
for (int i = 0; i < faces.length; i++) {
//println(faces[i].x + "," + faces[i].y);
noFill();
stroke(0, 255, 0); // face detection rectangle color
rect(faces[i].x, faces[i].y, faces[i].width, faces[i].height);
if (faces[i].x < 80 ) {
leftPupilX = (leftPupilX + (faces[i].x - leftPupilX) * easing);// + (faces[i].width * 0.2);
rightPupilX = leftPupilX + 100;
}
if ( faces[i].x > 175) {
rightPupilX = rightPupilX + (faces[i].x - rightPupilX) * easing;// + (faces[i].width * 0.2);
leftPupilX = rightPupilX - 100;
}
if ( (faces[i].y > 120) || (faces[i].y < 30) ) {
leftPupilY = leftPupilY + (faces[i].y - leftPupilY) * easing;
rightPupilY = rightPupilY + (faces[i].y - rightPupilY) * easing;
}
}
// Mouth
noFill();
stroke(0);
line(mouthX, mouthY, mouthX + mouthLength, mouthY);
arc(mouthX-15, mouthY, 30, 30, radians(-30), radians(30)); // left cheek
arc(mouthX+65, mouthY, 30, 30, radians(145), radians(205)); // right cheek
// Eyes
fill(255); // white
ellipse(leftEye.x, leftEye.y, radius+25, radius + 25); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+25, radius + 25); // left eyeball ellipse
PVector leftPupil = new PVector(leftPupilX, leftPupilY);
if (dist(leftPupil.x, leftPupil.y, leftEye.x, leftEye.y) > radius/2) {
leftPupil.sub(leftEye);
leftPupil.normalize();
leftPupil.mult(radius/2);
leftPupil.add(leftEye);
}
PVector rightPupil = new PVector(rightPupilX, rightPupilY);
if (dist(rightPupil.x, rightPupil.y, rightEye.x, rightEye.y) > radius/2) {
rightPupil.sub(rightEye);
rightPupil.normalize();
rightPupil.mult(radius/2);
rightPupil.add(rightEye);
}
// Actually draw the pupils
noStroke();
fill(0); // black pupil color
ellipse(leftPupil.x, leftPupil.y, pupilSize, pupilSize); // new left pupil
ellipse(rightPupil.x, rightPupil.y, pupilSize, pupilSize); // new right pupil
counter ++;
println(counter);
if (counter > 195) {
counter = 0;
}
if (counter >= 190 && counter < 195) {
blink();
}
}
void captureEvent(Capture c) {
c.read();
}
void blink() {
fill(255, 255, 0); // Yellow
stroke(255, 255, 0);
ellipse(leftEye.x, leftEye.y, radius+26, radius + 26); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+26, radius + 26);
stroke(0);
noFill();
line(67, leftEye.y, 133, leftEye.y);
translate(100, 0);
line(67, leftEye.y, 133, leftEye.y);
}
```

This version I modified for the RasPi2 with gohai's GLVideo library. I can get the sketch to run, but the tracking AND/OR responsive animation are incredibly slow. Unfortunately, to the point that it ruins the interactive nature of the work.

```
import gab.opencv.*;
import gohai.glvideo.*;
import java.awt.*;
GLCapture video;
OpenCV opencv;
float mouthLength = 50;
float mouthX = 120;
float mouthY = 175;
float leftPupilX;
float leftPupilY;
float rightPupilX;
float rightPupilY;
int radius = 40; // Radius of white eyeball ellipse
float pupilSize = 20;
PVector leftEye = new PVector(100, 100);
PVector rightEye = new PVector(200, 100);
int x, y = 120;
float easing = 0.2;
int scaleFactor = 3;
int counter;
void setup() {
size(960, 720, P2D);
smooth();
String[] devices = GLCapture.list();
println("Devices:");
printArray(devices);
if (0 < devices.length) {
String[] configs = GLCapture.configs(devices[0]);
println("Configs:");
printArray(configs);
}
video = new GLCapture(this, devices[0], 960/scaleFactor, 720/scaleFactor);
opencv = new OpenCV(this, 960/scaleFactor, 720/scaleFactor);
opencv.loadCascade(OpenCV.CASCADE_FRONTALFACE);
//opencv.loadCascade(OpenCV.CASCADE_PROFILEFACE);
//opencv.loadCascade(OpenCV.CASCADE_EYE);
video.start();
frameRate(24);
}
void draw() {
background(255, 255, 0); // Yellow
scale(scaleFactor);
if (video.available()) {
video.read();
opencv.loadImage(video);
opencv.flip(OpenCV.HORIZONTAL); // flip horizontally
Rectangle[] faces = opencv.detect();
//println(faces.length);
strokeWeight(3);
leftPupilX = leftPupilX + (100 - leftPupilX) * easing;
rightPupilX = rightPupilX + (200 - rightPupilX) * easing;
leftPupilY = rightPupilY = leftPupilY + (100 - leftPupilY) * easing;
for (int i = 0; i < faces.length; i++) {
//println(faces[i].x + "," + faces[i].y);
//noFill();
//stroke(0, 255, 0); // face detection rectangle color
//rect(faces[i].x, faces[i].y, faces[i].width, faces[i].height);
if (faces[i].x < 80 ) {
leftPupilX = (leftPupilX + (faces[i].x - leftPupilX) * easing);// + (faces[i].width * 0.2);
rightPupilX = leftPupilX + 100;
}
if ( faces[i].x > 175) {
rightPupilX = rightPupilX + (faces[i].x - rightPupilX) * easing;// + (faces[i].width * 0.2);
leftPupilX = rightPupilX - 100;
}
if ( (faces[i].y > 120) || (faces[i].y < 30) ) {
leftPupilY = leftPupilY + (faces[i].y - leftPupilY) * easing;
rightPupilY = rightPupilY + (faces[i].y - rightPupilY) * easing;
}
}
// Mouth
noFill();
stroke(0);
line(mouthX, mouthY, mouthX + mouthLength, mouthY);
arc(mouthX-15, mouthY, 30, 30, radians(-30), radians(30)); // left cheek
arc(mouthX+65, mouthY, 30, 30, radians(145), radians(205)); // right cheek
// Eyes
fill(255); // white
ellipse(leftEye.x, leftEye.y, radius+25, radius + 25); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+25, radius + 25); // left eyeball ellipse
PVector leftPupil = new PVector(leftPupilX, leftPupilY);
if (dist(leftPupil.x, leftPupil.y, leftEye.x, leftEye.y) > radius/2) {
leftPupil.sub(leftEye);
leftPupil.normalize();
leftPupil.mult(radius/2);
leftPupil.add(leftEye);
}
PVector rightPupil = new PVector(rightPupilX, rightPupilY);
if (dist(rightPupil.x, rightPupil.y, rightEye.x, rightEye.y) > radius/2) {
rightPupil.sub(rightEye);
rightPupil.normalize();
rightPupil.mult(radius/2);
rightPupil.add(rightEye);
}
// Actually draw the pupils
noStroke();
fill(0); // black pupil color
ellipse(leftPupil.x, leftPupil.y, pupilSize, pupilSize); // new left pupil
ellipse(rightPupil.x, rightPupil.y, pupilSize, pupilSize); // new right pupil
counter ++;
println(counter);
if (counter > 195) {
counter = 0;
}
if (counter >= 190 && counter < 195) {
blink();
}
}
}
void captureEvent(GLCapture c) {
c.read();
}
void blink() {
fill(255, 255, 0); // Yellow
stroke(255, 255, 0);
ellipse(leftEye.x, leftEye.y, radius+26, radius + 26); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+26, radius + 26);
stroke(0);
noFill();
line(67, leftEye.y, 133, leftEye.y);
translate(100, 0);
line(67, leftEye.y, 133, leftEye.y);
}
```

The animation is super smooth on my Mac, but achingly slow and jerky on my RasPi2.

Any advice is greatly appreciated. Is the sketch just too much for a RasPi2 to run smoothly? Is my code just too inefficient? Is OpenCV an issue here? Ultimately, I want to make this a standalone gallery installation with a monitor and RasPi subtly attached, so I don't have to run it off an expensive laptop left alone in a gallery space.

]]>I have problems loading several videos with GLvideo. I need to load, and play them continuously, one by one (a big video, and jump() is not an option).

With "myMovie.close()" I free the CPU RAM. No problems. It works perfect.

The problem is the GPU RAM. It becomes full, until the program crash. With

sudo vcdbg reloc

I can see that

```
total space allocated is 592M, with 590M relocatable, 2.3M legacy and 0 offline
1 legacy blocks of size 2359296
free list at 0x3da82260
580M free memory in 37 free block(s)
```

Then 560M free memory, 500, 400,...until the program crash.

I have tried everything, but no way to free the GPU RAM.

This is a simplification of my code (this is not my code, it is just a simplification):

```
import gohai.glvideo.*;
GLMovie myMovie;
boolean movieLoaded=false;
void setup() {
size(400, 240, P2D);
}
void draw() {
background(0);
if (movieLoaded) {
if (myMovie.available()) {
myMovie.read();
}
image(myMovie, 0, 0, width, height);
}
if (frameCount % 900 == 0) {
if (movieLoaded) {
clearMovie();
}
} else if (frameCount % 900 == 30) {
if (!movieLoaded) {
myMovie = new GLMovie(this, "launch1.mp4");
movieLoaded=true;
myMovie.play();
}
}
}
public void clearMovie() {
myMovie.close();
movieLoaded=false;
}
```

Any help would be more than welcome! f.-

]]>in the glvideo library I get :
`No capture devices found`

In the video library I get:
`There are no cameras available for capture`

I am able to acces the camera and take pictures via terminal and `rapistill -o output.jpg`

, so the camera is working.

Any tips?

]]>Raspberry Pi 3
Raspberry Pi Camera 5 MP
Arduino Uno
6V Brushed DC Motors
The Pi Camera will draw its power from the Pi. The Arduino is to be connected to the via Pi USB. The Arduino is to control 4 x DC Motors for the wheels with an LM2984CT IC (datasheet: **http://www.kynix.com/uploadfiles/pdf2286/LM2984CT.pdf** )from TI. The Pi will draw its power from the batteries through a 5V Step-Down converter. The Arduino, and the Pi Cam will draw their power from the Pi's USB port, and CSI interface respectively, and the motor voltage pin of the LM2984CT will be directly connected to the batteries.

I believe a battery with a 2A output will be sufficient. I am looking forward to using two protected 18650 3000mah batteries in series. This will give an output of 8.4 V at full charge, which is suitable for my application.

1) How do I charge two 18650 batteries with a single charger?

I found this Li-Ion charger module based on the TP4056.

With an output of 4.2 V and 1 A current, it can easily charge a single cell 18650 battery. However, I need to charge two cells simultaneously.

I came across this design on the Adafruit website for LiPo batteries and a different charging circuit:

Here, the two cells are connected to the charging circuit in parallel during charge mode. When the relay is switched, the charging circuit is disconnected and the batteries are connected in series.

I wish to use this design with a Lithium Ion 18650 and a TP4056 charging circuit. Is a TP4056 charging circuit capable of safely charging two batteries connected to its output terminals in parallel?

2) Are the batteries safe to be used in series configuration?

Since each battery has its own protection circuitry, will the protection circuitry work as it should when the two batteries are connected in series during usage?

3) Which step-down converter should I use for the Raspberry Pi?

I believe I need something which can take 7.4V input and give a 5V voltage and at least 2A current output.

I hope my questions are clear. Thank you for taking the time to read.

]]>I am trying to use a USB webcam with Processing 3 on a Raspberry PI 2.

This is using the preinstalled version from the Processing raspian disturbution(processing-3.0.1-linux-raspbian.zip, 2015-09-24-raspbian-jessie.img).

Using the Contribution Manager I've installed the video library, but I am getting an error using an USB webcam:

```
No such Gstreamer factory: v4l2src
```

I can see the camera showing up at `/dev/video0`

and after installing gstreamer1.0 and gstreamer1.0-tools I can open the camera:

```
gst-launch-1.0 v4l2src ! videoconvert ! ximagesink
```

Using `v4l2-ctl --list-formats-ext`

yelds:

```
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'YUYV'
Name : YUV 4:2:2 (YUYV)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 160x120
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 176x144
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 320x176
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 352x288
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 432x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 544x288
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 640x360
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 752x416
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 800x448
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 800x600
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 864x480
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 960x544
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 960x720
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1024x576
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1184x656
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.133s (7.500 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1280x960
Interval: Discrete 0.133s (7.500 fps)
Interval: Discrete 0.200s (5.000 fps)
Index : 1
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : MJPEG
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 160x120
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 176x144
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 320x176
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 352x288
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 432x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 544x288
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 640x360
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 752x416
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 800x448
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 800x600
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 864x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 960x544
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 960x720
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1024x576
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1184x656
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1280x960
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
```

How can I access the USB camera from Processing on Raspian ?

Thank you, George

]]>thanks for any help you might have

```
import processing.serial.*;
Serial myPort;
int val=0;
import gohai.glvideo.*;
GLMovie[]video = new GLMovie [2];
int i =0;
void setup() {
fullScreen(P2D);
String portName = Serial.list()[0];
myPort= new Serial (this, portName,115200);
video [0] = new GLMovie(this, "low.mp4");
video [1] = new GLMovie (this, "high.mp4");
video [i].loop();
}
void draw() {
image(video[i],0,0,width,height);
if (video[i].available()) {
video[i].read();
}
}
void serialEvent(Serial p) {
if (myPort.available()>0) {
val=myPort.read();
println(val);
}
if (val==0) {
i=1;
}
else {
i=0;
}
}
```

]]>I am working on an simple interactive installation off a PI. This is part of a mobile projection mapping project. Programming question : I want to combine the two processing book examples to overlap. I would like to get your opinion how to approach ? Box2D ? I have basic programming skills. so a direction is very welcome.

Installation : 2m*2m white-board with various differ size 3 mirrors on it. The bouncy ball effects looks on white background. However adding mouse function to aligning the rectangles to the physical mirrors (mapping with mouse) would make it funny. A PIR sensor monitors people's movement and make the bouncing fast/slower.

I read on forums that processing is not best solution for PI, do you suggest alternatives ?

]]>the probem I'm having is that I get this MESA-LOADER failed when trying to run a p3d sketch.. It works on one of my other PI's howerver two other PI's get the same failure. It seems as if there was a package I installed on the other machine that makes it work.. any ideas what I am missing on my other installs?

error I get from Processing..

```
40 S1948 E1947 fmul t1948, t1923, t1947
40 S1934 E1925 fmul t1934, t1925, t1931
40 S1933 E1924 fmul t1933, t1924, t1931
40 S1932 E1923, E1931 fmul t1932, t1923, t1931
39 fsub.sf null, 0, u333
39 S1891 mov t1891, 0
40 mov.ns t1891, -1
40 fsub.sf null, 0, u334
40 S1894 mov t1894, 0
41 mov.ns t1894, -1
41 S1899 E1891, E1894 or t1899, t1891, t1894
40 fsub.sf null, 0, u335
40 S1897 mov t1897, 0
41 mov.ns t1897, -1
41 S1900 E1897, E1899 or t1900, t1899, t1897
40 S1884 fmul t1884, u336, t1842
41 S1887 E1884 fadd t1887, t1664, t1884
41 S1902 E1664 mov t1902, t1664
41 S1901 E1900 and.sf t1901, t1800, t1900
41 E1887 mov.ns t1902, t1887
40 S1885 fmul t1885, u337, t1842
41 S1888 E1885 fadd t1888, t1666, t1885
41 S1904 E1666 mov t1904, t1666
41 mov.sf null, t136
41 S 12 E1902 mov.zs t12, t1902
41 mov.sf null, t1901
41 E1888 mov.ns t1904, t1888
40 mov.sf null, t136
40 S 13 E1904 mov.zs t13, t1904
40 E1901 mov.sf null, t1901
39 S1886 E1842 fmul t1886, u338, t1842
39 S1889 E1886 fadd t1889, t1668, t1886
39 S1906 E1668 mov t1906, t1668
39 E1889 mov.ns t1906, t1889
38 mov.sf null, t136
38 S 14 E1906 mov.zs t14, t1906
38 fsub.sf null, 0, u339
38 S1909 mov t1909, 0
39 mov.ns t1909, -1
39 fsub.sf null, 0, u340
39 S1912 mov t1912, 0
40 mov.ns t1912, -1
40 S1917 E1909, E1912 or t1917, t1909, t1912
39 fsub.sf null, 0, u341
39 S1915 mov t1915, 0
40 mov.ns t1915, -1
40 S1918 E1915, E1917 or t1918, t1917, t1915
39 S1935 E1932 fadd t1935, t1700, t1932
39 S1938 E1700 mov t1938, t1700
39 S1919 E1918 and t1919, t1800, t1918
39 mov.sf null, t1919
39 E1935 mov.ns t1938, t1935
38 S1936 E1933 fadd t1936, t1702, t1933
38 S1940 E1702 mov t1940, t1702
38 mov.sf null, t136
38 S 15 E1938 mov.zs t15, t1938
38 mov.sf null, t1919
38 E1936 mov.ns t1940, t1936
37 S1937 E1934 fadd t1937, t1704, t1934
37 S1942 E1704 mov t1942, t1704
37 mov.sf null, t136
37 S 16 E1940 mov.zs t16, t1940
37 mov.sf null, t1919
37 E1937 mov.ns t1942, t1937
36 S1951 E1948 fadd t1951, t1716, t1948
36 S1954 E1716 mov t1954, t1716
36 mov.sf null, t136
36 S 17 E1942 mov.zs t17, t1942
36 mov.sf null, t1919
36 E1951 mov.ns t1954, t1951
35 S1952 E1949 fadd t1952, t1718, t1949
35 S1956 E1718 mov t1956, t1718
35 mov.sf null, t136
35 S 18 E1954 mov.zs t18, t1954
35 mov.sf null, t1919
35 E1952 mov.ns t1956, t1952
34 mov.sf null, t136
34 S 19 E1956 mov.zs t19, t1956
34 E1919 mov.sf null, t1919
33 S1953 E1950 fadd t1953, t1720, t1950
33 S1958 E1720 mov t1958, t1720
33 E1953 mov.ns t1958, t1953
32 mov.sf null, t136
32 S 20 E1958 mov.zs t20, t1958
32 fsub.sf null, 0, u342
32 S1961 mov t1961, 0
33 mov.ns t1961, -1
33 fsub.sf null, 0, u343
33 S1964 mov t1964, 0
34 mov.ns t1964, -1
34 S1969 E1961, E1964 or t1969, t1961, t1964
33 fsub.sf null, 0, u344
33 S1967 mov t1967, 0
34 mov.ns t1967, -1
34 S1970 E1967, E1969 or t1970, t1969, t1967
33 S1971 E1800, E1970 and t1971, t1800, t1970
32 S2007 E2004 fadd t2007, t1772, t2004
32 S2010 E1772 mov t2010, t1772
32 mov.sf null, t1971
32 E2007 mov.ns t2010, t2007
31 S2008 E2005 fadd t2008, t1774, t2005
31 S2012 E1774 mov t2012, t1774
31 mov.sf null, t136
31 S 21 E2010 mov.zs t21, t2010
31 mov.sf null, t1971
31 E2008 mov.ns t2012, t2008
30 S2009 E2006 fadd t2009, t1776, t2006
30 S2014 E1776 mov t2014, t1776
30 mov.sf null, t136
30 S 22 E2012 mov.zs t22, t2012
30 mov.sf null, t1971
30 E2009 mov.ns t2014, t2009
29 S2026 E2023 fadd t2026, t1791, t2023
29 S2029 E1791 mov t2029, t1791
29 mov.sf null, t136
29 S 23 E2014 mov.zs t23, t2014
29 mov.sf null, t1971
29 E2026 mov.ns t2029, t2026
28 S2027 E2024 fadd t2027, t1793, t2024
28 S2031 E1793 mov t2031, t1793
28 mov.sf null, t136
28 S 24 E2029 mov.zs t24, t2029
28 mov.sf null, t1971
28 E2027 mov.ns t2031, t2027
27 mov.sf null, t136
27 S 25 E2031 mov.zs t25, t2031
27 E1971 mov.sf null, t1971
26 S2028 E2025 fadd t2028, t1795, t2025
26 S2033 E1795 mov t2033, t1795
26 E2028 mov.ns t2033, t2028
25 mov.sf null, t136
25 S 26 E2033 mov.zs t26, t2033
25 mov.zs t136, 2
25 sub.sf null, t136, 2
25 branch.all_zs
-> BLOCK 2, 3
BLOCK 3:
25 sub.sf null, t136, 3
25 mov.zs t136, 0
25 E 136 mov.sf null, t136
24 mov.zs t14, 0
24 mov.zs t13, t14
24 mov.zs t12, t13
24 mov.zs t17, t12
24 mov.zs t16, t17
24 mov.zs t15, t16
24 mov.zs t23, t15
24 mov.zs t22, t23
24 mov.zs t21, t22
24 mov.zs t20, t21
24 mov.zs t19, t20
24 mov.zs t18, t19
24 mov.zs t26, t18
24 mov.zs t25, t26
24 mov.zs t24, t25
-> BLOCK 2
BLOCK 2:
24 S2111 load_imm t2111, 0x00000568 (0.000000)
25 E2111 uniforms_reset null, t2111, u345
24 S2063 E 23 fmul t2063, t23, t9.8c
24 S2062 E 22 fmul t2062, t22, t9.8b
24 S2061 E 21 fmul t2061, t21, t9.8a
24 S2078 E 26 fmul t2078, t26, t9.8c
24 S2077 E 25 fmul t2077, t25, t9.8b
24 S2076 E 9, E 24 fmul t2076, t24, t9.8a
23 S2070 E 18 fmul t2070, t18, t4.8a
23 S2055 E 15 fmul t2055, t15, t4.8a
23 S2071 E 19 fmul t2071, t19, t4.8b
23 S2056 E 16 fmul t2056, t16, t4.8b
23 S2072 E 20 fmul t2072, t20, t4.8c
23 S2057 E 4, E 17 fmul t2057, t17, t4.8c
22 S2052 E 12 fmul t2052, t12, t8.8a
22 S2053 E 13 fmul t2053, t13, t8.8b
22 S2054 E 8, E 14 fmul t2054, t14, t8.8c
21 S2060 E2057 fadd t2060, t2054, t2057
21 S2075 E2054, E2072 fadd t2075, t2054, t2072
20 S2059 E2056 fadd t2059, t2053, t2056
20 S2074 E2053, E2071 fadd t2074, t2053, t2071
19 S2058 E2055 fadd t2058, t2052, t2055
19 S2073 E2052, E2070 fadd t2073, t2052, t2070
18 S2079 E2073, E2076 fadd t2079, t2073, t2076
17 S2080 E2074, E2077 fadd t2080, t2074, t2077
16 S2081 E2075, E2078 fadd t2081, t2075, t2078
15 S2064 E2058, E2061 fadd t2064, t2058, t2061
14 S2065 E2059, E2062 fadd t2065, t2059, t2062
13 S2066 E2060, E2063 fadd t2066, t2060, t2063
12 S2069 E2066 fadd t2069, t2066, t10.8c
12 S2068 E2065 fadd t2068, t2065, t10.8b
12 S2067 E2064 fadd t2067, t2064, t10.8a
12 S2098 rcp t2098, t82
13 S2084 E2081 fadd t2084, t2081, t10.8c
13 S2083 E2080 fadd t2083, t2080, t10.8b
13 S2082 E 10, E2079 fadd t2082, t2079, t10.8a
12 S2099 E 82 fmul t2099, t82, t2098
12 S2100 E2099 fsub t2100, 2.000000, t2099
12 S2101 E2098, E2100 fmul t2101, t2098, t2100
11 S2103 E 79 fmul t2103, t79, u346
11 S2105 E 80 fmul t2105, t80, u347
11 S2106 E2105 fmul t2106, t2105, t2101
11 S2104 E2103 fmul t2104, t2103, t2101
11 S2102 E2104 ftoi t2102.16a, t2104
11 E2106 ftoi t2102.16b, t2106
10 E2102 mov vpm, t2102
9 S2107 E 81 fmul t2107, t81, u348
9 S2108 E2107 fmul t2108, t2107, t2101
9 S2109 E2108 fadd t2109, t2108, u349
9 E2109 mov vpm, t2109
8 E2101 mov vpm, t2101
7 E2082 mov vpm, t2082
6 E2083 mov vpm, t2083
5 E2084 mov vpm, t2084
4 mov vpm, t50
4 E2067 mov vpm, t2067
3 E2068 mov vpm, t2068
2 E2069 mov vpm, t2069
1 E 50 mov vpm, t50
X11Util.Display: Shutdown (JVM shutdown: true, open (no close attempt): 2/2, reusable (open, marked uncloseable): 0, pending (open in creation order): 2)
X11Util: Open X11 Display Connections: 2
X11Util: Open[0]: NamedX11Display[:0.0, 0x652027b0, refCount 1, unCloseable false]
X11Util: Open[1]: NamedX11Display[:0.0, 0x65210e58, refCount 1, unCloseable false]
```

]]>