We are about to switch to a new forum software. Until then we have removed the registration on this forum.
Hi,
Fist time poster. Sorry it's a long one.
I am attempting to port over a sketch that I originally built on my Mac (1.6GHz core i5/8mb RAM), over to a RasPi2 and I am experiencing an unexpectedly dramatic loss in video performance. I'm looking for expectations, opinions, and any advice to get this thing working smoothly on a RasPi2. I get that the RasPi is obviously a much, much less powerful computer than my laptop, but gohai's SimpleCapture example worked so smoothly, that I hoped an OpenCV layer on top would also run smoothly. I tinkered with allocating more GPU memory and overclocking the Pi, but without any noticeable improvements.
The original script tracks faces and animates eyes to "watch" the viewer via a webcam. My original code is here, using Video and OpenCV libraries.
import gab.opencv.*;
import processing.video.*;
import java.awt.*;
Capture video;
OpenCV opencv;
float mouthLength = 50;
float mouthX = 120;
float mouthY = 175;
float leftPupilX;
float leftPupilY;
float rightPupilX;
float rightPupilY;
int radius = 40; // Radius of white eyeball ellipse
float pupilSize = 20;
PVector leftEye = new PVector(100, 100);
PVector rightEye = new PVector(200, 100);
int x, y = 120;
float easing = 0.2;
int scaleFactor = 3;
int counter;
void setup() {
size(960, 720);
smooth();
video = new Capture(this, 960/scaleFactor, 720/scaleFactor);
opencv = new OpenCV(this, 960/scaleFactor, 720/scaleFactor);
opencv.loadCascade(OpenCV.CASCADE_FRONTALFACE);
//opencv.loadCascade(OpenCV.CASCADE_PROFILEFACE);
//opencv.loadCascade(OpenCV.CASCADE_EYE);
video.start();
frameRate(24);
}
void draw() {
background(255, 255, 0); // Yellow
scale(scaleFactor);
opencv.loadImage(video);
opencv.flip(OpenCV.HORIZONTAL); // flip horizontally
Rectangle[] faces = opencv.detect();
println(faces.length);
strokeWeight(3);
leftPupilX = leftPupilX + (100 - leftPupilX) * easing;
rightPupilX = rightPupilX + (200 - rightPupilX) * easing;
leftPupilY = rightPupilY = leftPupilY + (100 - leftPupilY) * easing;
for (int i = 0; i < faces.length; i++) {
//println(faces[i].x + "," + faces[i].y);
noFill();
stroke(0, 255, 0); // face detection rectangle color
rect(faces[i].x, faces[i].y, faces[i].width, faces[i].height);
if (faces[i].x < 80 ) {
leftPupilX = (leftPupilX + (faces[i].x - leftPupilX) * easing);// + (faces[i].width * 0.2);
rightPupilX = leftPupilX + 100;
}
if ( faces[i].x > 175) {
rightPupilX = rightPupilX + (faces[i].x - rightPupilX) * easing;// + (faces[i].width * 0.2);
leftPupilX = rightPupilX - 100;
}
if ( (faces[i].y > 120) || (faces[i].y < 30) ) {
leftPupilY = leftPupilY + (faces[i].y - leftPupilY) * easing;
rightPupilY = rightPupilY + (faces[i].y - rightPupilY) * easing;
}
}
// Mouth
noFill();
stroke(0);
line(mouthX, mouthY, mouthX + mouthLength, mouthY);
arc(mouthX-15, mouthY, 30, 30, radians(-30), radians(30)); // left cheek
arc(mouthX+65, mouthY, 30, 30, radians(145), radians(205)); // right cheek
// Eyes
fill(255); // white
ellipse(leftEye.x, leftEye.y, radius+25, radius + 25); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+25, radius + 25); // left eyeball ellipse
PVector leftPupil = new PVector(leftPupilX, leftPupilY);
if (dist(leftPupil.x, leftPupil.y, leftEye.x, leftEye.y) > radius/2) {
leftPupil.sub(leftEye);
leftPupil.normalize();
leftPupil.mult(radius/2);
leftPupil.add(leftEye);
}
PVector rightPupil = new PVector(rightPupilX, rightPupilY);
if (dist(rightPupil.x, rightPupil.y, rightEye.x, rightEye.y) > radius/2) {
rightPupil.sub(rightEye);
rightPupil.normalize();
rightPupil.mult(radius/2);
rightPupil.add(rightEye);
}
// Actually draw the pupils
noStroke();
fill(0); // black pupil color
ellipse(leftPupil.x, leftPupil.y, pupilSize, pupilSize); // new left pupil
ellipse(rightPupil.x, rightPupil.y, pupilSize, pupilSize); // new right pupil
counter ++;
println(counter);
if (counter > 195) {
counter = 0;
}
if (counter >= 190 && counter < 195) {
blink();
}
}
void captureEvent(Capture c) {
c.read();
}
void blink() {
fill(255, 255, 0); // Yellow
stroke(255, 255, 0);
ellipse(leftEye.x, leftEye.y, radius+26, radius + 26); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+26, radius + 26);
stroke(0);
noFill();
line(67, leftEye.y, 133, leftEye.y);
translate(100, 0);
line(67, leftEye.y, 133, leftEye.y);
}
This version I modified for the RasPi2 with gohai's GLVideo library. I can get the sketch to run, but the tracking AND/OR responsive animation are incredibly slow. Unfortunately, to the point that it ruins the interactive nature of the work.
import gab.opencv.*;
import gohai.glvideo.*;
import java.awt.*;
GLCapture video;
OpenCV opencv;
float mouthLength = 50;
float mouthX = 120;
float mouthY = 175;
float leftPupilX;
float leftPupilY;
float rightPupilX;
float rightPupilY;
int radius = 40; // Radius of white eyeball ellipse
float pupilSize = 20;
PVector leftEye = new PVector(100, 100);
PVector rightEye = new PVector(200, 100);
int x, y = 120;
float easing = 0.2;
int scaleFactor = 3;
int counter;
void setup() {
size(960, 720, P2D);
smooth();
String[] devices = GLCapture.list();
println("Devices:");
printArray(devices);
if (0 < devices.length) {
String[] configs = GLCapture.configs(devices[0]);
println("Configs:");
printArray(configs);
}
video = new GLCapture(this, devices[0], 960/scaleFactor, 720/scaleFactor);
opencv = new OpenCV(this, 960/scaleFactor, 720/scaleFactor);
opencv.loadCascade(OpenCV.CASCADE_FRONTALFACE);
//opencv.loadCascade(OpenCV.CASCADE_PROFILEFACE);
//opencv.loadCascade(OpenCV.CASCADE_EYE);
video.start();
frameRate(24);
}
void draw() {
background(255, 255, 0); // Yellow
scale(scaleFactor);
if (video.available()) {
video.read();
opencv.loadImage(video);
opencv.flip(OpenCV.HORIZONTAL); // flip horizontally
Rectangle[] faces = opencv.detect();
//println(faces.length);
strokeWeight(3);
leftPupilX = leftPupilX + (100 - leftPupilX) * easing;
rightPupilX = rightPupilX + (200 - rightPupilX) * easing;
leftPupilY = rightPupilY = leftPupilY + (100 - leftPupilY) * easing;
for (int i = 0; i < faces.length; i++) {
//println(faces[i].x + "," + faces[i].y);
//noFill();
//stroke(0, 255, 0); // face detection rectangle color
//rect(faces[i].x, faces[i].y, faces[i].width, faces[i].height);
if (faces[i].x < 80 ) {
leftPupilX = (leftPupilX + (faces[i].x - leftPupilX) * easing);// + (faces[i].width * 0.2);
rightPupilX = leftPupilX + 100;
}
if ( faces[i].x > 175) {
rightPupilX = rightPupilX + (faces[i].x - rightPupilX) * easing;// + (faces[i].width * 0.2);
leftPupilX = rightPupilX - 100;
}
if ( (faces[i].y > 120) || (faces[i].y < 30) ) {
leftPupilY = leftPupilY + (faces[i].y - leftPupilY) * easing;
rightPupilY = rightPupilY + (faces[i].y - rightPupilY) * easing;
}
}
// Mouth
noFill();
stroke(0);
line(mouthX, mouthY, mouthX + mouthLength, mouthY);
arc(mouthX-15, mouthY, 30, 30, radians(-30), radians(30)); // left cheek
arc(mouthX+65, mouthY, 30, 30, radians(145), radians(205)); // right cheek
// Eyes
fill(255); // white
ellipse(leftEye.x, leftEye.y, radius+25, radius + 25); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+25, radius + 25); // left eyeball ellipse
PVector leftPupil = new PVector(leftPupilX, leftPupilY);
if (dist(leftPupil.x, leftPupil.y, leftEye.x, leftEye.y) > radius/2) {
leftPupil.sub(leftEye);
leftPupil.normalize();
leftPupil.mult(radius/2);
leftPupil.add(leftEye);
}
PVector rightPupil = new PVector(rightPupilX, rightPupilY);
if (dist(rightPupil.x, rightPupil.y, rightEye.x, rightEye.y) > radius/2) {
rightPupil.sub(rightEye);
rightPupil.normalize();
rightPupil.mult(radius/2);
rightPupil.add(rightEye);
}
// Actually draw the pupils
noStroke();
fill(0); // black pupil color
ellipse(leftPupil.x, leftPupil.y, pupilSize, pupilSize); // new left pupil
ellipse(rightPupil.x, rightPupil.y, pupilSize, pupilSize); // new right pupil
counter ++;
println(counter);
if (counter > 195) {
counter = 0;
}
if (counter >= 190 && counter < 195) {
blink();
}
}
}
void captureEvent(GLCapture c) {
c.read();
}
void blink() {
fill(255, 255, 0); // Yellow
stroke(255, 255, 0);
ellipse(leftEye.x, leftEye.y, radius+26, radius + 26); // left eyeball ellipse
ellipse(rightEye.x, rightEye.y, radius+26, radius + 26);
stroke(0);
noFill();
line(67, leftEye.y, 133, leftEye.y);
translate(100, 0);
line(67, leftEye.y, 133, leftEye.y);
}
The animation is super smooth on my Mac, but achingly slow and jerky on my RasPi2.
Any advice is greatly appreciated. Is the sketch just too much for a RasPi2 to run smoothly? Is my code just too inefficient? Is OpenCV an issue here? Ultimately, I want to make this a standalone gallery installation with a monitor and RasPi subtly attached, so I don't have to run it off an expensive laptop left alone in a gallery space.
Answers
Would you see an improvement of speed without affecting the algorithm when you reduce the size of the image to process? Additionally, I notice the only things that move are the pupils and the green square. I could suggest drawing all the elements of your sketch in setup in a single graphic buffer . In draw, you draw this buffer and then on top of it you draw the pupils. However, I don't think you will gain much as the problem resides in the library. Processing a smaller image should show some improvement nevertheless.
Kf
The Pi 3 should also be quite a bit faster than the 2.
Hi and thanks for the helpful feedback. I did borrow a RasPi3 from a friend, but did not notice any better performance. Next I did alter the program to not scale up the image that was processed - I brought it down to 320x240, also without any noticeable improvement in performance.
Here's that trimmed code for testing:
I didn't draw any elements into a buffer, because I don't have experience with that so I'll need time to teach myself. Thanks though.
To test my theory that OpenCV for Processing on a RasPi is just not yet ready for primetime, I modified the Simple Capture example to work with gohai's GLCapture library for the Pi. I noticed the same lag on live video processing - it just makes it look like the capture is slowed down to 1 frame per second or so (only an estimation). Try it out if you are curious:
Of course gohai's Simple Capture without OpenCV detection is as smooth and "real time" as you'd need (many thanks BTW!), so that's not the issue. This might just be a dead end with P3+OpenCV+RPi. Thankful to anyone that can lead me out of the woods here!