I solved my earlier query about getting an audio file to play when someone comes within a certain range of a Kinect. What I would like to do is play all files in a folder, one after another. Its basically a scripted dialogue that I need to trigger when someone walks up to a certain spot. I cannot seem to get the minim audio player to play all the files; it seems geared to play just one.
Has anyone any idea of how to do this? I was thinking of maybe loadSample() and specifying a sample for each file but there are 42 of them so it would be messy. Ideally I would like to loop through the files until the end or until the person moves out of range.
void play(){
int[] depth = kinect.getRawDepth();
tracker.track();
int t = tracker.getThreshold();
minim = new Minim(this);
// load a file, give the AudioPlayer buffers that are 2048 samples long
player = minim.loadFile("1.wav", 2048);
// play the file
player.play();
}
void stop() {
tracker.quit();
// always close Minim audio classes when you are done with them
player.close();
// always stop Minim before exiting
minim.stop();
super.stop();
}
What would be the best way to do this? I am relatively new to processing so any help is appreciated. I added in the audio playing stuff. The code also uses this kinect class:
class KinectTracker {
// Size of kinect image
int kw = 640;
int kh = 480;
int threshold = 745;
// We could skip processing the grayscale image for efficiency
// but this example is just demonstrating everything
kinect.processDepthImage(true);
display = createImage(kw,kh,PConstants.RGB);
loc = new PVector(0,0);
lerpedLoc = new PVector(0,0);
}
void track() {
// Get the raw depth as array of integers
depth = kinect.getRawDepth();
// Being overly cautious here
if (depth == null) return;
float sumX = 0;
float sumY = 0;
float count = 0;
for(int x = 0; x < kw; x++) {
for(int y = 0; y < kh; y++) {
// Mirroring the image
int offset = kw-x-1+y*kw;
// Grabbing the raw depth
int rawDepth = depth[offset];
// Testing against threshold
if (rawDepth < threshold) {
sumX += x;
sumY += y;
count++;
}
}
}
// As long as we found something
if (count != 0) {
loc = new PVector(sumX/count,sumY/count);
}
// Interpolating the location, doing it arbitrarily for now
lerpedLoc.x = PApplet.lerp(lerpedLoc.x, loc.x, 0.3f);
lerpedLoc.y = PApplet.lerp(lerpedLoc.y, loc.y, 0.3f);
}
// Being overly cautious here
if (depth == null || img == null) return;
// Going to rewrite the depth image to show which pixels are in threshold
// A lot of this is redundant, but this is just for demonstration purposes
display.loadPixels();
for(int x = 0; x < kw; x++) {
for(int y = 0; y < kh; y++) {
// mirroring image
int offset = kw-x-1+y*kw;
// Raw depth
int rawDepth = depth[offset];
int pix = x+y*display.width;
if (rawDepth < threshold) {
// A red color instead
display.pixels[pix] = color(150,50,50);
}
else {
display.pixels[pix] = img.pixels[offset];
}
}
}
display.updatePixels();