Howdy, Stranger!

We are about to switch to a new forum software. Until then we have removed the registration on this forum.

  • How to use web links in processing sketch?

    Please paste code and format by indenting four spaces or highlighting and pressing Ctrl-o -- don't post code screenshots. For example, I can't cut-paste and edit your code.

    Use saveStrings(): https://processing.org/reference/saveStrings_.html

    Inside the for loop, call saveStrings once for each line. Name the file something: for starters, you could name the file str(i)+".txt" -- so 1.txt, 2.txt, 3.txt -- or you could use split() and name it with part of the string value, drum.txt, snare.txt etc.

  • How to use web links in processing sketch?

    Maybe don't start with XML -- start with a plain old text file with one URL per line:

    http://someserver.com/kick.mp3
    http://someserver.com/snare.mp3
    

    Now you need to load this over the Internet into you sketch, then have your sketch write each line out to a file. To start you can just call those files 1.txt, 2.txt etc. -- once you get that working you can add tab-separated labels etc.

    The next step is the loading process.

  • How to use web links in processing sketch?

    oh haha my html example actually got processed in that comment. Didn't think about that. Lets try commenting it out in the spirit of Processing:

    <?xml version="1.0" encoding="UTF-8"?>

    //

    // //<a href="http://...../kick.mp3>Kick //

    // //<a href="http://...../snare.mp3>Snare //

    //

    Edit: Nope didn't work all the way..

  • How to use web links in processing sketch?

    Oh that sounds like it could be it. I have absolutely no knowledge on what you just suggested haha. I have googled and youtubed for some hours now and I kinda get what you mean and what its about. But I can't find any info for dummies on how to even get started. Feel very free to correct me if I am wrong here. My guess on how the XML could look for this:

    <?xml version="1.0" encoding="UTF-8"?>

    Kick

    Snare

    But it is probably more to it than that. Are you by any chance able to explain this process or maybe point me in a direction to something that will explain it from absolute scratch? If so then you will be my forever number 1 MVP

  • Processing-3D shapes

    import ddf.minim.*;

    Minim minim; AudioSample kick; AudioSample snare;

    import peasy.*; import peasy.org.apache.commons.math.*; import peasy.org.apache.commons.math.geometry.*;

    PeasyCam cam;

    float kickHeight = 100;

    void setup() { size (600, 600, P3D); cam = new PeasyCam(this, 500); minim = new Minim(this); kick = minim.loadSample( "tragoudi1.mp3",1024); // buffer size

    }

    void draw() { background(0); fill(200); lights();

    float r = 200; int total = 50; for (int i = 0; i < total; i++) { float lon = map(i, 0, total, -PI, PI); for (int j = 0; j < total; j++) { float lat = map(j, 0, total, -HALF_PI, HALF_PI); float x = r * sin(lon) * cos(lat); float y = r * sin(lon) * sin(lat); float z = r * cos(lon); stroke(255); point(x, y, z); } }

    for (int i = 0; i < kick.bufferSize() - 1; i++) { float x1 = map(i, 0, kick.bufferSize(), 0, width); float x2 = map(i+1, 0, kick.bufferSize(), 0, width); float y1 = kick.mix.get(i)kickHeight; float y2 = kick.mix.get(i+1)kickHeight; stroke(255,85,0); line(x1, 50 - kick.mix.get(i)kickHeight, x2, 50 - kick.mix.get(i+1)kickHeight); rect(x1,y1,20,40); } }

    void keyPressed() { if ( key == 'k' ) kick.trigger(); }``

  • flip video capture horizontally

    Hello! I have been trying to solve this but I haven´t been able. That has discouraged me and I haven´t done so much in the last months because of that. Can someone help me? this I guess shouldn´t be difficult. I want to flip the image capture horizontally.

    THanks!

    import processing.video.*;

    import ddf.minim.*;

    Capture video;

    float boxSize = 80; float boxSize2 = 40; float boxX, boxY; float boxX1, boxY1; float boxX2, boxY2; float boxX3, boxY3; float boxX4, boxY4; float boxX5, boxY5; float boxX6, boxY6; float boxX7, boxY7; float boxX8, boxY8; float boxX9, boxY9;

    void setup() { size(1024, 768);

    boxX = width/2;
    

    boxY = height/2; boxX1 = 230; boxY1 = 300; boxX2= 220; boxY2= 500; boxX3= 600; boxY3= 50; boxX4 = 700; boxY4 = 600; boxX5 = 900; boxY5 = 200; boxX6 = 170; boxY6 = 120; boxX7 = 280; boxY7 = 120; boxX8 = 700; boxY8 =400; boxX9 = 700; boxY9 = 300; rectMode(RADIUS); video = new Capture(this, width, height);

    video.start();
    noStroke(); smooth(); minim = new Minim(this);

    kick = minim.loadSample( "notes 1.wav", // filename 512 // buffer size );

    snare = minim.loadSample("notes 2.wav", 512); if ( snare == null ) println("Didn't get snare!");

    chick = minim.loadSample("notes 3.wav", 512); if ( chick == null ) println("Didn't get chick!");

    chack = minim.loadSample("notes 4.wav", 512); if ( chack == null ) println("Didn't get chack!");

    chuck = minim.loadSample("notes 5.wav", 512); if ( chuck == null ) println("Didn't get chuck!");

    check = minim.loadSample("notes 6.wav", 512); if ( check == null ) println("Didn't get check!");

    tac = minim.loadSample("notes 7.wav", 512);
    

    if ( tac == null ) println("Didn't get tac!");

     tuc = minim.loadSample("notes 8.wav", 512);
    

    if ( tuc == null ) println("Didn't get tuc!");

    tic = minim.loadSample("notes 9.wav", 512);
    

    if ( tic == null ) println("Didn't get tic!");

     toc = minim.loadSample("notes 10.wav", 512);
    

    if ( toc == null ) println("Didn't get toc!"); }

    // Uses the default video input, see the reference if this causes an error

    void draw() { if (video.available()) { video.read();

    image(video, 0, 0, width, height); // Draw the webcam video onto the screen
    int brightestX = 0; // X-coordinate of the brightest video pixel
    int brightestY = 0; // Y-coordinate of the brightest video pixel
    float brightestValue = 0; // Brightness of the brightest video pixel
    // Search for the brightest pixel: For each row of pixels in the video image and
    // for each pixel in the yth row, compute each pixel's index in the video
    video.loadPixels();
    int index = 0;
    for (int y = 0; y < video.height; y++) {
    
      for (int x = 0; x < video.width; x++) {
        // Get the color stored in the pixel
        int pixelValue = video.pixels[index];
        // Determine the brightness of the pixel
        float pixelBrightness = brightness(pixelValue);
        // If that value is brighter than any previous, then store the
        // brightness of that pixel, as well as its (x,y) location
        if (pixelBrightness > brightestValue) {
          brightestValue = pixelBrightness;
          brightestY = y;
          brightestX = x;
    
        }
        index++;
      }
    }
    // Draw a large, yellow circle at the brightest pixel
    

    if(brightestX>boxX-boxSize && brightestX<boxX+boxSize && brightestY>boxY-boxSize && brightestY<boxY+boxSize) snare.trigger();

    if(brightestX>boxX-boxSize && brightestX<boxX+boxSize && brightestY>boxY-boxSize && brightestY<boxY+boxSize){ fill(200,200); } else { noFill(); } rect(boxX, boxY, boxSize, boxSize);

    noFill();

    if(brightestX>boxX1-boxSize && brightestX<boxX1+boxSize && brightestY>boxY1-boxSize && brightestY<boxY1+boxSize) kick.trigger();

    if(brightestX>boxX1-boxSize && brightestX<boxX1+boxSize && brightestY>boxY1-boxSize && brightestY<boxY1+boxSize){ fill(200,200); } else { noFill(); }

    rect(boxX1, boxY1, boxSize, boxSize);
    

    noFill(); if(brightestX>boxX2-boxSize && brightestX<boxX2+boxSize && brightestY>boxY2-boxSize && brightestY<boxY2+boxSize) chick.trigger();

    if(brightestX>boxX2-boxSize && brightestX<boxX2+boxSize && brightestY>boxY2-boxSize && brightestY<boxY2+boxSize) { fill(200,200); } else { noFill(); } rect(boxX2,boxY2, boxSize, boxSize);

    noFill(); if(brightestX>boxX3-boxSize && brightestX<boxX3+boxSize && brightestY>boxY3-boxSize && brightestY<boxY3+boxSize) chack.trigger();

         if(brightestX>boxX3-boxSize && brightestX<boxX3+boxSize &&
    

    brightestY>boxY3-boxSize && brightestY<boxY3+boxSize) { fill(200,200); } else { noFill(); } rect(boxX3, boxY3, boxSize, boxSize);

    noFill();
         if(brightestX>boxX4-boxSize && brightestX<boxX4+boxSize &&
    

    brightestY>boxY4-boxSize && brightestY<boxY4+boxSize) chuck.trigger();

         if(brightestX>boxX4-boxSize && brightestX<boxX4+boxSize &&
    

    brightestY>boxY4-boxSize && brightestY<boxY4+boxSize) { fill(200,200); } else { noFill(); } rect(boxX4, boxY4, boxSize, boxSize);

     noFill();
         if(brightestX>boxX5-boxSize && brightestX<boxX5+boxSize &&
    

    brightestY>boxY5-boxSize && brightestY<boxY5+boxSize) check.trigger();

         if(brightestX>boxX5-boxSize && brightestX<boxX5+boxSize &&
    

    brightestY>boxY5-boxSize && brightestY<boxY5+boxSize) { fill(200,200); } else { noFill(); } rect(boxX5, boxY5, boxSize, boxSize);

     noFill();
         if(brightestX>boxX6-boxSize && brightestX<boxX6+boxSize &&
    

    brightestY>boxY6-boxSize && brightestY<boxY6+boxSize) tac.trigger();

         if(brightestX>boxX6-boxSize && brightestX<boxX6+boxSize &&
    

    brightestY>boxY6-boxSize && brightestY<boxY6+boxSize) { fill(200,200); } else { noFill(); } rect(boxX6, boxY6, boxSize2, boxSize2);

     noFill();
         if(brightestX>boxX7-boxSize && brightestX<boxX7+boxSize &&
    

    brightestY>boxY7-boxSize && brightestY<boxY7+boxSize) tuc.trigger();

         if(brightestX>boxX7-boxSize && brightestX<boxX7+boxSize &&
    

    brightestY>boxY7-boxSize && brightestY<boxY7+boxSize) { fill(200,200); } else { noFill(); } rect(boxX7, boxY7, boxSize2, boxSize2);

     noFill();
         if(brightestX>boxX8-boxSize && brightestX<boxX8+boxSize &&
    

    brightestY>boxY8-boxSize && brightestY<boxY8+boxSize) tic.trigger();

         if(brightestX>boxX8-boxSize && brightestX<boxX8+boxSize &&
    

    brightestY>boxY8-boxSize && brightestY<boxY8+boxSize) { fill(200,200); } else { noFill(); } rect(boxX8, boxY8, boxSize2, boxSize2);

     noFill();
         if(brightestX>boxX9-boxSize && brightestX<boxX9+boxSize &&
    

    brightestY>boxY9-boxSize && brightestY<boxY9+boxSize) toc.trigger();

         if(brightestX>boxX9-boxSize && brightestX<boxX9+boxSize &&
    

    brightestY>boxY9-boxSize && brightestY<boxY9+boxSize) { fill(200,200); } else { noFill(); } rect(boxX9, boxY9, boxSize2, boxSize2);

    ellipse(brightestX, brightestY, 20, 30);
                  stroke(100);
                  noFill();
    

    }

    }

  • Exporting PDF using a timer in an animation

    I've managed to export a .tiff from a timer from this animation code below. But I cannot work out how to do the same with a PDF. I've looked up the PDF ref... any help with code appreciated. This uses the hype framework (http://www.hypeframework.org/)

    // sketch set up
    
    import hype.*;
    import hype.extended.layout.HGridLayout;
    import hype.extended.behavior.HTimer;
    
    
    int           myStageW         = 700;
    int           myStageH         = 700;
    
    color         clrBG            = #FF3300;
    
    String        pathDATA         = "../../data/";
    
    // ************************************************************************************************************
    
    // sound set up
    
    import ddf.minim.*;
    import ddf.minim.analysis.*;
    
    Minim           minim;
    AudioInput      myAudio; // to get the audio input - either a line in or a mic etc
    //AudioPlayer       myAudio; // this one is to get audio in through the mp3...
    FFT             myAudioFFT;
    
    boolean         showVisualizer      = false;    
    
    int             myAudioRange        = 11;
    int             myAudioMax          = 100;
    
    float           myAudioAmp          = 40.0;
    float           myAudioIndex        = 0.2;
    float           myAudioIndexAmp     = myAudioIndex;
    float           myAudioIndexStep    = 0.35;
    
    float[]         myAudioData         = new float[myAudioRange];
    
    // ************************************************************************************************************
    
    HDrawablePool   pool;
    int             poolCols            = 5;
    int             poolRows            = 5;
    int             poolDepth           = 5;
    
    //              picks are made from the ranIndex
    //                                      v bass (orange)             v snare (blue)
    color[]         palette             = { #000000, #666666, #666666,  #FFFFFF,   #666666, #666666, #666666, #666666, #666666, #666666, #666666 };
    
    int             rotateNumX          = 0;
    int             rotateNumY          = 0;
    int             rotateNumZ          = 0;
    
    // ************************************************************************************************************
    
    HTimer timer;
    
    // ************************************************************************************************************
    
    void settings() {
        size(myStageW,myStageH,P3D);
    }
    
    void setup() {
        H.init(this).background(clrBG).use3D(true).autoClear(true);
    
        minim   = new Minim(this);
        myAudio = minim.getLineIn(Minim.MONO);
    
        // myAudio = minim.loadFile(pathDATA + "HECQ_With_Angels_Trifonic_Remix.wav");
        // myAudio.loop();
    
        myAudioFFT = new FFT(myAudio.bufferSize(), myAudio.sampleRate()); // buffersize = 1024 (comes from the audio file id if not specified). samplerate = 44100.00 (again, from fileif not specified).
        myAudioFFT.linAverages(myAudioRange); // calculates the average by grouping the frequency bands 'lineraly'. (using 256 here from the 'myAudioRange').
        //myAudioFFT.window(FFT.GAUSS);
    
        pool = new HDrawablePool(poolCols*poolRows*poolDepth);
        pool.autoAddToStage()
            .add(new HSphere())
            .layout (new HGridLayout().startX(-300).startY(-300).startZ(-300).spacing(150, 150, 150).rows(poolRows).cols(poolCols))
            .onCreate ( 
                new HCallback() {
                    public void run(Object obj) {
                        int ranIndex = (int)random(myAudioRange);
    
                        HSphere d = (HSphere) obj;
                        d
                            .size(10)
                            .strokeWeight(0)
                            .noStroke()
                            .fill(palette[ranIndex], 225)
                            .anchorAt(H.CENTER)
                            .extras( new HBundle().num("i", ranIndex) )
                        ;
                    }
                }
            )
            .requestAll()
        ;
    
    // ************************************************************************************************************
    
    // Timer - prints a frame every second (numCycles is in there to stop it printing tonnes whilst testing)
    
        timer = new HTimer()
            .numCycles(10)
            .interval(1000)
            .callback(
                new HCallback() {
                    public void run(Object obj){
                        //Output
                        saveFrame("../frames/######.tif");
                    }
                }
    
            )
        ;
    
    }
    
    void draw() {
        myAudioFFT.forward(myAudio.mix);
        myAudioDataUpdate();
    
        lights();
        sphereDetail(20);
    
        // do the rotation in the push/pop matrix
    
        pushMatrix();
            translate(width/2, height/2, -900);
    
            rotateX( map(rotateNumX, 0, myAudioMax, -(TWO_PI / 20), TWO_PI / 20) );
            rotateY( map(rotateNumY, 0, myAudioMax, -(TWO_PI / 20), TWO_PI / 20) );
            rotateZ( map(rotateNumZ, 0, myAudioMax, -(TWO_PI / 20), TWO_PI / 20) ) ;
    
            int fftRotateX = (int)map(myAudioData[0], 0, myAudioMax, -1, 20); //contolled by [0] = bass
            int fftRotateY = (int)map(myAudioData[3], 0, myAudioMax, -1, 20); //contolled by [3] = snare
            int fftRotateZ = (int)map(myAudioData[5], 0, myAudioMax, 1, -20); //contolled by [5] = can choose random one here inc another snare/bass
    
            rotateNumX += fftRotateX;
            rotateNumY += fftRotateY;
            rotateNumZ += fftRotateZ;
    
            H.drawStage();
    
            // draw the translucent box
    
            //stroke(#333333); fill(#242424, 50); box(600); noStroke(); noFill();
    
        popMatrix();
    
        for (HDrawable d : pool) {
            HBundle tempExtra = d.extras();
            int i = (int)tempExtra.num("i");
    
            int fftSize;
            if (i==0)       fftSize = (int)map(myAudioData[i], 0, myAudioMax, -200, 350); // bass
            else if (i==3)  fftSize = (int)map(myAudioData[i], 0, myAudioMax, 50, 350); // snare
            else            fftSize = (int)map(myAudioData[i], 0, myAudioMax, 2, 150); // snare
    
            d.size(fftSize);
        }
    
        //CALL TO WIDGET SHOULD BE THE LAST ITEM IN THE DRAW() FUNCTION, SO IT APPEARS ABOVE ALL OTHER VISUAL ASSETS
        if (showVisualizer) myAudioDataWidget();
    }
    
    void myAudioDataUpdate() {
        for(int i = 0; i < myAudioRange; ++i) {
            float tempIndexAvg = (myAudioFFT.getAvg(i) * myAudioAmp) * myAudioIndexAmp;
            float tempIndexCon = constrain(tempIndexAvg, 0, myAudioMax);
            myAudioData[i] = tempIndexCon;
            myAudioIndexAmp += myAudioIndexStep;
        }
        myAudioIndexAmp = myAudioIndex;
    
    }
    
    // ************************************************************************************************************
    
    void myAudioDataWidget() {
        noLights();
        hint(DISABLE_DEPTH_TEST);
        noStroke(); fill(0, 200); rect(0, height-112, width, 102);
    
            for(int i = 0; i < myAudioRange; ++i) {
                fill(#CCCCCC);
                rect(10 + (i * 5), (height - myAudioData[i]) - 11, 4, myAudioData[i]);
            }
            hint(ENABLE_DEPTH_TEST);
    }
    
    // ************************************************************************************************************
    
    
    void stop(){
        myAudio.close();
        minim.stop();
        super.stop();
    }
    
  • How can i imitate this project? (draw rectangles with audio beat)

    please don't post duplicates. you have two identical questions and both have answers. which one do we reply to? confusing.

    anyway, i'm deleting the other. but here's akenaton's post from there

    akenaton 5:43PM edited 5:45PM

    you can do that (and much more...) using minim + fft + beatdetect && indexing var from volume you can get. frequency (let us say) indexes the width or height of your rects, volume for width or height and beat detect (let us say snare, but it can be kick or hat) for the time line.

  • Recognize a snare drum sound ?

    This is more of a technical question about drums and their frequencies. Maybe some other forum goer can provide some of their experience. I will suggest exploring previous posts:

    https://forum.processing.org/two/search?Search=drums
    https://forum.processing.org/two/search?Search=snare

    Or do some research in google... I am sure somebody has had the same problem. I wouldn't use the low part of the spectrum to identify my instrument as anything can trigger that part of the spectrum. However, if you are only playing one instrument (ever), then instead of a frequency trigger, you could just focus in amplitude.

    Kf

  • Recognize a snare drum sound ?

    Hello guys,

    I am doing an project where I am displaying information on a snare drum only (not a all drum kit!). My goal is to give simple feedbacks to the user using a microphone as detection tool. My projection look like a clock and when the main hand reach a point, the player should hit the drum. I would like to detect if he hits the drum on time or not and if he hits the drum or not. My projection is in yellow on a black background, it will highlight in blue if he didn't hit on time and in red if he didn't hit the drum (or if he hits anything which is not the drum).

    I tried with minim and beatdetect but unfortunatelly the "isSnare" doesn't work as it should be ... It returns true even if I am whistling or hitting my desk...

    Should I go in an FFT analysis ? I am up for any advices .. I don't really know where to start.. Should I record the drum sound and then test in real-time if the hit is equal to the recorded sound?

    Here is my code for now:

    import ddf.minim.*;
    import ddf.minim.analysis.*;
    
    Minim minim;
    BeatDetect beat;
    AudioInput in;
    
    
    float eRadius;
    
    void setup()
    {
      size(200, 200, P3D);
      minim = new Minim(this);
    
      in = minim.getLineIn(Minim.STEREO, 512);
      // a beat detection object SOUND_FREQUENCY based on my mic
      beat = new BeatDetect(in.bufferSize(), in.sampleRate());
    
    }
    
    void draw()
    {
      background(0);
      beat.detect(in.mix);
    
      println("isSnare: "+ beat.isSnare());
      //println("isHat: "+ beat.isHat());
      //println("isKick: " + beat.isKick());
      //println("isOnset: "+ beat.isOnset());
    
    }
    
  • Error message : java.lang.RuntimeException: java.lang.IllegalAccessError: tried to access class

    HI , I have a code trying to light LED with processing , but I keep getting an error message :

    java.lang.RuntimeException: java.lang.IllegalAccessError: tried to access class processing.core.PApplet$RegisteredMethods from class cc.arduino.Arduino$SerialProxy at processing.opengl.PSurfaceJOGL$2.run(PSurfaceJOGL.java:461) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.IllegalAccessError: tried to access class processing.core.PApplet$RegisteredMethods from class cc.arduino.Arduino$SerialProxy at cc.arduino.Arduino$SerialProxy.(Arduino.java:99) at cc.arduino.Arduino.(Arduino.java:148) at BeatWrite.setup(BeatWrite.java:56) at processing.core.PApplet.handleDraw(PApplet.java:2387) at processing.opengl.PSurfaceJOGL$DrawListener.display(PSurfaceJOGL.java:871) at jogamp.opengl.GLDrawableHelper.displayImpl(GLDrawableHelper.java:692) at jogamp.opengl.GLDrawableHelper.display(GLDrawableHelper.java:674) at jogamp.opengl.GLAutoDrawableBase$2.run(GLAutoDrawableBase.java:443) at jogamp.opengl.GLDrawableHelper.invokeGLImpl(GLDrawableHelper.java:1293) at jogamp.opengl.GLDrawableHelper.invokeGL(GLDrawableHelper.java:1147) at com.jogamp.newt.opengl.GLWindow.display(GLWindow.java:759) at com.jogamp.opengl.util.AWTAnimatorImpl.display(AWTAnimatorImpl.java:81) at com.jogamp.opengl.util.AnimatorBase.display(AnimatorBase.java:452) at com.jogamp.opengl.util.FPSAnimator$MainTask.run(FPSAnimator.java:178) at java.util.TimerThread.mainLoop(Timer.java:555) at java.util.TimerThread.run(Timer.java:505)


    import processing.serial.*; import ddf.minim.*; import ddf.minim.analysis.*; import cc.arduino.*;

    Minim minim; AudioPlayer song; BeatDetect beat; BeatListener bl; Arduino arduino;

    int ledPin = 12; // LED connected to digital pin 12 int ledPin2 = 8; // LED connected to digital pin 1 int ledPin3 = 2; // LED connected to digital pin 0

    float kickSize, snareSize, hatSize;

    void setup() { size(512, 200, P3D);

    minim = new Minim(this); arduino = new Arduino(this, Arduino.list()[1], 57600);

    song = minim.loadFile("envremix.mp3"); song.play(); // a beat detection object that is FREQ_ENERGY mode that // expects buffers the length of song's buffer size // and samples captured at songs's sample rate beat = new BeatDetect(song.bufferSize(), song.sampleRate()); // set the sensitivity to 300 milliseconds // After a beat has been detected, the algorithm will wait for 300 milliseconds // before allowing another beat to be reported. You can use this to dampen the // algorithm if it is giving too many false-positives. The default value is 10, // which is essentially no damping. If you try to set the sensitivity to a negative value, // an error will be reported and it will be set to 10 instead. beat.setSensitivity(100);
    kickSize = snareSize = hatSize = 16; // make a new beat listener, so that we won't miss any buffers for the analysis bl = new BeatListener(beat, song);
    textFont(createFont("Helvetica", 16)); textAlign(CENTER);

    arduino.pinMode(ledPin, Arduino.OUTPUT);
    arduino.pinMode(ledPin2, Arduino.OUTPUT);
    arduino.pinMode(ledPin3, Arduino.OUTPUT);
    }

    void draw() { background(0); fill(255); if(beat.isKick()) { arduino.digitalWrite(ledPin, Arduino.HIGH); // set the LED on kickSize = 32; } if(beat.isSnare()) { arduino.digitalWrite(ledPin2, Arduino.HIGH); // set the LED on snareSize = 32; } if(beat.isHat()) { arduino.digitalWrite(ledPin3, Arduino.HIGH); // set the LED on hatSize = 32; } arduino.digitalWrite(ledPin, Arduino.LOW); // set the LED off arduino.digitalWrite(ledPin2, Arduino.LOW); // set the LED off arduino.digitalWrite(ledPin3, Arduino.LOW); // set the LED off textSize(kickSize); text("KICK", width/4, height/2); textSize(snareSize); text("SNARE", width/2, height/2); textSize(hatSize); text("HAT", 3*width/4, height/2); kickSize = constrain(kickSize * 0.95, 16, 32); snareSize = constrain(snareSize * 0.95, 16, 32); hatSize = constrain(hatSize * 0.95, 16, 32); }

    void stop() { // always close Minim audio classes when you are finished with them song.close(); // always stop Minim before exiting minim.stop(); // this closes the sketch super.stop(); }

  • What does beat and fft means?

    FFT is fast fourier transform. It calculates the frequency domain of a wave.

    Mathematically, any wave, like a sound wave, is a sum of a lot of sine waves, with different frequencies. A frequency is basically how fast the wave oscillates.

    So, a low frequency is a really long wave and in sound, this would be a low bass tone. High frequencies are short waves (the peaks are close together) and they sound high.

    A FFT will tell you exactly how much of each frequency is present in a wave. You give it a frequency and it will tell you the amplitude (magnitude) corresponding to that frequency. If the amplitude is 0 then that frequency is not present in the wave.

    As you can see in this animation, the green wave is composed of the sum of two "pure" sine waves, one at 5 Hz (meaning it goes up and down in 1/5 = 0.2 seconds) and another at 15 Hz (one period in 1/15 = 0.066 seconds, the wave with smaller amplitude). So, if you were to plug the green wave into a FFT algorithm, it would detect these two waves and give you the blue spectrum.

    Music is a lot of frequencies together. This can result in quite a complex spectrum.

    In this image there is a peak at 1 KHz. So that tells you there will be a pronounced tone at that frequency. It's not a simple line as well, the tone is a bit "smeared out". This is because the tone is not a pure sine wave and because of the "window". Only a pure sine wave would give a single line in the spectrum, but for that you would need to measure infinitely long. Nobody has time for that so they measure only during a selectable amount of time. This is the window. But since stopping the measurement prematurely essentially modifies the sine wave, it distorts the spectrum a bit and you get a distribution instead of a line around the original frequency of the sine. That's advanced stuff already.

    Mathematically speaking, a beat arises when you add two sine waves together:

    They add up and at some places they cancel each other out and at others they amplify each other. The periodic structure that arises is called "beat".

    But in music, a beat is usually referring to the unit of time in a measure. If you have a metre of 4/4 then you have four beats in one measure.

    Usually such a beat (in popular music) is accompanied with percussion such as a bass drum or snare drum hit. A bass drum would show up in the lower frequencies of the spectrum as a sudden onset in amplitude. An algorithm can detect this to try and find the beats in a piece of music.

  • Multiple songs + multiple beatlisteners

    Hi everyone! I'm learning Processing for the first time for one of my modules at university. The task is to essentially create something with processing, and I did a small equalizer reminiscing of the old windows media player with the weird effects. I seem to keep getting in line 38 NullPointerExpection error when I added multiple beatlisteners since it is part of what's randomizing the graphics.

    I managed to get the multiples files playing once, but with only 1 beatlistener working. So when I changed to the other equalizer screen it didn't (beat).

    I've tried multiple things, but can't find the solution. Can anyone spot where the problem lies?

    (The initial problem is fixed, I was missing the .mp3 in the song description)

    I am now back to my other problem after managing to get multiple songs to load yesterday, the beatlistener only "listens and beats" to one of the songs, in the current code it only does it to playlist [2]. When I choose the other songs the color stops changing and the graphics stop animating,

    Any way to make the beatlistener read all the 3 songs?

    Thank you!

    Main window

                                       import ddf.minim.*;
                        import ddf.minim.analysis.*;
    
                        Minim minim;
                        AudioPlayer[] playlist;
                        AudioPlayer player;
                        AudioInput input;
                        BeatDetect beat;
                        BeatListener bl;
    
                        float unit, theta;
                        float kickSize, snareSize, hatSize;
                        float r = random(0, 500);
                        int pageNumber = 1;
                        int num = 50, frames=180;
                        int radius = 40; 
                        int sides = 10;
                        int depth = 0; 
                        PWindow win;
    
                        public void settings() {
                          size(500, 500);
                        }
    
                        void setup() {
                          win = new PWindow();
                          minim = new Minim(this);
                          unit = width/num; 
                          noStroke();
    
    
                          playlist = new AudioPlayer [3];
                          playlist[0] = minim.loadFile("the_trees.mp3");
                          playlist[1] = minim.loadFile("marcus_kellis_theme.mp3");
                          playlist[2] = minim.loadFile("eternal_snowflake.mp3");
    
    
                          beat = new BeatDetect(playlist[1].bufferSize(), playlist[1].sampleRate());
                          beat.setSensitivity(50);  
                          kickSize = snareSize = hatSize = 1600;
                          bl = new BeatListener(beat, playlist[1]);
    
                            beat = new BeatDetect(playlist[0].bufferSize(), playlist[0].sampleRate());
                          beat.setSensitivity(50);  
                          kickSize = snareSize = hatSize = 1600;
                          bl = new BeatListener(beat, playlist[0]);
    
                          beat = new BeatDetect(playlist[2].bufferSize(), playlist[2].sampleRate());
                          beat.setSensitivity(50);  
                          kickSize = snareSize = hatSize = 1600;
                          bl = new BeatListener(beat, playlist[2]);
    
    
    
                        }
                        void draw() {
    
                          if (keyPressed) {
                            if (key == 'j')
                              playlist[0].play();
                            else
                              playlist[0].pause();
    
                            if (keyPressed) 
                              if (key == 'k')
                                playlist[1].play();
                              else
                                playlist[1].pause();
    
                            if (keyPressed) 
                              if (key == 'l')
                                playlist[2].play();
                              else
                                playlist[2].pause();
                          }
                          if (pageNumber == 1) {
                            background(0);
                            for (int y=0; y<=num; y++) {
                              for (int x=0; x<=num; x++) {
    
    
                                if (keyPressed) {
                                  if (key == 'r')
                                    fill(255, 0, 0); //sphere colour
                                }
    
                                if (keyPressed) {
                                  if (key == 'g')
                                    fill(0, 255, 0);
                                }  
    
                                if (keyPressed) {
                                  if (key == 'b')
                                    fill(0, 0, 255);
                                }
                                if (beat.isHat()) {
                                  fill(random(0, 255), random(0, 255), random(0, 255));
                                  radius = int(random(1, 100)); // randomly choose radius for sphere
                                  depth = int(random(1, 100)); // randomly set forward/backward translation distance of sphere
                                  // test if beat is snare
    
                                  if (beat.isSnare()) {
                                    fill(random(0, 255), random(0, 255), random(0, 255));
                                    radius = int(random(10, 200)); // randomly choose radius for sphere
                                    depth = int(random(10, 100)); // randomly set forward/backward translation distance of sphere
                                  }  
                                  // test id beat is Hat
                                  if (beat.isKick()) {
                                    fill(random(0, 255), random(0, 255), random(0, 255));
                                    radius = int(random(10, 500)); // randomly choose radius for sphere
                                    depth = int(random(25, 220)); // randomly set forward/backward translation distance of sphere
                                  }
                                }
    
                                pushMatrix();
                                float distance = dist(width/2, height/2, x*unit, y*unit);
                                float offSet = map(distance, 56, sqrt(sq(width/2)+sq(height/17)), 0, TWO_PI);
                                float sz = map(sin(theta+distance), 1, 10, unit*.2, unit*.1);
                                float angle = atan2(y*unit-height/11, x*unit-width/2);
                                float px = map(sin(angle+offSet+theta), 110, 56, 18, 159);
                                translate(251, -115);
                                rotate(random(271)); //rotates to a random angle
                                ellipse(random(-209, 730), random(-767, 105), -2, random(-59, 35)); //godcode
                                popMatrix();
                              }
    
    
                              theta -= TWO_PI/frames;
                            }
                          }
    
    
                          if (pageNumber == 2) {
                            background(0);
                            for (int y=50; y<=num; y++) {
                              for (int x=41; x<=num; x++) {
    
                                if (keyPressed) {
                                  if (key == 'r')
                                    fill(255, 0, 0); //sphere colour
                                }
    
                                if (keyPressed) {
                                  if (key == 'g')
                                    fill(0, 255, 0);
                                }  
    
                                if (keyPressed) {
                                  if (key == 'b')
                                    fill(0, 0, 255);
                                }
                                if (beat.isHat()) {
                                  fill(random(0, 255), random(0, 255), random(0, 255));
                                  radius = int(random(1, 50)); // randomly choose radius for sphere
                                  depth = int(random(1, 505)); // randomly set forward/backward translation distance of sphere
                                  // test if beat is snare
    
                                  if (beat.isSnare()) {
                                    fill(random(0, 255), random(0, 255), random(0, 255));
                                    radius = int(random(0, 500)); // randomly choose radius for sphere
                                    depth = int(random(0, 100)); // randomly set forward/backward translation distance of sphere
                                  }  
                                  // test id beat is Hat
                                  if (beat.isKick()) {
                                    fill(random(0, 255), random(0, 255), random(0, 255));
                                    radius = int(random(1, 500)); // randomly choose radius for sphere
                                    depth = int(random(23, 220)); // randomly set forward/backward translation distance of sphere
                                  }
                                }
    
                                pushMatrix();
                                float distance = dist(width/2, height/2, x*unit, y*unit);
                                float offSet = map(distance, 53, sqrt(sq(width/2)+sq(height/17)), 0, TWO_PI);
                                float sz = map(sin(theta+distance), 1, 10, unit*.2, unit*.1);
                                float angle = atan2(y*unit-height/2, x*unit-width/2);
                                float px = map(sin(angle+offSet+theta), 2, 29, 50, 152);
                                translate(245, 245);
                                rotate(random(59)); //rotates to a random angle
                                ellipse(random(102, 74), random(31, 20), 4, random(599, 234)); //godcode
                                popMatrix();
                              }
    
    
                              theta -= TWO_PI/frames;
                            }
                          }
    
                          if (pageNumber == 3) {
                            background(0);
                            for (int y=0; y<=num; y++) {
                              for (int x=0; x<=num; x++) {
    
                                if (keyPressed) {
                                  if (key == 'r')
                                    fill(255, 0, 0); //sphere colour
                                }
    
                                if (keyPressed) {
                                  if (key == 'g')
                                    fill(0, 255, 0);
                                }  
    
                                if (keyPressed) {
                                  if (key == 'b')
                                    fill(0, 0, 255);
                                }
                                if (beat.isHat()) {
                                  fill(random(0, 255), random(0, 255), random(0, 255));
                                  radius = int(random(1, 500)); // randomly choose radius for sphere
                                  depth = int(random(1, 500)); // randomly set forward/backward translation distance of sphere
                                  // test if beat is snare
    
                                  if (beat.isSnare()) {
                                    fill(random(0, 255), random(0, 255), random(0, 255));
                                    radius = int(random(1, 500)); // randomly choose radius for sphere
                                    depth = int(random(1, 500)); // randomly set forward/backward translation distance of sphere
                                  }  
                                  // test id beat is Hat
                                  if (beat.isKick()) {
                                    fill(random(0, 255), random(0, 255), random(0, 255));
                                    radius = int(random(1, 500)); // randomly choose radius for sphere
                                    depth = int(random(1, 500)); // randomly set forward/backward translation distance of sphere
                                  }
                                }
    
                                pushMatrix();
                                float distance = dist(width/2, height/2, x*unit, y*unit);
                                float offSet = map(distance, 10, sqrt(sq(width/2)+sq(height/2)), 0, TWO_PI);
                                float sz = map(sin(theta+distance), 1, 10, unit*.2, unit*.1);
                                float angle = atan2(y*unit-height/2, x*unit-width/2);
                                float px = map(sin(angle+offSet+theta), -1, 10, 0, 100);
                                translate(x*unit, y*unit);
                                rotate(r*angle); //rotates to a random angle
                                ellipse(px, 0, sz, sz); //godcode
                                popMatrix();
                              }
    
    
                              theta -= TWO_PI/frames;
                            }
                          }
                        }
    
    
                        void keyPressed() {
                          if (key == '1') {
                            redraw();
                            pageNumber = 1;
                          }
                          if (key == '2') {
                            redraw();
                            pageNumber = 2;
                          }
                          if (key == '3') {
                            redraw();
                            pageNumber = 3;
                          }
                        }
                        void mousePressed() {
                          println("mousePressed in primary window");
                        }  
    

    BeatListener window

                class BeatListener implements AudioListener
                {
                  private BeatDetect beat;
                  private AudioPlayer source;
    
                  BeatListener(BeatDetect beat, AudioPlayer source)
                  {
                    this.source = source;
                    this.source.addListener(this);
                    this.beat = beat;
                  }
    
                  void samples(float[] samps)
                  {
                    beat.detect(source.mix);
                  }
    
                  void samples(float[] sampsL, float[] sampsR)
                  {
                    beat.detect(source.mix);
    
                  }
                }
    
  • faster trigger audio clips for 'drum roll'
    Hi Everybody
    My professor (Kate Hartman) has generously helped my team and I develop the following code, trying to get a single 'drum tap' thing going. However, there's an issue.
    
    I'd like to be able to play the sound quickly, like a drum roll, but right now I'm unable. I believe currently, I can only re'trigger the sound once it's stopped playing, so I can't play it more than once every 1/4 second or so.
    
    I think I may need to add 'restart' for play mode somewhere, but I'm not sure where. Can anyone assist me or point me in the right direction? 
    
    Much much thanks.
    
    .aS.
    
    
    
    `function preload() {
      soundFormats('wav');
      sample = loadSound('snare.wav');
    }
    
    function setup() {
      createCanvas(displayWidth, displayHeight);
      //fullscreen(true);
      sample.play();
    }
    
    function draw() {
    
      if (sample.isPlaying()) {
        background('white');
        fill('blue');
        stroke('none');
        ellipse(windowWidth / 2, windowHeight / 2, 175, 175);
      } else {
        background('blue');
        noStroke();
        fill('white');
        ellipse(windowWidth / 2, windowHeight / 2, 175, 175);
      }
    
    }
    
    function mousePressed() {
      if (sample.isPlaying()) {
        //sample.pause();
        sample.playMode('restart');
        //do nothing!
      } else {
        sample.play();
    }
    }
    
    `
    
  • Creating an array of audio in beads.

    Then the sounds class...

    class Sound {
      //object variables 
      float xPos, yPos;
      String soundName;
    
    
      //Audio Files
      int numSamples = 0;
      int sampleWith = 0;
      String [] sourceFile;
    
    
      String sourceFile1, sourceFile2, sourceFile3, sourceFile4, sourceFile5, sourceFile6, 
      sourceFile7, sourceFile8, sourceFile9, sourceFile10, sourceFile11, sourceFile12, sourceFile13, 
      sourceFile14, sourceFile15, sourceFile16; // this will hold the path to our audio file
    
      SamplePlayer sp1; 
      SamplePlayer sp2;
      SamplePlayer sp3;
      SamplePlayer sp4;
      SamplePlayer sp5;
      SamplePlayer sp6;
      SamplePlayer sp7;
      SamplePlayer sp8;
      SamplePlayer sp9;
      SamplePlayer sp10;
      SamplePlayer sp11; 
      SamplePlayer sp12;
      SamplePlayer sp13;
      SamplePlayer sp14;
      SamplePlayer sp15;
      SamplePlayer sp16;
    
    
      //Gain
      Gain g;
      Glide gainValue;
      //Reverb
      Reverb r; // our Reverberation unit generator
    
    
        Sound (float _Xpos, float _Ypos, String _SoundName) {
        xPos = _Xpos;
        yPos = _Ypos;
        soundName = "";
      }
    
    
    
      void audioSetup() {
        sourceFile1 = dataPath("clap-1.mp3");
        sourceFile2 = dataPath("snare-1.mp3");
        sourceFile3 = dataPath("mid-2.mp3");
        sourceFile4 = dataPath("crash-4.mp3");
        sourceFile5 = dataPath("crash-1.mp3");
        sourceFile6 = dataPath("clap-2.mp3");
        sourceFile7 = dataPath("mid-1.mp3");
        sourceFile8 = dataPath("clap-4.mp3");
        sourceFile9 = dataPath("crash-3.mp3");
        sourceFile10 = dataPath("clap-3.mp3");
        sourceFile11 = dataPath("mid-3.mp3");
        sourceFile12 = dataPath("mid-4.mp3");
        sourceFile13 = dataPath("crash-2.mp3");
        sourceFile14 = dataPath("snare-2.mp3");
        sourceFile15 = dataPath("snare-3.mp3");
        sourceFile16 = dataPath("snare-4.mp3");
    
        try {  
          sp1 = new SamplePlayer(ac, new Sample(sourceFile1));
          sp2 = new SamplePlayer(ac, new Sample(sourceFile2));
          sp3 = new SamplePlayer(ac, new Sample(sourceFile3));
          sp4 = new SamplePlayer(ac, new Sample(sourceFile4));
          sp5 = new SamplePlayer(ac, new Sample(sourceFile5));
          sp6 = new SamplePlayer(ac, new Sample(sourceFile6));
          sp7 = new SamplePlayer(ac, new Sample(sourceFile7));
          sp8 = new SamplePlayer(ac, new Sample(sourceFile8));
          sp9 = new SamplePlayer(ac, new Sample(sourceFile9));
          sp10 = new SamplePlayer(ac, new Sample(sourceFile10));
          sp11 = new SamplePlayer(ac, new Sample(sourceFile11));
          sp12 = new SamplePlayer(ac, new Sample(sourceFile12));
          sp13 = new SamplePlayer(ac, new Sample(sourceFile13));
          sp14 = new SamplePlayer(ac, new Sample(sourceFile14));
          sp15 = new SamplePlayer(ac, new Sample(sourceFile15));
          sp16 = new SamplePlayer(ac, new Sample(sourceFile16));
        }
        catch(Exception e)
        {
          println("Exception while at_ting to load sample!");
          e.printStackTrace(); 
          exit();
        }
        sp1.setKillOnEnd(false);
        sp2.setKillOnEnd(false);
        sp3.setKillOnEnd(false);
        sp4.setKillOnEnd(false);
        sp5.setKillOnEnd(false);
        sp6.setKillOnEnd(false);
        sp7.setKillOnEnd(false);
        sp8.setKillOnEnd(false);
        sp9.setKillOnEnd(false);
        sp10.setKillOnEnd(false);
        sp11.setKillOnEnd(false);
        sp12.setKillOnEnd(false);
        sp13.setKillOnEnd(false);
        sp14.setKillOnEnd(false);
        sp15.setKillOnEnd(false);
        sp16.setKillOnEnd(false);
    
        Gain g = new Gain(ac, 2, 0.2);
        g.addInput(sp1);
        g.addInput(sp2);
        g.addInput(sp3);
        g.addInput(sp4);
        g.addInput(sp5);
        g.addInput(sp6);
        g.addInput(sp7);
        g.addInput(sp8);
        g.addInput(sp9);
        g.addInput(sp10);
        g.addInput(sp11);
        g.addInput(sp12);
        g.addInput(sp13);
        g.addInput(sp14);
        g.addInput(sp15);
        g.addInput(sp16);
    
        ac.out.addInput(g);
      }
    
      void audioPlay() {
    
        if (play1) {
          sp1.start(); // ply the audio file
          sp1.setToLoopStart();
          play1 = false;
        }
        if (play2) {
          sp2.start(); // ply the audio file
          sp2.setToLoopStart();
          play2 = false;
        }
        if (play3) {
          sp3.start(); // ply the audio file
          sp3.setToLoopStart();
          play3 = false;
        }
        if (play4) {
          sp4.start(); // ply the audio file
          sp4.setToLoopStart();
          play4 = false;
        }
        if (play5) {
          sp5.start(); // ply the audio file
          sp5.setToLoopStart();
          play5 = false;
        }
        if (play6) {
          sp6.start(); // ply the audio file
          sp6.setToLoopStart();
          play6 = false;
        }
        if (play7) {
          sp7.start(); // ply the audio file
          sp7.setToLoopStart();
          play7 = false;
        }
        if (play8) {
          sp8.start(); // ply the audio file
          sp8.setToLoopStart();
          play8 = false;
        }
        if (play9) {
          sp9.start(); // ply the audio file
          sp9.setToLoopStart();
          play9 = false;
        }
        if (play10) {
          sp10.start(); // ply the audio file
          sp10.setToLoopStart();
          play10 = false;
        }
        if (play11) {
          sp11.start(); // ply the audio file
          sp11.setToLoopStart();
          play11 = false;
        }
        if (play12) {
          sp12.start(); // ply the audio file
          sp12.setToLoopStart();
          play12 = false;
        }
        if (play13) {
          sp13.start(); // ply the audio file
          sp13.setToLoopStart();
          play13 = false;
        }
        if (play14) {
          sp14.start(); // ply the audio file
          sp14.setToLoopStart();
          play14 = false;
        }
        if (play15) {
          sp15.start(); // ply the audio file
          sp15.setToLoopStart();
          play15 = false;
        }
        if (play16) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play16 = false;
        }
        if (play17) {
          sp13.start(); // ply the audio file
          sp13.setToLoopStart();
          play17 = false;
        }
        if (play18) {
          sp14.start(); // ply the audio file
          sp14.setToLoopStart();
          play18 = false;
        }
        if (play19) {
          sp15.start(); // ply the audio file
          sp15.setToLoopStart();
          play19 = false;
        }
        if (play20) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play20 = false;
        }
        if (play21) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play21 = false;
        }
        if (play22) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play22 = false;
        }
        if (play23) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play23 = false;
        }
        if (play24) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play24 = false;
        }
        if (play25) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play25 = false;
        }
        if (play26) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play26 = false;
        }
        if (play27) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play27 = false;
        }
        if (play28) {
          sp16.start(); // ply the audio file
          sp16.setToLoopStart();
          play28 = false;
        }
      }
    }
    
  • Creating an array of audio in beads.

    Hi, I'm trying to create an array of audios in beads so that they play independently when each SMT zone is touched. The problem I am having is how to initialize and link a different audio file to each zone.

    The hope is to be able to use a kinect to play audio files independently within zones. Please let me know if you have any ideas on improving code, or how to use arrays to reduce the footwork of the code.

    Thanks

        //Set for 6' height from ground 15'6" distance
    
        import vialab.SMT.*;
    
        import beads.*;
    
        //Array of Zones
        int arrayNum = 56;
        TouchtouchZone[] touchZone = new TouchtouchZone[arrayNum];
        boolean isEven;
        boolean isOdd;
    
        //Array of Sounds
        int soundsNum = 23;
        Sound[] sounds = new Sound[soundsNum];
        AudioContext ac;
    
        //Playback Variables
        int playDelay = 0;
        boolean canPlay = true;
    
        //Grid Line Variables 
        int x, y;
        float areaW, areaH;
        int num, num2;
        float spacing, spacing2;
        float xVal, yVal;
    
        //Aspect Ratio Variables
        Float aspectX, aspectY;
        int buttonSize;
        float edgeSpace;
        int gridStartX, gridStartY;
    
        //Setup Display INFO
        boolean window_fullscreen = false;
        int window_width = 1200;
        int window_height = 800;
        int window_halfWidth;
        int window_halfHeight;
        int fps_limit = 60;
    
        void setup() {
    
          //Display setup
          if ( window_fullscreen) {
            window_width = displayWidth;
            window_height = displayHeight;
          }
          window_halfWidth = window_width / 2;
          window_halfHeight = window_height / 2;
          //processing window setup
          frameRate( fps_limit);
          size( window_width, window_height, SMT.RENDERER);
          SMT.init( this, TouchSource.AUTOMATIC);
    
          //Audio Setup
          ac = new AudioContext(); 
    
          //Aspect Ratio Variables
          edgeSpace = 20.0;
          //  aspectX = 640.0;
          //  aspectY = 480.0;
          aspectX = (float)width;
          aspectY = (float)height - edgeSpace*2;
    
          //  THIS IS NOT PERFECT YET
          gridStartX = (int)(aspectX-aspectY)/2;
          gridStartY = (int)edgeSpace;
    
          //Grid Line Variables 
          //  X
          num = 8;
          areaW=aspectY;
          //  Y
          num2 = 7;
          areaH=aspectY;
          buttonSize = (int)(aspectY/num2);
    
          //  ARRAY MAKES BUTTONS IN CHECKERBOARD STYLE
          for (int i=0; i<arrayNum; i++) {
            if ((i<=7)&&(i % 2 == 0)) {
              x = gridStartX+(i*buttonSize);
              y = gridStartY;
            }
            if (((i>7)&&(i<=15))&&(i % 2 != 0)) {
              x = gridStartX+((i-8)*buttonSize);
              y = gridStartY+buttonSize;
            }
            if (((i>15)&&(i<=23))&&(i % 2 == 0)) {
              x = gridStartX+((i-16)*buttonSize);
              y = gridStartY+(2*buttonSize);
            }
            if (((i>23)&&(i<=31))&&(i % 2 != 0)) {
              x = gridStartX+((i-24)*buttonSize);
              y = gridStartY+(3*buttonSize);
            }
            if (((i>31)&&(i<=39))&&(i % 2 == 0)) {
              x = gridStartX+((i-32)*buttonSize);
              y = gridStartY+(4*buttonSize);
            }
            if (((i>39)&&(i<=47))&&(i % 2 != 0)) {
              x = gridStartX+((i-40)*buttonSize);
              y = gridStartY+(5*buttonSize);
            }
            if (((i>47)&&(i<=56))&&(i % 2 == 0)) {
              x = gridStartX+((i-48)*buttonSize);
              y = gridStartY+(6*buttonSize);
            }
            touchZone[i] = new TouchtouchZone(x, y, buttonSize, buttonSize, 100, 100, 150, 200);
            SMT.add(touchZone[i]);
          }
    
          //  ARRAY INITIALIZES AUDIO SETUP  
          for (int i=0; i<soundsNum; i++) {
            sounds[i] = new Sound(0, 0);
            sounds[i].audioSetup();
          }
          ac.start();
        }
    
    
        void draw() { 
          background(0);
          fill(30);
    
          spacing = buttonSize;
    
          //  fill(255);
    
          playDelay++;
          if (playDelay >= 15) {
            canPlay = true;
          } else {
            canPlay = false;
          }
    
          text("Play Delay: "+playDelay, width-100, height-20);
    
          //FOR GRID DEBUGGING
          //  rect(0, gridStartY, aspectX, aspectY);
          for (int m = 0; m < num; m++) {
            for (int n = 0; n < num2; n++) {
              stroke(125);
              strokeWeight(3);
              x = gridStartX+(m*buttonSize); 
              y = gridStartY+(n*buttonSize);
              rect(x, y, buttonSize, buttonSize);
            }
          }
        }
    
        public void drawFrameRate() {
          float fps = this.frameRate;
          String fps_text = String.format( "fps: %.0f", fps);
          pushStyle();
          fill( 240, 240, 240, 180);
          textAlign( RIGHT, TOP);
          textMode( MODEL);
          textSize( 32);
          text( fps_text, window_width - 10, 10);
          popStyle();
        }
    
        private class touchZone extends Zone {
          protected int colour_red;
          protected int colour_green;
          protected int colour_blue;
          protected int colour_alpha;
          public touchZone( int x, int y, int width, int height, 
          int colour_red, int colour_green, int colour_blue, int colour_alpha) {
            super( x, y, width, height);
            this.colour_red = colour_red;
            this.colour_green = colour_green;
            this.colour_blue = colour_blue;
            this.colour_alpha = colour_alpha;
            this.setCaptureTouches( false);
          }
          //draw method
          public void draw() {
            pushStyle();
            noStroke();
            fill( colour_red, colour_green, colour_blue, colour_alpha);
            rect( 0, 0, this.getWidth(), this.getHeight(), 5);
            popStyle();
          }
          public void touch() {
          }
          //we define the press method so that touches will be unassigned when they 'exit' the zone.
          public void press( Touch touch) {
          }
        }
    
        private class TouchtouchZone extends touchZone {
          public TouchtouchZone( int x, int y, int width, int height, 
          int colour_red, int colour_green, int colour_blue, int colour_alpha) {
            super( x, y, width, height, 
            colour_red, colour_green, colour_blue, colour_alpha);
          }
          //touch method
          public void touch() {
            Touch touch = getActiveTouch( 0);
            touch.setTint(
            colour_red, colour_green, colour_blue, colour_alpha);
    
            for (int i=0; i<soundsNum; i++) {
              if (canPlay) {
                sounds[i].audioPlay();
              }
              playDelay=0;
            }
          }
        }
    
        static final boolean isEven(int n) {
          return (n & 1) == 0;
        }
    
        static final boolean isOdd(int n) {
          return !isEven(n);
        }
    
      class Sound {
      //object variables 
      float xPos, yPos;
    
      String sourceFile1, sourceFile2, sourceFile3, sourceFile4, sourceFile5, sourceFile6, 
      sourceFile7, sourceFile8, sourceFile9, sourceFile10, sourceFile11, sourceFile12, sourceFile13, 
      sourceFile14, sourceFile15, sourceFile16; // this will hold the path to our audio file
      SamplePlayer sp1; 
      SamplePlayer sp2;
      SamplePlayer sp3;
      SamplePlayer sp4;
      SamplePlayer sp5;
      SamplePlayer sp6;
      SamplePlayer sp7;
      SamplePlayer sp8;
      SamplePlayer sp9;
      SamplePlayer sp10;
      SamplePlayer sp11; 
      SamplePlayer sp12;
      SamplePlayer sp13;
      SamplePlayer sp14;
      SamplePlayer sp15;
      SamplePlayer sp16;
    
      //Gain
      Gain g;
      Glide gainValue;
      //Reverb
      Reverb r; // our Reverberation unit generator
    
        Sound (float _Xpos, float _Ypos) {
        xPos = _Xpos;
        yPos = _Ypos;
      } 
    
      void audioSetup() {
        //    sourceFile = dataPath("0.mp3");
        //    sourceFile = dataPath("1.mp3");
        sourceFile1 = dataPath("clap-1.mp3");
        sourceFile2 = dataPath("snare-1.mp3");
        sourceFile3 = dataPath("clap-3.mp3");
        sourceFile4 = dataPath("clap-4.mp3");
        sourceFile5 = dataPath("crash-1.mp3");
        sourceFile6 = dataPath("crash-2.mp3");
        sourceFile7 = dataPath("crash-3.mp3");
        sourceFile8 = dataPath("crash-4.mp3");
        sourceFile9 = dataPath("mid-1.mp3");
        sourceFile10 = dataPath("mid-2.mp3");
        sourceFile11 = dataPath("mid-3.mp3");
        sourceFile12 = dataPath("mid-4.mp3");
        sourceFile13 = dataPath("clap-2.mp3");
        sourceFile14 = dataPath("snare-2.mp3");
        sourceFile15 = dataPath("snare-3.mp3");
        sourceFile16 = dataPath("snare-4.mp3");
    
        try {  
          sp1 = new SamplePlayer(ac, new Sample(sourceFile1));
          sp2 = new SamplePlayer(ac, new Sample(sourceFile2));
          //      sp3 = new SamplePlayer(ac, new Sample(sourceFile3));
          //      sp4 = new SamplePlayer(ac, new Sample(sourceFile4));
          //      sp5 = new SamplePlayer(ac, new Sample(sourceFile5));
          //      sp6 = new SamplePlayer(ac, new Sample(sourceFile6));
          //      sp7 = new SamplePlayer(ac, new Sample(sourceFile7));
          //      sp8 = new SamplePlayer(ac, new Sample(sourceFile8));
          //      sp9 = new SamplePlayer(ac, new Sample(sourceFile9));
          //      sp10 = new SamplePlayer(ac, new Sample(sourceFile10));
          //      sp12 = new SamplePlayer(ac, new Sample(sourceFile11));
          //      sp12 = new SamplePlayer(ac, new Sample(sourceFile12));
          //      sp13 = new SamplePlayer(ac, new Sample(sourceFile13));
          //      sp14 = new SamplePlayer(ac, new Sample(sourceFile14));
          //      sp15 = new SamplePlayer(ac, new Sample(sourceFile15));
          //      sp16 = new SamplePlayer(ac, new Sample(sourceFile16));
        }
        catch(Exception e)
        {
          println("Exception while at_ting to load sample!");
          e.printStackTrace(); 
          exit();
        }
        sp1.setKillOnEnd(false);
        sp2.setKillOnEnd(false);
        //    sp3.setKillOnEnd(false);
        //    sp4.setKillOnEnd(false);
        //    sp5.setKillOnEnd(false);
        //    sp6.setKillOnEnd(false);
        //    sp7.setKillOnEnd(false);
        //    sp8.setKillOnEnd(false);
        //    sp9.setKillOnEnd(false);
        //    sp10.setKillOnEnd(false);
        //    sp11.setKillOnEnd(false);
        //    sp12.setKillOnEnd(false);
        //    sp13.setKillOnEnd(false);
        //    sp14.setKillOnEnd(false);
        //    sp15.setKillOnEnd(false);
        //    sp16.setKillOnEnd(false);
    
        Gain g = new Gain(ac, 2, 0.2);
        g.addInput(sp1);
        g.addInput(sp2);
        //    g.addInput(sp3);
        //    g.addInput(sp4);
        //    g.addInput(sp5);
        //    g.addInput(sp6);
        //    g.addInput(sp7);
        //    g.addInput(sp8);
        //    g.addInput(sp9);
        //    g.addInput(sp10);
        //    g.addInput(sp11);
        //    g.addInput(sp12);
        //    g.addInput(sp13);
        //    g.addInput(sp14);
        //    g.addInput(sp15);
        //    g.addInput(sp16);
    
        ac.out.addInput(g);
      }
    
      void audioPlay() {
    
        for (int i=0; i<soundsNum; i++) {
          if (i == 1) {
            sp1.start(); // ply the audio file
            sp1.setToLoopStart();
          }
          if (i == 2) {
            sp2.start(); // ply the audio file
            sp2.setToLoopStart();
          }
        }
      }
    
      void textDisplay() {
      }
    }
    

    Screen Shot 2016-07-14 at 11.20.12 AM

  • Simple openni and minim libraries

    Hey! I'm need help for my project. I'm on a windows and using a kinect v1. The purpose of this is trigger a sound everytime a RightHand or a Left Hand touch the ellipses. and that part is functional, but i can't add more sounds (i only have 2) with a error: "IllegalArgument Exception: unsuported bit depth, use either 8 or 16".

    I'm gonna share the code and any help is welcome. Thanks :)

    float theta1;
    float theta2;
    float yval;
    float xval;
    
    
    float raio = 50;
    float fila1 = 300;
    float fila2 = 600;
    
    float valA = 150;
    float valB = 300;
    float valC = 450;
    
    
    import ddf.minim.*;
    Minim minim;
    AudioSample player;
    AudioInput in;
    AudioRecorder recorder;
    
    
    import SimpleOpenNI.*;
    
    SimpleOpenNI  context;
    color[]       userClr = new color[]{ color(random(255), random(255), random(255)),
                                         color(random(255), random(255), random(255)),
                                         color(random(255), random(255), random(255)),
                                         color(random(255), random(255), random(255)),
                                         color(random(255), random(255), random(255))
    
                                       };
    //PVector com = new PVector();                                  
    //PVector com2d = new PVector();                                  
    
    void setup()
    {
      size(960,540);
    
      context = new SimpleOpenNI(this);
    
      context.setMirror(true);
    
      context.enableDepth();
    
      context.enableUser();
    
      background(255,255,255);
    
      stroke(0,255,0);
      strokeWeight(5);
      smooth(); 
    
    
    
    }
    
    void draw()
    {
      context.update();
      background(255,255,255);
       textSize(20);
    
        // se eu passar as elipses para baixo, elas só aparecem quando aparece o esqueleto... Posso colocar aqui o titulo do setup (que vai ficar sempre....
     ellipse(fila1,valA,raio,raio); // bola 1
     ellipse(fila1,valB,raio,raio); // bola 2 
     ellipse(fila1,valC,raio,raio); // bola 3
     ellipse(fila2,valA,raio,raio); // bola 4
     ellipse(fila2,valB,raio,raio); // bola 5 
     ellipse(fila2,valC,raio,raio); // bola 6
    
      // draw the skeleton if it's available
      int[] userList = context.getUsers();
      for(int i=0;i<userList.length;i++)
      {
        if(context.isTrackingSkeleton(userList[i]))
        {
          stroke(userClr[ (userList[i] - 1) % userClr.length ] );
          drawSkeleton(userList[i]);
        }     
    
      }   
    }
    
    // draw the skeleton with the selected joints
    void drawSkeleton(int userId)
    {
      // to get the 3d joint data
      /*
      PVector jointPos = new PVector();
      context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_NECK,jointPos);
      println(jointPos);
      */
    
      PVector torso = new PVector();
      context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_TORSO,torso);
      PVector convertedTorso = new PVector();
      context.convertRealWorldToProjective(torso, convertedTorso);
    
    
      PVector rightHand = new PVector();
      context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_LEFT_HAND,rightHand);
      PVector convertedRightHand = new PVector();
      context.convertRealWorldToProjective(rightHand, convertedRightHand);
      //float rightEllipseSize = map(convertedRightHand.z, 700, 2500,  50, 1);
      ellipse(convertedRightHand.x, convertedRightHand.y, 10, 10);
      //text("hand: " + convertedRightHand.x + " " + convertedRightHand.y, 10, 50);
    //  yval = -(convertedRightHand.y-height/2);
        xval = (convertedRightHand.x-convertedTorso.x);
      //yval = map(convertedRightHand.y,0,height,1,-1);
      //xval = map(convertedRightHand.x,0,width,1,-1);
    //  if (xval>=0){
    //  theta1 = acos(yval/sqrt(sq(xval)+sq(yval)));
    //  }
    //  else{
    //  theta1 = -acos(yval/sqrt(sq(xval)+sq(yval)));
    //  }
      theta1 = PVector.angleBetween(new PVector(convertedRightHand.x-convertedTorso.x,convertedRightHand.y-convertedTorso.y,0.0),new PVector(0,convertedTorso.y-height,0.0));
      if (xval<0){
        theta1*= -1;
      }
    
      PVector leftHand = new PVector();
      context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_RIGHT_HAND,leftHand);
      PVector convertedLeftHand = new PVector();
      context.convertRealWorldToProjective(leftHand, convertedLeftHand);
      //float leftEllipseSize = map(convertedLeftHand.z, 700, 2500,  50, 1);
      ellipse(convertedLeftHand.x, convertedLeftHand.y, 10, 10);
      //yval = -(convertedLeftHand.y-height/2);
        xval = (convertedLeftHand.x-convertedTorso.x);
      //yval = map(convertedLeftHand.y,0,height,1,-1);
      //xval = map(convertedLeftHand.x,0,width,1,-1);
      theta2 = PVector.angleBetween(new PVector(convertedLeftHand.x-convertedTorso.x,convertedLeftHand.y-convertedTorso.y,0.0),new PVector(0,convertedTorso.y-height,0.0));
      if (xval<0){
        theta2*= -1;
      }
    
        PVector head = new PVector();
      context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_HEAD,head);
      PVector convertedHead = new PVector();
      context.convertRealWorldToProjective(head, convertedHead);
      ellipse(convertedHead.x, convertedHead.y, 60, 60);
      fill(random(255), random(255), random(255));
    
     PVector leftFoot = new PVector();
      context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_RIGHT_FOOT,leftFoot);
      PVector convertedLeftFoot = new PVector();
      context.convertRealWorldToProjective(leftFoot, convertedLeftFoot);
    
      ellipse(convertedLeftFoot.x, convertedLeftFoot.y, 10, 10);
    
       PVector rightFoot = new PVector();
      context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_LEFT_FOOT,rightFoot);
      PVector convertedRightFoot = new PVector();
      context.convertRealWorldToProjective(rightFoot, convertedRightFoot);
      ellipse(convertedRightFoot.x, convertedRightFoot.y, 10, 10);
    
    
      context.drawLimb(userId, SimpleOpenNI.SKEL_HEAD, SimpleOpenNI.SKEL_NECK);
    
      context.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_LEFT_SHOULDER);
      context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_LEFT_ELBOW);
      context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_ELBOW, SimpleOpenNI.SKEL_LEFT_HAND);
    
      context.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_RIGHT_SHOULDER);
      context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_RIGHT_ELBOW);
      context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_ELBOW, SimpleOpenNI.SKEL_RIGHT_HAND);
    
      context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_TORSO);
      context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_TORSO);
    
      context.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_LEFT_HIP);
      context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HIP, SimpleOpenNI.SKEL_LEFT_KNEE);
      context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_KNEE, SimpleOpenNI.SKEL_LEFT_FOOT);
    
      context.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_RIGHT_HIP);
      context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HIP, SimpleOpenNI.SKEL_RIGHT_KNEE);
      context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_KNEE, SimpleOpenNI.SKEL_RIGHT_FOOT); 
    
    
     // mão esquerda toca na bola 1
     if(getDistancia(convertedRightHand.x,convertedRightHand.y,fila1,valA)<raio){
       println("Tocaste-me" + convertedRightHand.y);
     minim = new Minim(this);
      player = minim.loadSample("02.wav", 500);
      player.trigger();
    
     }
    
     // mão esquerda toca na bola 2
     if(getDistancia(convertedRightHand.x,convertedRightHand.y,fila1,valB)<raio){
       println("LINDO" + convertedRightHand.y);
        minim = new Minim(this);
      player = minim.loadSample("01.wav", 500);
      player.trigger();
    
     }
    
      // mão esquerda toca na bola 3
     if(getDistancia(convertedRightHand.x,convertedRightHand.y,fila1,valC)<raio){
       println("OPA" + convertedRightHand.y);
      minim = new Minim(this);
      in = minim.getLineIn(Minim.STEREO,896);
      player = minim.loadSample("78.wav", 500);
      player.trigger();
     }
    
      // mão esquerda toca na bola 4
     if(getDistancia(convertedRightHand.x,convertedRightHand.y,fila2,valA)<raio){
       println("Tocaste-me AGAIN" + convertedRightHand.y);
     }
    
     // mão esquerda toca na bola 5
     if(getDistancia(convertedRightHand.x,convertedRightHand.y,fila2,valB)<raio){
       println("LINDO AGAIN" + convertedRightHand.y);
     }
    
      // mão esquerda toca na bola 6
     if(getDistancia(convertedRightHand.x,convertedRightHand.y,fila2,valC)<raio){
       println("OPA AGAIN" + convertedRightHand.y);
     }
     // mão direita toca na bola 1
     if(getDistancia(convertedLeftHand.x,convertedLeftHand.y,fila1,valA)<raio){
       println("DIR TOCOU" + convertedLeftHand.y);
    
     }
    
     // mão direita toca na bola 2
     if(getDistancia(convertedLeftHand.x,convertedLeftHand.y,fila1,valB)<raio){
       println("Tocou 2x" + convertedLeftHand.y);
     }
    
      // mão direita toca na bola 3
     if(getDistancia(convertedLeftHand.x,convertedLeftHand.y,fila1,valC)<raio){
       println("E consegiu 3" + convertedLeftHand.y);
     }
    
      // mão direita toca na bola 4
     if(getDistancia(convertedLeftHand.x,convertedLeftHand.y,fila2,valA)<raio){
       println("DIR TOCOU AGAIN" + convertedLeftHand.y);
     }
    
     // mão direita toca na bola 5
     if(getDistancia(convertedLeftHand.x,convertedLeftHand.y,fila2,valB)<raio){
       println("Tocou 2x AGAIN" + convertedLeftHand.y);
     }
    
      // mão direita toca na bola 6
     if(getDistancia(convertedLeftHand.x,convertedLeftHand.y,fila2,valC)<raio){
       println("E consegiu 3 AGAIN" + convertedLeftHand.y);
     }
      // Pé ESQ toca na bola 1
     if(getDistancia(convertedRightFoot.x,convertedRightFoot.y,fila1,valA)<raio){
       println("Tocaste-me com o pé" + convertedRightFoot.y);
     }
    
     // Pé ESQ toca na bola 2
     if(getDistancia(convertedRightFoot.x,convertedRightFoot.y,fila1,valB)<raio){
       println("LINDO com o pé" + convertedRightFoot.y);
     }
    
      // Pé ESQ toca na bola 3
     if(getDistancia(convertedRightFoot.x,convertedRightFoot.y,fila1,valC)<raio){
       println("OPA com o pé" + convertedRightFoot.y);
     }
    
     // pé ESQ toca na bola 4
     if(getDistancia(convertedRightFoot.x,convertedRightFoot.y,fila2,valA)<raio){
       println("Tocaste-me AGAIN com o pé" + convertedRightFoot.y);
     }
    
     // Pé ESQ toca na bola 5
     if(getDistancia(convertedRightFoot.x,convertedRightFoot.y,fila2,valB)<raio){
       println("LINDO AGAIN com o pé" + convertedRightFoot.y);
     }
    
      // Pé ESQ toca na bola 6
     if(getDistancia(convertedRightFoot.x,convertedRightFoot.y,fila2,valC)<raio){
       println("OPA AGAIN com o pé" + convertedRightFoot.y);
     }
    
       // Pé DIR toca na bola 1
     if(getDistancia(convertedLeftFoot.x,convertedLeftFoot.y,fila1,valA)<raio){
       println("Tocaste-me com a PATA" + convertedLeftFoot.y);
     }
    
     // Pé DIR toca na bola 2
     if(getDistancia(convertedLeftFoot.x,convertedLeftFoot.y,fila1,valB)<raio){
       println("LINDO com a PATA" + convertedLeftFoot.y);
     }
    
      // Pé DIR toca na bola 3
     if(getDistancia(convertedLeftFoot.x,convertedLeftFoot.y,fila1,valC)<raio){
       println("OPA com a PATA" + convertedLeftFoot.y);
     }
    
     // pé DIR toca na bola 4
     if(getDistancia(convertedLeftFoot.x,convertedLeftFoot.y,fila2,valA)<raio){
       println("Tocaste-me AGAIN com a PATA" + convertedLeftFoot.y);
     }
    
     // Pé DIR toca na bola 5
     if(getDistancia(convertedLeftFoot.x,convertedLeftFoot.y,fila2,valB)<raio){
       println("LINDO AGAIN com a PATA" + convertedLeftFoot.y);
     }
    
      // Pé DIR toca na bola 6
     if(getDistancia(convertedLeftFoot.x,convertedLeftFoot.y,fila2,valC)<raio){
       println("OPA AGAIN com a PATA" + convertedLeftFoot.y);
     }
      // CABEÇA toca na bola 1
     if(getDistancia(convertedHead.x,convertedHead.y,fila1,valA)<raio){
       println("Tocaste-me na cabeça" + convertedHead.y);
     }
     // CABEÇA toca na bola 2
     if(getDistancia(convertedHead.x,convertedHead.y,fila1,valB)<raio){
       println("LINDO na cabeça" + convertedHead.y);
     }
     // CABEÇA toca na bola 4
     if(getDistancia(convertedHead.x,convertedHead.y,fila2,valA)<raio){
       println("Tocaste-me na cabeça... Não posso" + convertedHead.y);
     }
     // CABEÇA toca na bola 5
     if(getDistancia(convertedHead.x,convertedHead.y,fila2,valB)<raio){
       println("LINDO na cabeça... Não posso" + convertedHead.y);
     }
     // CABEÇA toca na bola 6
     if(getDistancia(convertedHead.x,convertedHead.y,fila2,valC)<raio){
       println("OPA na cabeça... Não posso" + convertedHead.y);
     }
      // CABEÇA toca na bola 3
     if(getDistancia(convertedHead.x,convertedHead.y,fila1,valC)<raio){
       println("OPA na cabeça... Não posso" + convertedHead.y);
     }
    
    
      translate(convertedTorso.x+320, height);
      stroke(0);
    
    
    }
    
    
    float getDistancia(float x1, float y1, float x0, float y0){
    
    return sqrt((x1-x0)*(x1-x0) + (y1-y0)*(y1-y0));
    
    }
    
    
    
    // -----------------------------------------------------------------
    // SimpleOpenNI events
    
    void onNewUser(SimpleOpenNI curContext, int userId)
    {
      println("onNewUser - userId: " + userId);
      println("\tstart tracking skeleton");
    
      curContext.startTrackingSkeleton(userId);
    
    }
    
    void onLostUser(SimpleOpenNI curContext, int userId)
    {
      println("onLostUser - userId: " + userId);
    }
    
    void onVisibleUser(SimpleOpenNI curContext, int userId)
    {
      //println("onVisibleUser - userId: " + userId);
    }
    
     void stop() {
    
     //out.close();
     //minim.stop();
     //kick.close();
     //snare.close();
    // hat.close();
     //crash.close();
    
    //player.close(); 
     super.stop();
    }
    
  • Defining Frequency Bands w/ Minim

    Your welcome. One last suggestion. You might try analyzing some drum sounds in Audacity which has a decent spectrum analysis tool. You should be able to discern the shifts in spectrum between different percussive instruments that way. The spectrogram uses a log scale so you can identify the upper and lower frequency limits. Copy and modify the snare function, which I think would cover most of the toms, congas, etc, and vary the parameters in your code. I think you'll get close. It's then just a matter of naming the new function, isTom() or whatever you're working on. Cheers and good luck.

  • Defining Frequency Bands w/ Minim

    While I don't use minim yet, I do compose electronic music and use percussion samples. Every instrument has a particular sound "profile", an envelope and frequency domain. Commonly known as an ADSR envelope, Attack, Decay, Sustain, Release and a set of dominant frequencies. A high hats envelope/freq is different from a snare, a kick, etc. Seeing as the first 3 are already profiled, I"m assuming the software is tuned to recognize an instrument based on it's FFT analysis and a set of predetermined parameters. For you to recognize more instruments using "isRange", you'll need to probably isolate a waveform of the percussive instrument you want to identify and adjust the minim parameters accordingly. Audacity: http://audacityteam.org/ has a wide range of tools to record, playback and analyze sounds and wave forms. It's also free and well recommended. You'll have to map the results you get to minim. Hope this makes sense. Cheers.

  • Defining Frequency Bands w/ Minim

    I'm using minim BeatDetect object to analyze the incoming microphone signal. BeatDetect uses FFT. In the BeatDetect class, there are 4 functions of interest. isHat(), isKick(), isSnare() and isRange(int, int, int). The first three are customized versions of isRange(). What I'm trying to do is recognize more than just hat, kick and snare drums. In order to do this, I need to understand the math in the methods isHat(), isKick() and isSnare(). I'm hoping someone here can help me. Here is the code for the 4 functions.

    `

        public boolean isKick()
    {
        if (algorithm == SOUND_ENERGY)
        {
            return false;
        }
        int upper = 6 >= fft.avgSize() ? fft.avgSize() : 6;
        return isRange(1, upper, 2);
    }
    
    public boolean isSnare()
    {
        if (algorithm == SOUND_ENERGY)
        {
            return false;
        }
        int lower = 8 >= fft.avgSize() ? fft.avgSize() : 8;
        int upper = fft.avgSize() - 1;
        int thresh = (upper - lower) / 3 + 1;
        return isRange(lower, upper, thresh);
    }
    
    public boolean isHat()
    {
        if (algorithm == SOUND_ENERGY)
        {
            return false;
        }
        int lower = fft.avgSize() - 7 < 0 ? 0 : fft.avgSize() - 7;
        int upper = fft.avgSize() - 1;
        return isRange(lower, upper, 1);
    }
    
    public boolean isRange(int low, int high, int threshold)
    {
        if (algorithm == SOUND_ENERGY)
        {
            return false;
        }
        int num = 0;
        for (int i = low; i < high + 1; i++)
        {
            if (isOnset(i))
            {
                num++;
            }
        }
        return num >= threshold;
    }
    

    ` I want to be able to recognize beats accurately across a range of instruments by manipulating the methods above. Can anybody help teach me what I need to recognize? Currently, I understand that the functions return true if a beat is detected within a specified range of frequency bands. What I don't understand is why the values for the parameters [low, high, threshold] in the functions correlate to specific instruments. Thanks for reading and please respond.