Howdy, Stranger!

We are about to switch to a new forum software. Until then we have removed the registration on this forum.

  • Ketai bluetooth to PC

    Ketai allows you to get access to the sensor data on your phone. You can use oscP5 to establish a connection between your PC and your device and transfer data. I suggest you become familiar with the ketai first and then with oscP5 before you merge them, and it should be easy to do. There are a many examples in the forum related to either oscP5 and ketai btw. Also check both libraries' websites and their provided examples.

    Kf

  • Arrays. Getting separate arrays of bubbles to emit at x,y coordinates of array of moving objects.

    Hi again.

    I have worked a bit further on my inital patch by adding an array of swimming fish and some OSC connectivity for audio visual reactions. After some effort, OSC panning and other effects I added seem to work. Each orb now vibrates to it’s own separate audio loop playing from a MAX Msp patch.

    I have been attempting two things with the array of fish; #1 to vibrate and #2 change their color when either or both orbs plunge beneath the water. Again I seem to have hit a wall here. The fish only react to one orb, but not both at once. I can chose which orb by referring to its index number. I figured this should be pretty simple, according to everything else I have working similarly. I have tried the method that BGADII suggested with my first query, and several others, but to no avail.

    The array for the two orbs is proving problematic. It might seem a bit overkill, but the reason for that array is that I will eventually add many more orbs(sound sources). The script overall is becoming quite confusing. I have pulled it apart several times to solve issues.

    I'd be really grateful if anyone would share some tips on my fish issue... and also general ways to clean up this sketch?

    I have been looking at the native object/class examples in the processing examples folder, but I'm not sure what methods work best for classes with values that interconnect. I am worried that if the patch gets any bigger it'll become wildly confusing (to me;).

    Many thanks!

    import ddf.minim.*;
    Minim minim;
    AudioPlayer[] popsound; 
    import oscP5.*;
    import netP5.*;
    OscP5 oscP5;
    NetAddress myRemoteLocation;   // ************************************
    
    int numberOfOrbs = 2;
    
    float[] incomingKickOSC = new float[numberOfOrbs];  // declare OSC message variable
    
    Orb_Class[] many_Orbs = new Orb_Class[numberOfOrbs];
    ArrayList<Bubble_Class> many_Bubbles = new ArrayList();
    Fish_Class[] many_Fish = new Fish_Class[30];
    
    // Constant Environmental Variables
    float spaceHeight;
    float waterSurfaceHeight;
    float seaFloor;
    
    float orbFishProximityBounces = 0;
    float orbSpaceProximityBounces = 0; 
    
    //Fish Parameters
    float fishColor = 0;
    float fishHeight; 
    
    float radius;   
    
    // ------------------------------------------------------------------------------------------------ SETUP
    void setup() {
      size(1000, 800, P3D);
      noStroke();
      smooth(); 
    
      oscP5 = new OscP5(this,11100);                         // start oscP5, listening for incoming messages at port 11000 
      myRemoteLocation = new NetAddress("127.0.0.1",11000);  // SEND OSC messages to Max MSP   
    
      spaceHeight = (height/3);
      waterSurfaceHeight = (height/3)*2;   
      seaFloor = height;  
      fishHeight = ((height/6)*5);    
      popsound = new AudioPlayer[numberOfOrbs];  // ************************************
      minim = new Minim(this); 
    
      for (int i=0; i<many_Orbs.length; i++) {  
        many_Orbs[i] = new Orb_Class(64, random(4, 6), random(2, 4));
        popsound[i] = minim.loadFile("splash.mp3");
      }//for
    
      for (int i=0; i<many_Fish.length; i++) {
        many_Fish[i] = new Fish_Class(0, random(waterSurfaceHeight, seaFloor), random(1, 3)); 
      }//for      
    }//setup
    
    
    
    // ------------------------------------------------------------------------------------------------ DRAW
    void draw() {
      background(  0, 0, 255 );  // erase the window to black  
      lights();  
      line(0, waterSurfaceHeight, width, waterSurfaceHeight);
    
      for (int j=0; j<many_Orbs.length; j++) {   // display red orbs 
        many_Orbs[j].display(j);
        many_Orbs[j].move(j);
        many_Orbs[j].fishBounceAndColoredOrNot();
        many_Orbs[j].skyReverberatesSound();
        many_Orbs[j].underwaterFiltersSound();
        many_Orbs[j].panSoundsInMax();
      }//for
    
      for (int i=0; i<many_Fish.length; i++) {
        many_Fish[i].moveFish();
        many_Fish[i].displayFish();
      }//for 
    
      for (Bubble_Class b : many_Bubbles) { // move through ArrayList
        b.ascend();
        b.burstAtSurface();
        b.display();
      }//for 
    
      for (int i=many_Bubbles.size()-1; i>=0; i--) {  // to remove we must move backwards through the ArrayList
        Bubble_Class b = many_Bubbles.get(i); 
        //popsound[i] = minim.loadFile("pop.mp3");   
        if (b.isDead){
          many_Bubbles.remove(i);
        }//if
      }//for
    
      tint(255, 127);  // Display at half opacity  
      fill(0, 41, 158, 200);
      noStroke();
      rect(0, height-height/3, width, height/3);  // The Ocean  
    
    
      // ---------------------- SENDING OSC MESSAGES TO MAX MSP ----------------------------------------
      OscMessage myMessage = new OscMessage("/oscFromProcToMax");       
      myMessage.add(many_Orbs[0].mappedFreq); 
      myMessage.add(many_Orbs[1].mappedFreq); 
      myMessage.add(many_Orbs[0].mappedVerbLevel);                                         /// *****************!!!!!!!!!
      myMessage.add(many_Orbs[1].mappedVerbLevel); 
      myMessage.add(many_Orbs[0].panningLeftRight);                                                         
      myMessage.add(many_Orbs[1].panningLeftRight);  
      oscP5.send(myMessage, myRemoteLocation);   /* send the message */  
    }//Draw
    
    
     // ---------------------- RECEIVING OSC MESSAGES FROM MAX ----------------------------------------
     // Technique found at: http://www.technopagan.net/blog/tech/processing-osc/
    
    void oscEvent(OscMessage theOscMessage1) {
      if (theOscMessage1.addrPattern().equals("/oscAmp")) {
        if(theOscMessage1.checkTypetag("ff") == true){
          //println(theOscMessage1.typetag());
          for(int i=0; i < numberOfOrbs; i++)       {
            incomingKickOSC[i] = theOscMessage1.get(i).floatValue();    
          }//for
        }//if
      }//if
    }//draw
    
    
    
    
    // -------------------------------------------------------------------------------------------------- ORB CLASS
    
    class Orb_Class {
      float mappedFreq;        //OSC variables
      float mappedVerbLevel;   //OSC variables
      float panningLeftRight;  //OSC variables
    
      float orbx; 
      float orby = random(300, height-100); // location
      float xspeed, yspeed; // speed  
      float xtime;  
      float ytime; 
      int timer;
      int duration= int(random(900, 1600));  
    
      // Constructor
      Orb_Class(float tempR, float tempXt, float tempYt) {
        radius = tempR;
    
        xspeed = tempXt;
        if (random(100)<50){
          xspeed *= -1;
        }//if
    
        yspeed = tempYt;
        if (random(100)<50){
          yspeed *= -1;
        }//if
      }
    
    
      void move(int arrayIndex){ 
        orbx += xspeed; // Increment x
        orby += yspeed; // Increment y
    
        if (orbx > width || orbx < 0) { // Check horizontal edges 
          xspeed *= -1;   
        }//if
    
        if (orby > height || orby < 0) { // Check vertical edges
          yspeed *= -1;
        }//if 
    
        if (orby > height/2){
          if (millis() - timer >= duration) {
            int upperBound = int(random(30)); 
            for (int i=0; i<upperBound; i++) {
              many_Bubbles.add( new Bubble_Class(orbx, orby, 10, 1) );
            }//for 
            timer = millis();
            duration=int(random(100, 300));
          }//if
        }//if
    
        if (orby+radius <= waterSurfaceHeight){  //Splash sound as orbs breach surface
          popsound[arrayIndex].rewind();
          popsound[arrayIndex].play();      
        }//if     
      }//void move
    
    
      void display(int arrayIndex){  
        pushMatrix();
        translate(orbx, orby, 0);
        fill(255, 0, 0); 
        noStroke();  
        sphere(radius/2+(incomingKickOSC[arrayIndex]*200));
        popMatrix();
      }
    
    
      void panSoundsInMax(){
         panningLeftRight = map(orbx, 0, width, 0.2, 0.8);
      }
    
    
      void fishBounceAndColoredOrNot(){
        if (   ( orby > (waterSurfaceHeight+radius/2) )   ){
          fishColor = 255; 
          orbFishProximityBounces = 1; 
        }//if 
        else{
          fishColor = 0;
          orbFishProximityBounces = 0.01; 
        }//else
      }
    
    
      void underwaterFiltersSound(){  
        if (   (orby < (waterSurfaceHeight+radius/2)) && (orby > (waterSurfaceHeight-radius/2))   ) {  
          mappedFreq = map(orby, (waterSurfaceHeight+(radius/2)), (waterSurfaceHeight-(radius/2)), 0, 20000);
        }//if
      }
    
    
      void skyReverberatesSound(){
        if (   (orby > (spaceHeight-radius/2)) && (orby < (spaceHeight+radius/2))   )  {  
          mappedVerbLevel = map(orby, (spaceHeight-(radius/2)), (spaceHeight+(radius/2)), 0, -72.6);        
          //orbSpaceProximityBounces = map(orby, (spaceHeight-(radius/2)), (spaceHeight+(radius/2)), 1, 0); 
        }//if
      }
    }
    
    
    
    
    // -------------------------------------------------------------------------------------------------- BUBBLE CLASS 
    class Bubble_Class {
      float diameter;
      float riseSpeed;
      float bubblex, bubbley;
      color color1; 
      boolean isDead=false; 
    
      Bubble_Class(float tempX, float tempY, float tempD, float tempRs) {
        bubblex = tempX+random(-12, 12);
        bubbley = tempY+random(-12, 12);
        diameter = tempD;
        riseSpeed = tempRs + random(1, 3);
        color1=color(random(127, 256), random(70, 256));
      }
    
      void ascend() {
        bubbley = bubbley - riseSpeed;
        if (bubbley<-30){
          isDead=true;
        }//if
      }
    
      void burstAtSurface() {             // Burst the bubbles as they reach the surface
        if (bubbley < ((height/3)*2)+diameter) {
          isDead=true;
        }//if
      }  
    
      void display() {
        fill(color1);
        ellipse(bubblex, bubbley, diameter, diameter);
      }
    }
    
    
    
    
    // -------------------------------------------------------------------------------------------------- FISH CLASS
    class Fish_Class
    {
      float fishx;
      float fishHeightConstrained;
      float fishSpeed;
      float fishy;
    
      // Constructor
      Fish_Class(float tempXpos, float tempYpos,  float tempSpeed) {
        fishx = tempXpos; 
        fishSpeed = tempSpeed;
        fishy = tempYpos;
      }
    
    
      void moveFish() {
        fishx = fishx + fishSpeed;
    
        for (int i=0; i<numberOfOrbs; i++) {
          fishy = fishy + (random(-10,10)*(incomingKickOSC[i]*5*orbFishProximityBounces) );    /// *****************!!!!!!!!!
        }//for
    
        fishHeightConstrained = constrain(fishy, 900, 600);// + (random(-10,10)*orbProximityToFish);
    
        if (fishx > width) {
          fishx = 0;
        }//if
        if (fishy < waterSurfaceHeight) {
          fishy = 900; 
        }//if
      }//moveFish
    
    
      void displayFish() {
        fill(fishColor);
        triangle(fishx, fishy, fishx-20, fishy+8, fishx-20, fishy-5);   
        ellipse(fishx, fishy, random(20, 30), random(10, 15)) ;
      }
    }
    
    //
    
  • GazeTrack: A Processing library for eye-tracking

    @AugustoEst can this library works with EyeLink system?

    We're working in the university with EyeLink 1000 plus

    The camera it's the same as the picture

    When we connect to Processing we're using oscp5 and UDP connection

  • Problems with Buffer (EyeTracker)

    I'm receiving data from Matlab (1000fps) of a subject looking a picture using EyeTracker Hardware. [both in different computers].

    Connection is OK, UDP is fine, OSC is working thanks to oscP5.

    With the data, I draw circles ( that are designed by X and Y axis from EyeTracker device ) in a canvas.

    Now the problem is that I don't know how to build a proper buffer to receive all this massive information, and then, after few milliseconds erase it. Because I want to see in my canvas, the eyes moving on real time drawing circles, but i just want to see 1 circle, not all of them, eventhough i don't need to save all this information.

    If you don't understand something, please ask me, and I'll try to explain you better:

    I have this code, feel free to ask and say whatever you want :

    /**
     * oscP5parsing by andreas schlegel
     * example shows how to parse incoming osc messages "by hand".
     * it is recommended to take a look at oscP5plug for an
     * alternative and more convenient way to parse messages.
     * oscP5 website at http://www.sojamo.de/oscP5
     */
    import java.util.*;
    
    import oscP5.*;
    import netP5.*;
    
    OscP5 oscP5;
    NetAddress myRemoteLocation;
    
    ArrayList <Posiciones> buffer  = new ArrayList<Posiciones>();
    
    //Iterator <Posiciones> itr = new ArrayList<Posiciones>();
    
    
    long tiempoX = 1000;
    long tiempoActual = millis(); 
    
    float x1, y1, x2, y2;
    float diamt = 20;
    
    float ancho = 400;
    float alto = 400;
    
    void setup() {
      size(400, 400); 
      smooth();
      noStroke();
      /* start oscP5, listening for incoming messages at port 8002 */
      oscP5 = new OscP5(this, 8002); //Pasar este puerto al experimento de MatLab
    
      /* myRemoteLocation is a NetAddress. a NetAddress takes 2 parameters,
       * an ip address and a port number. myRemoteLocation is used as parameter in
       * oscP5.send() when sending osc packets to another computer, device, 
       * application. usage see below. for testing purposes the listening port
       * and the port of the remote location address are the same, hence you will
       * send messages back to this sketch.
       */
      //myRemoteLocation = new NetAddress("127.0.0.1", 4002);
    }
    
    void draw() {
    
      escupirData();
      dibujarCirculos();
    }
    
    void oscEvent(OscMessage theOscMessage) {
      /* check if theOscMessage has the address pattern we are looking for. */
    
      if (theOscMessage.checkAddrPattern("/test")==true) {
        /* check if the typetag is the right one. */
        //println(theOscMessage);
        if (theOscMessage.checkTypetag("s")) {
          /* parse theOscMessage and extract the values from the osc message arguments. */
          String stringValue = theOscMessage.get(0).stringValue();
          println(" values: " + stringValue);
    
          int[] datos = int(split(stringValue, "-")); //divide los datos y los guarda temporalmente //  divide data and save temporarily
          Posiciones crearPosiciones = new Posiciones(); //creo un objeto para guardar la data // creating an object to save the data
          crearPosiciones.x1 = datos[0]; //guardo la data // saving data
          crearPosiciones.y1 = datos[1];
          crearPosiciones.x2 = datos[2];
          crearPosiciones.y2 = datos[3];
          buffer.add(crearPosiciones); //agrego un elemento al arrayList/buffer // adding an element to arrayList/buffer
    
          return;
        }
      }
    }
    
    void escupirData() {
    
      long currentMillis = millis();
    
      if (millis() - tiempoActual >= tiempoX ) {
        //Pixel newPixel = pixeles.get(i); 
    
        if (buffer.size() > 0) { //fijarse si el buffer tiene elementos // look if the buffer has elements
          Posiciones posicionActual = buffer.get(0); //creo un objeto para guardar el primer objeto del buffer // creating an object to save the first object of the buffer
    
          x1 = posicionActual.x1; //guardo la data para usar // saving the data to use
          y1 = posicionActual.y1;
          x2 = posicionActual.x2;
          y2 = posicionActual.y2;
    
    
          posicionActual.borrar = true;//si ya lo dibujé levanto el flag para borrarlo // if it is drawn, clear
    
          println(" borrado: " + posicionActual.borrar);
    
          tiempoActual = millis();
        }
      }
    
      //crear un iterator, que si ya dibujó la posición lo borre por el flag // creating an iterator
    
      //creo un iterator para podes remover lugares del ArrayList // creating an iterator to remove spots of ArrayList
      Iterator itr = buffer.iterator();
      // itr = buffer.iterator();
      while (itr.hasNext ()) {
        Posiciones posicionActual = (Posiciones)itr.next();
        if (posicionActual.borrar) {
          itr.remove(); //remuevo la frase que ya no se ve // removing the phrase that is not visible
          println("borré 1, quedan " + buffer.size() + " posiciones");
        }
      }
    }
    
    void dibujarCirculos() {
    
      x1 = map(x1, 0, ancho, 0, width); 
      y1 = map(y1, 0, ancho, 0, height);
      x2 = map(x2, 0, ancho, 0, width); 
      y2 = map(y2, 0, ancho, 0, height); 
    
      fill(255, 0, 0);
      ellipse(x1, y1, diamt, diamt);
      fill(0, 0, 255);
      ellipse(x2, y2, diamt, diamt);
    }
    
    
    AND THIS CLASS :
    
    class Posiciones {
      int x1, y1, x2, y2;
      boolean borrar = false;
    
    
      Posiciones() {
      }
    
      void vacio() {
      }
    
    }
    
  • Minim sequencing play and record functions

    Hi Folks!

    I've been working on a script where in very simple terms I would like a random beep to sound (Audioplayer beep - working!) and after that beep, a sequence of audio tracks to play (currently arranged in the array audio) and eventually audio recordings to be made (not set up yet).

    My current issue is that

    audio[0].play(); 

    is working perfectly but not then playing the second audio track. I've tried setting this up as a separate event (

    void playrecord()

    ) and have found this thread very helpful to get the crux of the code from @GoToLoop: https://forum.processing.org/two/discussion/12966/how-to-play-an-audio-file-sequentially.

    Ideally I would like to sequence it as audio 0 play, record audio input 0 through microphone, audio 1 play, record audio input 1 through microphone. All before the next "beep" sound is made.

    Can anyone suggest what is going wrong?


    import ddf.minim.*;
    import ddf.minim.AudioPlayer;
    import oscP5.*;
    import netP5.*;
    
    OscP5 oscP5;
    NetAddress otherSketch;
    
    Minim minim;
    AudioPlayer beep;
    
    static final int AUDIO = 2;
    final AudioPlayer[] audio = new AudioPlayer [AUDIO];
    AudioInput daydream;
    AudioRecorder[] recorder = new AudioRecorder [1];
    
    PFont f;
    int [] beeps = new int [5];
    int ms;
    int start = -millis();
    int totalBeeps;
    int beeptime;
    boolean pressEnter;
    
    int current=-1;
    int count=0;
     
    void setup() {
      size (512, 200, P3D);
      f= createFont ("Georgia", 16);
     
      boolean pressEnter = false;
      totalBeeps = 0;
     
      oscP5 = new OscP5(this,8001); /* start oscP5, listening for incoming messages at port 8001 */
      otherSketch = new NetAddress("xxxxxx",8000); /* Start listening at (IP Address, Port Number) */
      
     
      int fac=1000;
      beeps [0] = int(random(10, 60))*fac;//these numbers aren't right but give an earlier beep!
      beeps [1] = int(random(1260, 1739))*fac;
      beeps [2] = int(random(1860, 2339))*fac;
      beeps [3] = int(random(2460, 2939))*fac;
      beeps [4] = int(random(3060, 3539))*fac;
     
      printArray (beeps);
      
      minim = new Minim(this);
      audio [0] = minim.loadFile ("Audio_01.mp3");
      audio [1] = minim.loadFile ("Audio_02.mp3");
      
      minim = new Minim(this);
      daydream = minim.getLineIn();
     // recorder [0] = minim.createRecorder(daydream, "audio01.mp3");
      
      minim = new Minim(this);
      beep = minim.loadFile ("ping.mp3");
    }
     
    void keyPressed() { //boolean controlling the start screen.
      if (keyCode == ENTER) { 
        start = millis();
        pressEnter = true;
      }
    }
     
    void draw () {
     
      background (255);
      textFont (f, 16);
      fill (0);
     
      int ms = millis()-start;
     
      println(ms, start);
      
      startScreen();
    
      for (int i=0; i beeps[i] && i>current) {
          current=i;
          beeptime=millis();
          beep.rewind();
          beep.play();
          totalBeeps =totalBeeps+1;
          OscMessage myMessage = new OscMessage("time in milliseconds");
          myMessage.add(ms); 
          myMessage.add(millis()); 
          myMessage.add(beeptime); 
          myMessage.add(start); 
          oscP5.send(myMessage, otherSketch);       
    playrecord();
        }
      } 
    }
    
    void startScreen(){
      
      if (pressEnter)//this boolean controls the start screen and initiates the timer -resetting millis to 0 when ENTER is pressed.
      {
        text("The experiment has begun and these are the random beep times:", 10, 40);
        text(beeps[0], 10, 70);
        text("milliseconds", 80, 70);
        text(beeps[1], 10, 90);
        text("milliseconds", 80, 90);
        text(beeps[2], 10, 110);
        text("milliseconds", 80, 110);
        text(beeps[3], 10, 130);
        text("milliseconds", 80, 130);
        text(beeps[4], 10, 150);
        text("milliseconds", 80, 150);
      } else {
        text("Press Enter to begin", 10, 100);
      }
     
      if (!pressEnter)
        return;
    }
    
    void playrecord(){
         audio[0].play();
         if (!audio[count].isPlaying()) audio[count=(count+1)% AUDIO].play(); 
    }
    
  • How to play a minim track at a specific time using millis()

    I am attempting to write a code which sets of a beep at controlled but random intervals using millis and minim. I can't understand why the beeps are not playing using the code below. Can anyone help?

    I am very new to processing!

    Thank you,

    Amy

    <

    pre lang="javascript">

     import ddf.minim.*;
        import ddf.minim.AudioPlayer;
    
        import oscP5.*;
        import netP5.*;
    
        OscP5 oscP5;
        NetAddress otherSketch;
    
        Minim minim;
        AudioPlayer beep;
    
        PFont f;
        int [] beeps = new int [5];
        int ms;
        int start;
        int totalBeeps;
        boolean pressEnter;
    
        void setup(){
          size (512,200,P3D);
          f= createFont ("Georgia",16);
    
          boolean pressEnter = false;
          totalBeeps = 0;
          oscP5 = new OscP5(this,8001); /* start oscP5, listening for incoming messages at port 12000 */
          otherSketch = new NetAddress("127.0.0.1",8000);
    
        beeps [0] = int(random(10,60))*1000;//these numbers aren't right but give an earlier beep!
        beeps [1] = int(random(1260,1739))*1000;
        beeps [2] = int(random(1860,2339))*1000;
        beeps [3] = int(random(2460,2939))*1000;
        beeps [4] = int(random(3060,3539))*1000;
    
        printArray (beeps);
    
        minim = new Minim(this);
        beep = minim.loadFile ("ping.wav");
    
        }
    
        void keyPressed() { //boolean controlling the start screen.
          if (keyCode == ENTER) { 
            start = millis();
            pressEnter = true;
          }
        }
    
        void draw (){
    
        background (255);
        textFont (f,16);
        fill (0);
    
        int ms = millis()-start;
    
        println(ms);
    
          if (pressEnter)//this boolean controls the start screen and initiates the timer -resetting millis to 0 when ENTER is pressed.
          {
            text("The experiment has begun and these are the random beep times:",10,40);
            text(beeps[0],10,70);
            text("milliseconds",80,70);
            text(beeps[1],10,90);
            text("milliseconds",80,90);
            text(beeps[2],10,110);
            text("milliseconds",80,110);
            text(beeps[3],10,130);
            text("milliseconds",80,130);
            text(beeps[4],10,150);
            text("milliseconds",80,150);
    
            OscMessage myMessage = new OscMessage("/time in milliseconds");//this isn't the correct place but a test
            myMessage.add(ms); 
            oscP5.send(myMessage, otherSketch); 
          }else {
            text("Press Enter to begin",10,100);
          }
    
        for (int i=0;i<beeps.length;i++){ //this for loop should initiate the beeps in the array to sound if their value matches int ms (which is millis - millis when ENTER pressed) 
        if (beeps[i] == ms) 
             {
          beep.play();
          totalBeeps =totalBeeps+1;
            }
          else if ( beep.position() == beep.length() )
          {
            beep.rewind();
           }
          }
        }
    

  • stitching multiple sketches together

    Im struggling to stitch my codes together to a random load of capture cams. Can someone help? i've been using this but I can't get past the first two sketches in the list, I don't know if it's getting more complicated for processing? I've included the first bit of code.

    import processing.video.*; 
    
    final int numSketches=5;
    int w=640;
    int h=480;
    
    int sketchToDo;
    Capture cam;
    
    boolean bright = true;
    boolean greyScale;
    int shiftAmount = 4;
    int grid = 1;
    
    //GlobalVariables allSketchesGlobalVariables;
    
    void setup() {
      size(640, 480);
      sketchToDo = int( random( numSketches ) );
      switch( sketchToDo ) {
      case 0:
        setupSketch0(); // No size() calls!
        break;
      case 1:
        setupSketch1();
        break;
      }
    }
    
    void draw() {
      switch( sketchToDo ) {
      case 0:
        drawSketch0();
        break;
      case 1:
        drawSketch1();
        break;
      }
    }
    

    Glitch 1

    import processing.video.*; Capture video;

    PImage img1;
    int w=640, h=480;
    
    boolean bright = true;
    boolean greyScale;
    int shiftAmount = 4;
    int grid = 1;
    
    
    void setup() {
      size(640, 480);
      video = new Capture(this, 640, 480); 
      video.start();
    }
    
    void draw() { 
      loadPixels(); // Fills pixelarray
      float mouseMap = (int) map(mouseX, 0, width, 0, 255*3); // Brightness threshold mapped to mouse coordinates
    
    if(shiftAmount > 24 || shiftAmount < 0){shiftAmount = 0;};
    
      for (int y = 0; y< h; y++)
      {
        for (int x = 0; x< w; x++)
        {
          color c = video.pixels[y*video.width+x]; 
    
          int a = (c >> 24) & 0xFF;
          int r = (c >> 16) & 0xFF;  
          int g = (c >> 8) & 0xFF;  
          int b = c & 0xFF; 
    
          if (y %grid == 0) {
    
            if (bright)
            {
              if (r+g+b > mouseMap) {
                pixels[y*w+x] = c << shiftAmount; // Bit-shift based on shift amount
              }
            }
    
            if (!bright)
            {
              if (r+g+b < mouseMap) {
                pixels[y*w+x] = c << shiftAmount; // Bit-shift based on shift amount
              }
            }
          }
        }
      }
      updatePixels();
    
      if (greyScale) {
        filter(GRAY);
      }
    
      println("Shift amount: " + shiftAmount + " Frame rate: " + (int) frameRate + " Greyscale: " + greyScale) ;
    }
    
    void keyPressed()
    // Keyboard controls
    {
      switch(keyCode) {
      case UP:
        shiftAmount++;
        break;
      case DOWN:
        shiftAmount--;
        break;
      case LEFT:
        if (grid > 1) {
          grid--;
        }    
        break;
      case RIGHT:
        grid++;    
        break;
      case TAB:
        if (bright) {
          bright = false;
        }
        if (!bright) {
          bright = true;
        }
        break;
      case ENTER:
        if (!greyScale) {
          greyScale = true;
          break;
        }
        if (greyScale) {
          greyScale = false;
          break;
        }
      }
    }
    
    void captureEvent(Capture c) { 
      c.read();
    }
    `
    

    glitch 2

    import processing.video.*;
    
    Capture cam;
    
    void setup() {
      size(640, 480);
    
      String[] cameras = Capture.list();
      cam = new Capture(this, width, height);
      cam.start();
      ellipseMode(CENTER);
    }
    
    void draw() {
      noStroke();
      background(255);
      for (int i = 0; i < width; i = i+20) {
        for (int j = 0; j < height; j = j+20) {
          fill(cam.get(i, j) * 4);
          ellipse(i, j, 20, 20);
        }
      }
    
      if (cam.available() == true) {
        cam.read();
      }
      //image(cam, 0, 0);
    }
    

    Glitch 3

        import processing.opengl.*;
        import processing.video.*;
        import com.jogamp.opengl.GL;
        import com.jogamp.opengl.GL2ES2;
    
        Capture video;
    
        PJOGL pgl;
        GL2ES2 gl;
    
        boolean rotar = false;
    
        void setup() {
            size(640, 480, OPENGL);
    
            video = new Capture(this, width, height);
    
            video.start();  
    
        }
    
        void draw() {
    
            noFill();
            lights();
    
            strokeWeight(3);
            background(0);
    
            pgl = (PJOGL) beginPGL();  
              gl = pgl.gl.getGL2ES2();
    
            if (video.available()) {
              video.read();
              video.loadPixels();
              background(0);
    
             gl.glBlendColor(0.0,0.0,1,0.8);
    
              gl.glDisable(GL.GL_DEPTH_TEST);
              gl.glEnable(GL.GL_BLEND);      
              gl.glBlendFunc(GL.GL_SRC_ALPHA, GL.GL_ONE_MINUS_SRC_ALPHA);
    
              pushMatrix();
    
              if (rotar){
                translate(width/2, height/2, 0);
                rotateZ(radians((mouseX-(width))));
    
                //rotateZ(radians(mouseX));
                rotateY(radians(-(mouseY-(height)))); //rotateY -(mouseY
                //rotateY(radians(mouseX/TWO_PI));
                translate(-width/2, -height/2, 0);
              }  
    
    
              for (int y = 0; y < video.height; y+=5) {  //antes 5
    
                  beginShape(POINTS);
                    for (int x = 0; x < video.width; x++) {
    
                            int pixelValue = video.pixels[x+(y*video.width)];
    
                            stroke(red(pixelValue*2), green(pixelValue*2), blue(pixelValue*3), 255); //255
    
                            vertex (x*2, y*2, -(brightness(pixelValue)*2));  //100                                          
                    }
                  endShape();   
              }
    
              popMatrix();
              endPGL();
              saveFrame("img_###.png");
            }
        }
    
        void mouseClicked(){
          rotar = !rotar;
        }
    

    Glitch 4

    import processing.video.*; 
    Capture video;
    
    int w=640/2, h=480/2;
    int pixelDensity = 1;
    
    void setup() {
      size(800, 600, OPENGL);
      background(255);
      video = new Capture(this, w, h); 
      video.start();
      camera(100, height/2, (height/2) / tan(PI/6), width/2, height/2, 0, 0, 1, 1); // Adjust camera
    }
    
    void draw() {
      noCursor();  
      background(255);
      camera(mouseX*2, height/2, (height/2) / tan(PI/6), width/2, height/2, 0, 0, 1, 1); // Adjust camera
      translate(width/2-w/2, height/2-h/2, 100); 
      video.loadPixels(); // Load webcam feed to pixel array
    
      for (int y = 0; y< h; y++)
      {
        if (y%pixelDensity==0) {
          for (int x = 0; x< w; x++)
          {
            if (x%pixelDensity==0) {
              // Color extraction using bit-shifting
              color c= video.pixels[y*video.width+x];
              int r = c >> 16 & 0xFF; // Red
              int g = c >> 8 & 0xFF; // Green
              int b = c & 0xFF; // Blue
              stroke(c);
              point(30+x, y+30, ((int)(r+g+b)/3)+mouseY);
            }
          }
        }
      }
    }
    
    void captureEvent(Capture c) { 
      c.read();
    }
    
    void keyPressed()
    {
      if (keyCode == LEFT && pixelDensity >1)
      {
        pixelDensity--;
      }
      if (keyCode == RIGHT)
      {
        pixelDensity++;
      }
    }
    

    Glitch 5

    import processing.video.*;
    import controlP5.*;
    import oscP5.*;
    import netP5.*;
    
    OscP5 oscP5;
    NetAddress dest;
    ControlP5 cp5;
    Capture cam;
    
    float minSat = 100;
    float minBri = 170;
    float hueRange = 5;
    int blur = 0;
    boolean orignalVideo = true;
    int[] selectedHue = {0, 75, 150};
    int currentColor = 0;
    int currentX = 0;
    PGraphics filterColorImg;
    
    void setup() {
      size(640, 360);
    
      /* start oscP5, listening for incoming messages at port 9000 */
      oscP5 = new OscP5(this, 9000);
      //dest = new NetAddress("127.0.0.1",6448);
      dest = new NetAddress("192.168.43.62", 6448);
    
      cp5 = new ControlP5(this);
      cp5.addSlider("currentColor", 0, 2).setNumberOfTickMarks(3).linebreak();
      cp5.addSlider("minSat", 0, 255).linebreak();
      cp5.addSlider("minBri", 0, 255).linebreak();
      cp5.addSlider("hueRange", 0, 100).linebreak();
      cp5.addSlider("blur", 0, 30).linebreak();
      cp5.addToggle("orignalVideo");
    
      String[] cameras = Capture.list();
      cam = new Capture(this, cameras[0]);
      cam.start();
      filterColorImg = createGraphics(width, height);
      colorMode(HSB);
    }
    
    void draw() {
      if (cam.available() == true) {
        cam.read();
      }
      fastblur(cam, blur); //Apply fastblur
    
      pushMatrix();
      scale(-1, 1);
      image(cam, - (width), 0, width, height);
      loadPixels(); 
      popMatrix();
    
      filterColorImg.beginDraw();
      filterColorImg.background(0);
      OscMessage msg = new OscMessage("/test");
    
      for (int y = 0; y < height; y++) {
        for (int x = 0; x < width; x++) {
          int loc = x + y*width;
          float h = hue(get(x, y));
          float s = saturation(get(x, y));
          float b = brightness(get(x, y));
    
          for (int i = 0; i<3; i++) {
            if ((h<(selectedHue[i]+hueRange) && h>(selectedHue[i]-hueRange)) && s > minSat && b > minBri) {
              pixels[loc] =  color(h, s, b);
              filterColorImg.set(x,y,color(h, s, b));
              if (x==currentX) {
                msg.add((float)1);
    
                msg.add(map(s+b, 0, 255+255, 0, 1));
              }
            } else {
              pixels[loc] =  color(0);
              if (x==currentX) {
                msg.add((float)0);
              }
            }
          }
        }
      }
      filterColorImg.endDraw();
    
      //oscP5.send(msg, dest);
      //println(msg);
    
      //updatePixels();
      if (orignalVideo) {
        pushMatrix();
        scale(-1, 1);
        image(cam, - (width), 0, width, height);
        popMatrix();
      } else {
       image(filterColorImg, 0, 0);
      }
    
      drawUI();
      drawTimer();
      currentX+=1;
    
      if (currentX>width) currentX = 0;
    }
    
    void mousePressed() {
      if (mouseX>180 || mouseY >180)  
        selectedHue[currentColor] = (int)hue(get(mouseX, mouseY));
    }
    
    void drawUI() {
      int h = 180;
      int w = 180;
      for (int i = 0; i<selectedHue.length; i++) {
        fill(selectedHue[i], 255, 255);
        if (currentColor == i) {
          stroke(255);
        } else {
          noStroke();
        } 
        rect(5 + i * 17, 5, 13, 13);
      }
      stroke(0);
      line(w, 0, w, h);
      line( 0, h, w, h);
    }
    
    void drawTimer() {
      stroke(255);
      line(currentX, 0, currentX, height);
    }
    
    void fastblur(PImage img, int radius)
    {
      if (radius<1) {
        return;
      }
      int w=img.width;
      int h=img.height;
      int wm=w-1;
      int hm=h-1;
      int wh=w*h;
      int div=radius+radius+1;
      int r[]=new int[wh];
      int g[]=new int[wh];
      int b[]=new int[wh];
      int rsum, gsum, bsum, x, y, i, p, p1, p2, yp, yi, yw;
      int vmin[] = new int[max(w, h)];
      int vmax[] = new int[max(w, h)];
      int[] pix=img.pixels;
      int dv[]=new int[256*div];
      for (i=0; i<256*div; i++) {
        dv[i]=(i/div);
      }
    
      yw=yi=0;
    
      for (y=0; y<h; y++) {
        rsum=gsum=bsum=0;
        for (i=-radius; i<=radius; i++) {
          p=pix[yi+min(wm, max(i, 0))];
          rsum+=(p & 0xff0000)>>16;
          gsum+=(p & 0x00ff00)>>8;
          bsum+= p & 0x0000ff;
        }
        for (x=0; x<w; x++) {
    
          r[yi]=dv[rsum];
          g[yi]=dv[gsum];
          b[yi]=dv[bsum];
    
          if (y==0) {
            vmin[x]=min(x+radius+1, wm);
            vmax[x]=max(x-radius, 0);
          }
          p1=pix[yw+vmin[x]];
          p2=pix[yw+vmax[x]];
    
          rsum+=((p1 & 0xff0000)-(p2 & 0xff0000))>>16;
          gsum+=((p1 & 0x00ff00)-(p2 & 0x00ff00))>>8;
          bsum+= (p1 & 0x0000ff)-(p2 & 0x0000ff);
          yi++;
        }
        yw+=w;
      }
    
      for (x=0; x<w; x++) {
        rsum=gsum=bsum=0;
        yp=-radius*w;
        for (i=-radius; i<=radius; i++) {
          yi=max(0, yp)+x;
          rsum+=r[yi];
          gsum+=g[yi];
          bsum+=b[yi];
          yp+=w;
        }
        yi=x;
        for (y=0; y<h; y++) {
          pix[yi]=0xff000000 | (dv[rsum]<<16) | (dv[gsum]<<8) | dv[bsum];
          if (x==0) {
            vmin[y]=min(y+radius+1, hm)*w;
            vmax[y]=max(y-radius, 0)*w;
          }
          p1=x+vmin[y];
          p2=x+vmax[y];
    
          rsum+=r[p1]-r[p2];
          gsum+=g[p1]-g[p2];
          bsum+=b[p1]-b[p2];
    
          yi+=w;
        }
      }
    }
    
    
    import processing.video.*;
    import controlP5.*;
    
    int numPixels;
    Capture video;
    int keyColor = 0xff000000;
    int keyR = (keyColor >> 16) & 0xFF;
    int keyG = (keyColor >> 8) & 0xFF;
    int keyB = keyColor & 0xFF;
    PVector chromaAreaStart;
    PVector chromaAreaEnd;
    boolean setChromaArea = false;
    int thresh = 20; // tolerance of 
    ControlP5 cp5;
    
    void setup() {
      size(960, 720); 
    
      video = new Capture(this, width, height);
      numPixels = video.width * video.height;
    
      video.start(); 
      addSlider();
    
      chromaAreaStart = new PVector(0, 0);
      chromaAreaEnd = new PVector(width, height);
    }
    
    void slider(int theThreshold) {
      thresh = theThreshold;
      println("a slider event. setting background to "+theThreshold);
    }
    
    void addSlider() {
      noStroke();
      cp5 = new ControlP5(this);
    
      // add a vertical slider
      cp5.addSlider("slider")
        .setPosition(10, 10)
        .setSize(200, 20)
        .setRange(0, 255)   
        .setValue(thresh)
        ;
    }
    
    void draw() {
      if (video.available()) {
        background(0xFFFFFF);
        loadPixels();    
        video.read(); // Read a new video frame
        video.loadPixels(); // Make the pixels of video available
    
        for (int i = 0; i < numPixels; i++) { // For each pixel in the video frame...
          // Fetch the current color in that location
          color currColor = video.pixels[i];
          int currR = (currColor >> 16) & 0xFF; // apparently this is faster than using red(currColor);
          int currG = (currColor >> 8) & 0xFF;
          int currB = currColor & 0xFF;
    
          // Compute the difference of the red, green, and blue values
          int diffR = abs(currR - keyR);
          int diffG = abs(currG - keyG);
          int diffB = abs(currB - keyB);
    
          // Render the pixels wich are not the close to the keyColor to the screen
          int pixelX = i % width;
          int pixelY = floor(i/width);
    
          if ((diffR + diffG + diffB)> thresh && isWithinChromeArea(pixelX, pixelY)) {
            pixels[i] = color(currR, currG, currB);
          }
        }
        updatePixels();
    
        if (setChromaArea) {
          fill(255, 0, 255, 60);
          noStroke();
          int rectW = abs(round(chromaAreaEnd.x - chromaAreaStart.x));
          int rectH = abs(round(chromaAreaEnd.y - chromaAreaStart.y));
          int startX = min(int(chromaAreaStart.x), int(chromaAreaEnd.x));
          int startY = min(int(chromaAreaStart.y), int(chromaAreaEnd.y));
          rect(startX, startY, rectW, rectH);
        }
      }
    }
    
    boolean isWithinChromeArea(int xPos, int yPos) {
      boolean withinChromaArea = true;
    
      int startX = min(int(chromaAreaStart.x), int(chromaAreaEnd.x));
      int startY = min(int(chromaAreaStart.y), int(chromaAreaEnd.y));
      int endX = max(int(chromaAreaStart.x), int(chromaAreaEnd.x));
      int endY = max(int(chromaAreaStart.y), int(chromaAreaEnd.y));
    
      withinChromaArea = xPos > startX && xPos < endX;
    
      if (withinChromaArea) {
        withinChromaArea = yPos > startY && yPos < endY;
      }
    
      return withinChromaArea;
    }
    
    void mousePressed() {
      if (keyPressed) {
        chromaAreaStart = new PVector(mouseX, mouseY);
      } else if (mouseY > 100) {
        setChromaColour();
      }
    }
    
    void mouseDragged() {
      if (keyPressed) {
        setChromaArea = true;
        chromaAreaEnd = new PVector(mouseX, mouseY);
      }
    }
    
    void mouseReleased() {
      setChromaArea = false;
    }
    
    void setChromaColour() {
      keyColor = get(mouseX, mouseY);
      keyR = (keyColor >> 16) & 0xFF;
      keyG = (keyColor >> 8) & 0xFF;
      keyB = keyColor & 0xFF;
    }
    
  • Using oscP5 over the internet

    Hello all, I'm currently trying to figure out how to use the ocsP5 module to communicate over the internet. I have made 2 simple client and server sketches which act as a simple shared drawing canvas. I'm not posting the code in this thread to avoid cluttering, as each one is about 70 lines even after I removed all the unessential code. They can be found here:

    Server: https://pastebin.com/NgsRaLJP
    Client: https://pastebin.com/L4UqD7Uw

    Now the curious thing is that my code works when both are run simultaneously on my computer and the server-ip in the client is set to '127.0.0.1'. Now when I replace that with my actual IP (IPV6) and run both on the same computer a strange effect happens - the server can draw on itself and the client, while the client can only draw on itself. If I export the client to my Android phone and run it off a separate network from the server, neither client nor server can draw on each other. I have triple-checked that the server-listening-port is portforwarded through my router and that the client, when ran on Android, has the 'CHANGE WIFI MULTICAST STATE' permission enabled.

    What am I doing terribly wrong? Thanks in advance!

  • WiFi Communication between PC and Android Problem

    I found the working code. Maybe someone will need it. Processing 3.3.6, Android Mode 4.0. Phone: Android 7.0.

    https://github.com/gratefulfrog/Car/wiki/Processing-Android-mode-and-Ketai-Library#wifi

    I delete this part: public void settings(){ fullScreen();

    and replace in setup() this : orientation(PORTRAIT); //LANDSCAPE);

    work code see below:

    For Pc javaMode:

    
    import oscP5.*;
    import netP5.*;
    
    OscP5 oscP5;
    NetAddress remoteLocation;
    float accelerometerX, accelerometerY, accelerometerZ;
    
    void setup() {
      size(480, 480);
      oscP5 = new OscP5(this, 18000);
      remoteLocation = new NetAddress("192.168.1.34", 18000); // Customize! input android device ip!!!!!!
      textAlign(CENTER, CENTER);
      textSize(24);
    }
    void draw() {
      background(78, 93, 75);
      text( "Remote Accelerometer Info: " + "\n" +
            "x: "+ nfp(accelerometerX, 1, 3) + "\n" +
            "y: "+ nfp(accelerometerY, 1, 3) + "\n" +
            "z: "+ nfp(accelerometerZ, 1, 3) + "\n" +
            "Local Info: \n" +
            "mousePressed: " + mousePressed, width/2, height/2);
    
      OscMessage myMessage = new OscMessage("mouseStatus");
      myMessage.add(mouseX);
      myMessage.add(mouseY);
      myMessage.add(int(mousePressed));
      oscP5.send(myMessage, remoteLocation);
    }
    
    void oscEvent(OscMessage theOscMessage) {
      if (theOscMessage.checkTypetag("fff")) {
        accelerometerX = theOscMessage.get(0).floatValue();
        accelerometerY = theOscMessage.get(1).floatValue();
        accelerometerZ = theOscMessage.get(2).floatValue();
      }
    }
    

    For Android device, android Mode:

    
    /* Wifi to pc from Nexus 4
     * works PERFECTLY !!!
     */
    
    
    import netP5.*;
    import oscP5.*;  
    import ketai.net.*;
    import ketai.sensors.*;
    OscP5 oscP5;
    KetaiSensor sensor;
    
    
    NetAddress remoteLocation;
    float myAccelerometerX, myAccelerometerY, myAccelerometerZ;
    int x, y, p;
    String myIPAddress;
    String remoteAddress = "192.168.1.2"; // input Pc ip!!!!!!!!!!!!!!!!!!!
    
    
    
    void setup() {
      sensor = new KetaiSensor(this);
      orientation(PORTRAIT);
      textAlign(CENTER, CENTER);
      textSize(36);
      initNetworkConnection();
      sensor.start();
    }
    
    void draw() {
      background(78, 93, 75);
      
      text( "Remote Mouse Info: \n" +
            "mouseX: " + x + "\n" +
            "mouseY: " + y + "\n" +
            "mousePressed: " + p + "\n\n" +
            "Local Accelerometer Data: \n" +
            "x: " + nfp(myAccelerometerX, 1, 3) + "\n" +
            "y: " + nfp(myAccelerometerY, 1, 3) + "\n" +
            "z: " + nfp(myAccelerometerZ, 1, 3) + "\n\n" +
            "Local IP Address: \n" + myIPAddress + "\n\n" +
            "Remote IP Address: \n" + remoteAddress, width/2, height/2);
      
      OscMessage myMessage = new OscMessage("accelerometerData");
      myMessage.add(myAccelerometerX);
      myMessage.add(myAccelerometerY);
      myMessage.add(myAccelerometerZ);
      oscP5.send(myMessage, remoteLocation);
    }
    
    void oscEvent(OscMessage theOscMessage) {
      if (theOscMessage.checkTypetag("iii")){
        x = theOscMessage.get(0).intValue();
        y = theOscMessage.get(1).intValue();
        p = theOscMessage.get(2).intValue();
      }
    }
    
    void onAccelerometerEvent(float x, float y, float z){
      myAccelerometerX = x;
      myAccelerometerY = y;
      myAccelerometerZ = z;
    }
    void initNetworkConnection(){
      oscP5 = new OscP5(this, 18000);
      remoteLocation = new NetAddress(remoteAddress, 18000);
      myIPAddress = KetaiNet.getIP();
    }
    
  • Processing Sketches Have Messed Up Keyboards

    Hi @hudson_m4000,

    The whole project I think would be a lot to post here and considering the only time the keyboard ever is effected is the place in which I described I'm not sure it's fully necessary (Let me know if I'm just being ignorant though, apologies if I am :) ). The problem isn't one that is very persistent, the times at which the bug occurs appears to be completely random. When I first began experimenting with the android processing API and used the keyboard functions everything worked fine for quite a while without problems. Then one day an app that I installed on my phone during testing starting getting the error which had previously never shown any issues. I should mention again that it was across ALL android processing sketches (apps installed that were developed through processing), even one app which was the exact example from the reference linked. And, all other apps like Discord, Twitter, Messenger etc. had the normal keyboard. But, somehow all other devices installed the EXACT same APK package and didn't have any errors.

    Here are the imports I'm using if it helps in any way:

    `

    import oscP5.*;

    import netP5.*;

    import android.view.inputmethod.InputMethodManager;

    import android.content.Context;

    import android.content.SharedPreferences;

    import android.app.Activity;

    import android.preference.PreferenceManager;

    `

    Kind regards, Night. :)

    Side note: Even though InputMethodManager is never used or referenced in the code is it possible that just importing this could mess up the keyboard? I don't understand how this would affect all processing applications installed on the phone though... most without the import entirely. :-/

  • communicate between two macs in processing

    @Faffie -- I'd suggest starting out by installing the library and trying to run some of the examples that come with it.

    • Processing > Tools > Add Tool > oscP5
    • Processing > File > Examples > Contributed Libraries > oscP5

    The easiest way is to begin by testing sending OSC messages between two programs on the same computer -- then move one of the programs to another computer and see if they can hear each other over the network. That helps isolate a common category of why-won't-it-work problems.

  • communicate between two macs in processing

    @Faffie -- it sounds like you want network communication.

    In addition to the network library, there are lots of options in the libraries list under 'data' and 'io' depending on what you are trying to do.

    Specifically, if your two macs are on the same local subnet, you could use OSC message passing. That is a common way of orchestrating multiple machines to react to one another -- OSC is popular in music / VJ-ing.

  • How could I use a function from different class?

    Sorry, I don't have the hardware neither know much about the oscP5 library. :-<
    My answers were simply about what were obvious from what you had posted so far. :-\"

  • How could I use a function from different class?
    • You can also pre-process the received OscMessage inside PApplet::oscEvent() by calling OscMessage::addrPattern() method:
      http://Sojamo.de/libraries/oscp5/examples/oscP5message/oscP5message.pde
    • Given the possible receivable patterns are: "/output_1", "/output_2" & "/output_3".
    • You can instead just grab the last number character from the acquired String: "1", "2" & "3".
    • And pass that parsed value, rather than the whole OscMessage, to a method from Particle: *-:)

    // Forum.Processing.org/two/discussion/25814/
    // how-could-i-use-a-function-from-different-class#Item_8
    
    // GoToLoop (2018-Jan-04)
    
    import oscP5.*;
    import java.util.List;
    
    final List<Particle> particles = new ArrayList<Particle>();
    
    void oscEvent(final OscMessage msg) {
      final String addr = msg.addrPattern();
      if (addr == null || addr.isEmpty())  return;
    
      final int act = addr.charAt(addr.length() - 1) - '0';
      for (final Particle p : particles)  p.action(act);
    }
    
    class Particle {
      void action(final int action) {
        println("Action:", action);
      }
    }
    
  • How could I use a function from different class?
    • If that's your whole Particle class, right off the bat I can see you don't initialize its 2 fields:
      position & velocity!
    • At least do this: final PVector position = new PVector(), velocity = new PVector();.
    • Another bug is that you expect the oscP5 library's instance to callback Particle::oscEvent() method.
    • However, the oscP5 library expects the oscEvent() method to belong to the PApplet's instance we pass as its argument when we instantiate it via new.
  • WiFi Communication between PC and Android Problem

    Solved!

    I found another discussion with a similar problem:

    https://forum.processing.org/two/discussion/18194/receive-osc-using-oscp5-library-android-mode-4-0-beta2

    It seems that oscP5 is not compatible with Android Mode 4.0. So, I have changed the codes to use the library UDP and now it works.

  • WiFi Communication between PC and Android Problem

    Hi all,

    I am trying to establish bidirectional communication via Wifi between an Android device and a PC running Windows. I have followed the Daniel Sauter's book "Rapid Android Development", Chapter 6: Networking devices with Wifi. https://www.mobileprocessing.org/networking.html
    I am trying the first example of the chapter.

    On the Android side, the code can be found at:

    https://github.com/danielsauter/rapid-android-development/blob/master/code/Networking/WiFiDataExchangeAndroid/WiFiDataExchangeAndroid.pde

    On the PC side, the code can be found at:

    https://github.com/danielsauter/rapid-android-development/blob/master/code/Networking/WiFiDataExchangePC/WiFiDataExchangePC.pde

    When running both codes, the android device gets the messages from the PC but the PC doesn't get the messages sent by the Android device. Is there any solution? If not, Is there another approach to establish the communication using Processing?

    I am using:

    PC: Windows 10 v.1709, Processing 3.3.6, Android Mode 4.0, Ketai 14, oscP5 0.9.9

    Android device: Samsung S5 Mini (SM-G800F), Android 6.0.1

    Thanks in advance!

  • Showing IP address of my device in the network?

    Hello,

    I want to send data from one to another device by using the oscp5 library. Is there a function that returns the IP address of my device in the network? For example: The address of my notebook is 192.168.178.25.

    Thank you for your help!

  • How can I control Abelton Live 9 with Processing?

    Is it still possible to use LiveOSC to control Abelton Line 9?

    I've tried using LiveOSC and LiveOSC2 with oscp5 and Live 9.7.5 on Mac Sierra and Windows 10 with no luck.

    What are you guys/gals using to control Abetlon Live these days?