Howdy, Stranger!

We are about to switch to a new forum software. Until then we have removed the registration on this forum.

  • Is it possibile to sync Processing's frameRate/draw function with and incoming MIDI signal?

    Hi everyone, today I had this idea and tried to write some code but then got stucked.

    I'm thinking about syncing Processing's frameRate/draw function with an incoming MIDI signal because I would love to create a visualization in sync with music coming from Ableton.

    With the MidiBus Library and through the IAC Driver on my Mac I got the Sync/Clock MIDI Message from Ableton into Processing but then I wasn't able to link those two things together.

    This is my code so far but I'm not sure it's working properly. Any help would be greatly appreciated.

    import themidibus.*; //Import the library
    import javax.sound.midi.MidiMessage; //Import the MidiMessage classes http://java.sun.com/j2se/1.5.0/docs/api/javax/sound/midi/MidiMessage.html
    import javax.sound.midi.SysexMessage;
    import javax.sound.midi.ShortMessage;
    
    MidiBus myBus; // The MidiBus
    
    int i = 1;
    
    void setup() {
      size(400, 400);
      background(0);
      frameRate(60);
      MidiBus.list();
      myBus = new MidiBus(this, 0, 1);
    }
    
    void draw() {
     if(frameCount % 60 == 0) {
       println("min");
       i = 1;
     }
    }
    
    void midiMessage(MidiMessage message) {
      print(i + " ");
      i++;
    }
    
  • Processing - Kinect/ gesture - music interaction (Max7?)

    Hello everyone!

    I am using Kinect with Windows library (https://github.com/ThomasLengeling/KinectPV2) in processing and detecting all the skeleton joints and positions. What I want to do is to combine sounds and music according to movements of the skeleton joints. (e.g. control pitch or volume of a sound according to the x-position of the left hand). Something like this (). As I don't have any experience with music software I would like to ask you which method you recommend in order to implement this. A possible approached that I've found to combine with processing is by using Max7 (). But I am not sure if Max is the most suitable approach as I am aware that there are many other software (Ableton, virtual midi controllers). In respect to Max7, I am thinking that I could do that by controlling a slider using the skeleton joint position(e.g as the hand moves right the slider value will increase but I don't know how I could do that - assign the position values from processing to a Max's slider). Any guidelines would be appreciated!

    Michael

  • Night:Code

    V proud to share this first project for my masters. I wrote a piece of code that motion tracks webcam movement, converts it to MIDI data and then sends it out to Ableton to create a soundscape.

    I present to you, Night:Code

  • Still getting lag when sending midi from Processing to Ableton.

    Also, I don't think it's a problem with Ableton, I think it's a problem with the code. I'm not sure if it stops sending midi from Processing or if Ableton stops receiving midi but in normal circumstances, Ableton runs perfect for me. My mac is pretty well spec'd up and is only a month old.

  • Still getting lag when sending midi from Processing to Ableton.

    Hello all.

    So my code is now advancing but it's still laggy. It receives the data in Ableton but it's very temperamental. I've avoided using delay functions as it delays the entire thing. If anyone has any suggestions or can see any anomalies it would be much appreciated! Thank you :)

    import gab.opencv.*;
    import processing.video.*;
    import ipcapture.*;
    import org.opencv.core.Rect;
    import themidibus.*; //Import the library
    
    
    OpenCV opencv;
    IPCapture cam;
    MidiBus midi; 
    int lastMillis;
    int [] timestamps = new int [128];
    
    void setup() {
      size(640, 480);
      //video = new Movie(this, "street.mov"); MUTED
      cam = new IPCapture(this, "http://153.201.66.43:60001/mjpg/video.mjpg?resolution=640x480", "", "");
      cam.start();
      opencv = new OpenCV(this, 640, 480);
    
      opencv.startBackgroundSubtraction(5, 3, 0.5);
      MidiBus.list();
      midi = new MidiBus(this, -1, "Bus 1");
    
      //video.loop();
      //video.play();
    }
    
    void draw() {
      if (cam.isAvailable()) {
        cam.read();
        image(cam, 0, 0);  
        opencv.loadImage(cam);
        opencv.updateBackground();
    
        opencv.dilate();
        opencv.erode();
    
        noFill();
        stroke(255);
        strokeWeight(1);
    
        int rx = 300; 
        int ry = 150;
        int rw = 30;
        int rh = 15;
        rect(rx, ry, rx+rw, ry+rh);
    
        int rx2 = 10; 
        int ry2 = 10;
        int rw2 = 15;
        int rh2 = 15;
        rect(rx2, ry2, rx2+rw2, ry2+rh2);
    
    
        stroke(255, 0, 0);
        strokeWeight(10);
        for (Contour contour : opencv.findContours()) {
    
    
          println(contour.getBoundingBox().x);
          println(contour.getBoundingBox().y);
          int x = contour.getBoundingBox().x;
          int y = contour.getBoundingBox().y;
    
          contour.draw(); 
          {
    
            int [] numbers = new int [128];
    
            if (( x > rx && x < rx + rw) && (y > ry && y < ry + rh))
    
              for (int i = -1; i < timestamps.length; i++)
    
    
                midi.sendNoteOn(0, 62, 127);
    
            timestamps[62] = millis();
    
    
            if (millis()>lastMillis+40)
              midi.sendNoteOff(0, 62, 127);
    
            println(timestamps);
          }
        }
      }
    }
    
  • MIdi signal from Processing to Ableton is very glitchy, any ideas on how to help this issue?

    Hey all.

    Finally cracked my new project of triggering midi in Ableton via webcam movement but the midi signal is very glitchy and intermittent, despite my console telling me is it sending out signals. I feel it might be an issue with buffer size as when I'm sending a small amount of data, it works okay. Anything bigger and it glitches.

    If you need a screen recording of what is happening, let me know. Otherwise I've posted the code underneath. Thanks!

    import gab.opencv.*;
    import processing.video.*;
    import ipcapture.*;
    import org.opencv.core.Rect;
    import themidibus.*; //Import the library
    
    OpenCV opencv;
    IPCapture cam;
    MidiBus midi; 
    
    
    void setup() {
      size(640, 480);
      //video = new Movie(this, "street.mov"); MUTED
      cam = new IPCapture(this, "http://" + "100.38.83.153:8081/mjpg/video.mjpg?resolution=640x480", "", "");
      cam.start();
      opencv = new OpenCV(this, 640, 480);
    
      opencv.startBackgroundSubtraction(5, 3, 0.5);
      MidiBus.list();
      midi = new MidiBus(this, -1, "IAC Bus 1");
    
      //video.loop();
      //video.play();
    }
    
    void draw() {
      if (cam.isAvailable()) {
        cam.read();
        image(cam, 0, 0);  
        opencv.loadImage(cam);
        opencv.updateBackground();
    
        opencv.dilate();
        opencv.erode();
    
        noFill();
        stroke(255, 0, 0);
        strokeWeight(5);
        for (Contour contour : opencv.findContours()) {
    
    
          println(contour.getBoundingBox().x);
          println(contour.getBoundingBox().y);
          int x = contour.getBoundingBox().x;
          int y = contour.getBoundingBox().y;
    
          contour.draw();
    
          if ((x < 700) && (x > 650)) 
            println("x1");
          midi.sendNoteOn(0, 60, 127);
          midi.sendNoteOff(0, 60, 127);
    
          if ((x < 649) && (x > 600)) 
            println("x2");
          midi.sendNoteOn(0, 61, 127);
          midi.sendNoteOff(0, 61, 127);
    
    
          if ((x < 599) && (x > 550)) 
            println("x3");
          midi.sendNoteOn(0, 62, 127);
          midi.sendNoteOff(0, 62, 127);
    
          if ((x < 549) && (x > 500)) 
            println("x4");
          midi.sendNoteOn(0, 63, 127);
          midi.sendNoteOff(0, 63, 127);
    
    
          if ((x < 499) && (x > 450)) 
            println("x5");
          midi.sendNoteOn(0, 64, 127);
          midi.sendNoteOff(0, 64, 127);
    
          if ((x < 449) && (x > 400)) 
            println("x6");
          midi.sendNoteOn(0, 65, 127);
          midi.sendNoteOff(0, 65, 127);
    
          if ((x < 399) && (x > 350)) 
            println("x7");
          midi.sendNoteOn(0, 66, 127);
          midi.sendNoteOff(0, 66, 127);
    
    
          if ((x < 349) && (x > 300)) 
            println("x8");
          midi.sendNoteOn(0, 67, 127);
          midi.sendNoteOff(0, 67, 127);
    
          if ((x < 299) && (x > 250)) 
            println("x9");
          midi.sendNoteOn(0, 68, 127);
          midi.sendNoteOff(0, 68, 127);
    
          if ((x < 249) && (x > 200)) 
            println("x10");
          midi.sendNoteOn(0, 69, 127);
          midi.sendNoteOff(0, 69, 127);
    
    
          if ((x < 199) && (x > 150)) 
            println("x11");
          midi.sendNoteOn(0, 69, 127);
          midi.sendNoteOff(0, 69, 127);
    
          if ((x < 149) && (x > 100)) 
            println("x12");
          midi.sendNoteOn(0, 69, 127);
          midi.sendNoteOff(0, 69, 127);
    
          if ((x < 99) && (x > 50)) 
            println("X13");
          midi.sendNoteOn(0, 69, 127);
          midi.sendNoteOff(0, 69, 127);
    
          if ((x < 49) && (x > 1)) 
            println("x14");
          midi.sendNoteOn(0, 69, 127);
          midi.sendNoteOff(0, 69, 127);
    
    
          if ((y < 700) && (y > 650)) 
            println("y1");
          midi.sendNoteOn(1, 60, 127);
          midi.sendNoteOff(1, 60, 127);
    
          if ((y < 649) && (y > 600)) 
            println("x2");
          midi.sendNoteOn(1, 61, 127);
          midi.sendNoteOff(1, 61, 127);
    
    
          if ((y < 599) && (y > 550)) 
            println("y");
          midi.sendNoteOn(1, 62, 127);
          midi.sendNoteOff(1, 62, 127);
    
          if ((y < 549) && (y > 500)) 
            println("x4");
          midi.sendNoteOn(1, 63, 127);
          midi.sendNoteOff(1, 63, 127);
    
    
          if ((y < 499) && (y > 450)) 
            println("y5");
          midi.sendNoteOn(1, 64, 127);
          midi.sendNoteOff(1, 64, 127);
    
          if ((y < 449) && (y > 400)) 
            println("y6");
          midi.sendNoteOn(1, 65, 127);
          midi.sendNoteOff(1, 65, 127);
    
          if ((y < 399) && (y > 350)) 
            println("y7");
          midi.sendNoteOn(1, 66, 127);
          midi.sendNoteOff(1, 66, 127);
    
    
          if ((y < 349) && (y > 300))
            println("y8");
          midi.sendNoteOn(1, 67, 127);
          midi.sendNoteOff(1, 67, 127);
    
          if ((y < 299) && (y > 250)) 
            println("y9");
          midi.sendNoteOn(1, 68, 127);
          midi.sendNoteOff(1, 68, 127);
    
          if ((y < 249) && (y > 200)) 
            println("y10");
          midi.sendNoteOn(1, 69, 127);
          midi.sendNoteOff(1, 69, 127);
    
    
          if ((y < 199) && (y > 150)) 
            println("y11");
          midi.sendNoteOn(1, 69, 127);
          midi.sendNoteOff(1, 69, 127);
    
          if ((y < 149) && (y > 100)) 
            println("y12");
          midi.sendNoteOn(1, 69, 127);
          midi.sendNoteOff(1, 69, 127);
    
          if ((y < 99) && (y > 50)) 
            println("y13");
          midi.sendNoteOn(1, 69, 127);
          midi.sendNoteOff(1, 69, 127);
    
          if ((y < 49) && (y > 1)) 
            println("y14");
          midi.sendNoteOn(1, 69, 127);
          midi.sendNoteOff(1, 69, 127);
        }
    
    
    
        // void movieEvent(Movie m) {
        //  m.read();
      }
    }
    

    EDIT. Sorry, being stupid and can't get the code quoted properly! Edit 2, Got it ;)

  • How do I use the contour x y location data to send to a MidiBus

    Hello all.

    I'm currently trying to create a project where Processing reads movements from a webcam to send data to Ableton to trigger sound clips. I have managed to merge two sets of code and am now able to track movement (using IPcapture and OpenCV) but I am struggling to get find the correct code that will help deal with the x y data. I shall post the code I have done and what I want to add. Thank you!

    What I have:

    import themidibus.*;
    
    import gab.opencv.*;
    import processing.video.*;
    import ipcapture.*;
    import org.opencv.core.Rect;
    
    
    
    OpenCV opencv;
    IPCapture cam;
    
    
    void setup() {
      size(640, 480);
      //video = new Movie(this, "street.mov"); MUTED
      cam = new IPCapture(this, "http://153.201.66.43:60001/axis-cgi/mjpg/video.cgi?resolution=640x480", "", "");
      cam.start();
      opencv = new OpenCV(this, 640, 480);
    
      opencv.startBackgroundSubtraction(5, 3, 0.5);
    
      //video.loop();coun
      //video.play();
    }
    
    void draw() {
      if (cam.isAvailable()) {
        cam.read();
        image(cam, 0, 0);  
        opencv.loadImage(cam);
    
        opencv.updateBackground();
    
        opencv.dilate();
        opencv.erode();
    
        noFill();
        stroke(255, 0, 0);
        strokeWeight(10);
        for (Contour contour : opencv.findContours()) {
    
    
      print(contour.getBoundingBox());
          contour.draw();
    

    }

    What I want to add:

    void onGoLCellChange(boolean state, int x, int y) {
          //mapping from the GoL grid to MIDI messages
          int channel, note, velocity;
          if (y < 10) {
            channel = 0;
            note = x + 20;
            velocity = (y * 5) +30;
          } else {
            channel = 1;
            note = x + 40;
            velocity = y +30;
          }
    
    
          if (state == true) {
            midi.sendNoteOn(channel, note, velocity);
          } else {
            midi.sendNoteOff(channel, note, velocity);
          }
        }
    

    The second part is from the Game Of Life code. I have figured this is close to what I want but cannot get it to work. I have tried void onBoundingBox(boolean state, int x, int y) and void oncoutour(boolean state, int x, int y) { but keep getting error messages.

    Thank you for your time!

  • Quickie regarding laptop choice - any and all help appreciated.

    Hey guys. I'm looking at setting something up in which I'm essentially running a pretty weighty Ableton project and sending live audio into Processing for visualisations, which I'll then project onto myself.

    I'm just wondering what I should be looking for in a laptop, mainly in terms of CPU/GPU as I don't want to blow a grand if I should have spent two in the first place!

    Currently looking at this - https://www.gearbest.com/laptops/pp_683487.html?lkid=11265755

    Specs are as follows

    CPU: Core i7-7700HQ (Kaby Lake, Quad Core (does Processing utilise this?) 2.8Ghz normal up to 3.8).

    GPU: GTX 1050 Ti (4gb).

    RAM: (8gb DDR4 - expandable).

    Will probably throw in my own SSD

    Battery life will probably be rubbish but that's no issue as it'll be plugged in most of the time. I have a desktop - I just need something that I'll be able to gig with down the line/take to group meets e.t.c I won't be gaming on it or anything and I'll likely not even connect it to the internet - it's literally just to make music/visualisations on (I'm learning to code with Processing).

    Also - I'm aware this question probably gets asked a lot by people with unrealistic budgets so I just want to say that I have more money if people think this laptop won't cut the mustard. I'm not made of unlimited money though so I want bang per buck so to speak, not anything super top of the range!

    One final note - I don't expect people to do my shopping for me :-) Just any tips regarding specs for my desired goals would be HUGELY appreciated! I'm aware that it's difficult because sketches and audio work take as much CPU power as you essentially tell them to and you guys have no idea how many stupidly hungry plugins I'll be using e.t.c - but I guess there are ways around keeping resource usage to a minimum (bouncing effects onto tracks and whatnot).

    Many thanks for the help guys, it is truly appreciated. I've been reading for over a week now but the sea of laptops is pretty large so just a couple of starting points would be incredibly useful!

  • Wondering if Processing is suitable for a few ideas I've had.

    Lots of Processing users have worked with Ableton:

    including dj visualization work, e.g.

    See also node-based IDEs:

    in particular, if you are interested in the MaxMSP/Jitter paradigm you might want to try PraxisLive:

    From the announce:

    • Fork components on-the-fly, or build new ones from scratch, with embedded compiler and editor for live-coding Java/Processing and GLSL.
    • All components are effectively Processing sketches, with no limits on the numbers of individual components or component graphs.
  • Wondering if Processing is suitable for a few ideas I've had.

    Good afternoon all - hope everybody is doing well.

    A quick run down - I took a module in Processing at uni years ago and thoroughly enjoyed my time with it. I haven't done any sort of programming since but really want to get back into creative coding.

    Essentially, the ultimate goal is to create live abstract visualisations of audio (the audio being part pre-programmed and part live) and project it onto myself whilst 'gigging'. Alongside learning Processing, I'll be learning Ableton as I'm bored of Cubase and from what I've heard Ableton is a lot more flexible.

    So I guess the main quiestion is this... Is it possible/relatively painless to route multiple tracks of audio from Ableton into Processing and then use data from those tracks to trigger visuals? I'm guessing I'd have to get some sort of virtual I/O box and send the audio to both the speakers and this virtual box, which Processing would pick up/analyse from there? I know it's possible to do this with MIDI data so I assume it should be the same for audio?

    Also to note, I'm aware that as I'll be learning Ableton I could probably just use MaxMSP/Jitter and make life easier for myself. The aim though is to teach myself to code - I don't mind jumping through a few hoops to get it up and running, as long as my CPU dosen't explode under the load...

    Many thanks all anyway, have a good evening!

  • does processing 3.3.5 support skeleton tracking via kinect?

    hey, i have an installation where i need to trigger live sound from ableton 9 and some lights.using kinect as a trigger. i have downloaded oscp5 library and the kinect libraries. where can i find code that can help me? thanks

  • how to make kinect ditect ableton 9 for live tracking?

    https://forum.processing.org/one/topic/communication-between-processing-and-ableton-live-with-osc-opensoundcontrol.html

    LiveOSC is an interface for Ableton v8 and up: http://livecontrol.q3f.org/ableton-liveapi/liveosc/

    I am guessing if you use liveOSC and oscP5, then you get Processing and Live talking.

    On the other hand, you need to setup a sketch that interacts with Kinect by itself. You need to figure out which Kinect code to use based on your device and understand its performance.

    For kinect, here are some prev posts:

    https://forum.processing.org/two/discussion/21412/kinect-projection-masking-array-out-of-bounds-error#latest

    https://forum.processing.org/two/discussion/21990/get-depth-value-for-each-pixel#latest

    http://shiffman.net/p5/kinect/

    I hope this helps,

    Kf

  • how to make kinect ditect ableton 9 for live tracking?

    hi, i want to understand how to connect the oscp5 library to the kinect library. for this project i want sounds to be triggered when a person crosses which are being made live in ableton 9... any links on where to read more about this or code leads are needed...thanks!!!

  • Text controlled MIDI through Ableton

    Sounds possible.

    • What does this sequence of notes look like? Is it like C3 = a, C#3 = b, D3 = c and so on (sequential), or is it something like c d e = a, d f# a = b, a e c# = c, ... (chord equals letter)?
    • To clarify: the MIDI notes need to be recorded in Processing and displayed on screen as a sentence?

    For MIDI I tend to use TheMidiBus library which allows you to listen to a MIDI device and run a callback function when a note is received. Your code can be something really simple like

    String sentence = "";
    MidiBus bus;
    
    void setup()
    {
        size(800,600);
        //select the first available device (probably not the one you want)
        bus = new MidiBus(this, 0, -1);
        fill(255);
    }    
    
    void draw()
    {
        background(0);
        text(sentence, 10,10);
    }
    
    //The MidiBus library will call this automatically when a new note is received
    void noteOn(int channel, int pitch, int velocity)
    {
        sentence = sentence + getLetterFromNote(pitch);
    }
    
    String getLetterFromNote(int note)
    {
        //convert a MIDI note 0-127 to a letter
    }
    

    You will need a MIDI device that links Ableton to Processing. If you're running Ableton and Processing on the same computer you need to have a virtual MIDI device. I use LoopBe1 which is free for one device. I think OSX has a native virtual MIDI device (but am no expert).

  • Text controlled MIDI through Ableton

    Hello, I have made an alphabet by drawing it into the Ableton MIDI keyboard. I have a sequence of MIDI notes for each letter in the alphabet. I want to make a code which will let the user type in a sentence, a MIDI sequence will then be created combining each sequence corresponding to each letter in the sentence and playing it all in a row. I have never worked with MIDI on processing, I was wondering if this is possible? And could you present me with some general guidelines on which direction to take with this?

  • Arduino -> Live / Live -> Processing

    Yes -- Ableton Connection Kit has OSC Send and OSC MIDI Send. You can pass values to Processing -- specifically, OSC can pass messages.

    "OSC MIDI Send – sends MIDI note and velocity data as OSC messages to a Processing sketch." -- https://www.ableton.com/en/packs/connection-kit/

    I haven't actually used it to pass arbitrary argument strings, so if you don't want to have to map a huge list of values (or need free-form values) you would need to test the OSC pipeline that the patch allows to see if that is possible -- it might not be, but I expect that it would be as OSC messages use an Address pattern.

  • Arduino -> Live / Live -> Processing

    What in trying to work out is if i use the ableton connection kit is there a way to send it our <.. .. > message and have it pass that value to processing via osc or something

  • Arduino -> Live / Live -> Processing

    @jameswest --

    If I'm understanding you right (?) this isn't really an arduino question -- you are asking if Ableton Live can control a Processing sketch using OSC or Midi. Yes, it can. You can use an OSC or MIDI library for your Processing sketch (such as oscp5) to listen for the messages:

    ... and look to previous discussion of Ableton Live sketches for recipes: