teich004
YaBB Newbies
Offline
Posts: 2
Re: Capturing from Two webcams on MacBook
Reply #1 - Feb 19th , 2009, 4:15pm
Well I did end up figuring this one out. I thought it might be useful to post the solution as it wasn't obvious to me at first, so I hope this is helpful to others. As listed above I expected to be able to easily select the USB Webcam devices by selecting one of the items in Capture.list(). At least in Mac OS X 10.5, the actual devices are not exactly listed. Instead the list basically shows the hardware controller on which one might find the capture device: [0] DV Video [1] IIDC FireWire Video [2] USB Video Class Video My guess would be 0 = firewire 400 devices. 1 = firewire 800 devices. 2 = USB devices. Moreover my iSight camera is actually hardwired into the USB controller, so with two other USB webcams connected I had a total of 3 capture devices to choose from but could not find a way of distinguishing them as my only choice was USB Video Class Video. After looking at the source for processing.video.Capture I read this comment: Unfortunately, Apple's QuickTime API uses the name to select devices, and in some cases there might be cameras with the same name on a machine. If you ask for a camera of the same name in sequence, you might see if it just does the right thing and grabs each separate camera in succession. If that doesn't work, you might try calling settings() which will bring up the prompt where you can select a capture device. And that does indeed work. It allows me to individually pick the camera from a quicktime dialog, selecting what I want to use for my right/left stereo images. Now, for some reason doing this seems to disable the captureEvent, but I am still able to read on the draw which for now will suffice. Here is some sample code (note the calls to .settings() in the setup method: import processing.video.*; Capture rightCamera; Capture leftCamera; PImage redImage; PImage blueImage; Boolean _isSetup; int mode = 0; void setup() { _isSetup = false; size(640, 480); //instantiate capture source for right stereo image rightCamera = new Capture(this, width, height, 30); //select webcam from quicktime dialog rightCamera.settings(); //instantiate capture source for left image leftCamera = new Capture(this, width, height, 30); //select webcam from quicktime dialog leftCamera.settings(); redImage = new PImage(width, height); blueImage = new PImage(width, height); depthMap = new PImage(width, height); _isSetup = true; } /* * interpret user action. * s = export stereo images * 1,2 = pick camera feed * * = show anaglyph feed */ void keyReleased(){ if(key!= 's'){ updateMode(); }else{ exportFrame(); } } /* * save out a .png image from both cameras */ void exportFrame(){ showLeft(); save("left.png"); showRight(); save("right.png"); } /* * based on key pressed define draw mode. */ void updateMode(){ if(key == '1'){ mode = 1; }else if(key == '2'){ mode = 2; }else{ mode = 0; } } /* * based on mode show 1 camera feed or combined feed. */ void draw() { if(_isSetup){ switch(mode) { case 1: showRight(); break; case 2: showLeft(); break; case 0: showAnaglyph(); break; } } } void showRight(){ if(rightCamera.available()){ rightCamera.read(); image(rightCamera, 0, 0); } } void showLeft(){ if(leftCamera.available()){ leftCamera.read(); image(leftCamera, 0, 0); } } /* * create colorized images for both cameras * and display a merged image */ void showAnaglyph(){ createRightImage(); createLeftImage(); mergeImages(); } /* * Remove blue from right * camera feed and store in PImage */ void createRightImage(){ if(rightCamera.available()){ //right = red; rightCamera.read(); rightCamera.loadPixels(); redImage.copy(rightCamera, 0, 0, width, height, 0, 0, width, height); for(int i=0;i<redImage.pixels.length;i++) { float r = red(redImage.pixels[i]); float g = green(redImage.pixels[i]); redImage.pixels[i] = color(r,g,0); } } } /* * Remove red and green from left * camera feed and store in PImage */ void createLeftImage(){ if(leftCamera.available()){ //left = blue; leftCamera.read(); leftCamera.loadPixels(); blueImage.copy(leftCamera, 0, 0, width, height, 0, 0, width, height); for(int i=0;i<blueImage.pixels.length;i++) { float b = blue(blueImage.pixels[i]); // float g = green(blueImage.pixels[i]); blueImage.pixels[i] = color(0,0,b); } } } /* * Layer left and right images, screening * out the top image. Write image to display. */ void mergeImages(){ if(leftCamera.available() && rightCamera.available()){ image(blueImage, 0, 0); blend(redImage, 0, 0, width, height, 0, 0, width, height, SCREEN); image(blueImage, 0, height); image(redImage, width, height); } }