Broadcasting a byte[] array (socket connectivity?)

Hello again! I recently came here to ask how I would go about converting PImages to byte[] arrays, and this is a continuation of that question. I'm not currently in an environment where I can do this using trial and error, so I'm hoping I can get some help before it comes to that. I've got a function in my code that continuously packages PImages from a webcam into byte[] arrays. All I have left to do is broadcast them. The project is meant to emulate an IP camera on a robot which i don't presently have access to. My question is whether I should be using Socket or ServerSocket to do the actual sending (I've seen server examples that used either/or). In case it's important, this is my code:

import processing.video.*;
import javax.imageio.*;
import java.awt.image.*; 
import java.awt.image.BufferedImage;
import java.awt.image.DataBufferByte;
import java.awt.image.WritableRaster;

import java.io.ByteArrayOutputStream;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.OutputStream;
import java.net.Socket;

// Capture object
Capture cam;
Socket soc;
void setup() {
  size(320,240);

  // Setting up the Socket, requires try/catch
  try {
    //...?
    soc = new Socket("10.32.16.2", 1180);
  } catch (IOException e) {
    e.printStackTrace();
  }

  // Initialize camera
  cam = new Capture( this, width,height,30);
  cam.start();
}

void captureEvent( Capture c ) {
  c.read();
  // Whenever it gets a new image, send it!
  broadcast(c);
}

void draw() {
  image(cam,0,0);
}

void broadcast(PImage img) {

  // We need a buffered image to do the JPG encoding
  BufferedImage bimg = new BufferedImage( img.width,img.height, BufferedImage.TYPE_INT_RGB );

  // Transfer pixels from localFrame to the BufferedImage
  img.loadPixels();
  bimg.setRGB( 0, 0, img.width, img.height, img.pixels, 0, img.width);

  // Need these output streams to get image as bytes for TCP communication
  ByteArrayOutputStream baStream    = new ByteArrayOutputStream();
  BufferedOutputStream bos      = new BufferedOutputStream(baStream);
  // Turn the BufferedImage into a JPG and put it in the BufferedOutputStream
  // Requires try/catch
  try {
    ImageIO.write(bimg, "jpg", bos);
  } 
  catch (IOException e) {
    e.printStackTrace();
  }

  // Get the byte array, which it will send out via TCP!
  byte[] packet = baStream.toByteArray();

  // Send JPEG data as a byte[] packet
  println("Sending packet with " + packet.length + " bytes");
  /*try {
    OutputStream outStream = soc.getOutputStream();
    outStream.write(packet);
    outStream.flush();
  } 
  catch (Exception e) {
    e.printStackTrace();
  }*/
}

I've commented out the actual code for sending the byte[] array because it will return an error every time if the socket hasn't been properly created. If I can just get this part (the socket creation) to function properly, I can probably figure the rest of the issues out by myself... I would really appreciate any help! Thanks!

p.s and here's the error...

java.net.ConnectException: Connection timed out: connect
  at java.net.PlainSocketImpl.socketConnect(Native Method)
  at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
  at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
  at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
  at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
  at java.net.Socket.connect(Socket.java:529)
  at java.net.Socket.connect(Socket.java:478)
  at java.net.Socket.<init>(Socket.java:375)
  at java.net.Socket.<init>(Socket.java:189)
  at VideoSender.setup(VideoSender.java:63)
  at processing.core.PApplet.handleDraw(PApplet.java:2280)
  at processing.core.PGraphicsJava2D.requestDraw(PGraphicsJava2D.java:243)
  at processing.core.PApplet.run(PApplet.java:2176)
  at java.lang.Thread.run(Thread.java:662)
Sign In or Register to comment.