Hi,
I'm trying to use processing to transmit a wav file to an arduino with an SD card reader at 115200bps. However, it takes a really, really long time - on the order of 20 minutes per megabyte!
Here's my code; basically, it opens the file, sends out 120 bytes, and waits for a "(" from the arduino to confirm that it is ready to receive the next chunk.
Code:import processing.serial.*;
int IN_BUF_SIZE = 120;
// The serial port:
Serial myPort;
byte b[];
void setup()
{
// List all the available serial ports:
println(Serial.list());
myPort = new Serial(this, Serial.list()[0], 115200);
//myPort.flush();
// open a file and read its binary data
b = loadBytes("Layla.wav");
println("Loaded File!");
}
void draw() {
if (millis() < 5000) return; //Wait for the arduino bootloader to finish
for (int i = 0; i < b.length; i+=IN_BUF_SIZE) //Until the end of the file
{
for (int j = i; j < i+IN_BUF_SIZE && j < b.length; j++) //Read some bytes into the buffer
{
myPort.write(b[j]);
}
while (myPort.read() != '('); //Wait for ack from arduino
println(i + "Bytes sent!");
}
println("File sent!");
}
I've been poking around the signals with my logic analyzer, and discovered that each chunk of 120 bytes takes 0.118 seconds just to transmit from the computer. This means that even if I were to completely optimize all of the processing lag on the arduino side and have the computer transmitting continuously, a 1Mbyte file would still take
(1 000 000 bytes / 120 bytes per chunk * .118 seconds per chunk /60 seconds per minute)
~16 minutes to transmit, which is a lot longer than the
(1 000 000 bytes / 14 400 bytes per second)
~69 seconds that it should take running at 115200bps.
Thoughts?