serialEvent() bugs

edited May 2016 in Arduino

Hi,

I'm pulling my hair out with the buggy serial library in Processing. I'll try and explain without pasting reams of code here.

In my setup, I'm sending a node discovery command to an XBee co-ordinator (AT command ND), running in API mode. In my serialEvent, I used to wait for the Xbee delimiter (7E) using bufferUntil(0x7E), then do a port.read() to get the LSB of the packet length, then another port.read() for the MSB. I would then do a loop of [LSB] iterations to read in the packet.

port.buffer(1) must be set, btw, otherwise all kinds of weird behaviour results. Using different values truncated the packets, which are of varying length, naturally, so you have to trigger on each seperate byte arrival.

The XBee replies (in my two device setup) with two packets, one containing the name of the co-ordinator, and one containing the name of the end device. However, it became apparent that bufferUntil(0x7E) would produce the delimiter first time, and the 00 of the MSB the next, skipping the 7E. Why?

After trying numerous workarounds, I gave up, and started writing a ring buffer. Initially, I could watch the two XBee packets hit it, totalling the correct length, so I went back to the Arduino code that I'm using to send data to Processing.

The data packets are hitting the ring buffer, which is now filling up nicely. However, NOW, if I stop the Processing sketch, and let the Arduino continue to send packets, THEN stop the Arduino sketch, THEN start the Processing sketch, I get 130 bytes, always 130 bytes, sitting in the buffer, even though it's inside a class, and I'm instantiating it in my setup. Obviously, the serialEvent is firing when the sketch is started, and all attempts to clear it down are failing miserably.

I've tried port.clear(), a loop of port.read() inside a while(port.available() > 0), and nothing is working. Obviously, I need everything to be initialised when I stop and then run the Processing sketch, as eventually, data will be coming in from numerous devices, and I have to have a zero'd ring buffer, from which to start extracting valid packets.

If the bufferUntil() method wasn't so flaky, I wouldn't need to be faffing around with a ring buffer anyway. I hope I've explained this clearly. Before anybody says 'post some code snippets', you have to realise that all my workaround stuff is relevant, and has been cut and replaced with numerous different attempts at curing this issue.

I could do with some help here, before I un-install Processing and look for a more robust piece of s/w for my project. I simply cannot be the only person experiencing these issues.

Thanks.

Max

Answers

  • edited May 2016

    In order for bufferUntil() to work correctly, use readString() inside serialEvent().

    And on Arduino side, just use print() for sending out individual value.
    Use println() for the last value.

    Never use write() though. It's for buffer(), not bufferUntil()!

    https://forum.Processing.org/two/discussion/14988/drawing-of-graphs-from-i2c-imu

  • edited May 2016

    Hi,

    Thanks for the reply.

    Initially, the Arduino is not relevant. The XBee sends it's node information wirelessly, independently of any attached device, to the co-ordinator, which is plugged into my USB port. This is a packet of information with a format that starts with a delimiter, 7E. Thus, I don't have control over how the data is sent and, besides, it's a byte arriving howsoever.

    I was using this simple code in setup:

    myPort = new Serial(this, Serial.list()[3], 19200);

    myPort.buffer(1);

    myPort.bufferUntil(0x7E);

    txPacket(AT_ND);

    (This last line is a simple method that sends a remote AT command to all XBees, asking them to broadcast their identification settings)

    Inside serialEvent, I was merely doing this, to see why things weren't working:

    print(hex(myPort.read(),2));

    This produces the output:

    7E00

    The 7E arrives when the co-ordinator sends its node information, then the 00 appears when the end device sends its information. As far as I'm concerned, using bufferUntil(0x7E) should produce:

    7E7E

    OR... If the 7E is being discarded, then:

    0000

    Unless somebody wants to explain why it differs...

    If I do NOT use bufferUntil(0x7E), I can see both packets arrive, perfectly formed, BOTH beginning with 7E, thus:

    7E001E88014E440000000013A200409351CF4D415354455200FFFE0000C105101E7F 7E001D88014E4400CB860013A200409351AF534C4156450000000200C105101E9A

    I won't go into the details of the XBee packet structure, as it's not relevant. Suffice to say the device names ('MASTER','SLAVE') are in there, along with packet length info, checksums, etc.

    If I just use myPort.read() inside serialEvent(), and add each byte to an array, everything is perfect, and in sync, with no issues, as one would expect.

    Again, thanks for the reply, but it doesn't explain what's going on.

    Max

  • edited May 2016

    If I just use myPort.read() inside serialEvent(), ...

    Read my 1st reply at the 1st line:

    In order for bufferUntil() to work correctly, use readString() inside serialEvent().

    So you had already started wrongly there w/ read()! :-@

    Another important piece, bufferUntil() works better w/ ending values, not starting 1s.

    W/ bufferUntil(0x7E);, you're gonna have to await next transmission in order to get the previous! :-\"

    https://Processing.org/reference/libraries/serial/index.html

  • Dude,

    I'm not 'starting wrongly' with myPort.read(), and repeating it and putting up a childish gif doesn't make you right, I'm afraid. Nor does sidetracking with a pointer to how I should be presenting my code. I don't give a flying f*** what the forums 'expect' of me, just in why this library is wack.

    I'm interested in a byte, value 126, period. I don't need to use readString(), it's entirely irrelevant. Either bufferUntil(0x7E) waits until 126 arrives or it does not.

    Then, either port.read() gathers it correctly, every time, or it mucks it up every time, not one way, then the other. What do you think readString() is going to produce, when faced with 126 coming in? I need to parse packets of bytes, not strings, and I sure as Hell don't need to be converting stuff from one form to another.

    "Works better with ending values, not starting 1s", is the most bizarre 'answer' I've ever heard!

    Also, it's entirely incorrect to say that I have to wait until the next transmission. With buffer(1) set, every single byte that hits the port triggers a serialEvent(). With buffer(8), for example, it takes 8 bytes (obviously) to arrive before serialEvent() fires.

    The XBee sends a packet, of [n] bytes, starting with a delimiter, 126. There are many scenarios in which that's it, there's no more to come. myPort.read() DOES do exactly as it should, no need for any string anything.

    I don't want to get embroiled in petty arguments here, just looking for a definitive answer. Thanks for your input, but you've either not grasped what I'm talking about, or you don't know what you're talking about. No need to reply, thanks.

    Max

Sign In or Register to comment.