Unusual behavior with Serial Ports in 4.2 RC5 on ND and ND+ (Losing data…)
I have a project that requires two XBee/ZigBee modems to communicate and transfer large (for Zigbee) data packets from 1k to about 100k in size.
My plan is to use SD cards to buffer data as it come in and goes out the modems, stitching the packets together as they come in. Simple enough….
I am using the Grommet (Copyright © 2009
http://grommet.codeplex.com) drivers for the low-level networking, which I really like because they started out very light-weight and I made them lighter by removing some methods I will not need.
The project is too large to post the entire code base in this thread, but I plan to post later on as an extension of these drivers.
Here is my trouble: I am losing data after the receiving XBee passes it to the ND/ND+ and when it is read out of the serial buffer. Attached is the Logic trace showing data leaving XBee #1, and being passed from XBee #2 into the ND. The data loss happens at different places at different times. If I slow the transmissions way down (one message every two seconds) I get nearly zero lost bytes. As soon as I remove this delay; lost data.
Here is the setup:
- ND with XBee1 and SD shield on SPI
- ND+ with XBee2 and SD card installed
- Both ND/ND+ share common XBee drivers codebase (All XBee and serial port code is shared)
- I have tried hardware flow control on both ND/ND+ and is enabled on XBee.
- (As a footnote, CTS never de-asserts on either XBee, it always remains low, and never goes low on ND)
- I have SerialErrorReceivedEventHandler enabled, and it never fires.
- Everything is running at 9600 baud, 8N1. (I want to try faster speeds to see if that makes a difference, but CTS flow control should handle buffering data for me?)
Attached is a far-away view of the data in the LA. The left side is showing the modem setup exchange, and the right side is showing the data packets: you can see the packeting leaving one XBee and arriving at the other… These are two messages with 200 byte packages.
Also attaching a zoomed in view to compare with what I am reading out of the Serial Port buffer.
Here is the code that is reading the buffer. Fairly simple. Called by the SerialDataReceivedEventHandler event:
private void Serial_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
//One at a time please....
lock (readLock)
{
Debug.Print("Read unlocked with " + SerialPort.BytesToRead + " bytes to read");
//FIXIT - May not need, but for some reason the event fires with nothing to read....
int timeout = 100;
while (SerialPort.BytesToRead <= 0 && timeout-- > 0) Thread.Sleep(10);
byte[] buf = new byte[1];
if (SerialPort.BytesToRead > 0)
{
try
{
while (SerialPort.BytesToRead > 0)
{
//If there is data, there is a incomplete frame, and we have a complete header
while ((SerialPort.BytesToRead > 0) &&
(!frameBuilder.IsComplete) &&
(frameBuilder.HeaderIsComplete)))
{
//Don’t forget about the checksum byte… The + 1
byte[] buffer = new byte[frameBuilder.dataNeeded + 1];
//reading the data frame and the checksum byte
int bytesRead = SerialPort.Read(buffer, 0, frameBuilder.dataNeeded + 1);
Debug.Print("Read chunk of " + bytesRead.ToString() + " f: " + buffer[0].ToString() + " l: " + buffer[bytesRead-1].ToString() + " need " + (frameBuilder.dataNeeded + 1).ToString());
//Add the newly read data to the framBuilder buffer
frameBuilder.Append(buffer, bytesRead);
}
if (frameBuilder.IsComplete)
{
// got a frame, do something
ReceivedApiFrame(frameBuilder.GetApiFrame());
// Create a thread, launch fire event, etc
frameBuilder.Reset();
}
//Check to see is a header needs to be read (first three bytes)
if ((!frameBuilder.HeaderIsComplete) && (SerialPort.BytesToRead > 0))
{
SerialPort.Read(buf, 0, 1);
Debug.Print("Read byte for header " + buf[0].ToString());
frameBuilder.Append(buf, 1);
}
//Don't beat a hasty retreat after reading the frame…
timeout = 100;
while (SerialPort.BytesToRead <= 0 && timeout-- > 0) Thread.Sleep(10);
}
}
catch (Exception ex)
{
Debug.Print("Error reading data from serial " + ex.Message);
}
}
else Debug.Print("Nothing to read");//FIXIT – Get rid of this if() in production, ignore
Debug.Print("Ending read " + local.ToString());
}
}
Here is the relevant Debug file to try and convince myself I am not going nuts... The data is really not there...
Read byte for header 126
<- good
Read byte for header 0
<- good
Read byte for header 212
<- good
Allocated 212 bytes to API Frame
<- reading 200 bytes plus Zigbee frame
Read chunk of 16 f: 144 l: 52 need 213
<- extra room for checksum byte
Read chunk of 16 f: 34 l: 109 need 197
Read chunk of 15 f: 101 l: 109 need 181
Read chunk of 14 f: 32 l: 50 need 166
Read chunk of 15 f: 48 l: 87 need 152
Read chunk of 15 f: 97 l: 100 need 137
Read chunk of 14 f: 60 l: 48 need 122
Read chunk of 15 f: 54 l: 48 need 108
Read chunk of 15 f: 58 l: 99 need 93
Read chunk of 14 f: 114 l: 100 need 78
Read chunk of 15 f: 97 l: 49 need 64
Read chunk of 14 f: 32 l: 97 need 49
Read chunk of 15 f: 114 l: 105 need 35
Read chunk of 17 f: 99 l: 100 need 20
Read chunk of 3 f: 97 l: 110 need 3
<- finished, without checksum error!
Resetting frame
Staring processing thread…
<- thread off to process the packet
(Here is where it goes horribly wrong….)
Read byte for header 126
<- good, found header start byte
Read byte for header 162
<- BAD! This is NOT a valid high-byte length
Error in length: 162
Resetting frame
Read byte for header 0
ERROR!!! Expected start byte: 0
Read byte for header 64
ERROR!!! Expected start byte: 64
Read byte for header 141
ERROR!!! Expected start byte: 141
Read byte for header 28
ERROR!!! Expected start byte: 28
Read byte for header 195
ERROR!!! Expected start byte: 195
…
The serial buffer contained the following bytes: (Refer to the attached LA image)
126(~), 162, 0, 64(@), 141, 28, 195…
But the XBee sent the following to the ND:
126(~),
0, 212, 144, 0, 19, 162, 0, 64(@), 141, 28, 195…
MISSING BYTES
Any ideas are welcome. I need to get some sleep!