I am sending a digital bitstream from LabView through a XBee USB adaptor to a XBee-Pro module. When the bitstream from the PC includes three zeros in a row, sometimes only two zeros in a row appear at the Dout pin of the receiving XBee module. This glitch happens about one out of twenty times.
Which is the bigger problem? Should I not be trying to send digital bitstreams that include runs of zeros? Or is the problem the 115,200 to 111,111 interface in the XBee USB adaptor? In other words, should I solve this by dropping the PC baud to 57,600? Or should I avoid sending digital data? Or both?
I can’t tell what the exact problem is, but if I were in your position I’d first try dropping the baud rate to 57,600.
If the problem then goes away, it’s probably the 115,200/111,111 issue.
I can’t recall any posts here where runs of zeroes have been a problem.
Thanks, I’ll spare you a long explanation why 57,600 would make my life difficult (but I’ll do it if I have to), but I just replaced the zeros with other values, ran it at 115,200 and 100 of 100 packets came through CRC-checked perfectly. But with the original packet that had runs of two and three zeros, 27 of 100 packets failed. And all of the failures were because of missing 00 bytes at the receiving XBee.
Dropping the baud rate as a temporary measure, if you can, would indicate whether 115200 was the issue, which would be useful information.
Other things you could try:
Research into whether a rate closer to 111,111 is achievable on the PC. If it’s possible at all, it may only be possible with certain hardware.
See what happens if you set the PC to use two stop bits. That’s the suggestion I make in the pinned post at the top of this forum, though I have to confess that it was a theoretical suggestion not backed up by testing.
Question for the community: how many people are using 115,200 successfully and which were the tricks that worked?
Thanks. Digi tech support came back with the baud rate being the issue and not the digital zeros. I can buy that, it’s just that my testing indicates that digital zeros aggravate the baud rate problem.
But I’m still in bad need of some faster interface rates. Has anyone tested XBee Pro modules interfaced to a micro where the micro is also talking at 111,111? Because I can put an 8M crystal on my micro and get 111,111. I’m hoping that’s a robust solution.
Update for those curious: downshifting to 57,600 cleaned things up nicely, as predicted by Digi tech support. But this is a big hit to my battery life. I plan to build next gen stuff with 8M xtals so I can try all this again with micros that talk to the XBee at 111,111. I’ll let you know how that goes.
I still suspect that translating my digital data to ASCII (gets rid of the 00s) would hold the 115200-111111 together, but that doubles the length of my data, so effectively the same loss of battery capacity as halving the baud. Halving the baud comes out slightly ahead because it avoids the battery power to do the translations.