All good thoughts – thanks. I have about 10,000 lights this year, all computer controlled using a Windows app called Vixen. (www.vixenlights.com) I’m one of thousands of hobbiests around the world who mess with this stuff, and we all seem to converge at www.doityourselfchristmas.com where we trade ideas, stories, problems, etc. I’ve been working on designing a wireless adapter for some of our light controllers, and am trying XBee because part of our goals are to keep the hobby affordable. One of the popular light controller systems we use is called the “Renard” system. Renard controllers use multiple cascading PICs to decode the serial signals and assign them to the designated “channels.” It allows dimming, blinking, and daisy-chaining from one to the next so that potentially hundreds of different channels can be controlled individually. A single channel could be one light, or a single string of lights, or 5 or 6 strings of lights etc.
The Renard uses a serial protocol (at 8,N,1) run through through Cat5 wire, and while space doesn’t allow completely explaining how it works, to be able to fade (dim) lights up and down slowly, or quickly, or flash, etc. each channel gets refreshed multiple times per second, hence the 57,600 or 115,200 bps rate is required. Because the lights are also usually synchronized to music, the refresh rate really needs to be timely. When designing the light displays in the Vixen software, a common timing interval used is 50ms, which means that for a single string of lights (e.g. one channel) to be lit for one second, that channel needs to be refreshed 20 times. Multiply that out by a couple hundred channels and, well, it’s a lot of continuous data…
The Renard’s protocol works fine with standard RS-232 or RS-485 (for long cable runs) because the PICs on the controllers are plenty fast enough to keep up with the data stream even at 115,200. Consequently there’s no need for flow control and the Renards don’t use one. Therein lies the problem. The transmit/receive mechanism built into the XBee radios just don’t seem to be able to handle the throughput – or at least, I haven’t figured it out yet. I firmly believe it’s a buffer overflow issue, causing dropped packets at (probably) the transmitting radio but perhaps at the receiving end, too. While the baud rate is programmable in the assembly code the Renard PICs use, going slower than 57,600 severely limits the functionality and smoothness of the light effects and is really not acceptable.
I’ve considered overclocking the processor on the XBee, but I haven’t yet inquired to Digi how (or if) that can be done. The clock may be preset via hardware instead of using a multiplier in the microcode, and I really know very little about the processor the XBee uses.
I have experimented with the RO – tried various settings from zero all the way up to 20. Makes no real difference – that tiny XBee buffer just has to be overflowing something fierce. I’ve tried all manner of different settings but the typical result is lost packets, lights that blink or flicker when they’re not supposed to or don’t come on at all anywhere near the exact moment they’re supposed to.
Message was edited by: dirknerkle
Message was edited by: dirknerkle
Message was edited by: dirknerkle