I’m using two XBP24 modules, first generation, as an example (I’m trying to build a “star” system with multiple microcontroller-based end nodes communicating with a central PC-based server, but two devices seem to be enough to reproduce the problem).
I’ve updated both modules to 10E8 (current version as of today).
One of them (the server) is initialized with the following string:
ATID=42,DH=0,DL=1,MY=0,CE=1,A2=6,PL=4,AP=1,CN
I’m setting DH to 0 and DL to 1 just as a precaution (see below) - and it doesn’t change anything either way, I tried leaving them alone. As you can see, it’s set as Coordinator (CE=1) using API mode (AP=1), with its 16-bit address 0. Fine, it gets initialized and awaits associations.
The second device (uC-based) has the following init string:
ATID=42,DH=0,DL=0,MY=2,CE=0,A1=6,PL=4,AP=0,CN
Its 16-bit address is 2, default destination is the coordinator (DH=0, DL=0), and it’s using the transparent mode (AP=0) to simplify things on this end.
On startup it inits, gets appropriate number of OK responses, starts transmitting test strings - and here’s the problem: the server receives the data NOT as 16-bit packets (frame type 0x81), but as 64-bit packets (frame type 0x80). Surprisingly enough, these packets contain correct data. Sending to address “2” from the server does not work.
Next I implemented API mode on the client, thinking that maybe somehow API and Transparent addressing modes are not quite compatible - and tried that. Surprise, surprise - I send out 16-bit frames (code 0x01), but they arrive as 64-bit ones.
After banging my head on the table for a few days I noticed that if I reset the controller without power-cycling the module - it suddenly switches to 16-bit packets and keeps on working!
I tried to reproduce this without resets - and yes, “double-init” does the trick - here’s the client code, redacted for brevity:
//delay
printf("+++");
//delay
printf("ATID=42,DH=0,DL=0,MY=%s,CE=0,A1=6,PL=4,CN
",MYADDRESS); //MYADDRESS is "2" here
//delay
printf("started association\r"); //this string does NOT get to the coordinator, but at exactly this point the signal level LEDs on the server light up - looks like successful association to me
//delay - a few seconds
printf("association done\r"); //now, this text does arrive to the server, but in 64-bit packet
//delay
printf("+++");
//delay
printf("ATID=42,DH=0,DL=0,MY=%s,CE=0,A1=6,PL=4,CN
",MYADDRESS);
After this I DO receive proper 16-bit frames with the correct data, and sending from the server to client “2” works as expected.
Could someone explain what I’m doing wrong - it seems, “MY” gets reset somehow at the moment of successful association… I have already tried to save this configuration (using WR command) after everyting is initialized and works correctly, but nothing changed.