16-bit addressing cannot be set before association

I’m using two XBP24 modules, first generation, as an example (I’m trying to build a “star” system with multiple microcontroller-based end nodes communicating with a central PC-based server, but two devices seem to be enough to reproduce the problem).
I’ve updated both modules to 10E8 (current version as of today).

One of them (the server) is initialized with the following string:
ATID=42,DH=0,DL=1,MY=0,CE=1,A2=6,PL=4,AP=1,CN
I’m setting DH to 0 and DL to 1 just as a precaution (see below) - and it doesn’t change anything either way, I tried leaving them alone. As you can see, it’s set as Coordinator (CE=1) using API mode (AP=1), with its 16-bit address 0. Fine, it gets initialized and awaits associations.
The second device (uC-based) has the following init string:
ATID=42,DH=0,DL=0,MY=2,CE=0,A1=6,PL=4,AP=0,CN
Its 16-bit address is 2, default destination is the coordinator (DH=0, DL=0), and it’s using the transparent mode (AP=0) to simplify things on this end.

On startup it inits, gets appropriate number of OK responses, starts transmitting test strings - and here’s the problem: the server receives the data NOT as 16-bit packets (frame type 0x81), but as 64-bit packets (frame type 0x80). Surprisingly enough, these packets contain correct data. Sending to address “2” from the server does not work.

Next I implemented API mode on the client, thinking that maybe somehow API and Transparent addressing modes are not quite compatible - and tried that. Surprise, surprise - I send out 16-bit frames (code 0x01), but they arrive as 64-bit ones.

After banging my head on the table for a few days I noticed that if I reset the controller without power-cycling the module - it suddenly switches to 16-bit packets and keeps on working!
I tried to reproduce this without resets - and yes, “double-init” does the trick - here’s the client code, redacted for brevity:

//delay
printf("+++");
//delay
printf("ATID=42,DH=0,DL=0,MY=%s,CE=0,A1=6,PL=4,CN
",MYADDRESS); //MYADDRESS is "2" here
//delay
printf("started association\r"); //this string does NOT get to the coordinator, but at exactly this point the signal level LEDs on the server light up - looks like successful association to me
//delay - a few seconds
printf("association done\r"); //now, this text does arrive to the server, but in 64-bit packet
//delay
printf("+++");
//delay
printf("ATID=42,DH=0,DL=0,MY=%s,CE=0,A1=6,PL=4,CN
",MYADDRESS);

After this I DO receive proper 16-bit frames with the correct data, and sending from the server to client “2” works as expected.

Could someone explain what I’m doing wrong - it seems, “MY” gets reset somehow at the moment of successful association… I have already tried to save this configuration (using WR command) after everyting is initialized and works correctly, but nothing changed.

I don’t know what you’re doing wrong as it appears I was doing the same thing. Thanks to your posting I changed my code to set the MY address after association and I finally have 16-bit addressing working. Thanks for the tip.

Dave

I don’t know the answer either, but I can offer a suggestion for further investigation. (I can see you’ve spent some time working on the problem, so forgive me if you’ve already tried this.)

You’ve established that sending the whole init string again makes a difference, and the implication is that the MY parameter is somehow the problem. To prove or disprove that, maybe you could try something like this:

  1. Before the second init (ie just before the last line in your sample code), issue commands to read the values of the parameters mentioned in the init. This is local communication to the local XBee, so the fact that it isn’t associated yet is not a problem.
  2. Store the values in the uC memory.
  3. Issue the second init (last line of your code).
  4. Now you have communication, so send the saved parameter values to the host in a data packet and see which ones were unexpected and exactly what their values were.

I promise nothing, but it might reveal something.

A follow-up to my own post. Not good practice. Sometimes though, you hit the send button and then a new thought comes through.

XBees are designed to be used in large networks as well as small ones. Let’s imagine some sort of world-wide network (maybe for a travel company): coordinators everywhere, and clients (end devices) that people carry with them around the world. Any client needs at any time to be able to communicate with its nearest coordinator.

No coordinator can cope if two of its clients want to use 16-bit addressing and they both insist on using the same 16-bit address.

So when coordinators are in use, it seems likely to make sense that the coordinator will want to allocate 16-bit addresses rather than letting clients choose their own.

Could that be what’s happening here?

If so, then the fix would be just to read MY after association and use that value until the next association.

I have just re-read your original post, and I can see that this answer doesn’t address all the points in it. Still, could there be a clue here?

Hi,
the behavior of the modul is correct.
While association takes place, nodes are switched to 64-Bit addressing mode, MY parameter is set to 0xFFFE.

Look at page 20 of user manual:

"When an End Device associates to a Coordinator, its MY parameter is set to 0xFFFE to enable 64-
bit addressing. The 64-bit address of the module is stored as SH and SL parameters. To send a
packet to a specific module, the Destination Address (DL + DH) on the sender must match the
Source Address (SL + SH) of the desired receiver.

"

So after association my address change switch back to 16-Bit addressing mode.