ASCII Conversion in Python

Can anybody show me how i can convert my ASCII data to decimal in python please. I’ve got a device connected to an Xbee dev. board via RS232 and i used the Embedded Kit Gateway viewer to view the data coming to the serial port. The attachment shows what i’m getting … the data is in ASCII format and i want to know how i can go about converting my data to decimal so that it makes sense to me. Can someone give me some advice or point me in the right direction please. I’ve looked at a couple of places but i’m finding it hard to understand.

My assumption is that you have a Python string of binary bytes, correct?

The “ord” and “chr” functions are reciprocal operations.

ord(x) expects a character argument, and returns the ASCII value as an integer

chr(x) expects an integral ASCII value and returns the corresponding character

It sounds as if you would like to apply the “ord” operation to individual bytes in your data stream.

Message was edited by: 52637p

I can help you with the first part of this problem. The first part is how you would take a string of characters and reformat that string of characters as a new string of characters which would contain all of the non-printable characters replaced with some sort of hexadecimal representation.

Thankfully this is quite easy to do in Python. There are many ways to accomplish this remapping but I’ll offer you two options which accomplish the same thing:

  1. Create a new list and use a loop to process an input string “s” character-by-character using a variable “c”, add a formatted representation of “c” to the list, and then convert the list to a string using “join()” and print it.

  2. Use a list comprehension in a single line to accomplish the above.

I’ve attached a screenshot of discovering both routines in the interactive Python interpreter (since the formatting for Python is easy to skew on these forums). I will, however, post the one-liner (#2, above) here as well since it requires no indention:

>>> s = “\x03My D\x04T\x04 P\x04CKET\x05”
>>> print ‘’.join([ (“%02x” % ord(c), c)[ord(c) >= 0x20 and ord(c) <= 0x7e] for c in s])
03My D04T04 P04CKET05


Hi again

I’m still a little stuck with this. I’ve attached the screenshots of some of my tests and results and here are some of the questions i have.

Attahcment 1 -> Actual data when sensor is connected to serial port of pc and using hyperterminal. Using a binary viewer like waltr showed me helped me understand/decode the data according to the sensor protocol info sheet i have and it’s correct.

Attachment 2 -> Result when sensor is connected to a Xbee dev. board via RS232 (Xbee ZNet2.5, AT mode). This is an error message that keeps coming up frequently.

Attachment 3 -> Some more data after some time. Although i do get the error msgs as that in attch.2, if i press ok, it’s still able to receive data.

I’m using the PC companion application to view the serial data when testing with the Xbee dev. board.


  1. Attachment 3 -> where is the rest of the data? How does Zigbee process incoming data? When i compare this with its Hex Value, the results dont match the hex values from the hyperterminal test (which are correct). The hyperterminal results show 10 characters/bytes on every reading, but the xbee test only shows 6. Here is a link to an earlier post on how the data from the hyperterminal test was decoded.,787#2881

  2. How can i alter the program in order to decode my data properly. I’ve looked thru the source code and found some python scripts in the control folder, but somehow, i think i should be looking at the folder that’s uploaded to the gateway (together with the other python scripts and because it seems like this is the main file that handles most of the data reading etc. There’s a folder called “encodings” in the file that gives me some clues on decoding/encoding data but i have no idea where to start looking as there’s a whole lot of scripts there.

Can someone please help me!!!

Sorry, i forgot to attach the screen shot.

From the screen shot the data looks to be raw binary (hex). The terminal can only display characters defined by ASCII (0x41 is displayed as an ‘a’ but 0xA6 is ‘ª’).
Google ‘ASCII table’ for the 7-bit and 8-bit displayed Characters.

To convert raw Hex to ASCII in python I use:

import binascii

x = read_data(device) #string hex data from XBee
s = binascii.b2a_hex(x)
print s #output ASCII string

This is explained here:

This works for me but I’m sure there are other ways to do this conversation.

Message was edited by: waltr

Hi meme,
I have never look at the GatewayKit code but have dug into DigiDia code.
First, Python is a very different language from my experience and I’m just starting to learn it. A good ‘debugging’ technique is inserting ‘print’ statements to output a value.
When you run the GatewayKit code are you starting with a command line prompt where you type ‘python’?
If so then any print statement within the code will output to this screen.

I know this isn’t much help but I hope it gets you started.

In the Digi Dia code there are parsing routines that convert the data payload from the Xbee into other formats. There the code expects the data formatted in a specific way. There most likely is a parsing routine in the GatewayKit that assumes the data format (which is different with your sensor data) and thus mangles it. Once you find that routine you can add code to properly parse your sensors data.

Good luck and hope a Digi person will actually answer your questions.

Thanks Waltr!

Yes i am able to use print statements to print time on each output i get, but the output from the sensors are still mangeled up and i have no idea how to reformat it. Thanks for your help tho. I appreciate it!

Is anyone able to show me how the codes for the PC companion application formats data please? Or what it expects the data to be like?

I’ve gone through all the main code scripts but cant find anywhere where it looks like some form of encoding is happening. The only thing that gives me a clue is the “encodings” folder within the file. This has numerous scripts of data decoding/encoding from binary to ascii etc. but the file “encodings” doesnt seem to be used/imported anywhere in the main file scripts that is used to run the PC companion application.

Maybe a better question, how do i alter the program so that it doesnt try to decode my data? … i thought when the xbee modules are in AT mode, whatever is sent is exactly what’s received … but in this case, AT mode doesnt seem to work that way???

Please Help!! I really need to get this goin by the end of this week! Hope someone can help me please!

Message was edited by: meme