Low Level Serial Problem.

Hi All,
Has anyone seen a problem with serial comms on V7.0 of NET+OS ?

I have a setup that is not using DMA just the standard API functions of read() and write(). The serial link is operating at one of two speeds 9k6 or 115k. I have the BSP_SERIAL_FAST_INTERRUPT set to TRUE so the OS should be using internal interrupts to send and receive.

I find that if the send is quiet for more than a minute when I come to send a new telegram only the last 4 bytes are sent. I have done numerous tests and have found that if the data is a fairly constant stream then all is good and no data is lost.

If the sending of data is halted for more than a minute then for some reason it seems to only transmit the last 4 bytes of a 16byte telegram. I haven’t worked out yet whether the characters 0x02 and 0x03 have any significance with the timeout, problem is we use 0x02 and 0x03 as framing characters for STX and ETX.

I have managed to get a fudge to work currently by forcing the sending of 0x00FF as 2 bytes. This seems to wake the I/F up and no data is lost.

It just seems strange to me that it should enter a sleep mode that would cause data to be lost.

I hope someone has the same problem or knows what I need to do to fix it.

How are you verifying that the data was lost? Are you sure that the fisrt 12 bytes of the data didn’t go out immediately and then the last 4 bytes came out later with additional data?

The Digi is sending data to the target which is run using an emulator and all data is seen via interrupt. The tests have shown that if there isn’t a long time gap between telegrams then all data is sent.

It has all been verified with the target emulator, I have also been using a terminal session in windows to verify what I have seen with the emulator.

We are using an ASCII protocol so it is valid to use a terminal session to monitor data.

You should contact Digi Support. They have a beta serial driver that contains some bugfixes.