Has anyone seen a problem with serial comms on V7.0 of NET+OS ?
I have a setup that is not using DMA just the standard API functions of read() and write(). The serial link is operating at one of two speeds 9k6 or 115k. I have the BSP_SERIAL_FAST_INTERRUPT set to TRUE so the OS should be using internal interrupts to send and receive.
I find that if the send is quiet for more than a minute when I come to send a new telegram only the last 4 bytes are sent. I have done numerous tests and have found that if the data is a fairly constant stream then all is good and no data is lost.
If the sending of data is halted for more than a minute then for some reason it seems to only transmit the last 4 bytes of a 16byte telegram. I haven’t worked out yet whether the characters 0x02 and 0x03 have any significance with the timeout, problem is we use 0x02 and 0x03 as framing characters for STX and ETX.
I have managed to get a fudge to work currently by forcing the sending of 0x00FF as 2 bytes. This seems to wake the I/F up and no data is lost.
It just seems strange to me that it should enter a sleep mode that would cause data to be lost.
I hope someone has the same problem or knows what I need to do to fix it.