Packetloss at Receiver Side

My problem statement is very simple. I have one Arduino Uno and another Arduino Mega Board. Both have got Zigbee Shield mounted on them. One of them is working as Transmitter (Uno) and another (Mega) as a receiver.

Code for Tx:

 void setup() {
    Serial.begin(9600);
    }
    void loop() {
    Serial.println("High"); 
    delay(200);
    Serial.println("Low");
    delay(200);
    }

Code for Rx:

 char msg;
    const int led = 13; //led at pin 13
    void setup() {
    Serial.begin(9600);//Remember that the baud must be the same on both arduinos
    pinMode(led,OUTPUT);
    }
    void loop() {
    while(Serial.available() ) {
               msg=Serial.read();
               if(msg=='H') {
                   Serial.println("Message High");
               }
               if(msg=='L') {
                   Serial.println("Message Low");
               }
    delay(200);
    }
    }

On Tx side it is sending packets serially High Low

High
    Low
    High
    Low
    High
    Low
    High
    Low
    High
    Low
    High

However on the receiver side, I get some packets missing. It is like

Message High
    Message High
    Message Low
    Message High
    Message High
    Message High
    Message Low
    Message Low
    Message Low
    Message Low
    Message Low
    Message Low
    Message Low
    Message Low
    Message Low

I would expect that it should print

Message High
    Message Low
    Message High
    Message Low

How can i receive packets synchronously and How can i aware of any packetlos on the Rx side.

Thank you for your suggestions, corrections, comments!

First, add a delay between the packets. Next, don’t send the next till you send back an ACK from the remote side.