Getting timestamp from input switch

My current setup is the RCM6700 along with a maxim chip 13362 simulation board.

The primary goal in this project was to get data from the switches on the maxim board and have the rabbit perform serial communication with the server. This, I have achieved.

The problem is the timestamps. I get about a 30ms difference at the minimum when I hit 2 switches concurrently. I do not seek the same stamp for both, but I am looking for a 10ms delay at the most.

I have tried commenting out printf statements and this is what got me 30ms compared to about 60ms from before.

My program is quite long so here are some snippets I have that are related to the timestamp calculations (What I have provided is quite lengthy so I appreciate your time looking) :


// These are the various configurations for the NIST time servers.
#define NIST_SERVER_1   			""	// NIST, Boulder, Colorado
#define NIST_SERVER_2   			""   // NIST, Boulder, Colorado
#define NIST_SERVER_3   			""   // NIST, Boulder, Colorado
#define NIST_SERVER_4   			""  // University of Colorado, Boulder
#define NIST_SERVER_5   			""     // NIST, Gaithersburg, Maryland
#define NIST_SERVER_6   			""     // NIST, Gaithersburg, Maryland

void nistTime()
	auto int rtcStatus;
   auto int status;
   auto longword ip;
   auto int i, dst, health;
   auto struct tm t;
   auto unsigned long   longsec;
   char time[30];
   auto int done = 0;
   auto long difference;
   auto int ipChoice = 1;
   auto int secInc;
   static tcp_Socket socket;

   // Continue looping through all IP choices until we establish a valid connection to a server
   	// Converts a IP Address to a longword address to communicate
   	// with the various NIST servers
         case 1:

         case 2:

         case 3:

         case 4:

         case 5:

         case 6:

				// Set the RTC Status flag to -1 to indicate we failed to establish
            // a connection to one of the NIST time servers
           	rtcStatus = -1;

   	// Reset the socket in memory so that it be re-initialized upon use
      memset(&socket, 0, sizeof(socket));

      // Open a tcp socket to the NIST server
      tcp_open(&socket, 0, ip, NIST_PORT, NULL);

      // Tells the socket to wait for the specified timeout duration before aborting the connection
      sock_wait_established(&socket, NIST_TIMEOUT, NULL, &status);

      // Sets teh socket mode to ASCII only
      sock_mode(&socket, TCP_MODE_ASCII);

      // Loop as long as the socket has valid network information coming in
      while (tcp_tick(&socket))
         sock_wait_input(&socket, NIST_TIMEOUT, NULL, &status);
         // Store the data coming in from the socket into the time buffer
         sock_gets(&socket, time, 48);

      // Catch the socket error if one should occur
         if (status != 1)
				// Fill in with debugging code if necessary

      // Close down the socket connectino to the NIST Server

      // Determine the health of the NIST time by taking the ASCII value at index
      // 27 and subtracting ASCII '0' from it.
      health = time[27]-'0';
      switch (health)
      	// Case 0: RTC Status is good, and were done
         case 0:
            rtcStatus = 0;
            done = 1;

      	// Case 1: RTC Status is close, and were done
         case 1:
            rtcStatus = 1;
            done = 1;

      	// Case 2: RTC Status is bad and we need to try a different server
         case 2:
         	// Increase our IP choice to the next one. If the current choice is 7,
            // we will not stop trying to gather the NIST Time until a later time
            if(ipChoice++ == 7)


   // Set the rtc_time values based on the data returned from NIST
   t.tm_year = 100 + 10 * (time[6] - '0') + (time[7] - '0');
   t.tm_mon  = month2tm_mon(10 * (time[9] - '0') + (time[10] - '0'));
   t.tm_mday = 10 * (time[12] - '0') + (time[13] - '0');
   t.tm_hour = 10 * (time[15] - '0') + (time[16] - '0');
   t.tm_min  = 10 * (time[18] - '0') + (time[19] - '0');
   t.tm_sec  = 10 * (time[21] - '0') + (time[22] - '0');
   dst       = 10 * (time[24] - '0') + (time[25] - '0');

   // Convert the rtc_time information into a long number
   longsec = mktime(&t);

   // Add (3600 seconds * TIMEZONE) to adjust for the current timezone
   longsec += 3600ul * TIMEZONE;

   // If the daylight savings time value is between 1 and 50, add an hour to the current time
   dst = (dst >= 1 && dst <= 50 );
   if (dst)
      longsec += 3600ul;   // DST is in effect

	// Update the RTC time based on timezone & daylight savings time values
   mktm(&t, longsec);

   // Calculate the difference between our calculated time and the Rabbit SEC_TIMER
   difference = (SEC_TIMER-longsec);

   // If our health status is less than 2, then write the new new RTC value
   if (health < 2)
   // If our health status is less than 2 and the calculated difference is greater than 10,
   // then indicate we have an RTC time calculation error.
   else if(health < 2 & difference > 10)
      rtcStatus = 3;

   // Set the done flag to 0.
   done = 0;
   // Read the current RTC time from the rabbit and store it into rtc
   mktm( &rtc, read_rtc());

   // if the rtc's seconds value is 59, then set our secInc value to 0
   // otherwise set it to one second ahead of the current second value.
   if(rtc.tm_sec == 59)
      secInc = rtc.tm_sec + 1;

	// While we are not done calculating the millisecond values, keep trying to do so
   	// Read the current RTC time value and store it into rtc
      mktm( &rtc, read_rtc());

      // If the secInc value equals the second value from RTC, we can calculate our millisecond value
      if(secInc == rtc.tm_sec)
      	// Calculate the millisecond offset value based on the value stored in the Rabbit's MS_TIMER
         MSoffset = MS_TIMER;
         //printf("This MSoffset %x
", MSoffset);
         // Calculate the total time value as stored in the RTC clock in seconds
         offset = (long)rtc.tm_hour * 3600 + (long)rtc.tm_min * 60 + (long)rtc.tm_sec;
         // Calculate the number of milliseconds into a day
         //printf("This MSoffset 2 %x
", MSoffset);
         msDay =  MS_TIMER - MSoffset + offset * 1000;
         //msDay = offset * 1000;
              //printf("The time is offset %x
", offset);
              //printf("The time is msDay %x
", msDay);
         // Indcate that we are done with the time
         done = 1;


Are you doing these measurements in debug mode or in standalone mode? Use standalone mode for a true measurement.

I can not seem to discover what mode I am currently in. How can I change to standalone mode?

See page 38 of the RCM6700 user manual which describes putting the board into standalone mode:

The RCM5700/RCM6700 must be programmed via the Interface Board or via a similar arrangement on a customer-supplied board. Once the MiniCore has been programmed successfully, reset the MiniCore. The MiniCore may be reset by cycling power off/on or by pressing the RESET button on the Interface Board. The jumper across pins 1–2 on header JP1 on the Interface Board must be removed in order for the MiniCore to operate in the Run Mode(standalone mode) after it is reset.

1 Like

Thank you very much for your help on this issue. It’s not 100% consistent but my delays are now <= 10ms.