I’m trying to get the maximum resolution from the ADC on the RCM4200. I should get a lsb resolution of 1.3mV when measuring the range 0 - 5.56V, but instead I get half of that resolution; 2.7mV.
The strangest piece of evidence I’ve come across is that when I run AD_CAL_CHAN.C, I notice the calibrations are based on 0V and 5.56V corresponding to values of 0 and 2047 for raw data. The raw data corresponding to 5.56 should be 4095 (12bit) instead of 2047 (11bit) to get the full resolution… So how do I get the full 12 bit resolution? This calibration program says it’s running in single ended mode, and that’s what I need.
I looked at ADC_ADS7870.lib and found this in the anaIn() header comment:
it says it returns a value of “0-2047 for 11-bit conversions (bit 12 for sign)” and Overflow value otherwise. 11 bits regardless of whether your in differential or single ended mode? So you can never get the quoted 12 bit resolution?
I’m going dig deeper, but if anyone can help me with this it would be appreciated.