ADC bipolar range error

Hi all,

I’m using a BL5S220 and have a problem when trying to calibrate any of the bipolar adc gains.

I setup the voltage source as indicated by the calibration program. I set the voltage source for the positive and then swap the PS leads, but when a vary the PS for the whole range, I notice that only the positive range is shown with correct values. When in the negative range, only erratic values are displayed.

I know that for bipolar ranges, some DAC channels are used for voltage references, so I check the DAC channels 0 & 1, and they’re working just fine.

Any ideas. Thanks.

What range are you varying the voltag over?

The bipolar mode requires the use of 2 inputs and shows the difference between the 2 which can be negative in a relative fashion i.e. if input 1 is greater than input 2 but the voltages applied to the pins is always greater than 0. This mode does not allow the measurement of voltages less than 0 relative to the analog ground.


Thank you very much for your quick response.

Well, in fact I have tried calibration of all different ranges in the SE bipolar mode and the results are the same.

Regarding the use of two inputs, this is something I have thought about it and sounds with lots of logic. But I’m a little confused, I thought differential mode required the two inputs, but single ended mode required only one.

I have tried differential mode and that works fine. What I’m trying to calibrate now is the bipolar single ended range. I’m using the ADC_CAL_SE_BIPOLAR.C program to do the calibration, and acording to the instructions there, there is no need for a second input. My c. sense was in the way of just how a binary offset reading would be.

Could you clarify on this a little further?

Thanks again.


I was confusing bipolar mode and differential mode. Bipolar mode should allow you provide a negative input as it floats the potential divider to offset the input voltage into the positive range before passing it to the ADC.

Does your power supply output a true negative voltage relative to ground?



By the instructions in the ADC calibration program, the channel is calibrated by taking a reading in two points: at 80% in the positive range and at %80 in the negative range. By setting a power supply to 80% in the positive range and then swapping the leads to do the negative point, the effects of reading errors appear when swapping the leads.

What I did was to create the two voltage references (-80%, +80%) with a common reference instead of just one reference plus a swap of lead. Now, instead of swapping the leads, I connected the channel to the negative referece when doing the second point and there it is, it just works!!!

Could someone there care to explain why this happens?


If the controller and the power supply share a common ground then swapping the leads will not result in a negative voltage on the input.

Swapping the leads would only work if the power supply was truly floating relative to the BL5S220 ground (a battery powered supply for example).


Oh yes,

I’m using different power supplies and they’re isolated one from each other.