I’m trying to measure voltage from an external power supply that varies between 12-15 V for which i’m using a resistor between the positive terminal of the power supply and the “In” terminals of one of the telemetry card ports.
Since the card uses 4-20 mA current loops i’m expecting 4 mA when the power supply is set at 12V and 20 mA when is set at 15V. What i’m getting right now is 15.58 mA at 12V and 18.23 mA at 15V.