1 Reply Latest reply on Jun 18, 2014 1:24 PM by ArvindS_76

    ADC raw value is not saturated.

      adc_readVoltage has 10% error without calibration and 3% with.

      I hope to reduce error rate so I used adc_readSampleRaw instead of adc_read_voltage().

       

      But when I applied 3.6V into adc port, the16bit  adc value displayed 25700 around. I expected 65535.

      I set the inputrange voltage 3.6V  :    adc_SetInputRange(ADC_RANGE_0_3P6V)

       

      My first question is why the adc raw value is not saturated although the voltage on adc port is maximum value.

       

      The second question is how to use adc_adcCalibrate.

      If I measure the adc supply voltage as 3.2V, then should I put the function argument adc_adcCalibrate(32000000, P1)?

      and with this calibration, which effect can I get? I mean I'd like to know what calibration is done with this function?

       

      Thank you.

        • 1. Re: ADC raw value is not saturated.
          ArvindS_76

          > But when I applied 3.6V into adc port, the16bit  adc value displayed 25700 around. I expected 65535.

          65535 is only the max value that will fit in a 16 bit variable and this does not relate in any way with what the ADC HW generates. Also make sure that if you are trying to sample an input that has a 0-3.6V range, your Vio is at least 3.2V (in other words, don't let the ADC input go ~10% over the Vio supply of the chip).

           

          > My first question is why the adc raw value is not saturated although the voltage on adc port is maximum value.

          I don't understand the question. You don't want the ADC to saturate because then the calibration will be totally off. The raw ADC output has a process/chip/temperature variation and that is why it needs to be calibrated. We want to keep the ADC in its linear operating range so the calibration can lead to simple linear extrapolation/interpolation of voltages being measured.

           

          > how to use adc_adcCalibrate.

          I think there is a typo in the function documentation. The reference voltage should be in millivolts and *not* in microvolts. The default calibration voltage the driver uses is the internal 1.2V LDO input (so is regulated and known to be good). If there is a more accurate voltage reference in your design, you can use that to calibrate the ADC. You would connect that to one of the ADC input channels (P1 in your example) and then after you initialize the ADC driver with adc_confi(), you call adc_adcCalibrate(3200, P1) assuming your reference voltage is 3.2V (connected to P1).

          1 of 1 people found this helpful