1 of 1 people found this helpful
> But when I applied 3.6V into adc port, the16bit adc value displayed 25700 around. I expected 65535.
65535 is only the max value that will fit in a 16 bit variable and this does not relate in any way with what the ADC HW generates. Also make sure that if you are trying to sample an input that has a 0-3.6V range, your Vio is at least 3.2V (in other words, don't let the ADC input go ~10% over the Vio supply of the chip).
> My first question is why the adc raw value is not saturated although the voltage on adc port is maximum value.
I don't understand the question. You don't want the ADC to saturate because then the calibration will be totally off. The raw ADC output has a process/chip/temperature variation and that is why it needs to be calibrated. We want to keep the ADC in its linear operating range so the calibration can lead to simple linear extrapolation/interpolation of voltages being measured.
> how to use adc_adcCalibrate.
I think there is a typo in the function documentation. The reference voltage should be in millivolts and *not* in microvolts. The default calibration voltage the driver uses is the internal 1.2V LDO input (so is regulated and known to be good). If there is a more accurate voltage reference in your design, you can use that to calibrate the ADC. You would connect that to one of the ADC input channels (P1 in your example) and then after you initialize the ADC driver with adc_confi(), you call adc_adcCalibrate(3200, P1) assuming your reference voltage is 3.2V (connected to P1).