- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm trying to run a simple experiment using Delta-Sigma ADC on a CY8CKIT-059. For some reason, the ADC is measuring a constant offset of about 18mV.
Even when I short ADC input to GND on the board, I still get 18mV offset. I tried the same experiment with a SAR ADC and it behaves correctly where 0.0V at the input produced ~0 counts at the output. What am I missing here?
Thanks in advance, Boris.
ADC Configuration:
-Single ended
- Single sample
- 14-bit resolution
- Conversion rate 1000
- Input rage Vssa to 2.048V
- Buffer Mode: Bypass
- Reference: Internal 1.024Volts
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It looks to me that the offset correction for the decimator is being calculated incorrectly for certain resolutions. For 14-bits if you add the following line after ADC_Start() it will fix the problem, but this is just a work around.
ADC_DEC_OCOR_REG = 0x44;
ADC_DEC_OCORM_REG = 0x1E;
If you use the range Vss to Vref range everything is fine as well, because the offset correction register will always be zero. The way the Vss to 2*Vref (Single Ended) works is that it connects the minus input to the reference voltage. To make the numbers come out correct, 1/2_full_range * n is added in the decimator as an offset. "n" is compensation for the decimator gain. If this isn't done, the ADC would give you a -1.024V at Vss, 0V at Vref, and 1.024V at 2*Vref. By adding 1.024 volts to each of these inputs you would get 0V (0 counts), 1.024V (8192 counts), and 2.048V (16384 counts), for the inputs Vss, Vref, and Vref*2 respectively, assuming Vref is 1.024V.
I will post a bug report on this issue immediately. I found the same problem for 12, 13, and 15 bits as well.
Mark