In a few designs I have noticed a glitch within the ADC cause it to return a weird and off scale values in certain low voltage ranges (0.08 to 0.09 volts in this case, there is another range a bit higher up). This glitch will affect all ADC readings with the SAR ADC if one input is in this range.
Edit: This happens on the CY8C5888CTI-LP097 / CY8CKIT-059.
Today I have taken the time to find and map those certain voltages only with the Delta Sigma ADC. The type of reference does change the glitch a bit but still within the same voltage range. I will try to get a 16-bit voltage source hooked up to try and get the other range where this happens (It is much smaller).
For a simple strategy of testing this phenomena, I interconnected an 8-bit DAC to a 16-bit Delta Sigma converter. Although this does happen with an external input.
The ADC data and control data for the DAC were sent over USB using the CDC driver.
The DAC is setup as 0-1.020 volts, low speed mode.
The ADC is setup as Vssa-2.048, Internal VREF 1.024, bypassed buffer. I have mapped the other buffers and one of a different reference.
Attached is raw CSV file (Voltage vs Time), source code, and an images of the graphed data.
Edit: Added data for a 2 channel SAR with a 14Bit Signal fed into the channels. It looks like there is a possible bit error. This also shows how it affects the whole device.
Thanks, can't wait to hear your feedback.