6 Replies Latest reply on Apr 23, 2014 3:35 PM by user_14586677

    Delta Sigma ADC Accuracy

    kevin.fries

      I am working on a project that measures an analog output from a solar radiation sensor. I am a novice and have some confusion about the various settings for the Delta Sigma ADC. It seems that, depending on the buffer mode, I get wildly different readings, even when the gain is set to 1. Bypassing the buffer seems to provide me with the most reasonable readings, but still not accurate. Using a voltmeter, I measure values of around 1500 microvolts, yet the ADC spits out a reading of only about 600 microvolts. Changing the conversion mode didn't make any difference to the readings, and dropping the sampling rate of course only slowed down the update interval.

         

      Has anyone run into issues like this before when trying to use the delta sigma ADC at low amplitudes? The ADC works just fine when I connect the pins to an external voltage source that is much greater (1.5V instead of less than a millivolt), but I think the ADC may be having issues due to the low magnitude of what I'm measuring

      I've attached a screen shot of my configuration and my sampling code is below, though it's not much different from the sample project on creator

         

      ADC_DelSig_Start();

         

          ADC_DelSig_SelectConfiguration(ADC_DelSig_CFG1,1);

         

          ADC_DelSig_StartConvert();

         

          CyDelay(100u);

         

          for(i = 0; i < 100; i++)

         

          {

         

              if(ADC_DelSig_IsEndConversion(ADC_DelSig_WAIT_FOR_RESULT))

         

              {

         

                  output = ADC_DelSig_CountsTo_uVolts(ADC_DelSig_GetResult32());

         

                  break;

         

              }

         

              CyDelay(5u);

         

          }

         

          ADC_DelSig_StopConvert();