9 Replies Latest reply on Jun 14, 2019 4:50 AM by MiCo_4221156

    ADC SAR & ADC Delsig reading deviation

    MiCo_4221156

      Hi there, I am reading voltage from the potensiometer and feed it to adc SAR (12 bit res) and adc Delsig (20 bit res).

      • When i compere the value to the multi meter i see that there is deviation of 0.2V (im using FLUKE 287), i am not sure if i used the API's correctly, or if i made other mistake.

       

       

       

       

       

       

       

      #include "project.h"

          unsigned a=0; // a is a global int

          unsigned b=0; // a is a global int

      int main(void)

      {

          CyGlobalIntEnable; /* Enable global interrupts. */

         

          LCD_Start();

          ADC_SAR_Start();

          ADC_DS_Start();

          ADC_DS_StartConvert();

          ADC_SAR_StartConvert();

         

          LCD_Position(0u, 0u);

          LCD_PrintString("V SAR:");

          LCD_Position(1u, 0u);

          LCD_PrintString("V DS:");

                  

          for(;;)

          {

              

              LCD_Position(0u, 13u);

              LCD_PrintString("mV");

              LCD_Position(1u, 13u);

              LCD_PrintString("mV");

           

               if(ADC_SAR_IsEndConversion(ADC_SAR_RETURN_STATUS))      // wait for the conversion to end 

              {

                 a = ADC_SAR_CountsTo_mVolts(ADC_SAR_GetResult16());

             

                 LCD_Position(0u, 8u);

                 LCD_PrintNumber(a);

              }  

             

               if(ADC_DS_IsEndConversion(ADC_DS_RETURN_STATUS))         // wait for the conversion to end

              {

                 b = ADC_DS_CountsTo_mVolts(ADC_DS_GetResult32());

             

                 LCD_Position(1u, 6u);

                 LCD_PrintNumber(b);

              }

       

          }

       

      }

       

       

      thank you,

       

      Michael

        • 1. Re: ADC SAR & ADC Delsig reading deviation
          LePo_1062026

          Michael,

           

          What is the deviation between the SAR and DS ADC mV values?

           

          Note:  The x_CountsTo_mVolts() conversions for the SAR and ADC to compensate for Vdda set in the System DWR panel and if your Vref for the ADC is Vdda-based.  Vrefs based off the internal 1.024V effectively have simple constant calculations.

           

          There are x_SetGain() and x_SetOffset() API calls that can be used for gain and offset to compensate calculations in the x_CountsTo_mVolts() API call.  I believe that the default gain =1 and the offset=0.  Using the SetGain() and SetOffset() allows for a one-time calibration phase (then these values are stored and read out of EEPROM ) or if really tricky, as a run-time calibration phase to further compensate for temperature effects.

           

          In general, I believe what you might be exhibiting is a gain and/or offset issue.  However, you might want to check your multi-meter.  When was your meter last calibrated?  You're only as good as your reference.

           

          Len

          • 2. Re: ADC SAR & ADC Delsig reading deviation
            MiCo_4221156

            Hi Len! again thank you for the quick replay!

             

            the deviation between the SAR and DS ADC is around 0.05 - 0.03 V.

             

            what does it mean " one-time calibration"?

             

            and i will make another test with a better scope when i will be at the lab again!

             

            thank you

             

            Michael

            • 3. Re: ADC SAR & ADC Delsig reading deviation
              LePo_1062026

              Michael,

               

              A "One-time Calibration" is a calibration cycle that is usually manually performed by a technician trained in NIST best-practices usually when the device is first assembled and tested.  Manual calibrations can occur periodically (NIST recommends at least once a year) to insure the calibration has drifted too much due to component aging or stressing.  In this calibration type case, good reference equipment must be used.  Your calibration is only as good as your reference.

               

              A run-time calibration is much more difficult to design.  For one thing, your references now need to exist on-board.  Even these references might drift over time and need a periodic calibration.  The advantage of a run-time calibration is that the equipment can self-calibrate from the internal references as needed.  For example. many high-end test equipment will self-calibrate at power up and/or after a specified time once the equipment temperature stabilizes.

               

              Run-time calibrations are not common due to the time to perform the cycle especially if the equipment is functionally in use.

               

              Len

              • 4. Re: ADC SAR & ADC Delsig reading deviation
                MiCo_4221156

                thank you for your answer Len!

                 

                i did not got into the lab yet due to the fact that we are starting the finals of that semester, this is why i did not tried to use the info that you gave me yet. ill try it soon.

                 

                thank you,

                 

                Michael

                • 5. Re: ADC SAR & ADC Delsig reading deviation
                  MiCo_4221156

                  Hi Len sorry bugging you again

                   

                  I tried to use the functions that you recommended but as i got couple of problems.

                  • The first one - SetOffset() can got only integers while i need 0.2.
                  • for the SetGain() API the datasheet suggest this "It should only be used to further calibrate the ADC with a known input or if the ADC is using an external reference." which both of the conditions are invalid for my need.

                   

                  • The second one - when i measuring between 0V-1V i get the numbers in 10^(-4) for example - when Multimeter shows 0.37 i get 3862 on the Psoc instead of 0368.

                   

                  • The third one - i just notice that the deviation is getting bigger as the voltage is raising as show in the table below, and its make the calibration more complex.

                   

                   

                  Capture.PNG

                   

                   

                  (I using the same code and i did check the calibration of my multimeter, as you suggested)

                  i got little bit lost with those problems specially with the third one.

                   

                  thank you!

                  Michael

                  • 6. Re: ADC SAR & ADC Delsig reading deviation
                    LePo_1062026

                    Michael,

                     

                    I'll try to unravel your many questions.  Let's start with 3).

                    3) The deviation gets larger when voltage gets bigger.

                    The issue is that you need to adjust for gain of you input circuit path.  This could be because of the tolerances of your external components or if you are using Opamps or input gain on the ADC.

                     

                    In the data above if you assume your multimeter is your Voltage reference, then the PSoC SAR value needs to be multiplied by 0.9502 and the PSoC DS needs to multiplied by 0.9549.

                     

                    If you are familiar with using the SetGain() function, you can achieve this.  I usually prefer to use the gain correction afterwards and store this gain factor in EEPROM.

                     

                    2).  I'm confused.   You state:  "when i measuring between 0V-1V i get the numbers in 10^(-4) for example - when Multimeter shows 0.37 i get 3862 on the Psoc instead of 0368."  How are you getting PSoC values with 10^(-4)?  GetResults32() provides values in ADC counts.  ADC_DS_CountsTo_mVolts() in mV [10^(-3)] and ADC_DS_CountsTo_mVolts() in uV [10^(-6)].

                     

                    For your ADC configuration are you using Alignment = Left or = Right?  I normally recommend Alignment = Right.

                     

                    1) SetOffset() and SetGain() only take integers that operate on ADC counts to convert to mV or uV conversions.  I tend to prefer post-conversion Gain and Offset calculations instead.

                     

                    To perform a Offset calibration, you only need one point.  Assuming a linear line in the full-scale reading, you can pick any point.  Some people use a 0.0V (grounded) input reading.  The result is the offset.

                     

                    To perform a Gain calibration, you need two points.  Preferably a minimum and maximum full-scale reading (FSR) is used.

                     

                    I used your supplied data to calibrate the gain corrections for the SAR and DS and have attached a .xlsx file  for you.

                     

                    Len

                    • 7. Re: ADC SAR & ADC Delsig reading deviation
                      MiCo_4221156

                      Hi Len thank you for your elaborate answer!

                       

                      1.     i used the numbers that you calculate and indeed they solve my main problem, i have couple of questions on that method:

                       

                      • i assume that in every circuit that ill connect to the ADC ill have to calibrate it the same way that you did, there only way to do it is to do it manual (write down some values and find the best scaling factor)? since my reference will be a multimeter and the measuring is a analog voltage. (measuring in reference to ground will not be accurate since the deviation is getting larger )

                       

                      • i am not familiar with using the SetGain() but ill check it out, as well as using the EEPROM, for now, i declare a float and i multiple it on the ADC value. this way is a bad way to do it? (memory efficiency etc).

                      2. 

                      I am using right Alignment,  i am confused too. here is a photo that maybe will clear up what i mean.IMG_20190605_120546.jpg

                         the multi meter is showing 0.57V

                       

                      when the voltage is greater the 1V the reading is good (using the scale factor that you mention).

                       

                       

                      again, thank you very much!

                       

                      Michael

                      • 8. Re: ADC SAR & ADC Delsig reading deviation
                        LePo_1062026

                        Michael,

                         

                        Let me answer question 2 first:  The problem is in your LCD_Position() calls.  You are left-aligning your display.  Try LCD_Position(0u, 1u); and LCD_Position(1u, 1u); instead.

                         

                        Question 1:

                        If you perform a manual gain calculation, you can then hardcode the gain compensation factor in FLASH and don't need EEPROM.  However, if you are going to make many of these designs, that might be a problem to recompile with new FLASH compensation factors for every board.

                        In the past, I've performed gain and offset compensation by having a special code that went into calibration mode.  Here is the general sequence of operations:

                        • I forced a known minimum voltage value at the input.  (near the Bottom end of the range to be measured.)
                        • I entered the value I received on my reference measurement tool (in this case a multimeter) into the communication to the PSoC UART.
                        • Both the minimum reference value entered and the uncompensated raw ADC value were stored in RAM.
                        • I forced a known maximum voltage value at the input. (near the Top end of the range to be measured.)
                        • I entered the value I received on my reference measurement tool (in this case a multimeter) into the communication to the PSoC UART.
                        • Both the maximum reference value entered and the uncompensated raw ADC value were stored in RAM.
                        • I then performed in SW the gain and offset calculations similar to those performed in the spreadsheet I sent you.
                        • I stored the gain and offset values (as floats) in EEPROM.
                        • Further ADC readings now use the EEPROM stored gain and offset.
                        • If a reset occurs, the gain and offset is reloaded from EEPROM into RAM.

                        The above method does not use the SetGain() function.  Your SW would compensate as an external function.  SetGain() and SetOffset() are intended to be performed once at initialization time.  That way, CountsTo_mVolts() functions automatically perform the gain and offset compensation.  Note:  To use SetGain() and SetOffset() you still need to perform the sequence shown above to determine and store the compensation values in EEPPROM.

                         

                        Len

                        • 9. Re: ADC SAR & ADC Delsig reading deviation
                          MiCo_4221156

                          thank you very much len!