I would not init the amux each time, just do that once.
Then allow for a little settling time after switching or just throw away
the first sample taken.
Also you do not use the API to check for conversion complete before
you get result.Something like this -
output = ADC_DelSig_1_GetResult16();
Or use that as a test in a while loop and exit as soon as it has complete and read
If you are buffering with amux into an OpAmp you have to allow
for slew rate as well. That you can calc for a full scale step.
For noise considerations -
http://www.cypress.com/?rID=39677 AN57821 - PSoC® 3, PSoC 4, and PSoC 5LP Mixed Signal Circuit Board Layout Considerations
http://www.cypress.com/?rID=40247 AN58827 - PSoC® 3 and PSoC 5LP Internal Analog Routing Considerations
http://www.cypress.com/?rID=39974 AN58304 - PSoC® 3 and PSoC 5LP – Pin Selection for Analog Designs
http://www.cypress.com/?id=4&rID=49491 Power measurements for low power modes of PSoC 3/5 on CY8CKIT-001
Thanks for the links. I have done some digging and it seems the ADC_SAR_Seq should do What I am trying to. All in one component.
My question is: trying to sample 8 channels each main loop, is it better to run it continiously or using the software call - I can do the conversion, manipulate data and than stop the unit ( put it in low power mode) than next time go again.
I suppose this implementation takes care of the delays and makes sure each channel is sampled well.
I tend to use continuous mode and throw away a sample, eg. take 2
samples / channel, discard the first.
Starting and stopping DelSig causes problem of having to flush the
decimator, from datasheet -
All four ADC modes fully flush the decimator when the ADC initially starts conversions. This ensures that the first reading from the ADC is valid as long as the input voltage is stable before starting conversions with either the ADC_StartConvert() API or when triggered by the “soc” input. Although all modes reset the decimator when starting the ADC, only the continuous mode does not reset the decimator between readings. Because of this, the first reading in continuous mode takes four times longer than the subsequent readings. When using an analog mux to scan between multiple inputs, make sure that the ADC is not running while the input switches are changing. To switch input between samples when using modes other than continuous, use the analog hardware mux.
So the alluded muxing problem can be handled by throwing away the first sample.
Now for the hard part. If you are at high res you should make a settling time calculation.
Search the forum, I think the input model of the DelSig is discussed so that you could
make that calculation. This would be thru the entire signal path. You do this to make sure
that your target resolution settles to some number of LSBs of error.