16 bit ADC working, switched to 18 bit and it stopped?

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
lock attach
Attachments are accessible only for community members.
sccac_1236541
Level 4
Level 4

Hi all,

   

 

   

I previously had a perfectly working 16-bit ADC to Filter to RAM via DMA.  I increased it to 18-bits, and now It doesn't work.  

   

originally upon switching, the output of the filter was constantly high, as in the HOLDA and HOLDB PTRs weren't being read by the DMA.  Then I switched my final array size from 1024 to 512 and it magically started clocking with the ADC's eoc pin (I am aware of the DMA tranfer limit, but if you look at my schematic you'll see that shouldn't have an effect, at least I don't see how).

   

 

   

The problem is all the values are constant.  I have seen this problem before, and I don't remember what thing I did to make it go away, except it was something that seemed completely irrelevant, and I think was.

   

 

   

For this reason, could someone try running this code on their system if they get a chance?  I have a feeling it may be my machine since I'm using VMware on a mbp.

   

 

   

Any other help is greatly appriciated.

   

 

   

Thanks,

   

 

   

scarlson

0 Likes
5 Replies
odissey1
Level 9
Level 9
First comment on KBA 1000 replies posted 750 replies posted

scarlson, if you use same DMA configuration for 18- as for 16-bit, it won't work.

0 Likes
sccac_1236541
Level 4
Level 4

@odissey1,

   

I thought I swapped the configurations for the DMAs around though as per the Filter's Datasheet.  The Filter's Data sheet seems pretty straight forward as it literally gives you the configuration settings you need for each type of bit depth you want, and what you want to do with that data (adc-filter, filter-ram, filter-dac). 

   

 

   

I'm pretty certain I re-configured this properly as I had configured it properly for the 16-bit using the info from the filters datasheet, but I could be wrong?

0 Likes
odissey1
Level 9
Level 9
First comment on KBA 1000 replies posted 750 replies posted

scarlson,

   

check "coherency" settings for DelSig ADC, and Cypress example "20-Bit ADC Data Buffering Using DMA":

   

http://www.cypress.com/file/134681/download

0 Likes
sccac_1236541
Level 4
Level 4

I'm pretty sure I am doing this.  below is my code,

   

 

   

    ADC_DelSig_1_SetCoherency(ADC_DelSig_1_COHER_HIGH);   
    
    Filter_SetDalign(Filter_STAGEA_DALIGN,Filter_DISABLED);
    Filter_SetDalign(Filter_HOLDA_DALIGN,Filter_DISABLED);
    Filter_SetCoherency(Filter_STAGEA_COHER,Filter_KEY_HIGH);
    Filter_SetCoherency(Filter_HOLDA_COHER,Filter_KEY_HIGH);
    Filter_SetCoherency(Filter_CHANNEL_A,Filter_KEY_HIGH);  
    
    Filter_SetDalign(Filter_STAGEB_DALIGN,Filter_DISABLED);
    Filter_SetDalign(Filter_HOLDB_DALIGN,Filter_DISABLED);
    Filter_SetCoherency(Filter_STAGEB_COHER,Filter_KEY_HIGH);
    Filter_SetCoherency(Filter_HOLDB_COHER,Filter_KEY_HIGH);
    Filter_SetCoherency(Filter_CHANNEL_B,Filter_KEY_HIGH);

   

 

   

I had similar commands for the 16-bit except with Filter_KEY_MID, Dalign enabled, and ADC_DelSig_1_COHER_HIGH.

   

 

   

Although they are giving a slightly different command,

   

 

   

    ADC_DelSig_DEC_COHER_REG |= ADC_DelSig_DEC_SAMP_KEY_HIGH;

   

 

   

I guess I can give this one a try this afternoon, although I would imagine they are doing the same thing, but obviously idk.

   

 

   

 

   

thanks,

   

scarlson

0 Likes
sccac_1236541
Level 4
Level 4

So I found out what the problem was.. although I would love it if someone could explain it to me..

   

 

   

I wasn't including 

   

 

   

    ADC_DelSig_1_IRQ_Disable();

   

 

   

Now at least I'm getting data... although its not at all correct, but at least its not constant anymore. 

   

 

   

I didn't need it in the 16-bit case, but now I do I guess?

   

 

   

 

   

One problem down 99 more to go.

0 Likes