Strictly necessary cookies are on by default and cannot be turned off. Functional, Performance and Tracking/targeting/sharing cookies can be turned on below based on your preferences (this banner will remain available for you to accept cookies). You may change your cookie settings by deleting cookies from your browser. Then this banner will appear again. You can learn more details about cookies HERE.
Strictly necessary (always on)
Functional, Performance and Tracking/targeting/sharing (default off)
I previously had a perfectly working 16-bit ADC to Filter to RAM via DMA. I increased it to 18-bits, and now It doesn't work.
originally upon switching, the output of the filter was constantly high, as in the HOLDA and HOLDB PTRs weren't being read by the DMA. Then I switched my final array size from 1024 to 512 and it magically started clocking with the ADC's eoc pin (I am aware of the DMA tranfer limit, but if you look at my schematic you'll see that shouldn't have an effect, at least I don't see how).
The problem is all the values are constant. I have seen this problem before, and I don't remember what thing I did to make it go away, except it was something that seemed completely irrelevant, and I think was.
For this reason, could someone try running this code on their system if they get a chance? I have a feeling it may be my machine since I'm using VMware on a mbp.
I thought I swapped the configurations for the DMAs around though as per the Filter's Datasheet. The Filter's Data sheet seems pretty straight forward as it literally gives you the configuration settings you need for each type of bit depth you want, and what you want to do with that data (adc-filter, filter-ram, filter-dac).
I'm pretty certain I re-configured this properly as I had configured it properly for the 16-bit using the info from the filters datasheet, but I could be wrong?