Well first, between your two projects, the VDACs are configured differently. This accounts for the difference in the amplitude of the signals.
If the issue is the sample rate, you could slow down i_L and both should behave the same. That said I suspect there's something else going on here.
So I suspect you're getting cross sample contamination here. If you look, the actual samples per second are very similar.
I think you need to set this to multi-sample mode and run StopConvert in an interrupt on EOS of the delsig.
I haven't tested this yet.
I will have access the my setup again tomorrow, so then I will try some stuff. About the frequency of both signals, for now they are not defining, but they need to have the same frequency, as this is required by my project.
Regarding the StopConvert, I would really like to keep the sampling in hardware. In the end, my processor only has to calculate some variables and the control is executed in hardware. If I start adding stuff in software, I am afraid delays will prevent me to get most out of my project. Hence, do you think it is possible to perform a similar task in hardware?
I understand, I was just going by the documentation:
Multi-sample mode captures single samples back to back, resetting itself and the modulator between each sample automatically. This mode is useful when the input is switched between multiple signals. The filters are flushed between each sample so previous samples do not affect the current conversion. Note Take care when switching signals between ADC conversions. Either switch the input quickly between conversions with hardware control or stop the ADC conversion (ADC_StopConvert()) while switching the input. Then restart the ADC conversion (ADC_StartConvert()) after the new signal has been connected to the ADC. Failure to do this may result in contamination between signals in the ADC results.
I guess since you're not really changing signals (just relying on the flushing behavior) you probably don't need to call StopConvert and StartConvert.
Oh wait, I forgot something, you'll need to call StopConvert() to prevent updating your DAC via DMA.
I found the cause of my issues, but how it caused the sampling issues is unknown. Once I changed the output of the VDAC to the 4V range and the clock to 6 MHz, I was able to measure the Vc signal based on the comparator operation of the i_L signal. So, somehow the clock speed of the comparator and the edge detector determines the performance of the DelSig. For the SAR, the clock speed does not influence the performance. I will post some scope plots later.