I am just starting with CY8CKIT-050 Development Kit.
I am using two SAR-ADC (Sampling frequency 100 kHz) and some time-critical calculation in a hight priority interrupt service routine (prio 0). Everything works fine until I implement an additional interrupt service (rising edge @ 10Hz, priority 7) to set a flag for further processing. This further processing is done in den main-loop and sends some data to the display.
When the the low level interrupt is enabled, I measure a Jitter (about 1.3us) in the signal processing in the high-priority interrupt service routine.
This is done with a toggle Bit as digital output at the end of the routine. The service routine is triggered by eoc of one ADC and needs about 4 us for the signal processing (without code optimization).
The reason for this Jitter is the code in the main loop (LCD functions) :
Disp_Flag = 1;
int main(void) .....
if (Disp_Flag == 1)
// Doing nothing for testing!
phi_disp = (phi_LUT_int * (uint64)360000) >> 16; // für P = 2
sprintf(displayStr,"%4d %4d ",(int)voltCount_sin,(int)voltCount_cos);
result = sprintf(displayStr,"%6d mGrad",(int)phi_disp);
Disp_Flag = 0;
There is also a Jitter in the signal processing, depending on the input signal values and the adjustment of code optimization. But with constant input values (ignoring the ADC-Data) this jitter is not existing.
Could you give me some hints for solving this problem?