I am using a standard TIMER block to take a number of samples of an AC waveform at set time intervals; I'm basically doing an RMS voltage measurement, so need to be able to ensure I take 32 samples over one half cycle of the incoming AC signal, and the TIMER is used to set the sample period. Depending upon the frequency of the input AC signal (either 50Hz or 60Hz), I need to change the value of the Period buffer in my timer - a 50Hz cycle requires the samples to be taken slightly further apart when compared to a 60Hz cycle.
In my code, at first power on, I calculate the frequency of the incoming AC signal, then I write an appropriate value (FreqVal) into the Period buffer of my timer using the command Timer_WritePeriod(FreqVal); This appears to work correctly.
Then, when I need to start my timer, I use the command Timer_Start(); That all seemed to work...or so I thought.
The issue with this, it seems, is that rather than use the value of FreqVal that I wrote into the Period buffer earlier, starting the Timer using the Timer_Start() command appears to grab the default value for the Period buffer from the component I have set up, effectively ignoring the previous Timer_WritePeriod(FreqVal) command.
In reading the datasheet for this component, that is actually what it is supposed to do:
Description: Initializes the TCPWM with default customizer values when called the first time and enables the TCPWM. For subsequent calls the configuration is left unchanged and the component is simply enabled
But I don't want it to do that. I want it to use the FreqVal value I wrote in their earlier on in my code each time I start it (which I do about every 10 seconds during operation). But because I start the Timer for the first time after I have written the updated value of FreqVal into the Timer Period, it just defaults to the preset value and my RMS calcs are all wrong!
I did try changing all references to Timer_Start() to Timer_Enable(); as it appears this latter command doesn't use the default Period value from the customizer. However my code started to behave a bit erratically - basically timed out with the Watchdog timer, so I suspect my Timer was never actually triggering an interrupt. I didn't really look into this much further.
The only way I have been successfully able to get things to work correctly, is if I make a call to Timer_Start() first, then immediately make a call to Timer_Stop(), so that the initial setting up using the default value is completed. Then I make a call to Timer_WritePeriod(FreqVal) to set the Timer Period, and all subsequent Timer_Start() calls will ensure my timer runs using the value of FreqVal for the period.
This seems a bit messy to me, so I was wonder what the correct way to initialise a Timer is, so that whatever value I write into the Period buffer will be the value it uses to set its period.