I've googled and found lots of questions from people trying to get something done about how to measure frequency. Every answer starts with "what frequencies?". And the next question is "include archive". OK, so the archive is attached. In the event more than one project is in it, I'm using "ADC_Differential_Preamplifier01.cydsm" and no other projects in the workspace.
Platform is CY8CKIT-049-42xx. I sucessfully implemented the example project ADC_DiffPreamplifier.pdf:
Here's the circuit implemented from the example:
The difference is that I have implemented Rg as the round gray variable resistor to the right of the breadboard. Input is a cheapy Electret Condenser Microphone from Amazon. My R1 and R2 are 99xx ohms, so very close to 10k. Anecdotally, Rg is somewhere near 3k.
Notice on the o-scope, there's a 500mV p-p signal. Yeah, that's 60 Hz with a 1.25 second period additional signal on it that's about half its amplitutde. Thankfully, the microphone seems perfectly suited to this dual op-amp differential amplifier (referred to as the first half of an instrumention amplifier).
So what frequencies?
80 to 120 Hz with great resolution. 65 to 140 is the total range I care about. And definitely nothing below 65... I desire to rid this world of 60Hz. I'd really like to know when a primary frequency heard is near 100 or 110 Hz and be able to differentiate within 1Hz all around there.
From all that I've read, it would seem most people are focussed on higher frequencies and the sampling in the example seems MUCH too frequent. Most software FFTs seem to be in the dozens of points, while it would seem I need hundreds if not thousands of points to effectively sample ~100Hz.
I'm a total noob to this stuff, but can't suck too bad if I've managed to coax a CY8CKIT-049-42xx to do this much. I'd appreciate any and all advice, pointers, and clues about how to accomplish this mission via software, logic in the PSoC, or both. I'll keep researching...