Hello Cypress community!
I recently started experimenting with the CY8CKIT-042-BLE (CY8C4247LQI-BLE, to be specific) for a small R&D project in the office. Without going into too much detail, we're trying to accurately measure the resistance of a variable resistor. In particular, I'm measuring the resistance of 9 separate resistors in this application.
I've attached the Project file I've managed to cobble together that seems to work at the moment (please excuse poorly written code and badly configured diagrams, still new here). I've declared the proper analog pins that one end of the sensors are connected to, while the grounded end of all the sensors are connected together to GND on J1. The TopDesign.cysch connects the sensors to an AMux, which uses a combination of the IDAC and ADC blocks to inject a small amount of current to each of the analog pins, and then ADC measures the voltage drop across each resistor. From this, I can take the known current and measured voltage and (V = IR) calculate the resistance of the sensor at that point in time.
There's a little more to the design file, but most of it is to 1) accurately timestamp the collected data, and 2) create a UART connection such that I can print the values over terminal and capture them to a text file on my PC.
Okay, a couple important questions:
1) What is the maximum voltage I should expect into the ADC? The ADC block had Vref connected to VDDA, and the single-ended negative input is also set to Vref. This sets the "Single ended mode range" to "Vdda (3.3 V) to 0.0", and, when I disconnect the sensor entirely from the board, I see 3.301643 volts printed to the terminal (open connection). I know these are quite close, but they're not quite the same. Is this discrepancy normal?
2) I've got the ADC configured to be 12-bit, which should be 4096 counts full range. When I increase the resistance of my sensor, I can get it as high as ~8200 counts, which just exceeds 13 bits. Do I have my ADC configured incorrectly?
3) I've noticed that occasionally some of the voltages reported by the ADC for an individual sensor will "blip" up by a few 1000 counts before coming back down to the nominal level of the other sensors. I've reached out the sensor vendor, and after confirming that I've connected a battery to the kit to stabilize the board voltage (it was just powered off the USB cable before) they've commented that something could be going on with the inherent capacitance of the ADC relative to the timing of my ADC readings. Does anyone have any insight into how I can quantify this and change my timing to accomodate?
Apologies for the long read, any comments are greatly appreciated! If this is posted to the wrong section please point me in the right direction.
Sandbox.cywrk.Archive04.zip 2.7 MB