Q1). We inverts raw counts in CSX operation to get similar finger touch reaction to CSD operation. I.e. when you touch a button, raw count increses (CSD). But in CSX operation, raw count decreses. Therefore we invert raw count in CSX operation. So the actual calibration level of CSX is (1-40%) = 60%. Why we select 60%(40% after inverted, with risk of flat spots) instead of 85%(15 after inverted)? 85% will be too high for many applications, we adjust it to in CSX firmware. Suggest using SSC clock in most applications, if flat spots occurred with direct clock mode, we can set it back to 85%(15% after inverted).
Q2). Advantage and disadvantage is similar with CSD. Higher calibration level (lower after inverted) will give better gain of rawcount-Cf.
Disadvantage could be risk of signal saturation. Lower calibration level (higher after inverted) will have no risk of signal saturation but flat spots.
Thank you for your response.
>>> 85% will be too high for many applications, we adjust it to in CSX firmware.
Is this the same for CSD?
If yes, is there possibility that CSD default calibration target changes in the future?
Sorry sincerely for the late reply.
Yes. It is same for CSD. the default calibration level may be changed in the future, but we don't have concrete time schedule yet...
Thank you for your reply.