The scale of the RSSI is probably measured/based on some value of the hardware/voltages being read. If you have different resistors/capacitors/capacitor trim values on the radio, then the reference being used for the RSSI value would be incorrect, leading to wrong values/scale.
Since you are a comparing a custom board with the dev board, there is most likely a discrepancy in the hardware reference location for the RSSI (either software tuning, hardware value changes, or unconnected, etc.)
If you can find in the documentation where the RSSI value is compared/read from, that should be a step in the right direction I think.
Thanks for your reply and the hint.
Although I took my time to align both hardware environments, this could have been a relevant search direction.
But then the "scale" of the RSSI should be stable.
I've just implemented a dumb BLE Service to send the RSSI measured by GetRSSI.
In 20% of the cases I will get values under -90, whatever the distance between the EZ-BLE and the bonded device
In 70% of the cases I will get values above 90, whatever the distance between the EZ-BLE and the bonded device
In 10% of the cases I will get values between -85 and +5, with a correct link between the distance and the value
So there is a massive amount of strictly incoherent values when using the 214009 on its own.
Whereas when using the 214009 Eval on the Pioneer Kit, it's rather 5% of incoherent values.
In understood there has been a bug in CyBle_GetRSSI(), but not affecting 256k chips (and the 214009 is a 256k chip).
And it would not explain why it's working on the Eval + Pioneer Kit environment.
Really driving me nut!
90% random values seems really high I agree. Keep in mind that wireless interference will cause increased/decreased signal measurements based on the room's physical makeup. (See here for wiki explanation: Interference (wave propagation) - Wikipedia )
This could be aggravated by board layout, but the cyble modules are usually pretty self-contained in this respect.
Some tests you could run would be doing metallic shielding to see if it affects the percentage of success/failure readings? Otherwise, assuming you are running the same software code, it must be a hardware difference.
1 of 1 people found this helpful
Once again, thanks for your reply.
Actually found the explanation.
In thread CyBle_GetRssi() returned value doesn't make sense Cypress previously stated the CyBle_GetRSSI() bug was not affecting 256k chips.
But the bug was actually corrected in 256k BLE 4.2 chips.
So the 214009 is affected by the incoherent RSSI readings.
Since we don't have details on this bug, it could be "less visible" on an Eval Board + Pioneer Kit. Following your idea, this could be related to hardware (typically some capacitance connected somewhere on the Eval Board and / or the Kit, stabilizing the problematic stuff).
Anyway, Cypress should clearly write in the BLE component datasheet (PSoC Creator) for which of its chips the CyBle_GetRSSI() API is giving coherent measures.
I'm closing the topic.