3 Replies Latest reply on May 12, 2017 7:05 AM by epr_1639216

    Bad data from CyBle_GetRssi()

      I'm trying to use RSSI as a rough measurement of distance using a CYBLE-214009-00 and connected Android phone. To do this, I've set up a thread on an Android application to continually poll the RSSI value, then use the CYBLE-214009-00 to print the value over the UART TX port and send the RSSI back to the phone to display it there. However, if I sample the RSSI below 200 ms, I typically see values like 127 dBm appear. However, if I do NOT send the RSSI back to the phone (and only print to the UART port), I can sample as fast as I want. Is there any particular reason why this is happening?


      Thank you,



        • 1. Re: Bad data from CyBle_GetRssi()

          More than likely, the act of transmission is affecting the RSSI value. Transmitting over BLE uses the transmitter that is measuring the RSSI ;)


          Try testing it with various EMI effects (like shielding, reflection, etc).


          Try reducing the polling rate, and see how the timing effects it as well.


          I suspect the RX and TX circuits both run through the same, or nearby circuits and thus are causing crossover between the RSSI value and BLE transmissions.

          • 2. Re: Bad data from CyBle_GetRssi()

            That's a good point, thanks for the information. Looking around online, it seems that RSSI is unreliable during transmission, which is unfortunate as I was hoping to use it to calculate distance within 10 cm of accuracy.


            Also, changing the polling rate definitely helps, but the way to really kill the bad data is by not sending it back to the phone.

            • 3. Re: Bad data from CyBle_GetRssi()

              Yeah, it's too bad RSSI isn't easier to use. You could use it for long-term location calculation (greater than half a second I would think), but that would involve setting the radio to give an accurate RSSI reading. Thus, depending on your polling rate not so useful.