According to this specifciation document: http://www.etsi.org/deliver/etsi_en/300300_300399/300328/01.09.01_60/en_300328v010901p.pdf
The Receiver Blocking:
"This requirement does not apply to non-adaptive equipment or adaptive equipment operating in a non-adaptive mode. In addition, this requirement does not apply for equipment with a maximum declared RF Output power level of less than 10 dBm e.i.r.p. or for equipment when operating in a mode where the RF Output power is less than 10 dBm e.i.r.p."
If I recall correctly, the Cypress chips I've looked at all have 3dBm or less, and thus wouldn't apply unless above 10 dBm?
I don't see that exclusion in V2.1.1. My lab indicates that it is required as per the RED directive.
You are correct; Looking at v2.1.1 it is required. I'm not sure I would be able to provide any more help on the subject :)
But you can always open a support case with Cypress to get direct help on the testing purposes if need be.
If you can simplify what exactly you need to test to fulfill the requirement, I can help out with the technical side of setting up the module, but looking at the ETSI document makes my eyes go blurry...
I imagine if you send sequential packets over the radio with a packet counter that increments on each packet, you can merely count the number received compared with the number stamped on the packet to measure the error rate.
I appreciate your help. I have modified some of the Cypress MTK code to spew constant packets, random binary. I think I need code in the DUT to calculate PER during the transfer and send the result out the UART.
I need guidance on how to detect lost/retried packets in code... I have opened a support case.
Let me know how it goes with the support case :)
As far as calculating a PER:
Using the BLE notification method, it will send a packet with a one-attempt effort. Setting a single byte to keep track of the packet number (incremented on each notification) will allow you to count the received packets on the receiver side. I do think you will need code on the DUT in order to implement the PER calculations.
That might be a solution! A simple host that notifies a characteristic at x Hz. The peripheral samples n samples and calculates the PER. I suppose you would increment the characteristic and then the peripheral can detect missing packets.
Are notifications by default always "one-attempt"?
Hmmmm, looking online BLE notifications might be retried at the link layer. Bummer, I was hoping that would be a simple solution :(
I think you would have to delve into the HAL, L2CAP, or Link Layer software to figure out how to disable the retry/ack packet exchanges.
The Notification only sends the data without expecting a "Attribute Protocol Layer" acknowledgement, but it might still be retrying at the physical layer. Dang, I'll have to look into the physical layer more and see what it is doing specifcally. Hopefully Cypress Support can give better details than I can :(
Here is another thought. The Cypress MTK gui and firmware are setup to send n packets and calculate the PER at the end. This won't work for the blocking test as the PER has to be read in "real-time".
Perhaps I can modify this so the I send say, 1000 packets, update a PER readout, and repeat. The packet stream will not be continuous but I can update the PER at a high enough rate so that it gives a pseudo real-time update.
Sounds reasonable; Tweaking the number of packets, and the method of PER calculation, you might be able to get higher PER sampling rate as well.
Do you need the PER result during the test? I would think having a post-log of it should be valid still.
Ideally, the number of packets would be as large as you can make it to approach the continuity of a constant packet rate, but that will affect your PER update rate as well.
Is the PER calculation merely counting the number of returned packets? Or is it doing it based on timeouts?
Asking this to determine whether you could setup a sliding window for calculating the PER rate during the packet sending, or if you must wait until all the n packets are sent before the PER will be valid.
Now I'm stuck at receiver blocking test under EN300328 V2.1.1 ESTI documents
Anyone have a solution to do it?
It looks like you set the signal strength to the lowest strength that will still give you the PER required, then introduce the CW interference and test the performance? I'm not sure what specifically you would be stuck on for this.