Smart Bluetooth Forum Discussions
With BCM20737 and Android 4.4.4 we are running into an issue where, in some cases, pairing succeeds but all subsequent connects fail.
Comparing the BLE packet traffic we noticed that in the failure case the link layer for slave (20737) rejected encryption request (LL_REJECT_IND) with reason "Pin or Key missing".
Success case:
Looking at the message sequence chart in attached encryption-success.png, it matches Fig 6.7 in message sequence charts in Bluetooth 4.1 spec, Vol 6 -> Part D Message sequence charts -> 6 Connection state -> 6.6 Start encryption. Here in frame number 1826, the slave returns a good LL_ENC_RSP and the master continues to a 3-way handshake.
Failure case:
Looking at the message sequence chart in attached encryption-failed.png, it matches Fig 6.8 in message sequence charts in Bluetooth 4.1 spec, Vol 6 -> Part D Message sequence charts -> 6 Connection state -> 6.7 Start encryption without long term key. Here in frame number 6588, the slave returns a LL_REJECT_IND with reason "PIN or Key missing", and the connection is terminated.
In Bluetooth 4.1 spec Volume 3 -> Part H Security Manager Specification -> 2 Security manager -> 2.4.4 Encrypted Session Setup, we noticed that the slave’s host layer provides a Long Term Key (LTK) to the slave's Link Layer for setting up encryption. Also, this LTK is based on EDIV and Rand that were distributed by the slave during pairing.
Is it possible that the Slave forgot a Rand/Div or LTK that it ought to remember?
We noticed a similar symptom in Bluez stack that acted as a slave and a corresponding fix at:
https://lists.ubuntu.com/archives/kernel-team/2014-April/042060.html
Can you please let us know if we need any fixes to encrypt connections between BCM20737 and Android 4.4.4.
Show Lesswhen i use wiced sensor tag(the red one) , i build the wiced_sense demo app in the IDE2.2 and download to the sensor tag,. My phone can 't find the device .
I push and hold the app button ,then set power and free the button , the device can be found. but app can't pair with mobile phone.
why?
Show LessHello,
I have created a simple BLE application called demo, by way of the WICED IDE wizard, with one characteristics, 15 bytes, with notify/read/write properties.
This is the code that the wizard has generated, the following:
<pre>
CHARACTERISTIC_UUID128_WRITABLE (HDLC_DEMO_CHREGISTERHS,
HDLC_DEMO_CHREGISTERHS_VALUE,
__UUID_DEMO_CHREGISTERHS,
LEGATTDB_CHAR_PROP_READ | LEGATTDB_CHAR_PROP_WRITE,
LEGATTDB_PERM_READABLE | LEGATTDB_PERM_WRITE_REQ,
15),
0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,
/* Characteristic User Description Descriptor */
//<UserDescription>
//<Value>Host To Sensor</Value>
CHAR_DESCRIPTOR_UUID16 (HDLD_DEMO_CHREGISTERHS_USER_DESCRIPTION,
UUID_DESCRIPTOR_CHARACTERISTIC_USER_DESCRIPTION,
LEGATTDB_PERM_READABLE, 14),
//UTF-8 <User Description> Host To Sensor
'H','o','s','t',' ','T','o',' ','S','e','n','s','o','r',
</pre>
The generated code with additions in there commented out for this purpose:
<pre>
BOOL store_in_db_demo_chregisterhs(UINT8* p_value, UINT8 value_len)
{
BLEPROFILE_DB_PDU db_pdu;
// memset(&db_pdu.pdu[0], 0x0, 15); // Ensure we wipe the data first!!!!!
// db_pdu.len = 15;
// bleprofile_WriteHandle(HDLC_DEMO_CHREGISTERHS_VALUE, &db_pdu);
// Write value to the GATT DB
ble_trace2("write len:%d handle:%02x", value_len, HDLC_DEMO_CHREGISTERHS_VALUE);
memcpy(&db_pdu.pdu[0], p_value, value_len);
db_pdu.len = value_len;
bleprofile_WriteHandle(HDLC_DEMO_CHREGISTERHS_VALUE, &db_pdu);
return TRUE;
}
</pre>
It was during testing using LightBlue on iOS, I can see, that writing this, 'A','B','C','D','E','F','G','H','I','J', 'K' gets stored good and verified, all 11 bytes.
On the next update of characteristic, I write this, 'Z','Y','X','W','V','U','T','S','R', all 10 bytes.
What surprised me was that the actual value stored was this:
'Z','Y','X','W','V','U','T','S','R','K'
By uncommenting the portion of the code that is in place, one would suspect that the value would be cleanly resetted with zero bytes.
I get the same results.
Can anyone shed light on this?
Show LessLa aplicación adjunta muestra ejemplo de uso del hardware ADC incorporado en los chips de 36/37/s.
Para utilizar la aplicación, cargar en una tarjeta de la etiqueta que tiene un botón de presión P0. Ejecutar trazas en la Junta. Inicialmente verás ~ 0V, y cuando usted presiona el botón a ver la tensión del riel de voltaje (~ 3300mV).
Hay tres pasos necesarios para obtener al ADC funcionando, como se ve en la aplicación:
Importación:
#include??? ADC.h???
Crear una instancia en crear la función de:
adc_config();
Y encuesta ADC usando la función:
adc_readVoltage(adc_convertGPIOtoADCInput(gpio_number));
* Calibración ocurre dentro de la función de inicialización. Sin embargo, puede ser útil su aplicación para calibrar a la ADC a un voltaje externo para mayor seguridad, o no tan periódicamente durante todo el tiempo de ejecución de la aplicación. Para ello, llame la función siguiente: parámetro uno es la conocida tensión de la fuente de referencia en mV, y dos es la localización de la tensión de referencia. La ubicación se puede configurar para muchos autobuses diferentes y todos GPIOs (de fuentes externas). Control, haga clic en el segundo parámetro en el código fuente para ver la lista completa de los lugares que pueden ser utilizados como una tensión de referencia.
Show LessDear support,
I hade to estimate the life time of a product poered by a standard CR3032 battery coin.
The product will based on BCM20732 + external power amplifier capable to provide up to 10dBm + several sensors like accelerometer, gyro and magnetometer). I will use the proximity/find me profile.
Reading the DS I found only the typical and maximun power consunption of BLE radio, but I need a more detailed info like power consumption in sleep mode, wake up, processing and also a typical time lenght for RX and TX of a single connection event.
Could you provide me that info?
In first analisys the device that I had to do is more or less very similar to your WICED SENSE but with an esternal power amplifier, so a current consumption versus Time of Wiced SENSE could be enought.
Thansk a lot,
BR,
Giordano
Show LessDear BCM
We found out one serious issue that, BCM20736 work in master/slave mode
Work in Master, it has it's connection event of Anchor point, here we call it Anchor_point_1
Work in slave, its master has the connection event of Anchor point, here we call it Anchor_point_2
At the beginning, the Anchor_point_1 and Anchor_point_2 is far apart, about over 6~7 ms. and the data transmit between this BCM20736 and its master is right.
But due to the crystal accuracy(10PPM) or other influence, the Anchor_point_1 & Anchor_point_2 might shift left or right and the two might be closer and closer. We checked in our test that, in one 30ms interval, the Anchor_point_1 & Anchor_point_1 would be 250ns closer to each other. In such conditions, Anchor_point_1 & Anchor_point_1 would continue closer until be overlap, and the data transmition would be error when Anchor_point_1 and Anchor_point_2 as close as +/- 2ms.
Though the connection would not break down (because we set the timer out as long as 5s), but in fact, the data is hard to be transmitted between this BCM20736 and its master.
Of cause that, the data transmition would get right after Anchor_point_1 & Anchor_point_2 apart over 2ms. but it would take nearly 10 minutes, and it would happen again after several hours.
How to solve this issue ?? or is there any good suggestion about this ???
Many thanks...
Show LessHi
I am trying to communicate to BCM20737 tag board with peripheral uart. I initialised the puart using frunction tw_puart_init() from puart_control sample.
Also what I send to PC using puart_print is displaying correctly. So I think the setup is correct.
The problem is that the byte value obtained from puart_read() is not what I am sending from my PC. What can be the reason for this behaviour.
Thanks in advance.
Show LessHi, I've read everything here How can I make bcm20732s sleep and wake it up by GPIO interrupt? I can find on sleep and I find that the question I have is still not answered. The closest to an answer I have found is the statement:
"When advertising (ADV) or in connection, the device will not go into deep sleep. However, it will go into other low power modes like sleep and pause based on the connection/ADV interval and other activities.
1. I understand that the application does not get to control the low power state but I would like some guidelines on what configuration will enable the lowest power state possible short of deep sleep. For example, if my fine timer or connection interval is less than 64 milliseconds, would that somehow inhibit low power states? i.e. is there a threshold of time which plays a role in determining sleep level and what is that threshold(s)?
2. I have an interrupt that fires every 50 ms ... but it's from an external device and the BLE firmware has no way of knowing about this (unless it were to keep a log of interrupts, which I doubt), so my interrupt should not affect ability of the BLE firmware to make the decision to sleep. Is this a correct assumption? (certainly the interrupt will terminate any sleeping that might be going on... that is expected, but I also want to be sure that by interrupting any sleep at this frequency I am not introducing some kind of thrashing in the system that will end up costing me more power than I am saving by sleeping for 48 milliseconds between interrupts).
3. Can you give a short list of current draw for the various invisible low power states (which I do not have control of but would still like to know about so I can calculate my expected battery life).
4. For my application, I never want to go into HIDOFF deep sleep... but I do want the battery to last as long as possible. So in my callback function for the low Power Query, I return a non-zero number, like 200000. I understand this is like telling the firmware "When you make your decision about whether to sleep or not, understand that I don't want you to sleep any longer than 200000 microseconds". Is this the correct interpretation? And the 64 dollar question: Does this number that I return from the low power query callback function in any way influence how deeply the not-deep-sleep low power mode will go? For example would I potentially get a lower power state if I return 200000 than if I return 64000? What if I return 50000?
For some context on our App.. we have the fine timer at 200 mSec and the connection interval at slightly less than 200s. We are always in connection when possible and send data samples to the host continuously during the connection. The 200 ms interval is somewhat arbitrary. For example we might consider buffering data samples and sending less often if that meant we could spend more time in some lower power state. This is why I want to understand what the firmware is doing even though I don't have explicit control over it. I think if I have the answers to some or all of the above I can arrive at the optimal design.
Thanks,
eric
Show LessHi Sir :
I want do a Max Throughput test , I use speed_test App as peripheral and
puart_control App as Central ,
I use Terminal send command to peripheral for enable notification
Step1 (Connect peripheral) :
Terminal : C3 07 00 XX XX XX XX XX XX (Six Byte of Peripheral addr)
Step2 (enable notification) :
Terminal : CB 06 41 00 2C 00 01 00
2c is handle for configuration "speed_test.data_client_configuration" to 1
Step3 (start notification timer)
Terminal : CB 06 41 00 2E 00 01 00
2e is handle for enable notification Timer
Now the peripheral will send notification to Central.
But after few number of seconds , the peripheral auto "connection down"
Why auto connection down ?
Is problems occur at central(puart_control App) or peripheral(speed_test App) ?
-- Thanks YOU --