Hi all,
I hope this is the right forum for posting this question here about the BLE program as the nature of the question is about the GATT implementation. Please direct me if I am at wrong place.
I am working on the project where I am reading six ADC channels with 10 bit value each on ATMEGA324PB and then sending the resulting Bytes to a RN4871 BLE device configured as a GATT server. The ADC read is working fine. I have following structure where I can assign the ADC results to the char array using structure bit fields.
typedef struct{ uint8_t volatile in_status; uint8_t volatile out_status; union{ volatile char adc_data8]; struct{ volatile uint64_t adc_ch0: 10; volatile uint64_t adc_ch1: 10; volatile uint64_t adc_ch2: 10; volatile uint64_t adc_ch3: 10; volatile uint64_t adc_ch4: 10; volatile uint64_t adc_ch5: 10; volatile uint64_t pad_bits: 4; }data_s; }data_u; }COMM_PACKET_t; COMM_FRAME_t packet;
Now when I try to write these bytes to a BLE characteristic with 8 byte size the BLE goes wild and start sending err message as response. just to mention I am converting these bytes to hex string before sending to BLE over UART.
Following is my conversion code :
int main(void) { // system initialization //Module initiliazation while(1) { _delay(ms)// tried w/wo delay char *str = str_from_hex_array(packet.data_u.adc_data); BLE_Module_Write_Char(OUT_VOLTS_HANDLE,str,OUT_VOLTS_SIZE); } } // FUNCTION DEFINITION char *str_from_hex_array(volatile unsigned char *src, size_t len) { char *outstr = malloc(2*len+1); if (!outstr) return outstr; char *p = outstr; for (size_t i= 0; i<len; i++) { p+=sprintf(p, "%02X", src[i]); } return outstr; } int BLE_Module_Write_Char(char charHandle[],char charValue[],char charSize[]) { _delay_ms(DELAY_BEFORE_CMD); int max = (int)strtol(charSize, NULL, 16); //convert the size from bytes in hex to an int max = max * 2; //max is the maximum number of characters, and there are two characters per byte. //check if the data is longer than the allowed size if (strlen(charValue) > max) { return(0); } //check if the data is valid hexadecimal (0-9,a-f) for(int i = 0; i < strlen(charValue); i++) { if (!(isxdigit(charValue[i]))) { return(0); } } //semd the command Ring_Buffer_Clear(&RXBuffer); USART_Transmit_String(WRITE_LOCAL_CHARACT,0); USART_Transmit_String(charHandle,0); USART_Transmit_Char(','); USART_Transmit_String(charValue,1); timeout_start(); //begin the timeout timer while(RXBuffer.count < strlen(AOK_RESP) && timeout == 0){}; //wait until the expected characters have been received or the timeout triggers if (strstr(RXBuffer.data, AOK_RESP) != NULL) { return(1); } return 0; }
With everything looking good I still don't have a clue what is wrong. In my assumption this has something to do with the string size I am sending to update characteristic. But in theory it is looking correct.
Any help will be big favor.
Regards,