Crypto block - Unable to get same CRC result as PDL

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
user_1669321
Level 5
Level 5
100 replies posted 50 replies posted 25 replies posted

I'm trying to use the crypto block to compute a CRC using the CRC-16 / XMODEM algorithm. According to the PDL, I should get a result of 0x31C3 (written as 0x31C30000 in a uint32) for the string "123456789" and using a seed of 0x000, not counting the NULL character (not that I have tried counting the NULL character). However, I get 0x70A79F31.

Here are the functions I call

#include <project.h>

#include <string.h>

cy_stc_crypto_context_t cryptoScratch;

cy_stc_crypto_server_context_t cryptoServerContext;

cy_stc_crypto_context_crc_t crcCtx;

void Crypto_Init(void) {

    Cy_Crypto_Server_Start(&localCryptoConfig, &cryptoServerContext);

    Cy_Crypto_Init(&localCryptoConfig, &cryptoScratch);

    Cy_Crypto_Enable();

    Cy_Crypto_Sync(CY_CRYPTO_SYNC_BLOCKING);

}

cy_en_crypto_status_t Crypto_CrcInit() {

    return Cy_Crypto_Crc_Init(0x1021, 0, 0, 0, 0x0000, &crcCtx);

}

uint16_t Crypto_ComputeCrc(uint16_t seed, uint8_t* data, size_t len) {

    CY_ALIGN(4) uint32_t crc;

    cy_en_crypto_status_t result;

    while((result = Cy_Crypto_Crc_Run((void*) data, len, &crc, seed, &crcCtx)) != CY_CRYPTO_SUCCESS);

    Cy_Crypto_Sync(CY_CRYPTO_SYNC_BLOCKING);

    return crc >> 8;

}

void main() {

    SystemInit();

    __enable_irq();

    Crypto_Init();

    Crypto_CrcInit();

    uint8_t str[] = "123456789";

    uint16_t crc = uint16_t Crypto_ComputeCrc(0x0000, str, strlen(str));

    while(1);

}

0 Likes
1 Solution

Good news! By using the polynomial 0x10210000 instead of 0x1021, Cy_Crypto_Crc_Run outputs a CRC of 0x31C30000, so I achieve the right result by shifting it by 16 bits.

Thanks for the help, Len!

View solution in original post

0 Likes
10 Replies
Len_CONSULTRON
Level 9
Level 9
Beta tester 500 solutions authored 1000 replies posted

Len
"Engineering is an Art. The Art of Compromise."
0 Likes

Hi Len,

Sorry, I forgot localCryptoConfig. I pasted it hereunder. For the bitshift, it's because the PDL says that the expected output of the operation is 0x31C30000 and the actual result is 0x31C3, thus the bitshift (which I now realize should be 16 instead of 8).

/* Macros to configure the Crypto block */

/* IPC data channel for the Crypto */

#define CHAN_CRYPTO                        (uint32_t)(3u)

/* IPC interrupt structure for the Crypto server */

#define INTR_CRYPTO_SRV                    (uint32_t)(1u)

/* IPC interrupt structure for the Crypto client */

#define INTR_CRYPTO_CLI                    (uint32_t)(2u)

/* CM0+ IPC interrupt mux number the Crypto server */

#define INTR_CRYPTO_SRV_MUX                (IRQn_Type)(2u)

/* CM0+ IPC interrupt mux number the Crypto client */

#define INTR_CRYPTO_CLI_MUX                (IRQn_Type)(3u)

/* CM0+ ERROR interrupt mux number the Crypto server */

#define INTR_CRYPTO_ERR_MUX                (IRQn_Type)(4u)

/* Crypto configuration structure */

const cy_stc_crypto_config_t localCryptoConfig =

{

    /* .ipcChannel            */ CHAN_CRYPTO,

    /* .acquireNotifierChannel */ INTR_CRYPTO_SRV,

    /* .releaseNotifierChannel */ INTR_CRYPTO_CLI,

    /* .releaseNotifierConfig */ {

    #if (CY_CPU_CORTEX_M0P)

        /* .intrSrc            */ INTR_CRYPTO_CLI_MUX,

        /* .cm0pSrc            */ cpuss_interrupts_ipc_2_IRQn, /* depends on selected releaseNotifierChannel value */

    #else

        /* .intrSrc            */ cpuss_interrupts_ipc_2_IRQn, /* depends on selected releaseNotifierChannel value */

    #endif

        /* .intrPriority      */ 2u,

    },

    /* .userCompleteCallback  */ NULL,

    /* .userGetDataHandler    */ NULL,

    /* .userErrorHandler      */ NULL,

    /* .acquireNotifierConfig */ {

    #if (CY_CPU_CORTEX_M0P)

        /* .intrSrc            */ INTR_CRYPTO_SRV_MUX,      /* to use with DeepSleep mode should be in DeepSleep capable muxer's range */

        /* .cm0pSrc            */ cpuss_interrupts_ipc_1_IRQn, /* depends on selected acquireNotifierChannel value */

    #else

        /* .intrSrc            */ cpuss_interrupts_ipc_1_IRQn, /* depends on selected acquireNotifierChannel value */

    #endif

        /* .intrPriority      */ 2u,

    },

    /* .cryptoErrorIntrConfig */ {

    #if (CY_CPU_CORTEX_M0P)

        /* .intrSrc            */ INTR_CRYPTO_ERR_MUX,

        /* .cm0pSrc            */ cpuss_interrupt_crypto_IRQn,

    #else

        /* .intrSrc            */ cpuss_interrupt_crypto_IRQn,

    #endif

        /* .intrPriority      */ 2u,

    }

};

0 Likes

I'm still looking into it.  However there are two concerns:

  1. Cypress never provided an example of using the PSoC6 CRC block.  This would be useful to see if any steps were missing.
  2. The crc result from Cy_Crypto_Crc_Run() comes back as 32-bit.  This concerns me since the result you need is a 16-bit and the CRC HW engine needs to know when to finish the polynomial.  Since you are using the CCITT16-XMODEM polynomial, it should extend the CRC output beyond bit 15.

Len

Len
"Engineering is an Art. The Art of Compromise."

Right, the PDL documentation does indicate what parameters to use for each CRC polynomial, but an actual example would be desirable.

Thank you for looking into it.

0 Likes

crc_algorithm_t my_algorithm = {

    .width = CRC_BITLEN,    /* Sets the CRC computation length */

    .polynomial = CRC_POLYNOMIAL,

    .lfsrInitState = CRC_LFSR_SEED,

    .dataReverse = CRC_DATA_REVERSE,

    .dataXor = CRC_DATA_XOR,

    .remReverse = CRC_REM_REVERSE,

    .remXor = CRC_REM_XOR

};

In your code I could not see any place where there was a configuration variable to set the CRC computational length.

In the struct cy_stc_crypto_context_crc_t used by Cy_Crypto_Server_Start() there doesn't appear to be a variable for setting the CRC computational length.  It could be that the Crypto Server requires 32-bit CRC computations therefore there is no setting for something smaller.

Len

Len
"Engineering is an Art. The Art of Compromise."
0 Likes

Yes, that's the big difference that I saw also.

I just figured that I was looking at PDL 3.0.4 documentation, and they added some notes in PDL 3.1.1, such that:

"The polynomial, initial seed and remainder XOR values are always provided as MSB aligned (so actual higher bit should be located in 31s bit of the parameter value)."

I'll try providing a polynomial of 0x10210000 instead of 0x1021 and get back with the results.

0 Likes

Good news! By using the polynomial 0x10210000 instead of 0x1021, Cy_Crypto_Crc_Run outputs a CRC of 0x31C30000, so I achieve the right result by shifting it by 16 bits.

Thanks for the help, Len!

0 Likes

Apparently the Crypto Server is forcing a 32-bit CRC calculation.

By setting the seed to 0x10210000 to satisfy the 32-bit calculation and by keeping the XOR specs at 0x00 the polynomial works for you.

Thanks for the update.

Maybe this should be a KBA for others to refer to who might be having a similar issue.

Len

Len
"Engineering is an Art. The Art of Compromise."

Hi,

I don't know if this will help.  I found HAL implementation example code of the CRC calculation.  If used different calls and the closest reference CCITT-16/XMODEM  is CRC-16 CCITT-0 which uses a seed of 0xFFFF.

Hardware Abstraction Layer (HAL)

I tried their HAL implementation with the following changes:

  • #define CRC_LFSR_SEED 0xFFFF to 0x0000
  • The input vector in the example happens to be EXACTLY your input vector.  NO CHANGE HERE.

The CRC output = 0x313C.  This works and is very simple.

Len

Len
"Engineering is an Art. The Art of Compromise."

Thank you, Len. However I want to use the PDL client-server API, as there is other code that's based around it that I don't want to change.

0 Likes