Are you trying to do simultaneous transfer data from two cameras on two the endpoints simulanoeoulsy?
If so that is not possible?
But you can receive data on both endpoints in a time division multiplexed manner. You can have a single DMA Socket and the use CyU3PUsbMapStream to change the endpoint mapped with the socket.
Did you ever get this working? I'm trying to do something similar with 2 separate cameras that I'd like to stream simultaneously with an FX3. I have started modifying the bulkuvcinmem project to de-risk this ability, and have managed to get the device enumerated properly as 2 /dev/video devices. Unfortunately I am having some issues after that. Did you have to create 2 separate UVC VIC’s to do this? That’s the only way I have been able to get Linux to recognize the 2 inputs.
Our eventual hope is to feed the image data from the 2 sensors into the FX3 GPIFII via an FPGA and then stream it as independent outputs (unlike Section9 of AN75779 which talks about interleaving the data). Our reason for this is different resolutions, frame-rates, between the two sensors.
Is there a good example project that makes use of the CyU3PUsbMapStream function that you called out?