Multiple USB UVC endpoints (multiple webcams)

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
lock attach
Attachments are accessible only for community members.
Anonymous
Not applicable

I am trying to modify the See3CAM_CX3RDK_ e-CAM59CX3 example found here: http://www.cypress.com/cx3/ and add another usb streaming endpoint.  The mipi device that I have attached to is actually sending frames from multiple different cameras.  These frames have a header in the first line of the frame which allows me to tell which camera they came from.

   

 

   

I have successfully modified the USB descriptor so that linux sees two separate UVC cameras (/dev/video0 and /dev/video1)  Originally the application uses this socket for video:

   

dmaCfg.consSckId[0]         = CY_U3P_UIB_SOCKET_CONS_3;

   

If I change this to CY_U3P_UIB_SOCKET_CONS_5 I am able to switch the video stream to /dev/video1.  I am not able to stream to both video enpoionts at once however.

   

Reading in cyu3dma.h I can see that there is no CyU3PDmaMultiType_t that is many to many which is what I think I want.  The existing application uses CY_U3P_DMA_TYPE_MANUAL_MANY_TO_ONE.

   

Is it possible to have the DMA switch endpoints during it's callback?  I would like to change the consSckId on the fly so that I can stream frames to either video endpoint at will.

   

Does CyU3PUsbMapStream or CyU3PUsbChangeMapping help me accomplish this? I cannot find an example of either of these functions ANYWHERE on the web.  I've read the documentation but I don't understand exactly how I should use them.

   

 

   

Attached is the unmodified example project.  The only changes I've made are duplicating the bytes in the esUVCUSBSSConfigDscr struct.  The rest of the application and dma setup I haven't touched.

   

 

   

Thanks for any tips in advance.

   

Regards,

   

Ben

0 Likes
2 Replies
Anonymous
Not applicable

 Are you trying to do simultaneous transfer data from two cameras on two the endpoints simulanoeoulsy?

   

If so that is not possible? 

   

But you can receive data on both endpoints in a time division multiplexed manner. You can have a single DMA Socket  and the use CyU3PUsbMapStream to change the endpoint mapped with the socket.

   

Regards,

   

-Madhu Sudhan

0 Likes
Anonymous
Not applicable

Did you ever get this working?  I'm trying to do something similar with 2 separate cameras that I'd like to stream simultaneously with an FX3.  I have started modifying the bulkuvcinmem project to de-risk this ability, and have managed to get the device enumerated properly as 2 /dev/video devices. Unfortunately I am having some issues after that.  Did you have to create 2 separate UVC VIC’s to do this? That’s the only way I have been able to get Linux to recognize the 2 inputs.

Our eventual hope is to feed the image data from the 2 sensors into the FX3 GPIFII via an FPGA  and then stream it as independent outputs (unlike Section9 of AN75779 which talks about interleaving the data).  Our reason for this is different resolutions, frame-rates, between the two sensors.

Madhu Sudhan,

Is there a good example project that makes use of the CyU3PUsbMapStream function that you called out?

Thanks,

-Brian

0 Likes