What are the sockets that you are using as producer and consumer in dma multi channel?
What are other sockets you have used in the application?
FOR FPGA, these sockets are used configuring FPGA
The above 3 sockets are valid.
FOR UVC, these sockets are used for controlling and streaming UVC
( CY_U3P_UIB_SOCKET_CONS_0 | CY_FX_EP_VIDEO_CONS_SOCKET) (OR operation is done)
I have verified that there is no conflict and the same sockets are not used anywhere else.
Any other socket other than the FPGA ones are invalid. I am not able to stream the image.
These UVC sockets are used to create a DMA channel which is getting created successfully. Despite being invalid sockets, the DMA channel is getting created. Also, DMA transfer is also successfully working.
How can I go upon solving this issue? What are the debug points that I can add to check the flow?
I understood that you are not able stream the video.
But you said that the data transfer is successfully working. If the data transfer is successful, how come the sockets are being invalid?
The reason for not streaming the video may be some thing else: Like the Frame Format, Probe control settings and etc.
Please check whether the following debug steps help you.
UVC implementation requires the camera device to always send the right amount of bytes per frame that was declared. Anything more or less leads to the frame being dropped. If this happens, you will see black screen in the application.
Follow the below mentioned steps to debug this issue:
1. Does the camera send out exactly as much data as you configured it for? Probe the FV and LV lines and make sure that the pulse widths are exactly as much as you expect it to be, i.e. LV high duration = (number of pixels per line) * (bytes per pixel) / PCLK_frequency FV high duration = (LV high duration + line blanking time) * (number of pixels per line) Certain cameras have dead bands (blank lines) that are inserted in the frame which are also read by FX3. This leads to an increased frame size than what was reported to the PC.
2. Make sure that Vertical blanking period is at least 500us to ensure a constant frame rate. At the end of a frame, GPIF state machine interrupts the CPU (INTR_CPU). Once, the firmware has committed the partial buffer, GPIF state machine will switch to the other socket. Vertical blanking period should be large enough to allow GPIF state machine to switch to another socket, so that there is no data lost in the next frame. When there is a mismatch in the frame size than what was reported to PC, the frame will be dropped.
3. Make sure that the PCLK frequency set by the image sensor is less than 100MHz.
4. Make sure that the GPIF lines are terminated with 22Ω series resistors. Also make sure that these lines are length matched to within 500mils.
5. Are the following fields in the descriptor file cyfxuvcdscr.c, correctly set to what the camera actually gives out?
a. Frame width and height in VS Frame Descriptor
b. Bytes per pixel in VS Format Descriptor
c. Frame rate in VS Frame Descriptor
d. Video format as set by the GUID in VS Format Descriptor
6. Make sure that maximum video frame size set in the probe control structure (glProbeCtrl) is equal to or greater than the amount of bytes you are sending in one frame. Also, make sure that the maximum payload size set in the probe control structure is more than how much you send in the payload (which is typically the DMA buffer size)
7. Is “Error in CyU3PDmaMultiChannelCommitBuffer: code 71” debug message is displayed on a UART terminal when streaming video through FX3? If yes, implement the modifications mentioned in the following Knowledge Base Article: https://community.cypress.com/docs/DOC-10463
8. You can take a USB trace to make sure that UVC header field in the payload toggles between 0x8C and 0x8D in the subsequent frames.