Microsoft provides an extension to the UVC driver & it supports more formats than specified in the UVC 1.1/1.5 spec. Using this feature it can directly stream raw RGB image without using intermediate converter. Sample descriptor implementation along with firmware is available in SDK 1.3 firmware folder.
RGB24 (RGB888): ..\1.3\EZ-USB FX3 SDK\1.3\firmware\cx3_examples\cycx3_rgb24
RGB16 (RGB565): ..\1.3\EZ-USB FX3 SDK\1.3\firmware\cx3_examples\cycx3_rgb16
Refer the following link for supported formats,http://msdn.microsoft.com/en-us/library/windows/desktop/dd757532(v=vs.85).aspx
In the Cx3 chip you cannot the format from RGB to YUY2.
It simply accepts the Mipi signal and convert it into parallel signals.
i am using Aptina Ar0132 in my application,it has bayer 12 output , how can i stream it through uvc?
Waiting for your valuable reply.
It is not possible to transmit a Bayer12 format stream over UVC.
You need to convert it into a YUV/RGB format.
Or you can just transmit the Bayer12 stream over a vendor class interface and develop your own host application to read and display the stream.
can you please elaborate how i can stream over vendor class interface?
Waiting for your valuable reply.
You can use the GPIFII and DMA design available in the UVC example code to fetch the data from your sensor.
You can remove the header addition part of the code since you wouldn't be using UVC anymore. And hence make this an AUTO DMA channel if need be.
The major changes would happen in the descriptor file. You can remove all UVC-specific descriptors and only retain the config, interface and endpoint descriptors. Look at the descriptor structure of cyfxbulksrcsink for reference.
As a result, you can also remove the code that handles all UVC control requests (in uvc.c file).
On the host side, you would need to develop an application (like the Control Center or Streamer applications provided by Cypress) that would talk to the cyusb3.sys driver. Your application should be coded to fetch the data from the device (using Cypess API's) and then display the received video frames using Microsoft API's. All this only holds true if you are going to use Windows host platform.
For Linux/Mac, you'd need to use a third party driver (like libusb) and code your application on top of it (since Cypress does not provide drivers for these paltforms).
I was wondering, whether the example for an RGB Implementation, that Gayathri poasted earlier, works for the FX3 as well. It doesn't really seem like that to me but we have a little bit of hope left.
Doesn't the FX3 contain a ARM9 Core? Wouldn't it be possible to use this for data conversion? I just have a really hard time to believe that rgb and UVC can't be merged together on the FX3.
Or how about this idea: What would actually happen if I would just send my rgb data to the FX3 and let UVC transfer that data to the Host. Sure I wouldn't have a proper picture using a standard software like VLC or so, but I could grab the pixels with Open CV and convert them there, couldn't I?
Thanks in advance!
Have you been able to transfer RGB to VLC like VirtualDub using FX3 UVC sample code? I struggled on it more than 2 months and no video show up yet. If it is not feasible, I would just give it up and look for other solutions. Thank you very much.