Blank image from FX3

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
AlNi_1311921
Level 1
Level 1
5 replies posted Welcome! First question asked

To the Support,

I have been going round and round in a circle trying to make 400x400 raw8 sensor to work.  Here are the list of modifications that I have done from UVC_AN75779 project.

1) change resolution to 400x400 (0x90, 0x01,).

2) Modified Sensor.h and sensor.c files for the new sensor. (see 8MHz PCLK, LV=50us pluse, and FV=33.3ms pluse and stay low for 32us.

3) cyfxuvcdscr.c file: Maximum video or still frame size is 150K bytes (0x00,0x71,0x02,0x00,)

4) Number of bits per pixel = 8 (8 bit raw)

5) X and Y dimensions of the picture aspect ratio are 0x04 and 0x04. (tried 01,01; 08;08, and 00,00)

6) UVC.C fiel: Max video frame size in bytes = 160K= 0x00, 0x71, 0x02, 0x00,

Below is the output from TeraTerm, after loaded the img file and ran AMCap apllication.  AMCap showed black screen and under the Option property, it showed 0 frames played.

How  can I get image data from FX3?

          TeraTerm:

UsbEventCB: Detected SS USB Connection

USBSetupCB:In SET_FTR 0::1

USBSetupCB:In SET_FTR 0::1

USBSetupCB:In SET_FTR 0::1

UsbEventCB: SUSPEND encountered...

Entering USB Suspeþ

                   j½5

         After the AMCap application:

USBSetupCB:In SET_FTR 0::1

Leaving Suspend Mode

Application Started

0 Likes
1 Solution

Hello,

glDmaDone is a global variable that is used for storing the number of DMA buffers that are committed after a full frame is sent out. It is not random values that are printed. It is the additional buffers that are committed after a frame is sent out. This may not be the same every time. Also, the Debugprint for printing the information is called in the UVC Application Thread (UVCAppThread_Entry). There maybe callbacks that are triggered in between execution of statements in the UVCAppThread_Entry which can delay the print. But the count (glDmaDone) will go on increasing meanwhile.

You will not find the device in streamer Application if you load your code to FX3. This is because streamer is an application that is designed to communicate with cypress driver and not UVC driver. By loading your firmware, FX3 will be bound to UVC driver and hence you cannot see FX3 in streamer application. If you still want to verify whether the data is streaming properly using streamer application, you need to bind FX3 to the cypress driver. Then you need to start and stop streaming by making use of vendor commands. This is discussed in the following thread.

FX3 / CX3 Firmware for Streaming RAW Image Data using Cypress Driver

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna

View solution in original post

15 Replies
JayakrishnaT_76
Moderator
Moderator
Moderator
First question asked 1000 replies posted 750 replies posted

Hello,

Please find my comments below:

1. In the cyfxuvcdscr.c file, you have set the maximum video or still frame size as 150K Bytes. But according to your resolution, the frame size will be 400*400 bytes = 160K Bytes. Please modify this part of the descriptor to 160K Bytes.

2. The UVC driver will not support streaming RAW8 format. If you need to stream RAW8 format, then you need to develop your own application for it.

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna

Hi,

Sorry My mistake.  0x02 71 00 = 160K

Below is with the Debug_Print_Frame_Count enable

Application Started

UVC: Completed 0 frames and 0 buffers

UVC: Completed 29 frames and 13 buffers

UVC: Completed 59 frames and 2 buffers

UVC: Completed 88 frames and 8 buffers

UVC: Completed 118 frames and 0 buffers

UVC: Completed 147 frames and 4 buffers

UVC: Completed 176 frames and 10 buffers

UVC: Completed 206 frames and 0 buffers

UVC: Completed 235 frames and 5 buffers

UVC: Completed 264 frames and 11 buffers

UVC: Completed 294 frames and 1 buffers

0 Likes

Hello,

Please let me know the following:

1. What is the host application that you are using for streaming the video?

2. Have you made any changes to the GPIF state machine in the AN75779 project?

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna
0 Likes

Hi,

I used AMCap, Graph Edit, and VLC player.  Also, I did not modify GPIF state machine.

Although I am sending RAW8, but the Mediasubtype GUID is set to the same, YUY2 but 8 bits data (I changed the Number of bits per pixel from 16 bit to 8).  Will this be cause the trouble in the streaming?

Someone has posted that a UVC player can play RAW images data.  Can you suggest the player?

Lastly, I only needed to show some image and pass to someone who can debayer it.  What is a better way to do this?

Best Regards,

Alex

0 Likes

Hello,

There are no UVC Applications that can stream RAW8 format. Also when you use GUID of YUY2 format, you cannot change the pixel depth (number of bits per pixel) to 8. This is because the pixel depth for YUY2 format is 16.

For streaming RAW 8 as YUY2 format using FX3 you need to do the following:

1. In the GPIF state machine, modify the data bits used from 8 to 16. As the sensor sends out only 8 bits of data, you can either pull up or pull down the remaining 8 bits.

2. When the number of data bits used in the GPIF state machine is changed, the counters used for tracking the amount of data in the DMA buffer also need to be changed. You can use the following formula mentioned in AN75779 to calculate the count.

Count = [(Producer_Buffer_size (in bits))/data_bus_width] – 1

The above two modifications will make FX3 to read  and transmit 16 bits of information per pixel. Now you need to change the descriptors so that the host application will understand that each pixel is of 16 bits For this the please follow the steps below.

3. Change the pixel depth from 8 to 16. This is anyway required as YUY2 format makes use of 16 bits.

4. Change the min and max bit rate by multiplying width and height of your frame, fps and 16 (bit rate=width*height*fps*pixel depth)

5. Change the still frame size by multiplying width and height of your frame and 2. (Frame size = width * height * pixel depth(in bytes))

After doing the above steps, if you try to stream the video using UVC applications such as AMCap, e-CamView etc, then you will see a distorted video. This is because each pixel is having 8 bits of unwanted information. To stream the video properly, you need to develop your own host application that can remove these unwanted 8 bits of information per pixel.

Please Refer to the following KBA for better understanding:

UVC Troubleshooting Guide – KBA226722

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna

Hi,

Thank you for your guidance and suggestion.  Below are the recommended changes:

1) Set to 16 bits data bus in GPIF

2) For the Count in GPIF , I set ADDR and DATA count to 8183.   Are they correct?

3) Changed to 0x10 bits per pixel on cyfxuvcdscr.c file

4) Min and Max bit rate to 0x00,0xE0,0x93,0x04 on cyfxuvcdscr.c file

5) Changed Frame size to 0x00,0xE2,0x04,0x00 on cyfxuvcdscr.c file

After built a new cyfxgpif2config.h file and build and load image file to FX3, I still see black screen on AMCap with DEBUG_PRINT_FRAME_COUNT turned off.

Can I set the aspect ratio to 04 on both "x" and "Y"?

Per AN75779, end of section 2.3.3, how do I modify the sensor data to output 0x80 for U and V values?  This will give me a monochrome image.

How do I set FX3 for a host to enumerate as a camera (non-UVC) for the GraphEdit application to be able to recognize the device?  May be I can try its debayer filter.

Sincerely Yours,

Alex

0 Likes

Hello,

Please find my comments below:

>> For the Count in GPIF , I set ADDR and DATA count to 8183.   Are they correct?

The DMA Buffer size used in AN75779 is 16kB which is 16384Bytes. So the Count should be

Count = ((16384*8)/16)-1 = 8191

You need to modify the count as 8191.

>>After built a new cyfxgpif2config.h file and build and load image file to FX3, I still see black screen on AMCap with DEBUG_PRINT_FRAME_COUNT turned off.

1. After modifying the state machine in GPIF Designer, did you build the project to the workspace where AN75779 project is saved? Did you change the build settings so that the path of workspace where the project is saved was chosen?

2. Please turn on DEBUG_PRINT_FRAME_COUNT and send the UART debug logs.

>>Can I set the aspect ratio to 04 on both "x" and "Y"?

Yes You can change the aspect ratio on X and Y.

>>Per AN75779, end of section 2.3.3, how do I modify the sensor data to output 0x80 for U and V values?  This will give me a monochrome image.

For doing this, the image sensor registers need to be updated. Please contact the image sensor manufacturers for this.

Please Refer to this thread to enumerate FX3 as a non-UVC device and stream the video data using Control Center and Streamer Application.

FX3 / CX3 Firmware for Streaming RAW Image Data using Cypress Driver

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna

Thank you for your helps.

After update the GPIF header file, I still have black image on the AMCap.  Below is the output of the Debug print frame:

Leaving Suspend Mode

UVC: Completed 0 frames and 0 buffers

Application Started

UVC: Completed 0 frames and 0 buffers

UVC: Completed 29 frames and 4 buffers

UVC: Completed 58 frames and 9 buffers

UVC: Completed 88 frames and 0 buffers

UVC: Completed 117 frames and 5 buffers

UVC: Completed 146 frames and 11 buffers

UVC: Completed 176 frames and 0 buffers

UVC: Completed 205 frames and 6 buffers

UVC: Completed 234 frames and 12 buffers

UVC: Completed 264 frames and 2 buffers

Clear feature request detected...

Application Stopped

P.S. Can you clarify why do I need to change GPIF and UVC descriptor to 16 bits per pixel for the YUY2 format?  On the unmodified AN75779 project, GPIF is set to 8 bit with YUY2 format and the UVC descriptor is set to 16 bit per pixel.  So, I assumed under one Pclk cycle, the sensor send 2 bytes (Y and U/V data) per UVC descriptor to GPIF, which are two 1 byte (physically 8 bit data bus).  When GPIF read the 2nd byte after the first?   Also, how does the UVC and the host interact when I changed to 16 bit on GPIF and UVC descriptor with YUY2 format?

Best Regards,

Alex

0 Likes

Hello,

You are right the example project in AN75779 makes use of 8 bit bus width for GPIF II but in the descriptor it reports that each pixel is of 16 bits. The image sensor used in the project AN75779 has only 8 bit output. So one pixel will be send out in 2 PCLK by the image sensor. As the descriptor reports that each pixel is of 16 bits, the host application read 16 bits and treat it as a single pixel. In your case, one pixel is of 8 bits only. But UVC application cannot stream RAW formats. So we try to stream RAW videos in YUY2 format. This requires the pixel depth in the descriptors to be 16 bits per pixel. But if you do not change the GPIF bus width, then the host application will read 16 bits which actually contains the information of 2 pixels (as one pixel is of 8 bits). To avoid this, we make the GPIF II data width to be 16 bits. Then we pull up or pull down the unwanted 8 pins. The sensor sends RAW8 data to the remaining 8 pins of GPIF II. The host application will read the 16 bits of data. But to view the video properly, you need to discard the unwanted 8 bits that you added in the GPIF II block. For this, you need to develop your own custom application.

I find that from the debug logs, the video is being streamed properly. For viewing the video please develop your own Host Application.

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna

Hi Jayakrishna,

Thank you for your detail explanation.  It helps me a lot.  Can I show that the streaming data is valid with the Streamer app or other app?

Also, why the buffers number from the debug log are random, not the same?  This new debug log seems to be the same as previous debug log.

0 Likes

Hello,

Yes you can make use of streamer application to check whether the streaming data is valid or not.

>> Also, why the buffers number from the debug log are random, not the same?  This new debug log seems to be the same as previous debug log.

This debug print is correct. It is not random. If you check uvc.c in the project folder of AN75779, you can see that the following debug print.

CyU3PDebugPrint (4, "UVC: Completed %d frames and %d buffers\r\n", glFrameCount, (glDmaDone != 0) ? (glDmaDone - 1) : 0);

This is what you see in tera term. glFrameCount is the number of frames that are transferred. This is incremented once when one complete frame is sent out. It is cleared (i.e glFrameCount=0) only when Application is started or stopped. So the value of glFrameCount goes on increasing. Regarding glDmaDone, it is the global variable that is used to identify the number of DMA buffers in addition to a frame that are sent out. This variable is cleared when one complete frame is transmitted and incremented when individual buffers are committed.

The debug prints that you receive seems correct as the number of frames transferred is going on increasing.

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna

Hi Jayakrishna,

Thank you for your expertise.  I still don't understand glDmaDone value.  why did the debug gave random glDmaDone value?

Also, I tried running Streamer, but I did not see FX3 device for this, only the debug port.  When I selected the debug port as the connected device, I get zero successes on both endpoints (Bulk-in and Interrupt).  Please help!!!

Best Regards,

Alex

0 Likes

Hello,

glDmaDone is a global variable that is used for storing the number of DMA buffers that are committed after a full frame is sent out. It is not random values that are printed. It is the additional buffers that are committed after a frame is sent out. This may not be the same every time. Also, the Debugprint for printing the information is called in the UVC Application Thread (UVCAppThread_Entry). There maybe callbacks that are triggered in between execution of statements in the UVCAppThread_Entry which can delay the print. But the count (glDmaDone) will go on increasing meanwhile.

You will not find the device in streamer Application if you load your code to FX3. This is because streamer is an application that is designed to communicate with cypress driver and not UVC driver. By loading your firmware, FX3 will be bound to UVC driver and hence you cannot see FX3 in streamer application. If you still want to verify whether the data is streaming properly using streamer application, you need to bind FX3 to the cypress driver. Then you need to start and stop streaming by making use of vendor commands. This is discussed in the following thread.

FX3 / CX3 Firmware for Streaming RAW Image Data using Cypress Driver

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna

Hi Jayakrishna,

Thank you very much for all your helps and supports.  I will try the RAW image Data link.

Happy Holidays!!!

Alex

0 Likes

Hello,

Thank you for the update. Happy Holidays!

Best Regards,

Jayakrishna

Best Regards,
Jayakrishna
0 Likes