Hi,
I am facing an issue with SDK 1.3.4 while using multi thread. I have ported a project source to SDK 1.3.4 that I already used in SDK 1.3.1. I have not changed anything in my source code, I just build the source with 1.3.4.
In firmware we are just getting commands from HID call back, once I received command in HID call back I will set a event and in another thread I will wait for that event once I received the event I will perform my things. This simple concept is not working if I build my project in SDK 1.3.3 or SDK 1.3.4. Please help us to solve this issue at earliest.
Thanks and Regards,
Vignesh Kumar R.
Show Less
Hi,
I am sending data from the bulk endpoint to GPIF(BULK OUT). At the same time, I am sending/receiving data through UART (register mode) by Teraterm. Now I need to use Control Center(CC) instead of Teraterm. I mean I want to change the firmware and maybe Visual C project somehow that Bulk OUT and UART traffic be able to send concurrently at the same time and can be accessible by Control Center.
I am doing some research to find a good way to transfer UART data to CC without any corruption on Bulk out data and vise versa. Based on KBA92475(of course, it is for BULH IN), Bulk transfers are found to corrupt control endpoint data, causing errors in the control pipe.
So my question is that is there any suggestion for this goal? which kind of endpoints would be more reliable and helpful to transfer from/to UART to CC to avoid any corruption when I am using BULK out endpoint at the same time? Is there any example to help me on that?
Thanks
Show LessHi, I've referred AN75779 doc to accomplish that for our project.
Now, we want to change the transmission mechanism:
Dummy state machine:
According to Getting Started doc, I've understood the mechanism for dual sockets yield:
The normal step behavior which use dual socket should like:
However, now, each HREF may receive different type of data. I worry it may cause Socket Linked List conflict...
Ex: HREF1 – image, HREF2 – info, HREF3 – image, HREF4 – image
My inference step behavior:
In the step 1, load the DMA Descriptor1 on socket1(thread1). However, the next image data is received by socket0, not the expected socket1.
Q1: Will it cause the conflict as the following fig and cause crash for USB transmission?
Q2: If the answer for Q1 is yes. Can I use the commands in FW to reset Socket Linked List to avoid the conflict?CyU3PDmaMultiChannelReset(&glChHandleSlFifoPtoU);
CyU3PDmaMultiChannelSetXfer(&glChHandleSlFifoPtoU, 0, 0);
CyU3PDmaMultiChannelSetXfer(&glChHandleSlFifoPtoU, 0, 1);
Thanks for your patience to read it. Any help will be highly appreciated!
Show LessHi
Is there any design examples or documentation on how to use EZ-USB® GX3 as USB port replicator.
For example, I want to connect remote USB device via ethernet cable (point to point) to PC that will think that a USB device was connected and will activate the USB driver for the remote device
Best Regards
Alex
Show LessHello ,
AR1335 Image sensor RAW8 data we are sending in CY_U3P_CSI_DF_YUV422_8_2 (16 bit) format from CX3 to the PC over standard UVC protocol . Green image is streaming on PC . Now we are trying to build a new filter to get the actual image on the PC.
As per the explanation in example project and community discussion we came to know that :
* Data going out from CX3(over USB) in Data Order: {Y1,U1},{Y2,V1},{Y3,U3},{Y4,V3}....
* Raw8 to YUV422 is just format conversion not data conversion .
Whether first we need to parse back RAW8 from YUV422 format and convert to actual RAW8 to YUV2 format ?
* if yes kindly give some brief explanation on how exactly RAW8 pixels are formatted into Y1 ,U1 and V1.....etc in CX3 ?
* or else if is there any way to convert directly from CY_U3P_CSI_DF_YUV422_8_2 to real YUV2 format ?
It will very helpful if you share any technical document or forum discussion or reference details/formulas/code regarding this to understand more.
With Regards
channabasappa
Hi,
I made my custom board(FX3 and Camera Module).
I connected it and wrote my custom source code.
The black screen was printed in AMCAP, so I checked the debug message with wireshark.
The Wireshark confirmed that the data was output.
The resolution is 340(W) x 260(H), 30fps and 12bit data output.
You can see data in wireshark log.
I tried to change DMA Buffer size(CY_FX_EP_BULK_VIDEO_PKT_SIZE and CY_FX_EP_BULK_VIDEO_PKTS_COUNT ) in uvc.h
If i change buffer size, the buffers in the data packet are output differently.
Regards,
Jay
Show LessDears, I am literally stuck for many months in my project of the design of a 16-bit grayscale camera. I got the superspeed FX3 kit and started with the example code in AN75779. After multiple trials and errors, I see no progress at all. In fact, I am more confused than before. My ultimate aim is to get a 16-bit grayscale image from a sensor via FPGA, and feed it to an FX3 kit and get it streamed to a PC and use a LabVIEW program to acquire it for post-processing. As I was lost in my way seeing multiple community posts and not much of programming skills, I would like to request your help to at least make a successful first step.
1. I wanted to use the AN75779 as it (except for changing LV, FV, nRST pin numbers in GPIF) is so I do not add any error to the original program. I interfaced it with a DeoNano FPGA board with a custom made interface board. I changed the pin numbers in GPIF accordingly. Please see GPIF interface definition picture attached. I use VLC to try to get some video. I wrote a test program in FPGA/verilog to generate the PCLK, LV, FV, DATA. I start with 8 bit DATA as in GPIF the original program it was 8 bit. Some Signaltap plots of FV, LV shown. I made it for 1280 x 720, 30FPS using PCLK of 48MHz. I added Frame and Line blanking accordingly so that the final frame rate is 30FPS. I uncommented the DEBUG_PRINT_FRAME_COUNT in uvc.h as per suggestion from other posts. But I always get Frame timer overflow and 0 frames and 0 buffers in the UART debug as attached. Is my idea/attempt wrong? I tried giving both longer and shorter blanking times keeping the 30FPS more or less same. If I manage to make this work, I hope I can make some progress to my actual project goal.
2. I am not using any I2C lines for now. Will this work without I2C connected? I guess so.
If I manage to get this FV, LV, Data test signal correctly, I will be making the actual readout in same sequence and timing hoping it to work fine. Later I will have to change the resolution too as I have one QVGA and one VGA sensors to be interfaced individually.
Thank you for your kind help.
Regards
Show Less
Hi,
I think the FX3 supports Bulk, Isochronous, Interrupt and Control Data Transfers.
Does CX3 also support all Data Transfer mode?
Best Regards,
Naoaki Morimoto
Show Less你好,我使用一个uvc工具验证发送设置camera 图像对比度,饱和度命令,但是CyU3PUsbRegisterSetupCallback(CyCx3AppUSBSetupCB, CyTrue)注册的回调函数不响应,函数头里加了日志也没有通过串口打印出来。此uvc工具直接测试usb camera摄像头是有效果的。我想咨询一下,fx3需要如何注册回调函数才会响应类似命令?
Show LessHi,
We intend to use a camera module (sensor + FPGA) with the FX3 and UVC to the Host like in AN75779.
The module outputs VGA@60fps but has a pretty low pixel clock (20MHz).
I’m trying to check if the module timing is compatible with the GPIF state machine as described in AN75779.
I’m struggling to find information in the spec/appnote/community on the shortest horizontal and vertical blanking times or clock cycles required.
As far as I understand AN75779 GPIF state machine:
Is my understanding correct ?
Is there a minimum time (for a given CPU clock) I can take in account for the vertical blanking?
Thanks in advance
JC
Employee
Employee
Employee