I am working on Cy7C680-13A Cypress USB 2.0 deivce controller. The device controller is interfaced with cmos image scanner, which gives images with resolution 320x240, with approx. 4-5 frames per second. This entire system (Cypress controller + cmos image scanner) is to be connected to 32 bit ARM9 based controller, with USB1.1 host and running linux operating system.The system works in asynchronous mode, with the signals interfaced as mentioned below:
CMOS camera PCLK interfaced with SLWR
CMOS camera HSYNC interfaced with SLCS#
CMOS camera VSYNC interfaced to PA08bit
CMOS data lines interfaced with FD0-FD8
I use a windows based PC, with USB 2.0 high speed for the initial development, and the deployment is to be done on USB1.1 ARM system. Hence the following code is added in Cypress device firmware to enumerate the Cypress device as USB 1.1:
USBCS |= 0x08;
CT1 |= 0x02;EZUSB_Delay(1500);
With the above code segment added in TD_Init(), the Cypress USB device controller shows 64byte length in bulk endpoint, even when connected to USB 2.0 high speed, which means that Cypress controller is now enumerateda as a USB1.1 device.Now coming to the image display part from the cmos image sensor, I get a clean image when the controller, enumerated as USB 1.1 , is interfaced on the PC, but the image gets distorted when the same setup is interfaced with ARM device.On the PC, the image of size 75K is read every 300 ms and when in comes to ARM, it takes 320ms to read one frame. Is the time difference causing the distortion ?Is there any standard code for testing the throughput of Cypress controller when interfaced with any host. ?
Regarding the USB lines (DN, DP) connections, there are no termination resistors, or pull ups added as per the USB 1.1 specification. Does the Cyrpess controller require these termination resistors for impedence matching ? Are these factors serious enough to cause major data integrity issues. ?