Please find my comments for your questions:
1. It is possible to use two sensors simultaneously with pixel depth of 16 bits for each sensor. For that, you need to change the Data Bus Width setting of Interface Definition tab of GPIF Designer to 32 bits as shown below:
When you change the data bus width, you need to change the data and address counters in the state machine tab of GPIF Designer by using the formula mentioned in section 3.5 of the AN75779 document.
Please refer to section 9 of the AN75779 document to understand more about adding the new sensor. The section 9.1.2 describes about use of two sensors having pixel depth of 2 bytes each. Please go through entire section 9 of the Application Note to understand about the changes and checklists that are to be made while using the additional sensor.
2. We do not have an example project for using two 16 bits sensors connected to FX3 and streaming simultaneously.
Please let me know if you have more queries on this.
Thank you for your reply. If I change the GPIF to 32 bit, how can I get the data from the "CyFxUVCAppInDmaCallback"? Because I find that the "CyU3PDmaBuffer_t count" is 16bit, How can I get the 32 bit data?I could not understand the relationship between the GPIF's bits with the DMA very well.
In the project, a many to one DMA channel is created. This channel has 2 Producer sockets (both P Ports) and one Consumer socket (U Port). 8 DMA buffers of 16KB each are allocated with this channel.
For streaming video using this project, a condition has to be met. The condition is each video frame should end with a partial buffer.
Whenever a Producer socket fills a DMA buffer or when the last partial buffer corresponding to a video frame is received, the callback function is invoked. Your understanding that the count parameter is of 16 bits is correct. But this parameter just indicates the byte count of valid data in the DMA buffer. So, this value can be either the size of DMA buffer itself (full buffers) or less than the size of DMA buffers (partial buffers).
At each PCLK, the GPIF II block samples 32 bits(2 bytes) of data and inserts them into a DMA buffer associated with the P Port socket. This goes on until the GPIF II block fills the DMA buffer associated with the P Port socket (16KB). This invokes the callback function. For the last partial buffer, the state machine is designed in such a way that an interrupt signal is given to firmware and the callback function is invoked forcibly by using the API CyU3PDmaSocketSetWrapUp().
Please let me know if you have further queries on this.
Thank you for your reply. when I use dual camera, I have another problem.
For example, One camera connect the bits 15 to 0 and the other connect the bits from 31 to 16.
Condition1: The 32 bit of GPIF must get 32 bit from two camera at the same time. That means dual camera must output data at the same time.
Condition2：Dual camera could output data from different time. The first time get data from bit 15 to 0 and the next time get data from bit 31 to 16.How can I control this condition in firmware.
which condition is right? And Does it have any other condition?
As you might be knowing, when you change the GPIF II data bus width to 32 in the AN75779 project, the GPIOs used are GPIO[0:15], GPIO[33:44] and GPIO[46:49]. This is because the GPIOs in between them are used for control signals or have other functions. Please find the following snapshot to confirm this.
You can route the sensor data from the first sensor to GPIO[0:15] and the second sensor's data to GPIO[33:44] and GPIO[46:49].
Also, to use this setup, both the sensors should be synchronized. That is they should have same PCLK, same LV and FV transitions and pixel timing. This is already documented under Figure 51 of AN75779. Please refer to the second assumption under the figure. If the sensors are different, then an FPGA would be required between the image sensors and FX3.