I've been developing a camera interface using Madhu Sudhan's RAW data driver firmware (FX3 / CX3 Firmware for Streaming RAW Image Data using Cypress Driver ) and have been making some progress. I have the OV5647 sensor streaming 5Mp, 1080p and now 720p RAW10 data to a PC running an application I've written using CyAPI and OpenCV. The PC app collects the streaming data and converts it to RGB format for display. I believe I have a post describing this in more detail.
I have some questions I'm hoping you can help me answer. I'll provide some background.
Of these three resolutions, I am getting two different behaviors. At 5Mp, the system (firmware and PC app) runs for a random period of time before CyAPI reports error 0x1F (31). Once this error ocurrs, firmware must be reloaded before any more image data can be transferred. Transfers typically proceed many tens of minutes before there's an error, and there are only a few detected DMA timeouts (error 0x3E3/995) at the onset of streaming, until the stream is established. This thread discusses the issue in detail: Streaming RAW image data using Cypress Driver part 2 and I've accepted the probably cause that was proposed in the thread that the app isn't grabbing the data quickly enough at times.
For all resolutions there are DMA timeouts (995) when the stream starts, but for 1080p and 720p these errors continue after the stream starts and continue during streaming. The result is very confusing. For 1080p, there seems to be no effect, the frame rate nearly matches the theoretical range from the sensor. But for 720p, the frame rate is seriously degraded by the constant errors & firmware recovery. These errors got me to wonder is the frames the CX3 is receiving from the sensor are corrupted. Since it's very difficult to probe the sensor's MIPI signals I began monitoring the HSYNC test point to count the number of scan lines in each frame. Using a deep-memory scope, I can monitor the HSYNC test point for many tens or hundreds of frames and see the individual horizontal transfer pulses with enough resolution to count and verify their numbers over the course of tens or hundreds of frames. This has led me to the inconsistency I mentioned in the title.
Given the behavior of the system for different resolutions, I'd expect to see perfect frames at 5Mp (since there are no errors I expect 1944 lines/frame) while the stream is running. Conversely, for 1080p and 720p resolutions, maybe there are incorrect numbers of scanned lines for these resolutions causing firmware to get out of sync and timeout. Since 1080p is better than 720p I'd expect its horizontal transfer counts to deviate from 1080 less often than 720p deviates from 720. This isn't what I found.
At all three resolutions, I am measuring 10 to 30% of the frames contain wrong numbers of scanned lines. This makes me wonder about the integrity of this test point. Is it reporting the wrong numbers of lines in each frame? Does it not match what firmware "sees"? If this high of a percentage of scanned lines is wrong, why don't ALL resolutions continually trigger error 995?
Can anyone explain what's wrong with my reasoning for this apparent paradox?