- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
We have a project to transfer the BT.656 signals to the NVIDIA Jetson TX2 via USB. And we are considering using FX3 in this project. Did anyone try connecting bt.656 to FX3. thanks
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Unlike BT.601, Bt 656 uses SAV & EAV codes (timing reference signals) to decide HV & FV. This is done to reduce the number of interface signals between camera sensor and the controller. I can suggest you two methods to stream BT 656 video :
1. There is a compare_data option in GPIF2. This action compares the latest sampled data to a fixed value and then decides on the next state transition. So maybe you can use this cmp_data action to determine if the data is active data, blanking period, timing reference signals etc. and depending on this comparison we can determine what to do next.
2. You can stream the data as a raw data from the camera sensor and commit it to HOST. A custom build host application will then decode the SAV and EAV codes to determine if the data is active data, blanking period, timing reference signals etc. In this case you can make use of Vendor specific driver because UVC driver will drop the packets if they are not in proper format.
Thanks & Regards
Abhinav Garg
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Could you please mention what all interfaces NVIDIA Jetson TX2 provide to transfer data into it. FX3 has parallel GPIF interface which can be used to transfer data in one direction at a time.
Thanks & Regards
Abhinav
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Unlike BT.601, Bt 656 uses SAV & EAV codes (timing reference signals) to decide HV & FV. This is done to reduce the number of interface signals between camera sensor and the controller. I can suggest you two methods to stream BT 656 video :
1. There is a compare_data option in GPIF2. This action compares the latest sampled data to a fixed value and then decides on the next state transition. So maybe you can use this cmp_data action to determine if the data is active data, blanking period, timing reference signals etc. and depending on this comparison we can determine what to do next.
2. You can stream the data as a raw data from the camera sensor and commit it to HOST. A custom build host application will then decode the SAV and EAV codes to determine if the data is active data, blanking period, timing reference signals etc. In this case you can make use of Vendor specific driver because UVC driver will drop the packets if they are not in proper format.
Thanks & Regards
Abhinav Garg