Synchronizing multiple cameras is crucial for applications like 3D reconstruction, stereo vision, and panoramic video stitching. In this project, we explore how to achieve frame synchronization with sub-millisecond precision using Raspberry Pi and external triggers.
Key challenges addressed:
- Hardware trigger signal generation
- Kernel latency optimization
- Network Time Protocol (NTP) vs Precision Time Protocol (PTP)
By carefully tuning the system, we achieved a synchronization delay as low as 100µs between frames captured by different cameras.
Video Transport Protocols
Efficiently transporting the synchronized video streams is just as critical as the capture process. We evolved our approach through two major iterations:
- Version 1: RTP (Real-time Transport Protocol) - Our initial implementation relied on standard RTP. While widely supported, it struggled with strict timing guarantees under network load, occasionally leading to jitter that affected the multi-camera alignment.
- Version 2: AVTP (IEEE 1722) - To address the limitations of RTP, we migrated to the Audio Video Transport Protocol (AVTP). As part of the TSN (Time-Sensitive Networking) suite, AVTP provided the deterministic low-latency transport required for professional-grade synchronization, ensuring that the precise capture timing was preserved throughout the network transmission.
