• DepthAI-v2
  • The host is synchronized with the clock of the USB device

According to the documentation at

https://docs.luxonis.com/software/depthai-components/device/#Device-Clock-Host%20clock%20syncing

I understand the process of synchronizing the device clock with the host clock for the OAK device. However, I am unclear about how to replicate this test on my USB-connected OAK camera. Could you kindly provide detailed instructions regarding the testing procedure and the required cable connections?

    Hi aiyangsky
    You need at least two devices, then connect the fsin lines together like done on the POE devices. Then all devices should technically be triggered at the same time (by one of the devices as master clock).

    Then drift should be accounted for by the sync function and should clearly be seen in the logged timestamps.

    Thanks,
    Jaka

      jakaskerl

      Connect their FSIN lines? So does that mean all cameras will receive trigger sampling signals from the master device? How should I understand the time drift calibration you mentioned? In this scenario, will the increment of timestamps generated by camera sampling between the two devices remain consistent?

      We will conduct testing later as per your description.

      In that case, this differs from our test results. We configured the OAK device to sample at a 10Hz frequency, recording the timestamp returned with each image. Simultaneously, we ran a timer at the same frequency on the host driving this device, logging a timestamp whenever the timer was triggered, and compared it with the image timestamp.We have observed significant drift in their differential values.

      I can provide our test code to you for reference early tomorrow, around ​11:00 Beijing Time (UTC+8).

      thanks,

      Sky

        aiyangsky

        Here is my test code.

        #include "depthai/depthai.hpp"

        #include <chrono>

        #include <opencv2/opencv.hpp>

        #include <time.h>

        #include <iostream>

        #include <thread>

        #include <iostream> // std::cout

        #include <thread> // std::thread

        #include <mutex> // std::mutex, std::unique_lock

        #include <condition_variable> // std::condition_variable

        #include <time.h>

        std::mutex timerMutex;

        std::condition_variable_any timerMutexEmpty;

        struct timespec current_time_s;

        static void timerCallback(int signo, siginfo_t *si, void *data)

        {

        std::lock_guard<std::mutex> locker(timerMutex);

        clock_gettime(CLOCK_REALTIME, &current_time_s);

        timerMutexEmpty.notify_all();

        }

        void getSystemTime()

        {

        std::ofstream File("systime.csv", std::ios::app);

        File << "system_time" << std::endl;

        struct sigevent evp;

        struct itimerspec ts;

        timer_t timer;

        int ret;

        evp.sigev_value.sival_ptr = &timer;

        evp.sigev_notify = SIGEV_SIGNAL;

        evp.sigev_signo = SIGALRM;

        struct sigaction sa;

        sa.sa_flags = SA_SIGINFO;

        sa.sa_sigaction = timerCallback;

        sigaction(SIGALRM, &sa, NULL);

        ret = timer_create(CLOCK_REALTIME, &evp, &timer);

        ts.it_interval.tv_sec = 0;

        ts.it_interval.tv_nsec = 100000000;

        ts.it_value.tv_sec = 0;

        ts.it_value.tv_nsec = 100000000;

        ret = timer_settime(timer, 0, &ts, NULL);

        while (true)

        {

        std::lock_guard<std::mutex> locker(timerMutex);

        timerMutexEmpty.wait(timerMutex);

        uint64_t current_utc_time = current_time_s.tv_sec * 1000000000ULL + current_time_s.tv_nsec;

        File << std::fixed << std::setprecision(9) << current_utc_time / 1e9 << std::endl;

        }

        timer_delete(timer);

        return;

        }

        int main()

        {

        std::ofstream File("OAK_time.csv", std::ios::app);

        File << "CameraC_time" << std::endl;

        // Create pipeline

        dai::Pipeline pipeline;

        // Define sources and outputs

        // cam_b

        auto camB = pipeline.create<dai::node::MonoCamera>();

        camB->setBoardSocket(dai::CameraBoardSocket::CAM_B);

        camB->setResolution(dai::MonoCameraProperties::SensorResolution::THE_400_P);

        camB->setFps(10);

        auto BOut = pipeline.create<dai::node::XLinkOut>();

        BOut->setStreamName("CAM_B");

        camB->out.link(BOut->input);

        dai::Device device(pipeline);

        std::thread thread_systemtime(&getSystemTime);

        cv::Mat BMat;

        auto imageB = device.getOutputQueue("CAM_B", 1, false);

        bool is_firt_frame = true;

        uint64_t boot_utc_time;

        while (true)

        {

        auto B = imageB->get<dai::ImgFrame>();

        BMat = B->getCvFrame();

        auto tsB = B->getTimestamp().time_since_epoch().count();

        if (is_firt_frame)

        {

        is_firt_frame = true;

        struct timespec boot_time;

        struct timespec current_time;

        // 获取系统启动后的时间

        if (clock_gettime(CLOCK_BOOTTIME, &boot_time) != 0)

        {

        throw std::runtime_error("Failed to get boot time");

        }

        // 获取当前时间

        if (clock_gettime(CLOCK_REALTIME, &current_time) != 0)

        {

        throw std::runtime_error("Failed to get current time");

        }

        boot_utc_time = current_time.tv_sec * 1000000000ULL + current_time.tv_nsec - (boot_time.tv_sec * 1000000000ULL + boot_time.tv_nsec);

        }

        uint64_t oak_utc_timestamp = boot_utc_time + tsB;

        File << std::fixed << std::setprecision(9) << oak_utc_timestamp / 1e9 << std::endl;

        }

        return 0;

        }

        6 days later

        jakaskerl

        Hello, according to what you said, our hardware connection is as shown in the figure below, the OAK device on board is the master clock, and the OAK-FFC4P connected via USB is the slave device.

        We connected the fsin line of the onboard OAK device to the USB-connected camera and tested the time difference between the two OAK devices. It seems that they are not synchronized?

        This is my test code:

        
        #include <cstdio>
        #include <ctime>
        #include <csignal>
        #include <atomic>
        #include <mutex>
        #include <map>
        #include <vector>
        #include <chrono>
        #include <fstream>
        
        #include <depthai/depthai.hpp>
        
        
        std::atomic<bool> is_system_running;
        
        
        static void sigintHandler(int signal)
        {
        
            if (signal == SIGINT)
            {
                is_system_running.store(false);
            } 
        }
        
        
        class TicToc
        {
        public:
            TicToc()
            {
                tic();
            }
        
            void tic()
            {
                start_ = std::chrono::system_clock::now();
            }
        
            double toc()
            {
                end_ = std::chrono::system_clock::now();
                std::chrono::duration<double> elapsed_seconds = end_ - start_;
                return elapsed_seconds.count() * 1000;
            }
        
        private:
            std::chrono::time_point<std::chrono::system_clock> start_, end_;
        };
        
        
        class OAKDevice
        {
        public:
            OAKDevice() = default;
            ~OAKDevice()
            {
                imu_queue.reset();
                img_queue.clear();
                depth_queue.clear();
            }
        
            // 创建 depthai 管道
            void createPipeline(dai::Pipeline& pipeline)
            {
                pipeline.setXLinkChunkSize(0);  // 禁用设备发送数据时分块
                pipeline.setSippBufferSize(0);
        
           
                // 参考文档: https://docs.luxonis.com/projects/api/en/latest/components/nodes/stereo_depth/
        
                std::map<std::string, std::shared_ptr<dai::node::MonoCamera>> camera_node;
                std::map<std::string, std::shared_ptr<dai::node::XLinkOut>> xout_img_node;
                // std::map<std::string, std::shared_ptr<dai::node::StereoDepth>> stereo_node;
                // std::map<std::string, std::shared_ptr<dai::node::XLinkOut>> xout_depth_node;
                std::shared_ptr<dai::node::Sync> sync_node = pipeline.create<dai::node::Sync>();
                std::shared_ptr<dai::node::MessageDemux> demux_node =
                    pipeline.create<dai::node::MessageDemux>();
                sync_node->out.link(demux_node->input);
        
        
                std::map<std::string, dai::CameraBoardSocket> cam_socket_opts = {
                    {"CAM_A", dai::CameraBoardSocket::CAM_A},
                    {"CAM_B", dai::CameraBoardSocket::CAM_B},
                    {"CAM_C", dai::CameraBoardSocket::CAM_C},
                    {"CAM_D", dai::CameraBoardSocket::CAM_D}};
        
        
                for (const auto& cam_name : cam_list)
                {
                    camera_node[cam_name] = pipeline.create<dai::node::MonoCamera>();
                    camera_node[cam_name]->setBoardSocket(cam_socket_opts[cam_name]);
                    camera_node[cam_name]->setResolution(
                        dai::MonoCameraProperties::SensorResolution::THE_400_P);
                    camera_node[cam_name]->setFps(10);
                    
                    //使用四目 or AD 时, A 口触发同步
                    camera_node[cam_name]->out.link(sync_node->inputs[cam_name]);
                    camera_node[cam_name]->initialControl.setFrameSyncMode(cam_name == "CAM_A" ?
                        dai::CameraControl::FrameSyncMode::OUTPUT :
                        dai::CameraControl::FrameSyncMode::INPUT);
                    
                    xout_img_node[cam_name] = pipeline.create<dai::node::XLinkOut>();
                    xout_img_node[cam_name]->setStreamName(cam_name);
                    demux_node->outputs[cam_name].link(xout_img_node[cam_name]->input);
                    // camera_node[cam_name]->out.link(xout_img_node[cam_name]->input);
        
                    /*if(cam_name == "CAM_A" || cam_name == "CAM_C")
                    {
                        stereo_node[cam_name] = pipeline.create<dai::node::StereoDepth>();
                        setStereoProperties(stereo_node[cam_name]);
                        stereo_node[cam_name]->setDepthAlign(cam_socket_opts[cam_name]);
        
                        xout_depth_node[cam_name] = pipeline.create<dai::node::XLinkOut>();
                        xout_depth_node[cam_name]->setStreamName(cam_name + "_depth");
        
                        if(cam_name == "CAM_A")
                        {
                            demux_node->outputs["CAM_A"].link(stereo_node[cam_name]->left);
                            demux_node->outputs["CAM_D"].link(stereo_node[cam_name]->right);
                            stereo_node[cam_name]->depth.link(xout_depth_node[cam_name]->input);
                            stereo_node[cam_name]->rectifiedLeft.link(xout_img_node["CAM_A"]->input);
                            stereo_node[cam_name]->rectifiedRight.link(xout_img_node["CAM_D"]->input);
                        }
                        else
                        {// CAM_C 是前向左目
                            demux_node->outputs["CAM_C"].link(stereo_node[cam_name]->left);
                            demux_node->outputs["CAM_B"].link(stereo_node[cam_name]->right);
                            stereo_node[cam_name]->depth.link(xout_depth_node[cam_name]->input);
                            stereo_node[cam_name]->rectifiedLeft.link(xout_img_node["CAM_C"]->input);
                            stereo_node[cam_name]->rectifiedRight.link(xout_img_node["CAM_B"]->input);
                        }
                    }*/
                }
            }
        
        
            void setStereoProperties(std::shared_ptr<dai::node::StereoDepth>& stereo_node)
            {
                stereo_node->setDefaultProfilePreset(
                    dai::node::StereoDepth::PresetMode::HIGH_DENSITY);
                stereo_node->initialConfig.setConfidenceThreshold(200);    // Known to be best
                stereo_node->setRectifyEdgeFillColor(0);                   // black, to better see the cutout
                stereo_node->initialConfig.setLeftRightCheckThreshold(5);  // Known to be best
                stereo_node->setLeftRightCheck(true);
                stereo_node->setExtendedDisparity(false);
                stereo_node->setSubpixel(true); 
            }
        
        
            const std::vector<std::string> cam_list{"CAM_C"};
        
        
            std::shared_ptr<dai::DataOutputQueue> imu_queue;
            std::map<std::string, std::shared_ptr<dai::DataOutputQueue>> img_queue;
            std::map<std::string, std::shared_ptr<dai::DataOutputQueue>> depth_queue;
        };
        
        
        
        void device2Thread()
        {
            OAKDevice oak_device;
        
            TicToc timer_pipe;
            dai::Pipeline pipeline;
            oak_device.createPipeline(pipeline);
            printf("--Start pipeline2 time: %f ms\n", timer_pipe.toc());
        
            TicToc timer_deivce;
            auto device_info = dai::DeviceInfo("14442C1061FD91D700");
            dai::Device device(pipeline, device_info, dai::UsbSpeed::SUPER_PLUS);
            printf("--Start device2 time: %f ms\n", timer_deivce.toc());
            
            device.setTimesync(true);
        
            switch (static_cast<int32_t>(device.getUsbSpeed()))
            {
                case 5:
                    printf("--Usb speed: SUPER_PLUS\n");
                    break;
                case 4:
                    printf("--Usb speed: SUPER\n");
                    break;
                default:
                    printf("WARNING: Usb speed is not SUPER!\n");
                    break;
            }
            printf("--Device2 name: %s\n", device.getDeviceName().c_str());
        
            for (const auto& cam_name : oak_device.cam_list)
            {
                oak_device.img_queue[cam_name] =
                    device.getOutputQueue(cam_name, 20, false);
        
            }
            
            std::ofstream File2("oak2.csv", std::ios::app);
            File2 << "#oak2_time" <<std::endl;
            File2 << std::fixed<< std::setprecision(6);
        
            while (is_system_running)
            {
                for (const auto& cam_name : oak_device.cam_list)
                {
                    std::shared_ptr<dai::ImgFrame> img_data =  oak_device.img_queue[cam_name]->get<dai::ImgFrame>();
                    if (img_data == nullptr)
                    {
                        std::chrono::milliseconds dura(1);
                        std::this_thread::sleep_for(dura);
                        continue;
                    }
        
                    auto oak_timestamp = img_data->getTimestamp().time_since_epoch().count();
                    cv::Mat cvImg = img_data->getCvFrame();
        
                    printf("Received device2 %s : %f\n", cam_name.c_str(), 1e-9 * oak_timestamp);
        
                    if (cam_name == "CAM_C")
                    {
                        File2 << 1e-9 * oak_timestamp << std::endl;
        
                    }
                    else if (cam_name == "CAM_B")
                    {
                        //cv::imshow(cam_name, cvImg);
                        // File <<std::fixed<< std::setprecision(9) << oak_utc_timestamp/1e9<<",";
                    }
                    else if (cam_name == "CAM_A")
                    {
                        //cv::imshow(cam_name, cvImg);
                        // File <<std::fixed<< std::setprecision(9) << oak_utc_timestamp/1e9<<","<< current_utc_time/1e9 << std::endl;
                    }
                    else
                    {
                        //cv::imshow(cam_name, cvImg);
                    }
                    //cv::waitKey(1);
                }
            }
            
            return;
        }
        
        
        int main(int argc, char** argv)
        {
            std::signal(SIGINT, &sigintHandler);
            is_system_running.store(true);
            
        
            std::thread thread_device2(&device2Thread);
        
            OAKDevice oak_device;
        
        
            TicToc timer_pipe;
            dai::Pipeline pipeline;
            oak_device.createPipeline(pipeline);
            printf("--Start pipeline time: %f ms\n", timer_pipe.toc());
        
            TicToc timer_deivce;
            auto device_info = dai::DeviceInfo("14442C1021B243D200");
            dai::Device device(pipeline, device_info, dai::UsbSpeed::SUPER_PLUS);
            printf("--Start device time: %f ms\n", timer_deivce.toc());
            
            device.setTimesync(true);
        
            switch (static_cast<int32_t>(device.getUsbSpeed()))
            {
                case 5:
                    printf("--Usb speed: SUPER_PLUS\n");
                    break;
                case 4:
                    printf("--Usb speed: SUPER\n");
                    break;
                default:
                    printf("WARNING: Usb speed is not SUPER!\n");
                    break;
            }
            printf("--Device name: %s\n", device.getDeviceName().c_str());
        
            // 获取传感器数据
            // oak_device.imu_queue = device.getOutputQueue("imu", 10, false);
            for (const auto& cam_name : oak_device.cam_list)
            {
                oak_device.img_queue[cam_name] = 
                    device.getOutputQueue(cam_name, 20, false);
                // if(cam_name == "CAM_A" || cam_name == "CAM_C")
                // {
                //     oak_device.depth_queue[cam_name] = 
                //         device.getOutputQueue(cam_name + "_depth", 20, false);
                // }
            }
        
        
            std::ofstream File1("oak1.csv", std::ios::app);
            File1 << "#oak1_time" <<std::endl;
            File1 << std::fixed<< std::setprecision(6);
            
            while (is_system_running)
            {
                for (const auto& cam_name : oak_device.cam_list)
                {
                    std::shared_ptr<dai::ImgFrame> img_data =  oak_device.img_queue[cam_name]->get<dai::ImgFrame>();
                    if (img_data == nullptr)
                    {
                        std::chrono::milliseconds dura(1);
                        std::this_thread::sleep_for(dura);
                        continue;
                    }
        
                    auto oak_timestamp = img_data->getTimestamp().time_since_epoch().count();
                    cv::Mat cvImg = img_data->getCvFrame();
        
                    printf("Received %s : %f\n", cam_name.c_str(), 1e-9 * oak_timestamp);
        
                    if (cam_name == "CAM_C")
                    {
                        File1 << 1e-9 * oak_timestamp << std::endl;
                        //cv::imshow(cam_name, cvImg);
                        
                        // printf("OAK UTC timestamp: %.9lf seconds\n", oak_utc_timestamp/1e9);
                        // printf("Current UTC tiemstamp: %9lf seconds\n", current_utc_time/1e9);
        
                    }
                    else if (cam_name == "CAM_B")
                    {
                        //cv::imshow(cam_name, cvImg);
                        // File <<std::fixed<< std::setprecision(9) << oak_utc_timestamp/1e9<<",";
                    }
                    else if (cam_name == "CAM_A")
                    {
                        //cv::imshow(cam_name, cvImg);
                        // File <<std::fixed<< std::setprecision(9) << oak_utc_timestamp/1e9<<","<< current_utc_time/1e9 << std::endl;
                    }
                    else
                    {
                        //cv::imshow(cam_name, cvImg);
                    }
                    //cv::waitKey(1);
                }
            }
        
            thread_device2.join();
        
            return 0;
        }

          Amovlab-matnav
          Can you confirm the triggering is done at the same time? Like taking a still image of an external timer. Then we can debug further.

          Thanks,
          Jaka

            6 days later

            Amovlab-matnav
            I don't think your triggering is done at the same time. Please point all cameras at a common precise clock and record an image at the same time. I think your logic of setting INPUT / OUTPUT is wrong. Merely changing that seems to solve the issue for me.

            Matched: Main = 964733.014101 s, Device2 = 964733.014083 s, Difference = 0.000018 s
            Main CAM_A: current timestamp = 964733.014101 s
            Main CAM_A: current timestamp = 964733.114101 s
            Device2 CAM_A: current timestamp = 964733.114083 s
            Matched: Main = 964733.114101 s, Device2 = 964733.114083 s, Difference = 0.000017 s
            Main CAM_A: current timestamp = 964733.214100 s
            Device2 CAM_A: current timestamp = 964733.214083 s
            Matched: Main = 964733.214100 s, Device2 = 964733.214083 s, Difference = 0.000017 s
            Main CAM_A: current timestamp = 964733.314101 s
            Device2 CAM_A: current timestamp = 964733.314083 s
            Matched: Main = 964733.314101 s, Device2 = 964733.314083 s, Difference = 0.000017 s
            Main CAM_A: current timestamp = 964733.414100 s
            Device2 CAM_A: current timestamp = 964733.414083 s
            Device2 CAM_A: current timestamp = 964733.514083 s
            Matched: Main = 964733.414100 s, Device2 = 964733.414083 s, Difference = 0.000017 s
            Matched: Main = 964733.514100 s, Device2 = 964733.514083 s, Difference = 0.000017 s

            Thanks,
            Jaka