C
Clevy

  • Aug 18, 2023
  • Joined Jul 31, 2023
  • 0 best answers
  • Hello, it seems that the code there has only capability to synch one device as the 'self' argument, are you recommending adding a second camera argument to that code?

    My ideal solution is using an external FSYNCH signal to adjust the frame capture within the 33 ms time window of capture. I would like to use frame synch mode `<`camera.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT)~

    There is documentation https://docs.luxonis.com/projects/hardware/en/latest/pages/guides/sync_frames/ which mentions using synch mode to synch multiple devices but I am having difficulties.

    Is there example code using this mode to synch two cameras during recording, all I have found is code used to detect a trigger in order to take a snapshot. This is not too helpful im my case because I must record and use FSYNCH to synch frames at the same time.

    The FSYNCH code available that I have found uses the older depthai SDK, where a new queue is made and checks if there is an element in the queue. Form what I understand the queue should have an element arriving for each frame, so checking (wait(lambda: oakQueue.has() == True) should always return true while recording. Is there another way to access the information from the FSYNCH?

    Thank you!

  • I would like to use the frame synch mode input as a 'trigger action' in the SDK. In older version of the SDK the pipeline and queues are configured explicitly in the main code, not done in OakCamera. When this was the case, ( lambda: queues[val].has() == bool) was used to check if the trigger was sent to FSYNCH. How might I setup and check the status of the FSYNCH when this cannot be directly accessed?

    thanks!

    • Hello erik

      Thank you for taking this on, could I get an update on the project?

      • erik replied to this.
      • Hello,

        I an using F-Synch port and recording, I would like be able to use trigger actions. To record, we need the encoding functionalities in depth-AI-SDK version 1.2.3, but the trigger actions are not available in this version. The newer versions which have the trigger actions do not have encoding. I tired copying the folders from a newer version to the 1.2.3 folder in my python libraries but this did not resolve the issue

        Any solutions?

        Thank you!

        • no- it is only synching the cameras on one device, not the two devices.

          as you can see from the ms displayed in the VLC, there is a difference between the two video streams.

          This is the full terminal:

          • erik replied to this.
          • hello,

            That code was helpful to look at and gave me ideas as to how to edit my working code. however, I was not able to get the cameras to synch and record. the recording code cycles through one device at a time and save the frame, using the checkSynch function to verify that all the frames in one device are in synch. I have tried to save all queues to an array and send that to the checkSynch function but I am getting an error I don't entirely understand in this context.

            Could you show an example of how the https://github.com/luxonis/depthai-experiments/blob/master/gen2-syncing/host-multiple-OAK-sync.py and https://github.com/luxonis/depthai/blob/main/apps/record/main.py appended together?

            Thank you for your support

            • erik replied to this.
            • Thank you for your response, I would like to use timestamp synching, I assume that this should this be implemented at the beginning of the recording to align the start of the videos, then continue monitoring throughout the recording. If so how would the additional frames be discarded?

              clevy

              • erik replied to this.
              • Hello,

                I have two OAK-D-PRO-POE cameras which I an recording from and saving the six video streams. We need the recording to be synched with each other below a 33ms variation. I have been able to use the FSYNCH to trigger a start recording, but it is not lining the recordings up. There is a disparity of between 40- 500 ms between the streams coming the two devices. The way the code works, the cameras must disconnect and reconnect after the synch signal is received. Ideally the cameras are synched with a pulse sent through the F-SYNCH, but once the recording has started I am not able to read the synch port anymore. This is the code I am currently running. Is there a way I could record and synch with the F-SYNCH simultaneously?

                Code used: synch Code

                Thanks!!