B
BearrGod

  • 16 days ago
  • Joined Apr 11, 2024
  • 0 best answers
  • I'm using the R7M1E7. Here are pictures of it in case i'm mistaken :

    • Hello everyone.

      I'm working with an 4PFFC Som and here is my setup

      • OV9282 on socket A
      • OV9282 on socket B
      • OV9282 on socket C
      • IMX477 on socket D

      I'm having some trouble because the SOM just won't detect the IMX477 on socketD despite it being a 4 lanes socket. But when i switch socket A and D , (on luxonis carrier board) it works fine. Due to hardware constrains i prefer not to modify my hardware.

      Is there a way to fix this or an option somewhere?

      Thank you.

      • Hello everyone. I'm working with the oak som and turned the incoming encoded bitstream into and ip camera using gstreamer (like it was done on depthai experiments) . I essentially ran into the same problem as described here Github Issue . I was wondering it it is possible to manually change the bitrate without restarting the pipeline.

      • jakaskerl I contacted the Jetson forums and got no answer but on the jetson , the device is clearly marked as USB3.2 , because after running dmesg --follow , here is what i found.

        [  749.253884] usb 2-2: new SuperSpeedPlus Gen 2 USB device number 4 using tegra-xusb
        [  749.284330] usb 2-2: New USB device found, idVendor=03e7, idProduct=f63b, bcdDevice= 1.00
        [  749.284340] usb 2-2: New USB device strings: Mfr=1, Product=2, SerialNumber=3
        [  749.284346] usb 2-2: Product: Luxonis Device
        [  749.284350] usb 2-2: Manufacturer: Intel Corporation
        [  749.284354] usb 2-2: SerialNumber: 14442C109142AFCF00

        Is there a possibility that the SOM it not delivering the bandwidth as it should be ? Would you propose a setup to test that ?

        • Hello everyone. I'm using a setup consisting of a 4pffc SOM connected to a Jetson Orin NX (nvidia hardware). I tested the bandwidth of the system when Usb::SUPER_PLUS using the oak_bandwidth_test.py script in depthai-experiments.

          • The Jetson Orin NX accepts up to 10gb/s (USB 3.2 Gen 2 ) in host mode and 5gb/s (USB 3.2 Gen 1)in device mode. Since the SOM is the device i issume the jetson is able to consume all incoming datas. The cable is also USB 3.2 Gen 2 compliant.
          • As you can see in the code , the bandwidth was set to SUPER_PLUS.

          I get the following results from the script :

          Downlink 2414.5 mbps
          Uplink 2229.0 mbps

          So the bandwidth is clearly stuck at 5gb/s.

          Can anyone help ?

          • Hello everyone. I'm running on quite the issue on my setup. I'm using the OAK SOM pro for a robotic application which is connected to an n board computer through usb connection. I Connectect two monochrome OV9282 cameras and a color IMX477. I'm getting :

            • Two OV9282 800p@20 fps for a slam application
            • Encoding the 4K data stream for video recoding
            • Getting preview color image at 1080p

            My slam application needs high rate imu data (about 200dps ) and i'm getting lattency and bandwidth issues which is normal with all the datas i'm getting but the BNO086 (imu) data quality is just perfect for my application. So i would like to ask if luxonis has an spi linux driver for getting the imu data so that i can directly connect the imu to my onboard computer.

            • erik I don't think there is one but it's not a problemn since i'm making my own pcb. Thnak you .

            • Hello everyone. I browsed through the supported sensors here . I was wondering if there is any suport for the OV9281 like there is for the OV9282 since it is essentially the same sensor. Thank you.

            • Hello everyone.
              I'm having trouble with the 4PFFC regarding the autoexposition.
              Here is my setup
              -socket CAM_A : IMX577 4056 x 3040 focus:fixed - COLOR
              -socket CAM_B : OV9282 1280 x  800 focus:fixed - MONO COLOR
              -socket CAM_C : OV9282 1280 x  800 focus:fixed - MONO COLOR
              -socket CAM_D : OV9282 1280 x  800 focus:fixed - MONO COLOR

              When i close the cameras on socket B (with a black tape) , thus turning the image all dark, the image on
              the socket C turns all white.

              The issue is easliy reproducable by using the cam_test.py script of the
              depthai-python repository.
              I would like to get some advice or help on how to solve that problem.
              Best regards

            • Hello everyone. I've been browsing through the depthai examples for quite some time now and i've realized that in all of them the pipeline is constructed before the device is instanciated ant then it is not modified afterward. So i was wondering if for example i could insert some new elements in my pipeline after the devide is created. For example start encoding video on the fly or branch the pipeline to do an image encoding and then unlink the unnecessary elements after i'm done.

            • Hello. I'm planning on using data queues callback with my imu data queue just like here : https://docs.luxonis.com/projects/api/en/latest/samples/host_side/queue_add_callback/ . I did the relevant adaptations to the minimal example provided by depthai-core but i seam to be doing something wrong. Here is my code :

               #include <iostream>
              #include <opencv4/opencv2/highgui.hpp>
              // Includes common necessary includes for development using depthai library
              #include <depthai/depthai.hpp>
              bool firstTs = false;
              auto baseTs = std::chrono::time_point<std::chrono::steady_clock, std::chrono::steady_clock::duration>();
              
              void imu_cb_dai(std::shared_ptr<dai::ADatatype> callback)
              {
                  using namespace std;
                  using namespace std::chrono;
                  if(dynamic_cast<dai::IMUData*>(callback.get()) != nullptr)
                  {
                      auto imuData = std::shared_ptr<dai::IMUData>(static_cast<dai::IMUData*>(callback.get())) ;
                      if(imuData)
                      {
                          auto imuPackets = imuData->packets;
                          for(auto& imuPacket : imuPackets) {
                              auto& acceleroValues = imuPacket.acceleroMeter;
                              auto& gyroValues = imuPacket.gyroscope;
              
                              auto acceleroTs1 = acceleroValues.getTimestampDevice();
                              auto gyroTs1 = gyroValues.getTimestampDevice();
                              if(!firstTs) {
                                  baseTs = std::min(acceleroTs1, gyroTs1);
                                  firstTs = true;
                              }
              
                              auto acceleroTs = acceleroTs1 - baseTs;
                              auto gyroTs = gyroTs1 - baseTs;
              
                              printf("Accelerometer timestamp: %ld ms\n", static_cast<long>(duration_cast<milliseconds>(acceleroTs).count()));
                              // printf("Accelerometer [m/s^2]: x: %.3f y: %.3f z: %.3f \n", acceleroValues.x, acceleroValues.y, acceleroValues.z);
                              // printf("Gyroscope timestamp: %ld ms\n", static_cast<long>(duration_cast<milliseconds>(gyroTs).count()));
                              // printf("Gyroscope [rad/s]: x: %.3f y: %.3f z: %.3f \n", gyroValues.x, gyroValues.y, gyroValues.z);
                          }
                      }
                      else 
                          std::cout<<"Imu ptr returned null  \n" ;
                  }
              } 
              
              
              int main() {
              
                  // Create pipeline
                  dai::Pipeline pipeline;
              
                  // Define sources and outputs
                  auto imu = pipeline.create<dai::node::IMU>();
                  auto xlinkOut = pipeline.create<dai::node::XLinkOut>();
              
                  xlinkOut->setStreamName("imu");
              
                  // enable ACCELEROMETER_RAW at 500 hz rate
                  imu->enableIMUSensor(dai::IMUSensor::ACCELEROMETER_RAW, 500);
                  // enable GYROSCOPE_RAW at 400 hz rate
                  imu->enableIMUSensor(dai::IMUSensor::GYROSCOPE_RAW, 400);
                  // it's recommended to set both setBatchReportThreshold and setMaxBatchReports to 20 when integrating in a pipeline with a lot of input/output connections
                  // above this threshold packets will be sent in batch of X, if the host is not blocked and USB bandwidth is available
                  imu->setBatchReportThreshold(1);
                  // maximum number of IMU packets in a batch, if it's reached device will block sending until host can receive it
                  // if lower or equal to batchReportThreshold then the sending is always blocking on device
                  // useful to reduce device's CPU load  and number of lost packets, if CPU load is high on device side due to multiple nodes
                  imu->setMaxBatchReports(10);
              
                  // Link plugins IMU -> XLINK
                  imu->out.link(xlinkOut->input);
              
                  // Pipeline is defined, now we can connect to the device
                  dai::Device d(pipeline);
              
                  auto imuQueue = d.getOutputQueue("imu", 50, false);
                  baseTs = std::chrono::time_point<std::chrono::steady_clock, std::chrono::steady_clock::duration>();
                  imuQueue->addCallback(imu_cb_dai) ; 
                  
              
                  while(true) {
                      int key = cv::waitKey(1);
                      if(key == 'q') {
                          return 0;
                      }
                  }
              
                  return 0;
              }
            • jakaskerl . The cm4 in itself does not have usb3 hub. Instead, it has usb2 only. So i couldn't reach high bandwidth, thus i was looking for a way to get the data from the OAK-FFC 4P using a pcie connection ( i'm making a custom board). So i wanted to know how to do that. For example taking a close look to the OAK-FFC 4P , there are two pcie port: which one should i use ? Is there a special firmware / software setup ?

            • Hello everyone. I'm currently workinw with a compute module4 and an OAK-FFC 4P. Unfortunately i couldn't get USB3 to work on the cm4. I'm wondering how would one do to transfer images and other datas of the OAK-FFC 4P board through PCIe connection? What would be the hardware and software setup ? Thank you.