Hi, I have problem with controlling camera.
I created pipeline which encode video from camera to h264. Thats works good.
But when I try to control camera via control queue. Then I receive this error message:
Communication exception - possible device error/misconfiguration. Original message 'Couldn't write data to stream: 'ctrl' (X_LINK_ERROR)'

https://gist.github.com/oto313/0576344e8bdad89f9bd68821c6b80ba5
There is pipeline.cpp file which shows how I initializing pipeline.
pipeline.json is file containing log message during creating pipeline.
controlMessage.json is file containing logged message during sending control request.

Code to send control message.
try{
dai::CameraControl ctrl;
ctrl.setAutoWhiteBalanceMode(mode);
control_queue_in->send(ctrl);
}catch(std::exception &e){
gst_print(e.what());
return false;
}
I am on most recent commit on main branch (3507400) of depthai-core
Am I initializing pipeline wrong? Or is some bug in depthAI?

    oto313

    Sorry about the trouble here and thanks for the report and details here (and sorry about the delay). I'm not immediately sure what is happening here, but I'm not deep into the code so it has to be fairly obvious for me to catch it.

    @Luxonis-Csaba - would you be able to look into this? I know you are deep into the C++ side of the API now.

    oto313 - I'll discuss this offline in our slack as well.

    Best,
    Brandon

    Hey!
    I will check it out.

    oto313 ctrl.setAutoWhiteBalanceMode(mode);

    I guess you are using mode as dai::CameraControl::AutoWhiteBalanceMode::MODE_HERE, right?

    Only manual focus / autofocus modes could be handled through initialControl for now.
    Here is the example how it works:

    #include <csignal>
    #include <cstdio>
    #include <iostream>
    
    // Includes common necessary includes for development using depthai library
    #include "depthai/depthai.hpp"
    
        // Keyboard interrupt (Ctrl + C) detected
    static std::atomic<bool> alive{true};
    static void sigintHandler(int signum) {
        alive = false;
    }
    
    int main(int argc, char** argv) {
        // Detect signal interrupts (Ctrl + C)
        std::signal(SIGINT, &sigintHandler);
    
        dai::Pipeline pipeline;
    
        // Define a source - color camera
        auto colorCam = pipeline.create<dai::node::ColorCamera>();
        auto xout = pipeline.create<dai::node::XLinkOut>();
        auto encoder = pipeline.create<dai::node::VideoEncoder>();
        auto control = pipeline.create<dai::node::XLinkIn>();
    
        control->setStreamName("ctrl");
        xout->setStreamName("h264");
        colorCam->setInterleaved(true);
        colorCam->setImageOrientation(dai::CameraImageOrientation::ROTATE_180_DEG);
        colorCam->setBoardSocket(dai::CameraBoardSocket::RGB);
        colorCam->setResolution(dai::ColorCameraProperties::SensorResolution::THE_4_K);
        //colorCam->initialControl.setAutoWhiteBalanceMode(dai::CameraControl::AutoWhiteBalanceMode::SHADE);
    
        encoder->setDefaultProfilePreset(3840, 2160, 30, dai::VideoEncoderProperties::Profile::H264_MAIN);
    
        // Create outputs
        colorCam->video.link(encoder->input);
        control->out.link(colorCam->inputControl);
        encoder->bitstream.link(xout->input);
    
        // Pipeline is defined, now we can connect to the device
        dai::Device device(pipeline);
        // Start pipeline
        device.startPipeline();
    
        // Output queue will be used to get the encoded data from the output defined above
        auto camQueue = device.getOutputQueue("h264");
        auto controlQueue = device.getInputQueue("ctrl");
    
        // The .h264 file is a raw stream file (not playable yet)
        auto videoFile = std::ofstream("video.h264", std::ios::binary);
        std::cout << "Press Ctrl+C to stop encoding..." << std::endl;
    
        while(alive) {
            dai::CameraControl ctrl;
            ctrl.setAutoWhiteBalanceMode(dai::CameraControl::AutoWhiteBalanceMode::TWILIGHT);
            controlQueue->send(ctrl);
    
            auto h265Packet = camQueue->get<dai::ImgFrame>();
            videoFile.write((char*)(h265Packet->getData().data()), h265Packet->getData().size());
        }
    
        std::cout << "To view the encoded data, convert the stream file (.h264) into a video file (.mp4) using a command below:" << std::endl;
        std::cout << "ffmpeg -framerate 30 -i video.h264 -c copy video.mp4" << std::endl;
    
        return 0;
    }

    As you can see
    colorCam->initialControl.setAutoWhiteBalanceMode(dai::CameraControl::AutoWhiteBalanceMode::SHADE);
    is commented out and needs to be sent at runtime.

    5 days later

    Luxonis-Alex Nice. Great support. But it only partially solved my problem. I need to close and open the luxonis device on demand. So there is a problem of lifetime of objects. If I open and close twice in row. Then I cannot set parameters but video works. If I only open device once then control of parameters works.

    There is class which I implemented as wrapper around depthai. I use this object as singleton

    Do you know how should I properly handle object lifetime or just how to fix this weird behavior?