Howdy!
I've been working on writing a C++ application to stream video from the DAI camera over RTSP. I am currently using the Oak-D USB camera, which I have confirmed does not have these same issues in Python. After getting the streaming framework itself written and tested, I was hoping to integrate a simple, unprocessed ColorCamera feed. However, on launching the sever and connecting with a client, the camera provides 20 frames of video (counted using system prints) before timing out the client connection. I suspected that this was a problem not with the streaming itself, but the camera providing media to the server. I was able to confirm this by removing any streaming from the loop and just attempting to display the ColorCamera feed using OpenCV. Code snippet follows:
auto camRgb = pipeline.create<node::ColorCamera>();
auto xoutVideo = pipeline.create<node::XLinkOut>();
auto xoutPreview = pipeline.create<node::XLinkOut>();
xoutVideo->setStreamName("video");
xoutPreview->setStreamName("preview");
camRgb->setPreviewSize(1920, 1080);
camRgb->setBoardSocket(CameraBoardSocket::RGB);
camRgb->setResolution(ColorCameraProperties::SensorResolution::THE_1080_P);
camRgb->setColorOrder(ColorCameraProperties::ColorOrder::BGR);
camRgb->video.link(xoutVideo->input);
camRgb->preview.link(xoutPreview->input);
// shared_device = make_shared<Device> (pipeline);
device_ = new Device(pipeline);
auto video = device_->getOutputQueue("video", 64, false);
liveCtx * ctx_;
ctx_ = new liveCtx;
ctx_->frame_queue = video;
while (true) {
auto videoFrame = ctx_->frame_queue->get<ImgFrame>();
imshow("video", videoFrame->getCvFrame());
int key = waitKey(1);
if (key == 'q' || key == 'Q') {return 0;}
}
where liveCtx is defined as follows:
typedef struct {
std::shared_ptr<DataOutputQueue> frame_queue;
GstClockTime timestamp;
} liveCtx;
Frames will display on program startup, through the camera autofocus, before the camera freezes. No errors are thrown, the program continues to run, but the camera does not seem to provide any more frames. Checking $dmesg -wH, nothing out of the ordinary seems to be occurring on the hardware side. I have also tried creating the Device with usb2Mode=true, which did not seem to change anything. I have to assume that there is something going on in my code which is keeping the camera from providing frames, or which fails to continue to request frames from the camera, but I cannot seem to identify where this would be happening. Any help would be appreciated!