Is there any guidance for testing a vision pipeline and applications without hardware in the loop?

For example, if we have a test environment in Isaac Sim or Habitat or what have you it would be useful to have a virtual camera model we could use so we can iterate across a growing set of benchmarks rather than do manual validation with physical hardware.

    MichaelMentele changed the title to Synthetic Cam Data for No Hardware in the Loom Automated Tests / Simulation .

    MichaelMentele
    No way to do that currently since the main point of running the pipeline is to run it on the device VPU. The sending back/forth + VPU cause the most issues, so having no device means having no issues to test.

    Thanks,
    Jaka

    16 days later

    I suppose I can generalize this to simulating getting and RGB + depth aligned stream or using pre-recorded streams. I'll just do that!