Visual SLAM (Simultaneous Localization and Mapping) traditionally relies on visual features and visual odometry. But in some mobile robotics use cases—particularly where visual odometry may be unreliable due to motion blur, low texture, or computational constraints—it is beneficial to fall back to wheel odometry alone. In this guide, we’ll explore how to build a Visual SLAM system without visual odometry, relying entirely on wheel odometry for robot motion estimation. We’ll use the Luxonis OAK-D Pro for stereo vision, iRobot Create3 for mobility and odometry, and an NVIDIA Jetson Xavier NX as the main compute platform—all running on ROS 2 Humble.
This minimal yet functional setup is ideal for 2D indoor navigation where the environment is flat and wheel slippage is minimal.
System Configuration Overview
- Compute NVIDIA Jetson Xavier NX with JetPack 6.0 on Ubuntu 22.04 Tegra
- Robot Base iRobot Create® 3, flashed with ROS 2 Iron I.0.0
- Camera Luxonis OAK-D Pro (stereo + IMU)
- SLAM
rtabmap_ros
with visual odometry disabled
- Middleware ROS 2 Humble
Hookup Instructions:
For hardware wiring and basic ROS 2 setup between Create 3 and the Jetson, follow the official hookup guide: Jetson + Create 3 Hookup Guide - we used Ethernet over USB to communicate between Jetson and Create 3
OAK-D Pro was connected to Jetson via USB-C. An additional power bank was added to supply extra power to the camera (Jetson can't power the camera alone).

Mechanical setup
The camera was mounted on the Create 3 pointing straight ahead and roughly 10cm above the wheels.

Objective
We want to build a lightweight Visual SLAM system that:
Uses wheel odometry from Create 3.
Uses stereo RGB-D images from the OAK-D Pro.
Disables visual odometry (VO) in RTAB-Map.
Builds a 2D (3DoF) map using loop closure + image features.
Software Setup on Jetson
On Jetson install ROS Humble by following the official create 3 guide with rmw_fastrtps_cpp
as RMW.
Install depthai_ros_driver
with
sudo apt install ros-$ROS_DISTRO-depthai-ros
Install rtabmap_ros
with
sudo apt install ros-$ROS_DISTRO-rtabmap-ros
NTP time sync was necessary to sync times between Create 3 and Jetson
USB device rules for OAK-D
Step-by-Step Startup
1. Launch the Camera Node
Use the following launch command to start the Luxonis OAK-D Pro:
ros2 launch depthai_ros_driver camera.launch.py \
pass_tf_args_as_params:=true \
parent_frame:=base_link \
cam_pos_z:=0.1 \
imu_from_descr:=true \
pointcloud.enable:=false
Explanation of Parameters:
parent_frame:=base_link
: Ensures the camera TF is correctly tied to your robot base.
cam_pos_z:=0.1
: Positions the camera 10 cm above the robot base.
imu_from_descr:=true
: Uses onboard IMU from OAK.
pointcloud.enable:=false
: Disables the point cloud if you only need RGB-D for SLAM (helps with performance).
Make sure the camera is physically oriented to match the expected axis (+Z
up, +X
forward).
2. Launch RTAB-Map with Visual Odometry disabled
Start RTAB-Map with the following command:
ros2 launch rtabmap_launch rtabmap.launch.py \
rtabmap_viz:=false \
rgbd_sync:=true \
depth_topic:=/oak/stereo/image_raw \
camera_info_topic:=/oak/rgb/camera_info \
rgb_topic:=/oak/rgb/image_raw \
visual_odometry:=false \
approx_sync:=true \
approx_rgbd_sync:=true \
approx_sync_max_interval:=0.1 \
odom_topic:=odom \
publish_tf_odom:=false \
namespace:=/ \
qos:=2 \
odom_log_level:=warn \
rtabmap_args:="--delete_db_on_start --Reg/Force3DoF true" \
subscribe_rgbd:=false
Key Parameters:
visual_odometry:=false
: Disables RTAB-Map’s internal visual odometry.
odom_topic:=odom
: Uses Create 3's wheel odometry.
--Reg/Force3DoF true
: Limits pose graph to 2D mapping, suitable for flat environments.
rgbd_sync
, approx_sync
: Ensures stereo + RGB data are timestamp-aligned even with slight latency.
publish_tf_odom:=false
: Prevents conflict with Create 3’s odometry TF broadcast.
Make sure that Create 3’s /odom
topic is available and being published at a reliable rate (~10Hz+ ideal). It sometimes takes a while before Create3 boots up and the odometry becomes available.
Testing and Validation
On an external PC connected to the same network as Jetson use rviz2
to visualize the SLAM map (/map
) and robot pose (/odom
, /tf
):
Add topics like:
/map
/tf
/oak/rgb/image_raw
/oak/stereo/image_raw
/rtabmap/grid_map
Then drive the robot around using teleoperation node:
ros2 run teleop_twist_keyboard teleop_twist_keyboard
Troubleshooting Tips
If SLAM isn't working, check:
- That
/odom
is publishing properly and reflects movement.
- That TF between
base_link
and camera_link
is correctly configured. Use tf2_tools
or ros2 run tf2_tools view_frames
to confirm correct transform tree.
- That RGB and depth images are coming in sync (
ros2 topic hz
, ros2 topic echo
).
If external PC can't detect topics from Jetson make sure:
If robot or camera driver are acting in a weird way, first make sure you have both batteries (power bank and Create 3 battery) charged.
Conclusion
Using the Luxonis OAK-D Pro with only wheel odometry offers a reliable and performant SLAM setup for indoor robots. It's especially effective for low-speed applications or structured environments where wheel slip is minimal. Combined with RTAB-Map's loop closure capabilities, even non-visual odometry can yield globally consistent maps.
The quality of produced map is still very dependant on wheel odometry accuracy which is prone to drifting with time. This is why fusing with an absolute localisation source (e.g. GPS, LPS, LIDAR localisation, ...) should be considered for more reliable and accurate operation.