Autonomous Rover: Realsense R200 sensor

I was looking for a visual sensor for my autonomous rover project and the Intel Realsense R200 might be a good fit.

Realsense R200

Looking inside

This tiny sensor combines two near infrared cameras with a laser projector that allows the combination of stereovision and structured light methods for extracting depth information from a scene. This allows it to work outdoors as well as indoors.

The infrared cameras have VGA (640x480) resolution and come with global shutter. There also is a 1080p RGB camera with rolling shutter. The expected range for depth is 0.7m to 3.5m indoors and from 0.7m to 8m outdoors

There is an ASIC inside handling image capture synchronisation and data transfer through USB 3.0.

Realsense R200 internals

Setting up the drivers and SDK

The realsense SDK is well documented and provides a cross platform API allow to run it on Windows / Mac / Linux.

A detailed installation guide is provided on their repository. I wrote a Salt state to deploy it as I already use SaltStack for the configuration of my laptop.

Once compiled, the library comes with examples (see compiled executables in the bin folder) that allow you to try out the sensor: capture image streams, play with configuration and even point clouds.

Librealsense cpp-capture GUI

This is what it looks like when you run the cpp-capture example. The top left stream is the RGB camera. The top right stream is the disparity map. The bottom streams are respectively the left and the right infrared cameras. You can clearly see the patterns of the laser projector on the wall in the infrared streams.

ROS support

The ROS support for the Realsense sensors has come a long way since the announcement at ROSCon 2015.

The package now supports the realsense R200 as the primary sensor targeted at robotics applications. But support for the F200 and SR300 is there too.

To get the package up and running, you just need to clone the repo inside your catkin workspace and build it. Then plug in your R200 sensor and launch the driver

roslaunch realsense_camera r200_nodelet_default.launch

Then you can run RQT to have image streams alongside the dynamic reconfigure interface to tweak the different camera parameters.

RQT

Another possibility is to run the RGBD processing nodes to get RGBD point cloud streams computed from the sensor streams and visualize it in Rviz

roslaunch realsense_camera r200_nodelet_rgbd.launch

# in a second terminal
rviz -d realsense_camera/rviz/realsenseRvizConfiguration1.rviz

RQT

And this is what it should look like

Mapping software

Now that we’ve got our depth sensor running on ROS, finding a ROS package to run SLAM on RGBD sensor streams is quite easy. One option is RTAB-Map and it’s easy to install. On Ubuntu, with ROS Indigo, a quick apt-get install ros-indigo-rtabmap-ros query does the trick.

In order to run the mapping software, we first launch our sensor driver to output RGBD data

roslaunch realsense_camera r200_nodelet_rgbd.launch

In a new terminal, we start the mapping nodes

roslaunch rtabmap_ros rgbd_mapping.launch rtabmap_args:="--delete_db_on_start" depth_registered_topic:=/camera/depth_registered/sw_registered/image_rect_raw

You should be seeing the RTAB visualiser

RTAB viz

When the background turns red, it means the odometry is lost (yellow is for warning) and you should go back to the last position before it turned red in order for the algorithm to recover its position.

Once you’re done, you can export the point cloud. You’ll get a .ply file saved in ~/.ros/ that you can convert to a .pcd file and display.

For that you first need to install the point cloud library packages: sudo apt-get install ros-indigo-pcl-ros. Then run

pcl_ply2pcd input.ply output.pcd
pcl_viewer output.pcd

With this I was able to generate the following point cloud of the Eurobot game table at CVRA.

Point cloud of CVRA's Eurobot table

Wrapping up

The R200 is a promising sensor on paper and it does deliver. The sensor is cheap, compact and rich. The combination of stereo and laser projection is very interesting and allows indoors as well as outdoors applications.

Standard tools and interfaces are really nice to have. With ROS being established as a standard middleware in robotics, it became very easy to play around with sensors and run state of the art solutions to robotics problems (mapping, localisation, navigation). What an exciting time to be a roboticist!

Next, I would like to perform more tests on the R200 outdoors, mainly range measurements. And once my rover is assembled, I will fit the sensor on the rover and record field data.

I also have a SR300 (close range, indoors only realsense sensor) that I might play with for object recognition.