Indoor autonomous flight with ArduCopter, ROS and Aruco Boards Detection

This wiki page describes how to setup a system capable to realize indoor autonomous flight. The system is based on a quadcopter with a Raspberry Pi 3 and a Raspberry Pi Camera Module v2. Images from camera are used to calculate poses estimation on the Raspberry Pi and the result are sent as MAVLink messages to the Flight Controller. The camera is downward looking and on the floor there is an Aruco Boards like this:


The system uses ROS for all the tasks it has to do. The images from Raspberry Pi Camera are captured by raspicam_node, the poses estimation are calculated by a modified version of aruco_gridboard and the relevant messages are sent to the Flight Controller using mavros. All this ROS packages, and other we will see later, runs on the Raspberry Pi 3.

The ROS node raspicam_node publish camera/image and camera/camera_info topics, the node aruco_gridboard subscribes to these topics and publish a camera_pose message to the mavros/vision_pose/pose topic, mavros translates ROS messages in MAVLink messages and send it to the Flight Controller.

The messages SET_GPS_GLOBAL_ORIGIN and a SET_HOME_POSITION are sent with a script before starting to use the system.

The Flight Controller and the Raspberry Pi 3 on the quadcopter are connected via serial port whereas the Rapsberry Pi 3 and the desktop PC are connected via WiFi. The desktop PC is used only for configuration and visualization purposes. rviz from ROS is used for visualization on PC.

Components of the system

  • A little quadcopter (160mm) with Revolution Flight Controller with ArduCopter 3.7-dev and the following relevant parameters:

SERIAL1_BAUD 921   (the serial port used to connect to Raspberry Pi)
SYSID_MYGCS 1   (to accept control from mavros)
  • On the quadcopter there is a Raspberry Pi 3 (connected to Flight Controller with serial port) and a Raspberry Pi Camera

  • On the Raspberry Pi there is ROS Kinetic with raspicam_node, aruco_gridboard and mavros packages.

Instructions to reproduce the system

On the Raspberry Pi 3 on quadcopter

  • Install Ubuntu 16.04 and ROS Kinetic with Ubiquity Robotics Raspberry Pi images

  • Edit /boot/config.txt to have higher serial speed on /dev/ttyAMA0 (connection at 921600 baud)

find the row with #init_uart_clock=3000000 and change it in this way:
at the end of the file comment all lines after # Allow UART and Bluetooth ...
add the line:
  • Connect the serial port with one telemetry port on the Flight Controller

  • Connect to the PC using WiFi following the instructions on Ubiquity Robotics site

  • Edit mavros configuration file apm_config.yaml to syncronize the flight controller and companion computer (Raspberry Pi) clocks using MAVLink’s SYSTEM_TIME and TIMESYNC messages as in this wiki

  • Calibrate the camera following the instructions in this wiki

  • Clone this fork of aruco_gridboard in ~/catkin_ws/src

  • Build all

cd ~/catkin_ws

On the desktop PC

  • Install ROS Kinetic on Ubuntu 16.04 (maybe newer version work the same but was not tested)

  • Install ros-kinetic-joy-teleop (sudo apt install ros-kinetic-joy-teleop) and configure for your Joystick - We use a Joystick instead of RC because using 2.4GHz RC disturb the WiFi video streaming. In mavros there is a configuration file for Logitech F710 Joystick, In the aruco_gridboard package we added a configuration file for the Xbox one Joystick.

  • Install mavros (sudo apt install ros-kinetic-mavros*)

  • If you are not familiar with ROS follow the tutorials

  • Edit ~/.bashrc and append the following line:

export ROS_MASTER_URI="http://ubiquityrobot.local:11311"
  • Create a Catkin WorkSpace (on Raspberry Pi this is not necessary because it is already in Ubiquity Robotics image)

cd $HOME
mkdir -p catkin_ws/src
cd catkin_ws
cd ~/catkin_ws

On PC you also have to run a GCS of your choice to configure the quadcopter, see telemetry data, see MAVLink inspector, set flight modes and give commands. All of this things can be done also via ROS messages and services but in this way could be easier.

Starting all ROS node

Now to start all the node needed by the system to work give the following command on different term (or tab with CTRL+SHIFT+T) (in this example is the PC and is the Raspberry Pi on the quadcopter)


ssh ubuntu@ubiquityrobot
ubuntu@ubiquityrobot:~/catkin_ws$ roslaunch aruco_gridboard detection_rpicam.launch


ssh ubuntu@ubiquityrobot
ubuntu@ubiquityrobot:~/catkin_ws$ roslaunch mavros apm.launch fcu_url:=/dev/ttyAMA0:921600 gcs_url:=tcp-l://


ssh ubuntu@ubiquityrobot
ubuntu@ubiquityrobot:~/catkin_ws$ rosrun aruco_gridboard (only after receiving EK2 ...)


andrea@galileo:~/catkin_ws$ rosrun rqt_reconfigure rqt_reconfigure (for setting camera params then exit)
andrea@galileo:~/catkin_ws$ roslaunch mavros_extras teleop.launch


andrea@galileo:~/catkin_ws$ rosrun rviz rviz -d catkin_ws/src/aruco_gridboard/data/aruco_grid.rviz

At this point it should be possible to see /mavros/vision_pose/pose and /mavros/local_position/pose, represented as 3 Axes, on rviz and moving the quadcopter with the camera towards the Aruco Board, you should see the two poses moving close to each other. Connecting the GCS to the quadcopter (tcp 2000) it should be possible to see the quadcopter on the map, set flight mode and give commands.

If this last point is OK the first test could be done arming the quadcopter in Loiter mode, takeoff and hover over the Aruco Board with the Joystick, then land.

The last step (for now) is to test an all autonomous flight using one of the script included, to do this open another term or tab

ssh ubuntu@ubiquityrobot
ubuntu@ubiquityrobot:~/catkin_ws$ rosrun aruco_gridboard

You should see the quadcopter arm, takeoff, flight along the square and land as showed in the video at the beginning of this page.