Simulate mmWave radar based drone control in Gazebo with ROS2 and PX4
Tested with:
- Ubuntu 20.04.3 LTS
- ROS2 Foxy
- Gazebo 11.9.0
- px4_ros_com commit 90538d841a383fe9631b7046096f9aa808a43121
- px4_msgs commit 7f89976091235579633935b7ccaab68b2debbe19
- PX4 Autopilot commit d7a962b4269d3ca3d2dcae44da7a37177af1d8cd
Specific commits from pull request #1
https://docs.ros.org/en/foxy/Installation.html or https://docs.ros.org/en/foxy/Installation/Ubuntu-Install-Debians.html
- Setup Sources:
sudo apt update && sudo apt install curl gnupg2 lsb-release
sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -o /usr/share/keyrings/ros-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros.org/ros2/ubuntu $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null
- Install ROS 2 packages:
sudo apt update
sudo apt install ros-foxy-desktop
sudo apt install python3-colcon-common-extensions
http://gazebosim.org/tutorials?tut=ros2_installing&cat=connect_ros and http://gazebosim.org/tutorials?tut=install_ubuntu&cat=install
- Default Gazebo installation:
cd ~
curl -sSL http://get.gazebosim.org | sh
- Install gazebo_ros_pkgs
sudo apt install ros-foxy-gazebo-ros-pkgs
Install gazebo ROS2 package:
sudo apt install ros-foxy-gazebo-ros-pkgs
https://docs.px4.io/master/en/ros/ros2_comm.html
- Foonathan memory:
cd ~
git clone https://github.com/eProsima/foonathan_memory_vendor.git
cd foonathan_memory_vendor
mkdir build && cd build
cmake ..
sudo cmake --build . --target install
- Fast-RTPS (DDS)
cd ~
git clone --recursive https://github.com/eProsima/Fast-DDS.git -b v2.0.0 ~/FastDDS-2.0.0
cd ~/FastDDS-2.0.0
mkdir build && cd build
cmake -DTHIRDPARTY=ON -DSECURITY=ON ..
make -j$(nproc --all)
sudo make install
- Fast-RTPS-Gen
cd ~
git clone --recursive https://github.com/eProsima/Fast-DDS-Gen.git -b v1.0.4 ~/Fast-RTPS-Gen \
&& cd ~/Fast-RTPS-Gen \
&& ./gradlew assemble \
&& sudo ./gradlew install
- Check install with
which fastrtpsgen
- Download PX4 Source code and run
ubuntu.sh
with no arguments:
cd ~
git clone -n https://github.com/PX4/PX4-Autopilot.git
cd PX4-Autopilot/
git checkout d7a962b4269d3ca3d2dcae44da7a37177af1d8cd
git submodule update --init --recursive
bash ./Tools/setup/ubuntu.sh
- Relogin or reboot computer before attempting to build NuttX targets
- Setup ROS 2 Workspace
cd ~
mkdir -p ~/px4_ros_com_ros2/src
- Clone ROS 2 bridge packages
px4_ros_com
px4_msgs
~/px4_ros_com_ros2/src
git clone -n https://github.com/PX4/px4_ros_com.git
cd px4_ros_com/
git checkout 90538d841a383fe9631b7046096f9aa808a43121
cd ..
git clone -n https://github.com/PX4/px4_msgs.git
cd px4_msgs/
git checkout 7f89976091235579633935b7ccaab68b2debbe19
- Update uorb message definitions:
cd ~/PX4-Autopilot/msg/tools/
./uorb_to_ros_msgs.py ~/PX4-Autopilot/msg/ ~/px4_ros_com_ros2/src/px4_msgs/msg/
- Run the
px4_ros_com
ROS2 workspace build script in verbose mode to catch any errors:
cd ~/px4_ros_com_ros2/src/px4_ros_com/scripts/
./build_ros2_workspace.bash --verbose
(I had a python import error (pyros-genmsg) that did not show without the --verbose tag)
https://docs.qgroundcontrol.com/master/en/getting_started/download_and_install.html#ubuntu
sudo usermod -a -G dialout $USER
sudo apt-get remove modemmanager -y
sudo apt install gstreamer1.0-plugins-bad gstreamer1.0-libav gstreamer1.0-gl -y
cd ~/Downloads/
chmod +x ./QGroundControl.AppImage
./QGroundControl.AppImage (or double click)
The script builds ROS2 workspace and downloads and installs powerline test setup worlds and models from:
https://drive.google.com/file/d/1wMf4hJXjVBkhR41Do0fGaT0t_GC8mKW_
. If downloading within the script fails, download from this link manually and install files where the script would.
cd ~/mmWave_ROS2_PX4_Gazebo/
chmod +x ./install.sh
Execute install script (from the script directory). If PX4 and px4_ros_com_ros2 installed in home directory:
./install.sh ~/PX4-Autopilot/ ~/px4_ros_com_ros2/
(https://docs.px4.io/master/en/ros/ros2_comm.html#sanity-check-the-installation)
-
Open a new terminal in the root of the PX4 Autopilot project, and then start a PX4 Gazebo simulation using:
cd ~/PX4-Autopilot/
make px4_sitl_rtps gazebo_iris__hca_full_pylon_setup
or, for empty world (if no additional worlds/models installed):
make px4_sitl_rtps gazebo
(syntax:
make <target> <simulator>_<vehiclemodel>__<world>
prependHEADLESS=1
to launch without GUI prependPX4_NO_FOLLOW_MODE=1
to launch without following drone)Should make and open PX4 in same console, as well as a Gazebo window with chosen model and world
-
On a new terminal, source the ROS 2 workspace and then start the micrortps_agent daemon with UDP as the transport protocol:
source ~/px4_ros_com_ros2/install/setup.bash micrortps_agent -t UDP
-
On the original terminal (PX4 console) start the micrortps_client daemon with UDP:
pxh> micrortps_client start -t UDP
(may already be running if following message is generated:
INFO [micrortps_client] Already running Command 'micrortps_client' failed, returned -1.
) -
Open a new terminal and start a "listener" using the provided launch file:
source ~/px4_ros_com_ros2/install/setup.bash ros2 launch px4_ros_com sensor_combined_listener.launch.py
-
Optionally, open QGroundControl which will connect with PX4. From here it is possible to set waypoints and execute missions.
https://docs.px4.io/master/en/ros/ros2_offboard_control.html https://github.com/PX4/px4_ros_com/blob/master/src/examples/offboard/offboard_control.cpp
-
If offboard_control.cpp or other files have been edited, re-run
install.sh
script (add new files to script and CMakeLists.txt):cd ~/mmWave_ROS2_PX4_Gazebo/ ( chmod +x ./install.sh ) ./install.sh
If same PX4 and px4_ros_com_ros2 roots:
./install.sh ~/PX4-Autopilot/ ~/px4_ros_com_ros2/
-
Launch PX4 SITL:
cd ~/PX4-Autopilot/ make px4_sitl_rtps gazebo_iris__hca_full_pylon_setup
Without Gazebo GUI:
HEADLESS=1 make px4_sitl_rtps gazebo_iris__hca_full_pylon_setup
Without drone following:
PX4_NO_FOLLOW_MODE=1 make px4_sitl_rtps gazebo_iris__hca_full_pylon_setup
After PX4 SITL fully launched, might need to manually start microRTPS client in same terminal:
micrortps_client start -t UDP
Will fail and return -1 if already running.
-
Open QGroundControl
-
In a new terminal start microRTPS agent and offboard control:
source ~/px4_ros_com_ros2/install/setup.bash micrortps_agent start -t UDP & ros2 run px4_ros_com offboard_control
-
In another terminal, start the velocity vector advertiser, lidar to mmwave converter, and 3d to 2d projection nodes:
source ~/px4_ros_com_ros2/install/setup.bash ros2 launch ~/mmWave_ROS2_PX4_Gazebo/launch/simulate_pointcloud_control_launch.py
-
Simulated drone in Gazebo should arm and takeoff. May need to restart
vel_ctrl_vec_pub
andoffboard_control
ros2 runs. -
Visualize simulated data in rviz2:
rviz2 ~/mmWave_ROS2_PX4_Gazebo/3d_and_2d_pointcloud_rgb.rviz
To enhance the realism of the simuation, it is possible to add wind to the virtual environment. This is simply done by adding and customizing the wind plugin in the .world file. Below is an example which can be added in the hca_full_pylon_setup.world
file which will introduce a mean wind of 3m/s, a max wind velocity of 6m/s, and a typical wind direction along the y-axis:
<plugin name='wind_plugin' filename='libgazebo_wind_plugin.so'>
<frameId>base_link</frameId>
<robotNamespace/>
<windVelocityMean>3.0</windVelocityMean>
<windVelocityMax>6.0</windVelocityMax>
<windVelocityVariance>0.25</windVelocityVariance>
<windDirectionMean>0 1 0</windDirectionMean>
<windDirectionVariance>0.25</windDirectionVariance>
<windGustStart>0</windGustStart>
<windGustDuration>0</windGustDuration>
<windGustVelocityMean>0</windGustVelocityMean>
<windGustVelocityMax>20.0</windGustVelocityMax>
<windGustVelocityVariance>0</windGustVelocityVariance>
<windGustDirectionMean>1 0 0</windGustDirectionMean>
<windGustDirectionVariance>0</windGustDirectionVariance>
<windPubTopic>world_wind</windPubTopic>
</plugin>
-
General tips on PX4+Gazebo simulation (e.g. wind, vehicle spawn location): https://docs.px4.io/main/en/simulation/gazebo.html
-
Trajectory setpoint message: https://github.com/PX4/px4_msgs/blob/ros2/msg/TrajectorySetpoint.msg
-
Changed parameters (to fix "Failsafe enabled: No manual control stick input" warning and not taking off): pxh> param set NAV_RCL_ACT 0 pxh> param set COM_RCL_EXCEPT 4
NAV_RCL_ACT: curr: 2 -> new: 0
-
Local positioning? https://github.com/PX4/px4_msgs/blob/ros2/msg/VehicleLocalPositionSetpoint.msg No, calculate positions in drone frame and transform to world frame.
-
Add any new ROS2 files to ~/px4_ros_com_ros2/src/px4_ros_com/CMakeLists.txt
-
Check if drone armed? https://github.com/PX4/px4_msgs/blob/ros2/msg/ActuatorArmed.msg No, subscribe to
/fmu/vehicle_status/out
topic and monitorarming_state
. -
libignition-common3 error (after software update?) - Copy existing file and rename to match missing file
-
If gazebo does not open, try running
gazebo --verbose
to troubleshoot.killall gzserver
should kill any gazebo instances. Restart PC if all else fails. -
inlude both iris.sdf and iris.sdf.jinja?
-
Implemented laser scanner with Gazebo and ROS2 https://github.com/chapulina/dolly
-
Make custom sensor plugin http://gazebosim.org/tutorials?cat=guided_i&tut=guided_i5
-
In ~/px4_ros_com_ros2/src/px4_ros_com/CMakeLists.txt add
sensor_msgs
underament_target_dependencies
-
After running
./build_ros2_workspace
restart all affected executables (micrortps_agent, offboard_control, vel_vec_ctrl_pub). Gazebo PX4 SITL can be left running. -
iris.sdf (or other models) can be edited to include sensors, like 2D lidar.
-
Display simulated camera feed either with rviz2 or
source ~/px4_ros_com_ros2/install/setup.bash
ros2 run image_tools showimage image:=/cable_camera/image_raw
- Add new worlds/models to ~/PX4-Autopilot/platforms/posix/cmake/sitl_target.cmake
- See local packages, and msgs, with:
ros2 interface packages
and e.g.ros2 interface package px4_msgs
- Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) (http://sdformat.org/spec?ver=1.7&elem=sensor#lens_intrinsics)
- Drone spawn coordinates set in ~/PX4-Autopilot/Tools/sitl_run.sh ?
*** No rule to make target '/opt/ros/foxy/lib/libfastrtps.so.2.0.2', needed by 'libpx4_msgs__rosidl_typesupport_fastrtps_cpp.so'. Stop.
Fixed by renaming closest libfastrtps.so.x.y.z to libfastrtps.so.2.0.2.- Dependency errors with PX4, like
ninja: error: '/usr/lib/x86_64-linux-gnu/libsdformat9.so.9.6.1', needed by 'libmav_msgs.so', missing and no known rule to make it
may be solved by a PX4 reinstall (remember worlds, models, cmake files etc. must be also be reinstalled into new PX4). - If drone enters failsafe when starting offboard_control,
param set COM_RCL_EXCEPT 4
in the PX4 console may solve this. Else, try manually publish few setpoints to fmu/manual_control_setpoint/in and then start offboard mode. - Showing videos in readme: Just drag and drop your image/video from your local pc to github readme in editable mode.
- If gradle not working, might have to downgrade Java (JDK) to 11: https://askubuntu.com/questions/1133216/downgrading-java-11-to-java-8
- May have to set unused non-velocity parameters to NAN in TrajectorySetpoint message: https://discuss.px4.io/t/offboard-control-using-ros2-how-to-achieve-velocity-control/21875
- Customize GPS noise within .sdf of vehicle model in the gps_plugin section. E.g. gpsXYRandomWalk of 0.02 and gpsZRandomWalk of 0.04 if simulating RTK accuracy.
- π’ Install tools
- π’ Figure out how to control drone via offboard_control.cpp
- π’ Make ROS2 advertiser that generates control input for offboard_control.cpp for more advanced control
- π’ Figure out how to use simulated depth sensors
- π’ Implement depth data into ROS2 advertiser for even more advanced control
- π’ Control drone towards overhead cable
- π‘ More tightly integrate with PX4 to optimize control based on e.g. drone state
- get pose of drone to mitigate sideways motion when rotated around x or y.
- use GPS positioning to counteract drift
- π’ Use drone mounted simulated camera to get images of overhead cable
- π’ Visualize depth data in camera feed
- π’ Investigate occasional drone control loss
- π’ Make module that turns 2d lidar data into noisy pointcloud to prepare for mmwave integration
- π‘ Tracking of points in pointcloud (kalman?)
- π‘ Implement cable detection AI to filter depth data and align drone yaw wrt. cable