Playing around with ROS and Gazebo ,Ball chasing robot.
In this Project I have used Adaptive Monte Carlo Localization to make the robot move in a Gazebo environment.
In this project ROS AMCL package is used to localize a mobile robot inside a map in the Gazebo simulation environments. Few aims of this project are as follows:
- Create a ROS package that launches a custom robot model in a custom Gazebo world
- Utilize the ROS AMCL package and the Tele-Operation / Navigation Stack to localize the robot
- Explore, add, and tune specific parameters corresponding to each package to achieve the best possible localization results
ball_chasing_robot_AMCL_Video.mp4
- Gazebo >= 7.0
- ROS Kinetic
- ROS navigation package
sudo apt-get install ros-kinetic-navigation
- ROS map_server package
sudo apt-get install ros-kinetic-map-server
- ROS move_base package
sudo apt-get install ros-kinetic-move-base
- ROS amcl package
sudo apt-get install ros-kinetic-amcl
- make >= 4.1(mac, linux), 3.81(Windows)
- Linux: make is installed by default on most Linux distros
- Mac: install Xcode command line tools to get make
- Windows: Click here for installation instructions
- gcc/g++ >= 5.4
- Linux: gcc / g++ is installed by default on most Linux distros
- Mac: same deal as make - install Xcode command line tools
- Windows: recommend using MinGW
- Meet the
Prerequisites/Dependencies
- Open Ubuntu Bash and clone the project repository
- On the command line execute
sudo apt-get update && sudo apt-get upgrade -y
- Build and run your code.
- drive_bot.cpp: ROS service C++ script, command the robot with specify speeds.
- process_images.cpp: ROS service C++ script, process the camera image and return requested speeds.
- robot_description.launch: Create robot model in Gazebo world.
- hokuyo.dae: Hokuyo LiDAR sensor mesh model.
- my_robot.gazebo: Define my_robot URDF model plugins.
- my_robot.xacro: Define my_robot URDF model.
- amcl.launch: Launch AMCL node
- map.pgm: Generated myoffice map
- map.yaml: Info for myoffice map
- default.rviz: Default rviz
- map.pgm: Generated office map
- Open the repository and make
catkin_make
- Launch my_robot in Gazebo to load both the world and plugins
roslaunch my_robot world.launch
- Launch amcl node
roslaunch my_robot amcl.launch
- Testing
You have two options to control your robot while it localize itself here:- Send navigation goal via RViz
- Send move command via teleop package.
Navigate your robot, observe its performance and tune your parameters for AMCL.
Option 1: Send 2D Navigation Goal
Your first option would be sending a 2D Nav Goal
from RViz. The move_base
will try to navigate your robot based on the localization. Based on the new observation and the odometry, the robot to further perform the localization.
Click the 2D Nav Goal
button in the toolbar, then click and drag on the map to send the goal to the robot. It will start moving and localize itself in the process. If you would like to give amcl
node a nudge, you could give the robot an initial position estimate on the map using 2D Pose Estimate
.
Option 2: Use teleop
Node
You could also use teleop node to control your robot and observe it localize itself in the environment.
Open another terminal and launch the teleop
script:
rosrun teleop_twist_keyboard teleop_twist_keyboard.py
You could control your robot by keyboard commands now.
- It's recommended to update and upgrade your environment before running the code.
sudo apt-get update && sudo apt-get upgrade -y
- Got an error when launching amcl.launch
check the amcl.launch file that you have correctly mapped the topics to the correct published ones
<remap to="scan" from="my_robot/laser/scan"/>