May 24, 2019 Montreal, Canada
In this lab, you will be implementing a wall follower on a simulated version of the racecar. Your goal is to make an autonomous controller that drives the racecar forwards while maintaining a constant distance from a wall on either its left or right (chosen on the fly). It should also be robust to uneven surfaces and small errors in the LIDAR data, and it should be able to recover from small deviations from the desired state; being too far, too close, or too angled.
This lab is to be done individually. In the next lab, you will join your team to get your wall follower working on the real racecar. You will be reusing the code you write for this lab so make sure it is clean and presentable to your teammates!
We have made a series of tests to evaluate the performance of your wall follower. In order to test your code properly you must start out with the template starter code. The template is still very sparse and you have plenty of freedom to implement any algorithm you’d like so long as the inputs and outputs are the same.
Clone this repository into your catkin workspace:
cd ~/racecar_ws/src
git clone https://github.com/mit-rss/wall_follower_sim.git
Your directory structure should now look like
[YOUR_WORKSPACE]/src/
CMakeLists.txt
racecar_simulator/
wall_follower_sim/
Then rebuild your workspace with catkin_make
`:
cd ~/racecar_ws
catkin_make
source devel/setup.bash
First take a look at the simulator. You can launch it by running:
roslaunch racecar_simulator simulate.launch
To see the simulated car, you will need to open
rviz
by typing
rviz
If you are using the Docker image, rviz should already be configured properly!
In the left panel on the bottom click the “Add” button, and then in the “By display type” tab click “RobotModel”. You should see a small blue car appear. Then click “Add” again and in the “By topic” tab click add the “/map” topic and then repeat to add the laser scan topic. Under added LaserScan dropdown menu there should be a field called “Size (m)”. Change this to 0.1 so you can see the laser scan more clearly. The checkboxes turn on and off display types which may be useful as you add topics to visualize.
You should see a car in a map (walls are black, empty space is grey) and points on that map representing the points hit by the car’s lidar.
You can move the car around by plugging in a joystick into your computer, or by clicking the “2D Pose Estimate” button on top of rviz and dragging somewhere on the map.
Note that the simulator does not include collision detection.
In order to make the car drive autonomously you will need to publish messages of
type AckermannDriveStamped
to the /drive
topic.
Import the AckermannDriveStamped
type like this in your wall_follower.py file:
from ackermann_msgs.msg import AckermannDriveStamped
The racecar (and it’s simulation) are equipped a with LIDAR sensor that measures
the distance from the racecar to its surroundings with high accuracy. All of the
LIDAR data is published to the /scan
topic.
The data is of type LaserScan
. You can import the type in python using:
from sensor_msgs.msg import LaserScan
The ranges
data entry in the LaserScan
message is an array of the
distances from the lidar sensor to the nearest obstacle. The measurements are
taken at regular intervals, angle_increment
, from the angle
angle_min
to the angle angle_max
.
The rainbow points in this image below are the laser scan as visualized in
rviz
. The color simply corresponds to the intensity of the scan. In the
simulator this is simply the distance, but on the actual lidar it gives you an
indication of how reflective the object you are scanning is. Note that there is
no data in the quadrant behind the car because on the actual car that area is
occluded.
How you implement the wall follower is entirely up to you. However implementing the following may get you started in the right direction:
ranges
data into more useful pieces. A majority of the data won’t be useful to you if you only care about a wall to one side. When you can, try to use numpy
operations rather than for loops in your code. Multidimensional slicing and broadcasting can make your code cleaner and much more efficient. You can turn any array into a numpy
array with np.array
, or you can integrate it directly with ros like in this tutorial.We have some start code for you in here:
src/wall_follower.py
However if you want to add more python files to keep your code organized, feel free to do so.
The template code specifies a couple useful parameters which you can use to make your code more flexible:
You must use the following ROS parameters in your follower:
desired_distance
: The distance in meters the racecar should maintain from the wallvelocity
: The speed the racecar should move in meters per second.side
: The side the wall is following represented as an integer. +1 represents the left wall and -1 represents the right wall. We chose this convention because typically we will assume the car is pointing in the positive x direction. That means the left side of the car will point to the positive y axis and the right side will point to the negative y axis.To run your wall follower launch:
roslaunch wall_follower wall_follower.launch