Usage#
Configuration#
All the planner configuration files are located at cmr_nav/nav_conf. Look through them if you need to adjust anything.
If you are confused by any of the variable names, the definitions can be found here, and a reference forum post
which was used to build the general structure of the YAML file can be found within the YAML files for common, global,
and local params. To extend this system by adding additional sensors, make sure to add an additional layer definition
within common_costmap_params and use said definition in local_costmap_params.
Camera#
To enable computer vision, make sure the configuration parameter enable_vision for the autonomous control node (in auto.yaml) is set
to true. This will tell the autonomous control node that this autonomous traversal will use CV to find the AR tag.
Rviz#
Run cmr navview to open a rviz window configured to display the current plan, the current goal, and the rover’s
current position. This is especially useful when debugging as it displays the current planned path, the current costmap and
LIDAR information, and a general view of the rover’s relative pose based off of the overall map. The ROS path planner is
generally robust, but be sure to check here as well as to check the logs to detect any potential issues.
Pre-task Checks#
Before starting the task, verify that all systems work as expected.
Before driving the rover (or using drives at any point): check that the screws on the wheel hubs are securely tightened. If they are loose, it’s possible that the motor could get severely damaged! Also make sure that the IMU, GPS, and LIDAR sensors (and ZED camera if testing vision) are properly mounted on the rover: if they aren’t, there will be missing quaternions that will prevent navigation from occurring or missing pose estimates that will prevent vision from working.
The IMU mounting can be found on the board itself: one side of the board will show the +y direction for the IMU. Orient it such that the +y arrow is facing up on the board and pointing to the left of the rover: this will properly mount the IMU with respect to the rover. The IMU must also be plugged into the sensor board, rather than via USB: please doublecheck with ECE that it is properly set up.
The GPS has a red light near the base and connects via USB. If the red light is solid, the GPS is currently powered: If it is blinking, the GPS has a “fix” (can get coordinates)
“Front” with respect to the LIDAR is the direction which the wire is coming out of on the LIDAR. This can be easily checked by looking at the RPLidar specification sheet.
The ZED camera is pretty easy to properly mount: just make sure it’s right side-up and the cameras are facing the front of the rover.
Deploy and start the rover:
cmr deploy&&cmr --remote roverStart the base station with:
cmr baseEnsure that drive and base/drive are disabled initially. If you need to drive the rover to an isolated area to spin the rover for “cmr autocal,” please do so now before disabling drive and base/drive.
Enable the
ddnode viacmr enable dd.- Calibrate the compass with:
cmr autocal. It will prompt you through the steps: simply follow the instructions. Warning:cmr autocalwill spin the rover! It will explain the process to you and ask you to hit enter before doing anything, however. Make sure that the yaw_offset for the imu (in the imu.yaml configuration file) is set to the correct magnetic declination in radians depending on the location (Hanksville, Ithaca, etc). The declination calibration is especially important. If the declination is more than a couple degrees off, the rover will take a very curvy path. This is because it thinks it is going one direction, but then the GPS tells it that it was off, so it corrects, and this cycle repeats. The magnetic declination can be manually searched for a specific location.
Overall: In Utah, true north will be 10.66 degrees west of magnetic north. (West is negative and East is positive).
- Calibrate the compass with:
Enable the autonomous system by running:
cmr enable auto. This will enable the IMU node, the odometry node, the autonomy control node, the differential drive node, the gps node, the ball detector node, and any needed dependencies (most likely HAL and drive).- Test the sensors and verify that everything is working as expected:
GPS: run
rostopic echo /gps/fixand verify that the right GPS coordinates are being printed. There will be a bit of drift early on, so make sure that if GPS is being used for localization, it’s fully settled before starting the task.IMU: run rostopic echo
/imu/readingand verify that readings are being printed, and appear sensible. The angular velocity should be close to 0 if the rover is not moving, and the z acceleration should be roughly 9.8 m/s^2 (gravity, obviously)Odometry: Enable the base station (
cmr manual). Runcmr distcheck. Hit enter to collect the first data point. Drive the rover forwards a couple of meters and measure (with a tape measure) the total distance traveled. Once you stop the rover, hit enter again to collect the second data point. The script will print out the total distance traveled. Divide the distance from the script by the distance measured. This should be 1.0, the current scale factor. Note that the scale factor used to be 1.8. It may change due to stuff the electrical team does. If it is not, adjust the scale factor at line 49 ofodom_filter.pyto be the new value. You divide that line by the scale factor. Warning: if you have to adjust the scale factor, that is probably indicative of a larger problem. Check the microcontrollers. In a pinch, if you are sure the scale factor is consistent, just adjust it.
If all the sensors are functioning properly, start the task. If not, here are some additional debugging steps to try to isolate the problem:
Can the rover plan locally? Run
cmr moveand request 5 m in the x direction, with the identity quaternion (w is 1, the rest are 0). Measure how far the rover actually goes. If it does not go 5 m, there is probably an issue with odometry. The tolerance is set to 1 m, so if you ask for a move smaller than a meter, the rover won’t move at all.Can the rover handle moves with turns? Request a local 5 meter move in the y direction. This should be 5 meters left of the rover. If it cannot do this, check the compass calibration.
Can the rover handle GPS moves? Run
cmr moveand provide a goal in theutmframe. Use thecmr utmtool to convert from GPS coordinates to UTM coordinates. Set the altitude to 0. If this does not work, but the previous tests did, there may be an issue with the GPS or compass calibration. Check the compass first.
Task#
For the task, run: ac /auto/plan
This will open a window that takes a frame, a ‘latitude’, and a ‘longitude’. Given GPS coordinates, use cmr utm in a new terminal, input the latitude and longitude as lat, long, and put the resulting x and y values in the window’s latitude and longitude fields respectively (altitude doesn’t matter, so you can put anything you want). Input the utm frame, and start the goal in order to navigate the rover directly to the proposed coordinates.
Common Issues#
The rover constantly starts and stops while moving autonomously or doesn’t move at all. To fix, ensure that base/drive is disabled (as both the auto and basestation nodes will publish movement messages to drive) and restart auto by disabling and re-enabling the auto node.
Sensor readings are completely unchanging: this isn’t a problem with the GPS, but is definitely a problem with the LIDAR and IMU (a little noise is expected). If this is happening on the IMU (which can be checked by echoing /imu/heading), the sensor board may need to be reflashed: consult ECE.
Some of the wheels in the rover aren’t driving. If this is happening, it’s probably emblematic of a larger ECE/drives problem. Consult both subteams.
Paths look wonky → make sure that wheel odometry is used for local planning, while wheel AND GPS odometry is used for global planning.
Paths look like they go through/ignore obstacles → check LIDAR data and ideally “cart test” (run the mock rover, or run the rover on a cart indoors) and use RVIZ to see how the rover reacts to obstacles it has detected.