This exercise involved learning the basic skills and knowledge required to operate a Duckiebot.
Technologies used: Python, DuckieTown Shell, Docker
The first step was to set up the Duckiebot. We were provided an assembled Duckiebot with a burned SD card. Our Duckiebot is named csc22926, but we needed to install the required software dependencies to connect to the robot and operate it.
This involved installing Docker, Duckietown Shell (dts), and creating a Duckietown account.
Next, we turned on the Duckiebot and ensured we could discover it on the network using the dts fleet discover
command.
We were able to see our Duckiebot's name csc22926 in the list of discovered Duckiebots.
Our Duckiebot was now ready to be operated!
Fig 1.1 Duckiebot CSC22926
Before we began moving the robot, we first logged into the Duckietown dashboard atcsc22926.local
using our Duckietown account token.
We then used the following command to connect to the Duckiebot over WiFi and control it using our keyboard:
dts duckiebot keyboard_control csc22926
On the dashboard, we were able to see its camera feed and motor signals as it moved.
Fig 2.1 Duckietown Dashboard
The image contains the motor speed, linear speed, and angular speed. At the time of taking the screenshot, we had just started driving the bot in a straight line at full power, hence the high linear speed and zero angular speed.
We can also see the Duckiebot's camera POV. I hope csc22926 had a nice view of me...
The Duckiebot camera's POV is not necessarily an accurate representation of the world from an outside reference. It may distort the image and have an inaccurate pixel projection. Intrinsic calibration focuses on the camera's internal characteristics and how it forms images. It accounts for the camera's lens distortion, focal length, and sensor properties. The calibration process generates parameters that help correct image distortions and ensure accurate pixel measurements, which is crucial for the robot to perceive distances and shapes correctly.
We next performed intrinsic calibration by using a fixed size and rigid checkerboard as a reference and using the following command:dts duckiebot calibrate_intrinsics csc22926
We then moved the Duckiebot camera around the checkerboard in different axes to perform the calibration. Finally, we generated the intrinsic calibration parameters required to correct the camera's POV.
Fig 3.1 Intrinsic Calibration Parameters
Images are essentially matrices, and the robot is receiving an input matrix from its camera. The .yaml file contains parameters that are used in various matrix operations to correct the input matrix to represent an accurate view of the world.
For example, the projection_matrix
describes how 3D points are projected onto the 2D image plane, and the rectification_matrix
accounts for correcting camera tilt.
Extrinsic calibration deals with the camera's position and orientation relative to the Duckiebot's environment (like the ground plane). It determines how to transform between the camera's view and real-world coordinates. This is essential for the robot to understand where it is in space and navigate accurately in Duckietown.
So, we performed extrinsic calibration next, which involved placing the Duckiebot at the right position and the right angle with respect to the calibration checkerboard. Then we performed the calibration using the following command:dts duckiebot calibrate_extrinsics csc22926
The required extrinsic calibration parameters were generated.
Fig 4.1 Extrinsic Calibration Parameters
The homography parameters, yet again, represent a transformation matrix that helps the robot understand its position relative to the ground. It basically maps where the ground is in the camera's point of view (POV).
The last bit of calibration has to do with the kinematics of Duckiebot. Even slight differences in the left and right motors, wheel sizes, etc. can prevent the bot from moving in a straight line. To ensure that it moves in a straight line, we performed a trim calibration. Trim acts as a bias correction between the right and left wheels. If the robot veers left, a positive trim adds more power to the left wheel to mitigate rotation towards the left side. A negative trim works similarly for the right wheel.
Duckietown has many tools it uses to move the Duckiebot and analyze its motion. To use these tools, we run the following command:dts start_gui_tools csc22926
Then, we used ROS to modify data related to motion. In our case, it was to set the trim parameter to 0.01 using the following command:rosparam set /csc22926/kinematics_node/trim 0.01
We then saved the kinematics calibration settings.
Fig 5.1 Kinematics Calibration Parameters
These represent the parameters used by ROS to manipulate how the Duckiebot moves. We can see the trim value of 0.01. We also notice parameters like omega_max
(max angular speed) and v_max
(max linear speed).
We were now able to move the Duckiebot in a near straight line.
Video 5.1: Straight line motion
The Duckiebot was initially steering to the left, so we added a positive 0.01 trim and we can see in the video that it moves nearly in a straight line. It starts veering to the right slightly at the end, but factors like road material, the tape, and slight calibration errors can cause this.
We then made the Duckiebot move in a rectangular lane. Again, we ran the keyboard control command and pressed a
to start lane following.
Fig 5.2: Duckiebot Lane Following
As you can see, the Duckiebot moves around the lane detecting the white tape (lane borders) using its sensors. The Duckiebot uses computer vision to detect lane markings and stay within the boundaries. It relies on a set of sensors to track the lane's edges, adjust its speed, and make decisions at corners. The left turn at intersections is part of its path-following algorithm for safe navigation.
For now, we used the default configuration and algorithms built into the Duckiebot to perform this demonstration.
We learned the basics of making a program that runs on the Duckiebot. First, we cloned a Duckietown development template repository. We modified the Dockerfile and wrote a simple Hello World program in Python. Then we built the container image for the program and loaded it onto the Duckiebot. Afterward, we ran the program on the Duckiebot.
Fig 6.1: Duckiebot csc22926 says Hello!
The program runs on the Duckiebot, and our program specifically used an environment variable that would contain the name of the Duckiebot, which is not hardcoded anywhere in our program. Only the Duckiebot would know the name stored in the environment variable. The program correctly retrieves the name and prints: Hello from csc22926!
.
Through this exercise, I learned a lot about how Duckietown works, particularly in terms of Docker's role when programming robots. I also learned the importance of calibration and how crucial it is to ensure the robot moves correctly before deploying any algorithms to perform specific tasks.
One of the biggest challenges was patience! Sometimes getting a response from the robot could take a while, especially during the calibration process. The calibration itself was a time-consuming process where I had to move the robot around several times before it could generate appropriate calibration parameters. Another challenge was with adjusting the trim value for straight-line motion. Initially, I used too large values, but after experimenting, I settled on a trim value of 0.01.
I also faced challenges with installing Ubuntu, which led me to ultimately use the desktop in the lab to perform all my tasks. Overall, I had a lot of fun and I have so much to learn. I look forward to the next exercises!
The exercise was completed in collaboration with Basia Ofovwe.
I would also like to acknowledge the LI Adam Parker and the TAs Monta and Dikshant for their assistance with troubleshooting Duckiebot commands and explaining calibration concepts, which significantly improved my understanding and helped me complete this write-up.