Technologies used: Python, ROS, OpenCV, Duckietown
The final project for CMPUT 412 was to autonomously complete a 4 stage course, each with its own task.
Fig 0: Diagram of Duckietown with 4 stages
For the first stage, we were required to tail a duckiebot driving in front of our bot around part of the town.
Fig 1.1 The back of the leading bot
The leading bot followed one of two paths around the town. Our bot was required to follow it while dynamically adjusting speed.
Our solution: We detected the 7×3 black circle grid on the back of the leading bot using OpenCV's cv2.SimpleBlobDetector
and cv2.findCirclesGrid
. A proportional controller adjusted our bot’s linear velocity based on the perceived size of the pattern (larger pattern = closer distance). A second proportional controller centered the pattern in the image, adjusting the angular velocity accordingly.
Note: We also run lane following in the background to ensure we stay on the road. (See Exercise 3 to see how we do lane following)
The leading bot would stop at red line intersections and could proceed straight, left, or right.
Our solution: We determined the red intersections HSV range and used OpenCV's cv2.findContours
to detect red contours. Based on the size of the intersection, we know how close we are and we stop.
Further, to know which direction the leading bot is going, we know that the leading bot's body is blue in color. We found the HSV range and detect the last position of its contour. We then check the position of the contour in relation to the center of our camera's view. If it is on the left, we know that the leading bot is going left, and so on.
Once our bot has stopped for 3 seconds at the red intersection, we use deadreckoning and odometry to move our bot in the direction of the leading bot. We then lane follow until the leading bot is in view of our bot and continue tailing again.
Our bot needed to ensure it never collided with the leading bot during tailing.
Our solution: If the circle pattern on the back of the bot ever becomes too large, we know our bot needs to stop. We also utilized the TOF sensor as a failsafe on the front of our bot to detect if we are too close to the leading bot. If either of these conditions are met, we stop our bot.
(A TOF sensor is a time of flight sensor that uses infrared light to measure distance. It works by emitting a pulse of light and measuring the time it takes for the light to bounce back to the sensor. This allows it to calculate the distance to an object with high accuracy.)
After the leading bot has completed its path, it would be taken off course and our bot would transition to stage 2. The way we determined that stage 2 had begun was if 3 or more red intersections had been detected and stopped at since we know that the leading bot in stage 1 will always stop at most 3 red intersections in either of its paths
In this stage, we were required to detect 2 Apriltags placed on 2 red intersections, and based on the apriltag ID, move either left or right.
In between these apriltags, we were required to lane follow.
Fig 2.1 Apriltag placed near a red intersection
Our bot continues to lane follow, but based on the ID of the detected apriltag placed on a red intersection, turns left or right.
Our solution: We utilized the dt_apriltags
python library and used its Detector
class to identify apriltags of the "tag36h11" family
We would attempt to detect an apriltag on every image callback (i.e. the bot subscribes to the camera feed and processes the image in the image callback function at 10 frames per second). If an apriltag is detected, it is saved as the last_seen_apriltag
Then at the red intersection, based on the last seen apriltag and its ID, we determine which direction to turn. Deadreckoning is used to turn the bot in the correct detection based on the known road dimensions after which lane following resumes.
The end condition for stage 2 is when the bot has stopped at 2 red intersections since the beginning of stage 2 (we know beforehand that there are only 2 apriltags in stage 2, each placed at a red intersection with no red intersections between these 2 tags)
The bot continues to lane follow around Duckietown after stage 2 has ended and now for stage 3 must safely navigate the road avoiding obstacles.
Fig 3.1 Two Crosswalks with potential pedestrians (ducks) crossing and a broken bot on the road between these crosswalks
Crosswalks are determined by two blue strips separated by a small gap. Between these gaps can be ducks crossing. These ducks have a darker yellow color compared to the lane, so they have their own HSV color range.
Our bot must always stop at a crosswalk for a moment even if their are no peduckstrians crossing. If there are ducks crossing, our bot must keep waiting until the ducks have crossed (no longer there).
If the ducks are not there, then the bot continues driving past the crosswalk and resumes lane following.
Our solution: Blue crosswalks were detected using regular contour detection and if detected, our bot would stop for a second. At the same time in our image callback, we attempt to detect ducks using contour detection for their particular colour.
If ducks are detected, our bot continues to stay stationary (we publish a velocity of zero). If ducks are no longer detected, we wait for a cooldown period where we continue to look for ducks
If no ducks were detected during the cooldown period, we resume lane following.
In between the two crosswalks in stage 3 is a broken down bot that is also slightly yawed on the road. Our bot must detect this broken bot and stop.
It must then switch lanes and drive past the bot without colliding with it. Then it must return to the original lane and continue lane following.
Our solution: Since the broken bot is tilted, we cannot rely on detecting the circle pattern on its back. So we resorted to simply detecting its blue hue using contour detection. In order to not confuse the broken bot and a blue crosswalk, we kept count of the number of blue contours detected in stage 3. The second blue contour detected could only be the blue bot based on our logic and prior knowledge.
Once the broken bot is detected, we stop for 5 seconds. We then use deadreckoning to rotate our bot counterclockwise by 90 degrees, drive straight for 0.4m, rotate clockwise by 90 degrees, drive straight for 1.5m, rotate clockwise by 90 degrees, drive for 0.4m, rotate counterclockwise by 90 degrees and resume lane following.
This method of using deadreckoning (hardcoding kinematics) is a bit unreliable and needs quite a bit of tuning since our bot had poor wheel callibration and was very sensitive to the friction on road. A better solution would've been to use deadreckoning to nudge the bot towards the opposite lane and then make it lane follow but with the lanes swapped for a small period of time before using a similar technique to make it go back to the original lane.
Stage 3 ends when the second crosswalk has been passed (i.e. the third blue contour has been detected since stage 3 began)
For stage 4, a parking lot is present in Duckietown with 4 spots. Each spot is a rectangle with yellow borders and an apriltag with a fixed unique ID centered at the short edge of the rectangle
Fig 4.1 The parking lot with 4 spots
Our bot must stop at the red intersection before entering the parking lot. Our bot is given the the parking ID at the beginning of the course and is required to park and stop at the correct spot with the wheels inside the yellow borders.
Our solution: Knowing the parking ID, we would know the position of the parking spot in the lot. We used deadreckoning to move the bot approximately close to the parking spot it needs to park at
We then start rotating the bot in place slowly to look for the apriltag with the parking ID. Once the apriltag is detected, we draw a bounding box around it.
We then used a proportional controller to move the bot closer to this apriltag's bounding box until it reaches a close enough distance to the box.
We use another proportional controller with higher priority to ensure that the bounding box is at the center of our bot's field of view (within a small threshold). This is to ensure that the bot is within the borders of the parking spot since we know that the april tag is centered on the rectangle.
Once our bot is close enough to the apriltag and the apriltag is appropriately centered in our bot's field of view. We know we have parked correctly and stop.
Note: 5 extra points were awarded for reverse parking, but due to time constraints, we did not implement this.
We were given 3 rounds to attempt the entire 4 stage course and the best round would be chosen as our final score.
Each round was out of 125 points and any score obtained >= 100 would be rewarded a 100% on the final project.
We did pretty well this round up until stage 4. For some reason, after reaching the parking lot, it would just continue to lane follow and never attempt parking.
Our bot also had some issues with lane following, where it went too close to the white lane on certain turns.
Also, in stage 1, the leading bot was controlled using keyboard control. However, there was a delay between the keyboard input and the bot's movement and the leading bot was moving unreliably. This hindered our tailing at points where our bot could not see the back of the tailing bot and almost collided.
The TA controlling the leading bot decided to switch to moving it physically with their hands for the subsequent rounds.
We scored a 75/125 for round 1
Round 2 was a repeat of round 1, but even worse. The same issue with the final stage happened again, our bot just did not even attempt to park.
To make things worse, we had increased the gain values for our pd controller for lane following and this caused the bot to start oscillating at points during the first stage and it even detected and stopped for red intersections when it was not supposed to.
Since our code reled on keeping track of the number of red stops to know which stage or which part of a stage we were at, this messed with its navigation
We scored a 65/125 for round 2
With only 1 more attempt at the course and our final grade on the line, we finally figured out the issue with the final stage. Turns out it was a single if condition that had to check for the number of red stops to know that we need to park now. This if condition needed to check that more than 5 red stops had been detected. Instead it was checking if exactly 5 stops had been detected. (Yes our code could have been more robust to prevent a situation like this from happening, but we were very short on time by this point)
Changing this single condition made stage 4 be initiated. The downside was that this meant we only had one chance for our stage 4 to work
After some final tuning for our pd controllers and small touchups, we built the program on our bot and gave it a go.
We scored a 105/125 for round 3!! This secured us a full 100% for the final project.
We pretty much did perfectly on stages 1 and 2. For stage 3, we only messed up in the maneuvering around the broken bot once. In stage 4, our bot had to be picked up once because it got stuck when rotating and trying to look for the apriltag with the parking ID, but eventually found it and parked in the right spot.
Unfortunately, this was the only round that we did NOT take a video of and only the people present in the room got to witness it.
WE GOT AN A!