This is a set of three tasks, given for the purpose of recruitment to CSD Robocon NITK.
- Milestone-1: Make the bot move in circles of three random radii.
- Milestone-2: Implement localization without using API functions
getObjectPosition()
andgetObjectOrientation()
. - Milestone-3: Implement control system and navigate through the waypoints.
This project uses openCV-python to analyze the visual sensor data, and determine the position, orientation and thus path progress of the bot. With the locations of the waypoints fed to the bot, it uses p-system or proportionality system with angular error to orient itself towards the next waypoint.
Axes:

Orientation is 0 deg when the bot is exactly opposite of the default starting orientation, thus starting orientation is 180 deg. Angle increases upon rotating clockwise.
Unfortunately, I couldn't complete the implementation within the given time frame. Here's the logic I would go ahead with given about a day or so more:
- We pass as a list the waypoints to the robot
- The robot has its own orientation: o1, given from the AruCo tracking system; and the orientation needed to reach the next waypoint: o2, calculated from the list of waypoints.
- The robot utilizes PID or a similar damping error correction system to match o1 with o2 while commuting.
- Upon reaching the waypoint (within a given tolerance vicinity), o2 is updated and we go to step 3, except when the waypoint is the goal.
The error correction system has some detail to it, which I couldn't manage implementing due to shortage of time.
- I finally got the control system figured out. It is a p-system, and uses only proportionality for error correction.
- Orientation system has been changed. Now starting orientation is 90 degrees, 0 degrees being bot facing way point 2 from way point 1.