For the final group project in my Embedded Systems class this year, we were tasked with building a robot that could navigate through a simple arena.
Our robot, Raven (Robotic Arduino Vehicle with Enviable Nerves), had to start at one end of the arena and make it to the other end of the arena while another robot did the same. The arena had a black line running through the middle and some obstacles blocking our way. We used two IR light sensors, one proximity sensor, and two bump sensors to detect our environment. We used continuous rotation servos and an Arduino to tie it all together.
Here is a video demonstrating Raven’s logic.
Raven navigated the arena by following the line until her bump sensors were hit. Then Raven would use her proximity sensor to avoid the block and get back on the line. This would then be repeated until we got stuck or made it successfully out of the arena.
Raven operated quite well given the simplicity of her programming and came in second place in the final competition. We couldn’t be more proud of her.
Special thanks to Trudy and Patrick for the great partnership.
For my embedded systems class this semester, our first project was to create a line following robot using an Arduino, two servos, and two IR sensors.
Demonstration:
Stability:
In the videos above, I reset the robot to make sure the light adjustment was accurate. I then allowed the robot to scan the line for even better adjustment. Then I pressed the button to go to mode 1, which then started the motors. The first video shows a working demonstration and the second video shows how difficult it is for the robot to lose its place. While I did not show the robot failing to get back on the line, it is still possible and can be accomplished by smacking it hard enough to move the line out of its turn radius. That being said, the robot should not need to deal with those conditions in practice.
The robot itself is simply two continuous rotation servos, two IR sensors, a button, and an Arduino. The Arduino sketch comprises most of the complexity of the project. The Arduino program used to control the robot can be downloaded below. The program has 4 modules that run. First, a setting module watches for button presses and allows other modules to respond to different modes of operations set by the button. The second module is a thresholder that takes the analog sensor values, takes averages and stores the min and max values. Then the thresholder converts the analog values into a ‘1’ if the sensor detects the line and ‘0’ otherwise. The third module is the transition module, which pays attention to the change of the sensors and will output a value telling the other modules if the robot is moving left or right with respect to the line. The last module controls the motors. The motor is driven given the binary sensor values and the left/right value to determine which way it needs to move. The logic is designed to go forward when the left sensor is on the line and the right sensor is off the line. Any other combination will cause the robot to drive towards the line.
Most line following robots use a multitude of sensors to increase their ability to detect the line. As an Honor’s project, I will be using a camera instead. This camera is the OV7670 camera module that uses a parallel data port and a subprotocol to the I2C (TWI) protocol to manipulate settings. In the current stage of development, the camera operates at 2.5fps at a resolution of 176 x 144, which will give me the same resolution as if I had 25,344 IR sensors. The hardware implementation of the I2C protocol (Wire library) does not properly communicate with the camera (The speed of 100 kHz is too fast), so I created my own library to run I2C communication manually with a scalable speed. Using this library, the Arduino can communicate with the camera smoothly. Soon I will create an Arduino sketch to save the camera data to an SD card, and a Java program to interpret this raw data and format it into JPEG or BMP.