Autonomous Model Vehicle
(Elgoo Vehicle)
GitHub Project & Documentation: HERE
Why I Chose the AMV Project
In my Artificial Intelligence class, we were tasked to develop a project that deals with a "learning" algorithm. This project was not limited to any boundaries. Instead my professor gave us the freedom to choose anything we were passionate about. My passion for this project was inspired from a posted Google video titled "Self-Driving Car Test: Steve Mahan" on YouTube. As of writing this autonomous vehicles including Tesla, Acura, BMW, Ford have become a "normalized" standard and are continuing to improve their AI technology through rigorous testing as well as trying to pass stringent regulations. However, when I first watched that video from Google I was fascinated at the fact that one day we'd have self driving cars. It's funny because back then the idea of autonomous vehicles (even electric vehicles) being "standardized" seemed like a farfetched concept. Anyways, ever since then I have always began to wonder how these vehicles are able to move on their own. In college it became a question of how can working code work with mechanical components. Hence, I wanted to make a model vehicle that simulated the movements of a self-driving car. That being said this project is in no way close to emulating a working smart car. However, this project did impact my knowledge on Convolutional Neural Networks, Arduino circuit board logic, and analyzing how C++ code works with the hardware.
CNN's Preview
Convolutional Neural Networks or CNN's are specialize in detecting patterns and also make sense of them. This pattern detection can be beneficial especially for image analysis. Deeper within the networks are hidden layers (convolutional layers) which perform a protocol involving: receiving an input, transforming the input, and output the transformed input to the next layer. It is important to mention that each layer contains a filter which are responsible for detecting patterns. For images analysis patterns include: edges, shapes, textures, objects, and more! So one type of pattern a filter can detect can be edges. This can be classified as an edge detector. It is important to note that some filters may detect corners, circles, squares. These simple filters are something we'd typically see at the start of the network. The deeper our network goes the more sophisticated these filters become. So in later layers these filters will be able to detect complete objects like eyes, ears, hair, fur, feathers, scales, beaks, or more! This continues even further in deeper layers where filters can detect full dogs, cats, humans, lizards, birds, and of course much more!
How the Arduino Worked with the AMV
The Arduino version Uno played an important role! It essentially served as the brain of the vehicle. The Arduino comes with its own integrated development environment as a free download from Arduino’s main website. This main website also provides a forum where Arduino enthusiast’s can upload their sketches/programs for the community to re-use. It uses both C/C++ as acceptable programming languages on its IDE. It is also able to manage 32 Kilobytes of Flash Memory and 2 Kilobytes of RAM.
The AMV Finished Product
The finished result of the project is an autonomous model vehicle capable of object detection for one mode and capable of line detection for its other mode. These modes can be switched by two programmed buttons on a remote control.
The critical piece of hardware contributing to the vehicle being able to handle obstacle avoidance is the HC-SR04 ultrasonic sensor. this ultrasonic sensor uses sonar to determine the distance to an object. The transmitter trigger pin sends a signal: with a high-frequency and if there is an object or obstacle on its path it will bounce back from which the echo pin will receive it. The time between the transmission and reception of the signal allows us to calculate the distance to an object. If an object is detected then this will activate the Servo Motor in degrees ranging from 0 to 180 degrees.
The other piece of hardware contributing to the line detection mode is the Infrared Module located underneath the vehicle. The line detection sensor works by detecting reflected light coming from its own infrared LED and by measuring that amount of infrared light resisted. How the car will traverse its pre-made track is as follows: (Assuming the car is placed in-directly on top of the line) if the left sensor detects the line then the car will turn left, if right sensor detects the black line then the robot will turn right, if neither sensor detects the black line then the car will continue moving forward. It will keep re-adjusting itself infinitely.
Final Thoughts
While the project may not be close to becoming a fully optimized autonomous car. This project does teach the basics of AI technology and also promotes Arduino projects. The only applicability I can see this car being used in reality would be something like a automatic roaming vacuum, a project for a student (unintended irony), and with some logic changed it can be used as a RC vehicle. I suggest anyone to research and develop projects with a Arduino Microcontroller or a Rasperry Pi. Both give you a hands on experience working with code and hardware.
Thanks for Reading! All the documentation I compiled was papers that I have written from other classes :) Feel free to read them or utilize any information for your own!