Demo Video: https://www.youtube.com/watch?v=S3nTK3ZTKOM
Github Code: https://github.com/jyjblrd/SelfDrivingCar
3d Files: https://www.thingiverse.com/thing:4124546

How It Works

From the beginning of this project, I knew that I wanted to make the self driving car as small as possible. This meant that all the hard computational work had to be done by my macbook, and the computer on the car would only be used to transmit a live video, and to receive driving commands (accelerate, left, right).

A general flow of events is as follows:

  1. Car sends a message to my macbook asking for new directions
  2. Macbook processes newest frame from the webstream
  3. Macbook sends driving commands to the car
  4. Car receives driving command and drives the car
  5. Car send a message to my macbook asking for new directions

This cycle continues until either I stop the program or one of the programs crash (normally it is the latter one).

 

Hardware

My first plan for the self driving car was to use an esp32-cam microprocessor to send images to my macbook. It seemed like the perfect computer for this project as it is tiny, only costs a few dollars, and draws very little current. The esp32-cam module was plagued with bugs however, and I could never get it to operate properly. The frame rate was far too low (around 2fps) and the latency was unacceptable even when I set it to the lowest resolution making it unusable.

The next computer I tried to use was an orange pi i96, which is just similar to a raspberry pi zero but a bit cheaper and easier to get. This computer was also very buggy however and was almost impossible to get working. After hours of fiddling and countless linux re-installations, I could only barely get a video live stream to work. The wifi randomly disconnected every few minutes however, making it almost unusable.

Eventually I decided that buying cheap, badly supported hardware was not worth the extra time and effort required to get it working properly, so I bought a raspberry pi zero w. In comparison to the esp32-cam and i96, the raspberry pi was an absolute joy to work with. Everything immediately work the way it was supposed to, and within a few hours I had the video stream running reliably. Learn from my mistakes and just buy a raspberry pi, it will save you weeks of frustration and it only costs $15 more.

Now that I had the computer figured out, I had to get something to drive the computer around. My original plan was to use a coke can rc car, however I soon found out that these tiny cars have terrible steering systems. The front wheels had a very small range of motion, making sharp turns impossible. The steering control is also binary in these cars. There is a little electromagnet which is used to turn the front wheels, meaning that you only have three options for steering: hard left, straight, or hard right. Lastly, the cars were geared for very fast speeds, which is not what I wanted for my self driving car. The combination of the steering and bad gearing made it almost impossible to control the car so I ended up just making my own drive system. I bought two 120rpm motors and 3d printed a mount for them. The raspberry pi was hooked up to a h bridge motor control board which controls the voltage going to each motor. This new design allowed for much better control of speed and steering, making everything much easier.

In the end I settled for my custom drive system with a raspberry pi zero w and a fisheye camera. I used a 3s 1000mah turnigy lipo which I had lying around to power the car. I also threw in an arduino mini and oled display to display the battery voltage. I never really timed battery life, but it is somewhere around 45 min or so which is plenty for testing.

 

Car Software

As stated before, the program running on the car only has to:

  1. Stream a video to the macbook
  2. Receive commands from the macbook
  3. Control the movement of the car
Video Streaming

The car uses the linux program motion to stream the camera's video to the web. I changed a few of the settings in the config file to lower the stream resolution, increase the frame rate, and convert the video to black and white. This decreased the latency to acceptable levels. It still isnt perfect, but I think it is the best I am going to get considering that the video has to be compressed, streamed over a http connection, and then decompressed.

Receiving Commands

The car and macbook communicate in real time over a socket.io connection. I had used socket.io previously in whitebox, which was written all in javascript. One thing that really stood out to me was how much easier it is to use socket.io in javascript than python. I guess socket.io was built to be use by asynchronous programming languages.

Controlling The Car

As mentioned in the hardware section, the car has two motors to control its movement. The car receives information on how fast it should be moving and what direction it should be moving in, and the car has to give each motor a speed to run at. I just use a little formula to do this, it is all in the github code.

 

Macbook Software

Now for the fun stuff, the macbook software. Writing the software to turn images into driving commands was by far the most time consuming part of this project. I used python running opencv to do all of the image processing. Ill break it up into 4 different steps:

  1. Calculating speed of car
  2. Manipulating image into something easy to process
  3. Finding lane lines & center of the lane
  4. Drawing image to macbook screen

How the lane is found
Calculating Speed of Car

I used image flow to calculate the speed the car is moving. The calculate_speed function takes in the previous frame and the current frame, and uses opencv's built in image flow function to calculate the average magnitude and direction that the image is moving in, which gives an approximation of the speed of the car. If the program sees that the car is not moving, it tells the car to accelerate more so it can get moving again. This prevents the car from getting stuck on something when it is moving at low speeds.

Manipulating Image

There are a few simple manipulations done to the image to turn it into something that is easier to process. The image is first blurred to reduce the noise in the image. The adaptiveThreshold opencv function then converts the grayscale image into a 1 bit black and white image, so the whole image is black apart from the lane lines which are white.

Finding Lane Lines and Lane Center

The program takes in a black and white image of the lane lines and returns the direction which the car has to turn in order to stay in the lane. The program first finds the location of the two lane lines. It then finds the line of best fit for the lane lines, which means that we have two functions which represent (approximately) where the two lane lines are. The line of best fit also does a good job at extrapolating where the lane lines go, even if the camera cant see them. These two functions are then used to find the middle of the lane, which is then used to figure out which direction the car should steer in order to get to the centre of the lane. This is a very, very broad overview of what the program actually does, so go check out the code if you want to see exactly how it works. There are tons of comments in there which explain everything.

Drawing Image

This step is purely for debugging. It just visualises all of the things that I talked about, so I can see what my program is seeing and get a better sense of what is happening. I used opencv to show all of the videos on my screen.

Conclusion

This project was the first time I have used opencv, which I have now come to love. In terms of hardware, this project has taught me that I should always use expensive and well supported hardware when prototyping, since using cheap hardware is not worth the time and effort. If I had simply used a raspberry pi from the beginning, I would have saved hours of frustration. Now that I have succeeded in making a self driving car with (relatively) expensive hardware, I might go and do it again with the esp32-cam, since it would be really cool to have an army of $10 self driving cars. Go check out the github code if you want more details on how everything works, and the 3d files are available on thingiverse if you want to print them yourself.