Categories
Uncategorised

How to make an autonomous car (code included)


Suiron-3

I’ve just finished a recent side project with my friend Kendrick. (his GitHub)  We built an autonomous car that you can teach how to drive, then it drives around by itself.  I did all of the hardware/arduino software and Kendrick did all of the machine learning software.  He called his project Suiron and it’s available on GitHub here.  The code running on the arduino is called car-controller and is available on my GitHub here.

Now that you’ve got the links feel free to go and have a look.  Work through the code and try to figure out how it works.  I’ll try to briefly cover the software here but my main focus will be on the hardware.  One thing to note are the open source licenses, all my stuff is GPL and Kendrick’s Suiron project is MIT.

This post is more intended as an overview of how the whole thing works.  If I get time I might turn it into a tutorial on how to get it working yourself.

Before we begin here is a short video of it in action.

Now onto the fun stuff!  How does it work?

The Hardware

These are the main components used.

1) Remote Control Car – we used this car (link) but anything of similar size will work.  As long as it has a standard ESC and Steering Servo.  It comes with a remote control, battery and charger to start with.  I recommend buying a new remote control system. (link 5 below)

car

2) Inten NUC – The raspberry pi doesn’t really have enough power and is arm based.  An x86 based processor like the i5 in our NUC is much easier to use for machine learning purposes.  The exact one you use doesn’t matter.

nuc

3) Battery for NUC – A standard laptop battery bank was used to power it.  This one from kogan gives about 6-10 hours of runtime.  (link)

powerbank

4) Logitech C920 Webcam – Buy from any reputable retailer.  You could probably use a cheaper model but we had better luck with a C920. (link)

c920

4) Lens filters – if you are operating in any sunlight, you will want a Polarising and ND (Neutral Density) filter.  The camera just can’t cope with the harsh sunlight and shadows so these filters help bring the conditions into something much better.  A variable ND is great as it let’s you adjust the “darkness” level.

filters

5) Radio control system – if you intend on doing lots of this stuff then get an FrSky TARANIS.  You won’t be disappointed.  Otherwise, a turnigy 9XR will work just as good.  Make sure you get a receiver too if it isn’t included.

taranis

6) You’ll also need an arduino.  I like the Arduino Nano’s because they are super cheap and have on board USB.

nanoIf you want some specific help choosing components just leave a comment.

I won’t go into details on how to wire everything as this isn’t a tutorial.  However, If you need some help drop a comment below.  I suggest you learn how an ESC (electronic speed controller) works together with a motor, receiver, servo and battery.  This is a standard setup on normal remote control cars.  Once you understand that you should look at arduino’s and how to use them to blink lights and read inputs.  Read through the arduino code and the wiring should be pretty self explanatory.

How it all fits together

It’s up to you how you put everything together.  I recommend trying to keep everything as low as possible for better stability when driving.  The webcam needs to be mounted up high so it has a better chance of seeing the lane that it’s in.  I just used a square bit of balsa wood as it’s really light and strong, then glued the webcam to it.  Instead of explaining exactly how I mounted everything I’ll dump a few pictures here.  All the white things are 3D printed, but you could easily do it without a 3D printer.

Suiron-1 Suiron-2 Suiron-3Suiron-6

Polarising/ND Filter

The importance of a polarising filter cannot be underestimated.  It reduces reflections and the harsh glare sometimes encountered.  In the image below (credit)  you can see how much of a difference a polarising filter can make.  Now water is a bit of an extreme example, but I chose that picture so it’s easier to demonstrate the difference.  In realty, where we’re operating the difference won’t be so obvious.

PolarizingFilter1

The neutral density filter is equally or more important than the polarising filter.  The ND filter is basically like sun glasses for the webcam.  The webcam doesn’t like really harsh light so it reduces the intensity of it without interfering with the image to much.  The picture below (credit wikipedia) shows how much better the right ND filter can make an image in harsh light.

nd

 

I suggest making the lens filters removable as it will make the image to dark in lower lighting situations.  For example, it was perfect mid day but much to dark a few hours later just before dusk.  I made a simple mount that just uses an alligator clip to hold the filters in place.  The filters are both glued together then onto a small 3D printed right angle mount.

Suiron-4Suiron-5

The Arduino

The diagram below shows how everything is hooked up.  Basically the arduino is the “brains of the hardware”.  It reads in the values from the R/C receiver (bottom left) and then decides what to do based on the mode channel.  Dig through the arduino code (link) and see exactly how.  Basically there are 3 modes, manual, autonomous and emergency stop.

In manual mode the arduino reads in the steering and motor values and passes it straight to the motor and steering servo.  In this mode with the right flag enabled, it also sends back over UART what those values are every time it receives a character.  (every time it receives prevents the serial buffer getting full and “lagging”) In autonomous mode the arduino reads inputs over UART from the NUC.  In this mode it receives two messages; steer,x and motor,x where x is the value you want to set it to.  It then writes those outputs to the steering servo or motor.  Finally, emergency stop kills the motor output and straightens the steering servo.  This emergency stop overrides any sort of manual or autonomous control.

arduino

The Machine Learning Part

This isn’t my expertise so I’ll briefly summarise what it’s doing.  (not really how it’s doing it, no one really knows)  We used a library called Tensor Flow.  It’s an Open Source machine learning library published by Google.  It’s open source and released under an Apache license.  It has a nice python and a “no nonsense” C++ api.

Collecting data

This is a really short summary of the whole process.  Each time a video frame is recorded Suiron (software on the NUC) asks car-controller (software on arduino) what the human operator is doing.  Remember, in manual mode the human operator is driving the car around.  Car-controller responds by sending the current steering and motor values back to Suiron.  Suiron takes these values and saves them along with a processed version of the frame.

This process happens at about 30Hz (or 30 times per second) for as long as you record data.  In the final model, we used about 20 minutes worth of training data.  That is 20 minutes of continuously driving around the track.  It may not seem like a lot but it’s repetitive very quickly. 😉  In reality, 20 minutes is no where near enough data.  It works great on this particular track with similar lighting conditions but would likely fail if the conditions changed to much.

Training data

Again, I’m not an exert at this but I’ll try to briefly explain how the training works.  Convolutional Neural Networks (CNNs) are weird in the way they work.  It’s impossible to know exactly how or why a CNN works.  Basically, we’re giving Tensor Flow the frame and two numbers. (steering and motor)  Then we’re asking it to work out how the frame relates to those two numbers.  After giving it hundreds of thousands of examples (frames) it can try to generalise a model.

Because of the amount of computing power required it takes a very long time to train a good model.  Due to the type of calculations it has to do, Tensor Flow runs much faster on a dedicated GPU.  With only 20 minutes of data our model took half a day to train properly.  The training took place in a desktop with a borrowed GTX980, a GPU that’s towards the higher end of consumer graphics cards.

Using the model

You can see it in action in the gif below.  The blue line is what the model thinks it should do, the green line is what I actually did when I was steering it.  Note that this data was not included in the training set, this is to ensure the model works with other data.

demo gif

Once it has been trained we can then use the model.  Basically, what happens is we collect just a frame from the webcam.  Then we pass it to Tensor Flow and ask it to run it through the model.  The model then spits out what it thinks our two values should be, one for steering and one for throttle.  At the moment the throttle is unused and it runs at a constant speed.  However we thought we’d include it just in case we wanted to use it in the future.

Update: Clive from hobbyhelp.com reached out to me after seeing this. He’s got a pretty cool “Ultimate beginners guide to RC cars” article on his website here. I recommend checking it out if you want to get started doing something similar to this project.

56 replies on “How to make an autonomous car (code included)”

[…] A self-driving car may seem as though it is beyond the abilities of a Hackaday reader, but while it might be difficult to produce safe collision avoidance of a full-sized car on public roads it’s certainly not impossible to produce something with a little more modest capabilities. [Jaimyn Mayer] and [Kendrick Tan] have done just that, creating a self-driving R/C car that can follow a complex road pattern without human intervention. […]

So real-time convnets on an arduino or rasberry pi are still not feasible? Kind of sucks you were forced to use a nuc.

Not really, we had to use the Intel NUC because we couldn’t get Tensor Flow to run on the arm based raspberry pi. The Intel NUC has an i5 which is x86 based.

[…] A self-driving car may seem as though it is beyond the abilities of a Hackaday reader, but while it might be difficult to produce safe collision avoidance of a full-sized car on public roads it’s certainly not impossible to produce something with a little more modest capabilities. [Jaimyn Mayer] and [Kendrick Tan] have done just that, creating a self-driving R/C car that can follow a compl…. […]

[…] A self-driving car may seem as though it is beyond the abilities of a Hackaday reader, but while it might be difficult to produce safe collision avoidance of a full-sized car on public roads it’s certainly not impossible to produce something with a little more modest capabilities. [Jaimyn Mayer] and [Kendrick Tan] have done just that, creating a self-driving R/C car that can follow a complex road pattern without human intervention. […]

Great Job, it is good to see Tensorflow working on a practical application…not just identify flowers !!

Would it be possible to have more detail on how you matched the video frames with the steering commands? And how you passed this info into the training process and finally how the inference code is running live? … best would be making this available on github so we can replicate and learn ….human learning this time 😉

[…] A self-driving car may seem as though it is beyond the abilities of a Hackaday reader, but while it might be difficult to produce safe collision avoidance of a full-sized car on public roads it’s certainly not impossible to produce something with a little more modest capabilities. [Jaimyn Mayer] and [Kendrick Tan] have done just that, creating a self-driving R/C car that can follow a complex road pattern without human intervention. […]

I think so, I can’t see why not. When we collect the data it’s at about 30 FPS so that’s where that number came from.

The big questions are how it handles changes in the shadow & whether the animated GIF was using the polarizing filter. Quite a bit of glare still shows up.

It depends on the training data. We didn’t have much of it so it struggled a little with the highlights. (we trained it mostly in shadows) That gif is with the polarising and neutral density on the lens, it was quite bad without the filters.

This is excellent. I think I speak for everyone when I ask that you guys make a tutorial for this fantastic project. Thanks again!!

Thanks for the kind words. I am swamped with university assessment and work at the moment so it won’t be for a little while. I’ll try to make a tutorial within the next month but no promises! 😉

Could you please give me specific len filters- links or model where can I buy and specific model of The Arduino and also the link to buy it.
Thank you

Hi Steven, the lense filters were just a generic polarising and neutral density (ND) filter. The exact ones don’t matter, we just bought the cheapest ones we could find on eBay. Any arduino will work, we used an arduino nano because it’s tiny and it has USB on board. These can also be found on eBay.

This is one of the coolest things i have read online.
Planning to do one on my own.
Thanks for sharing it. Its awesome.

thank you for that but i want to do something similar to that but just a software without the hardware and i can’t afford good GPU , i want your advice what can i do ?

You can’t really do it without the hardware and it’s impossible to test without the car setup. You don’t have to use a GPU but it will take days to train your data each time you change something.

Nice project as a deep learning introduction 🙂

“Basically, what happens is we collect just a frame from the webcam. Then we pass it to Tensor Flow and ask it to run it through the model.”

How much time this step takes?

I have no idea, the closest I can say is under a second. 😉 It operates at around 30FPS as I mention in the post.

So cool. I am planning to make one with wireless transfer video to a server and send the control data back to the car by WIFI.

Thanks for sharing.

Thank you for sharing great work.
I am duplicating work for my understanding to use this technology. Some question on your project.
1) No servo movement throttle and steering while collecting data. Do you have any condition of those output?
2) What is the meaning of LED on D13 of Arduino Nano?

1) We are collecting the human set values for both steering and throttle while collecting data.
2) I just had a look at my code and apparently it blinks every single loop! I’ll have to fix that as it should only blink everytime it receives a value to set. 😉

Very nice!! Thanks for sharing..

If the car is self contained with its own brain (NUC), what was the purpose of the Radio control system?

Because it isn’t perfect and we also need some way of training the data. Plus, it’s fun to drive around manually 😉

Why you didn’t consider a wifi or any other wireless webcam and sending real-time control commands through the – already there – wireless controller of the car.

It will be much simpler and much cheaper than modifying the controller of the car and purchasing an expensive i5 NUC.

Your only hardware needed is a non-modified RC car attached with a wireless webcam.

I’m a little confused as to what you mean as what you suggest isn’t possible. First off, the car doesn’t have a “controller” so I didn’t actually modify it. I’m simply replacing the normal receiver with outputs from an arduino. The i5 NUC was a little overkill, an i3/atom would suffice. You don’t have to use a NUC, you just need an x86 based processor as tensorflow won’t run properly otherwise. A NUC happens to be smallest/most practical available.

The latency involved with a WiFi webcam is way, way to high. The best case I’ve seen was with my custom built webcam on a raspberry pi. But at 200ms, even that was too slow! We are doing real time control of a car so it needs to be quick, if it was half a second behind (best case with a normal WiFi webcam) think how hard it would be to steer. 😉

Nothing is “real time” except for the arduino. The NUC just talks over serial and the arduino handles all the “real time” stuff like PWM output etc.

With an ip camera like this:
https://ryanmessina.wordpress.com/2013/04/30/wireless-webcam-for-opencv/

the delay was around 66ms , but I don’t know about the needs of your project. Is that too much?

Other option I think, using an analog camera with analog transmitter and digitizing the analog video feed using a cheap tv tuner card. The analog world has way less delay.

Of course your way of doing this project is perfect too, and you didn’t have to deal with the trouble of any delay. But you know, there is this compromise in engineering, whether is it worth to include more expensive parts and more engineering work, or is it better to deal with the less optimum solution?

If the 66ms delay is not too much, the project could be done with much less efforts. Buy an RC car and use it as it is. Its original radio controller is used plus modifying the transmitter to be connected with the pc somehow (e.g., using adruino placed beside the pc and the transmitter, which is more convenient than placing it inside the car) and the wireless ip camera sending the video feed to the pc. The pc will deal with all the neural network stuff. This sounds much simpler, however the devils are in the details and I would never know until trying this out.

What do you think?

Hi, really great post dude, 2 years ago when self-driving car was hot in the news I tried searching this same topic on the internet but couldn’t find anything about it but now this helps me very much. Thanks a lot for this tutorial.
But is there any way around to get it working with Raspberry pi as it is cheaper than Intel NUC and I have board available with me, I have Intel edison as well would that work? NUC seems 4 times costlier than edison or raspberry pi, please suggest some cheaper alternatives. Thank you again!!

Hi Vinay, thanks for your comment, we’re glad it helped. Unfortuneately I don’t think it will work on an intel edison – there isn’t enough processing power. You might be able to get something working on a raspberry pi 2 or 3 but at the time support wasn’t great for the libraries that we used.

I’m truly inspired by this project and am trying to duplicate it. Could you be kind enough to post a photo of the connections for the ardunio and the receiver, steering servo, and esc? Im having trouble wiring it up and am new to the electronics/arudino hardware side of things.

Which operating System, and which framework are you using to run tensorflow in the intel NUC?

Leave a Reply to Patrick Poirier Cancel reply

Your email address will not be published.