Categories
Uncategorised

How to add OVMS (open vehicles) to Home Assistant

OVMS (Open Vehicles) is a great hardware module that connects to my Nissan LEAF and allows me to perform remote functions like turning on the climate control, checking charge status, range, etc. I’m also a big fan of Home Assistant, and have almost everything in my house hooked up to it. There is no official (or unofficial) integration for OVMS to Home Assistant. However, OVMS has a HTTP API, and Home Assistant supports generic RESTful sensors.

Read on to find out how to hook up your OVMS module to Home Assistant!

Getting Started

Firstly, you’ll need your OVMS module to be hooked up, configured correctly, and working with the default OVMS app. Once your app is connected to your OVMS module and you can see live data coming through, it’s time to move on.

Generating an API Token

You’ll need to generate an API Token from the openvehicles.com API. To do this, you’ll need to open up the Terminal on your computer. Once there, type the command below and hit enter, replacing <USERNAME> and <PASSWORD> with your OVMS username/password that you use to login to openvehicles.com.

curl --location --request POST 'http://api.openvehicles.com:6868/api/token?username=<USERNAME>&password=<PASSWORD>'

After you run that command, you will see an output on your screen similar to the one below. You’ll need to copy your API Token (highlighted in bold) to a safe place.

{"application":"notspecified","owner":"<YOUR_USERNAME>","permit":"auth","purpose":"notspecified","token":"RiVINShnbS0wNG5tJUlNYUZJbUNeR1NcYSdwM0l7aDpWOyE2QkQxSCwrLWh8Ow"}

Find your list of metrics

Now you need to find a list of all the metrics that you want available in home assistant. There are 3 main collections of metrics that the OVMS API makes available. These are Status, Charging, and Location. If you have a 2012 era Nissan LEAF like I do, skip below and copy my config file. Otherwise, read on.

To find all of the available metrics, run the following commands in your terminal. Make a note of all the metrics that you want available in Home Assistant. Be sure to replace <USERNAME> with your username, <VEHICLE_ID> with your OVMS vehicle ID, and <YOUR_API_TOKEN> with the token you retrieved earlier.

Status

curl --location --request GET 'http://api.openvehicles.com:6868/api/status/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_TOKEN>'

Charging

curl --location --request GET 'http://api.openvehicles.com:6868/api/charge/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_TOKEN>'

Location

curl --location --request GET 'http://api.openvehicles.com:6868/api/location/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_TOKEN>'

Home Assistant Configuration

Now you’ll need to configure Home Assistant to retrieve data from the OVMS API and pull out the metrics that you want. Add a configuration to your configuration.yaml in the sensor section like below. You can update the scan_interval to whatever you’d like, but be considerate and don’t go lower than when your OVMS sends updates, or at a minimum every 60 seconds.

sensor:
  - platform: rest
    scan_interval: 120
    name: car_status
    resource: http://api.openvehicles.com:6868/api/status/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_KEY>
    value_template: "{{ value_json.soc }}"
    json_attributes:
      - soh
      - soc
      - etc...

  - platform: rest
    scan_interval: 120
    name: car_location
    resource: http://api.openvehicles.com:6868/api/location/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_KEY>
    value_template: "{{ value_json.longitude }},{{ value_json.latitude }}"
    json_attributes:
      - longitude
      - latitude
      - etc...

  - platform: rest
    scan_interval: 60
    name: car_charging
    resource: http://api.openvehicles.com:6868/api/status/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_KEY>
    value_template: "{{ value_json.chargestate }}"
    json_attributes:
      - battvoltage
      - cac100
      - carawake
      - caron
      - etc...

Save and validate

Save the new configuration, and use the handy “Check Configuration” button on the Configuration > Server Controls page. If there are no errors, then restart your home assistant server.

When the server starts back up, you should see some new entities called sensor.car_status, sensor.car_location, and sensor.car_charging. Use this in your automations or expose them via HomeKit like I did! Check below for my full configuration, including HomeKit friendly template sensors (although HomeKit does not support this very well as it doesn’t have native EV support).

HomeKit Example

You can see what this looks like in the screenshot from my iPhone below. I’ve configured the SoC as a humidity sensor so it reads as a percentage, and the range as an illuminance sensor. Unfortunately, HomeKit lacks an EV entity type, so this is the best I could come up with.

If you name the sensors something appropriate, you can even ask Siri to tell you the state of charge or range, if you can deal with the annoying response as it thinks they’re different types of sensors. Hopefully Apple adds an EV entity to HomeKit in the future, so a proper integration can be made!

Example config for a 2012 Nissan LEAF

- sensor:
    - name: car_soc_homekit
      state: "{{ state_attr('sensor.car_status', 'soc') }}"
      icon: "mdi:car-electric"
      device_class: humidity
      unit_of_measurement: "%"

    - name: car_range_homekit
      state: '{{ (float(state_attr("sensor.car_status", "estimatedrange")) * 1.25) | int }}'
      icon: "mdi:speedometer-slow"
      device_class: illuminance

    - name: car_soc
      state: "{[ state_attr('sensor.car_status', 'soc') }}"
      icon: "mdi:car-electric"
      device_class: battery

    - name: car_range
      state: '{{ (float(state_attr("sensor.car_status", "estimatedrange")) * 1.25) | int }}'
      icon: "mdi:speedometer-slow"

- binary_sensor:
    - name: car_charging
      state: "{{ state_attr('sensor.car_status', 'charging') }}"
      icon: "mdi:ev-station"

- platform: rest
  scan_interval: 60
  name: car_status
  resource: http://api.openvehicles.com:6868/api/status/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_KEY>
  value_template: "{{ value_json.soc }}"
  json_attributes:
    - alarmsounding
    - bt_open
    - cac100
    - carawake
    - carlocked
    - caron
    - chargestate
    - charging
    - charging_12v
    - cooldown_active
    - cp_dooropen
    - estimatedrange
    - fl_dooropen
    - fr_dooropen
    - handbrake
    - idealrange
    - idealrange_max
    - mode
    - odometer
    - parkingtimer
    - pilotpresent
    - soc
    - soh
    - speed
    - staleambient
    - staletemps
    - temperature_ambient
    - temperature_battery
    - temperature_charger
    - temperature_motor
    - temperature_pem
    - tr_open
    - tripmeter
    - units
    - valetmode
    - vehicle12v
    - vehicle12v_current
    - vehicle12v_ref

- platform: rest
  scan_interval: 60
  name: car_location
  resource: http://api.openvehicles.com:6868/api/location/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_KEY>
  value_template: "{{ value_json.longitude }},{{ value_json.latitude }}"
  json_attributes:
    - altitude
    - direction
    - drivemode
    - energyrecd
    - energyused
    - gpslock
    - invefficiency
    - invpower
    - latitude
    - longitude
    - power
    - speed
    - stalegps
    - tripmeter

- platform: rest
  scan_interval: 30
  name: car_charging
  resource: http://api.openvehicles.com:6868/api/status/<VEHICLE_ID>?username=<USERNAME>&password=<YOUR_API_KEY>
  value_template: "{{ value_json.chargestate }}"
  json_attributes:
    - battvoltage
    - cac100
    - carawake
    - caron
    - charge_estimate
    - charge_etr_full
    - charge_etr_limit
    - charge_etr_range
    - charge_etr_soc
    - charge_limit_range
    - charge_limit_soc
    - chargeb4
    - chargecurrent
    - chargeduration
    - chargekwh
    - chargelimit
    - chargepower
    - chargepowerinput
    - chargerefficiency
    - chargestarttime
    - chargestate
    - chargesubstate
    - chargetimermode
    - chargetimerstale
    - chargetype
    - charging
    - charging_12v
    - cooldown_active
    - cooldown_tbattery
    - cooldown_timelimit
    - cp_dooropen
    - estimatedrange
    - idealrange
    - idealrange_max
    - linevoltage
    - mode
    - pilotpresent
    - soc
    - soh
    - staleambient
    - staletemps
    - temperature_ambient
    - temperature_battery
    - temperature_charger
    - temperature_motor
    - temperature_pem
    - units
    - vehicle12v
    - vehicle12v_current
    - vehicle12v_ref
Categories
Uncategorised

I’ve launched an online store!

I’ve been enjoying my hardware related projects recently. I’ve even had a few people ask about buying some of them so I decided to launch an online store. Being a sucker for cute animals and a proud Australian, I wanted to name my store after a cute, native Australian animal. Hence, the name Koala Creations was born. You can access my online store by clicking here.

I’ve also setup a store on Tindie if you’d prefer to purchase things through a marketplace style site (like Etsy). Click here to go my Tindie store.

I don’t have a huge range of products available at the moment, but stay tuned as my awesome 18650 battery tester is in it’s final stages of development, and my 12v power distribution board for electric vehicle (EV) conversions is coming along nicely.

Please check it out and let me know if there’s anything else you’d like to see me make! ūüėĄ

Categories
Uncategorised

Update to my electric motorcycle project

For those who don’t know, I’ve been blogging about my electric motorcycle conversion over at https://ebandit.bike. I’ve gotten back into the swing of things now that a lot of places are opening back up after COVID mandated shut downs. I’ve put together a really detailed post explaining all about the BMS, the motor I selected and the ELV system. You can read all about it on my other blog right here.

Categories
Uncategorised

My smart home system

I’ve been an eager smart home (or home automation) enthusiast for a number of years now. My end goal is always changing, but it’s generally been to automate as many things as possible and make it as convenient as possible to control all of my lighting and appliances. My smart home system has grown to be quite complex so I’ve started documenting it.

To start with, I’ve put together a system diagram showcasing all of the different components and how everything is connected.

Smart%20Home%20Diagram
A diagram of my smart home system as of January 2020.

Here is a quick summary of all the different protocols and the different components that rely on them.

ZigBee – CC2531 Dongle (zigbee2mqtt)

Zigbee2mqtt is a fantastic open source project. It aims to bring together all the products from various companies so they can all use a single hub. Currently most vendors have a proprietary hub and there’s little compatibility between. This is surprising given ZigBee is an open standard just like WiFi. Zigbee2mqtt has a set of converters that allow you to add support for almost any device and expose a control/status API over MQTT.

ZigBee is “created on IEEE’s 802.15.4 using the 2.4GHz band and a self-healing true mesh network”. It’s especially ideal for sensor and IoT networks as it is a true mesh network that re-organises itself and relays messages between nodes. It’s also extremely low powered which makes it great for tiny battery powered sensors.

I’m slowly moving all of my ZigBee devices onto this network. This allows me to benefit from having less hubs and a bigger, more reliable ZigBee network. Most of my fixed LED downlights have ZigBee light switches that act as repeaters as they’re always powered.

ZigBee – Phillips Hue

Although they’re great, I’m migrating away from the Phillips hue lineup to the IKEA range for consistency reasons. Otherwise, the hue range is the best quality and functioning smart bulbs I’ve used.

ZigBee – IKEA TR√ÖDFRI

IKEA’s range of smart lighting products is absolutely fantastic. They are incredibly good value, great quality and work well. You can get a dimmable smart bulb for about $15 AUD!

IPv4 – WiFi/Ethernet

All of the ZigBee hubs connect back to hass.io (or home assistant) over a standard IP network using WiFi or Ethernet. There are also various other devices like my air purifier, robot vacuum cleaner, smart thermostat and a couple of WiFi based relays (sonoff). I try to avoid adding WiFi based IoT devices as a lot of them have serious security vulnerabilities. ZigBee is generally much more secure as a breach from any ZigBee device generally can’t give access to the entire IP network.

Raspberry Pi 4

Home Assistant (or hass.io) runs on my raspberry 4 and exposes all of the devices in my smart home system to HomeBridge. This makes everything available to the Apple ecosystem via HomeKit. This allows me to use Siri or the Home app on my watch, iPhone, MacBook, iPad or HomePod to control everything that’s part of my smart home system. This is really convenient and Siri is now the primary way I interact with my smart home system and control lights/other devices.

The Raspberry Pi 4 also hosts several services such as a plex media server and download server. It’s mapped to our NAS which has 8tb of network accessible storage for media, backups and other files.

Conclusion

My smart home system is a lot of fun to build and maintain but it’s not for everyone. Hopefully this post has given you some ideas on how to get started or improve your own smart home.

Categories
Uncategorised

Untangling the mess that is USB Type C and video outputs

USB Type C is meant to be the answer to all of our problems and be this magic, universal port right? Well in terms of charging things it’s pretty good. We’ve got the USB Type C PD (power delivery) spec that means my Apple charger will work on my MacBook Pro, my Samsung S9+, Samsung Gear Icon X, Nintendo switch, and most things with a Type C port on it. In general I’ve had a good experience with USB C being a truly universal solution for charging devices. However, getting a video signal out of a USB Type C port is another story.

I recently purchased a 2018 MacBook Pro (MBP) 15″ and I’ve been trying to work out how to setup my desk. I started investigating different docking stations, USB Type C adapters and cables, etc. I quickly learned that the world of USB Type C/Thunderbolt 3 docks and video adapters is complex and full of confusion. What’s the difference between Thunderbolt 3, USB3/3.1, “Thunderbolt 3 compatible” devices? Why do some only support mirroring on macOS but extended displays on windows? What is USB Type C alternate mode “alt mode”, etc.

I found myself asking so many questions. As a result I quickly fell into a rabbit hole of trying to understand all the different options that are available on the market. I’m going to attempt to summarise everything I’ve learnt, so that you don’t have to go through the same pain.

Thunderbolt 3 vs USB 3/3.1 vs Thunderbolt 3 “Compatible”

Source: thewirecutter.com

I quickly discovered that there are two main types of docks, proper Thunderbolt 3 ones, USB 3/3.1 ones and Thunderbolt 3 compatible adapters/docking stations. The Thunderbolt 3 options seemed far more expensive than their USB 3.1 and “compatible” alternatives. So what gives? The main difference is the way they communicate with your device, whether that’s a laptop like my MBP or a phone like my S9+.

Thunderbolt 3

Thunderbolt 3 is a standard that’s been developed by Intel to allow you to connect high bandwidth peripherals such as displays and storage devices. However, because of the high amounts of available bandwidth, it’s also used in many docks or “port replicators”. In fact, with the 40Gbps of bandwidth it has, you can drive two 4k displays at 60hz and still have room leftover for other peripherals.

USB 3/3.1

USB 3/3.1 on other other hand, is just the latest revision of the USB (Universal Serial Bus) protocol that has been around for a long time. Thunderbolt 3 “compatible” devices seem to be just a marketing ploy to get people to think they support Thunderbolt 3. Really, they just use the normal USB protocol that Thunderbolt 3 automatically falls back to. USB 3.1 only has 10Gbps of bandwidth compared to Thunderbolt 3’s 40Gbps which means it doesn’t event have enough for a single 4k 60hz display signal. However, USB 3.1 over Type C has a nice trick up its sleeve which I’ll explain later.

Thunderbolt 3 ports are often accompanied by a small lightning icon to signify the fact. However, my MBP and some other devices don’t always do this. Thunderbolt 3 ports will normally fallback to USB3/3.1 if that’s the only protocol the device (such as a dock or adapter) supports.

USB Type C Display Output Methods

There are many different ways that USB Type C devices (laptops and docks etc.) output and interpret display signals. I’ll explain the common ones below.

USB 3/3.1 Over Type C With DisplayLink Chip

USB Type 3/3.1 over Type C docks normally rely on a chip manufactured by a company called DisplayLink (or something similar). These chips use software to encode, compress and send a display signal over the lower bandwidth USB 3/3.1 protocol. However, these chips are software driven so they don’t perform well in demanding applications such as gaming or video editing. They might even struggle with playing some videos. Anything besides general office use is asking for trouble.

DisplayPort/HDMI Over Type C With Alternate Mode

Most cheap USB Type C dongles/adapters rely on on a neat trick called USB C alternative mode. Basically, a dongle/adapter/dock can ask a compatible device like a laptop or smartphone to output a non USB signal at the same time over some unused wires. Some examples of these non USB signals include HDMI and DisplayPort. Yep, the standard protocol that a HDMI or DisplayPort cable carries can also be carried by the humble USB Type C port.

The way this works is the dongle/dock will ask the output device if it’s able to support HDMI/DisplayPort etc. via alternative mode. If it can, the device starts to output a native HDMI/DisplayPort signal straight from the GPU – no software to get in the way like a DisplayLink chip. These cheap adapters are completely passive, basically just joining the correct wires from the Type C connector to the right places ono the HDMI/DisplayPort connector. They don’t manipulate or process the signal.

DisplayPort MST

Part of the DisplayPort standard includes MST – Multi Stream Transport. This handy feature allows you to daisy chain displays, use multiple outputs to drive a high res/refresh rate display, or carry multiple signals to different monitors as a “splitter” from a hub. A lot of docking stations and adapters that support more than one display out rely on MST, which is fine for the most part. However, Apple does not properly support MST in macOS. The only part of MST that’s supported is driving one larger screen from two DisplayPort outputs.

Unfortunately this means a lot of docking stations that work flawlessly in Windows or Linux show a “mirrored” image on both outputs instead of separate images for each. There’s nothing that can be done as a workaround as the problem is macOS fundamentally not supporting it. What this practically means is that some docking stations with multiple display outputs will only show up as a single one in macOS and output the same image on each one.

A Mixture of the Above

You’d think that adapters and dongles would probably pick one of the above methods and stick with it. However, from what I’ve seen most docks that advertise 2 or more outputs rely on some crazy combination of the methods above. Some will have one DisplayPort driven via USB Type C alt mode, and another two with a DisplayLink chip, or two with DisplayPort and MST via USB C Alt mode. This crazy mishmash of implementations and lack of information on product data sheets means it’s difficult for even a tech savvy consumer to work out if something is compatible with their device.

For example, I found this great looking Dell dock for the reasonable price of $200. I was about to buy it when I saw a review saying it only supports one display output on macOS. After looking into this I figured out it was due to the lack of MST support in macOS. I then found a more expensive one for $300 from Lenovo, and thought sweet, this is it. Apparently it uses DisplayPort via alt mode for one connector and a DisplayLink chip over USB for the other two. This means you get one output with “good” performance and the other two are severely restricted in comparison with CPU rendering.

Source: slashgear.com

Passive Dongles/Adapters and Cables

I didn’t spend too much time researching this, but there are still a few problems here. Whilst not ideal, someone should be able to plug a USB C to HDMI into a HDMI to DisplayPort adapter then use it with their screen, right? Well not quite, because of the way USB C video outputs are so varied and inconsistent, it’s unlikely you’ll be able to find the right combination of adapters that will work. It ends up just being easier to buy a new USB Type C adapter for every single type of output you need rather then chaining old ones onto a single Type C to HDMI adapter.

Source: samsung.com

You’d also think that all USB Type C cables are the same right? Well, only certain cables support Thunderbolt 3, and only some cables are rated for higher amounts of power. How do you know? It’s impossible to tell. USB Type C enabled devices are developing into an ecosystem where you have to plug something in and cross your fingers that it all works. This isn’t the way it was meant to be.

Conclusion

Most manufacturers don’t tell you what ungodly mess they’ve got going on inside their products. Because of this complete mishmash, some display outputs will be severely limited in their performance, while the one next to it might be fine. Some docks and adapters may work fine with windows machines but not with macOS. On top of that, sometimes you can’t tell if a USB Type C port, cable or device, is USB3/3.1, Thunderbolt 3, DisplayPort/HDMI over alt mode compatible, etc. It used to be if a cable fit, the device and cable were compatible, but that’s no longer the case.

Consumers shouldn’t need to spend hours researching how an adapter or dock is implemented to work out if it’s going to be compatible with their use case and performance needs. This inconsistency and lack of information from manufactures is a massive problem and is dragging down an otherwise great standard that should be universal and consistent.

P.s. if I’ve left anything out or made any mistakes please let me know in the comments. My head is still spinning from the huge amount of information I’ve processed over the last day while trying to write this.

Categories
Home Automation MQTT Uncategorised

MQTT Status Codes (Connack Return Codes)

If you’ve ever played with MQTT, then you’ve probably had issues connecting to your broker. Whether it’s one you’ve setup or you’re using a 3rd party provider like AWS, they should all follow the MQTT protocol. This is mainly for my reference because I can never find it, but below is a list of the standard connack codes that could be returned when you try to connect.

Note these have been directly copied from the official specification. You can see the original by clicking here.

Table 3.1 ‚Äď Connect Return code values
Value Return Code Response Description
0 0x00 Connection Accepted Connection accepted
1 0x01 Connection Refused, unacceptable protocol version The Server does not support the level of the MQTT protocol requested by the Client
2 0x02 Connection Refused, identifier rejected The Client identifier is correct UTF-8 but not allowed by the Server
3 0x03 Connection Refused, Server unavailable The Network Connection has been made but the MQTT service is unavailable
4 0x04 Connection Refused, bad user name or password The data in the user name or password is malformed
5 0x05 Connection Refused, not authorized The Client is not authorized to connect
6-255 Reserved for future use
Categories
Uncategorised

Fix for slow tab auto completion on ubuntu (bash)

For days I’ve struggled with this new linux install on a virtual machine on my local network. ¬†The SSH has been super unreliable and everytime I typed tab for an auto completion the whole thing seemed to lock up for ~30 seconds. ¬†Turns out the autocompletion problem was the simplest fix ever! ¬†After scouring the internet for ages I found this command.

sudo updatedb

It’s simple, all it does is update the auto completion database. (according to the forum I found it on) What was probably happening is the database got really big and was taking ages to scan through. ¬†It beats me why a fresh ubuntu install had this problem, but at least it’s solved, for now.

Categories
Uncategorised

3DR Solo DSLR & 1/4″ Tripod Mount Adaptor

If you’ve seen my 3DR Solo xtra large leg extenders post, you might be wondering what I used to attach my Sony a5100 to my Solo. Well I used the “pretty” face plate thing that comes with solo. ¬†(for use without a gimbal) ¬†It has a hard mounted GoPro adaptor on it and for now this will suffice. It’s basically a little right angle GoPro to 1/4″ tripod mount adaptor with an offset 1/4″ mount to roughly centre the a5100 and ensure it’s as small as possible.

You can download my STL file for the print via the link at the end of this post. ¬†You can see this mount in action in the photo below. ¬†I highly recommending that you take it slow and easy. ¬†The “rubber stoppers” to help combat gello/vibrations are designed to take a ~85g GoPro, not a ~400g compact DSLR. ¬†I highly recommend tethering the Sony to the Solo just in case the mount fails.

I recommend using leg extenders like these for some extra clearance!

It’s important to print the mount so that when you look down on the print bed from above you see an L shape. ¬†This ensures the layers aren’t parallel to the camera body. ¬†It’s extremely weak when printed this way.

Download the¬†3DR Solo 1/4″ Tripod Mount Adaptor STL file

Categories
Uncategorised

sonoff WiFi relays

The sonoff WiFi relays have arrived. ¬†I ended up buying ten of them and 3 motion sensors. ¬†My first impression is that they’re tiny and solid. ¬†They’re much smaller than I thought, which is a good thing! ¬†The case they come in is perfect for mounting inline with something and neatly hides the exposed wires. ¬†For comparison, you can see my old LG G4 phone next to it.

sonoff-1On the inside

On the inside, they look pretty good.  The soldering is done well and the gaps between the mains traces is reassuring.  As you can also see from the picture below there are a few header pins.  These are the programming pins.  Itead has been nice enough to breakout the programming pins into headers to make it easier to reprogram with your own code.

sonoff-2

 

Reliability

I’ve currently had one set up on my desk¬†lamp for the last couple of days. ¬†It has been rock solid and hasn’t experienced any drop outs or glitches. ¬†This was running their stock firmware which allowed me to connect it to their app. ¬†Although I have no intention of continuing to use their app it is miles ahead of the Belkin system. ¬†For example, switching it on or off happens via the internet almost instantaneously. ¬†However the Belkin’s system sometimes takes 10 seconds!

sc

Categories
Uncategorised

How to make an autonomous car (code included)


Suiron-3

I’ve just finished a recent side project with my friend Kendrick. (his¬†GitHub) ¬†We built an autonomous car that you can teach¬†how to drive, then it drives around by itself. ¬†I did all of the hardware/arduino software and Kendrick did all of the machine learning software. ¬†He called his project Suiron and it’s available on GitHub here. ¬†The code running on the arduino is called car-controller and is available on my GitHub here.

Now that you’ve got the links feel free to go and have a look. ¬†Work through the code and try to figure¬†out how it works. ¬†I’ll try to briefly cover the software here but my main focus will be on the hardware. ¬†One thing to note are the open source licenses, all my stuff is GPL and Kendrick’s Suiron project is MIT.

This post is more intended as an overview of how the whole thing works.  If I get time I might turn it into a tutorial on how to get it working yourself.

Before we begin here is a short video of it in action.

Now onto the fun stuff!  How does it work?

The Hardware

These are the main components used.

1) Remote Control Car Рwe used this car (link) but anything of similar size will work.  As long as it has a standard ESC and Steering Servo.  It comes with a remote control, battery and charger to start with.  I recommend buying a new remote control system. (link 5 below)

car

2) Inten NUC – The raspberry pi doesn’t really have enough power and is arm based. ¬†An x86 based processor like the i5 in our NUC is much easier to use for machine learning purposes. ¬†The exact one you use doesn’t matter.

nuc

3) Battery for NUC РA standard laptop battery bank was used to power it.  This one from kogan gives about 6-10 hours of runtime.  (link)

powerbank

4) Logitech C920 Webcam –¬†Buy from any reputable retailer. ¬†You could probably use a cheaper model but we had better luck with a C920. (link)

c920

4) Lens filters¬†– if you are operating in any sunlight, you will want a Polarising and ND (Neutral Density) filter. ¬†The camera just can’t cope with the harsh sunlight and shadows so these filters help bring the conditions into something much better. ¬†A variable ND is great as it let’s you adjust the “darkness” level.

filters

5) Radio control system – if you intend on doing lots of this stuff then get an FrSky TARANIS. ¬†You won’t be disappointed. ¬†Otherwise, a turnigy 9XR will work just as good. ¬†Make sure you get a receiver too if it isn’t included.

taranis

6) You’ll also need an arduino. ¬†I like the Arduino Nano’s because they are super cheap and have on board USB.

nanoIf you want some specific help choosing components just leave a comment.

I won’t go into details on how to wire everything as this isn’t a tutorial. ¬†However, If you need some help drop a comment below. ¬†I suggest you learn how an ESC (electronic speed controller) works together with a motor, receiver, servo and battery. ¬†This is a standard setup on normal remote control cars. ¬†Once you understand that you should look at arduino’s and how to use them to blink lights and read inputs. ¬†Read through the arduino code and the wiring should be pretty self explanatory.

How it all fits together

It’s up to you how you put everything together. ¬†I recommend trying to keep everything as low as possible for better stability when driving. ¬†The webcam needs to be mounted up high so it has a better chance of seeing the lane that it’s in. ¬†I just used¬†a square bit of balsa wood as it’s really light and strong, then glued the webcam to it. ¬†Instead of explaining exactly how I mounted everything I’ll dump a few pictures here. ¬†All the white things are 3D printed, but you could easily do it without a 3D printer.

Suiron-1 Suiron-2 Suiron-3Suiron-6

Polarising/ND Filter

The importance of a polarising filter cannot be underestimated. ¬†It reduces reflections and the harsh glare sometimes encountered. ¬†In the image below (credit) ¬†you can see how much of a difference a polarising filter can make. ¬†Now water is a bit of an extreme example, but I chose that picture so it’s easier to demonstrate the difference. ¬†In realty, where we’re operating the difference won’t be so¬†obvious.

PolarizingFilter1

The neutral density filter is equally or more important than the polarising filter. ¬†The ND filter is basically like sun glasses for the webcam. ¬†The webcam doesn’t like really harsh light so it reduces the intensity of it without interfering with the image to much. ¬†The picture below (credit wikipedia) shows¬†how much better the right ND filter can make an image in harsh light.

nd

 

I suggest making the lens filters removable as it will make the image to dark in lower lighting situations.  For example, it was perfect mid day but much to dark a few hours later just before dusk.  I made a simple mount that just uses an alligator clip to hold the filters in place.  The filters are both glued together then onto a small 3D printed right angle mount.

Suiron-4Suiron-5

The Arduino

The diagram below shows how everything is hooked up. ¬†Basically the arduino is the “brains of the hardware”. ¬†It reads in the values from the R/C receiver (bottom left) and then decides what to do based on the mode channel. ¬†Dig through the arduino code (link) and see exactly how. ¬†Basically there are 3 modes, manual, autonomous and emergency stop.

In manual mode the arduino reads in the steering and motor values and passes it straight to the motor and steering servo. ¬†In this mode with the right flag enabled, it also sends back over UART what those values are every time it¬†receives a character. ¬†(every time it receives prevents the serial buffer getting full and “lagging”) In autonomous mode the arduino reads inputs over UART from the NUC. ¬†In this mode it receives two messages; steer,x and motor,x where x is the value you want to set it to. ¬†It then writes those outputs to the steering servo or motor. ¬†Finally, emergency stop kills the motor output and straightens the steering servo. ¬†This emergency stop overrides any sort of manual or autonomous control.

arduino

The Machine Learning Part

This isn’t my expertise so I’ll briefly summarise what it’s doing. ¬†(not really how it’s doing it, no one really knows) ¬†We used a library called Tensor Flow. ¬†It’s an Open Source machine learning library published by Google. ¬†It’s open source and released under an Apache license. ¬†It has a nice python and a “no nonsense” C++ api.

Collecting data

This is a really short summary of the whole process.  Each time a video frame is recorded Suiron (software on the NUC) asks car-controller (software on arduino) what the human operator is doing.  Remember, in manual mode the human operator is driving the car around.  Car-controller responds by sending the current steering and motor values back to Suiron.  Suiron takes these values and saves them along with a processed version of the frame.

This process happens at about 30Hz (or 30 times per second) for as long as you record data. ¬†In the final model, we used about 20 minutes worth of training data. ¬†That is 20 minutes of continuously driving around the track. ¬†It may not seem like a lot but it’s repetitive very quickly. ūüėČ ¬†In reality, 20 minutes is no where near enough data. ¬†It works great on this particular track with similar lighting conditions but would likely fail if the conditions changed to much.

Training data

Again, I’m not an exert at this but I’ll try to briefly explain how the training works. ¬†Convolutional Neural Networks (CNNs) are¬†weird in the way they work. ¬†It’s impossible to know exactly how or why a CNN works. ¬†Basically, we’re giving Tensor Flow the frame and two numbers. (steering and motor) ¬†Then we’re asking it to work out how the frame relates to those two numbers. ¬†After giving it hundreds¬†of¬†thousands of examples (frames) it can try to generalise a model.

Because of the amount of computing power required it takes a very long time to train a good model. ¬†Due to the type of calculations it has to do, Tensor Flow runs much faster on a dedicated GPU. ¬†With only 20 minutes of data our model¬†took half a¬†day to train properly. ¬†The training took place in a desktop with¬†a borrowed GTX980, a GPU that’s towards the higher end of consumer graphics cards.

Using the model

You can see it in action in the gif below.  The blue line is what the model thinks it should do, the green line is what I actually did when I was steering it.  Note that this data was not included in the training set, this is to ensure the model works with other data.

demo gif

Once it has been trained we can then use the model. ¬†Basically, what happens is we collect just a frame from the webcam. ¬†Then we pass it to Tensor Flow and ask it to run it through the model. ¬†The model then spits out what it thinks our two values should be, one for steering and one for throttle. ¬†At the moment the throttle is unused and it runs at a constant speed. ¬†However we thought we’d include it just in case we wanted to use it in the future.

Update: Clive from hobbyhelp.com reached out to me after seeing this. He’s got a pretty cool “Ultimate beginners guide to RC cars” article on his website here. I recommend checking it out if you want to get started doing something similar to this project.