There are many automakers that have driver assistance features built into many of the vehicles of their fleet. A few examples of this are Volvo’s Pilot Assist, VWs Autopilot, and BMWs driver assistance features.
Tesla’s offering is called Autopilot and is well known as being one of the most well developed systems in any mass production vehicle.
After being announced in 2013, it was integrated into the Model S fleet in 2014. Shortly after release, Musk said the following:
“Autopilot is a good thing to have in planes, and we should have it in cars.”
At first, the feature was developed with the Israeli company Mobileye, and it offered basic lane-keeping assistance on freeways or in stop and start traffic. This crude version of the software was developed over the following years, with notable releases including Autopilot 8.0 in 2016, which allowed the processing of radar signals to aid visibility in rain, fog, and other conditions with poor visibility.
Currently, vehicles built between 2014 to present are all capable of some form of Autopilot. At the time, Tesla said that all vehicles produced after October 2016 would have the necessary hardware for future full self-driving. This is called hardware two, or HW2 for short.
Despite this, from March 2019, Tesla has been adding the world’s first full self-driving computer to its vehicles which, according to Tesla, will facilitate full self-driving in the future. There’ll be more information on the Hardware that Autopilot uses later on in this article.
Finally, in 2019, Navigate on Autopilot was released. Additional functionality to the semi-autonomous driving software was unlocked, including the ability to perform lane changes on the freeway automatically, as well as handling on and off-ramps.
The latest edition of Autopilot full self-driving had added the ability to stop at stop signs. However, at the time of writing, this feature is still in the beta phase.
What Can Autopilot Currently Do?
At present, every new Tesla purchased, including those on pre-order like the Cybertruck, comes with a base version of Autopilot as standard.
However, Tesla ships two versions of Autopilot: base Autopilot and Full Self-Driving Capability. Note how the latter is just the capability of full self-driving. The car cannot actually drive itself at all times at the moment, so its more of future investment. Nonetheless, the full self-driving option does come with a plethora of its own features that make it worth purchasing for some people.
So, what are the features of each version of Autopilot? According to Tesla, the base version of Autopilot which is installed as standard on all new cars has the following features:
- Automatic steering
- Braking within its lane
- Automatic emergency braking
- Side collision warning
- Front collision warning
- Traffic aware cruise control (speed limit signs don’t seem to be detected. GPS is used to find the speed limit.)
This free version of Autopilot is focused on making the car safer and more enjoyable to drive over long journeys, rather than allowing it to drive fully autonomously per se.
The more advanced full self-driving capability contains the following features in addition to the standard features listed above. At the time of writing, it costs £5,800 in the UK, but prices are increasing all the time as more features become available and the feature becomes better.
- Automatic lane changes.
- Navigate on autopilot (vehicle can fully navigate to destination providing the journey is on the freeway. Picks the correct lanes and handles on and off-ramps automatically.)
- Summon (and smart summon) allows the owner to summon their car to their phone’s location automatically from another location nearby. For example, a Tesla can come out of a parking space to a hotel door.
- Traffic light and stop sign control (ability to stop automatically at stop signs and traffic lights. Currently in beta.)
- Automatic city driving (not yet determined what this will do).
How Does Autopilot Work?
I’m going to split this section into two divisions: hardware and software. Since both of these components are manufactured entirely by Tesla, they complement each other very well.
Dotted around the car is a mixture of cameras, radar, and ultrasonic sensors.
In total, there are eight cameras. Together, they provide a 360˙ view around the vehicle, though three out of the eight cameras are looking forwards.
The front narrow camera has a range of around 250m and is most useful for high-speed operation. The main forwards camera has a range of 150m and handles general-purpose tasks. Finally, the wide forward camera uses a 120-degree fisheye lens to capture traffic lights, obstacles cutting into the path of travel, and objects at close range. Tesla says this is particularly useful in urban, low-speed maneuvering.
Next are the side cameras. There are two sets of these, one around the A-pillar area, and one on the B pillar.
The cameras on the A-pillar monitor look for cars that may be unexpectedly entering your lane and aids the system at intersections. Meanwhile, the B pillar cameras monitor blind spots to ensure that it is safe to change lanes. The driver can also use these cameras to look in their blind spot themselves.
Finally, there’s the rearview camera. Its main purpose is to aid the Autopark functionality where the vehicle can find a parking space and park itself. As is typical of any reversing camera, it can be monitored by the driver when they are parking themselves.
Radar and Ultrasonic Sensors
All Tesla’s manufactured in the past few years are equipped with a radar behind a panel on the front of the car. The addition of the radar was brought about by a flaw in a purely camera-based vision system. Radar is somewhat similar to LiDAR, however, it sends radio waves out and listens for echoes, rather than sending light out.
Cameras which use the visible light spectrum (what we can see) are very similar to the human eye. You may be thinking that’s a good thing as humans are capable of driving cars, so why wouldn’t that be sufficient for a robot?
Human eyes aren’t perfect. For example, it is very difficult for a human to drive safely in thick fog and poor weather conditions. Hence, it is very difficult for a computer to drive safely in poor weather conditions when taking data solely from cameras.
However, radar is able to penetrate through thick fog and rain, meaning it is almost completely unphased by these conditions. Radar makes the car dramatically safer when, for example, the car is driving in thick fog and there is a car stopping in front of you. Cameras can’t see it, but the radar can. Another benefit of radar is that it can pass under cars, meaning it can detect things that the cameras cannot see. A 160m range means that the radar can see more than the vast majority of cameras on the car.
In addition to radar, Tesla cars are fitted with twelve ultrasonic (sound of a frequency that humans can’t hear) sensors. These very accurately measure distance using ultrasonic waves.
Their accuracy makes them extremely good at measuring small distances between vehicles in adjacent lanes. Furthermore, they are useful for parking for these same reasons.
Vision VS LiDAR
I plan to write a full blog post purely on this subject, so look out for that in the future. For now, I’ll give a brief rundown of these two different approaches.
Tesla’s focus is primarily on vision, which uses data from cameras dotted around the car to drive the car.
By contrast, pure self driving companies like Waymo are taking the approach of LiDAR and other more expensive instruments. LiDAR sensors use light to scan the environment and produce a detailed map of it in real time.
Both of these approaches could work in one way or another. Furthermore, the two different models of these two companies is strongly correlated to the type of company they are. Although both work on self driving, Tesla is an automaker, meaning it has millions of vehicles on the road. Conversely, Waymo is purely a self driving company, meaning it doesn’t have as many cars on the road.
Vision systems require enormous amounts of training. Despite Tesla having over one million Autopilot equipped vehicles on the road, their neutral network still requires much more training.
Waymo’s approach still uses a neural network but requires less training. The company’s aim is to provide a network of robot taxis in cities. Currently, the taxis only work in Phoenix, Arizona, where the company is based. Waymo has carefully mapped out this area so that their vehicles are capable of driving on them.
As you can probably imagine, this approach is not particularly scalable. A Waymo taxi would be unable to start driving itself if placed on any road in the globe, or any road in the United States for that matter. Meanwhile, Tesla’s aim is for a vehicle to be able to start driving autonomously on almost any road on the globe, even if the system has never seen it before. Essentially, just like a human.
Self Driving Computer
The self-driving computer is a core component of Tesla’s self-driving system. Essentially, it takes all the inputs from the cameras and sensors, processes them, and produces outputs that tell the vehicle what to do.
The computer, which was custom produced by Tesla in house, was announced at the Autonomy investor day last year. Claims from Tesla suggest it has around 40 times the computing power of the previous generation computer, which was powered by NVIDIA.
A core component that Tesla focused on when building this computer is redundancy. After all, you wouldn’t want the driver of the car to suddenly fail, especially in a future of no manual controls like steering wheels.
Consequently, there is redundancy everywhere, including two chips that run simultaneously. Hence, Tesla claims that the probability of the computer failing is considerably less than the probability of a driver falling unconscious.
Finally, each chip is equipped with 3 billion transistors, giving a total of 6 billion across the two chips.
Software is arguably the most important section of Autopilot. The hardware aspect of self-driving has already been solved. There are several vehicle models already equipped with the necessary hardware to facilitate full self-driving in the future.
Tesla’s software has many elements, but in this article I’m going to be talking about the artificial neural network inside the car.
Put simply, an artificial neural network is a computing system that is modeled based on the neural networks of the brain. Much like the human brain, a neural network is capable of learning from inputs. This is the fundamental core concept of Tesla’s self-driving approach.
To demonstrate the capability of a neural network, I’d like to introduce you to a transformer, which is based on Open Ai’s GPT-2 algorithm for text prediction and generation. Here it is: https://talktotransformer.com/
Given the input of ‘What is your name?’, the transformer produced the following:
You know me as a programmer. I have a few years of professional programming experience and I want to start teaching and working with the most recent technology that is available in this area. I would like to help others who may be interested in learning about machine learning and artificial intelligence. My primary field is Artificial Intelligence and Machine Learning but I enjoy working with machine learning algorithms as well.
As you can tell, it’s quite effective. Yes, it’s a little repetitive at times, but really isn’t too bad considering its a robot.
This particular neural network was trained on 40 GB of data collected from the web, including the top 8 million upvoted links from Reddit.
Put simply, this Ai used data from the web to teach itself how to write.
Tesla’s neural network works in a similar way. All the cameras, radar, and other sensors around Tesla are automatically monitoring Autopilot’s performance, collecting sparse, but valuable amounts of data. More information on this can be found here:
The data that is collected can be used to help Tesla train the neural network. Furthermore, the type of data that they are collecting can be changed depending on the problem that they are trying to solve. For example, if the team is working on making stopping at traffic lights and stop signs better, the can make all the Tesla’s on the road using Autopilot send them data about what the vehicles are doing in these situations.
Below is a video that Tesla featured on its Autopilot Ai webpage. It shows the kind of analyses that is being undertaken by the system whilst driving. The majority of this data is not sent back to Tesla, just small components of it.
There are more than 1 million Autopilot equipped Tesla’s on the road collecting data, that is essentially free to Tesla. Comparing business models to Waymo, Waymo pays to build their cars and equip them and then pays drivers to sit in the cars whilst they drive around and collect data. Meanwhile, customers pay Tesla for their cars, then proceed to collect free data for them whilst they are using a feature that is convenient for them.
You can probably see why Tesla’s solution is more scalable.
Tesla claims that “A full build of Autopilot neural networks involves 48 networks that take 70,000 GPU hours to train. Together, they output 1,000 distinct tensors (predictions) at each timestep.” Those are some impressive stats.
Just this quarter, even with the Coronavirus causing factory shutdowns, Tesla has managed to deliver over 90,000 Autopilot equipped vehicles. Those will go out onto the roads and train the neural network.
What May Future Versions of Autopilot Include?
What can you, as a Tesla owner, expect from your car in the coming months and years?
As I’m sure you’re familiar with, Tesla frequently releases software updates that make the car better each time. Their pace of innovation is only going to speed up in the future.
In summary, some of the features that must come in the future to facilitate full self-driving, or are likely to come are the following:
- Turning and Roundabouts
- Navigation on Small Roads
- Smarter and Reverse Summon
- Navigate on Autopilot for all roads
- Worldwide Speed Limit Recognition
First off, there’s something quite important going on with Autopilot at the moment. It’s undergoing a fundamental rewrite. Musk says this rewrite will be pushed to vehicles within the next few months, and that it would be possible to enable several big new features at that time. However, he also says that Tesla will only release the features that are deemed safe to release.
Turning and Roundabouts
One of the most anticipated releases is the ability for the car to turn itself and drive around roundabouts without driver intervention.
Don’t get me wrong, the car is currently capable of keeping its lane. Hence, if that lane curves to go round a bend, the car will follow it. In that sense, it will turn. However, when at a junction, the car will not pull out and turn itself.
Furthermore, turning would most likely allow the car to drive around roundabouts unaided. There have been some instances of a Tesla handling a roundabout by itself, but these are most likely to be flukes as they are in no way repeatable.
Navigation on Small Roads
Navigation on small roads is a fundamental problem that Tesla has to solve before vehicles will be able to do full point to point self-driving, especially in the UK where smaller, unmarked roads are more common.
At present, Autopilot has all kinds of strange behavior on small, unmarked roads in the UK. For example, it will often hug the right-hand side of the road, where the correct side of the road to drive on is the left. In summary, positioning on small roads with no lines needs to be improved.
Smarter and Reverse Summon
Tesla cars already come with summon as a feature of the full self-driving package, however, Musk has hinted that there are some more advanced versions of this feature in the works.
First off, the ‘smart summon’ feature certainly has room for improvement, specifically in terms of accuracy and speed. However, a new summon mode could be in the works.
Although it’s unclear what it will be called, it will most likely be something along the lines of ‘reverse summon’, which, certainly describes the feature.
Put simply, instead of your car coming from a parking space to your location like in smart summon, the car will be able to go from your location to a parking space of its choice.
If executed well, reverse summon could have all kinds of useful benefits. For example, if you were to go to the supermarket, the car could drop you off at the door, and then go and find itself a parking space.
Worldwide Speed Limit Recognition
Presently, Tesla uses GPS and maps to display the speed limit of the road to the driver, and uses this for Autopilot. The main problem with this approach is it’s slow to update. For instance, when new roads or added, or roadworks are put in place, it may take months for the car to get the correct speed limit. This would be a problem for full self-driving as the car wouldn’t know the correct speed limit.
One solution to this would be using the car’s cameras to read speed limit signs and use those, in conjunction with maps and GPS, to display a more accurate speed. Many other car manufacturers are already doing this, and it has proven to work quite well.
Furthermore, Tesla could use the map data to train the speed limit recognition. For map data that was known to be correct, the car could detect a speed sign, compare it to the map data. If correct, that’s great, but if not, it could use that data to learn.
Components of speed limit sign recognition are already visible in current Tesla software versions, as pointed out by @greentheonly on Twitter.
Conclusion: A Fully Driverless Future?
Elon continually makes predictions about when Autopilot will be ‘feature complete’. In 2015, he stated that the problem with autonomy was already solved. However, these deadlines come and go.
Given how much progress Tesla has made on this over the past few years, and with an ever-expanding fleet of vehicles, it wouldn’t surprise me if point to point autopilot became available within the next five years. That is, a Tesla could pick you up anywhere, and drive you to another destination of your choice anywhere. And I mean anywhere, not just in cities.
However, many think this will take more than 5 years, and many think it will take less, so it’s hard to know. What we do know is that the future is autonomous, and Tesla seems to be leading the charge.