Blog

Where are they?

26 August 2021

We were promised autonomous vehicles by now

When Australian Automotive first wrote about autonomous vehicles there were five levels of autonomy. Now we’re up to six, but who cares? Where are the cars? Perhaps level seven will be defined by fully autonomous cars that are actually available. Maybe level eight will be fully autonomous, available and registerable. Perhaps level nine will stop killing people when they fall asleep at the wheel. We can’t wait, although we’ve certainly had to. What’s been the hold up?

Some sources suggested that we’d have general regulatory approval AVs by now. However, governments haven’t framed general regulations to govern such vehicles in mixed company with human-controlled vehicles. They haven’t had to because, despite optimistic predictions, there is no car ready to safely carry a sleeping driver to his destination. So far, governments have only enacted provisions that relate to the testing of autonomous vehicles.

Law and its amendment is always complex but as far as autonomous vehicles are concerned it doesn’t necessarily have to be that bad. In the section of the Road Safety Act 1986 shown below, the terms ‘person’ and ‘person driving a motor vehicle’ could be broadened to include the entity that takes responsibility for the vehicle control system. This will be the developer and owner of the algorithms, effectively the car manufacturer.

17A Obligations of road users
(1) A person who drives a motor vehicle on a highway must drive in a safe manner having regard to all the relevant factors    
(2) A road user other than a person driving a motor vehicle must use a highway in a safe manner having regard to all the relevant factors
(2A) For the purposes of subsections (1) and (2) and without limiting their generality, the relevant factors include the following:

    a) the physical characteristics of the road
    b) the prevailing weather conditions
    c) the level of visibility
    d) the condition of any vehicle the person is driving or riding on the highway
    e) the prevailing traffic conditions
    f) the relevant road laws and advisory signs
    g) the physical and mental condition of the driver.

All of the things listed in sub-paragraph 2A are pretty much the characteristics that autonomous vehicles seek to emulate. So, getting the Road Safety Act and Road Safety Rules ready for the arrival of autonomous vehicles might not be the ordeal (or barrier) imagined. This is not to trivialise the task, just to suggest that it is doable within the framework of existing legislation.

That the foregoing is reasonable is suggested by the Road Safety Amendment (Automated Vehicles) Bill 2017. Clause 14 amended the Crimes Act 1958, in relation to culpable or dangerous driving and says that an ADS permit holder can be the party prosecuted under the act. That’s a very serious change to one of our most important acts. So, legislation doesn’t necessarily have to hinder the adoption of autonomous vehicles. The real difficulty is getting fully autonomous vehicles to comply with legal expectations to a degree commensurate with human performance.

Tesla suggests that its autonomous control system, Autopilot, already exceeds human performance, yet two Model S occupants were killed in a crash in Texas in April this year. Incidents like this tend to overshadow the realities of autonomous technology. The details surrounding this accident are really quite bizarre and Tesla says Autopilot wasn’t even activated and therefore can’t be blamed. Still, Tesla makes some of the biggest predictions for autonomous vehicles and therefore garners the strongest reactions when things do go wrong. Tesla publishes safety data each quarter and compares them with statistics for conventional driving. 

From the first-quarter report for 2021: In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles. 

Other autonomous vehicle developers also suffer accidents. Everyone remembers the 2018 Uber accident in Arizona in which a woman wheeling her bicycle across the road was struck and killed by a test vehicle that recognised the situation seconds too late for mitigation. This was the first death caused by an autonomous car. Still, Uber persisted and at the beginning of 2020 the company painted a bright picture of its plans for the future of ride-hailing services. It planned to dominate the segment. By the end of the same year it announced the sale of its self-driving division to a specialist company in the field, after investing about one billion dollars.
 
Making a fully autonomous vehicle is extremely difficult. The AI algorithms that control autonomous vehicles are extremely good and getting better all the time. Because of its leadership in the field of connected vehicles, Tesla has by far the greatest trove of driving data – by a huge margin. Additionally, Tesla’s latest in-car hardware is very impressive. But, as the majors make their presence felt in the connected EV market they’ll start to catch up. All of the data gathered will push autonomous technology to ever greater reliability. But no matter how great the data set, the real world will always throw up exceptions. Such exceptions are one of the barriers to mainstream adoption of autonomous vehicles.

The 2020 Tesla crash on a freeway in Taiwan is a good example of complex real-world phenomena rendering an autonomous system impotent. Video coverage of the incident is readily available on YouTube and shows the car ploughing into the roof of a truck overturned across two lanes. The driver of the truck is standing on the inside edge of the freeway fast lane, against the concrete median barrier some distance from the truck. As the car approaches it seems to react to the presence of the driver and displays a puff of tyre smoke as the brakes are rapidly applied. Then the car proceeds into the truck at speed.

There’s a slew of clips on YouTube showing Teslacam videos of both accidents and of Autopilot saves. Many of the accidents were unavoidable and the fault of others, while the saves are often quite impressive. Autopilot works extremely well, but some accidents just can’t be avoided. Unfortunately, even when crashes involving autonomous vehicles are the fault of other vehicles, the statistics will simply indicate the involvement of an autonomous vehicle. This will mar the perceptions of some people and certainly won’t help with acceptance of this technology. 

How data is perceived can also matter. Even if a fully autonomous vehicle fleet is proven to save, say, 90 percent of existing road fatalities it could be difficult for legislators to accept. This is because road deaths in conventional cars are seen as accidents. However, signing off on a system that promises fewer fatalities could also be seen as signing off on a technology that guarantees a certain number of deaths, albeit a lower number. It doesn’t take seeing things from a radically different angle to come up with this view.

Pulling real-world data from autonomous vehicles and using it to train and perfect AI-based autonomous control systems is obviously essential, but it’s not enough. Unanticipated situations will always occur. Rather than passively waiting for field data describing unprecedented scenarios to come in, developers use simulations to create situations that aren’t yet available from real-world data. Use of simulation software in this way has been growing in recent years and will be a cornerstone of future autonomous vehicle development, particularly as AI is used to generate the scenarios used in the simulations. Simulation testing may also provide a more level playing field for companies without the advantage of huge real-world data sets.

Predictions for autonomous vehicles have ranged from the sublime to the ridiculous. MIT autonomous vehicle researcher, Lex Friedman suggests Elon Musk is one of the more optimistic proponents of the technology. In 2017 he suggested that in 10 years “...it will be very unusual for cars to be built that are not fully autonomous.” At the other end of the expectation spectrum, Friedman summarises the views of Rodney Brooks, another major industry player. Brooks suggests that perhaps one major city will ban manually driven cars from a significant portion of its streets but not until 2031. He further suggests that beyond 2045 the majority of US cities will have introduced such bans. We assume that the prediction holds for other major cities around the world. Brooks suggests that this is how autonomous vehicles will eventually gain acceptance.

Friedman suggests that autonomous vehicles will not be adopted because they are safer, although they will be. Nor will they gain acceptance because they’re faster. In fact, until the majority of the fleet is fully autonomous they will be over-cautious and slower. And they will not be accepted because they’re cheaper. Economic indications are that this probably won’t be the case for some time. One caveat on the economics is in road transport and delivery. Removing the cost of human participation in such sectors will deliver significant economic benefits.

As far as autonomous cars are concerned, Friedman suggests that the most important reason they will be accepted is because they will create a better experience for the passengers. The RethinkX report we reviewed a few years ago suggested the same and noted that releasing drivers from the burden of the attention-sapping, peak-hour crawl will be enough to generate acceptance.

Data analysis company, Gartner developed a technology mapping indicator called the Hype cycle. It places new technologies on a curve that indicates the current state of development. It features a peak of inflated expectations (probably reached a few years ago for autonomous vehicles) followed by a trough of disillusionment in which, arguably, the technology currently sits – ready to ascend the slope of enlightenment on its way to reach the plateau of productivity after the difficulties (both real and perceived) have been tamed. The real bottleneck with autonomous driving is the technology itself. But we’re working on it and edging ever closer to readiness for real roads in real conditions. Meanwhile, we wait with somewhat less bated breath than we might have had in the past.


Words: Paul Tuzson. As featured in Australian Automotive June 2021.

Previous Article Vehicle taxes
Next Article Who repairs your vehicle?

Name:
Email:
Subject:
Message:
x