Thatcham Research has highlighted the importance of accurately defining what self driving, or autonomous mode, actually means to real world car driving consumers. A court in Munich has ruled that Tesla has exaggerated promises about its so-called ‘self-driving’ technology, with the carmaker’s use of the term ‘Autopilot’ coming under strong scrutiny.
As The New York Times reports: “Tesla’s use of the brand name Autopilot for its software, as well as claims the company made on its German website about the software’s function, create the false impression that the car can drive itself, a Bavarian state court ruled. In fact, the court said, Autopilot is a driver-assistance system that requires human intervention.”
Matthew Avery, research director at Thatcham Research comments:
“We have long warned of the pitfalls to the Autopilot system. Its seemingly competent performance can encourage drivers to hand too much control to the vehicle and lose sight of their responsibilities behind the wheel.
“This is a progressive process that begins when motorists are marketed the ‘self-driving’ experience. Autopilot is not a self-driving system. It is there to provide driver assistance, not become an invisible chauffeur.”
Yesterday on Twitter, Elon Musk was not happy with the ruling, tweeting; “Tesla Autopilot was literally named after the term used in aviation. Also, what about Autobahn!?”
The Tesla boss also hinted earlier in July that updated semi-autonomous software would be released in the Autumn of 2020, to enhance the capabilities of Tesla cars.
Naming is key
“We support the German competition commission’s ruling. Naming is key, and Autopilot is an especially misleading term.
“How many times have movies depicted an airline captain disengaging completely when switching on autopilot – leaning back in their chair, reaching for a cup of coffee or even leaving the cockpit entirely?
“But it’s not just the name of the system. Tesla marketing frequently suggests the car is capable of ‘full self-driving’. Just recently some UK customers received an email communication stating:
Our records indicate that you haven’t upgraded your Model S… to Full Self-Driving Capability. You can upgrade now at a reduced price of £2,200.
“The outcomes of driver over-reliance on the Autopilot system can be catastrophic. Reports of accidents with Autopilot engaged have become all too familiar. Many are fatal and we don’t know if drivers were “taking a chance” or worse still, literally believing their Autopilot system was fully capable of driving the car itself.
“When marketed and used sensibly, systems like this will ultimately benefit road safety. However, without a safety-first principle enshrined in new technology adoption, our roads will become more dangerous and it will take longer to reap the societal benefits new systems have the potential to bring.
Spotlight on driver assistance
“We’re therefore continuing to shine the spotlight on driver assistance systems by testing and evaluating their performance. As a member of Euro NCAP, we have developed new protocols and testing methodologies and we’re launching the results of these tests later this year to show how these systems should be used and how effective they are at striking the balance in offering the right level of assistance to drivers.
“The resulting ratings will also consider carmaker marketing and encourage carmakers to be prudent about performance.
“Although the case in Germany focused on Tesla’s Autopilot, we believe it should serve as a brightly flashing hazard light on misleading marketing for all carmakers.
“If the warning is heeded, we look forward to a future where these exciting technologies can truly deliver on their promise.”
Thatcham Research is at the forefront of vehicle testing and a champion for the safe adoption of new vehicle technologies. In recent years its views on the dangers of overselling the so-called ‘self-driving’ capability of current driver assistance systems have been covered widely in the media.
In September 2019, Thatcham Research also launched 12 guidelines to minimise bumps in the road on the journey towards fully Automated Driving. Developed with the ABI, the guidelines came as part of Thatcham Research’s work with International Regulators on designing new rules which will eventually allow Automated Driving Systems onto Motorways.
Follow the link for video and to download the full ‘Defining Safe Automation’ report:
Tesla is blazing a trail in car manufacturing and there is huge opposition in the established car industry to that disruption. History has shown many times that when something innovative comes along to disrupt a cosy cartel of parts suppliers and big name manufacturers in any sector, the result is conflict. The battle for the electric car market is a lucrative one, as politicians fall over each other to dish out buyer PX subsidies, VED tax breaks, free parking and probably a pack of M&Ms on top.
It is inevitable that the race to win the prize of creating a true self-driving car will be a bitter one.
To be fair to Tesla, the info on their website states quite clearly that Autopilot is for use by a fully alert driver, ready to take control at any time. Not someone eating a Krispy Kreme in the back seat. Here’s the link.
From an insurer point of view, the crucial matter here is the driver/policyholder understanding of what the term Autopilot means, in case an accident happens as a result of over reliance on technology, or the failure of that tech – even if that was only for a second. Let’s imagine a scenario; if a driver believed the car was capable of detecting a hazard and applying the brakes, or steering around it, but the car did not, was that the fault of the driver, or the Autopilot?
That is the question which may be tested in court one day.
Be the first to comment