On Thursday, the Insurance Institute for Highway Safety announced that it is creating a rating system for hands-free advanced driver-assistance systems like Tesla’s Autopilot and General Motors’ Super Cruise. Later this year IIHS will issue its first set of ratings, with grading levels of good, acceptable, marginal, or poor. Having a good driver-monitoring system will be vital to getting a good grade.
And the institute is not alone. Also on Thursday, Consumer Reports revealed that it, too, will consider the safety of such tech features, adding points if there’s a good driver-monitoring system. CR says that so far, only Super Cruise and Ford’s BlueCruise systems are safe enough to get those extra points. Meanwhile, from model year 2024, CR will start subtracting points for cars that offer partial automation without proper driver monitoring.
“Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” says IIHS President David Harkey. “In fact, the opposite may be the case if systems lack adequate safeguards.”
Indeed, what data we have had in past years is untrustworthy. For several years, Tesla and NHTSA repeatedly cited a statistic that the electric automaker’s lane-keeping system made the cars substantially safer—instead that claim crumbled when a third party actually dug into the data.
These are not driverless cars
Quickly, let’s define what we’re talking about, which is partially automated vehicles. Specifically, these are vehicles that combine an adaptive cruise control function (which reacts to vehicles in the path ahead and slows to match their speed) and a lane-keeping function (which tracks the lane markers on the roadway and keeps the car centered via the steering).
Importantly, the human driver is entirely responsible for situational awareness at all times—hence the requirement for a driver-monitoring system that ensures that’s actually happening.
Or, as the Partners for Automated Vehicle Education puts it: “When you use partial automation, the human stops actively driving and begins what is called a “vigilance task”: overseeing automation and waiting for a failure. We are actually worse at these tasks than actively driving, as the risk of inattention is very high.” (In fact, this whole thread from PAVE on this topic is worth a read.)
Unfortunately, the general public is experiencing a severe degree of confusion thanks to a combination of misleading marketing by one automaker and a classification scheme meant for automotive engineers but never the public.
Leaving aside Tesla’s problematic branding, it’s time for us to move past the SAE automation levels and think instead of these “partially automated” (or “conditionally automated”) options as one distinct set of technologies to autonomous vehicles. These hands-free driver assists require the human behind the seat to pay attention at all times, which is very much not the case for driverless autonomous vehicles like robotaxis, low-speed shuttles, and delivery bots.
How do you get a good score?
IIHS says that to earn a good rating, a partially automated driving system should “use multiple types of alerts to quickly remind the driver to look at the road and return their hands to the wheel when they’ve looked elsewhere or left the steering unattended for too long.”
Research shows that the more you alert a driver, the more likely they are to respond, so expect a combination of audio, haptic, and visual cues—something Super Cruise does already, for example. (It will use sound alerts together with vibrating the seat and showing alerts on the main instrument panel.)
If the driver ignores the increasingly urgent cascade of alerts, a good-ranked system will slow the vehicle to a stop (or a crawl) and notify emergency services (via an OEM concierge like OnStar). If such an event does take place, the partially automated system should not re-engage unless the car has been turned off and then on again.
Interestingly, IIHS also has some other requirements for getting a good grade. Automatic lane changes must be initiated or approved by the human driver—making lane changes automatic is an easy way to convince people that a vehicle is much more autonomous than it truly is. And IIHS says that adaptive cruise control should not automatically resume if the vehicle has come to a complete stop for too long or if the driver is not paying attention to the road ahead. And IIHS also wants systems to require the driver to wear their seatbelt and to enable automatic emergency braking in order for the vehicle to operate.
“Nobody knows when we’ll have true self-driving cars, if ever. As automakers add partial automation to more and more vehicles, it’s imperative that they include effective safeguards that help drivers keep their heads in the game,” says Harkey.