(NEW YORK) — In the early morning hours, California Highway Patrol chased a grey Tesla Model S for an unfathomable seven miles down Highway 101 as the driver slept, police said.
Redwood City Area CHP officers said they observed Alexander Joseph Samek, a local Los Altos politician, driving at around 3:30 a.m. PST on Nov. 30. Police followed Samek with lights and siren on, but he remained “unresponsive,” and “appeared to be asleep at the wheel,” according to the arrest report.
Assuming that the car was on Autopilot, police drove in front of Samek and “began slowing directly in front of the Tesla in hopes that the ‘driver assist’ feature had been activated and the Tesla would slow to a stop as the patrol vehicle came to a stop,” the arrest report said. Samek was charged on suspicion of driving under the influence.
But what is befuddling transportation analysts and Tesla watchers is that the chase could even go on for that long. Tesla’s “Autopilot” feature requires a driver to touch the steering wheel every minute, or the system alerts the driver and gradually brings the car to a stop. It seems that in this case, Autopilot may not have worked, or the driver somehow subverted the process, experts say.
Tesla declined to comment on the accident or confirm the car was in Autopilot mode. But on Sunday night, Musk tweeted: “Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here.”
In a follow-up tweet, Musk said that Autopilot could not distinguish between different types of emergency vehicles, but that it would be able to in the near future. “We’re adding police car, fire truck & ambulance to the Tesla neural net in coming months,” he wrote.
Redwood City CHP is familiar with the Tesla Autopilot feature in part because of a fatal crash the agency investigated in March. A 38-year old engineer at Apple died after he did not place his hands on the wheel in time when the car was in Autopilot mode, Tesla said.
The March crash is being investigated by the National Transportation and Safety Board.
Dan Edmunds, director of vehicle testing at Edmunds, an automotive research firm, has been reviewing partially automated vehicles, and called Tesla’s Autopilot a misleading term for an “overhyped automated cruise control system.” He said it was difficult to come up with an explanation for such a long car chase, and it underscored shortcomings with Tesla’s safety features.
“Certainly somebody could defeat the one minute timeout that allows you to put your hands on the wheel and the car could go longer,” Edmunds told ABC News. “Cadillac’s Super Cruise system would not have allowed you to behave this way because Super Cruise does something that Tesla doesn’t do and should do. It has sensors look at your head to see which way it’s pointed to make sure your chin’s up and not down against your shirt, and also looks at your eyeballs to see where they’re looking. So even if you’re head’s up, and you look off to the side, it will warn you and eventually disengage.”
“The fact that it doesn’t monitor the driver’s head position and line of sight is really a major shortcoming,” Edmunds said. “Just because somebody has their hands on the wheel, maybe the guy’s leaning on it, passed out, with just enough force to make it think that he’s got his hands on the wheel. The car isn’t really sure what the driver’s looking at. It doesn’t matter if you have your hands on the wheel or not, it matters if you’re looking out the windshield at the cars ahead.”
Copyright © 2018, ABC Radio. All rights reserved.