FedEx Autopilot escalates Tesla investigation into emergency vehicle crash


U.S. security regulators have upgraded their investigation into Tesla’s Advanced Driver Assistance System (ADAS), autopilot. NHTSA has now opened an engineering analysis for the security of the feature.

The company first started the track in August 2021, after a series of accidents involving Teslas and emergency vehicles stopping on the side of the road. At that point, it opened a preliminary assessment (PE) on the system. During a PE, the administration takes information submitted by drivers, automakers and other sources and uses it to determine whether a revocation or an engineering analysis is required or whether the case should simply be dropped.

However, it was found that the “crash review identified patterns of system performance and associated driver behavior that enabled the agency to identify areas of engineering search in a variety of situations that would allow this initial assessment to be upgraded to an Engineering Analysis (EA),” according to NHTSA documents.

Read more: US safety regulators launch formal investigation into Tesla autopilot after crash

In addition to the 16 Tesla crashes involving first responders who started the investigation, the NHTSA has looked at 191 other collisions involving autopilots. Of these, 85 were removed for involvement of external factors (e.g., activities of other vehicles). In about half of the remaining 106 cases, it was found that the driver’s operation of the system (too late to prompt the car or doing the wrong thing) was a primary cause of the accident.

Another quarter of accident drivers try to use the autopilot in an environment where, according to the Tesla owner’s manual, there may be system limitations. Details of each accident were found, although the driver’s hand was on the wheel at the last second before the accident.

Similarly, in 16 accidents with first responders, only two of the drivers within five minutes of the accident were given a driver’s busy prompt. This suggests, NHTSA says, that drivers are basically not intentionally abusing the autopilot or sleeping on wheels.

“The use of a driver or misuse of vehicle components, or the unintentional handling of a vehicle does not necessarily preclude a system error,” the NHTSA wrote. “For systems labeled as SAE Level 2 ADAS, important design considerations include the way in which a driver can interact with the system or the previous limitations of driver behavior, intentionally or unintentionally, while such a system is running.”

The regulator added that finding effective ways to keep drivers employed and allowing them to perform their driving duties properly under their supervision is an “important safety consideration”.

From here, NHTSA will look at more data, evaluate vehicles, and “explore the degree to which autopilots and associated Tesla systems may increase humanitarian factors at behavioral safety risks by reducing driver supervision effectiveness.”

Engineering analysis (which NHTSA usually tries to complete within a year) is terminated if either the regulator decides that there are no safety concerns or the automaker starts withdrawing.

Leave a Reply

Your email address will not be published.