Tesla scored a rare victory with the National Highway Traffic Safety Administration (NHTSA) on Monday after the Office of Defects and Investigations decided to close its investigation into its Actually Smart Summon tech, which allows drivers to summon their parked vehicles to them using just their phones.
The ODI opened an investigation into multiple reported crashes on January 6, 2025, and, after analyzing complaint data provided by Tesla and consumer complaints, the agency concluded that, since most of the reported crashes involved minor property damage and no serious injuries, it was ending its inquiry.
More Tesla
- Tesla loses crucial Autopilot ruling that could cost hundreds of millions
- New Tesla vision sounds almost too good to be true
- Tesla proves it truly is a tech (not car) company with latest move
The NHTSA has been busy auditing Tesla lately, including opening an “engineering analysis” into the company last month over its other autonomous driving tech, Full Self-Driving (Supervised).
Autonomous driving in general is a hot topic as thousands of them operate in major cities like San Francisco, Austin, and Miami every day.
The NHTSA opened a Preliminary Evaluation in October to investigate an estimated 2,000 Waymo 5th-gen automated driving system-equipped vehicles, following a Georgia media report that revealed the company’s vehicles did not stop for crossing schoolchildren despite repeated assertions that it had fixed the issue.
While those are different levels of autonomy from the investigation that just ended with Tesla, the technology is built on the same principles, and just because no one has been injured by Actually Smart Summon yet, it doesn’t mean they won’t be in the future.
Photo by CFOTO on Getty Images
NHTSA clears Tesla autonomous tech because of the lack of injuries, so far
On Monday, NHTSA announced that it is closing its investigation into Tesla’s Actually Smart Summon tech, which acts as a short-distance autonomous system, after more than a year.
The agency says that out of millions of sessions, only a fraction of 1% resulted in an adverse incident, and none of them were fatal.
“Almost all those incidents took place where, typically early in a Summon session, the system or person using the app failed to fully detect or respond appropriately to vehicle surroundings, resulting in minor impacts. Incidents took place when app users did not have a complete 360-degree view of the surroundings in the app to assess situational awareness,” ODI said.
“This limited the app user’s ability to determine whether an impact was imminent during initial vehicle maneuvers such as reversing in close proximity to an obstacle or a curb. ODI found that the impacts most often occurred with parking gates, adjacently parked vehicles, and short parking bollards.”
Related: Rivian defies expectations despite rough EV environment
Two of the crashes ODI investigated were related to camera blockages. In both cases, the vehicles were trying to navigate a snowy parking lot “with snow partially or fully obstructing the forward-facing cameras.”
The technology didn’t detect the camera blockage, and the vehicles collided with unoccupied parked vehicles.
Tesla issued an over-the-air software update to implement camera blockage detection, so users can know when the vehicle is in a blind spot; however, another recent investigation suggests that Tesla’s camera blockage issue is recurring.
NHTSA opens ‘engineering analysis’ into Tesla FSD cameras
In March, the ODI announced it was escalating its investigation into Tesla FSD, opening an “engineering analysis” to evaluate Tesla Vision’s “degradation detection system.”
The NHTSA is investigating how much camera visibility is degraded by roadway conditions such as glare and airborne obstructions, and whether Tesla FSD (Supervised) can detect and adjust to the resulting degradation to still work safely.
“Available incident data raise concerns that Tesla’s degradation detection system, both as originally deployed and later updated, fails to detect and/or warn the driver appropriately under degraded visibility conditions,” the NHTSA said.
The agency has identified nine crashes in which it says Tesla FSD’s degradation systems may not have been functioning properly. It says FSD “did not detect common roadway conditions that impaired its visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.”
And there could be many more instances that the agency does not know about because their review of Tesla’s responses to its request for more information revealed “additional crashes that occurred in similar environments and where the system either did not detect a degraded state, and/or it did not present the driver with an alert with adequate time for the driver to react.”
In each of those crashes, FSD lost track of or failed to detect a lead vehicle. Still, ODI says, Tesla has described “internal data and labeling limitations” that have prevented it from being sure it has caught every crash due to degraded vision. So there could be more crash incidents than are currently known.
Tesla did not respond to a request for comment.
Tesla says it does not need LiDAR
While the NHTSA directly lists the lack of radar as a possible component in these crashes, Tesla says the system is unnecessary.
Most experts consider a light detection and ranging driver-assistance system to be the state-of-the-art technology. Tesla competitors like Toyota offer LiDAR, in addition to the camera-based system that Tesla FSD uses.
But Musk has called LiDAR an “expensive and unnecessary” fool’s errand, just “expensive hardware that’s worthless on the car.”
Recently, Tesla said that its FSD system has driven a cumulative total of 3.6 billion miles, nearly triple the 1.3 billion miles it reported a year ago.
Related: Tesla LiDAR stance accelerates NHTSA investigation into FSD

