The National Highway Traffic Safety Administration (NHTSA) is investigating Elon Musk’s Tesla over incidents in which its vehicles drove through red lights and other instances of dangerous driving while using the company’s partial-automation software known as Full Self-Driving (FSD).
Bloomberg reports that the NHTSA has launched an investigation into Elon Musk’s Tesla following reports of the EV giant’s vehicles driving dangerously while using its Full Self-Driving (FSD) software. The investigation, which covers an estimated 2.9 million Tesla vehicles, was initiated after the agency became aware of 58 incidents involving the driver-assistance system.
According to the NHTSA filing, the reported incidents include cases where Tesla vehicles drove through red lights and traveled in the wrong direction on roads. Several of these incidents resulted in crashes and injuries, although no fatalities have been reported within the scope of the current investigation. The agency specifically cited six reports where a Tesla vehicle with FSD engaged “approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash” with other vehicles. It was noted that multiple incidents occurred at the same intersection in Maryland, prompting Tesla to take action to address the issue at that specific location.
Breitbart News reported in September that Tesla drivers have reported FSD struggling with train crossings, including some near miss collisions with trains:
Italo Frigoli, a Tesla owner from North Texas, narrowly avoided a potentially catastrophic collision when his Tesla, operating in FSD mode, failed to recognize an approaching train at a railroad crossing. “It felt like it was going to run through the arms,” Frigoli recounted. “So obviously I just slammed on the brakes.” Video evidence from the car’s cameras appears to corroborate his account, and a subsequent test at the same crossing yielded similar results, with the Tesla’s software failing to detect the oncoming train.
Frigoli’s experience is not an isolated incident. At least six Tesla drivers who use FSD have reported problems with the technology at rail crossings, with four providing video evidence. Additionally, numerous online posts on Tesla forums and social media platforms describe similar mishaps, with some dating back to June 2023.
The FSD system is a key component of CEO Elon Musk’s vision for fully automated driving, as he pushes Tesla to embrace advanced technology. This investigation adds to the growing scrutiny of Tesla’s driver-assistance technology, alongside other ongoing probes into the automaker’s doors, Autopilot system, and the timeliness of its crash reporting.
Read more at Bloomberg here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.
Read the full article here