Elon Musk’s Full Self-Driving (FSD) software has come under scrutiny as multiple Tesla owners report incidents of their vehicles failing to recognize and respond appropriately to oncoming trains at railroad crossings.
NBC News reports that Tesla’s much-touted Full Self-Driving (FSD) software, an add-on package of driver-assistance features, has raised concerns among car owners and experts due to its apparent inability to safely navigate railroad crossings. Despite Elon Musk’s claims that FSD is “the future of transport” and capable of navigating “almost anywhere with your active supervision,” a growing number of Tesla drivers have reported alarming incidents where their vehicles failed to detect oncoming trains or respond appropriately to flashing lights and descending gate arms at rail crossings.
Italo Frigoli, a Tesla owner from North Texas, narrowly avoided a potentially catastrophic collision when his Tesla, operating in FSD mode, failed to recognize an approaching train at a railroad crossing. “It felt like it was going to run through the arms,” Frigoli recounted. “So obviously I just slammed on the brakes.” Video evidence from the car’s cameras appears to corroborate his account, and a subsequent test at the same crossing yielded similar results, with the Tesla’s software failing to detect the oncoming train.
Frigoli’s experience is not an isolated incident. At least six Tesla drivers who use FSD have reported problems with the technology at rail crossings, with four providing video evidence. Additionally, numerous online posts on Tesla forums and social media platforms describe similar mishaps, with some dating back to June 2023.
The National Highway Traffic Safety Administration (NHTSA) has acknowledged the issue, stating that they are “aware of the incidents and have been in communication with the manufacturer.” The agency emphasizes its commitment to enforcing the law and prioritizing the safety of all road users.
Experts warn that the consequences of these software failures could be disastrous. Phil Koopman, an associate professor emeritus of engineering at Carnegie Mellon University, cautions, “If it’s having trouble stopping at rail crossings, it’s an accident waiting to happen. It’s just a matter of which driver gets caught at the wrong time.”
The rail industry has long expressed concerns about the potential dangers posed by autonomous vehicles at railroad crossings. In 2018, the Association of American Railroads stressed the complexity of the issue, noting that self-driving cars must recognize various signals, including locomotive headlights, horns, and bells, as not all crossings have gates and flashing lights.
Despite these warnings, Tesla CEO Elon Musk has continued to make bold claims about the capabilities of FSD, asserting that Tesla vehicles “can drive themselves.” However, experts argue that such claims are not supported by evidence, and Tesla itself classifies FSD as a “Level 2” system, requiring constant human supervision.
The root of the problem, according to experts, may lie in the training data used to develop Tesla’s FSD software. Koopman suggests that Tesla engineers may not have included sufficient examples of train crossings in the videos used to train the software, leading to its inability to handle these scenarios effectively.
Read more at NBC News here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.
Read the full article here