Woman Sues Tesla After Cybertruck Tries to Drive Her Off Bridge
A recent lawsuit has cast a stark spotlight on the contentious claims surrounding Tesla’s “Full Self-Driving” (FSD) capabilities, as a Houston woman alleges her Cybertruck, operating under the advanced driver-assistance system, attempted to veer her off a bridge. This incident adds to a growing litany of concerns from regulators, critics, and now, a new wave of legal challenges that scrutinize the safety and reliability of Tesla’s autonomous driving technology, which its CEO, Elon Musk, has long championed as being on the cusp of full autonomy.
The lawsuit, filed by Justine Saint Amour, details a harrowing experience that occurred in August 2025. According to reports from *Chron* and the *Austin American-Statesman*, Saint Amour was driving her futuristic Cybertruck when its “Full Self-Driving” feature allegedly malfunctioned, leading to a dangerous collision. The unorthodox pickup truck, known for its distinctive angular design, “suddenly and without warning” accelerated towards the edge of a Houston overpass, threatening to plunge the vehicle and its occupant onto the freeway below.
Dashcam footage, publicly available and central to the lawsuit, captures the terrifying sequence of events. The video shows the Cybertruck ascending an overpass ramp with what appears to be increasing speed. As it approached a critical curve in the road, leading to a Y-shaped interchange on the 69 Eastex Freeway, the vehicle failed to adjust its trajectory or speed. Instead of following the curve, the Cybertruck barreled straight ahead, tearing through a line of traffic cones that delineated the lanes, and violently slammed into a concrete sidewall. The impact was severe, causing the truck to spin wildly and sending pieces of its hood flying across the road, a chilling testament to the force of the collision. Saint Amour’s lawsuit explicitly states that the Cybertruck “attempted to drive straight ahead into the concrete barrier and the freeway below,” underscoring the gravity of the system’s alleged failure.
A Houston driver was in a Cybertruck in August 2025 when the Autopilot-controlled vehicle drove straight into a concrete barrier on a Y-shaped overpass on 69 Eastex Freeway. The vehicle was expected to follow a curve to the right, but when it failed to do so, she disengaged the… pic.twitter.com/LopI3y5elg
— Austin Statesman (@statesman) March 6, 2026
In the immediate moments leading up to the crash, Saint Amour recounted her desperate attempt to regain control. Realizing the vehicle was accelerating too rapidly up the ramp and not adhering to the road’s curve, she tried to disengage FSD and manually intervene. However, the speed and suddenness of the malfunction left her with precious little time to react. The subsequent impact inflicted “substantial” injuries, leaving her with severe pain and requiring extensive medical attention. She was diagnosed with two herniated discs in her lower back and another in her neck, sprained tendons in her wrist, and experiencing numbness and weakness in her right hand – injuries that highlight the violent nature of the collision and the lasting physical toll it has taken.
This incident is far from isolated; it represents the latest in a series of highly publicized mishaps and regulatory challenges that plague Tesla’s self-driving systems. The capabilities of these systems, which include both Autopilot and the more advanced FSD, have regularly drawn intense scrutiny from government regulators worldwide, particularly the National Highway Traffic Safety Administration (NHTSA) in the United States. Critics argue that Tesla’s naming conventions – “Autopilot” and “Full Self-Driving” – are inherently misleading, fostering a false sense of security and encouraging drivers to over-rely on systems that are, by Tesla’s own admission in their disclaimers, merely Level 2 driver-assistance features requiring constant human supervision.
The tech frequently causes mishaps and accidents, some of them tragically deadly. Tesla was found partially responsible for the death of a 22-year-old woman who was struck by a Tesla running Autopilot, FSD’s predecessor, and a judge ordered Tesla to pay $243 million to the woman’s family, setting a significant precedent for liability in autonomous vehicle incidents. Such rulings underscore the immense legal and ethical complexities surrounding these technologies. Last year, the National Highway Traffic Safety Administration launched a probe into the automaker after a Tesla running FSD struck and killed an elderly pedestrian on the side of the road. Disturbing dashcam footage from that incident showed that the vehicle’s front camera, critical for its “vision-only” system, had been blinded by direct sunlight just prior to the collision, raising serious questions about the system’s robustness in varying environmental conditions.
These crashes, particularly the one involving the Cybertruck, have reinvigorated criticisms of Tesla CEO Elon Musk’s unwavering insistence on a “vision-only” approach to self-driving technology. Musk has famously dismissed the use of additional sensors like LiDAR – which detects surroundings using lasers and creates highly accurate 3D maps – as an expensive “crutch.” Instead, Tesla relies almost exclusively on an array of cameras combined with sophisticated artificial intelligence to interpret the world. While this approach has the advantage of being potentially less costly and more scalable, its limitations, especially in adverse weather, low light, or complex scenarios like the Y-shaped interchange, are becoming increasingly apparent and dangerous. Competitors like Waymo and Cruise, as well as many traditional automakers, employ a sensor suite that often includes LiDAR, radar, and ultrasonic sensors in addition to cameras, believing this redundancy offers a more robust and safer perception system.
The latest lawsuit from Saint Amour directly echoes these fundamental criticisms of Tesla’s technological choices. “While engineers at Tesla recommended the super-human vision of LiDAR be included for self-driving vehicles, and competitors like Waymo and Cruise relied heavily on LiDAR, Musk chose instead to rely only upon cheap video cameras,” the lawsuit states. This legal challenge squarely places the blame on a design philosophy that, according to the plaintiff, prioritizes cost-saving and an ambitious, unproven vision over established safety redundancies. The case will likely delve deep into the technical merits of Tesla’s FSD system, comparing its performance and limitations against industry best practices and the expectations set by the company’s marketing.
Furthermore, the suit also accuses Tesla of misrepresenting Full Self-Driving’s true capabilities. Despite its evocative name, the mode is unequivocally incapable of fully driving itself and explicitly requires a motorist to maintain vigilance and be prepared to take over at a moment’s notice. This critical distinction often gets blurred by Elon Musk’s frequent, optimistic pronouncements that Tesla cars can already drive themselves, while consistently hyping full autonomy as being “right around the corner” or “next year.” Such rhetoric, critics argue, creates a dangerous dissonance between perceived capability and actual performance, lulling drivers into a false sense of security. The California Department of Motor Vehicles (DMV) notably sued Tesla for false advertising precisely because of FSD’s spurious branding. In response to mounting pressure, Tesla subtly modified the branding last year to “Full Self-Driving (Supervised),” a tacit acknowledgment of the system’s current limitations. Tesla has, in turn, retaliated by filing a counter-suit against the state’s DMV, signaling an escalating legal battle over the very definition and marketing of autonomous capabilities.
Bob Hilliard, Saint Amour’s lawyer, articulated the core of their legal argument in a statement to *Chron*: “This company wants drivers to believe and trust their life on a lie: that the vehicle can self-drive and that it can do so safely. It can’t, and it doesn’t.” This blunt assessment encapsulates the growing frustration among consumers and regulators alike, who feel that Tesla’s marketing outpaces its engineering reality. The Cybertruck incident serves as a potent, if terrifying, reminder of the real-world consequences when advanced technology fails to live up to its ambitious promises, particularly in a domain as critical as automotive safety. The outcome of this lawsuit could have significant implications for how autonomous driving systems are developed, marketed, and regulated moving forward, potentially forcing a reevaluation of the balance between innovation and public safety.
More on Tesla: Tesla Robotaxis Crashing Vastly More Often Than Human Drivers

