Latest News 18-10-2025 13:03 2 Views

Tesla’s self-driving cars under fire again

The U.S. National Highway Traffic Safety Administration (NHTSA) has opened a new investigation into 2.88 million Tesla vehicles running ‘Full Self-Driving’ (FSD). Officials say the system may be breaking traffic laws, and worse, causing accidents. According to Reuters, 58 reports describe Teslas blowing through red lights, drifting into the wrong lanes and even crashing at intersections. Fourteen of those cases involved actual crashes, and 23 caused injuries.

Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.

Red lights, train tracks and trouble ahead

In one striking pattern, six Tesla vehicles reportedly ran red lights before colliding with other cars. One driver in Houston complained that FSD ‘is not recognizing traffic signals,’ saying the car stopped at green lights but ran through reds. The driver even said Tesla saw the issue firsthand during a test drive, but refused to fix it. The agency is also reviewing new reports that some Teslas using FSD failed to handle railroad crossings safely, with one case involving a near-collision with an oncoming train.

Mounting legal and safety scrutiny

This is far from Tesla’s first brush with regulators. The company is already facing several investigations tied to both its Autopilot and FSD systems. In one high-profile case, a California jury ordered Tesla to pay $329 million after an Autopilot-related crash killed a woman. Another investigation is looking into Tesla’s limited Robotaxi service in Austin, Texas, where passengers reported erratic driving and speeding — even with human safety drivers onboard. Meanwhile, Tesla is still fighting a false advertising lawsuit from California’s DMV. Regulators say calling the software ‘Full Self-Driving’ is misleading since it requires constant driver supervision. Tesla recently changed the name to ‘Full Self-Driving (Supervised)’ to reflect that reality.

Regulators say more crashes may come

Tesla’s latest FSD software update arrived just days before the investigation began. But the NHTSA says the system has already ‘induced vehicle behavior that violated traffic safety laws.’ This investigation, now in its early stages, could lead to a recall if the agency finds Tesla’s self-driving software poses a safety risk.

What this means for you

If you drive a Tesla with FSD enabled, stay alert. The system isn’t fully autonomous, no matter what the name suggests. You should:

Keep your hands on the wheel and eyes on the road at all times. Manually override the system when approaching intersections, crosswalks or railroad tracks. Check for Tesla software updates regularly — they may include critical safety fixes. Report any unsafe FSD behavior to NHTSA.

For everyone else, this investigation is a reminder that ‘self-driving’ still means supervised driving.

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.

Kurt’s key takeaways

Tesla’s dream of a fully autonomous future keeps hitting speed bumps. With safety regulators circling and lawsuits piling up, the company’s next moves will shape public trust in AI-driven transportation. Still, the push toward automation isn’t slowing down; it’s just under heavier watch.

How much control would you give an AI behind the wheel? Let us know by writing to us at Cyberguy.com.

Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter

Copyright 2025 CyberGuy.com.  All rights reserved.

This post appeared first on FOX NEWS
Other news