Prime Highlights
- Tesla’s robotaxi pilot in Austin tested with short routes, set prices, and onboard safety monitors.
- Early user footage exposes alarming driving patterns, fueling regulatory probes and public outcry.
Key Facts
- Tesla’s robotaxi runs between 6 a.m. and midnight in a geofenced zone of downtown Austin on about 10 Model Y vehicles.
- Footage shows cars swerving across lanes, speeding, and reacting erratically near police vehicles.
- U.S. regulators have contacted Tesla, raising concerns over safety and the company’s push to keep test data private.
Key Background
Tesla recently began a limited rollout of its long-anticipated robotaxi service in Austin, Texas. This launch marks a significant step in Elon Musk’s decade-long vision of delivering autonomous ride-hailing powered by Tesla’s proprietary Full Self-Driving (FSD) software. The pilot program runs on a small fleet of about 10 Model Y cars and is presently invite-only. The robotaxis are geofenced to only drive within certain areas of downtown Austin, running daily from 6 a.m. to midnight. Riders pay a fixed fare of $4.20 per trip.
Even with the hype over the rollout, the service has already generated serious safety questions. Videos posted by initial riders and onlookers have emerged showing the vehicles performing risky maneuvers. These include veering into the opposing lane, speeding, and slamming into their brakesparticularly approaching intersections or police cars. Even though the robotaxis are being monitored by human safety operators in the front seat, the public and regulatory bodies remain extremely worried about the performance of the software.
Tesla’s FSD uses only cameras and neural networks—rather than the radar and LiDAR used by rivals like Waymo for a more wide-ranging sensing paradigm. Skeptics criticize Tesla’s vision-only model as lacking redundancy and potentially less reliable in degraded weather or nighttime conditions.
Compounding the uproar, protest organizations organized private demonstrations, illustrating Tesla models repeatedly failing to see and stop for child-mannequin-sized figures crossing in front of halted school buses. Such tests, replicated under controlled conditions, indicate profound deficiencies in Tesla’s pedestrian sensing capabilities.
In turn, the National Highway Traffic Safety Administration (NHTSA) has written to Tesla in terms of safety infractions and previous accidents involving FSD software, including a 2023 fatal crash. Texas legislators have also weighed in, suggesting that Tesla postpone further development until stricter oversight exists. As Tesla looks to expand robotaxis across the country, experts warn that public acceptance and safety verification are major barriers that need to be overcome before large-scale deployment.