A new report reveals Tesla may be programming its Autopilot system to disregard specific traffic signs. Business Insider found that the company’s driver assistance technology is not only being trained to recognize and respond to road markings and signs but, in some cases, to ignore them altogether.
Inside Tesla’s Autopilot Training Process
Tesla’s Autopilot is a level 2 driver assistance system that combines adaptive cruise control and lane-keeping assist. To enable these features, Tesla equips its vehicles with cameras placed strategically to monitor the environment and assess road conditions. This data is fed into the car’s computer system, which analyzes it to help the vehicle react to things like road signs and lane markings.
Tesla employs a team of researchers to help train Autopilot, teaching the system how to respond to various road situations. One might assume the goal is to ensure Autopilot strictly follows all traffic laws. However, according to Business Insider, the training process sometimes involves instructing Autopilot to overlook certain regulations.
A former Tesla employee shared that his daily job entailed reviewing 5-6 hours of footage captured by Tesla’s onboard Vision cameras. His task was to “label” objects like road signs and lane markings to inform the AI on how to interpret them. Surprisingly, he was occasionally asked to teach Autopilot to disregard specific signs, such as “No Turn on Red” or “No U-turn” signs. When a “No Turn on Red” sign appears, it typically indicates that vehicles must wait for a green light before turning right to prevent interference with pedestrians or oncoming traffic. Ignoring these signs could increase safety risks.
The former employee explained that, rather than programming Autopilot to strictly obey these signs, the instructions were designed so that the AI might ignore them altogether. The rationale for this choice, while not explicitly communicated, seemed to suggest that Tesla aimed to develop a system that thinks more like a human driver, capable of interpreting road conditions rather than rigidly following every rule.
Employee Reactions and Concerns Over Ethical Standards
For some employees, this training approach raised ethical concerns. “That was something that made me and my coworkers uncomfortable,” said the former employee. “Sometimes they listened, but most of the time their response was to do their job and get paid.”
The idea of a self-driving car ignoring specific traffic regulations goes against conventional safety principles and may contribute to a perceived lack of transparency in Tesla’s self-driving program. One employee expressed discomfort with teaching a system to make judgment calls that might violate traffic rules, a process that could result in unpredictable behaviors on the road.
Workplace Conditions and Employee Monitoring at Tesla
In addition to these ethical concerns, employees working on Tesla’s Autopilot team reportedly face challenging working conditions. One current employee described the workplace environment as “hell,” expressing frustration over Tesla’s stringent monitoring and productivity tracking.
Tesla reportedly uses a software program called Flide Time, which monitors employees’ keystrokes and records inactivity periods, effectively timing how long employees are away from their keyboards. Although employees are allowed a 15-minute break and a 30-minute lunch break, they are disciplined if they exceed these allotted times or if they take “too long” for bathroom breaks.
This level of surveillance has contributed to a high-pressure work environment that some employees find unbearable. While initially, employees believed working at Tesla was a promising opportunity for career advancement, these conditions have reportedly left many feeling disillusioned and exhausted.
Safety and Ethical Implications for Tesla’s Future
Tesla’s alleged approach to training Autopilot raises questions about safety and ethics in developing autonomous vehicles. A self-driving system designed to ignore traffic regulations could lead to potential safety risks, posing challenges to the company’s stated mission of creating safe, accessible, and innovative transport solutions.
As Tesla continues to push boundaries with its Autopilot technology, these reported concerns highlight the importance of balancing innovation with transparency and ethical responsibility.