Tesla has recently been rolling out a beta version of its 'Full Self-Driving' Autopilot update to a small group of users. In fact, the 'Full-Self Driving' beta program has been on public streets since October last year, as the company has been expanding the rollout of the prototype software to small groups of Tesla owners. A further expansion is coming this April. However, we are not sure if that's a good idea considering a new video has surfaced on the internet which shows the system in action and it's laughable to say the least.
This video comprehensively shows how technologically limited and potentially dangerous Tesla Autopilot's "Full Self-Driving" beta program still is. In this 13-minute video posted on Youtube by user 'AI Addict', we can see a Tesla Model 3 with Full-Self Driving Beta 8.2 fumbling its way around Oakland. Now it must be noted that Tesla's AutoPilot feature has never been a fully autonomous system. In fact, they have always required constant human supervision and split-second intervention.
While the video begins with embarrassing mistakes such as some hesitancy and confusion, it quickly moves on to extremely risky and potentially harmful driving. The Tesla Model 3 can be seen confused when trying to pick a lane, stopping way short of the line at traffic signals and even getting stuck behind parked cars. It seems particularly very confused at intersections, often ignoring road signs and traffic lights. It can even be seen disengaging abruptly in the middle of road, thus being a hazard for other road users.
What's really concerning is that there were a couple of near collisions, once when crossing an intersection with no stop sign for cross traffic, and in another moment where it seemed as though it wanted it drive straight through a fence. There were several instances where the driver had to take control manually to either continue the drive or avoid a crash. This is, of course, still a beta version of the software and no one expects it to perform perfectly. That is why it needs to be tested to find and iron out the issues. However, what's appalling is that Tesla has already rolled out this incomplete software to its customers, even charging a substantial premium for it.
When the technology is still nascent, it should ideally have been kept in the hands of Tesla employees. Otherwise, it will only lead to such early adopters carrying out uncontrolled tests on public roads, and potentially putting all other road users at risk. If a driver fails to catch this system on time, it could be up to some serious harm. What is really happening is that all this testing is ultimately benefitting the world's most valuable carmaker at basically zero cost. Essentially, every Tesla owner who purchases "Full Self-Driving" is conducting unpaid research on Tesla's behalf.