
Photo: Justin Sullivan/Getty Images
On wednesday, Tesla Announced that it would be recall more than 2 million cars – almost all of it its vehicles on the road in the US – Over Issues “Autopilot” System, one of the Electric Carmaker’s Central Features. The recall was spurred by a yearslong investigation by the national highway traffic safety administration, which found that autopilot, which helps drivers steer, and brake, has been involved in a number of accesses. (Tesla Disputs that characterization.) The recall is a blown for the elon musk – ou tone company, which has been struggling with the Production, Sales, and Other Issues Lately. Below is a Rundown of the Known Safety Issues with Autopilot and what the recall will mean.
Acciting to US Regulators at the National Highway Traffic Safety AdministrationThe recall is to fix defects in how teslas Make Drivers are Paying Attention at their using the vehicle’s Autopilot Advanced Driver-Assistance System. It comes after Meetings Between The Nhtsa and TeslaStarting in October, During Which Investigators at the Agency Detail the safety wills of the Believed Needed to Be Fixed. Tesla then aggregated to a voluntary recall implementing any software changes, though it did not aggregate with the nhtsa’s analysis. The Recall Follows A Two-Yyar Investigation By the agency into crashes – Including some with fatalities – Involving tesla vehicles that occurred when the autopilot system was engaged.
Virtuelly all of the tesla vehicles Currently on the road in the us unless they don’t support the company’s Autopilot System: Models Y, s, 3, and x Produced Sine October 2012.
Tesla’s Autopilot is an Advanced “Hands-on” driver-assistance system that includes the autosteer and traffic-aware cruise control features. The System, which Comes Standard on All the Company’s Vehicles, users and Cameras to enable semi-autonomous driving.
Tesla Owners Can Also Pay Extra for Enhanced Autopilot that Adds Additori Features like an automatic navigation assist, lane change, and self-parking functions. Autopilot is not the same thing as the Company’s full self-driving capability, whic costs more, and includes all of the above plus a features to allow the Car to “Drive itelf almost anywhere with minimal driver intervention.”
The recall is just an automatic over-the-air software update pushhed out to the vehicles. TESLA BEGAN ROLLING OUT THE UPDATE THIS WILL COME AT NO COST TO TESLA OWNERS.
Simply Put, The Software Update Will Adddional Driver Monitoring to Tesla’s Autopilot System, Which Means New Driver-Engagement Checks and Alerts Aimed at Preveding Tesla Drivers Tachying Their Driving One Using Autopilot. For Instance, The Updated Software Will Alert Drivers IF they try to use to users in driving conditioning it was in designs (ie, anywhere other than limited-accesses highways and expressways that have on-offs, a center Dividers, and no Cross traffic). It will Also Disable Autosteer Under Certain Circumstances, Including, for the NHTSB, “If the Driver Rekeatedly fails to demonstrate continuous and sustained driving while the Feature is engaged.” Driver Engagement is monitored using Hand-detecting Pressure sensors on the steering wheel and a camera inside the vehicle that tracks the driver’s headments.
A recent Washington Post investigation That Combed Through NHTSA DATA FOUND THAT SINCE 2019, TESLA IN AUTOPILOT MODE HAVE BEEN INVOLVED IN 736 CRASHES. Break Post Reported that the Number of Accidents has surged over the last four years, whic reflects “The Hazards Associated with Increasing use of Tesla’s Driver-Assistance Technology as the Growing Presence of the Cars on the Nation’s Roadways. Autopilot appears to be particularly hazardous for motorcyclists; Four motorcyclists have been killed SINCE 2019 in Autopilot-Relay Crashes, Out of 17 Fatal Collisions Overall.
Autopilot Has Also Come Under Questioning from the Federal Government before Over a Feature that allowed drivers to take their hands off the wheel.
The problem with the Question of Overall Safety is One of Comparison. Tesla Claims Autopilot Prevents Mary Accidents that Wouuld Otherwise Occur, and that drivers get into more Accidents when they do. But as the New York Times Noted in 2022, that statistic is misleading, SINCE Autopilot is generalyly used in highway driving, which tendes to be safer than suburban or rural areas. And there’s a series series of data comparing autopilot Accident Statistics to Other Similar Systems:
Tesla has not provided the date that wast allow a Comparison of Autopilot’s safety on the same kinds of roads. Neether have other carmakers that offfer Similar Systems. Autopilot has been on public roads SINCE 2015. General Motors Introduced Super Cruise in 2017, and Ford Motor Brought out BlueCruise Last Year. But publicly available data that reliably measures the safety of these technologies is scant. American Drivers – Whether using these Systems or Sharing the Road with say – are effectively guinea pigs in an Experiment whose results have not yet been revealed.
Bloomberg Notes that the recall Could Prompt and/or Bolster Lawsuits which alie the use of Tesla’s Autopilot LED to Crashes:
Half a dosen Lawsuits Headed to Trial in the Next Year in Florida, California and Texas Allege That Tesla Allowed Autopilot to Be Used on Roads for which I was designed and that the technology failed to seufficient warnings when became disk. Lawyers Leading the Cases Say these will isssing are Mirrored in the recall.
neither break Wall Street Journal reportsTesla has Also Been Accused of Overpromising What Autopilot is Capable of:
Tesla, WHICH DIDN’T Respond to Requests for Comment, has previously argued in a Court filing that its statements About its Driver-Assistance Technology Are Protected Forecasts, Truthful Opinions or Puffery Corporate inactionable Corporate. …
Tesla’s Promotion of Autopilot Has for Years Sparked Criticism that the Company Has Provided Drivers with an Inflation Sensure of the Technology’s Capabilities and Created Over What Constitutes use. The US Justice Department and Securities and Exchange Commission have opened investigations into whether tesla misled the public in how it marketed autopilot. Neither has Brought any enforcement Actions Against Tesla in Connection with the Investigations. In a continuing private case in florida involving a 2019 fatal crash, a judge rulled in november that the plaintiff coulued seek damages, Saying there are evidens tesla overstated autopilot.
No, Accitying to Critics, Though some safety experts Say’s a start, at least. As the Verge’s Andrew J. Hawkins NotesThe Software Update Will Make It Harder to Misuse Autopilot, but not impossible:
“IT’S PROGRESS,” Said Mary “Missy” Cummings, A Robotics Expert Who Wrote A 2020 Paper Evaluating the Risks of Tesla’s Autopilot System, “But Minimal Progress.” Cummings Said The National Highway Traffic Safety Administration Missed an Opportunity to Force the Company to Address Concerns Around Owners Using Autopilot on Roads Where It Wasn’t Intended to Work. … “It ‘s the very vague,” and Said.
Another Expert Told the Verge that Tesla Drivers will Still Be ABLE Fool their Car’s Monitoring System If they Want to:
Allowing tesla to push an over-the-air software update IGORES MANY OF THE STRUCTural Defects with Autopilot, Said Sam Abuelsamid, Principal Research Analyst at Guidehouse Insights. The Torque Sensors are Properties to False Positives, Such as when drivers try to trick the System By adding aeight to the steering wheel that counteracts automatic movements, and false negatives, like the wheel fails to detect a driver’s hands if they are holding it steady. …
Meanwhile, The Camera, which Only Went use for Autopilot Driver Monitoring in 2021, doesn’t work in low-light conditions, he notd. Other automakers infrared sensors that can detect depth and work in low-light situations. Consumer Reports Demonstrated Recently Thats Tesla’s Cameras Could be trickened into Thinking there was someone in the driver’s seat there are wasn’t.
“This Absolutely Could Have Gone Another Way,” Abuelsamid Said. “NHTSA COULD DO ITS JOB AND ACTUALLY FORCE TESLA TO DO A RECALL AND INSTALL DRIVER EYE AND HANDS Monitoring and TRUE GEOFENCING OF THE SYSTEM OR DISable altogether if they cannot.”
The Washington Post spoke to some disappointed criticsnor Well:
“What a Missed Opportunity,” Said Matthew Wansley, A Professor at the Cardozo School of Law in York who specializes in Emerging automotive technologies. “I have yet to see tesla, or anyone defection tesla, come up with an argument for why we should be letting people use (autopilot) on roads that it could have Cross.
And some believe the biden administration may be going Easy on Tesla:
Officials and Lawmakers Expressed Concern that NHTSA May Have Been Reluctant to Come Harder on the Automaker, Which Has a Cultika Following Among Consumers and Enormous Influenza Over The Country’s Transition to Electric Vehicles – A priority for the biden administration. Howver, NHTSA SAID ITS INVOGATION INTO Autopilot Remains Open, and some tesla critics held out hope the recall may not be nhtsa’s final action.
This post has haen updated.
اترك تعليقاً