Search

Tesla Sells ‘Full Self-Driving,’ but What Is It Really? - The New York Times

abaikans.blogspot.com

As federal investigators escalate their scrutiny of Tesla’s driver-assistance technology, another problem is emerging for the electric carmaker: complaints among customers that they have been sold an additional driver-assistance option that doesn’t operate as advertised.

Over the years, Tesla owners have paid as much as $10,000 for the package, called Full Self-Driving. F.S.D., which can be purchased as an extra on Tesla cars, is a collection of services that add to Tesla’s Autopilot, the driver-assistance technology that government investigators are taking a look at after a string of crashes.

Critics say F.S.D. hasn’t lived up to its name since its debut more than two years ago. It can help a car navigate off one highway and onto another, and respond to traffic lights and stop signs. It also includes a service for summoning a car out of a parking space or parking lot with a mobile app. But full self-driving? Not quite.

When Joel M. Young paid $6,000 for F.S.D. in 2019, he assumed he would receive a system that could drive anywhere on its own by year’s end. Two years later, that remains beyond the system’s abilities. Mr. Young, a lawyer, writer and car enthusiast living in Placitas, N.M., recently asked Tesla to refund his money, and it declined. On Wednesday, he sued the company, accusing it of fraud and breach of contract, among other complaints.

“Tesla has not delivered what it promised,” he said.

Mr. Young’s suit is most likely the second from a customer aimed at the F.S.D. add-on feature. Two brothers in Southern California have filed a suit that raises similar complaints. And as many enthusiasts on social media platforms like Reddit question whether they have paid for something that does not exist, the California Department of Motor Vehicles recently said it was reviewing Tesla’s use of the term Full Self-Driving.

Also on Wednesday, Senators Richard Blumenthal of Connecticut and Edward J. Markey of Massachusetts, both Democrats, sent the chair of the Federal Trade Commission a letter calling on the agency to investigate the marketing and advertising of Autopilot and F.S.D.

Tesla privately acknowledges the limitations of the technology. As the public advocacy website PlainSite recently revealed after a public records request, Tesla officials have told California regulators that the company is unlikely to offer technology that can drive in any situation on its own by the end of 2021.

Tesla’s factory in Fremont, Calif., in 2018. “Tesla has not delivered what it promised,” said an owner who paid $6,000 for Full Self-Driving.
Justin Kaneps for The New York Times

“If we can’t trust Tesla when they say their vehicles are full self-driving, how can we trust the company when it says they are safe?” said Bryant Walker Smith, an associate professor in the Schools of Law and Engineering at the University of South Carolina who specializes in autonomous vehicles.

Tesla did not respond to several requests for comment.

Complaints about the F.S.D. kit may pale in comparison with the concerns that people are being killed by misuse of or glitches in Tesla’s driver-assistance technology. But they point to a common thread of Tesla’s approach to driving automation: The company is making promises that other carmakers shrink from, and its customers think their cars can do more on their own than they really can.

“One of the downsides of automated technology can be overreliance — people relying on something it may not be able to do,” said Jason K. Levine, executive director of the Center for Auto Safety, a nonprofit that has monitored the industry since the early 1970s.

Other automakers are being considerably more conservative when it comes to automation. The likes of General Motors and Toyota offer driver-assistance technologies akin to Autopilot and F.S.D., but they do not market them as self-driving systems.

Backed by billions of dollars from major automakers and tech giants, companies like Argo, Cruise and Waymo have been developing and testing autonomous vehicles for years. But in the near term, they have no intention of selling the technology to consumers. They are designing vehicles they hope to deploy in certain cities as ride-hailing services. Think Uber without the drivers.

In each city, they begin by building a detailed, three-dimensional map. First they equip ordinary cars with lidar sensors — “light detection and ranging” devices that measure distances using pulses of light. As company workers drive these cars around the city, the sensors collect all the information needed to generate the map, pinpointing the distance to every curb, median and roadside tree.

The cars then use this map to navigate roads on their own. They continue to monitor their surroundings using lidar, and they compare what they see with what the map shows, keeping close track of where they are in the world.

At the same time, these sensors alert the cars to nearby objects, including other cars, pedestrians and bicyclists. But they do not do this alone. Additional sensors — including radar and cameras — do much the same. Each sensor provides its own snapshot of what is happening on the road, serving as a check on the others.

Waymo now offers an automated ride-hailing service in the suburbs of Phoenix, but the roads are wide, pedestrians are few and rain is rare. Expanding into other areas is a painstaking process that involves constant testing and retesting, mapping and remapping. Chris Urmson, the chief executive of the autonomous vehicle company Aurora, said the rollout could take 30 years or more.

Tesla is taking a very different tack. The company and its chief executive, Elon Musk, believe that self-driving cars can navigate city streets without three-dimensional maps. After all, human drivers do not need these maps. They need only eyes.

Evan Jenkins for The New York Times

For years, Tesla has argued that autonomous vehicles can understand their surroundings merely by capturing what a human driver would see as they speed down the road. That means the cars need only one kind of sensor: cameras.

Since its cars are already equipped with cameras, Tesla argues, it can transform them into autonomous vehicles by gradually improving the software that analyzes and responds to what the cameras see. F.S.D. is a step toward that.

But F.S.D. has notable limits, said Jake Fisher, senior director of Consumer Reports’ Auto Test Center, who has extensively tested these services. Automatically changing lanes can be enormously stressful and potentially dangerous, for instance, and summoning the car from a parking space works only occasionally.

“These systems are good at dealing with the boring, monotonous stuff,” Mr. Fisher said. “But when things get interesting, I prefer to drive.”

Machines cannot yet reason like a human. Cars can capture what is happening around them, but they struggle to completely understand what they have captured and predict what will happen next.

That’s why other companies are deploying their autonomous cars so slowly. And it is why they equip these cars with additional sensors, including lidar and radar. Radar and lidar can track the speed of nearby objects as well as their distance, giving cars a better sense of what is happening.

Tesla recently removed the radar from its new cars, which now rely solely on cameras, as the company always said they would. During a January earnings call, Mr. Musk said he was “highly confident the car will be able to drive itself with reliability in excess of humans this year.”

This promise rests on a “beta” service, now under test with a limited number of Tesla owners, that aims to automate driving beyond highways. In a March post on Twitter, Mr. Musk estimated that 2,000 people were using the beta, called “Autosteer on city streets.”

But like Autopilot and other F.S.D. services, the beta calls for drivers to keep their hands on the wheel and take control of the car when needed.

Most experts say this is unlikely to change soon. Given the speed of cameras and the limitations in the algorithms that analyze camera images, there are still situations where such a setup cannot react quickly enough to avoid crashes, said Schuyler Cullen, a computer vision specialist who oversaw autonomous driving efforts at the South Korean tech giant Samsung.

With a system that relies solely on cameras, crash rates will be too high to offer the technology on a wide scale without driver oversight, said Amnon Shashua, chief executive of Mobileye, a company that supplies driver-assistance technology to most major carmakers and has been testing technology that is similar to what Tesla is testing. Today, he said, additional sensors are needed.

Tesla was not necessarily wrong to remove the radar from its cars, Mr. Shashua added. There are questions about the usefulness of radar sensors, and Tesla may have seen an opportunity to remove their cost. But that does not mean the company can reach full autonomy solely with cameras. The technology needed to do this safely and reliably does not yet exist.

“That approach, in my opinion, will never work,” Dr. Cullen said.

Adblock test (Why?)



"really" - Google News
August 20, 2021 at 10:24PM
https://ift.tt/3B1ekOu

Tesla Sells ‘Full Self-Driving,’ but What Is It Really? - The New York Times
"really" - Google News
https://ift.tt/3b3YJ3H
https://ift.tt/35qAk7d

Bagikan Berita Ini

0 Response to "Tesla Sells ‘Full Self-Driving,’ but What Is It Really? - The New York Times"

Post a Comment

Powered by Blogger.