Tesla’s recall of 2 million cars relies on a fix that may not even work::Tesla agreed to the recall last week after a federal investigation the system to monitor drivers was defective and required a fix.

  • Voroxpete@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    68
    ·
    edit-2
    1 year ago

    Tesla’s website says that Autopilot and more sophisticated “Full Self Driving” software cannot drive themselves

    Full self driving

    Cannot drive themselves

    Christ I can smell the bullshit all the way from Canada.

  • Fapper_McFapper@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    5
    ·
    1 year ago

    Here I am hoping that Tesla, Twitter, Space X, and any other brand associated with Elon Musk burn to the fucking ground. Burn baby burn, show this wanna be emperor that he’s wearing nothing at all.

      • Beetschnapps@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        ·
        1 year ago

        Had a chance to cancel a cross state rail project cause elon would rather crowd cars into a tunnel with no fire exit than let people take a fucking train.

        https://time.com/6203815/elon-musk-flaws-billionaire-visions/

        “Musk later admitted to his biographer that he had never planned to build a Hyperloop system in California, and primarily promoted it in order to prevent conventional HSR proposals from breaking ground.”

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Herrenknecht employees probably still make jokes about how he was going to “improve” the boring machine they built that he bought.

  • ExLisper@linux.community
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    1 year ago

    So they are pretty much trying to figure out how to make sure the driver is paying attention to the road? IDK, maybe make the car respond to the steering wheel so that the driver has to move it or the car will not turn? That would ensure the driver is actually looking at the road.

    Alternatively ask them questions about the surroundings. “Driver, what state is the car in front of you from? You have 3 second to answer or FSD will be disabled”.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Just because a driver has their hands on the wheel doesn’t mean they’re watching the road. They might be watching a movie.

      As for asking about number plates - that sounds like a distraction that would cause accidents rather than prevent them.

      For me these systems need to be really clear. Either the person is driving, in which case they are fully responsible for every crash, or the car is driving, in which case the car is fully responsible. There’s no room for any grey area in the middle.

      In my opinion Tesla should be forced to refund anyone who was told their car has “full self driving”. I’m OK with autopilot though, since the airplane and boat version of that feature has always pretty much been “just keep going in a straight line until a human disengages autopilot”.

      • ExLisper@linux.community
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Asking questions was obviously a joke.

        As for the rest I don’t know what would it take to make sure the driver is paying attention. Distracted driving is the most common cause of accidents so clearly even in normal cars we can’t be sure drivers are paying attention. I think we can agree cruise control is generally good but I have no idea what happens once the car has line following. Is it the same? You focus on the road more? Or do you stop paying attention completely? I think it’s a questions to scientists really. Someone has to test it rigorously before it’s actually added to the cars. My feeling is that once you don’t have to drive by yourself (as in turn and brake) you eventually stop paying attention, so yeah, either the car drives itself 100% or you drive.

      • alienangel@sffa.community
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Note: it’s not the HTSB or any other agency’s responsibility to figure out a solution for Tesla. They just need to figure out what the bar for safety is, and tell tesla “make it as safe as full low light eye tracking, with whatever solution you want. But if you can’t make it at least that safe your cars shouldn’t be allowed back on the roads”.

        I was the biggest cheerleader for self driving cars because i hate driving - but “our best self driving car still can’t self drive at all” isn’t good enough, and letting them keep doing half assed shit like this does more harm to bringing people around to the technology than good.

  • serial_crusher@lemmy.basedcount.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    A better solution, experts say, would be to require Tesla to use cameras to monitor drivers’ eyes to make sure they’re watching the road. Some Teslas do have interior-facing cameras. But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.

    In case you were wondering who wrote the article

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    This is the best summary I could come up with:


    Tesla’s recall of more than 2 million of its electric vehicles — an effort to have drivers who use its Autopilot system pay closer attention to the road — relies on technology that research shows may not work as intended.

    But research conducted by NHTSA, the National Transportation Safety Board and other investigators show that merely measuring torque on the steering wheel doesn’t ensure that drivers are paying sufficient attention.

    “I do have concerns about the solution,” said Jennifer Homendy, the chairwoman of the NTSB, which investigated two fatal Florida crashes involving Teslas on Autopilot in which neither the driver nor the system detected crossing tractor trailers.

    Missy Cummings, a professor of engineering and computing at George Mason University who studies automated vehicles, said it’s widely accepted by researchers that monitoring hands on the steering wheel is insufficient to ensure a driver’s attention to the road.

    But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.

    Kelly Funkhouser, associate director of vehicle technology for Consumer Reports, said she was able to use Autopilot on roads that weren’t controlled access highways while testing a Tesla Model S that received the software update.


    The original article contains 1,028 words, the summary contains 212 words. Saved 79%. I’m a bot and I’m open source!

  • PatFusty@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    26
    ·
    1 year ago

    This is so fucking stupid it actually makes me mad. A tiny percentage of people died misusing the feature and now Tesla is forced to upgrade people to a technology that doesnt exist yet??? For free??? Holy shit this is dumb. Tesla should just relabel it to auto assist or something

      • PatFusty@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        10
        ·
        1 year ago

        Buyers remorse of getting a free upgrade? Cope that you cant get free things in one of the 42,000 deaths a year from regular cars while teslas count in 10 years is like 400 total.