A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      17 hours ago

      To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        16 hours ago

        Someone who doesn’t understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

        • bluewing@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          48 minutes ago

          You are trying to judge the self driving feature in a vacuum. And you can’t do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn’t need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that “bit better” than you statistically.

          FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

      • NιƙƙιDιɱҽʂ@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        12 hours ago

        …It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn’t even time for human intervention, but I frequently had to take over when I used to use it (post v13)

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        16 hours ago

        Even with the distances I drive and I barely drive my car anywhere since covid, I’d probably only last about a month before the damn thing killed me.

        Even ignoring fatalities and injuries, I would still have to deal with the fact that my car randomly wrecked itself, which has to be a financial headache.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      16 hours ago

      That’s probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.

      Let’s say that it’s only 0.01% risk, that’s still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.

      It wouldn’t be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they’re never going to add lidar scanners so is literally never going to get any better it’s always going to be this bad.

      • FreedomAdvocate@lemmy.net.au
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        5 hours ago

        Saying it’s never going to get better is ridiculous and demonstrably wrong. It has improved in leaps and bounds over generations. It doesn’t need LiDAR.

        The biggest thing you’re missing if that with FSD **the driver is still supposed to be paying attention at all times, ready to take over like a driving instructor does when a learner is doing something dangerous. Just because it’s in FSD Supervised mode it slant mean you should just sit back and watch it drive you off the road into a lake.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          Your saying this on a video where it drove into a tree and flipped over. There isn’t time for a human to react, that’s like saying we don’t need emergency stops on chainsaws, the operator needs to just not drop it.

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        16 hours ago

        …is literally never going to get any better it’s always going to be this bad.

        Hey now! That’s unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won’t know which until it tries to kill you in new and unexpected ways :j