• rsuri@lemmy.world
    link
    fedilink
    English
    arrow-up
    92
    ·
    11 months ago

    Autopilot “is not a self-driving technology and does not replace the driver,” Tesla said in response to a 2020 case filed in Florida. “The driver can and must still brake, accelerate and steer just as if the system is not engaged.”

    Tesla’s terminology is so confusing. If “Autopilot” isn’t self-driving technology, does that mean it’s different from “Full Self Driving”? And if so, is “Full Self Driving” also not a self-driving technology?

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      63
      ·
      edit-2
      11 months ago

      I heard Elon Musk call it: “Assisted full self driving”. Which doesn’t make any sense. LOL

    • anlumo@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      11 months ago

      The term autopilot comes from aviation, where the only kind of problem resolution an autopilot does is turning itself off.

      Other than that, it just flies from checkpoint to checkpoint.

      • machinin@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 months ago

        If only we could implement similar testing protocols to the aviation version to validate it’s safety!

      • Captain Aggravated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        Depends on the autopilot. There are some that are as rudimentary as a “wing leveler.” They only have control of the ailerons and can level the wings and maybe make turns. Other systems have control of all three major control axes and are integrated with the navigation systems so they can do things like climb to an altitude and level off, turn to a heading, or even fly holds and approaches.

        They do require training on the part of the pilot to use in flight.

        • anlumo@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          Yeah, but even the best ones would happily crash into a mountain if the pilots don’t set their altimeters properly (and ignore the terrain warnings).

            • anlumo@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              11 months ago

              Hard to say, it might depend on the plane model. I’ve heard that Boeing 777s autopilots are really snarky.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      11 months ago

      Autopilot is a more basic driver assist system than FSD. FSD is what will eventually become what the name suggests but it’s obviously not there yet and everyone knows this. It’s just the name of the system.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        ·
        11 months ago

        Those are really crappy names. How about “driver assist” and “supervised self driving”? Drop the “supervised” once they’re ready to market it as real self driving.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          FSD is called Full Self-Driving (Supervised) nowdays.

          Autopilot can be seen as a misleading term but that has more to do with people not understanding what autopilot on airplanes actually does which is quite similar to what it does on Teslas aswell.

          • michaelmrose@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            11 months ago

            Autopilot isn’t being marketed to aviation enthusiasts nor is it a plane so it doesn’t matter how autopilot in planes works it matters what the perception is. They could have used a more appropriate term like advanced cruise control

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 months ago

        FSD is just a lie because its a description of a product they intend to develop not something that exists on the car you are buying now

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            11 months ago

            The car being able to get to their destination using the public road network without a single person in it, while fully complying with the law and road safety.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              Well, the current version of FSD can do that. It’s just not approved for unsupervised driving (level 3) so that’s why the driver still needs to be there to be ready to take over at any moment. The current version of it near-perfectly mimics a human driver. I highly recommend to check reviews on YouTube for the version 12. It’s quite impressive.

              • racemaniac@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                Yes, it can do that. Occasionally. And then it’ll randomly fail in the stupidest ways.

                And i’ve actually looked at some Tesla FSD reviews, and every review seems to be of a “2 steps forward, 2 steps back” kind. Look at all these things that improved, and then mentioning all the things that used to work that are now broken again. (of course with a lot more focus on the things that have improved, since hype pays).

                I’m honestly wondering how self driving will evolve, it seems we’ve landed in the really hard last 10% of getting there, and it’s mostly come to a stand still.

          • machinin@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            11 months ago

            The one where Tesla is responsible if there is an accident (but this user blocks people critical of Tesla, so probably won’t see this message).

      • bitchkat@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        Specifically Auto Pilot is lane keep and traffic aware cruise control (it will slow down if you’re going faster than the car in front) FSD adds auto lane changes (it can do it by itself or the driver can initiate with the turn signals), makes turns necessary to follow navigation. It does a pretty decent job on freeways.

        That they are working on now is getting FSD to work better on city streets and secondary highways

  • istanbullu@lemmy.ml
    link
    fedilink
    English
    arrow-up
    56
    ·
    11 months ago

    You can’t call something Full Self Driving or Autopilot and then blame the driver. If you want to blame the driver then call it drive asist.

    • kingthrillgore@lemmy.ml
      link
      fedilink
      English
      arrow-up
      23
      ·
      11 months ago

      Right! That’s why you have the FSD turn it over to the driver the moment a crash is unavoidable to make the driver liable.

      • pyre@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 months ago

        “at the time of the crash, the driver was in full control”

        (but not a couple seconds before)

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 months ago

    This is the best summary I could come up with:


    SAN FRANCISCO — As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the company’s most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public.

    Evidence emerging in the cases — including dash-cam video obtained by The Washington Post — offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla.

    Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the “automation has greater capabilities than it does.”

    The company’s decision to settle with Huang’s family — along with a ruling from a Florida judge concluding that Tesla had “knowledge” that its technology was “flawed” under certain conditions — is giving fresh momentum to cases once seen as long shots, legal experts said.

    In Riverside, Calif., last year, a jury heard the case of Micah Lee, 37, who was allegedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 65 mph, crashed into a palm tree and burst into flames.

    Last year, Florida Circuit Judge Reid Scott upheld a plaintiff’s request to seek punitive damages in a case concerning a fatal crash in Delray Beach, Fla., in 2019 when Jeremy Banner and his Tesla in Autopilot failed to register a semi truck crossing its path.


    The original article contains 1,850 words, the summary contains 263 words. Saved 86%. I’m a bot and I’m open source!

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      11 months ago

      Even when the driver is fully responsible, the assistance software must work properly in all situations. And it must be tested fully.

      In case the software makes severe mistakes surprisingly, normal drivers maybe don’t have a chance to regain control. Normal drivers are not like educated test drivers.

      • hoshikarakitaridia@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.

        • umami_wasabi@lemmy.ml
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          11 months ago

          But does the driver have a reasonable chance with adequate timeframe to regain control?

          Like what happened with Boeing 737 Max MCAS incident, Boeing expects the pilot to disengage the trim motor in mere 4 seconds, which accoriding to a pilot “a lot to ask in an overwheming situation” or something similar.

          Normal people in soon-to-crash situation are likely to freeze for a second or two, and the fear kicks up. How the driver reacts next is hard to predict. Yet, at the speed most US drivers love to go (I saw 70+ mph on freeway is the norm), the time avalible for them to make an well thought out decision I guess is quite short.

          • hoshikarakitaridia@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 months ago

            You made me think about this for a second.

            In my head, the reason is not specifically to punish the driver, but to make drivers always be aware and ready to take control again. Yes 100 ppl will have 1000 different ways to react to such a software error, but you need ppl to pay attention, and in law the only way is to use punishment. Obviously this needs to be well calculated but either you have multiple lines of defense (the software, the driver, maybe even additional safety features) or you have to remove the autonomous system.

            • michaelmrose@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              It doesn’t matter for practical purposes you can’t make people pay attention as if driving without the actual engagement of driving. There is going to be a delay in taking over and in a lot of cases it wont matter by the time the human is effectively in control.

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          11 months ago

          Imagine you are going along a straight road, not too much traffic, the speed limit is high and you are enjoying it. Suddenly your assistant software decides to turn your steering wheel hard to the left.

          You will have no chance.

          What have you done wrong? What is it what you are accountable for?

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            For mine

            • there’s feedback to ensure you’re alert, touching the wheel every once in a while
            • when it made me nervous, it was drifting to the right or slowing, not suddenly moving anywhere.

            So did the car think there was an impending collision? That should be obvious in the logs and the only reason for sudden maneuvers

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        The article keeps calling it “Autopilot”, which is different from “Full Self Driving”.

        If they are correct, then it’s all on the driver. Autopilot is just a nicer adaptive cruise control, and should be treated as such. Many cars have them, even non-smart vehicles. Even my seven year old Subaru had similar (much dumber but similar)

        That being said, people seem to confuse the names of these different functionalities all the time, including throughout this thread. However, even if they were confused and meant FSD, my car has feedback to require your hands in the wheel, so I don’t understand how you can claim ignorance

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 months ago

          The article keeps calling it “Autopilot”, which is different from “Full Self Driving”. If they are correct, then […]

          No. That difference is meaningless, since both softwares provide autonomy level 2. The responsibilities are exactly the same.