Victim in critical condition

  • @[email protected]
    link
    fedilink
    English
    -11 year ago

    “Do nothing” is usually not that bad an approach

    That doesn’t get any truer even if you repeat it a few more times.

    Truth is that a general approach was not sufficient here. This cars programming was NOT good enough. It has made a bad decision with bad consequences.

    • FaceDeer
      link
      fedilink
      0
      edit-2
      1 year ago

      And “no it isn’t” isn’t a very convincing argument to the contrary.

      Yes, in this particular case, maybe the car should have moved a bit. I’m talking about the general case. What are the odds that a car happens to come to a stop with its wheel exactly on top of someone’s limb, versus having that wheel finish up somewhere near the person where further movement might cause additional harm? And how can the car know which situation it’s currently in?

      • @[email protected]
        link
        fedilink
        English
        5
        edit-2
        1 year ago

        What are the odds

        Wrong question.

        If you want autonomous cars outside in the real world (as opposed to artificial lab and test scenarios), then they have to deal with real world situations. This situation has happened in reality. You don’t need to ask about odds anymore.

        how can the car know which situation it’s currently in?

        That is an engineering question. A good one. And again one of these that should have been solved before they let this car out into the real world.

        • FaceDeer
          link
          fedilink
          21 year ago

          This situation happened, yes. Do you think this is the only time that an autonomous car will ever find itself straddling a pedestrian and need to decide which way to move its tires to avoid running over their head? You can’t just grab one very specific case and tell the car to treat every situation as if it was identical to that, when most cases are probably going to be quite different.