• bufalo1973@lemmy.ml
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    2 months ago

    The funny part will be once the car doesn’t have a driver and is full autonomous. If the car kills someone, who’s to blame?

    • Glytch@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      2 months ago

      The company that rented it to you, because fully self-driving cars won’t be for private ownership, they’ll just replace rideshare drivers.

      • explodicle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Who’s to say that will be immediate? Many people won’t be quick to abandon their guaranteed-available vehicle, especially while every house and employer has parking.

                • Sizzler@slrpnk.net
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  2 months ago

                  Ok so ten years then. In that time nearly all average family cars will be smart. They will have self-driving (they can come pick you up). Will have a few years of insurance claims and premiums showing they are not responsible for 99% of crashes and insurance will react accordingly pushing up the insurance of the last holdouts so far that it becomes uneconomical for the average person to drive “manual”.

                  • explodicle@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    2 months ago

                    It sounds like we’re assuming a similar adoption curve and are just using terms differently. In those intermediate years while insurance is reacting, if the driverless car kills someone, who’s to blame?

    • Schadrach@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      You treat it like any other traffic accident, except if a self driving car is responsible, that responsibility lies with the vehicle’s owner.

      • Wogi@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        2 months ago

        It would have to be the manufacturer.

        If someone steals your car and kills someone with it, then disappears without ever being identified, the car owner doesn’t assume liability. Liability falls on whoever was operating it at the time. If software was driving, then the software company assumes the liability.

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 months ago

          Doubt it. I mean, any self driving car is going to make the driver agree to responsibility for what the car does and ensure the user has a manual override available just in case.

          No company is going to ship fully autonomous driving software (for example to have fully autonomous driverless taxis) without contractually making the fleet owner responsible for their fleet cars.

        • explodicle@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 months ago

          But you bought the driverless car and turned it on. You never agreed to the thief’s joyride. Where do you draw the line for “operation” - like operating a steering-assist car, or operating a Roomba?

      • bufalo1973@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        It’s not the same. When you have a dog you use a leash and, if needed, you can restrain the mouth.

        In this case you are not in control. And you can’t be. You are just a passenger. And you should have the same responsibility as a passenger in a train: none.

        • boatsnhos931@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          I didn’t know about your parameters. I would think your example pushes it home, no car should ever be fully autonomous and should have a “leash” that a human could “restrain” the car with if necessary. Is no good?

    • supercriticalcheese@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Whichever was at fault is my non-lawyer opinion.

      What kind of penalty you apply to a self driving car guilty for causing an accident is a good question though.

      • bufalo1973@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        I guess it would be the car maker’s responsibility if you are only a passenger in the car.