• Darkassassin07@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      7 hours ago

      It would, but he explicitly says ‘without even a slight tap on the breaks’ in the youtube video.

      Then:

      Here is the raw footage of my Tesla going through the wall. Not sure why it disengages 17 frames before hitting the wall but my feet weren’t touching the brake or gas.

      - Mark Rober

      Twitter.

      • billwashere@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        6 hours ago

        He did state that he hit the brakes. Just on the fog one, not the wall. 🤷

        But the fact that FSD disengages 17 frames before the crash also implies the software, along with the car, crashed 😂 I’d love to see the logs. I imagine the software got real confused real quick.

        • LeninOnAPrayer@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          5 hours ago

          This is likely what happened. The software hit an invalid state that defaulted to disengaging the autopilot feature. Likely hit an impossible state as the camera could no longer piece the “wall” parts of the road with the actual road as it got too close.

          Likely an error condition occured that would happen in the same way if an object suddenly covered the camera while driving. It would be even more concerning if the autopilot DIDN’T disengage at that point.

      • LeninOnAPrayer@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 hours ago

        The software likely disengaged because the wall was so close it could not function. It would be the same as if an object suddenly covered up the camera while you were driving. It would turn off the autopilot/FSD or whatever they call it. (At least I hope it would)