• snooggums
    link
    fedilink
    English
    1915 days ago

    Paraphrasing:

    “We only have the driver’s word they were in self driving mode…”

    “This isn’t the first time a Tesla has driven onto train tracks…”

    Since it isn’t the first time I’m gonna go ahead and believe the driver, thanks.

    • Pika
      link
      fedilink
      English
      1215 days ago

      Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn’t go public saying it wasn’t means that it was in self-drive mode and they want to save the PR face and liability.

      • @IphtashuFitz@lemmy.world
        link
        fedilink
        English
        615 days ago

        I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.

          • Pika
            link
            fedilink
            English
            215 days ago

            Dude, in today’s world we’re lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that’s registered via the cars systems. That way they can make better decisions regarding people’s car insurance.

            Nowadays it’s a red flag if you join a car insurance and they don’t offer to give you a discount if you put something like drive pass on which logs you’re driving because it probably means that your car is already getting that data to them.

            • @CmdrShepard49@sh.itjust.works
              link
              fedilink
              English
              114 days ago

              We just got back from a road trip in a friend’s '25 Tundra and it popped up a TPMS warning for a faulty sensor then minutes later he got a text from the dealership telling him about it and to bring it in for service.

    • @Mouselemming@sh.itjust.works
      link
      fedilink
      English
      1
      edit-2
      15 days ago

      Since the story has 3 separate incidents where “the driver let their Tesla turn left onto some railroad tracks” I’m going to posit:

      Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.

      Prove me wrong, Tesla

    • @XeroxCool@lemmy.world
      link
      fedilink
      English
      115 days ago

      The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

      Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don’t like a company? I don’t think so. But if Tesla has proof fsd was off, we’ll know in a minute when they invade the driver’s privacy and release driving events

      • snooggums
        link
        fedilink
        English
        115 days ago

        Tesla has constantly lied about their FSD for a decade. We don’t trust them because they are untrustworthy, not because we don’t like them.

        • @BlueLineBae@midwest.social
          link
          fedilink
          English
          1
          edit-2
          15 days ago

          I have no sources for this so take with a grain of salt… But I’ve heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it’s Tesla’s faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla’s fault if it was on at the time.

          • snooggums
            link
            fedilink
            English
            015 days ago

            On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn’t a drop down of the same depth as the rails. Someone who is caught off guard isn’t going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn’t really available.

            So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren’t fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.

            • @ayyy@sh.itjust.works
              link
              fedilink
              English
              015 days ago

              If you’re about to be hit by a train, driving forward through the barrier is always the correct choice. It will move out of the way and you stay alive to fix the scratches in your paint.

    • @TheKingBee@lemmy.world
      link
      fedilink
      English
      0
      edit-2
      15 days ago

      Maybe I’m missing something, but isn’t it trivial to take it out of their bullshit dangerous “FSD” mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?

    • @NuXCOM_90Percent@lemmy.zip
      link
      fedilink
      English
      015 days ago

      I mean… I have seen some REALLY REALLY stupid drivers so I could totally see multiple people thinking they found a short cut or not realizing the road they are supposed to be on is 20 feet to the left and there is a reason their phone is losing its shit all while their suspension is getting destroyed.

      But yeah. It is the standard tesla corp MO. They detect a dangerous situation and disable all the “self driving”. Obviously because it is up to the driver to handle it and not because they want the legal protection to say it wasn’t their fault.

      • @AA5B@lemmy.world
        link
        fedilink
        English
        115 days ago

        At my local commuter rail station the entrance to the parking lot is immediately next to the track. It’s easily within margin of error for GPS and if you’re only focusing immediately in front of you the pavement at the entrance probably look similar.

        There are plenty of cues so don’t rolled shouldn’t be fooled but perhaps FSD wouldn’t pay attention to them since it’s a bit of an outlier.

        That being said, I almost got my Subaru stuck once because an ATV trail looked like the dirt road to a campsite from the GPS, and I missed any cues there may have been

        • @XeroxCool@lemmy.world
          link
          fedilink
          English
          114 days ago

          Sounds reasonable to mix up dirt roads at a campsite. Idk why the other commenter had to be so uptight. I get the mixup in the lot if it’s all paved and smooth, especially if say you make a left into the lot and the rail has a pedestrian crossing first. Shouldn’t happen, but there’s significant overlap in appearance of the ground. The average driver is amazingly inept, inattentive, and remorseless.

          I’d be amused if your lot is the one I know of where the train pulls out of the station, makes a stop for the crosswalk, then proceeds to just one other station.

          But the part of rail that’s not paved between? That should always be identifiable as a train track. I can’t understand when people just send it down the tracks. And yet, it still happens. Even at the station mentioned above where they pulled onto the 100mph section. Unreal.

  • Eww
    link
    fedilink
    English
    1014 days ago

    Tesla’s have a problem with the lefts.

  • @J52@lemmy.nz
    link
    fedilink
    English
    414 days ago

    Hope no one was hurt, regardless whether they’re stupid, distracted or whatever! If we can’t build fail-saves into cars, what are our chances for real AI?

    • @danhab99@programming.dev
      link
      fedilink
      English
      214 days ago

      Okay I don’t want to directly disagree with you I just want to add a thought experiment:

      If it is a fundamental truth of the universe, a human can literally not program a computer to be smarter than a human (because of some Neil deGrasse Tyson-esq interpretation of entropy), then no matter what AI’s will crash cars as often as real people.

      And the question of who is responsible for the AI’s actions will always be the person because people can take responsibility and AI’s are just machine-tools. This basically means that there is a ceiling to how autonomous self-driving cars will ever be (because someone will have to sit at the controls and be ready to take over) and I think that is a good thing.

      Honestly I’m in this camp that computers can never truly be “smarter” than a person in all respects. Maybe you can max out an ai’s self-driving stats but then you’ll have no points left over for morality, or you can balance the two out and it might just get into less morally challenging accidents more often ¯\_(ツ)_/¯. There are lots of ways to look at this

      • @mojofrododojo@lemmy.world
        link
        fedilink
        English
        1
        edit-2
        14 days ago

        a human can literally not program a computer to be smarter than a human

        I’d add that a computer vision system can’t integrate new information as quickly as a human, especially when limited to vision-only sensing - which Tesla is strangely obsessed with when the cost of these sensors is dropping and their utility has been proven by waymo’s excellent record.

        All in all, I see no reason to attempt to replace humans when we have billions. This is doubly so for ‘artistic’ ai purposes - we have billions of people, let artists create the art.

        show me an AI driven system that can clean my kitchen, or do my laundry. that’d be WORTH it.

    • @MBech@feddit.dk
      link
      fedilink
      English
      315 days ago

      I’m not sure I’d be able to sleep through driving on the railroad tracks. I’m going to guess this person was simply incredibly fucking stupid, and thought the car would figure it out, instead of doing the bare fucking minimum of driving their goddamn 2 ton heavy death machine themself.

    • @Darleys_Brew@lemmy.ml
      link
      fedilink
      English
      215 days ago

      I was gonna say it’s not so much the fact that the car was hit by a train, but that it turned on to the tracks …but 40 or 50 feet?

      • @NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        215 days ago

        Cop: WTF happened here?

        Driver: It drove itself onto the tracks

        Cop: Okay, but what about the other 49 feet of the 50 feet it’s on the tracks?

        Driver: …

  • @altphoto@lemmy.today
    link
    fedilink
    English
    214 days ago

    Honey, are those train tracks? … Yes looks like we’ll turn left on to the tracks for 1/2 a mile. Its a detour.

    • NιƙƙιDιɱҽʂ
      link
      fedilink
      English
      1
      edit-2
      14 days ago

      Yeaaaaah, I mean fuck Tesla for a variety of reasons, but right here we’re looking at a car that drove itself onto a set of train tracks, continued down the train tracks, and the people inside did…nothing? Like, they grabbed their shit and got out when it got stuck. The car certainly should not have done this, but this isn’t really a Tesla problem. It’ll definitely be interesting when robotaxis follow suit though.

      • @stephen01king@lemmy.zip
        link
        fedilink
        English
        1
        edit-2
        11 days ago

        This is unironically something I’ve heard people argue about public transport in general, that its a tool to control people’s movement.

  • nanook
    link
    fedilink
    114 days ago

    @Davriellelouna I am sure it was all monitored in real time and a revised algorithm will be included in a future update.

  • @XeroxCool@lemmy.world
    link
    fedilink
    English
    115 days ago

    If only there was a way to avoid the place where trains drive.

    I checked first. They didn’t make a turn into a crossing. It turned onto the tracks. Jalopnik says there’s no official statement that it was actually driving under FSD(elusion) but if it was strictly under human driving (or FSD turned itself off after driving off) I guarantee Tesla will invade privacy and slander the driver by next day for the sake of court of public opinion