• Concerns rise as Neuralink fails to provide evidence of brain implant success, raising safety and transparency questions.

• Controversy surrounds Neuralink’s lack of data on surgical capabilities and alarming treatment of monkeys with brain implants.

• While Neuralink touts achievements, experts question true innovation and highlight developments in other brain implant projects.

  • @LibertyLizard@slrpnk.net
    link
    fedilink
    English
    62
    edit-2
    9 months ago

    Teslas are already directly dangerous to his customers but our society is numb to traffic violence so people don’t care as much as they should. But “full self-driving” has already killed people.

    Edit: removed “a lot” because while I suspect it is true, it remains unproven.

    • @Thorny_Insight@lemm.ee
      link
      fedilink
      English
      19
      edit-2
      9 months ago

      “full self-driving” has already killed a lot of people.

      There’s only one death linked to FSD beta and even he was driving drunk.

      In a recent interview, Rossiter said he believes that von Ohain was using Full Self-Driving, which — if true — would make his death the first known fatality involving Tesla’s most advanced driver-assistance technology

      Von Ohain and Rossiter had been drinking, and an autopsy found that von Ohain died with a blood alcohol level of 0.26 — more than three times the legal limit

      Source

      However there’s approximately 40 accidents that have led to serious injury or death due to the use of the less advanced driver assist system “autopilot”.

      • @LibertyLizard@slrpnk.net
        link
        fedilink
        English
        159 months ago

        You’re right, I was conflating the two. However, I suspect there are more cases than just this one due to Tesla’s dishonesty and secrecy.

      • @evatronic@lemm.ee
        link
        fedilink
        English
        49 months ago

        (Why would the human’s inebriation level matter if the vehicle is moving autonomously?)

        • @Jrockwar@feddit.uk
          link
          fedilink
          English
          89 months ago

          Because it’s not autonomous, nor “full self driving”. It’s a glorified adaptive cruise control. I don’t think it’s even in the L3 category… (I’m not the biggest fan of the autonomy “levels” classification but it’s an ok reference for this).

        • @LibertyLizard@slrpnk.net
          link
          fedilink
          English
          29 months ago

          Agreed. Also while it’s impossible to say in any individual case I suspect people might be more likely to drive while inebriated if they believe the autopilot will be driving for them.

        • @Thorny_Insight@lemm.ee
          link
          fedilink
          English
          09 months ago

          This kind of thinking is why these accidents happen. The goal of autonomous driving is for it to one day be better driver than the best human driver, but this technology is still in its infancy and requires an attentive driver behind the wheel. Even Teslas tell you this when you engage these systems.

        • RedFox
          link
          fedilink
          English
          -39 months ago

          What if we compare that to human related injuries?

          I bet more people were killed by other human drivers today. Probably another right now…

          I’m not excusing lack of tech safety, but I think there’s a double standard not in context.

          • @LibertyLizard@slrpnk.net
            link
            fedilink
            English
            2
            edit-2
            9 months ago

            So I hear what you’re saying—what we really want to measure is deaths avoided versus those caused. But it’s a difficult thing to measure how many people the technology saved. So while I’m cognizant of this issue, I’m not sure how to get around that. That said, the article mentions that Tesla drivers are experiencing much higher rates of collisions than other manufacturers. There could be multiple factors at play here, but I suspect the autopilot (and especially Tesla’s misleading claims around it) is among them.

            Also, while there may be an unmeasured benefit in reducing collisions, there may also be an unmeasured cost in inducing more driving. This has not been widely discussed in this debate but I think it is a big problem with self-driving technology that only gets worse as the technology improves.

            • RedFox
              link
              fedilink
              English
              2
              edit-2
              9 months ago

              Yeah, I’m hoping though it progresses to the point that we can reasonably reduce vehicle related incidents.

              Between drunk driving, texting, and generally not paying attention, I’d love more people using automated driving if it became statistically safer.

              Some people are scared to fly even thought it’s statistically safer. They don’t want to be the rare happening. Unless Boeing, then check your doors…

              Edit, I also agree you can’t easily track or correlate things that didn’t happen with all the factors here.