• @wahming
      link
      English
      119 months ago

      No, mistake implies it wasn’t intentional

    • @LibertyLizard@slrpnk.net
      link
      fedilink
      49 months ago

      They are claiming the logo was not visible due to darkness but I’d like to see the footage to see how credible that claim is. I’m assuming they were watching with infrared camera, could you see such a logo with that technology?

      • @Huckledebuck@sh.itjust.worksOP
        link
        fedilink
        2
        edit-2
        9 months ago

        I don’t know. I was trying to point the blame beyond the obvious. I didn’t really mean that they mistook the emblem.

        I’m sure their preperations included making sure everyone was aware how they were leaving. Israel just doesn’t care.

        Edit: I realize now the extra question marks don’t read as the sarcasm i had meant.

        • @Forester@yiffit.net
          link
          fedilink
          2
          edit-2
          9 months ago

          From what we have recently learned the IDF is using dragnet surveillance and AI to choose targets. It’s purpose is purportedly rank every individual in Gaza a score 0 to 100. The higher your number, the more likely you are to get striked. The Intel released thus far indicates that the system gives you a higher score, the more Hamas members you associate with. I’m pretty sure the AI noticed the aid workers distributing aid and thus interacting with a large amount of Hamas -affiliated individuals and thus gained a rapid high score. And this is why AI should not make decisions.

          Note, I am not stating that all the people the aid workers interact with are active Hamas militants just people that are related to or neighbors to which is close enough for most AI systems.