• @girlfreddy@lemmy.ca
    link
    fedilink
    983 months ago

    Testing armed robot dogs in the Middle East instead of the US is pretty telling.

    Can’t be accidentally murdering Americans with a software glitch.

          • @beebarfbadger@lemmy.world
            link
            fedilink
            4
            edit-2
            3 months ago

            Don’t worry, no danger of killing real people in the Middle East. All the “collateral damage” will be brown people, not Americans. They’ll have all the kinks ironed out and will make sure that the AI doesn’t hurt white targets before the technology is distributed to every national police district.

            I wish this post even deserved a /s.

    • Which is wild when you add perspective using facts like the police in the US are less disciplined than troops overseas and tbe US still uses substances banned by the Geneva Convention on its civilian population. So if even the US wwon’t test it on their own people, it’s bad.

      • Jojo, Lady of the West
        link
        fedilink
        83 months ago

        Listen, the Geneva convention only specifies what we can’t use on enemies, okay? As long as the targets are technically friendlies, it’s fair game!

        • @PapstJL4U@lemmy.world
          link
          fedilink
          English
          43 months ago

          GC is for war and soldiers are combatants and not criminals by default (switching can happen easily). As an example Hollowpoint against criminals is okay as it can protect surrounding bystanders.

          It’s a bit weird, but for countries war is different from domestic problems.

    • Annoyed_🦀 A
      link
      83 months ago

      “Testing facility” in Gaza, just like how Israel do.

    • @RangerJosie@lemmy.world
      link
      fedilink
      263 months ago

      Oh it was already tremendously fucked. This is just gravy on top.

      Fuckin killbots. Coming soon to the 1033 program and thus, your local police department. The Boston Dynamics: Wardog!

    • We should never have moved away from sticks and stones tbh. Anything that works at long range makes people misunderstand what war is. War needs to look disgusting, because the more clean and automated it looks, the less horrible it looks to people spectacting it. But it is indeed just as horrible as beating someone to death with a rock.

      • Flying Squid
        link
        fedilink
        213 months ago

        I mean, I’d rather not be hunted down by an AI robot dog, but you do you.

        • @RangerJosie@lemmy.world
          link
          fedilink
          143 months ago

          It’s happening anyway. We build them. Others build them in response because they have to. The sophistication of killbots will increase. Terrorists will get hold of them eventually. They’ll be hacked and turned on their handlers and/or civilians.

          All this is on top of ever increasing climate catastrophe. Look at Appalachia. The topography of those mountains was just rewritten. Whole towns erased like they were never there.

          • Flying Squid
            link
            fedilink
            43 months ago

            That’s not a reason for me to want it to happen. Which was your original post’s suggestion.

            • @RangerJosie@lemmy.world
              link
              fedilink
              23 months ago

              My first post was about letting the army fuck around and find out. Let the natural course of events remind them of those scifi movies they forgot about.

                • @RangerJosie@lemmy.world
                  link
                  fedilink
                  -13 months ago

                  Thousands at least. The more effective the killbots are the more money our war economy will throw at warbot R&D.

                  This is happening. Nothing on this planet can stop it.

      • Bone
        link
        fedilink
        13 months ago

        Someone is bound to be dropping out of the sky to help us any minute now…

    • @SmilingSolaris@lemmy.world
      link
      fedilink
      43 months ago

      I remember some kinda skit about sci Fi authors writing about how bad a torture matrix would be ironically inspiring real people to create the torture matrix cause it’s the future.

    • @pigup@lemmy.world
      link
      fedilink
      83 months ago

      Well you see, the owners know you won’t die for them anymore, but now they’re able to take you out of the equation. Don’t even need poors to conquer the world. It’s really a great deal for them.

      • @Warjac@lemmy.world
        link
        fedilink
        13 months ago

        I dunno, I’m subscribed to the BD YouTube channel and the very sudden change in facilities and upgrades to bots seems to be a little too in line with this. Like someone definitely caved in my opinion.

  • @iAvicenna@lemmy.world
    link
    fedilink
    193 months ago

    dont worry first they test it where civil lives dont matter and once it passes some basic tests, they will become available for domestic (ab)use

  • @Asafum@feddit.nl
    link
    fedilink
    193 months ago

    Without reading the article can I take a wild guess and say this is from “we promise never to make weaponized robots” Boston Dynamics?

    A promise from a corporation is just a lie by another name.

  • @TommySoda@lemmy.world
    link
    fedilink
    113 months ago

    So if a robot commits a war crime, they can just blame it on AI and call it a day, right? Sounds like an easy way to do whatever the fuck you want.

  • @Sundial@lemm.ee
    link
    fedilink
    10
    edit-2
    3 months ago

    Is this their way of exterminating civilian populations like the Palestinians without dropping bombs and contributing so significantly to climate change?

    “The US military has been adopting a new climate friendly mindset and approach to international conflict. With this invention we can help our genocidal colonies acquire more land with little to no carbon emissions. We plan to be carbon-neutral by 2050, provided no one retaliates and attacks back.”