Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called ‘Death and the Gorgon’ by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real ‘steelman’ airtime to the subjects/subjects it is being negative about. But well, he doesn’t have to, he isn’t the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

  • Charlie Stross
    link
    fedilink
    52 days ago

    @bencurthoys @Amoeba_Girl @Soyweiser I’m pretty sure that about 10-20 years ago Egan came out with a serious repudiation of his own ideas about achieving AI through iterated simulations of less-intelligent entities: he noted that implementing it was implicitly genocidal (by murdering all entities that didn’t *quite* meet some threshold set by the experimenters, you’d inevitably kill huge numbers of sentient beings just for failing an arbitrary test).

    • @blakestacey@awful.systemsM
      link
      fedilink
      English
      42 days ago

      I vaguely recalled a statement of his to that effect and found one here:

      What I regret most [about Permutation City] is my uncritical treatment of the idea of allowing intelligent life to evolve in the Autoverse. Sure, this is a common science-fictional idea, but when I thought about it properly (some years after the book was published), I realised that anyone who actually did this would have to be utterly morally bankrupt. To get from micro-organisms to intelligent life this way would involve an immense amount of suffering, with billions of sentient creatures living, struggling and dying along the way. Yes, this happened to our own ancestors, but that doesn’t give us the right to inflict the same kind of suffering on anyone else.

      This is potentially an important issue in the real world. It might not be long before people are seriously trying to “evolve” artificial intelligence in their computers. Now, it’s one thing to use genetic algorithms to come up with various specialised programs that perform simple tasks, but to “breed”, assess, and kill millions of sentient programs would be an abomination. If the first AI was created that way, it would have every right to despise its creators.

      He even wrote a story on that theme, “Crystal Nights”.

    • Ben Curthoys
      link
      fedilink
      5
      edit-2
      2 days ago

      @cstross @Amoeba_Girl @Soyweiser My usual handle when playing online games is “Bickel”, because I happened to be re-reading “Destination: Void” at the time that I first signed up my World Of Warcraft account, and killing huge numbers of sentient beings in the pursuit of artificial consciousness was definitely not a problem for Frank Herbert =)

      • @Amoeba_Girl@awful.systems
        link
        fedilink
        English
        42 days ago

        Herbert is so obsessed with his particular vision of eugenics it ends up back being endearing. Look at our big boy building his big torture worlds just so they can roundaboutly excrete one superman. Such a specific, endlessly restated fetish.

    • @gerikson@awful.systems
      link
      fedilink
      English
      52 days ago

      There’s a fun/horrifying scene in Ken McLeod’s Stone Canal where the protagonists revive superhuman intelligences from cold storage, get the answers they need from them, then destroy them with nanotech the superhumans have not developed defenses against. As one of them says when confronted: “standard programming practice, keep the source code, blow away the object code”.

      (It’s partially justified that if left alone the superintelligences will just iteratively bootstrap themselves into catatonic insanity anyway)