AI singer-songwriter ‘Anna Indiana’ debuted her first single ‘Betrayed by this Town’ on X, formerly Twitter—and listeners were not too impressed.

  • PupBiru
    link
    fedilink
    01 year ago

    it’s only qualitative because we don’t understand it

    when an LLM “experiences” new data via training, that’s subjective too: it works its way through the network in a manner that’s different depending on what came before it… if different training data came before it, the network would look differently and the data would change the network as a whole in a different way

    • queermunist she/her
      link
      fedilink
      -21 year ago

      When an LLM feeds on its own outputs, though, it quickly starts to hallucinate. I think this is actually closer to creativity, but it betrays the fundamental flaw behind the technology - it does not think about its own thoughts and requires a curator to help it create.

      I’ll believe something is an AI when it can be its own curator and not drive itself insane.

      • PupBiru
        link
        fedilink
        0
        edit-2
        1 year ago

        that’s a lack of understanding of concepts though, rather than a lack of creativity… curation requires that you understand the concept that you’re trying to curate: this looks more like a dog than this; this is a more attractive sunset than this

        current LLMs and ML don’t understand concepts, which is their main issue

        id argue that it kind of does “think about its own thoughts” to some degree: modern ML is layered, and each layer of the net feeds into the next… one layer of the net “thinks about” the “thoughts” of the previous layer. now, it doesn’t do this as a whole but neither do we: memories and neural connections are lossy; heck even creating a creative work isn’t going to turn out exactly like you thought it in your head (your muscle memory and skill level will effect the translation from brain to paper/canvas/screen)

        but even we hallucinate in the same way. don’t look at a bike, and then try and draw a bike… you’ll get general things like pedals, wheels, seat, handlebars, but it’ll be all connected wrong. this is a common example people use to show how our brains aren’t as precise and we might like to think… drawing a bike requires a lot of very specific things to be in very specific places and that’s not how our brain remembers the concept of “bike”

        • queermunist she/her
          link
          fedilink
          -2
          edit-2
          1 year ago

          current LLMs and ML don’t understand concepts, which is their main issue

          This is a relevant issue to the question!

          If I take a dose of LSD and paint the colors I hallucinate, is that creative? I’d argue it’s not.

          Only when I, the subjective self, curate my own thoughts and sensations can I engage in a creative process. I can think about my own thoughts without going insane (how do the colors make me feel, what do the colors mean?) and that’s a fundamental part of creativity and intelligence. Conceptualization is key to subjectivity.

          I don’t think this is far off. I just don’t think we’re there, either, and we should be skeptical of marketing hype.

          • PupBiru
            link
            fedilink
            01 year ago

            i don’t agree with that definition of creative… there’s lots of engineering work that’s creative: writing code and designing systems can be a very creative process, but doesn’t involve feeling… it’s problem solving, and thats a creative process. you’re narrowly defining creativity as artistic expression of emotion, however there’s lots of ways to be creative

            now, i think thats a bit of a strawman (so i’ll elaborate on the broader point), but i think its important to define terms

            i agree we should be skeptical of marketing hype for sure: the type of creativity that i believe ML is currently capable of is directionless. it doesn’t understand what it’s creating… but the truth lies somewhere in the middle

            ML is definitively creating something new that didn’t exist before (in fact i’d say that its trouble with hallucinations of language are a good example of that: it certainly didn’t copy those characters/words from anywhere!)… this fits the easiest definition of creative: marked by the ability or power to create

            the far more difficult definition is: having the quality of something created rather than imitated

            the key here being “rather than imitated” which is a really hard thing to prove, even for humans! which is why our copyright laws basically say that if you have evidence that you created something first, you pretty much win: we don’t really try to decide whether something was created or imitated

            with things like transformative works or things that are similar, it’s a bit more of a grey area… but the argument isn’t about whether something is an imitation; rather it’s argued about how different the work is from the original