• @realitista@lemm.ee
    link
    fedilink
    6
    edit-2
    8 months ago

    My issue with generative AI is not that it doesn’t have uses, but that it seems to me that the vast majority of those uses are nefarious.

    As far as I can tell, it has the most potential for:

    • Creating sock puppet accounts on social media to sway public opinion

    • Make fake media/ identity theft

    • Plagarize various art mediums and meld them together enough to make attribution difficult

    Other positive use cases like summarization or reformatting seem to pale in comparison to the potential negative effects of the bad use cases. There are many marginal use cases like coding or law where you may save some time but the review required is likely not that much different than the time it would take for a good programmer or lawyer to just write it.

    • aiccount
      link
      78 months ago

      Most positive use cases are agent-based and the average user doesn’t have access to good agent-based systems yet because it requires a bit of willingness to do some “coding”. This will soon not be the case though. I can give my crew of AI agents a mission, for example, “find all the papers on baby owl vocalizations and make 10 different charts of the frequency range relative to their average size after each of their first 10 weeks of life”, and come back an hour later and have something that would have been 100 hours for a grad student just last year. Right now I have to wait an hour or so, soon it will be instant.

      The real usefulness of these agents today is enormous, it is just outside of the view of many average people because their normal lives don’t require this kind of power.

    • @CanadaPlus@lemmy.sdf.org
      link
      fedilink
      3
      edit-2
      8 months ago

      You forgot porn.

      Edit: Actually, in the article it mentions coding assistents and various interfaces. Not to mention the plagiarism thing is a misunderstanding. I’m not sure why I decided to jump on the jerk there, I disagree with you.