And the quality of the AI output sucks. I was recently looking for information about positive convention for yaw, pitch, and roll in aircraft. I was looking at az and yaw and got reasonable results from the AI, but when I looked at pitch and el all of the results were about elevator pitches. Even when I spelled out elevation it insisted on elevator pitches. I scroll past the AI results as a matter of principle, but I usually look at them so I have something specific to complain about when people ask why I am so virulently anti-AI.
The other day I tried to have it help me with a programming task on a personal project. I am an experienced programmer, but I only “get by” in Python (typically just by looking up the documentation for the standard library). I thought, “OK. This is it. I will ask Llama 3.3 and GPT4 for help.”
That shit literally set me back a weekend. It gave me such bad approaches and answers, that I could tell were bad (aforementioned experience in programming, degree in comp sci, etc) that I got confused about writing Python. Had I just done what I usually do, which is to look up the documentation and use my brain, I would have gotten my weekend task done a whole weekend sooner.
It scares me to think what people are doing to themselves by relying on this, especially if they’re novices.
I recently started as a graphic designer despite knowing absolutely nothing about it, so i am constantly searching how to do stuff in Adobe suite at work. Half the time Google’s AI can’t even keep “Cmnd” and “ctrl” straight, telling me to use’ “cmnd+shift+H” on Windows or “ctrl+shift+H” on Mac’. I don’t even know how it botches that, but it does it about 25% of the time.
And the quality of the AI output sucks. I was recently looking for information about positive convention for yaw, pitch, and roll in aircraft. I was looking at az and yaw and got reasonable results from the AI, but when I looked at pitch and el all of the results were about elevator pitches. Even when I spelled out elevation it insisted on elevator pitches. I scroll past the AI results as a matter of principle, but I usually look at them so I have something specific to complain about when people ask why I am so virulently anti-AI.
AI is useful for basic, mundane tasks and that’s about it. Trying to force it to be some sort of Uber search engine is such a bad idea.
The other day I tried to have it help me with a programming task on a personal project. I am an experienced programmer, but I only “get by” in Python (typically just by looking up the documentation for the standard library). I thought, “OK. This is it. I will ask Llama 3.3 and GPT4 for help.”
That shit literally set me back a weekend. It gave me such bad approaches and answers, that I could tell were bad (aforementioned experience in programming, degree in comp sci, etc) that I got confused about writing Python. Had I just done what I usually do, which is to look up the documentation and use my brain, I would have gotten my weekend task done a whole weekend sooner.
It scares me to think what people are doing to themselves by relying on this, especially if they’re novices.
I recently started as a graphic designer despite knowing absolutely nothing about it, so i am constantly searching how to do stuff in Adobe suite at work. Half the time Google’s AI can’t even keep “Cmnd” and “ctrl” straight, telling me to use’ “cmnd+shift+H” on Windows or “ctrl+shift+H” on Mac’. I don’t even know how it botches that, but it does it about 25% of the time.
Yea that’s a bad example of what to use ai for at least right now. You’re going to get bad results with that question.
It’s good for things, if you pay.
I don’t want to ask ai. Google automatically gives me ai search results that are piss poor. Those useless results still use energy to generate.