@brt01010101@sh.itjust.works to No Stupid Questions@lemmy.world • edit-29 months agoXXXNSFWmessage-square98fedilinkarrow-up1118arrow-down14
arrow-up1114arrow-down1message-squareXXXNSFW@brt01010101@sh.itjust.works to No Stupid Questions@lemmy.world • edit-29 months agomessage-square98fedilink
minus-square@A_Very_Big_Fan@lemmy.worldlinkfedilinkEnglish1•9 months agoSo you do unironically think it takes that amount of equipment and power to output to a single device lmao
minus-square@mojofrododojo@lemmy.worldlinkfedilinkEnglish0•9 months agoI can’t tell if you’re fucking dense or can’t read. A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD. you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it? fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.
minus-square@A_Very_Big_Fan@lemmy.worldlinkfedilinkEnglish1•edit-29 months agoHere’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture. There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3
So you do unironically think it takes that amount of equipment and power to output to a single device lmao
I can’t tell if you’re fucking dense or can’t read.
A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD.
you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it?
fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.
Here’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture.
There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3