@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 5 months agoAI chatbots unable to accurately summarise news, BBC findswww.bbc.comexternal-linkmessage-square49fedilinkarrow-up1189arrow-down13
arrow-up1186arrow-down1external-linkAI chatbots unable to accurately summarise news, BBC findswww.bbc.com@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 5 months agomessage-square49fedilink
minus-square@tal@lemmy.todaylinkfedilinkEnglish10•edit-25 months agoThey are, however, able to inaccurately summarize it in GLaDOS’s voice, which is a strong point in their favor.
minus-squareJackGreenEarthlinkfedilinkEnglish3•5 months agoSurely you’d need TTS for that one, too? Which one do you use, is it open weights?
minus-square@brucethemoose@lemmy.worldlinkfedilinkEnglish1•edit-25 months agoZonos just came out, seems sick: https://huggingface.co/Zyphra There are also some “native” tts LLMs like GLM 9B, which “capture” more information in the output than pure text input.
minus-square@ag10n@lemmy.worldlinkfedilinkEnglish1•edit-25 months agoA website with zero information, and barely anything on their huggingface page. What’s exciting about this? Ahh, you should link to the model https://www.zyphra.com/post/beta-release-of-zonos-v0-1
minus-square@brucethemoose@lemmy.worldlinkfedilinkEnglish1•5 months agoWhoops, yeah, should have linked the blog. I didn’t want to link the individual models because I’m not sure hybrid or pure transformers is better?
They are, however, able to inaccurately summarize it in GLaDOS’s voice, which is a strong point in their favor.
Surely you’d need TTS for that one, too? Which one do you use, is it open weights?
Zonos just came out, seems sick:
https://huggingface.co/Zyphra
There are also some “native” tts LLMs like GLM 9B, which “capture” more information in the output than pure text input.
A website with zero information, and barely anything on their huggingface page. What’s exciting about this?
Ahh, you should link to the model
https://www.zyphra.com/post/beta-release-of-zonos-v0-1
Whoops, yeah, should have linked the blog.
I didn’t want to link the individual models because I’m not sure hybrid or pure transformers is better?