• @JustUseMint@lemmy.world
      link
      fedilink
      English
      510 months ago

      Absolutely. Host your own. Like the other person said, Hugging Face and look upon llama.cpp as well, vicuna wizard uncensored probably spelled that wrong

    • @BananaOnionJuice@lemmy.dbzer0.com
      link
      fedilink
      English
      410 months ago

      I finally found some offline ones jan.ai and koboldcpp you download the GGUF model and run everything from your own pc, it just takes a lot of CPU and GPU for it to work acceptable, my setup can’t really manage much more than a model with 7B.