petsoi@discuss.tchncs.de to Linux@lemmy.ml · 10 months agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19linkfedilinkarrow-up190arrow-down116
arrow-up174arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi@discuss.tchncs.de to Linux@lemmy.ml · 10 months agomessage-square19linkfedilink
minus-squareVincent@feddit.nllinkfedilinkarrow-up8·10 months agoAnd llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.
And llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.