

5·
2 months agoI recommend Ollama, its easy to setup and the cli can download and run llms. With some more techsavviness you can get openwebui as a nice ui.
I recommend Ollama, its easy to setup and the cli can download and run llms. With some more techsavviness you can get openwebui as a nice ui.
I was expecting a, “that is all” and was disappointed
I personally like lxc’s over vms for my home lab and i run a dedicated lxc for docker and one running a single node k8s.