

Mostly Kubernetes and sometimes podman
Mostly Kubernetes and sometimes podman
Personally, Framework has become a bit too expensive for me. If you’re in the US I’d look at the older Dell precision and HP ZBook workstations from 2020 or earlier, they have amazing specs and go for $400 or so. Fairly repairable because enterprises demanded that they be and gobs of power for anything you want.
Older MacBooks still have that darned WiFi card which you need special proprietary drivers for. And basically nothing in that chassis is standard; everything is Apple-specific if you want to repair it. I don’t recommend MacBooks
OS-hardening is exactly what I meant. Thanks
Thank you for your comment, I will save it. This really cleared it up
It is possible. One can have IMAP hosted on their server and simply use the SMTP server operated by a different entity. There are companies offering SMTP servers for free as long as you’re under the limit.
Thanks. After speaking with some others here, I’ve realised that this is actually quite doable (in theory). The other commenter has a great note on DKIM and SPF that I’m sure will help anyone looking to do this. Thanks for your help, I’ve also found a lot of companies offering a free SMTP server for a limited number of emails (which is more emails than I’ll ever send so it works for me).
The previous commenter mentioned mxroute and I got sendgrid from your comment. I will look at these products, is there any other provider that you recommend?
Amazing comment. Saved. Thank you so much.
Indeed, I have thought about hosting my own email, but the problem of dealing with IP blacklists made it seem not worth it.
Thank you so much for the explanation on DKIM and SPF. It makes sense to me now, indeed I didn’t really have a clue about either of these before I read your comment. Thank you for breaking it down.
I have an alternative for you if your power bills are cheap: X99 motherboard + CPU combos from China
In general how much VRAM do I need for 14B and 24B models?
I didn’t know that. I thought just one ROCM binary to install, run Ollama and that’s it. Thanks for the explanation
Do you have any recommendations for running the Mistral small model? I’m very interested in it alongside CodeLlama, OogaBooga and others
Wait how does that work? How is 24GB enough for a 38B model?
The 7900XTX was $1000 when it launched, I wouldn’t mind it used either.
I don’t mind multiple GPUs but my motherboard doesn’t have 2+ electrically connected X16 slots. I could build a new homeserver (I’ve been thinking about it) but consumer platforms simply don’t have the PCIE lanes for 2 actual x16 slots. I’d have to go back to Broadwell Xeons for that, which are really power hungry. Oh well, I don’t think it matters considering how power hungry GPUs are now.
I am OK with either Nvidia or AMD especially if Ollama supports it. With that said I have heard that AMD takes some manual effort whilst Nvidia is easier. Depends on how difficult ROCM is
Thank you. Are 14B models the biggest you can run comfortably?
Do you have 2 PCIE X16 slots on your motherboard (speaking in terms of electrical connections)?
Why aren’t you just using Kodi?