These are amazing. Dell, Lenovo and I think HP made these tiny things and they were so much easier to get than Pi’s during the shortage. Plus they’re incredibly fast in comparison.
Gamer, rider, dev. Interested in anything AI.
These are amazing. Dell, Lenovo and I think HP made these tiny things and they were so much easier to get than Pi’s during the shortage. Plus they’re incredibly fast in comparison.
Bad article title. This is the “Textbooks are all you need” paper from a few days ago. It’s programming focused and I think Python only. For general purpose LLM use, LLaMA is still better.
Yep, I’m using an RTX2070 for that right now. The LLMs are just executing on CPU.
Do you recommend this email provider? Lots of people looking to get off gmail lately.
Are you running your own mail server? I only ever integrated Spamassassin with postfix.
Stable Diffusion (Stability AI version), text-generation-webui (WizardLM), a text embedder service with Spacy, Bert and a bunch of sentence-transformer models, PiHole, Octoprint, Elasticsearch/Kibana for my IoT stuff, Jellyfin, Sonarr, FTB Minecraft (customized pack), a few personal apps I wrote myself (todo lists), SMB file shares, qBittorrent and Transmission (one dedicated to Sonarr)… Probably a ton of other stuff I’m forgetting.
For the really old stuff, I used to do NetBSD. I’m sure their 32bit x86 support is still top notch.