Getting started with local LLMs
I’ve recently dove headfirst into running LLMs on my local hardware, and I wanted to share what I’ve learned and how I’ve set it up so others that want to do the same thing, may. If you’re on a Windows Desktop machine, and you just want to interact with LLMs on your desktop, then skip to the section about installing Ollama. Hardware When it comes to the hardware, I utilized my media PC that had mostly been sitting dormant. This PC has a 5700X and RTX 3070 Ti, which is more than enough to get started. As of now in 2024, it’s still a great time to build a PC using new or used parts. If you need to brushup on how to build a PC, LTT created a great video here. I’ve found that most LLMs < 10B parameters (which we’ll talk about later) only need ~8GB of VRAM. So if you’re able to secure a GPU that has that amount of VRAM or less, you’ll be in a great spot. ...