What kind of graphics card do I actually need to run DeepSeek 67B on my own computer at home? Sorry if this is a really basic question but I am completely lost with all these numbers and letters. I work at a small marketing agency here in Chicago and we keep hearing about these open source models being better for privacy so I wanted to try setting it up myself to learn. My budget is around $1,500 maybe $1,800 if I stretch it but I have no idea if that is even enough to get started.
I keep seeing people talk about VRAM and parameters and it feels like another language to be honest. Is 67B the size of the file or how much memory it uses? I was looking at some gaming cards like the RTX 4070 but then I saw someone say you need like 48GB of something to run the big ones and now I'm just confused. Is there a specific card that just works for this model or do I have to buy two cards and stick them together? I really want to get this built by the end of next month so I can start testing it for some copy writing stuff we do for clients. If I buy the wrong thing I'm gonna be out a lot of money so I figured I should ask here first before I click buy on anything...
I learned the hard way that 24GB isnt enough for 67B; I once tried running a similar sized model on one card and it just crashed instantly. You basically need 48GB of VRAM to fit it comfortably.
The 67B stands for 67 billion parameters. It basically tells you how big the models brain is. To run it, you need enough VRAM to hold all those parameters.