What is the best GP...
 
Notifications
Clear all

What is the best GPU for running DeepSeek-V3 locally?

3 Posts
4 Users
0 Reactions
62 Views
0
Topic starter

honestly so hyped to finally get deepseek v3 running on my own hardware instead of paying those api fees every month. im stuck between two paths right now for my home lab setup in the garage.

  • option a: grabbing 2 used 3090s off ebay so i can get that 48gb vram pool but im worried about the power draw on my old house circuits
  • option b: biting the bullet on a single 4090 and just running a really tiny quantized version

i got about 3500 bucks to blow on this by next friday for my local coding assistant project. do you guys think the dual 3090 setup is still the king for vram heavy stuff like v3 or is the 4090 speed better for a single user...


3 Answers
12

In my experience building these rigs over the years, 24GB is just gonna frustrate you with V3. Since you have 3500 bucks, maybe skip the dual setup and look for a used NVIDIA RTX A6000 48GB GDDR6. It pulls way less power than dual 3090s and keeps everything on one bus. Check r/hardwareswap or local liquidators... you can usually find them within your budget if you dig around.


12

Just catching up on this... honestly i'm pretty satisfied with my multi-gpu rig and it handles these massive models like a champ. For DeepSeek-V3, you gotta prioritize VRAM over raw speed or you're gonna have a bad time.

  • NVIDIA GeForce RTX 3090 24GB GDDR6X: Grabbing two of these used is the best bang for your buck. Having 48GB total lets you run the 4-bit quants which are way more reliable for coding.
  • NVIDIA GeForce RTX 4090 24GB GDDR6X: The 4090 is insanely fast but 24GB is basically a cage for a model this big. You'll have to compress it so much it'll probably start giving you garbage code. Since you got 3500 bucks, you'll actually have a ton of cash left over for a beefy PSU like the EVGA SuperNOVA 1600 P+ 80+ Platinum to handle the dual cards. Cap the power limit on the GPUs to 75% using software and you wont even have to worry about the old garage breakers. Its worked well for me and stays pretty quiet too.


3

> grabbing 2 used 3090s off ebay so i can get that 48gb vram pool but im worried about the power draw Unfortunately, running a dual setup on older residential circuits is frequently a recipe for disaster. I had significant issues with power spikes tripping my breakers last year, and it was not as stable as expected. Before you commit, how many amps is that garage circuit actually rated for? Also, are you prioritizing inference speed or the total context window?


Share: