RTX 3090 really is a shockingly good buy
| https://imgur.com/a/o2g8xYK | 12/05/25 | | https://imgur.com/a/o2g8xYK | 12/05/25 |
Poast new message in this thread
Date: December 5th, 2025 7:25 PM
Author: https://imgur.com/a/o2g8xYK
I doubted it for a long time because it's a Samsung product. However, it you ONLY want to run local LLMs it's actually 180. It runs Gemma 27b lightning fast. Sure it will drink 350W while it's doing it, but it drops right down to 5 W at idle. It's really quite efficient in that sense, because you're hardly ever hitting it. If all you want is something to host ollama, it's perfect. I know people are using AMD but it's a nightmare, believe me. Having Nvidia means everything just works. These are like $800 on ebay last I checked. 24gb of VRAM. There's no point in getting the ti if you're just using it for AI
(http://www.autoadmit.com/thread.php?thread_id=5806767&forum_id=2\u0026show=week#49487222) |
|
|