\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

Rate my PC build for large model local LLM

AMD Ryzen 9 9950X GPU1: RTX 4090 GPU2: RTX A6000 128gb RA...
pale ape native
  12/11/24
why not two 4090
Sticky awkward boistinker azn
  12/11/24
My combo is more optimal for dual GPU since you can't onboar...
pale ape native
  12/11/24
the CPU absolutely does nothing for LLM?
Sticky awkward boistinker azn
  12/11/24
CPU matters. That's why I got the best one
pale ape native
  12/11/24
There's something called CPU offloading where some processin...
aphrodisiac church building
  12/11/24
Do you think Apple's future M4 max/ultra cpu will be better ...
pale ape native
  12/11/24
no
aphrodisiac church building
  12/11/24
M1 Ultra is still plenty
sexy haunted graveyard elastic band
  01/08/25
i have similar but just for poasting
Sooty electric sandwich double fault
  12/11/24
...
pale ape native
  12/11/24
...
aquamarine stirring center
  12/11/24
...
irradiated office
  12/11/24
Where are you finding 4090s for sale that aren't jacked up 2...
Razzmatazz institution trump supporter
  12/11/24
I got it months ago and just putting together now. Are pr...
pale ape native
  12/11/24
For work?
razzle-dazzle ladyboy
  12/11/24
Yep
pale ape native
  12/11/24
Is setting up a local LLM pretty easy? Does it need coding s...
crimson plaza
  12/11/24
really easy with ollama, https://ollama.com/
razzle-dazzle ladyboy
  12/11/24
...
aquamarine stirring center
  12/11/24
whats the benefit of running these locally vs just subscribi...
heady coral nowag crackhouse
  12/11/24
the ones running locally are not as powerful and don't accep...
razzle-dazzle ladyboy
  12/11/24
does the hardware above make it practical to train your own?...
heady coral nowag crackhouse
  12/11/24
1) Yes the above hardware can easily handle large models and...
pale ape native
  12/11/24
0 chance you will be able to train your own LLM, these compa...
razzle-dazzle ladyboy
  12/11/24
You don’t need this much power just to take notes/exam...
Histrionic Plum Striped Hyena
  12/11/24
163
pale ape native
  12/11/24
...
irradiated office
  12/11/24
...
autistic resort
  12/11/24
...
irradiated office
  12/12/24
how much did it all come to? will a local llm be capable of ...
vivacious trailer park telephone
  12/11/24
Under $5k but I didn't buy a6000 card yet Those fuckers a...
pale ape native
  12/11/24
how many watts does the PSU have?
peach regret
  12/11/24
170 Edit: this is the CPU watts
pale ape native
  12/11/24
that's a bit low, I don't think u actually have built a comp...
peach regret
  12/11/24
Dark Side of the Moon 1350W PSU, Advanced Liquid Cooled CPU ...
pale ape native
  12/11/24
You are using an OEM Alienware part for you PSU? How does t...
peach regret
  12/12/24
OP is growing a beard and changing monikres as we speak
maroon godawful goyim
  12/12/24
...
pale ape native
  12/12/24
I really want to put together my own local LLM setup too. Ta...
ocher area depressive
  12/11/24
We should start a discord or something
pale ape native
  12/11/24
I'm down, I'm still a total noob at this I haven't even star...
ocher area depressive
  12/11/24
If you have an M2 or later MacBook/mini you can install olla...
pale ape native
  12/11/24
can u show us what u can do with ur setup
Sticky awkward boistinker azn
  12/11/24
What will you use it for? How is it better than just getting...
Yellow Adventurous Piazza
  12/12/24
Re 1st question: scaling and streamlining CRM and customer s...
pale ape native
  12/12/24
link to wat u can get from ur setup
Sticky awkward boistinker azn
  12/12/24
automating as much of the client lifecycle as possible t...
maroon godawful goyim
  12/12/24
Bump for 5090, how many are you getting OP?
Razzmatazz institution trump supporter
  01/08/25
Isn't it just easier to buy a Mac Studio or whatever it's ca...
crimson plaza
  01/08/25
You can use this for DeepSeek now!
Soul-stirring irate property
  01/28/25


Poast new message in this thread



Reply Favorite

Date: December 11th, 2024 1:25 AM
Author: pale ape native

AMD Ryzen 9 9950X

GPU1: RTX 4090

GPU2: RTX A6000

128gb RAM

8TB NVMe 4.0

Windows 11 Pro

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435761)



Reply Favorite

Date: December 11th, 2024 2:03 AM
Author: Sticky awkward boistinker azn

why not two 4090

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435788)



Reply Favorite

Date: December 11th, 2024 2:12 AM
Author: pale ape native

My combo is more optimal for dual GPU since you can't onboard dual 4090s on 1 motherboard and eGPU throttles 50-70% of 4090 performance due to inherent bandwidth limitations with thunderbolt connection vs A6000 is optimized for low bandwidth and has 48gb VRAM (vs 24gb for 4090) to handle massive data sets



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435793)



Reply Favorite

Date: December 11th, 2024 3:34 AM
Author: Sticky awkward boistinker azn

the CPU absolutely does nothing for LLM?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435877)



Reply Favorite

Date: December 11th, 2024 5:14 AM
Author: pale ape native

CPU matters. That's why I got the best one

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435907)



Reply Favorite

Date: December 11th, 2024 8:22 AM
Author: aphrodisiac church building

There's something called CPU offloading where some processing is handled by the CPU. But the kind of operations used in LLMs is different than the kind of operations at which CPUs excel.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48436129)



Reply Favorite

Date: December 11th, 2024 1:36 PM
Author: pale ape native

Do you think Apple's future M4 max/ultra cpu will be better at LLM?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437276)



Reply Favorite

Date: December 11th, 2024 2:57 PM
Author: aphrodisiac church building

no

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437673)



Reply Favorite

Date: January 8th, 2025 9:20 PM
Author: sexy haunted graveyard elastic band

M1 Ultra is still plenty

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48533861)



Reply Favorite

Date: December 11th, 2024 1:34 AM
Author: Sooty electric sandwich double fault

i have similar but just for poasting

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435773)



Reply Favorite

Date: December 11th, 2024 2:14 AM
Author: pale ape native



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435795)



Reply Favorite

Date: December 11th, 2024 3:05 PM
Author: aquamarine stirring center



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437693)



Reply Favorite

Date: December 11th, 2024 9:07 PM
Author: irradiated office



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438902)



Reply Favorite

Date: December 11th, 2024 11:24 AM
Author: Razzmatazz institution trump supporter

Where are you finding 4090s for sale that aren't jacked up 2x MSRP by scalpers?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48436665)



Reply Favorite

Date: December 11th, 2024 1:33 PM
Author: pale ape native

I got it months ago and just putting together now.

Are prices on 4090 still that bad? I thought 5090 and 5080 were about to drop any day now

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437259)



Reply Favorite

Date: December 11th, 2024 1:34 PM
Author: razzle-dazzle ladyboy

For work?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437264)



Reply Favorite

Date: December 11th, 2024 8:56 PM
Author: pale ape native

Yep

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438874)



Reply Favorite

Date: December 11th, 2024 1:35 PM
Author: crimson plaza

Is setting up a local LLM pretty easy? Does it need coding savvy?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437271)



Reply Favorite

Date: December 11th, 2024 1:36 PM
Author: razzle-dazzle ladyboy

really easy with ollama, https://ollama.com/

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437274)



Reply Favorite

Date: December 11th, 2024 1:38 PM
Author: aquamarine stirring center



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437286)



Reply Favorite

Date: December 11th, 2024 3:40 PM
Author: heady coral nowag crackhouse

whats the benefit of running these locally vs just subscribing to claude or whatever



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437815)



Reply Favorite

Date: December 11th, 2024 3:45 PM
Author: razzle-dazzle ladyboy

the ones running locally are not as powerful and don't accept really long context windows. the main benefit is no limits to the number of requests and you can find uncensored models.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437832)



Reply Favorite

Date: December 11th, 2024 3:55 PM
Author: heady coral nowag crackhouse

does the hardware above make it practical to train your own? could you train on just on a giant dump of legal cases and get one superior at legal research at the expense of everything else, or is that even available better commercially at this point?



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437869)



Reply Favorite

Date: December 11th, 2024 4:38 PM
Author: pale ape native

1) Yes the above hardware can easily handle large models and is possibly even overkill for CRM, basic legal queries / prompts about client's file, and drafting complex legal briefs

2) Local llm with win10 Pro (eg, ollama, GPT4All, chat rtx) is the ONLY option for attorneys to do this offline given client data sensitivity

3) if you spent $10k on a fully spec'd out Mac Pro/Studio with 192gb unified ram it would not come close to this setup for local LLM in terms of speed and ability to handle large data and complex writing tasks

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438017)



Reply Favorite

Date: December 11th, 2024 5:23 PM
Author: razzle-dazzle ladyboy

0 chance you will be able to train your own LLM, these companies spend tens of millions $$$ on gpu hardware to train an LLM.

The best you can do locally is "retrieval augmented generation" that is specific to legal cases/research.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438197)



Reply Favorite

Date: December 11th, 2024 1:37 PM
Author: Histrionic Plum Striped Hyena

You don’t need this much power just to take notes/exams at NYU Law tax classes.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437283)



Reply Favorite

Date: December 11th, 2024 8:48 PM
Author: pale ape native

163

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438847)



Reply Favorite

Date: December 11th, 2024 10:29 PM
Author: irradiated office



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48439148)



Reply Favorite

Date: December 11th, 2024 10:32 PM
Author: autistic resort



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48439153)



Reply Favorite

Date: December 12th, 2024 12:12 PM
Author: irradiated office



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440746)



Reply Favorite

Date: December 11th, 2024 4:41 PM
Author: vivacious trailer park telephone

how much did it all come to? will a local llm be capable of translating from exotic languages?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438026)



Reply Favorite

Date: December 11th, 2024 8:52 PM
Author: pale ape native

Under $5k but I didn't buy a6000 card yet

Those fuckers are expensive!

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438854)



Reply Favorite

Date: December 11th, 2024 5:25 PM
Author: peach regret

how many watts does the PSU have?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438199)



Reply Favorite

Date: December 11th, 2024 8:50 PM
Author: pale ape native

170

Edit: this is the CPU watts

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438849)



Reply Favorite

Date: December 11th, 2024 9:06 PM
Author: peach regret

that's a bit low, I don't think u actually have built a computer.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438901)



Reply Favorite

Date: December 11th, 2024 9:15 PM
Author: pale ape native

Dark Side of the Moon 1350W PSU, Advanced Liquid Cooled CPU & Clear Side Panel

Item number: 321-BIKW

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438939)



Reply Favorite

Date: December 12th, 2024 8:50 AM
Author: peach regret

You are using an OEM Alienware part for you PSU? How does that make sense?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440037)



Reply Favorite

Date: December 12th, 2024 8:54 AM
Author: maroon godawful goyim

OP is growing a beard and changing monikres as we speak

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440043)



Reply Favorite

Date: December 12th, 2024 10:17 AM
Author: pale ape native



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440299)



Reply Favorite

Date: December 11th, 2024 8:53 PM
Author: ocher area depressive

I really want to put together my own local LLM setup too. Tagging this thread for later. 180

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438860)



Reply Favorite

Date: December 11th, 2024 8:53 PM
Author: pale ape native

We should start a discord or something

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438863)



Reply Favorite

Date: December 11th, 2024 8:59 PM
Author: ocher area depressive

I'm down, I'm still a total noob at this I haven't even started to do my own research on what hardware I'll need

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438883)



Reply Favorite

Date: December 11th, 2024 9:01 PM
Author: pale ape native

If you have an M2 or later MacBook/mini you can install ollama or lm studio right now on an external ssd and start fucking around

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438889)



Reply Favorite

Date: December 11th, 2024 10:31 PM
Author: Sticky awkward boistinker azn

can u show us what u can do with ur setup

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48439150)



Reply Favorite

Date: December 12th, 2024 10:23 AM
Author: Yellow Adventurous Piazza

What will you use it for? How is it better than just getting a subscription to Claude/ChatGPT?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440310)



Reply Favorite

Date: December 12th, 2024 10:58 AM
Author: pale ape native

Re 1st question: scaling and streamlining CRM and customer service; reducing bottlenecks, automating as much of the client lifecycle as possible

Re 2nd question: mainly compliance with HIPAA, SCA, ECPA, CPRA etc

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440474)



Reply Favorite

Date: December 12th, 2024 11:04 AM
Author: Sticky awkward boistinker azn

link to wat u can get from ur setup

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440496)



Reply Favorite

Date: December 12th, 2024 1:06 PM
Author: maroon godawful goyim

automating as much of the client lifecycle as possible

tptptptp

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48441000)



Reply Favorite

Date: January 8th, 2025 8:34 PM
Author: Razzmatazz institution trump supporter

Bump for 5090, how many are you getting OP?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48533708)



Reply Favorite

Date: January 8th, 2025 9:08 PM
Author: crimson plaza

Isn't it just easier to buy a Mac Studio or whatever it's called? Supposedly it's very good for private LLM stuff

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48533823)



Reply Favorite

Date: January 28th, 2025 3:12 PM
Author: Soul-stirring irate property

You can use this for DeepSeek now!

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48598692)