\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

Rate my PC build for large model local LLM

AMD Ryzen 9 9950X GPU1: RTX 4090 GPU2: RTX A6000 128gb RA...
Bistre Bawdyhouse Filthpig
  12/11/24
why not two 4090
Harsh bearded stage
  12/11/24
My combo is more optimal for dual GPU since you can't onboar...
Bistre Bawdyhouse Filthpig
  12/11/24
the CPU absolutely does nothing for LLM?
Harsh bearded stage
  12/11/24
CPU matters. That's why I got the best one
Bistre Bawdyhouse Filthpig
  12/11/24
There's something called CPU offloading where some processin...
stirring maniacal codepig
  12/11/24
Do you think Apple's future M4 max/ultra cpu will be better ...
Bistre Bawdyhouse Filthpig
  12/11/24
no
stirring maniacal codepig
  12/11/24
M1 Ultra is still plenty
https://imgur.com/a/o2g8xYK
  01/08/25
i have similar but just for poasting
Mahogany associate
  12/11/24
...
Bistre Bawdyhouse Filthpig
  12/11/24
...
Sticky metal messiness travel guidebook
  12/11/24
...
vermilion odious abode legal warrant
  12/11/24
Where are you finding 4090s for sale that aren't jacked up 2...
racy olive partner university
  12/11/24
I got it months ago and just putting together now. Are pr...
Bistre Bawdyhouse Filthpig
  12/11/24
For work?
carmine cracking base patrolman
  12/11/24
Yep
Bistre Bawdyhouse Filthpig
  12/11/24
Is setting up a local LLM pretty easy? Does it need coding s...
wild casino
  12/11/24
really easy with ollama, https://ollama.com/
carmine cracking base patrolman
  12/11/24
...
Sticky metal messiness travel guidebook
  12/11/24
whats the benefit of running these locally vs just subscribi...
sepia unhinged stag film newt
  12/11/24
the ones running locally are not as powerful and don't accep...
carmine cracking base patrolman
  12/11/24
does the hardware above make it practical to train your own?...
sepia unhinged stag film newt
  12/11/24
1) Yes the above hardware can easily handle large models and...
Bistre Bawdyhouse Filthpig
  12/11/24
0 chance you will be able to train your own LLM, these compa...
carmine cracking base patrolman
  12/11/24
You don’t need this much power just to take notes/exam...
Spruce soul-stirring range
  12/11/24
163
Bistre Bawdyhouse Filthpig
  12/11/24
...
vermilion odious abode legal warrant
  12/11/24
...
Floppy abusive gaming laptop
  12/11/24
...
vermilion odious abode legal warrant
  12/12/24
how much did it all come to? will a local llm be capable of ...
concupiscible turquoise pervert
  12/11/24
Under $5k but I didn't buy a6000 card yet Those fuckers a...
Bistre Bawdyhouse Filthpig
  12/11/24
how many watts does the PSU have?
medicated fluffy scourge upon the earth clown
  12/11/24
170 Edit: this is the CPU watts
Bistre Bawdyhouse Filthpig
  12/11/24
that's a bit low, I don't think u actually have built a comp...
medicated fluffy scourge upon the earth clown
  12/11/24
Dark Side of the Moon 1350W PSU, Advanced Liquid Cooled CPU ...
Bistre Bawdyhouse Filthpig
  12/11/24
You are using an OEM Alienware part for you PSU? How does t...
medicated fluffy scourge upon the earth clown
  12/12/24
OP is growing a beard and changing monikres as we speak
Maroon market
  12/12/24
...
Bistre Bawdyhouse Filthpig
  12/12/24
I really want to put together my own local LLM setup too. Ta...
Lavender principal's office dragon
  12/11/24
We should start a discord or something
Bistre Bawdyhouse Filthpig
  12/11/24
I'm down, I'm still a total noob at this I haven't even star...
Lavender principal's office dragon
  12/11/24
If you have an M2 or later MacBook/mini you can install olla...
Bistre Bawdyhouse Filthpig
  12/11/24
can u show us what u can do with ur setup
Harsh bearded stage
  12/11/24
What will you use it for? How is it better than just getting...
Garnet corner dog poop
  12/12/24
Re 1st question: scaling and streamlining CRM and customer s...
Bistre Bawdyhouse Filthpig
  12/12/24
link to wat u can get from ur setup
Harsh bearded stage
  12/12/24
automating as much of the client lifecycle as possible t...
Maroon market
  12/12/24
Bump for 5090, how many are you getting OP?
pitbulls eating your face in hell forever tp
  01/08/25
Isn't it just easier to buy a Mac Studio or whatever it's ca...
imagine if the races were reversed
  01/08/25


Poast new message in this thread



Reply Favorite

Date: December 11th, 2024 1:25 AM
Author: Bistre Bawdyhouse Filthpig

AMD Ryzen 9 9950X

GPU1: RTX 4090

GPU2: RTX A6000

128gb RAM

8TB NVMe 4.0

Windows 11 Pro

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435761)



Reply Favorite

Date: December 11th, 2024 2:03 AM
Author: Harsh bearded stage

why not two 4090

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435788)



Reply Favorite

Date: December 11th, 2024 2:12 AM
Author: Bistre Bawdyhouse Filthpig

My combo is more optimal for dual GPU since you can't onboard dual 4090s on 1 motherboard and eGPU throttles 50-70% of 4090 performance due to inherent bandwidth limitations with thunderbolt connection vs A6000 is optimized for low bandwidth and has 48gb VRAM (vs 24gb for 4090) to handle massive data sets



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435793)



Reply Favorite

Date: December 11th, 2024 3:34 AM
Author: Harsh bearded stage

the CPU absolutely does nothing for LLM?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435877)



Reply Favorite

Date: December 11th, 2024 5:14 AM
Author: Bistre Bawdyhouse Filthpig

CPU matters. That's why I got the best one

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435907)



Reply Favorite

Date: December 11th, 2024 8:22 AM
Author: stirring maniacal codepig

There's something called CPU offloading where some processing is handled by the CPU. But the kind of operations used in LLMs is different than the kind of operations at which CPUs excel.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48436129)



Reply Favorite

Date: December 11th, 2024 1:36 PM
Author: Bistre Bawdyhouse Filthpig

Do you think Apple's future M4 max/ultra cpu will be better at LLM?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437276)



Reply Favorite

Date: December 11th, 2024 2:57 PM
Author: stirring maniacal codepig

no

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437673)



Reply Favorite

Date: January 8th, 2025 9:20 PM
Author: https://imgur.com/a/o2g8xYK


M1 Ultra is still plenty

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48533861)



Reply Favorite

Date: December 11th, 2024 1:34 AM
Author: Mahogany associate

i have similar but just for poasting

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435773)



Reply Favorite

Date: December 11th, 2024 2:14 AM
Author: Bistre Bawdyhouse Filthpig



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48435795)



Reply Favorite

Date: December 11th, 2024 3:05 PM
Author: Sticky metal messiness travel guidebook



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437693)



Reply Favorite

Date: December 11th, 2024 9:07 PM
Author: vermilion odious abode legal warrant



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438902)



Reply Favorite

Date: December 11th, 2024 11:24 AM
Author: racy olive partner university

Where are you finding 4090s for sale that aren't jacked up 2x MSRP by scalpers?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48436665)



Reply Favorite

Date: December 11th, 2024 1:33 PM
Author: Bistre Bawdyhouse Filthpig

I got it months ago and just putting together now.

Are prices on 4090 still that bad? I thought 5090 and 5080 were about to drop any day now

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437259)



Reply Favorite

Date: December 11th, 2024 1:34 PM
Author: carmine cracking base patrolman

For work?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437264)



Reply Favorite

Date: December 11th, 2024 8:56 PM
Author: Bistre Bawdyhouse Filthpig

Yep

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438874)



Reply Favorite

Date: December 11th, 2024 1:35 PM
Author: wild casino

Is setting up a local LLM pretty easy? Does it need coding savvy?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437271)



Reply Favorite

Date: December 11th, 2024 1:36 PM
Author: carmine cracking base patrolman

really easy with ollama, https://ollama.com/

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437274)



Reply Favorite

Date: December 11th, 2024 1:38 PM
Author: Sticky metal messiness travel guidebook



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437286)



Reply Favorite

Date: December 11th, 2024 3:40 PM
Author: sepia unhinged stag film newt

whats the benefit of running these locally vs just subscribing to claude or whatever



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437815)



Reply Favorite

Date: December 11th, 2024 3:45 PM
Author: carmine cracking base patrolman

the ones running locally are not as powerful and don't accept really long context windows. the main benefit is no limits to the number of requests and you can find uncensored models.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437832)



Reply Favorite

Date: December 11th, 2024 3:55 PM
Author: sepia unhinged stag film newt

does the hardware above make it practical to train your own? could you train on just on a giant dump of legal cases and get one superior at legal research at the expense of everything else, or is that even available better commercially at this point?



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437869)



Reply Favorite

Date: December 11th, 2024 4:38 PM
Author: Bistre Bawdyhouse Filthpig

1) Yes the above hardware can easily handle large models and is possibly even overkill for CRM, basic legal queries / prompts about client's file, and drafting complex legal briefs

2) Local llm with win10 Pro (eg, ollama, GPT4All, chat rtx) is the ONLY option for attorneys to do this offline given client data sensitivity

3) if you spent $10k on a fully spec'd out Mac Pro/Studio with 192gb unified ram it would not come close to this setup for local LLM in terms of speed and ability to handle large data and complex writing tasks

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438017)



Reply Favorite

Date: December 11th, 2024 5:23 PM
Author: carmine cracking base patrolman

0 chance you will be able to train your own LLM, these companies spend tens of millions $$$ on gpu hardware to train an LLM.

The best you can do locally is "retrieval augmented generation" that is specific to legal cases/research.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438197)



Reply Favorite

Date: December 11th, 2024 1:37 PM
Author: Spruce soul-stirring range

You don’t need this much power just to take notes/exams at NYU Law tax classes.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48437283)



Reply Favorite

Date: December 11th, 2024 8:48 PM
Author: Bistre Bawdyhouse Filthpig

163

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438847)



Reply Favorite

Date: December 11th, 2024 10:29 PM
Author: vermilion odious abode legal warrant



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48439148)



Reply Favorite

Date: December 11th, 2024 10:32 PM
Author: Floppy abusive gaming laptop



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48439153)



Reply Favorite

Date: December 12th, 2024 12:12 PM
Author: vermilion odious abode legal warrant



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440746)



Reply Favorite

Date: December 11th, 2024 4:41 PM
Author: concupiscible turquoise pervert

how much did it all come to? will a local llm be capable of translating from exotic languages?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438026)



Reply Favorite

Date: December 11th, 2024 8:52 PM
Author: Bistre Bawdyhouse Filthpig

Under $5k but I didn't buy a6000 card yet

Those fuckers are expensive!

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438854)



Reply Favorite

Date: December 11th, 2024 5:25 PM
Author: medicated fluffy scourge upon the earth clown

how many watts does the PSU have?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438199)



Reply Favorite

Date: December 11th, 2024 8:50 PM
Author: Bistre Bawdyhouse Filthpig

170

Edit: this is the CPU watts

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438849)



Reply Favorite

Date: December 11th, 2024 9:06 PM
Author: medicated fluffy scourge upon the earth clown

that's a bit low, I don't think u actually have built a computer.

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438901)



Reply Favorite

Date: December 11th, 2024 9:15 PM
Author: Bistre Bawdyhouse Filthpig

Dark Side of the Moon 1350W PSU, Advanced Liquid Cooled CPU & Clear Side Panel

Item number: 321-BIKW

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438939)



Reply Favorite

Date: December 12th, 2024 8:50 AM
Author: medicated fluffy scourge upon the earth clown

You are using an OEM Alienware part for you PSU? How does that make sense?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440037)



Reply Favorite

Date: December 12th, 2024 8:54 AM
Author: Maroon market

OP is growing a beard and changing monikres as we speak

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440043)



Reply Favorite

Date: December 12th, 2024 10:17 AM
Author: Bistre Bawdyhouse Filthpig



(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440299)



Reply Favorite

Date: December 11th, 2024 8:53 PM
Author: Lavender principal's office dragon

I really want to put together my own local LLM setup too. Tagging this thread for later. 180

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438860)



Reply Favorite

Date: December 11th, 2024 8:53 PM
Author: Bistre Bawdyhouse Filthpig

We should start a discord or something

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438863)



Reply Favorite

Date: December 11th, 2024 8:59 PM
Author: Lavender principal's office dragon

I'm down, I'm still a total noob at this I haven't even started to do my own research on what hardware I'll need

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438883)



Reply Favorite

Date: December 11th, 2024 9:01 PM
Author: Bistre Bawdyhouse Filthpig

If you have an M2 or later MacBook/mini you can install ollama or lm studio right now on an external ssd and start fucking around

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48438889)



Reply Favorite

Date: December 11th, 2024 10:31 PM
Author: Harsh bearded stage

can u show us what u can do with ur setup

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48439150)



Reply Favorite

Date: December 12th, 2024 10:23 AM
Author: Garnet corner dog poop

What will you use it for? How is it better than just getting a subscription to Claude/ChatGPT?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440310)



Reply Favorite

Date: December 12th, 2024 10:58 AM
Author: Bistre Bawdyhouse Filthpig

Re 1st question: scaling and streamlining CRM and customer service; reducing bottlenecks, automating as much of the client lifecycle as possible

Re 2nd question: mainly compliance with HIPAA, SCA, ECPA, CPRA etc

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440474)



Reply Favorite

Date: December 12th, 2024 11:04 AM
Author: Harsh bearded stage

link to wat u can get from ur setup

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48440496)



Reply Favorite

Date: December 12th, 2024 1:06 PM
Author: Maroon market

automating as much of the client lifecycle as possible

tptptptp

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48441000)



Reply Favorite

Date: January 8th, 2025 8:34 PM
Author: pitbulls eating your face in hell forever tp

Bump for 5090, how many are you getting OP?

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48533708)



Reply Favorite

Date: January 8th, 2025 9:08 PM
Author: imagine if the races were reversed

Isn't it just easier to buy a Mac Studio or whatever it's called? Supposedly it's very good for private LLM stuff

(http://www.autoadmit.com/thread.php?thread_id=5647806&forum_id=2:#48533823)