\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

should i dump like 50k into a local LLM setup

we're gonna get fucking rugged on this man it's actually cra...
peach bespoke station feces
  12/17/25
https://youtu.be/t_hh2-KG6Bw?si=Msowi7NeNeCH3LhX
Angry territorial pozpig
  12/17/25
that shit probably sucks though local setups are pointles...
peach bespoke station feces
  12/17/25
You can't get that much VRAM for $50k
Angry territorial pozpig
  12/17/25
no lol there is still a huge disparity in the speed/quality ...
arousing nowag parlour
  12/17/25
(no links)
Angry territorial pozpig
  12/17/25
what are you Doing with your Local Models, champ? go ahead, ...
arousing nowag parlour
  12/17/25
Once in a while I argue with Gemma 27b about antitrust law
Angry territorial pozpig
  12/17/25
...
arousing nowag parlour
  12/17/25
Real-time infrared target acquisition & control
cowardly gay queen of the night
  12/17/25
you don't need frontier models for specialized use cases lik...
arousing nowag parlour
  12/17/25
For coding there's no reason to do it locally. You need extr...
Angry territorial pozpig
  12/17/25
they are going to keep neutering frontier models until the o...
peach bespoke station feces
  12/17/25
I think there's going to be some big money in decommissionin...
cowardly gay queen of the night
  12/17/25
interesting
peach bespoke station feces
  12/17/25
Nope. The old shit that's getting decommissioned will be str...
Angry territorial pozpig
  12/17/25
Jfc kikes 110 will happen sooner than you think.
cowardly gay queen of the night
  12/17/25
They can't stop China from releasing optimized-as-fuck OSS m...
Angry territorial pozpig
  12/17/25
This might actually be the thing to get me to kms
Purple location
  12/17/25
no
Charismatic high-end property digit ratio
  12/17/25
for the most part you're not going to get Biglaw out of an L...
Concupiscible Theatre
  12/17/25
seems unlikely that llms can do pure math (they can and scor...
Big Lime Wrinkle Heaven
  12/17/25
just do what pewdiepie did https://www.youtube.com/watch?...
provocative clear doctorate
  12/17/25
based
geriatric stead filthpig
  12/17/25
gotta spend even more than he did to get an actually useful ...
peach bespoke station feces
  12/17/25


Poast new message in this thread



Reply Favorite

Date: December 17th, 2025 1:29 PM
Author: peach bespoke station feces

we're gonna get fucking rugged on this man it's actually crazy

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516827)



Reply Favorite

Date: December 17th, 2025 1:30 PM
Author: Angry territorial pozpig

https://youtu.be/t_hh2-KG6Bw?si=Msowi7NeNeCH3LhX

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516833)



Reply Favorite

Date: December 17th, 2025 1:32 PM
Author: peach bespoke station feces

that shit probably sucks though

local setups are pointless if you have to wait 60 seconds for it to do anything

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516838)



Reply Favorite

Date: December 17th, 2025 1:33 PM
Author: Angry territorial pozpig

You can't get that much VRAM for $50k

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516843)



Reply Favorite

Date: December 17th, 2025 1:33 PM
Author: arousing nowag parlour

no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool

self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516844)



Reply Favorite

Date: December 17th, 2025 1:33 PM
Author: Angry territorial pozpig

(no links)

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516846)



Reply Favorite

Date: December 17th, 2025 1:35 PM
Author: arousing nowag parlour

what are you Doing with your Local Models, champ? go ahead, sell me

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516852)



Reply Favorite

Date: December 17th, 2025 1:36 PM
Author: Angry territorial pozpig

Once in a while I argue with Gemma 27b about antitrust law

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516859)



Reply Favorite

Date: December 17th, 2025 1:45 PM
Author: arousing nowag parlour



(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516911)



Reply Favorite

Date: December 17th, 2025 1:37 PM
Author: cowardly gay queen of the night

Real-time infrared target acquisition & control

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516863)



Reply Favorite

Date: December 17th, 2025 1:57 PM
Author: arousing nowag parlour

you don't need frontier models for specialized use cases like that though, there's already been crazy progress with stuff like qwen3-vl. I'm talking about the dweebs trying to run 500b-1T parameter models at 30 tokens/s or whatever locally just to do the same shit they would use chatgpt for

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516951)



Reply Favorite

Date: December 17th, 2025 2:06 PM
Author: Angry territorial pozpig

For coding there's no reason to do it locally. You need extra RAM just for the long ass context windows coding requires.

For anything else there are good reasons to keep it off the cloud and not have it linked to your credit card

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516971)



Reply Favorite

Date: December 17th, 2025 1:38 PM
Author: peach bespoke station feces

they are going to keep neutering frontier models until the only thing they are able to do is blurt out that white people are evil and make porn videos and boomer bait AI slop

imo the best argument against investing in a local setup is that hardware will keep getting better - but now that commercial hardware access is getting rugpulled, i'm starting to seriously worry

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516874)



Reply Favorite

Date: December 17th, 2025 1:41 PM
Author: cowardly gay queen of the night

I think there's going to be some big money in decommissioning & recycling old equipment at data centers. Many contracts require them to literally shred the equipment they lease, rather than resell it. Huge black market opportunity.

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516891)



Reply Favorite

Date: December 17th, 2025 1:43 PM
Author: peach bespoke station feces

interesting

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516904)



Reply Favorite

Date: December 17th, 2025 2:03 PM
Author: Angry territorial pozpig

Nope. The old shit that's getting decommissioned will be stripped for VRAM, and the new stuff is all HBM and not suitable for home use. We'll be buying off-brand recycled GPUs on AliExpress if we're lucky

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516963)



Reply Favorite

Date: December 17th, 2025 2:10 PM
Author: cowardly gay queen of the night

Jfc kikes

110 will happen sooner than you think.

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516983)



Reply Favorite

Date: December 17th, 2025 1:41 PM
Author: Angry territorial pozpig

They can't stop China from releasing optimized-as-fuck OSS models that require 30% less compute. However China is just as fucked as we are when it comes to supply of basic PC components. Maybe more so.

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516892)



Reply Favorite

Date: December 17th, 2025 1:37 PM
Author: Purple location

This might actually be the thing to get me to kms

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516865)



Reply Favorite

Date: December 17th, 2025 1:39 PM
Author: Charismatic high-end property digit ratio

no

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516882)



Reply Favorite

Date: December 17th, 2025 1:41 PM
Author: Concupiscible Theatre

for the most part you're not going to get Biglaw out of an LLM unless it's NYU Tax and you already came from a decent law school

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516887)



Reply Favorite

Date: December 17th, 2025 2:12 PM
Author: Big Lime Wrinkle Heaven

seems unlikely that llms can do pure math (they can and score better on many metrics than most experts) but not Biglaw considering pure math and theoretical physics is exponentially more difficult. I think you guys are just pretending it can't for ego purposes. Are you sure you guys aren't just expecting some perfect output in one prompt or something instead of using it to make several different processes and tasks much easier in a way that compounds to basically not having to try anymore?

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516989)



Reply Favorite

Date: December 17th, 2025 1:42 PM
Author: provocative clear doctorate

just do what pewdiepie did

https://www.youtube.com/watch?v=qw4fDU18RcU

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516899)



Reply Favorite

Date: December 17th, 2025 2:01 PM
Author: geriatric stead filthpig

based

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516957)



Reply Favorite

Date: December 17th, 2025 2:09 PM
Author: peach bespoke station feces

gotta spend even more than he did to get an actually useful setup though

(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026hid=#49516976)