\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

So can any of you NERDS explain how you "train" a local AI? You just buy expensi

Expensive ass graphics card and then what? Download some shi...
average/ordinary/typical citizen/person
  02/21/26
no its more complicated than that. you don't do full trainin...
robot daddy
  02/21/26
...
just say the word
  02/21/26
And do what with it
average/ordinary/typical citizen/person
  02/22/26
There are a lot of reasons people need a local llm (but used...
Patel Philippe
  02/22/26
Ok I'm imagining. Now what
average/ordinary/typical citizen/person
  02/22/26
Start using Claude and ask the same questions you're asking ...
Patel Philippe
  02/22/26
No
average/ordinary/typical citizen/person
  02/22/26
Use Axolotl for training. Train a qlora in 8bit and let your...
cr friend
  02/21/26
I have never used that. Isn't it supposed to be config file ...
robot daddy
  02/21/26
Yeah I think that's where you get Bitcoin too
gibberish (?)
  02/22/26
You'd think AI would be DEVVING thousands of entirely new gr...
average/ordinary/typical citizen/person
  02/22/26
Not flame, the ban on porn usage is a dumb artificial thrott...
gibberish (?)
  02/22/26


Poast new message in this thread



Reply Favorite

Date: February 21st, 2026 10:07 PM
Author: average/ordinary/typical citizen/person

Expensive ass graphics card and then what? Download some shit and it starts training AI?

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49685993)



Reply Favorite

Date: February 21st, 2026 10:20 PM
Author: robot daddy

no its more complicated than that. you don't do full training yourself at this point. like pre-training + training across billions of tokens from scratch. that's something frontier labs do it takes distributed training tons of gpu hours millions of dollars. you can train your own small models. but most of what peole do today is fine-tune big models. I wouldn't do the training locally either. Those GPUs aren't optimized for that you use a cloud server gpu, you pay for the VM, deploy it and they charge you per hour when its running. Train it on the cloud then basically you extract the base model directory + weights to your local system. Depends how big and complex the system is whether your local computer can run it. I don't know as much about that part I have never ran anything directly on a local machine. I can tell you how to set up cloud servers, how set up a cli runner on it that plugs in to hugging face transformers etc, but I don't know exactly how to import it locally it would take actual experimentation for me to figure it out

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686010)



Reply Favorite

Date: February 21st, 2026 11:04 PM
Author: just say the word



(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686073)



Reply Favorite

Date: February 22nd, 2026 1:27 AM
Author: average/ordinary/typical citizen/person

And do what with it

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686201)



Reply Favorite

Date: February 22nd, 2026 1:30 AM
Author: Patel Philippe

There are a lot of reasons people need a local llm (but used in parallel with cloud AI for reasoning)

Do you think the guy who is working on the next Coke product or designing the next iPhone is chatting with Claude or using MPCs? Now imagine an enterprise org with millions of people's SSN/DOB and ACH info

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686205)



Reply Favorite

Date: February 22nd, 2026 1:31 AM
Author: average/ordinary/typical citizen/person

Ok I'm imagining. Now what

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686208)



Reply Favorite

Date: February 22nd, 2026 1:32 AM
Author: Patel Philippe

Start using Claude and ask the same questions you're asking here

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686211)



Reply Favorite

Date: February 22nd, 2026 1:33 AM
Author: average/ordinary/typical citizen/person

No

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686213)



Reply Favorite

Date: February 21st, 2026 10:20 PM
Author: cr friend

Use Axolotl for training. Train a qlora in 8bit and let your dataset be formatted as plaintext, you also need to establish a format

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686012)



Reply Favorite

Date: February 21st, 2026 10:31 PM
Author: robot daddy

I have never used that. Isn't it supposed to be config file based controls, no need to write a runner file w python, torch etc? Is it good? Or are there limitations?

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686022)



Reply Favorite

Date: February 22nd, 2026 1:33 AM
Author: gibberish (?)

Yeah I think that's where you get Bitcoin too

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686212)



Reply Favorite

Date: February 22nd, 2026 1:34 AM
Author: average/ordinary/typical citizen/person

You'd think AI would be DEVVING thousands of entirely new groundbreaking L1 block chains every day. But no one is doing this. They're focusing on personal assistant agents

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686215)



Reply Favorite

Date: February 22nd, 2026 1:37 AM
Author: gibberish (?)

Not flame, the ban on porn usage is a dumb artificial throttle. If you want true innovation you need to have cumming as an end goal.

(http://www.autoadmit.com/thread.php?thread_id=5837098&forum_id=2\u0026mark_id=5310844#49686219)