should i dump like 50k into a local LLM setup
| sticky space turdskin | 12/17/25 | | khaki nighttime lodge | 12/17/25 | | sticky space turdskin | 12/17/25 | | khaki nighttime lodge | 12/17/25 | | Cracking Church | 12/17/25 | | khaki nighttime lodge | 12/17/25 | | Cracking Church | 12/17/25 | | khaki nighttime lodge | 12/17/25 | | Cracking Church | 12/17/25 | | Stubborn goyim senate | 12/17/25 | | Cracking Church | 12/17/25 | | khaki nighttime lodge | 12/17/25 | | sticky space turdskin | 12/17/25 | | Stubborn goyim senate | 12/17/25 | | sticky space turdskin | 12/17/25 | | khaki nighttime lodge | 12/17/25 | | Stubborn goyim senate | 12/17/25 | | khaki nighttime lodge | 12/17/25 | | marvelous zippy site clown | 12/17/25 | | vibrant autistic place of business | 12/17/25 | | green big-titted double fault | 12/17/25 | | Henna hominid | 12/17/25 | | ungodly sable station alpha | 12/17/25 | | Pink lascivious ladyboy | 12/17/25 | | sticky space turdskin | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: Cracking Church
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026mark_id=5310864#49516844) |
Date: December 17th, 2025 1:42 PM Author: ungodly sable station alpha
just do what pewdiepie did
https://www.youtube.com/watch?v=qw4fDU18RcU
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2\u0026mark_id=5310864#49516899) |
|
|