should i dump like 50k into a local LLM setup
| smoky new version | 12/17/25 | | Mauve Adulterous Alpha | 12/17/25 | | smoky new version | 12/17/25 | | Mauve Adulterous Alpha | 12/17/25 | | white vibrant abode | 12/17/25 | | Mauve Adulterous Alpha | 12/17/25 | | white vibrant abode | 12/17/25 | | Mauve Adulterous Alpha | 12/17/25 | | white vibrant abode | 12/17/25 | | Bistre Cheese-eating Pit | 12/17/25 | | white vibrant abode | 12/17/25 | | Mauve Adulterous Alpha | 12/17/25 | | smoky new version | 12/17/25 | | Bistre Cheese-eating Pit | 12/17/25 | | smoky new version | 12/17/25 | | Mauve Adulterous Alpha | 12/17/25 | | Bistre Cheese-eating Pit | 12/17/25 | | Mauve Adulterous Alpha | 12/17/25 | | Razzle Selfie Native | 12/17/25 | | racy chapel | 12/17/25 | | boyish hall newt | 12/17/25 | | zippy indian lodge legal warrant | 12/17/25 | | Cream Dead Church | 12/17/25 | | aromatic church building black woman | 12/17/25 | | smoky new version | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: white vibrant abode
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2/en-en/#49516844) |
|
|