we can't ban open source AI so we'll remove the ability to run it locally
| Twinkling masturbator | 12/17/25 | | cracking clear space sound barrier | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Copper crusty chad | 12/17/25 | | Glittery dull meetinghouse | 12/17/25 | | charismatic office | 12/17/25 | | Twinkling masturbator | 12/17/25 | | charismatic office | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Copper crusty chad | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Copper crusty chad | 12/17/25 | | Twinkling masturbator | 12/17/25 | | chestnut lodge | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Glittery dull meetinghouse | 12/17/25 | | chestnut lodge | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Geriatric startling forum | 12/17/25 | | chestnut lodge | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Bronze jewess hell | 12/17/25 | | chestnut lodge | 12/17/25 | | Copper crusty chad | 12/17/25 | | chestnut lodge | 12/17/25 | | chestnut lodge | 12/17/25 | | Geriatric startling forum | 12/17/25 | | chestnut lodge | 12/17/25 | | Geriatric startling forum | 12/17/25 | | Copper crusty chad | 12/17/25 | | ruby macaca half-breed | 12/17/25 | | chestnut lodge | 12/17/25 | | chestnut lodge | 12/17/25 | | Doobsian National | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Wonderful henna bawdyhouse hunting ground | 12/17/25 | | vivacious mother foreskin | 12/17/25 | | Aphrodisiac olive mood ape | 12/17/25 | | Glittery dull meetinghouse | 12/17/25 | | ruddy talented wagecucks | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Copper crusty chad | 12/17/25 | | Glittery dull meetinghouse | 12/17/25 | | Twinkling masturbator | 12/17/25 | | Copper crusty chad | 12/17/25 | | Wonderful henna bawdyhouse hunting ground | 12/17/25 | | cracking clear space sound barrier | 12/17/25 | | Glittery dull meetinghouse | 12/17/25 | | https://imgur.com/a/o2g8xYK | 12/17/25 | | ruddy talented wagecucks | 12/17/25 |
Poast new message in this thread
 |
Date: December 17th, 2025 9:25 AM Author: Twinkling masturbator
why don't you "ping" your pencil neck you weird little freak
(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2Firm#49516133)
|
 |
Date: December 17th, 2025 7:49 PM
Author: https://imgur.com/a/o2g8xYK
You need 24gb VRAM to do inference. 16gb isn't enough. However, unless you are coding there's little reason to go above 24gb. The reason coding can use more VRAM is because going through each iteration of the code generates long context windows. If you run out of context window the AI will forget what it was doing earlier. This is also why you can't run 15gb models on 16gb of VRAM. The context windows spills into system RAM and slows everything down
48gb VRAM lets you do more with image and video generation, but will not give you measurable gains in inference. You can put bigger models on the system but they probably won't give you better results.
(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2Firm#49517900) |
|
|