Date: December 17th, 2025 1:49 PM
Author: https://i.imgur.com/chK2k5a.jpeg
Without CUDA support you'll find that nothing works. Sure, AMD has their proprietary ROCm platform, but 95% of models either don't work or run shitty on ROCm. Furthermore AMD seems to have bailed on ROCm development so hard that, last I checked, they still didn't have support for RDNA4 GPUs. That's right: you can buy an AMD RyzenAI with 64gb of soldered RAM, but it won't run Ollama.
(http://www.autoadmit.com/thread.php?thread_id=5811374&forum_id=2...id.#49516931) |