@[email protected] to [email protected] • 3 days agoThe AI revolution is cominglemmy.worldmessage-square76fedilinkarrow-up1327
arrow-up1327imageThe AI revolution is cominglemmy.world@[email protected] to [email protected] • 3 days agomessage-square76fedilink
minus-square@[email protected]linkfedilink1•2 days agoAs someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
minus-square@[email protected]linkfedilink2•1 day agoI also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
minus-squareLurkerlinkfedilink3•1 day agoYou can try from lowest to bigger. You probably can run biggest too but it will be slow.
As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
You can try from lowest to bigger. You probably can run biggest too but it will be slow.