baatliwala@lemmy.world to memes@lemmy.world · 3 天前The AI revolution is cominglemmy.worldimagemessage-square77fedilinkarrow-up1326arrow-down138
arrow-up1288arrow-down1imageThe AI revolution is cominglemmy.worldbaatliwala@lemmy.world to memes@lemmy.world · 3 天前message-square77fedilink
minus-squareMora@pawb.sociallinkfedilinkarrow-up1·2 天前As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
minus-squarekyoji@lemmy.worldlinkfedilinkarrow-up2·1 天前I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
minus-squareLurker@sh.itjust.workslinkfedilinkarrow-up3·2 天前You can try from lowest to bigger. You probably can run biggest too but it will be slow.
Deepseek is good locally.
As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
You can try from lowest to bigger. You probably can run biggest too but it will be slow.