Stamets to White People [email protected] • 6 months agoThe dreamlemmy.worldmessage-square237fedilinkarrow-up11.94K
arrow-up11.9KimageThe dreamlemmy.worldStamets to White People [email protected] • 6 months agomessage-square237fedilink
minus-square@[email protected]linkfedilink2•6 months ago I don’t know of an LLM that works decently on personal hardware Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.
minus-square@[email protected]linkfedilink1•6 months agoIf you have really low specs use the recently open sourced Microsoft Phi model.
Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.
If you have really low specs use the recently open sourced Microsoft Phi model.