mr_right@lemmy.dbzer0.com to linuxmemes@lemmy.world · 11 months agoyou don't need more 4GB of RAMlemmy.dbzer0.comimagemessage-square201fedilinkarrow-up11.48K
arrow-up11.48Kimageyou don't need more 4GB of RAMlemmy.dbzer0.commr_right@lemmy.dbzer0.com to linuxmemes@lemmy.world · 11 months agomessage-square201fedilink
minus-squareAdrianTheFrog@lemmy.worldlinkfedilinkEnglisharrow-up2·11 months agodepends on quantization
Just using local llama takes 32GB ram
depends on quantization