• DoucheBagMcSwag
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    10 days ago

    Hey thanks for this. I got a 4070. Beefy enough? 😅

    Shit I’ve never used docker but the Lemmy community talks about it so much

    • IngeniousRocks (They/She) @lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 days ago

      Deffo! I run my models on a 3070 with comfy-ui in low vram mode so it uses my DRAM as well, you need a good amount of DRAM if you’re doing it that way though, I have 64 gigs and still get OOM errors when using dram for AI models.

      The 4070’s 12 gigs of VRAM should cut it though!