• IngeniousRocks (They/She) @lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 days ago

      If you have a decent GPU - check it out https://github.com/lecode-official/comfyui-docker

      Look at civitai and huggingface for models to employ. The Hentai type generators are easy to employ in places of any other image generation model, if you want specific poses you can use lora networks to modify the output, if you want video you’ll want to look into hunyuan video or something of that nature, keep in mind you’ll need big VRAM for that, I recommend using a smaller quantized gguf for hunyuan video generation because it can break the process into smaller chunks at the expense of longer processing time.

      • DoucheBagMcSwag
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        7 days ago

        Hey thanks for this. I got a 4070. Beefy enough? 😅

        Shit I’ve never used docker but the Lemmy community talks about it so much

        • IngeniousRocks (They/She) @lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          Deffo! I run my models on a 3070 with comfy-ui in low vram mode so it uses my DRAM as well, you need a good amount of DRAM if you’re doing it that way though, I have 64 gigs and still get OOM errors when using dram for AI models.

          The 4070’s 12 gigs of VRAM should cut it though!