Escalation@lemmy.world to AI Generated Porn@lemmynsfw.comEnglish · 9 个月前Check it oyutimage.civitai.comvideomessage-square8linkfedilinkarrow-up136arrow-down13
arrow-up133arrow-down1videoCheck it oyutimage.civitai.comEscalation@lemmy.world to AI Generated Porn@lemmynsfw.comEnglish · 9 个月前message-square8linkfedilink
minus-squareIngeniousRocks (They/She) @lemmy.dbzer0.comlinkfedilinkEnglisharrow-up2·9 个月前Deffo! I run my models on a 3070 with comfy-ui in low vram mode so it uses my DRAM as well, you need a good amount of DRAM if you’re doing it that way though, I have 64 gigs and still get OOM errors when using dram for AI models. The 4070’s 12 gigs of VRAM should cut it though!
minus-squareDoucheBagMcSwag@lemmynsfw.comlinkfedilinkEnglisharrow-up1·9 个月前Do I have to use this …docker thing. …? I have like zero experience with it
minus-squareIngeniousRocks (They/She) @lemmy.dbzer0.comlinkfedilinkEnglisharrow-up2·9 个月前I highly recommend using Docker, it’s probably the easiest way to set it up if you’re a linux or intel mac user. Alternatively: Comfy.org has the windows and Apple Silicon versions as executables.
Deffo! I run my models on a 3070 with comfy-ui in low vram mode so it uses my DRAM as well, you need a good amount of DRAM if you’re doing it that way though, I have 64 gigs and still get OOM errors when using dram for AI models.
The 4070’s 12 gigs of VRAM should cut it though!
Do I have to use this …docker thing. …? I have like zero experience with it
I highly recommend using Docker, it’s probably the easiest way to set it up if you’re a linux or intel mac user.
Alternatively: Comfy.org has the windows and Apple Silicon versions as executables.
Good to know. Thanks!