Since the shutdown of SD on Colab
What happened? I'm out of the loop
Discuss matters related to our favourite AI Art generation technology
Since the shutdown of SD on Colab
What happened? I'm out of the loop
Colab no longer supports or allows Stable Diffusion on their free tier. You can still buy time for SD, though.
There are a few.
Draw things https://drawthings.ai
Diffusionbee https://diffusionbee.com
Are apps available on macos. Not sure about other platforms.
Other than that a more complex approach is installing automatic1111 which is a web interface that runs locally. https://www.youtube.com/watch?v=kqXpAKVQDNU
Just had a little look at ^ this guy's YouTube channel and he has guides to install stuff on windows too if that's needed for ya.
I struggled to replace a dying HDD this month, there's no way I can afford a SD-capable GPU.
You don't need a GPU. It'll just be a bit slower.
....a lot slower to be honest
I never knew it was possible to run on CPU. Well, thanks for the idea, but that doesn't seem usable. SD outputs are terrible 9 of 10 times, and at ten+ minutes a generation...
If you use ComfyUI you can queue as many images as you want, so you can let it run over night. Not ideal but probably the best solution without GPU.
TBH, it would take a long time to come up with a prompt that didn't produce trash even when generating quickly on Colab. Trying to do this on my CPU sounds like it could end up taking weeks or even months just to get a few good images.
EDIT: Well, it took me a long while to get set up, but as it turns out, I'm one of the lucky ones who can run on 2GB VRAM. I generated this in about a minute.
Good to hear you got it working.
If you want to speed it up even further and you're willing to boot Linux from a USB, ROCm is much faster than DirectML right now.
edit: Also, you can run without UI, saving even more VRAM.
Interesting about ROCm, I'll have to look into that. As for running without the UI, I honestly don't think I know enough to do that right now, lol.
Have you tried running it on yoru PC? Comfy UI seems to run it on a potato
I think Automatic1111 runs on most CPUs, but you need a lot of ram.
I didn't have any hope for being able to run it locally, but regardless, I've been getting it set up on and off all day. Well, as it turns out, I'm lucky enough to be able to generate with DirectML and 2GB VRAM in a not-terrible amount of time.
To test, I generated a couple of simple prompts. This one is from 'a poster for Sex the movie'. It's not NSFW, but I'd say it's suggestive... in a weird, warped way. I didn't use any negatives to test.