this post was submitted on 28 Feb 2024
25 points (96.3% liked)

Stable Diffusion

4320 readers
32 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Scew@lemmy.world 3 points 8 months ago* (last edited 8 months ago) (2 children)

No, lol. Well, at least I'm not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:

The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they "distilled" theirs from... Not really sure what exactly they think they are differentiating themselves from, reading the article...

[–] grue@lemmy.world 3 points 8 months ago (1 children)

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Jeff Geerling has entered the chat

[–] PipedLinkBot@feddit.rocks 2 points 8 months ago

Here is an alternative Piped link(s):

Jeff Geerling has entered the chat

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] Even_Adder@lemmy.dbzer0.com 3 points 8 months ago

There are three models and the smallest one is 700M parameters.