I want to buy a new GPU mainly for SD. The machine-learning space is moving quickly so I want to avoid buying a brand new card and then a fresh model or tool comes out and puts my card back behind the times. On the other hand, I also want to avoid needlessly spending extra thousands of dollars pretending I can get a 'future-proof' card.
I'm currently interested in SD and training LoRas (etc.). From what I've heard, the general advice is just to go for maximum VRAM.
- Is there any extra advice I should know about?
- Is NVIDIA vs. AMD a critical decision for SD performance?
I'm a hobbyist, so a couple of seconds difference in generation or a few extra hours for training isn't going to ruin my day.
Some example prices in my region, to give a sense of scale:
- 16GB AMD: $350
- 16GB NV: $450
- 24GB AMD: $900
- 24GB NV: $2000
edit: prices are for new, haven't explored pros and cons of used GPUs
Do you know how much wattage those GPUs use, even if I disconnected my solar panels and ran the card 100% 24/7? Protip: it rounds down to zero.
If you're serious about the global environmental crisis, comrade, organize with others to fight industrial-scale culprits instead of wasting your valuable time blaming trivial people.
So you could buy 2xAMD 24GB and a good power supply to power them for a 4090.
I didn't even think of dual cards, because I have an old & budget motherboard with one slot. But 2 x 16GB GPUs and a new motherboard (and if necessary, new CPU) and PSU and it might even still be cheaper than a 24GB NVIDIA for me. Of course I'd have to explore the trade-offs in detail because I've never looked into how dual cards work.
(but truth be told, I just as easily could settle for a 1x16 GB if I'm confident it would be able to train, even if slowly, AuraFlow or FLEX LoRas for the upcoming Pony v7 model. It's just a hobby.)
Yeah. I don't think dual cards is a great solution as I don't think they can both be made to work on the same job at the same time, but maybe if you were generating many images it would make sense.
I don't know, but maybe somebody else has experience.