Illustrious and NoobAI versions please and Pony if possible.
Hey Team, great work again. May you guys please create a version for each popular SDXL variant?
Illustrious (wai latest version).
NoobAI
Pony V6
or release a script so one can do that.
Upping this, please.
It would be nice to have a nunchaku version of each since they all have different strengths.
Please Pony V6 for RTX 5070 TI GPU, I would like to see if I can generate faster.
Upping this, please.
It would be nice to have a nunchaku version of each since they all have different strengths.
Which is better PonyV6 or WAI-NSFW-illustrious-SDXL?
Upping this, please.
It would be nice to have a nunchaku version of each since they all have different strengths.Which is better PonyV6 or WAI-NSFW-illustrious-SDXL?
ofcourse WAI-NSFW-illustrious-SDXL as poeple are shifted from Pony V6 but there are still good use of Pony V6 as well you can consider it as well.
A simple demo here https://huggingface.co/heziiiii/nunchaku-sdxl-noob
Unfortunately, it won’t work for me.. I have a 5070 Ti.
I’m using WAI-NSFW-illustrious-SDXL, and it’s amazing.. they’ve significantly improved the hands.
Do SDXL models in FP4 format improve speed without affecting quality? Or do they run faster but also reduce quality?
@heziiiii how did you do the conversion? Any resources you can share?
Unfortunately, I haven't done the conversion, I don't know how long it would take with my GPU. The only information I've gotten is this link that says it can do the conversion.
https://github.com/nunchaku-tech/deepcompressor
I don't know what's needed for LoRA.. Currently I would like to see this checkpoint and the new Z IMAGE TURBO, it would be awesome.
UP!!
UP!!
where?
where?
I just wanted to get it back in the team's sights!! I need the WAI-illustrious-SDXL v16.0 model.
For both a 4060 8GB and a 5070 TI.
I don't understand why the Nunchaku project is so stagnant.
If there was a tutorial that showed step by step how to do the conversion, I would do it myself. The cost of renting GPUs is low, it is not too expensive to rent a GPU that works 24/7 on Vast.
The problem is not being sure whether the conversion is happening correctly.
Hi @stablecollector we can do this by using it:
https://github.com/nunchaku-ai/deepcompressor
Yes, I posted the same link too.
By talking to other AI creators, I discovered that using only that is not enough.
Need dataset or how to run engine for SDXL.
Unfortunately, my language is not the one I'm writing and I don't understand anything about how to create or convert AI models.
I also asked ChatGPT and they confirmed that a simple conversion is not enough, so I gave up.
I only built a few LoRAs with SD1.5 a few years ago, come on over.
If the team made a step-by-step tutorial, there would be many more models for Nunchaku.