is LCM Lora possible? #26
Closed
habanerogossip
started this conversation in
Ideas
Replies: 2 comments 2 replies
-
Of course, It is. In my opinion, you may need just to train a specific LoRA(or other PEFT for Transformer) on Pixart-α. |
Beta Was this translation helpful? Give feedback.
2 replies
-
PixArt-LCM released. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
is Latent Consistency Model Lora possible for this? https://huggingface.co/blog/lcm_lora
Beta Was this translation helpful? Give feedback.
All reactions