Instructions to use InstantX/FLUX.1-dev-Controlnet-Union with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diffusers
How to use InstantX/FLUX.1-dev-Controlnet-Union with Diffusers:
pip install -U diffusers transformers accelerate
import torch from diffusers import DiffusionPipeline # switch to "mps" for apple devices pipe = DiffusionPipeline.from_pretrained("InstantX/FLUX.1-dev-Controlnet-Union", dtype=torch.bfloat16, device_map="cuda") prompt = "Astronaut in a jungle, cold color palette, muted colors, detailed, 8k" image = pipe(prompt).images[0] - Notebooks
- Google Colab
- Kaggle
FP8 or NF4 version
#8
by GeroldMeisinger - opened
I just tried the ComfyUI implementation and it works great but the memory requirements are quite high. with flux barely fitting in 16GB vram this controlnet is too much. could you also add a fp8 version (or better yet a nf4 version) please.
GeroldMeisinger changed discussion title from FP8 version to FP8 or NF4 version
I think a GGUF version would be also great.
I did not manage to get it running on 16gb VRAM unfortunately. Q4.1.gguf model and T5xxl_fp8 were not enough to run it
install xformers and try smaller images (512x512)
install xformers and try smaller images (512x512)
What's the point of using Flux if you're creating such small images, though? π€·πΌ