mradermacher/Huihui-Qwen3-30B-A3B-Instruct-2507-abliterated-i1-GGUF 31B • Updated 4 days ago • 2.35k • 10
MOE/Mixture of Experts Models (see also "source" cll) Collection Mixture of Expert Models by me. This leverages the power of multiple models at the same time during generation for next level performance. • 126 items • Updated 13 days ago • 12