Gemma 3 Collection All versions of Google's new multimodal models including QAT in 1B, 4B, 12B, and 27B sizes. In GGUF, dynamic 4-bit and 16-bit formats. • 55 items • Updated 11 days ago • 102
Kimi Linear: An Expressive, Efficient Attention Architecture Paper • 2510.26692 • Published Oct 30, 2025 • 119
view article Article Accelerating Qwen3-8B Agent on Intel® Core™ Ultra with Depth-Pruned Draft Models +3 Sep 29, 2025 • 22
view article Article Reachy Mini - The Open-Source Robot for Today's and Tomorrow's AI Builders Jul 9, 2025 • 748
Brainstorm Adapter Models - Augmented/Expanded Reasoning Collection Adapters by DavidAU: Splits apart the reasoning center(s) and multiples them 3x, 4x, 8x, 10x, 20x, 40x+. Creativity+ / Logic+ / Detail+ / Prose+ ... • 203 items • Updated 20 days ago • 26