Leco Li PRO
imnotkitty
AI & ML interests
None yet
Recent Activity
posted an
update
about 12 hours ago
๐ GLM-5-Turbo just dropped: The "OpenClaw Native" Model
Z.ai just shipped a 744B MoE beast that's 2-3x faster than GLM-5, with 200K context + 128K max output.
What's different:
๐ Tool call stability (no more random failures mid-chain)
๐ Complex instruction decomposition (breaks down messy prompts)
๐ Time-aware execution (understands scheduled/persistent tasks)
๐ High-throughput long-chain efficiency (doesn't choke on 50-step workflows)
๐ ZClawBench: Leads mainstream models in OpenClaw scenarios
๐ฐ Trade-off: +20% price vs GLM-5
Anyone trying it yet? ๐
reacted
to
BibbyResearch's
post with ๐ about 17 hours ago
We are working on the largest Dataset and Pre-trained model for Text to Speech and Speech to text for the low-resourced language called Marwari in India.