zai-org/GLM-4.6V · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
GLM-5 is a 744B-parameter MoE model (40B active) from Zhipu AI, scaled up from GLM-4.5's 355B with 28.5T pre-training tokens and DeepSeek Sparse Attention for long-context efficiency.
It outperforms GLM-4.7 on reasoning, coding, and agentic benchmarks like Vending Bench 2 ($4,432 balance) and leads open-source models, nearing Claude Opus 4.5. The model is open-sourced under MIT on Hugging Face and ModelScope, supports local deployment, and integrates with Z.ai for document generation and agent modes.