via “multilingual-text-generation-and-understanding”
Compared with GLM-4.5, this generation brings several key improvements: Longer context window: The context window has been expanded from 128K to 200K tokens, enabling the model to handle more complex...
Unique: GLM 4.6 is trained on multilingual data with particular strength in Chinese and English, providing better performance for CJK languages compared to English-first models like GPT-4, while maintaining competitive performance across European languages
vs others: Outperforms English-centric models on Chinese language tasks and code-switching scenarios due to balanced training data, while remaining competitive with specialized translation models for single-language translation tasks