Mistral: Mistral NemoModel25/100 via “reasoning and multi-step problem solving”
A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese,...
Unique: Mistral Nemo's instruction-tuning includes reasoning tasks and chain-of-thought examples, enabling it to generate explicit reasoning steps when prompted. The 128k context window enables longer reasoning chains than smaller-context models.
vs others: Reasoning capability is weaker than larger models (70B+) but sufficient for many reasoning tasks. Prompt-based chain-of-thought is more transparent than implicit reasoning but less efficient than specialized reasoning architectures.