Models overview

Xiaomi MiMo models

The public MiMo line spans earlier reasoning-first 7B releases and the newer MiMo-V2-Flash family. This directory keeps the main public variants readable in one place.

Flagship reasoning and agent model

MiMo-V2-Flash

A Mixture-of-Experts model positioned around fast reasoning, coding, long context, and agentic workflows.

309B total parameters15B active parameters256k context

Source: MiMo-V2-Flash README

Base model variant

MiMo-V2-Flash-Base

The base MiMo-V2-Flash variant used as the public foundation for benchmark comparisons and downstream evaluation.

309B total parameters15B active parameters256k context

Source: MiMo-V2-Flash model table

Reasoning-tuned small model

MiMo-7B-RL

The reasoning-focused 7B public line from the earlier MiMo release, positioned around math and code performance.

7B class modelRL-tunedReasoning and code focus

Source: MiMo README

Earlier base checkpoint

MiMo-7B-Base

The earlier base model from the MiMo line, useful for understanding the public evolution from MiMo-7B to MiMo-V2-Flash.

7B class modelPre-training heavyReasoning-oriented

Source: MiMo README