Tongyi DeepResearch 30B A3BModel24/100 via “iterative-search-refinement-with-model-directed-queries”
Tongyi DeepResearch is an agentic large language model developed by Tongyi Lab, with 30 billion total parameters activating only 3 billion per token. It's optimized for long-horizon, deep information-seeking tasks...
Unique: Implements a closed-loop search strategy where the model's reasoning directly controls search execution and evaluation, rather than treating search as a separate tool invoked once. The model maintains state across search iterations and makes explicit decisions about strategy pivoting, enabling adaptive research workflows.
vs others: More adaptive than static RAG systems that execute a single retrieval pass, and more transparent than black-box search ranking because the model's reasoning about search strategy is part of the output.