MUM algorithm(MUM)
MUM stands for Multitask Unified Model, the language model Google announced in May 2021 as the successor to BERT. Google's claim: 1000x more capable than BERT, multimodal (text plus image), and multilingual across 75 languages. MUM powers Things to Know and underpins parts of AI Overviews.
Long definition
MUM was announced at Google I/O 2021 by Pandu Nayak. The headline claims: built on the Transformer architecture (same family as BERT), trained on 75 languages simultaneously, capable of reasoning across text and images, and "1000x more powerful than BERT" — a comparison Google never fully formalized but indicates the order-of-magnitude leap in parameter count and training data.
The unique pitch of MUM is multitask reasoning. The example Google used at launch: a query like "I've hiked Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?" requires understanding fitness levels, comparing two specific mountains, factoring in seasonal differences, and synthesizing across many sources — possibly in different languages. A traditional retrieval system can't reason across that complexity. MUM was designed to.
MUM's actual deployment in Search has been gradual and quieter than the launch announcement implied:
- Things to Know — the SERP feature that surfaces related sub-aspects of a topic uses MUM-derived classification.
- Search refinements and "broaden/narrow your search" — MUM helps generate the related-query suggestions in the AI Overviews experience.
- About this result — context about source domains is informed by MUM-derived signals.
- AI Overviews (formerly SGE) — Google has been less explicit about MUM's role here, but the multimodal multilingual reasoning capability is foundational to the AI-generated answer features.
- Vaccine information surfacing (2021-2022) — used MUM to identify and translate vaccine-related queries across 800+ language variants.
MUM is part of Google's broader generative AI stack alongside PaLM 2 and Gemini. Where BERT is purely for understanding, MUM and successor models can both understand and reason — although Google has been careful not to deploy MUM as an open-ended generator the way ChatGPT was launched; the user-facing surfaces remain conservative.
Common misconceptions
- "You can optimize for MUM." No more than for BERT — meaning, write naturally and answer questions thoroughly. There's no MUM-specific keyword strategy. The system rewards comprehensiveness and topical authority because it's looking for content that actually addresses multifaceted queries.
- "MUM replaced BERT." They coexist. Google has many models in the ranking stack — RankBrain, BERT, MUM, neural matching, and others. MUM is added capacity for harder queries, not a replacement for the rest.
- "MUM is what makes AI Overviews work." It's part of the foundation, but AI Overviews specifically use Gemini-family models for the generative output. MUM contributed to the underlying retrieval and reasoning research that AI Overviews builds on.
- "1000x more capable means 1000x better rankings." The 1000x figure refers to model capability and training scale, not search-result quality. The user-facing impact has been incremental — better disambiguation on hard queries, not a wholesale overhaul of ranking.
Continue exploring