DeML OS Daily DeML OS 最新前沿分析
Explore Frontier
02.05
2026
Thu
📄
Paper
Mosaic Learning: A Framework for Decentralized Learning with Model Fragmentation https://arxiv.org/abs/2602.04352
Sayan Biswas Fragmentation Non-IID Gossip

Notes

DeML OS Q & A 问答
Deep Dive 💬
02.05
2026
Thu
😇
How is Mosaic Learning intuitively different from standard decentralized learning?
Standard DL exchanges full models each round, while Mosaic Learning sends different parameter fragments over different paths. This spreads information more evenly, similar to splitting a monolithic check into multiple independent checks.
😎
😊
Why does model fragmentation not hurt convergence guarantees?
Each fragment's gossip operator matches Epidemic Learning in expectation. Analyzing fragments separately and aggregating results yields the same worst-case convergence rate as EL, independent of K, akin to batch evaluation preserving asymptotic cost.
😎
🤓
From a linear system view, why does increasing K speed up consensus?
Fragmentation yields a block-diagonal gossip structure where parameter subspaces mix independently. In quadratic models, this lowers the largest eigenvalue of the contraction matrix, reducing consensus error, similar to decomposing sums in sum-check protocols.
😎