Abstract
The Minimum Message Length (MML) Principle is an information-theoretic approach to induction, hypothesis testing, model selection, and statistical inference. MML, which provides a formal specification for the implementation of Occam’s Razor, asserts that the ‘best’ explanation of observed data is the shortest. MML was first published by Chris Wallace and David Boulton in 1968.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Recommended Reading
Allison L (2009) MML website. http://www.allisons.org/ll/MML/
Dowe DL, Gardner SB, Oppy G (2007) Bayes not bust!: why simplicity is no problem for Bayesians. Brit J Phil Sci 58:709–754
Dowty JG (2013) SMML estimators for 1-dimensional continuous data. Comput J. doi:10.1093/comjnl/bxt145
Dai H, Korb KB, Wallace CS, Wu X (1997) A study of causal discovery with weak links and small samples. In: Proceedings of fifteenth international joint conference on artificial intelligence. Morgan Kaufman, San Francisco, pp 1304–1309
Edgoose T, Allison L (1999) MML Markov classification of sequential data. Stat Comput 9(4):269–278
Farr GE, Wallace CS (2002) The complexity of strict minimum message length inference. Comput J 45(3): 285–292
Grunwald P (2008) The minimum description length principle. MIT Press, Cambridge
Honkela A, Valpola H (2004) Variational learning and bits-back coding: an information-theoretic view to Bayesian learning. IEEE Trans Neural Netw 15(4):800–810
Lanterman AD (2001) Schwarz, Wallace and Rissanen: intertwining themes in theories of model selection. Int Stat Rev 69(2):185–212
MML software: www.datamining.monash.edu.au/software, http://allisons.org/ll/Images/People/Wallace/FactorSnob/
Neil JR, Wallace CS, Korb KB, Learning Bayesian networks with restricted interactions, in Laskey and Prade. In: Proceedings of the fifteenth conference of uncertainty in artificial intelligence (UAI-99), Stockholm, pp 486–493
O’Donnell R, Allison L, Korb K (2006) Learning hybrid Bayesian networks by MML. Lecture notes in computer science: AI 2006 – Advances in artificial intelligence, vol 4304. Springer, Berlin/New York, pp 192–203
Wallace CS (1990) Classification by minimum-message length inference. In: Akl SG et al (eds) Advances in computing and information-ICCI 1990. No. 468 in lecture notes in computer science. Springer, Berlin
Wallace CS (2005) Statistical & inductive inference by MML. Information sciences and statistics. Springer, New York
Wallace CS, Boulton DM (1968) An information measure for classification. Comput J 11:185–194
Wallace CS, Boulton DM (1975) An information measure for single-link classification. Comput J 18(3):236–238
Wallace CS, Dowe DL (1999) Minimum message length and Kolmogorov complexity. Comput J 42(4):330–337
Wallace CS, Freeman PR (1987) Estimation and inference by compact coding. J. R. Stat. Soc. (Ser B) 49:240–252
Wallace CS, Patrick JD (1993) Coding decision trees. Mach Learn 11:7–22
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Science+Business Media New York
About this entry
Cite this entry
Baxter, R.A. (2017). Minimum Message Length. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7687-1_547
Download citation
DOI: https://doi.org/10.1007/978-1-4899-7687-1_547
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4899-7685-7
Online ISBN: 978-1-4899-7687-1
eBook Packages: Computer ScienceReference Module Computer Science and Engineering