Skip to main content

Learning Bayesian Networks Based on a Mutual Information Scoring Function and EMI Method

  • Conference paper
Advances in Neural Networks – ISNN 2007 (ISNN 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4492))

Included in the following conference series:

Abstract

At present, most of the algorithms for learning Bayesian Networks (BNs) use EM algorithm to deal with incomplete data. They are of low efficiency because EM algorithm has to perform iterative process of probability reasoning to complete the incomplete data. In this paper we present an efficient BN learning algorithm, which use the combination of EMI method and a scoring function based on mutual information theory. The algorithm first uses EMI method to estimate, from incomplete data, probability distributions over local structures of BNs, then evaluates BN structures with the scoring function and searches for the best one. The detailed procedure of the algorithm is depicted in the paper. The experimental results on Asia and Alarm networks show that when achieving high accuracy, the algorithm is much more efficient than two EM based algorithms, SEM and EM-EA algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Cheng, J., Greiner, R., Kelly, J., Bell, D.A., Liu, W.: Learning Bayesian Networks from Data: An Information-Theory based Approach. The Artificial Intelligence Journal 137, 43–90 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  2. Chickering, D.M., Heckerman, D.: Efficient Approximations for the Marginal Likelihood of Bayesian Networks with Hidden Variables. Machine Learning 29, 181–212 (1997)

    Article  MATH  Google Scholar 

  3. Chow, C.K., Liu, C.N.: Approximating Discrete Probability Distributions with Dependence Trees. IEEE Trans. Information Theory 14, 462–467 (1968)

    Article  MATH  Google Scholar 

  4. Dash, D., Druzdzel, M.J.: Robust Independence Testing for Constraint-based Learning of Causal Structure. In: UAI, pp. 167–174 (2003)

    Google Scholar 

  5. Friedman, N.: The Bayesian Structural EM Algorithm. In: Fourteenth Conf. on Uncertainty in Artificial Intelligence (1998)

    Google Scholar 

  6. Heckerman, D.: Bayesian Networks for Data Mining. Data Mining and Knowledge Discovery 1, 79–119 (1997)

    Article  Google Scholar 

  7. Ioannis, T., Laura, E., Constantin, F.A.: The Max-Min Hill-Cimbing Bayesian Networks from Data. To be appeared in machine Learning (2006)

    Google Scholar 

  8. Lam, W., Bacchus, F.: Learning Bayesian Belief Networks: An Approach based on the MDL Principle. Computational Intelligence 10, 269–293 (1994)

    Article  Google Scholar 

  9. Myers, J., Laskey, K., Levitt, T.: Learning Bayesian Networks from Incomplete Data with Stochastic Search Algorithms. In: Fifth Conf. on Uncertainty in Artificial Intelligence, UAI (1999)

    Google Scholar 

  10. Singh, M.: Learning Bayesian Networks from Incomplete Data. In: The 14th National Conf. on Artificial Intelligence (1997)

    Google Scholar 

  11. Spirtes, P., Glymour, C., Scheines, R.: Causation, Prediction, and Search, 2nd edn. MIT Press, Cambridge (2001)

    MATH  Google Scholar 

  12. Suzuki, J.: Learning Bayesian Belief Networks based on the MDL Principle: An Efficient Algorithm using Branch and Bound Techniques. In: Saitta, L. (ed.) Proceedings of the 13th International Conference on machine Learning, Bari, pp. 462–470. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  13. Tian, F., Lu, Y.-c., Shi, C.-Y.: Learning Bayesian Networks with Hidden Variables Using the Combination of EM and Evolutionary Algorithms. In: Cheung, D., Williams, G.J., Li, Q. (eds.) PAKDD 2001. LNCS (LNAI), vol. 2035, pp. 568–574. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  14. Tian, F., Zhang, H., Lu, Y.: Learning Bayesian Networks from Incomplete Data based on EMI Method. In: Proceedings of the 3rd IEEE International Conference on Data Mining (ICDM 2003), Melbourne, Florida, USA, pp. 323–330 (2003)

    Google Scholar 

  15. Zhang, S., Wang, X.: Algorithm for Bayesian Networks Structure Learning based on Information Entropy. Mini-Micro Computer Systems 26, 983–986 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Derong Liu Shumin Fei Zengguang Hou Huaguang Zhang Changyin Sun

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Tian, F., Li, H., Wang, Z., Yu, J. (2007). Learning Bayesian Networks Based on a Mutual Information Scoring Function and EMI Method. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4492. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72393-6_50

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72393-6_50

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72392-9

  • Online ISBN: 978-3-540-72393-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy