Abstract
Modular structures are ubiquitously found in the brain and neural networks. Inspired by the biological networks, we explore Hopfield-type recurrent neural networks with sparse modular connectivity for associative memory. We first show that an iterative learning algorithm, which determines the connection weights depending on the network topology, yields better performance than the one-shot learning rule. We then examine the topological factors which govern the memory capacity of the sparse modular neural network. Numerical results suggest that the uniformity of the number of connections per neuron is an essential condition for good performance. We discuss a method to design an energy-efficient neural network.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bishop, C.M.: Pattern Recognition and Machine Learning, vol. 1. Springer, New York (2006)
Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)
Misra, J., Saha, I.: Artificial neural networks in hardware: A survey of two decades of progress. Neurocomputing 74(1), 239–255 (2010)
Katayama, Y., Yamane, T., Nakano, D.: An energy-efficient computing approach by filling the connectome gap. In: Ibarra, O.H., Kari, L., Kopecki, S. (eds.) UCNC 2014. LNCS, vol. 8553, pp. 229–241. Springer, Heidelberg (2014)
Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79(8), 2554–2558 (1982)
Davey, N., Hunt, S.P., Adams, R.: High capacity recurrent associative memories. Neurocomputing 62, 459–491 (2004)
Hirose, A.: Complex-Valued Neural Networks. SCI, vol. 32. Springer, Heidelberg (2006)
Tanaka, G., Aihara, K.: Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction. IEEE Trans. Neural Networks 20(9), 1463–1473 (2009)
McEliece, R.J., Posner, E.C., Rodemich, E.R., Venkatesh, S.S.: The capacity of the Hopfield associative memory. IEEE Transactions on Information Theory 33(4), 461–482 (1987)
Beiu, V., Madappuram, B.A.M., Kelly, P.M., McDaid, L.J.: On two-layer brain-inspired hierarchical topologies – A rent’s rule approach –. In: Stenström, P. (ed.) Transactions on HiPEAC IV. LNCS, vol. 6760, pp. 311–333. Springer, Heidelberg (2011)
Sompolinsky, H.: Neural networks with nonlinear synapses and a static noise. Physical Review AÂ 34(3), 2571 (1986)
Derrida, B., Gardner, E., Zippelius, A.: An exactly solvable asymmetric neural network model. Europhys. Lett. 4(2), 167 (1987)
Treves, A., Amit, D.J.: Metastable states in asymmetrically diluted hopfield networks. Journal of Physics A: Mathematical and General 21(14), 3155 (1988)
Gardner, E.: Optimal basins of attraction in randomly sparse neural network models. Journal of Physics A: Mathematical and General 22(12), 1969 (1989)
Bohland, J.W., Minai, A.A.: Efficient associative memory using small-world architecture. Neurocomputing 38, 489–496 (2001)
Stauffer, D., Aharony, A., da Fontoura Costa, L., Adler, J.: Efficient hopfield pattern recognition on a scale-free neural network. The European Physical Journal B-Condensed Matter and Complex Systems 32(3), 395–399 (2003)
McGraw, P.N., Menzinger, M.: Topology and computational performance of attractor neural networks. Physical Review EÂ 68(4), 047102 (2003)
Torres, J.J., Munoz, M.A., Marro, J., Garrido, P.: Influence of topology on the performance of a neural network. Neurocomputing 58, 229–234 (2004)
Kim, B.J.: Performance of networks of artificial neurons: The role of clustering. Physical Review EÂ 69(4), 045101 (2004)
Hebb, D.O.: The organization of behavior: A neuropsychological theory. Psychology Press (2005)
Scannell, J.W., Blakemore, C., Young, M.P.: Analysis of connectivity in the cat cerebral cortex. The Journal of Neuroscience 15(2), 1463–1483 (1995)
Rubinov, M., Sporns, O.: Complex network measures of brain connectivity: uses and interpretations. Neuroimage 52(3), 1059–1069 (2010)
Hagmann, P., Cammoun, L., Gigandet, X., Meuli, R., Honey, C.J., Wedeen, V.J., Sporns, O.: Mapping the structural core of human cerebral cortex. PLoS Biology 6(7), e159 (2008)
Bullmore, E., Sporns, O.: Complex brain networks: graph theoretical analysis of structural and functional systems. Nature Reviews Neuroscience 10(3), 186–198 (2009)
Varshney, L.R., Chen, B.L., Paniagua, E., Hall, D.H., Chklovskii, D.B.: Structural properties of the caenorhabditis elegans neuronal network. PLoS Computational Biology 7(2), e1001066 (2011)
Clune, J., Mouret, J.B., Lipson, H.: The evolutionary origins of modularity. Proceedings of the Royal Society B: Biological Sciences 280(1755), 20122863 (2013)
Diederich, S., Opper, M.: Learning of correlated patterns in spin-glass networks by local learning rules. Physical Review Letters 58, 949–952 (1987)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Tanaka, G., Yamane, T., Nakano, D., Nakane, R., Katayama, Y. (2014). Hopfield-Type Associative Memory with Sparse Modular Networks. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8834. Springer, Cham. https://doi.org/10.1007/978-3-319-12637-1_32
Download citation
DOI: https://doi.org/10.1007/978-3-319-12637-1_32
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12636-4
Online ISBN: 978-3-319-12637-1
eBook Packages: Computer ScienceComputer Science (R0)