Abstract
Recurrent neural networks are frequently used in cognitive science community for modeling linguistic structures. More or less intensive training process is usually performed but several works showed that untrained recurrent networks initialized with small weights can be also successfully used for this type of tasks. In this work we demonstrate that the state space organization of untrained recurrent neural network can be significantly improved by choosing appropriate input representations. We experimentally support this notion on several linguistic time series.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Elman, J.L.: Finding structure in time. Cognitive Science 14(2), 179–211 (1990)
Tiňo, P., Čerňanský, M., Beňušková, Ľ.: Markovian architectural bias of recurrent neural networks. IEEE Transactions on Neural Networks 15(1), 6–15 (2004)
Ron, D., Singer, Y., Tishby, N.: The power of amnesia. Machine Learning 25, 117–149 (1996)
Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004)
Jaeger, H.: Short term memory in echo state networks. Technical Report GMD 152, German National Research Center for Information Technology (2001)
Tong, M.H., Bickett, A.D., Christiansen, E.M., Cottrell, G.W.: Learning grammatical structure with Echo State Networks. Neural Networks 20, 424–432 (2007)
Frank, S.L.: Strong systematicity in sentence processing by an Echo State Network. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 505–514. Springer, Heidelberg (2006)
Frank, S.L.: Learn more by training less: systematicity in sentence processing by recurrent networks. Connection Science 18, 287–302 (2006)
Tiňo, P., Čerňanský, M., Beňušková, Ľ.: Organization of the state space of a simple recurrent neural network before and after training on recursive linguistic structures. Neural Networks 20, 236–244 (2007)
Frank, S.L., Čerňanský, M.: Generalization and systematicity in echo state networks. In: Proceedings of the 30th Cognitive Science Conference, Washington, DC, USA, pp. 733–738 (2008)
Frank, S.L., Jacobsson, H.: Sentence processing in echo state networks: a qualitative analysis by finite state machine extraction. Journal of Algorithms (2008) (submitted)
Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical Report GMD 148, German National Research Center for Information Technology (2001)
Bullinaria, J.A., Levy, J.P.: Extracting semantic representations from word co-occurrence statistics: a computational study. Behavior Research Methods 39, 510–526 (2007)
Elman, J.: Distributed representations, simple recurrent networks, and grammatical structure. Machine Learning 7, 195–225 (1991)
Farkaš, I., Crocker, M.: Recurrent networks and natural language: exploiting self-organization. In: Proceedings of the 28th Cognitive Science Conference, Vancouver, Canada, pp. 1275–1280 (2006)
Čerňanský, M., Beňušková, Ľ.: Simple recurrent network trained by RTRL and extended Kalman filter algorithms. Neural Network World 13(3), 223–234 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Čerňanský, M., Makula, M., Beňušková, Ľ. (2009). Improving the State Space Organization of Untrained Recurrent Networks. In: Köppen, M., Kasabov, N., Coghill, G. (eds) Advances in Neuro-Information Processing. ICONIP 2008. Lecture Notes in Computer Science, vol 5506. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02490-0_82
Download citation
DOI: https://doi.org/10.1007/978-3-642-02490-0_82
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02489-4
Online ISBN: 978-3-642-02490-0
eBook Packages: Computer ScienceComputer Science (R0)