Abstract
We propose in this paper a new kernel, suited for Support Vector Machines learning, which is inspired from the biological world. The kernel is based on Gabor filters that are a good model for the response of the cells in the primary visual cortex and have been shown to be very effective in processing natural images. Furthermore, we build a link between energy-efficiency, which is a driving force in biological processing systems, and good generalization ability of learning machines. This connection can be the starting point for developing new kernel-based learning algorithms.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Anguita, D., Pischiutta, S., Ridella, S., Sterpi, D.: Feed–forward Support Vector Machines without Multipliers. IEEE Trans. on Neural Networks (in press, 2006)
Anguita, D., Boni, A., Ridella, S.: A Digital Architecture for Support Vector Machines: Theory, Algorithm and FPGA Implementation. IEEE Trans. on Neural Networks 14, 993–1009 (2003)
Anthony, M., Bartlett, P.L.: Neural Networks Learning: Theoretical Foundations. Cambridge University Press, Cambridge (1999)
Bartlett, P.L.: The Sample Complexity of Pattern Classification with Neural Networks: The Size of the Weights is More Important than the Size of the Network. IEEE Transactions on Information Theory 44, 525–536 (1998)
Chandrakasan, A., Brodersen, R.: Minimizing power consumption in digital CMOS circuits. Proc. of the IEEE 83, 498–523 (1995)
Cortes, C., Vapnik, V.: Support–vector networks. Machine Learning 27, 273–297 (1991)
Genton, M.G.: Classes of kernel for machine learning. Journal of Machine Learning Research 2, 299–312 (2001)
Herbrich, R.: Learning Kernel Classifiers: Theory and Algorithms. MIT Press, Cambridge (2002)
Herbrich, R., Graepel, T., Shawe-Taylor, J.: Sparsity vs. Margins for Linear Classifiers. In: Proc. of the 13th Conf. on Computational Learning Theory, pp. 304–308 (2000)
Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison-Wesley, Reading (1997)
The International Technology Roadmap for Semiconductors. ITRS (2005), http://public.itrs.net
Jones, J.P., Palmer, L.A.: An evaluation of the two-dimensional Gabor filter model of simple receptive fields in cat striate cortex. J. Neurophysiol. 58, 1233–1258 (1987)
Mitchell, T.: Machine learning. McGraw-Hill, New York (1997)
Parhami, B.: Computer arithmetic: algorithms and hardware design. Oxford University Press, Oxford (2000)
Poggio, T., Girosi, F.: Networks for approximation and learning. Proc. of the IEEE 78, 1481–1497 (1987)
Poggio, T., Girosi, F.: A Theory of Networks for Approximation and Learning. Technical Report 1140, MIT AI Lab (1989)
Poggio, T., Mukherjee, S., Rifkin, R., Rahklin, A., Verri, A.: b. Technical Report 198, MIT CBCL (2001)
Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for AdaBoost. Machine Learning 42, 287–320 (2001)
Rumelhart, D.E., McClelland, J.L.: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge (1986)
Schmidhuber, J.: Discovering neural nets w1997ith low Kolmogorov complexity and high generalization capability. Neural Networks 10, 857–873 (1997)
Schölkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, T., Vapnik, V.: Comparing support vector machines with Gaussian kernels to radial basis function classifiers. IEEE Trans. on Signal Processing 45, 2758–2765 (1997)
Shawe-Taylor, J., Bartlett, P.L., Williamson, R.C., Anthony, M.: Structural Risk Minimization over Data-dependent Hierarchies. IEEE Trans. on Information Theory 44, 1926–1940 (1998)
Valiant, L.G.: A theory of the learnable. Comm. of the ACM 27, 1134–1142 (1984)
Vapnik, V.: Statistical Learning Theory. John Wiley & Sons, Chichester (1998)
Vapnik, V.: The Elements of Statistical Learning Theory, 2nd edn. Springer, Heidelberg (2000)
Vincent, B.T., Baddeley, R.J.: Synaptic energy efficiency in retinal processing. Vision Research 43, 1283–1290 (2003)
Wang, Y., Chua, C.-S.: Face recognition using 2D and 3D images using 3D Gabor filters. Image and Vision Computing 23, 1018–1028 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Anguita, D., Sterpi, D. (2006). Nature Inspiration for Support Vector Machines. In: Gabrys, B., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2006. Lecture Notes in Computer Science(), vol 4252. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893004_57
Download citation
DOI: https://doi.org/10.1007/11893004_57
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46537-9
Online ISBN: 978-3-540-46539-3
eBook Packages: Computer ScienceComputer Science (R0)