Abstract
Instead of previous SVM algorithms that utilize a kernel to evaluate the dot products of data points in a feature space, here points are explicitly mapped into a feature space by a Single hidden Layer Feedforward Network (SLFN) with its input weights randomly generated. In theory this formulation, which can be interpreted as a special form of Regularization Network (RN), tends to provide better generalization performance than the algorithm for SLFNs—Extreme Learning Machine (ELM) and leads to a extremely simple and fast nonlinear SVM algorithm that requires only the inversion of a potentially small matrix with the order independent of the size of the training dataset. The experimental results show that the proposed Extreme SVM can produce better generalization performance than ELM almost all of the time and can run much faster than other nonlinear SVM algorithms with comparable accuracy.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Sartori, M.A., Antsaklis, P.J.: A simple method to derive bounds on the size and to train multilayer neural networks. IEEE Trans. Neural Networks 2, 34–43 (1991)
Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental feedforward networks with arbitrary input weights. In: ICIS (2003)
Huang, G.B., Zhu, Q., Siew, C.K.: Extreme Learning Machine: A New learning Scheme of Feedforward Neural Networks. In: IJCNN (2004)
Tikhonov, A.N., Arsenin, V.Y.: Solutions of Ill-posed Problems. W.H. Winston, Washington (1997)
Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and support vector machines. Advances in Computational Mathematics 13, 1–50 (2000)
Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and support vector machines. In: Advances in Large Margin Classifiers, pp. 171–203 (2000)
Fung, G., Mangasarian, O.L.: Proximal support vector machine classifiers. In: KDD, pp. 77–86 (2001)
Mangasarian, O.L., Wild, E.W.: Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Transaction on Pattern Analysis and Machine Intelligence 28(1), 69–74 (2006)
Suykens, J.A.K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters 9(3), 293–300 (1999)
Huang, G.B., Babri, H.A.: Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Transactions on Neural Networks 9(1), 224–229 (1998)
Huang, G.B.: Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Transactions on Neural Networks 14(2), 274–281 (2003)
Baum, E.B.: On the capabilities of multilayer perceptions. J. Complexity 4, 193–215 (1988)
Huang, S.C., Huang, Y.F.: Bounds on number of hidden neurons in multilayer perceptrons. IEEE Trans. Neural Networks 2, 47–55 (1991)
Serre, D.: Matrices: Theory and applications. Springer, New York (2002)
Vapnik, V.N.: Estimation of Dependences Based on Empirical Data. Springer, Berlin (1982)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Bertero, M.: Regularization methods for linear inverse problems. In: Inverse Problems, Springer, Berlin (1986)
Bertero, M., Poggio, T., Torre, V.: Ill-posed problems in early vision. Proc. IEEE, 869–889 (1988)
Wahba, G.: Splines Models for Observational Data. Series in Applied Mathematics, vol. 59. SIAM, Philadelphia (1990)
Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. The John Hopkins University Press, Baltimore (1996)
Nilsson, N.J.: Learning Machine. McGraw-Hill, New York (1965)
Chang, C.-C., Lin, C.-J.: LIBSVM: a library for support vector machines (2001)
Murphy, P.M., Aha, D.W.: UCI repository of machine learning databases (1992)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liu, Q., He, Q., Shi, Z. (2008). Extreme Support Vector Machine Classifier. In: Washio, T., Suzuki, E., Ting, K.M., Inokuchi, A. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2008. Lecture Notes in Computer Science(), vol 5012. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-68125-0_21
Download citation
DOI: https://doi.org/10.1007/978-3-540-68125-0_21
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-68124-3
Online ISBN: 978-3-540-68125-0
eBook Packages: Computer ScienceComputer Science (R0)