Abstract
In subspace pattern recognition, the basis vectors represent the features of the data and define the class. In the previous works, standard principal component analysis is used to derive the basis vectors. Compared with standard PCA, Nonlinear PCA can provide the high-order statistics and result in non-orthogonal basis vectors. We combine Nonlinear PCA and a subspace classifier to extract the edge and line features in an image. The simulation results indicate that the basis vectors from Nonlinear PCA can classify the edge patterns better than those from linear PCA.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Duda, R.O., Hart, P.E.: Pattern classification and scene analysis. John Wiley & Sons, Chichester (1973)
Fukanka, K.: Introduction to statistical pattern recognition, 2nd edn. Academic Press, Sna Diego (1990)
Oja, E.: Subspace methods of pattern recognition. Research studies Press. Ltd., Letchworth (1983)
Kambhatla, N.: Local models and Gaussian mixture models for statistical data processing, PhD thesis, Oregon Graduate Institute of Science & Technology (January, 1996)
Oja, E., Kohonen, T.: The subspace learning algorithm as a formalism for pattern recognition and neural networks. In: Proc. IEEE 1988 Int. Conf. On Neural Networks, San Diego, CA, July 1988, pp. 277–284 (1988)
Dony, R.D., Haykin, S.: Image segmentation using mixture of principal components representation. In: IEE Proc.-Vision Image signal processing, April 1997, vol. 144(2), pp. 73–80 (1997)
Oja, E.: The nonlinear PCA learning rule and signal separation – mathematical analysis. Neuro-computing 17, 25–45 (1997)
Xu, L.: Least mean square error reconstruction principal for self organizing neural-nets. Neural Networks 6, 627–648 (1993)
Karhunen, J., Joutsensalo, J.: Representation and separation of signals using nonlinear PCA type learning. Neural Networks 7(1), 113–127 (1994)
Oja, E., Ogawa, H., Wangviwattana, J.: Learning in nonlinear constrained Hebbian networks. In: Kohonen, T., et al. (eds.) Artificial Neural Networks, pp. 385–390. North-Holland, Amsterdam (1991)
Sanger, T.D.: Optimal unsupervised learning in a single-layer linear feedforword neural network. Neural Networks 2, 459–473 (1989)
Moreau, E., Macchi, O.: High-order constrast for self –adaptive source separation. Int. J. of adaptive Control and Signal Processing 10, 19–46 (1996)
Karhunen, J., Wang, L., Vigario, R.: Nonlinear PCA type approaches for source separation and independent component analysis. In: Proceedings of 1995 IEEE International conference On Neural Networks, November 1995, pp. 995–1000 (1995)
Anthony, B.J., Terrence, S.J.: The Independent components of natural scenes are edge filters. Vision research 37(23) (July 1997)
Oja, E.: The nonlinear PCA learning rule and signal separation- mathematical analysis. Neurocomputing 17, 25–45 (1997)
Karhunen, J., Oja, E., Wang, L., Vigario, R., Joutsensalo, J.: A class of neural networks for independent component analysis. IEEE. Trans. on Neural Networks 8, 486–504 (1997)
Dony, R.D., Haykins, S.: Optimally adaptive transform coding. IEEE Trans. Image Process 4, 1358–1370 (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chen, YW., Zeng, XY. (2004). Image Feature Representation by the Subspace of Nonlinear PCA. In: Negoita, M.G., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2004. Lecture Notes in Computer Science(), vol 3214. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30133-2_43
Download citation
DOI: https://doi.org/10.1007/978-3-540-30133-2_43
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23206-3
Online ISBN: 978-3-540-30133-2
eBook Packages: Springer Book Archive