Skip to main content

Image Feature Representation by the Subspace of Nonlinear PCA

  • Conference paper
Knowledge-Based Intelligent Information and Engineering Systems (KES 2004)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3214))

  • 961 Accesses

Abstract

In subspace pattern recognition, the basis vectors represent the features of the data and define the class. In the previous works, standard principal component analysis is used to derive the basis vectors. Compared with standard PCA, Nonlinear PCA can provide the high-order statistics and result in non-orthogonal basis vectors. We combine Nonlinear PCA and a subspace classifier to extract the edge and line features in an image. The simulation results indicate that the basis vectors from Nonlinear PCA can classify the edge patterns better than those from linear PCA.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Duda, R.O., Hart, P.E.: Pattern classification and scene analysis. John Wiley & Sons, Chichester (1973)

    MATH  Google Scholar 

  2. Fukanka, K.: Introduction to statistical pattern recognition, 2nd edn. Academic Press, Sna Diego (1990)

    Google Scholar 

  3. Oja, E.: Subspace methods of pattern recognition. Research studies Press. Ltd., Letchworth (1983)

    Google Scholar 

  4. Kambhatla, N.: Local models and Gaussian mixture models for statistical data processing, PhD thesis, Oregon Graduate Institute of Science & Technology (January, 1996)

    Google Scholar 

  5. Oja, E., Kohonen, T.: The subspace learning algorithm as a formalism for pattern recognition and neural networks. In: Proc. IEEE 1988 Int. Conf. On Neural Networks, San Diego, CA, July 1988, pp. 277–284 (1988)

    Google Scholar 

  6. Dony, R.D., Haykin, S.: Image segmentation using mixture of principal components representation. In: IEE Proc.-Vision Image signal processing, April 1997, vol. 144(2), pp. 73–80 (1997)

    Google Scholar 

  7. Oja, E.: The nonlinear PCA learning rule and signal separation – mathematical analysis. Neuro-computing 17, 25–45 (1997)

    Google Scholar 

  8. Xu, L.: Least mean square error reconstruction principal for self organizing neural-nets. Neural Networks 6, 627–648 (1993)

    Article  Google Scholar 

  9. Karhunen, J., Joutsensalo, J.: Representation and separation of signals using nonlinear PCA type learning. Neural Networks 7(1), 113–127 (1994)

    Article  Google Scholar 

  10. Oja, E., Ogawa, H., Wangviwattana, J.: Learning in nonlinear constrained Hebbian networks. In: Kohonen, T., et al. (eds.) Artificial Neural Networks, pp. 385–390. North-Holland, Amsterdam (1991)

    Google Scholar 

  11. Sanger, T.D.: Optimal unsupervised learning in a single-layer linear feedforword neural network. Neural Networks 2, 459–473 (1989)

    Article  Google Scholar 

  12. Moreau, E., Macchi, O.: High-order constrast for self –adaptive source separation. Int. J. of adaptive Control and Signal Processing 10, 19–46 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  13. Karhunen, J., Wang, L., Vigario, R.: Nonlinear PCA type approaches for source separation and independent component analysis. In: Proceedings of 1995 IEEE International conference On Neural Networks, November 1995, pp. 995–1000 (1995)

    Google Scholar 

  14. Anthony, B.J., Terrence, S.J.: The Independent components of natural scenes are edge filters. Vision research 37(23) (July 1997)

    Google Scholar 

  15. Oja, E.: The nonlinear PCA learning rule and signal separation- mathematical analysis. Neurocomputing 17, 25–45 (1997)

    Article  Google Scholar 

  16. Karhunen, J., Oja, E., Wang, L., Vigario, R., Joutsensalo, J.: A class of neural networks for independent component analysis. IEEE. Trans. on Neural Networks 8, 486–504 (1997)

    Article  Google Scholar 

  17. Dony, R.D., Haykins, S.: Optimally adaptive transform coding. IEEE Trans. Image Process 4, 1358–1370 (1995)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Chen, YW., Zeng, XY. (2004). Image Feature Representation by the Subspace of Nonlinear PCA. In: Negoita, M.G., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2004. Lecture Notes in Computer Science(), vol 3214. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30133-2_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-30133-2_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-23206-3

  • Online ISBN: 978-3-540-30133-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy