Skip to main content

Relative Reduct-Based Selection of Features for ANN Classifier

  • Conference paper
Man-Machine Interactions

Part of the book series: Advances in Intelligent and Soft Computing ((AINSC,volume 59))

Abstract

Artificial neural networks hold the established position of efficient classifiers used in decision support systems, yet to be efficient an ANN-based classifier requires careful selection of features. The excessive number of conditional attributes is not a guarantee of high classification accuracy, it means gathering and storing more data, and increasing the size of the network. Also the implementation of the trained network can become complex and the classification process takes more time. This line of reasoning leads to conclusion that the number of features should be reduced as far as possible without diminishing the power of the classifier. The paper presents investigations on attribute reduction process performed by exploiting the concept of reducts from the rough set theory and employed within stylometric analysis of literary texts that belongs with automatic categorisation tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Cyran, K.A., Mrózek, A.: Rough sets in hybrid methods for pattern recognition. International Journal of Intelligent Systems 16, 149–168 (2001)

    Article  MATH  Google Scholar 

  2. Cyran, K.A., StaƄczyk, U.: Indiscernibility relation for continuous attributes: application in image recognition. In: Kryszkiewicz, M., Peters, J.F., RybiƄski, H., Skowron, A. (eds.) RSEISP 2007. LNCS (LNAI), vol. 4585, pp. 726–735. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  3. Doumpos, M., Salappa, A.: Feature selection algorithms in classification problems: an experimental evaluation. WSEAS Transactions on Information Science & Applications 2(2), 77–82 (2005)

    Google Scholar 

  4. Matthews, R.A.J., Merriam, T.V.N.: Distinguishing literary styles using neural networks. In: Fiesler, E., Beale, R. (eds.) Handbook of neural computation, pp. G8.1.1–6. Oxford University Press, Oxford (1997)

    Google Scholar 

  5. Moshkow, M.J., Skowron, A., Suraj, Z.: On covering attribute sets by reducts. In: Kryszkiewicz, M., Peters, J.F., RybiƄski, H., Skowron, A. (eds.) RSEISP 2007. LNCS (LNAI), vol. 4585, pp. 175–180. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  6. Pawlak, Z.: Rough set rudiments. Tech. rep., Institute of Computer Science Report, Warsaw University of Technology, Warsaw, Poland (1996)

    Google Scholar 

  7. Peng, R.D., Hengartner, H.: Quantitative analysis of literary styles. The American Statistician 56(3), 15–38 (2002)

    Article  MathSciNet  Google Scholar 

  8. Shen, Q.: Rough feature selection for intelligent classifiers. In: Peters, J.F., Skowron, A., Marek, V.W., OrƂowska, E., SƂowiƄski, R., Ziarko, W.P. (eds.) Transactions on Rough Sets VII. LNCS, vol. 4400, pp. 244–255. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  9. SƂowiƄski, R., Greco, S., Matarazzo, B.: Dominance-based rough set approach to reasoning about ordinal data. In: Kryszkiewicz, M., Peters, J.F., RybiƄski, H., Skowron, A. (eds.) RSEISP 2007. LNCS (LNAI), vol. 4585, pp. 5–11. Springer, Heidelberg (2007)

    Google Scholar 

  10. Smolinski, T.G., Chenoweth, D.L., Zurada, J.M.: Application of rough sets and neural networks to forecasting university facility and administrative cost recovery. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC 2004. LNCS (LNAI), vol. 3070, pp. 538–543. Springer, Heidelberg (2004)

    Google Scholar 

  11. StaƄczyk, U., Cyran, K.A.: On employing elements of rough set theory to stylometric analysis of literary texts. International Journal on Applied Mathematics and Informatics 1(2), 159–166 (2007)

    Google Scholar 

  12. Stefanowski, J.: On combined classifiers, rule induction and rough sets. In: Peters, J.F., Skowron, A., DĂŒntsch, I., GrzymaƂa-Busse, J.W., OrƂowska, E., Polkowski, L. (eds.) Transactions on Rough Sets VI. LNCS, vol. 4374, pp. 329–350. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

StaƄczyk, U. (2009). Relative Reduct-Based Selection of Features for ANN Classifier. In: Cyran, K.A., Kozielski, S., Peters, J.F., StaƄczyk, U., Wakulicz-Deja, A. (eds) Man-Machine Interactions. Advances in Intelligent and Soft Computing, vol 59. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-00563-3_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-00563-3_35

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-00562-6

  • Online ISBN: 978-3-642-00563-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy