Abstract
One-class classification is an important branch of machine learning. Feature extraction is an important means to improve the performance of one-class classifiers, but there is no generalized method yet reported to solve this problem. In this paper, a framework is proposed for one-class feature extraction. The proposed framework divides the original feature space into two orthogonal spaces, namely the principal space and the complementary space. The principal space is used to learn the features of the target class, and the complementary space is used to learn the features of the abnormal class. The features extracted from the two spaces are fused as the final one-class feature vector of the original feature space. Furthermore, a specific implementation method, complete principal component analysis (CPCA), is proposed. First, CPCA conducts principal component analysis to calculate the projection scores of the target class samples in the principal space. Then, according to the projection vectors of the principal components (obtained in the principal space), the corresponding complementary space is constructed. The projection of the sample in the complementary space is calculated and transformed into the first-order norm as the extracted feature in the complementary space. Several datasets are used to verify the effect of this proposed method. The experimental results show that the proposed CPCA has good universality for one-class feature extraction problems.





Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
Available upon request.
References
Bing L, Liu M, Guo Z, Ji Y (2018) Mechanical fault diagnosis of high voltage circuit breakers utilizing EWT-improved time frequency entropy and optimal GRNN classifier. Entropy 20:448–459
Burnaev E, Smolyakov D (2017) One-Class SVM with Privileged Information and Its Application to Malware Detection. In: IEEE International Conference on Data Mining Workshops
De Santana FB, Neto WB, Poppi RJ (2019) Random forest as one-class classifier and infrared spectroscopy for food adulteration detection. Food Chem 293:323–332
Fei G, Teng H, Sun J, et al (2018) A new algorithm of sar image target recognition based on improved deep convolutional neural network. Cogn Comput 1–16
FernáNdez-Francos D, MartíNez-Rego D, Fontenla-Romero O, Alonso-Betanzos A (2013) Automatic bearing fault diagnosis based on one-class ν-SVM. Comput Ind Eng 64:357–365
Galeano P, Joseph E, Lillo RE (2013) The mahalanobis distance for functional data with applications to classification. Technometrics 57:281–291
Guerbai Y, Chibani Y, Hadjadji B (2018) Handwriting gender recognition system based on the one-class support vector machines. In: Seventh International Conference on Image Processing Theory
Huang G, Yang Z, Chen X, Ji G (2017) An innovative one-class least squares support vector machine model based on continuous cognition. Knowl-Based Syst 123:217–228
Huang G, Yuan L, Shi W et al (2022) Using one-class autoencoder for adulteration detection of milk powder by infrared spectrum. Food Chem 372:131219
Jeong YS, Kang IH, Jeong MK, Kong D (2012) A new feature selection method for one-class classification problems. IEEE Trans Syst Man Cybern Part C Appl Rev 42:1500–1509
Jia F, Yan Y, Zhang J (2018) K-means based feature reduction for network anomaly detection. J Tsinghua Univ 58:137–142
Kemmler M, Rodner E, Wacker ES, Denzler J (2013) One-class classification with Gaussian processes. Pattern Recognit 46:3507–3518
Kim K (2018) An improved semi-supervised dimensionality reduction using feature weighting: application to sentiment analysis. Expert Syst Appl 109:49–65
Koltchinskii V, Lounici K (2016) New asymptotic results in principal component analysis. Sankhya A 79:1–44
Lian H (2012) On feature selection with principal component analysis for one-class SVM. Pattern Recognit Lett 33:1027–1031
Liu C, Wang W, Konan M et al (2017) A new validity index of feature subset for evaluating the dimensionality reduction algorithms. Knowl-Based Syst 121:83–98
Lorena LHN, Carvalho ACPLF, Lorena AC (2014) Filter feature selection for one-class classification. J Intell Robot Syst 80:1–17
Schölkopf B, Smola A, Williamson R, Bartlett P (2000) New support vector algorithms. Neural Comput 12:1207–1245
Tax DMJ (2003) Feature extraction for one-class classification. In: Joint International conference on artificial neural networks and neural information processing. pp 342–349
Washizawa Y, Hotta S (2017) Mahalanobis distance on extended grassmann manifolds for variational pattern analysis. IEEE Trans Neural Netw Learn Syst 25:1980–1990
Wold S, Esbensen K, Geladi P (1987) Principal component analysis. Chemom Intell Lab Syst 2:37–52
Xiao Y, Wang H, Zhang L, Xu W (2014) Two methods of selecting Gaussian kernel parameters for one-class SVM and their application to fault detection. Knowl-Based Syst 59:75–84
Xiao Y, Wang H, Xu W (2015) Hyperparameter Selection for Gaussian Process One-Class Classification. IEEE Trans Neural Netw Learn Syst 26:2182–2187
Xu L, Yan SM, Cai CB, Yu XP (2013) One-class partial least squares (OCPLS) classifier. Chemom Intell Lab Syst 126:1–5
Yu G, Xiao H (2018) Genetic algorithm-tuned adaptive pruning SVDD method for HRRP-based radar target recognition. Int J Remote Sens 39:3407–3428
Funding
The authors would like to acknowledge the financial support provided by the Natural Science Foundation of Zhejiang (LY21C200001 and LQ20F030059) and the National Natural Science Foundation of China (62105245 and 61805180) and the Wenzhou science and technology bureau general project (S2020011 and G20200044).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Human or animal rights
This article does not contain any studies with human participants or animals performed by any of the authors.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Huang, G., Chen, X., Chen, X. et al. A one-class feature extraction method based on space decomposition. Soft Comput 26, 5553–5561 (2022). https://doi.org/10.1007/s00500-022-07067-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-022-07067-y