Automated Feature Engineering Sparsh
Automated Feature Engineering Sparsh
Introduction
Literature Survey
Methodology
Proposed Method
Future Scope
Conclusion
Introduction
Automated Essay Scoring Neural Networks (Transformer Models) 88.5 Limited research on hybrid models combining syntactical and grammatical elements
with Transformer models.
Feature Engineering for Bankruptcy Prediction Random Forest, XGBoost, Feature Engineering 75.2 Lack of exploration into hybrid models incorporating both domain-specific and deep
Methods learning-based features for bankruptcy prediction.
Efficient Data Pre-processing and Feature Engineering Automated Data Pre-processing Tools, Transformers 93.7 Need for more research on the generalizability of automated feature engineering
methods across various domains and datasets.
Multi-Layer Visual Feature Fusion for Groundnut Leaf Disease MLVSF Model, CNN, Handcrafted Features 98.9 Limited studies on the robustness of MLVSF models in detecting rare groundnut leaf
Classification disease patterns and their scalability to larger datasets.
Slim-FCP for Cooperative Perception in Automated Vehicles Slim-FCP Model, CNN, Salp Swarm Algorithm 82.6 Research gap in evaluating Slim-FCP's performance under diverse driving conditions and
its integration with real-time automated vehicle systems.
Automated Feature Engineering for Downstream Tasks Reinforcement Learning, Feature Pre-Evaluation 89.6 Lack of investigation into the scalability and robustness of the proposed AFE framework
Model across different types of datasets and downstream tasks.
Auto-Spam Detection with Salp Swarm Algorithm Salp Swarm Algorithm, RBNN Classifier 95.4 Need for further exploration of the ASD-SSFSML method's performance with different
spam datasets and its applicability in real-time spam detection systems.
Automated Groundnut Leaf Disease Classification CNN, Feature Fusion, Mini Batch K-Means 91.3 Limited research on the generalizability of the proposed feature fusion technique to
other crop disease classification tasks and datasets.
ASD-SSFSML for Spam Detection Fire Hawk Optimizer, Feature Selection Algorithms 87.9 Research gap in evaluating the ASD-SSFSML model's performance with various email
servers and spam detection thresholds.
Lightweight Feature-Based Cooperative Perception Semantic Feature Encoder, Slim-FCP Model 78.1 Need for further exploration of Slim-FCP's efficiency and accuracy across diverse road
environments and its integration with real-time cooperative perception systems.
Methodology
Integration of Domain Knowledge: Future research in AFE will focus on incorporating domain-specific
knowledge into feature engineering pipelines, enabling more context-aware feature generation and
selection.
Hybrid AFE Frameworks: There's potential for the development of hybrid AFE frameworks that combine
multiple techniques, such as symbolic regression with deep learning, to address complex data challenges
and enhance model performance.
Automated Model Selection and Tuning: AFE could extend beyond feature engineering to encompass
automated model selection and hyperparameter tuning, creating end-to-end automated machine learning
pipelines for enhanced efficiency and effectiveness.
Dynamic Feature Engineering: AFE methods may evolve to dynamically adapt to changing data distributions
and drifts, ensuring that feature engineering pipelines remain effective over time in dynamic environments.
Interpretable AFE Techniques: There will be a growing demand for interpretable AFE techniques that
provide insights into the underlying relationships between features and target variables, enhancing model
transparency and trustworthiness.
Privacy-Preserving AFE: With increasing concerns about data privacy, future AFE research may focus on
developing techniques that preserve privacy while extracting meaningful features from sensitive data,
enabling secure and ethical data-driven decision-making.
Conclusion: