Abstract: In real-world scenarios, datasets generally exhibit containing mixed-type of attributes and imbalanced classes distribution, and the minority classes in the data are the primary research focus. Attribute reduction is a key step in the data preprocessing process, but traditional attribute reduction methods commonly overlook the significance of minority class samples, causing the critical information possessed in minority class samples to damage and decrease the performance of classification. In order to address this issue, we develop an attribute reduction algorithm based on a composite entropy-based uncertainty measure to handle imbalanced mixed-type data. To begin with, we design a novel oversampling method…based on the three-way decisions boundary region to synthesize the samples of minority class, for the boundary region to contain more high-quality samples. Then, we propose an attribute measure to select candidate attributes, which considers the boundary entropy, degree of dependency and weight of classes. On this basis, a composite entropy-based uncertainty measure guided attribute reduction algorithm is developed to select the attribute subset for the imbalanced mixed-type data. Experimental on UCI imbalanced datasets, as well as the results indicate that the developed attribute reduction algorithm is significantly outperforms compared to other attribute reduction algorithms, especially in total AUC, F1-Score and G-Mean.
Show more
Abstract: Feature selection focuses on selecting important features that can improve the accuracy and simplification of the learning model. Nevertheless, for the ordered data in many real-world applications, most of the existing feature selection algorithms take the single-measure into consideration when selecting candidate features, which may affect the classification performance. Based on the insights obtained, a multi-measure feature selection algorithm is developed for ordered data, which not only considers the certain information by the dominance-based dependence, but also uses the discern information provided by the dominance-based information granularity. Extensive experiments are performed to evaluate the performance of the proposed algorithm on…UCI data sets in terms of the number of selected feature subset and classification accuracy. The experimental results demonstrate that the proposed algorithm not only can find the relevant feature subset but also the classification performance is better than, or comparably well to other feature selection algorithms.
Show more
Abstract: Feature selection can reduce the dimensionality of data effectively. Most of the existing feature selection approaches using rough sets focus on the static single type data. However, in many real-world applications, data sets are the hybrid data including symbolic, numerical and missing features. Meanwhile, an object set in the hybrid data often changes dynamically with time. For the hybrid data, since acquiring all the decision labels of them is expensive and time-consuming, only small portion of the decision labels for the hybrid data is obtained. Therefore, in this paper, incremental feature selection algorithms based on information granularity are developed for…dynamic partially labeled hybrid data with the variation of an object set. At first, the information granularity is given to measure the feature significance for partially labeled hybrid data. Then, incremental mechanisms of information granularity are proposed with the variation of an object set. On this basis, incremental feature selection algorithms with the variation of a single object and group of objects are proposed, respectively. Finally, extensive experimental results on different UCI data sets demonstrate that compared with the non-incremental feature selection algorithms, incremental feature selection algorithms can select a subset of features in shorter time without losing the classification accuracy, especially when the group of objects changes dynamically, the group incremental feature selection algorithm is more efficient.
Show more