Feature Selection via Mutual Information: New Theoretical Insights

Mario Beraha, Alberto Maria Metelli, Matteo Papini, Andrea Tirinzoni, and Marcello Restelli

International Joint Conference on Neural Networks (IJCNN), 2019.

CORE 2018: A   GGS 2018: B

Abstract
Mutual information has been successfully adopted in filter feature-selection methods to assess both the relevancy of a subset of features in predicting the target variable and the redundancy with respect to other variables. However, existing algorithms are mostly heuristic and do not offer any guarantee on the proposed solution. In this paper, we provide novel theoretical results showing that conditional mutual information naturally arises when bounding the ideal regression/classification errors achieved by different subsets of features. Leveraging on these insights, we propose a novel stopping condition for backward and forward greedy methods which ensures that the ideal prediction error using the selected feature subset remains bounded by a user-specified threshold. We provide numerical simulations to support our theoretical claims and compare to common heuristic methods.

[Link] [BibTeX]

 @inproceedings{beraha2019feature,
    author = "Beraha, Mario and Metelli, Alberto Maria and Papini, Matteo and Tirinzoni, Andrea and Restelli, Marcello",
    title = "Feature Selection via Mutual Information: New Theoretical Insights",
    booktitle = "International Joint Conference on Neural Networks ({IJCNN})",
    organization = "IEEE",
    pages = "1--9",
    year = "2019",
    url = "https://doi.org/10.1109/IJCNN.2019.8852410",
    doi = "10.1109/IJCNN.2019.8852410"
}