About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ISCAS 2000
Conference paper
Maximum entropy and maximum likelihood criteria for feature selection from multivariate data
Abstract
We discuss several numerical methods for optimum feature selection for multivariate data based on maximum entropy and maximum likelihood criteria. Our point of view is to consider observed data x1, x2, ..., xN in Rd to be samples from some unknown pdf P. We project this data onto d directions, subsequently estimate the pdf of the univariate data, then find the maximum entropy (or likelihood) of all multivariate pdfs in Rd with marginals in these directions prescribed by the estimated univariate pdfs and finally maximize the entropy (or likelihood) further over the choice of these directions. This strategy for optimal feature selection depends on the method used to estimate univariate data.