About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Abstract
Current approaches for feature selection on multiple data sources need to join all data in order to evaluate features against the class label, thus are not scalable and involve unnecessary information leakage. In this paper, we present a way of performing feature selection through class propagation, eliminating the need of join before feature selection. We propagate a very compact data structure that provides enough information for selecting features to each data source, thus allowing features to be evaluated locally without looking at any other information. Our experiments confirmed that our algorithm is highly scalable while effectively preserving the data privacy. Copyright © by SIAM.