About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
Proceedings of the IEEE
Paper
Decision making with quantized priors leads to discrimination
Abstract
Racial discrimination in decision-making scenarios such as police arrests appears to be a violation of expected utility theory. Drawing on results from the science of information, we discuss an information-based model of signal detection over a population that generates such behavior as an alternative explanation to taste-based discrimination by the decision maker or differences among the racial populations. This model uses the decision rule that maximizes expected utility - the likelihood ratio test - but constrains the precision of the threshold to a small discrete set. The precision constraint follows from both bounded rationality in human recollection and finite training data for estimating priors. When combined with social aspects of human decision making and precautionary cost settings, the model predicts the own-race bias that has been observed in several econometric studies.