Decision making with quantized priors leads to discrimination
Abstract
Racial discrimination in decision-making scenarios such as police arrests appears to be a violation of expected utility theory. Drawing on results from the science of information, we discuss an information-based model of signal detection over a population that generates such behavior as an alternative explanation to taste-based discrimination by the decision maker or differences among the racial populations. This model uses the decision rule that maximizes expected utility - the likelihood ratio test - but constrains the precision of the threshold to a small discrete set. The precision constraint follows from both bounded rationality in human recollection and finite training data for estimating priors. When combined with social aspects of human decision making and precautionary cost settings, the model predicts the own-race bias that has been observed in several econometric studies.