A search for the stochastic complexity of the observed data, as the greatest lower bound with which the data can be encoded, represents a global maximum likelihood principle, which permits comparison of models regardless of the number of parameters in them. For important special classes, such as the gaussian and the multinomial models, formulas for the stochastic complexity give new and powerful model selection criteria, while in the general case approximations can be computed with the MDL principle. Once a model is found with which the stochastic complexity is reached, there is nothing further to learn from the data with the proposed models. The basic notions are reviewed and numerical examples are given. © 1987, Taylor & Francis Group, LLC. All rights reserved.