Probabilistic inference is a versatile tool to solve a large variety of pixel-labeling problems in computer vision such as stereo matching and image denoising. Belief Propagation (BP) is an effective method for such inference tasks, and has also shown attractive error-resilience properties-The ability to converge to usable solutions in the presence of low-level hardware errors. This is of increasing interest, as the looming end of Moore's Law scaling brings with it a vast increase in the statistical variability of nanoscale circuit fabrics. In this work we seek to understand why certain combinations of BP and error-resilience mechanisms work so well in practice. We focus on Algorithmic Noise Tolerance (ANT) techniques for the resilience mechanisms, and Max-Product BP for inference. We analyze the error characteristics of BP in this hardware context, derive novel asymptotic error bounds, and provide theoretical reasoning to explain why ANT works well in this BP context. Experimental results from detailed resilient-BP simulations for various stereo matching tasks offer empirical support for this analysis.