Runtime noise management systems typically respond to on-chip noise sensors to accurately capture voltage emergencies. As such, the threshold voltage for noise sensors to report emergencies serves as a critical tuning knob between the system failure rate and the runtime performance loss (RPL) due to false alarms. Unfortunately, the problem of optimal threshold voltage computation remains open in literature despite its importance. The problem is further complicated by process variations, which introduce significant variations in load currents and thus in noise across different chips. A uniform noise margin may not work optimally for all the chips. In this paper, we first formulate the problem of minimizing the system failure rate subject to a given RPL constraint. We then put forward a uniform scheme to find an optimal solution for all chips. Compared to a seemingly more intuitive approach which is too conservative, experimental results over a set of industrial designs show an average of 32.1% reduction in system failure rate under the same RPL constraint. We further show that with the help of Iddq measurements during testing which reveals process variation information, it is possible and efficient to compute a per-chip optimal threshold voltage. Such an approach further reduces the system failure rate by 25.0% on average compared with the uniform threshold approach, under the same RPL constraint. To the best of the authors knowledge, this is the first in-depth study on optimal threshold voltage computation for noise sensors. We hope that it shall point out new directions for systematic studies of on-chip noise sensor utilization.