Traditionally, the reliability of gate oxides is mostly evaluated using a Constant Voltage Stress, CVS, based stress-and-sense methodology, in which a constant stress voltage is interrupted to quantify the degradation. In this paper, we discuss the use of a Voltage-Ramp Stress, VRS, based stress-and-sense method in which the stress voltage is ramped at a constant rate and degradation is quantified by interrupting the VRS at increasingly higher stress voltages. The two methods are used and compared for dielectric breakdown and bias temperature instability measurements using high-k/metal gate stacks. It is shown that the two methods can be used interchangeably, producing very similar reliability assessments. It is shown that the VRS method, however, has distinct advantages over a CVS approach. It is fast and 'foolproof, providing reliable information on the voltage acceleration without any prior knowledge on the dynamics of the degradation. The use of a VRS is therefore well suited for monitoring reliability parameters during gate stack and process development, and during device optimization, where large variations in the reliability parameters are common. ©The Electrochemical Society.