The accuracy of microwave measurements is not only critical for applications in telecommunication and radar but also for future quantum computers. Qubit technologies such as superconducting qubits or spin qubits require the detection of minuscule signals, typically achieved by reflecting a microwave tone off a resonator that is coupled to the qubit. Noise from cabling and amplification, e.g., from temperature variations, can be detrimental to readout fidelity. We present an approach to detect phase and amplitude changes of a device under test based on the differential measurement of microwave tones generated by two first-order sidebands of a carrier signal. The two microwave tones are sent through the same cable to the measured device that exhibits a narrow-band response for one sideband and leaves the other unaffected. The reflected sidebands are interfered by down-conversion with the carrier. By choosing the amplitude and phases of the sidebands, suppression of either common-amplitude or common-phase noise can be achieved, allowing for fast, stable measurements of frequency shifts and quality factors of resonators. Test measurements were performed on NbN superconducting resonators at 25 mK to calibrate and characterize the experimental setup and to study time-dependent fluctuations of their resonance frequency.