The Y-factor method is a widely used technique for measuring the gain and noise temperature of an amplifier. It is based on the Johnson-Nyquist noise of a resistor at two different, known temperatures.[1]
Consider a microwave amplifier with a 50 ohm impedance with a 50 ohm resistor connected to the amplifier input. If the resistor is at a physical temperature , then the Johnson-Nyquist noise power coupled to the amplifier input is , where is Boltzmann’s constant and is the bandwidth. The noise power at the output of the amplifier (i.e. the noise power coupled to an impedance-matched load that is connected to the amplifier output) is , where G is the amplifier power gain and is the amplifier noise temperature. For the Y-factor technique, is measured for two different, known values of . is then converted to an effective temperature (in units of kelvin) by dividing by and the measurement bandwidth . The two values of are then plotted as a function of (also in units of kelvin), and a line is fit to these two points (see figure). The slope of this line is equal to the amplifier power gain. The x-intercept of the line is equal the negative of the amplifier noise temperature - in kelvin.