Sensitivity (radio receiver)

From Wikipedia, the free encyclopedia

A receiver's sensitivity is a measure of its ability to discern low-level signals.

Sensitivity in a receiver is normally taken as the minimum input signal (Smin) required to produce a specified output signal having a specified signal-to-noise (S/N) ratio and is defined as the minimum signal-to-noise ratio times the mean noise power.

Because receive sensitivity indicates how faint a signal can be successfully received by the receiver, the lower power level, the better. This means that the larger the absolute value of the negative number, the better the receive sensitivity. For example, a receive sensitivity of -98 dBm is better than a receive sensitivity of -95 dBm by 3 dB, or a factor of two. In other words, at a specified data rate, a receiver with a -98 dBm sensitivity can hear signals that are half as weak as a receiver with a -95 dBm receive sensitivity.

The simplified equation of the sensitivity (in Volts) is

e = \sqrt{k(T_i+T_s)B(R-1)/Z_0}

where

k = Boltzmann's constant
Ti = noise temperature that the antenna sees (kelvins)
Ts = noise temperature of the receiver (kelvins)
B = bandwidth (Hz)
R = Required SNR at input (linear, not in dB)
Z0 = characteristic impedance (50 Ω)