Atmospheric moisture content is described by several metrics. Two commonly encountered terms are dew point and relative humidity. Relative humidity indicates the amount of water vapor present in air expressed as a percentage of the amount needed for saturation at the same temperature. For instance, a relative humidity of 50% signifies that the air holds half the water vapor it could hold at its current temperature. Dew point, conversely, is the temperature to which air must be cooled, at constant pressure and water vapor content, for saturation to occur. When air cools to its dew point temperature, condensation begins to form. An example: if the relative humidity is high and the temperature drops, it will reach dew point faster than if the humidity is low.
Understanding these measures is crucial in fields ranging from meteorology to agriculture. Knowledge of atmospheric moisture facilitates weather forecasting, predicting the likelihood of fog or precipitation. In agriculture, it informs irrigation strategies and helps prevent crop diseases that thrive in high humidity conditions. Historically, these measurements were qualitative, relying on human observation. Modern instruments provide precise and readily available data, improving accuracy in many applications.