Some water in the form of invisible vapour is intermixed with the air throughout the atmosphere. It is the condensation of this vapour which gives rise to most weather phenomena: clouds, rain, snow, dew, frost and fog. There is a limit to how much water vapour the air can hold and this limit varies with temperature. When the air contains the maximum amount of vapour possible for a particular temperature, the air is said to be saturated. Warm air can hold more vapour than cold air. In general the air is not saturated, containing only a fraction of the possible water vapour.
The amount of vapour in the air can be measured in a number of ways. The humidity of a packet of air is usually denoted by the mass of vapour contained within it, or the pressure that the water vapour exerts. This is the absolute humidity of air. Relative humidity is measured by comparing the actual mass of vapour in the air to the mass of vapour in saturated air at the same temperature. For example, air at 10°C contains 9.4 g/m3 (grams per cubic metre) of water vapour when saturated. If air at this temperature contains only 4.7 g/m3 of water vapour, then the relative humidity is 50%.
When unsaturated air is cooled, relative humidity increases. Eventually it reaches a temperature at which it is saturated. Relative humidity is 100%. Further cooling leads to condensation of the excess water vapour. The temperature at which condensation sets in is called the dew point. The dew point, and other measures of humidity can be calculated from readings taken by a hygrometer. A hygrometer has two thermometers, one dry bulb or standard air temperature thermometer, and one wet bulb thermometer. The wet bulb thermometer is an ordinary thermometer which has the bulb covered with a muslin bag, kept moist via an absorbent wick dipped into water. Evaporation of water from the muslin lowers the temperature of the thermometer. The difference between wet and dry bulb temperatures is used to calculate the various measures of humidity.