How Is the Heat Index Calculated and How Is the Humiture Index Used To Measure Heat and Humidity?

The term humiture was coined in 1937 by O. F. Hevener, a meteorologist, to refer to the discomfort felt in the upper temperature ranges when the humidity rises.

The higher the humidity, the warmer it feels to the human body, because it cannot evaporate sweat as easily.

Many indexes have been devised to quantify the discomfort.

All are faulty, because discomfort varies from person to person and from day to clay for the same person.

All the indexes are based on similar concepts.

For example, the humiture is equal to the temperature in degrees Fahrenheit plus a reading equal to the actual vapor pressure of the air, in millibars, minus twenty-one.

A millibar is a unit of pressure, the amount of force over a given area that the air is exerting; 1,000 millibars equals 29.53 inches of mercury. Total air pressure, measured with barometers, is the weight of the air above you; part of that weight is the vapor in the air.

The humiture is close to another index, the apparent temperature, which is calculated by a much more complex formula, but approximates it under most circumstances.

Another familiar measure, relative humidity, is based on the percentage of moisture present in the air compared with how much it could hold at that temperature. But the hotter air gets, the more water vapor it can hold, 50 35 the temperature goes up, the relative humidity actually goes down.

The dew point, the temperature at which dew forms, might be a much better and simpler number for defining discomfort.

Virtually everyone is comfortable when the dew point is less than 60 degrees Fahrenheit. When it is more than 60, some will be comfortable.

Most will be uncomfortable over 65.

When it is 70 or more, virtually everyone agrees that the air is oppressively humid.