Did bleeding or bloodletting the patient in ancient times actually help any medical condition or did it make it worse?

Not unless the victim that is, patient chanced to be suffering from a hypertension crisis or an abnormal buildup of iron in addition to whatever disease the doctor thought he was treating.

For example, draining 10 to 12 ounces of blood, for a start, and up to 80 ounces in all, was routinely prescribed during the American Revolution for yellow fever among Washington’s troops. The practice, called phlebotomy, is unlikely to have done the soldiers any good, especially if they were already weakened by cold, malnutrition, purges, and enemas.

Bleeding or bloodletting was common for centuries, if not millennia. It was based on the ancient theory that bodily humors, called black bile, yellow bile (or choler), phlegm, and blood, had to be kept in balance.

The humors were also thought to determine temperaments, causing people to be choleric or sanguine, for example. If a person was ill, the theory went, balance had to be restored by reducing whatever humor was thought to create the plethora, often by draining off blood.

Finally, in the nineteenth century, a French pathologist, P.C.Louis, decided to find out if bleeding really helped, by using what was then a revolutionary method: counting who got better and who got worse.

His “numerical method” was applied to 78 cases of pneumonia, 33 cases of erysipelas (a flesh-eating strep infection of the skin), and 23 cases of inflammation of the throat. He found no advantage to bloodletting, and it was the beginning of the end for bleeding and leeching as well.

However, using leeches to reduce a bloody swelling is still a practical part of modern medicine.