Skip to main content

6. Slavery and Diseases in the Antebellum American South

Published onApr 08, 2020
6. Slavery and Diseases in the Antebellum American South
·

As an economic institution American slavery was certainly privately profitable.1 Slavery easily passed the most rigorous evolutionary test: survival. American slavery existed and thrived for over two hundred years; it was only extinguished in 1865 after the bloodiest conflict fought in the Western Hemisphere. Prior to the Civil War, slavery had spread throughout the reaches of the American South. It was entrenched as far north as Kentucky and Missouri; in the West, it had spread across Texas and was a contentious issue in the debates over statehood for New Mexico. For over two centuries, slaveowners profited immensely from their ownership of human chattel, and they were willing to plunge the American nation into war rather than contemplate any possible attenuation of their “peculiar institution.” Regardless, these facts do not prove that slavery was profitable (efficient) to the entire non-enslaved Southern society, neither do they disprove that it was only slaveowners who profited from slavery with the rest of Southern society bearing the brunt of slavery’s costs. Whether slavery was profitable to the entire non-enslaved population of the South is an issue unto itself demanding a complete listing of all the costs that slavery imposed on the South that were not paid by the owners of slaves.

Some costs of slavery were known to be borne by the entire society, not just by slaveowners; other costs that fell on the broader society were incompletely recognized, and some were completely unrecognized. Included in the costs borne by the entire South were (1) the costs of the legal system that defined and delineated property rights in humans, (2) the costs of the judicial and law enforcement systems that implemented the laws, and (3) the costs of the restrictions that the laws and customs of slavery imposed on the rights and privileges of the non-enslaved. These costs are widely ignored in the literature on slavery. Prior to the Civil War, mere discussions about the morals, ethics, and feasibility of slavery in the South that did not reflect well on slavery were hazardous to people who entertained them. Extrajudicial capital punishment (lynching) existed in the South before the Civil War; only before the Civil War its victims were mostly white, and many were lynched because of their antislavery views. The restrictions on human freedoms that upholding slavery entailed were no small sum either. For example, whites and blacks could not legally marry, and those who married without legal sanction could not send their children to school. The second Roman Catholic Bishop of Portland, Maine, James A. Healy was a child of such a mixed union. Healy and his siblings were separated from their parents in Georgia to receive an education in the North. These marital restrictions on freedom and choice are eerily suggestive of Nazi Germany.2

These everyday restrictions and expenses were all real costs of slavery, no less real were the costs imposed by the disease pools that slavery introduced and perpetuated in the American South. Two different forces made the southern disease environment different from that of the other states before the Civil War. One was climate, which differed by degree from that of the North; the other was slavery. It was slavery that made profitable the plantation economy (large land-holdings cultivated primarily by non-family labor) that dominated much of southern agricultural output. Slavery and the plantation economy differed in kind from the way agricultural output and labor markets were organized in the North. Biological, epidemiological, and historical evidence document the role of the plantation South in the maintenance and spread of pathogens, and the debilitating effects of infectious parasitic diseases in the South.

In our discussion of the profitability (efficiency) of slavery, we accede to the common (yet morally reprehensible) practice of excluding the welfare of slaves from the calculus. We do this because if the welfare of slaves is included in the calculus, then the profitability and efficiency of slavery is trivially obvious: slavery was unprofitable and inefficient. If the benefits to slaves of slavery exceeded the costs to them, there would have been no need for physical compulsion (slavery); voluntary contractual arrangements could have replaced chattel slavery. People are willing to and do work under degrading, deplorable, and dangerous conditions if they are sufficiently compensated. Professional “sports” such as alligator wrestling and cage fighting are blatant examples of people sacrificing their dignity, and physical and mental well-being for fame and fortune. One might cast aspersions on the judgment and/or mental abilities of the participants, but there is no physical compulsion that forces them to undertake these activities; they are not slaves in any literal sense of the word.

After the Civil War, plantation owners found that they could not afford to pay freedmen enough to work as they did when they were slaves; this meant that the resumption of plantation-type agricultural production failed because the extra revenues generated by working under slave-like conditions (gang labor) were insufficient to compensate a free labor force to work under those conditions. The wages that had to be paid to a free labor force to work as it did before emancipation far outweighed the extra revenues the labor would have produced in plantation agriculture. Share cropping, tenant farming, and other forms of agricultural organizations were the institutional response to the financial impossibility of using pre-war production techniques (gang labor) with an emancipated labor force. This gives us the obvious conclusion: if the welfare of the enslaved is included in the calculus, slavery is patently unprofitable. The value of leisure (freedom) and more desirable work conditions were greater than the extra revenue generated by working people like slaves.

In this chapter, we examine the antebellum southern disease environment and the interactions between diseases and slavery. This involves assessing the impact of the primary “southern” diseases and their disparate effects on people of different ancestral heritages (ethnicities), identifying the interactions with the slave system, and exploring how this affected slavery and its profitability. We assess the role of slavery and the plantation system in the spread and perpetuation of infectious parasitic diseases throughout the southern populations—black and white—and discuss the impact of these diseases on health, human development, productivity, and society.3

African-American Slavery and Diseases

We begin with an observation that is medically and scientifically uncontroversial: diseases have different impacts on people of different ethnicities. Regardless of its acceptance by the medical and scientific communities, this statement still raises hackles in the social sciences; it appears to be a matter of morality for many social scientists to believe that there should be no genetic traits that differ markedly between ethnic (racial) groups. Stampp (1956) is a prominent example of this school of thought. In the absence of any evidence, and with only moral indignation supporting his case, he bluntly asserted his belief that there is no difference in ethnic susceptibilities to diseases: “The slave of tradition was a physically robust specimen who suffered from few of the ailments which beset the white man. A tradition with less substance to it has seldom existed. In the South, disease did not discriminate among men because of the color of their skins; wherever it was unhealthy for whites to live it was also unhealthful for Negroes” (Stampp 1956, p. 296). Charitable critics may say this statement is literally true: diseases do not discriminate on the basis of skin pigmentation but on other criteria (Duffy blood type, sickle cell trait, sweating response, and so forth). But even here a recent study by Pierson (2008) suggests that vitamin deficiency diseases (specifically, lack of vitamin D) differentially affected darker skinned people.

Stampp’s beliefs are contradicted by evidence and objective reality. In the context of the antebellum South, there were identifiable diseases that had disparate impacts on the black and white southern populations. The warm southern environment and the plantation system of relatively clustered dense rural populations were particularly conducive to the perpetuation and spread of infectious pathogens. Plantation slavery played a significant role in creating and maintaining a distinctive southern disease ecology that, in turn, affected both the southern economy and society.

Even before the Civil War, slavery in America was a constant source of scholarly and public interest. In the present era, African-American slavery has simultaneously disgusted scholars and seduced them into expending scholarly efforts into researching it. In 1974, the hallmark addition to the continual flow of scholarship on slavery was the book Time on the Cross by Robert William Fogel and Stanley L. Engerman. This book intensified slavery debates, enflamed scholarly passions, and spurred interest among scholars and wider audiences. Both book and authors crossed over from academic venues to the general media. Book reviews appeared in all major newspapers in the United States, and Fogel conducted both radio and television interviews and appeared on nightly talk shows. Publicity and notoriety spurred an avalanche of scholarly (and nonscholarly) articles, books, and manuscripts on slavery. One of the more contentious issues that arose was that of slave living standards and the adequacy of slave diets. Indeed the heat of the debate on slave living standards can be appraised by the headline in The New York Review of Books (May 2, 1974) when it featured Time on the Cross in its lead book review. The review by C. Vann Woodward was headlined “The Jolly Institution.” Interest in slave living standards metamorphosed into an interest in the role of nutrition and its effects on slave health and human development. Physiological measurements (termed anthropometrics) were used to address the question of the adequacy of slave diets, as it was maintained that the average stature (height) of a population is a relatively good proxy for the net nutrition of the population and can serve as a substitute for explicit evidence on diet when it is missing.4

Part of the emphasis on anthropometrics was the availability of large datasets containing physical measurements of enslaved Americans. These measurements, for instance, were required as a component of the prohibition of the international slave trade that was enacted by Congress on March 2, 1807, and took effect January 1, 1808. To ensure that no slaves in the antebellum era’s coastwise trade were smuggled in from abroad, all slaves boarding and disembarking ships in ocean ports of the United States beginning in 1808 had to have a manifest detailing their physical characteristics lodged at the customs house of the ports of origin and destination. These coastwise manifests recorded estimates of slave ages, heights, weights, sex, and other physical characteristics; the records could then be used to ascertain whether any of the slaves in the coastal waters of the United States had been smuggled into American ports. The data are enormous (over 146,000 individual observations); consequently, the slave manifests provide fairly reliable estimates of the physical characteristics of slaves who were shipped on the coastal waters of the United States. Another large dataset containing measurements of the heights (and other personal characteristics) of nineteenth-century southern blacks, a substantial portion of who were former slaves, come from Union Army records during the Civil War. These data also are enormous, over 186,000 black males enlisted in the Union Army from late 1862 to 1865. While not containing as many observations as the preceding datasets, there are other substantial sources that detail the physical characteristics of nineteenth-century southern blacks (most of whom were born and raised slaves); among these are convict (penitentiary) records for various states.

Yet like most data, the data on slave (black) heights can mislead the unwary. The data have to be standardized to make sense of them; an obvious example is that the data have to be standardized for age and sex—because males are typically taller than females, and adults taller than children—to assess height. What also is normal is that height data are reported by year of birth; this is termed birth cohort. Even so the data are not revealing unless taken in historical context with an appreciation of the nuances that the entire dataset entails. A good example is that some of the anthropometric data indicate that the mean height of various slave cohorts was increasing during parts of the antebellum period and decreasing during other parts; other data indicate that the mean height of black Tennessee convicts born in the 1830s was increasing; still other data indicate that the heights of black Georgia convicts born in the 1830s and 1840s were stable (if not slightly increasing). But these findings have misled scholars to conclude that diets, not diseases, were certainly the more important influence on slave heights (see Komlos and Coclanis 1997; Rees, Komlos, van Long, and Woitek 2003; Sunder 2004).

The anthropometric data, though, are inadequate to answer the diet versus disease question for at least two reasons. First, the height findings derived from the data themselves are ambiguous about the exact time profile of slave heights; they indicate both fluctuations in heights as well as some (contradictory) time trends depending on the slave and birth cohorts examined and the data used.5 Second, there are historical reasons. In the years between the end of the Revolutionary War (1783) and before the ban on the international slave trade (1808), extraordinarily large numbers of African slaves were imported into the United States.6 Slave imports during this period dwarf that of any other similar period. The most recent direct estimate of slave imports for 1783 to 1810 puts imports at 170,300 slaves (McMillin 2004, pp. 30–48, and tables 7, 9); accepted estimates of total slave imports into the United States (or what became the United States) from colonial times to 1810 are between 600,000 and 660,000 slaves. This means that between 26 and 28 percent of all slave imports were brought into the country in less than thirty years.7

There are two reasons that make the absolute number of slaves imported important in the interpretation of slave heights. First, people born in tropical West Africa during the eighteenth and early nineteenth centuries were much shorter than their American-born descendants. Kiple and Kiple (1980, p. 786) summarize this issue: “Research conducted by Fogel, Engerman, and Higman concerning the height of some 25,000 Trinidadian slaves indicates that newly imported Africans were significantly shorter than Creole-born slaves. Fraginals has found the same to be true for Cuba. First-generation Creole slaves were significantly taller than freshly imported Africans.” This means that the average slave height was depressed by the addition of African-born slaves. So the large number of slave imports from 1783 until the trade ban in 1808 would have brought down the mean heights of slaves in any data that included slaves born before 1808. But this is arithmetic; it does not indicate deteriorating diets in the United States.8

Second, and more important, the number of slaves imported is significant because with the importation of Africans came an increasing number of pathogens to which American-born slaves had no, or few, acquired immunities. The spread of these diseases to American-born slaves during the years when these pathogens were most rampant (1783 to 1808) would have affected American slave children born during those years and affected their adult heights. This can help explain the sharp increase in the heights of male and female slaves, aged 12 to 17, in the birth cohorts born circa 1817 to 1830 (Steckel 1979, p. 377). A milder disease ecology, because of the decline in “new” African diseases in their childhood, also can help explain the increasing heights of the adult male slaves born between circa 1805 and 1820 (Steckel 1979, p. 377), the stable heights of adult male black Georgia convicts born in the second half of the 1830s and in the 1840s (Komlos and Coclanis 1997, p. 439–40), and the increasing heights of adult male black Tennessee convicts born in the 1830s (Sunder 2004, pp. 80–81, 84–85). Alternatively, the increasing heights could be attributed to better diets, or they could be due to both diets and an ameliorating disease ecology. The possibility also exists that there were other entirely different reasons for the increasing heights. The data alone do not reveal truth; to unravel the connections, we have to use all the knowledge available, and still hope that we are right and not missing something. Still attributing increasing slave heights to slave diets may be entirely incorrect. Slave diets in the United States could have been unchanged and the mean heights of slaves would have risen as the proportion of African-born slaves in the coastal shipping data fell, and as the disease ecology in America ameliorated.

Regardless of these considerations, from the anthropometric findings, scholars suggest that adult male slaves were given adequate sustenance, but slave infants and children (and perhaps females) were severely malnourished (Steckel 1986b, 1992). The data indicate that adult slaves were only slightly shorter than contemporaneous whites; slave children were extraordinarily short, and slaves had relatively high rates of neonatal and infant mortality (Fogel 1986; Steckel 1986b, 1992). The obvious question is: If adult slaves were relatively healthy, then why were slave infants and children severely malnourished? Steckel (1986b, 1992) addresses this issue, arguing that slaveowners systematically deprived infants and children of a nutritious diet.

We offer an alternative explanation: it was not the diets of slave children, but a combination of the plantation system and diseases that caused much of the phenomena of relatively healthy adults and sickly infants and children. Of course, there were synergies between diets and diseases in slave health; inadequate slave diets increased susceptibility to diseases and existing diseases in slaves affected their appetites and eating habits; these interactions led to inadequate diets. More precisely, both diets and diseases mattered. We emphasize the role of infectious parasitic diseases in the plantation South because heretofore such diseases have been given a subsidiary role in the history of the American economy. The role of infectious parasitic diseases in slave health is fundamental to a complete history of antebellum slavery, including the issues surrounding the biological standard of living of slave infants, children, and their caregivers, as well as the economics of slave productivity.9

Yellow Fever and the Southern Urban Disease Environment

Some diseases that were relatively common in the South (“southern” diseases) were less common in the North. As noted in the previous chapter, the further south one travels in the United States the more the climate is conducive to the maintenance and spread of pathogens that depend on warm weather for survival, or the survival of their vectors (most usually arthropods). Yellow fever, while never endemic to the United States, was one of the parasitic diseases that was initially imported with the slave trade, and we believe was instrumental in shaping southern society.

To repeat points made earlier, yellow fever is typically a disease of urban or very dense populations, and an individual once infected with yellow fever has life-long immunity to re-infection. It is an acute viral disease that has a relatively brief duration in humans. It does not exist during cold periods because its vector (the mosquito Aedes aegypti) cannot survive freezing cold and is inactive in cool weather. A. aegypti has a range of less than a few hundred meters and can only survive a few days without water; consequently, it is usually active in wet and warm areas such as the American southeast. Without continuously warm weather, yellow fever cannot become endemic. It was endemic in the Caribbean islands and tropical South America, where it is still endemic in monkey populations. Currently, it spreads from monkey populations to the occasional human, but modern medicine and vaccinations have reduced yellow fever to a minor public health menace. In nineteenth-century America, yellow fever epidemics were most severe in urban environments; it has been absent from the American landscape since 1905, but its mosquito vector is still abundant and widespread.

Yellow fever was imported into the port cities of the antebellum South, particularly when the international slave trade was in its noxious bloom. Along with tropical produce, yellow fever was episodically introduced into southern port cities after the prohibition of the international slave trade in 1808. Children infected by the disease seem to suffer less and die less frequently than adults. Southerners who grew up and lived in or near port cities (New Orleans, Charleston, Mobile, Savannah, and Wilmington) were likely to have had a childhood exposure that made them immune to yellow fever. People of tropical West African ancestry were (and are) relatively resistant to yellow fever. While the physiological basis for African resistance to yellow fever has not been identified, tropical West Africans and their descendants are much less susceptible to death from yellow fever than non-Africans. What this means is that the people who were especially susceptible to yellow fever were recent arrivals to southern port cities not of West African ancestry. Northerners and European immigrants were the peoples who fit in this category; the death rate of yellow fever among recent arrivals was so appalling that it was given the appellation, the “strangers’ disease.” Pritchett and Tunali (1995, p. 519) estimate that non-Americans constituted 89.5 percent of all fatalities in the 1853 yellow fever epidemic in New Orleans; in contrast, African-Americans constituted just 1.2 percent of the fatalities and whites born in New Orleans 3.3 percent. Pritchett and Tunali (1995, p. 519) also estimate that the mortality rate in New Orleans for foreign-born whites from all diseases in 1853 was 148 per thousand, contrasted to 24 per thousand for New Orleans-born whites.

The differential in death rates is even more impressive when one accounts for the presumed age and health of immigrants. The population of people who emigrated in the nineteenth century was not an unbiased sample of the originating population; immigrants tend to be lumped between the ages of 18 and 40, and they tend to be healthy. Given the rigors of long-distance travel in the nineteenth century, we expect these traits in immigrants to be even more pronounced. This means that a relatively young and healthy population died at a rate six times higher than that of the native white population (including infants, the elderly, and the sickly), according to Pritchett and Tunali’s estimates; in 1853, an immigrant to New Orleans had a better than 1 chance in 7 of dying. Moreover a little over a decade later, in New Orleans in 1867, black troops in the Union Army who had no previous exposure to yellow fever (and consequently no acquired immunities) suffered a mortality rate of 73 per thousand when they contracted yellow fever; their white counterparts died at a rate of 256 per thousand (Kiple and King 1981, p. 45).

Part of the excess mortality of immigrants was due to living conditions; immigrants tended to live in crowded conditions without many amenities; in short, slum-like housing. Standing pools of water (New Orleans had no sewers) served as breeding grounds for yellow fever’s mosquito vector; this explains part of the excess mortality of immigrants. Another part can be explained by their relative poverty; the staple southern diet of the poor consisted of cornmeal, molasses, and pork fatback. This diet can and did cause pellagra, a disease of nutritional (niacin) deficiency. Immigrant nutritional resources would have been depleted by the long ocean voyage and exacerbated by their adoption of the diet of the southern poor; sickly people are more likely to suffer more severely when attacked by another disease.

The urban South was decidedly unhealthful for newcomers that were not of West African ancestry. This had a profound impact on southern culture and American history. Besides yellow fever in southern urban areas, there were the ubiquitous diseases common to most nineteenth-century American cities. Among the more common urban diseases were some that are the childhood diseases of today—measles, whooping cough, diphtheria—in addition to cholera, dysentery, and tuberculosis. While it might seem that these ubiquitous urban diseases would have had no particular differential effect on European immigrants or northern migrants in the South, this is not correct. Given the southern climate relative to that from which the immigrants and migrants came, the specific strains of pathogens that thrived in southern cities in the nineteenth century were relatively virulent throughout more of the year than in the North or in Europe.

Death and disease, and the competition from slave labor, made the South unattractive to European immigrants and northern migrants. Low levels of such newcomers had the effect of homogenizing the white society of the South. The politically active immigrants who fled European tyrannical rulers and secret police did not typically go to the South; those who did go south (and survived) were too few in numbers to question the southern system of slavery and aristocracy. This, and the low numbers of northern migrants, allowed the belief that the institution of racial slavery was natural and just to flourish in the South because there were no substantial elements in the white population who spoke in opposition. All true (white) southerners supported the system; a relatively homogeneous population allowed the easy suppression of heterodox opinions. This was unlike the North; immigrants there were relatively common, and a myriad of opinions on slavery and race, while not welcomed, were tolerated for the most part. Southerners viewed the election of Lincoln as a kind of blasphemy; no right thinking (white) man would question the morality of slavery.

In the North in 1860, there was no overwhelming desire for war to free slaves. There was an emotional attachment to the concepts of one country and a shared history. It was only the southern actions to dissolve the Union by force that galvanized the North to go to war. And why was the South so unyielding? It was because everyone knew that slavery was beneficial and right. A substantial portion of southern belligerence and intransigence can be attributed to the southern disease environment that visited illness and death so abundantly on northerners and European immigrants who may have had radically different views on the ethics of African slavery.

Slave Plantations and the Southern Rural Disease Environment

The living conditions of most African-American slaves were very different from those of other agriculturalists in the pre–Civil War United States. The ownership of slaves was relatively concentrated. In 1850 for the entire South, 21.6 percent of all slaves were owned by people who owned 50 or more slaves; in 1860, the percentage was 24.9 percent. In the lower South, the percentages were 30.6 percent in 1850 and 33.2 percent in 1860 (Gray 1958, vol. 1, p. 530). There is a caveat; these data are for slaveowners, not slaves resident per plantation. However, the source of the data, Gray (1958, vol. 1, p. 530), states that “While a single large slaveholding might be distributed among several different plantations, this was undoubtedly exceptional.” Relying on Gray, we can interpret these data for the most part as slaves per plantation, except for the largest slaveowners, say owners with 200 or more slaves. For the entire South though, owners with 200 or more slaves accounted for only 2.2 percent of all slaves in 1850 and only 2.4 percent in 1860 (Gray 1958, vol. 1, p. 530). Gray (1958, vol. 1, pp. 529–35) also has data on the size of slave holdings; for the entire South, the median number of slaves held was 20.6 in 1850 and 23.0 in 1860; for the lower South, the median number of slave holdings was 30.9 in 1850 and 32.5 in 1860.

The resident slave population on plantations was for the most part kept in compact areas (“slave quarters”), close but not adjacent to the dwellings of the white residents. In nineteenth-century America, the housing of slaves represented a highly unusual form of farm housing. Non-slave agricultural communities had a much more dispersed population, with the distance between residences vastly greater than that between slave cabins. Free farms had many fewer acres and workers (both family members and any hired workers) per farm relative to slave plantations; free agricultural workers typically lived on or very close to the land they cultivated. In contrast, the density of the slave population and housing was unique in nineteenth-century agriculture because on large slave plantations the slave quarters typically had multiple families and many slaves per acre.

The slave quarters that have survived into the twenty-first century are outliers; the typical shoddily built structures were much more likely to have been torn down and/or abandoned relative to more substantial structures. Nevertheless, from surviving structures and documentary material, it appears that slave quarters were constructed with between 15 to 30 cabins per acre. A little arithmetic supports this observation. The typical slave cabin is reported to have been 800 square feet. Suppose, to error on the side of caution, we double it to 1,600 square feet and give an additional 1,000 square feet of grounds between cabins; our “typical” slave cabin then occupies 2,600 square feet (including grounds). This gives a separation of about 20 feet between each cabin. From casual observation of surviving slave quarters, this seems about right. There are 43,560 square feet in an acre; consequently, these dimensions imply approximately 17 slave cabins per acre (17 cabins times 2,600 square feet per cabin equals 44,200 square feet). If each cabin housed a family of five, then 20 cabins would have housed 100 people. The 20 cabins would require 2,600 square feet each, for a total of 52,000 square feet or approximately 1.19 acres. As a result, a plantation with 100 slaves could have housed them on just over one acre.

The animals and food storage of the slave quarters were similarly confined and concentrated in a relatively small area. Defecation, urination, disposing of waste products, and cooking invite environmental pollution and infestation by pathogens. Somewhat ameliorating the increased hazards associated with crowding and pollution was that slaveowners could more readily absorb the fixed costs associated with a “public works” infrastructure. For example, wells could be dug and lined with brick and mortar to prevent ground water and animal contamination, and privies could be built some distance away from housing. In contrast, many small farms in the South, both before and after the Civil War, had no dedicated facilities for the disposal of human wastes; their sanitary facilities were bushes or secluded outposts.

We focus on the diseases that were both endemic to the southern plantation system and had disparate effects on African-American and European-American peoples. Hookworm and malaria are given special consideration later in the chapter because (1) they have disparate effects on African-Americans and European-Americans that have been documented, (2) they were both widespread in the antebellum rural South, (3) they affect adult productivity even though they have relatively low levels of case mortality, and (4) they both have a substantial historical, medical, and scientific literature that allows for their investigation and analysis.10

Nutritional and Parasitic Diseases on Southern Plantations

The diseases of the slave quarters can be broken into two parts—nutritional and parasitic. Nutritional diseases of the slave quarters were significant and relatively widespread. The nutritional diseases in the slave quarters were caused by ignorance rather than malice; this should not be taken to imply that slaveowners were benevolent. Slaves were very valuable and expensive assets. Any deliberate nutritional deprivation would have been economically stupid. Nevertheless, there were nutritional deficiencies in the slave quarters and some of these deficiencies manifested themselves in the form of illnesses.

Probably the most prevalent of the nutritional diseases was pellagra, but pellagra was not identified as a nutritional disease until the first half of the twentieth century. Pellagra is caused by lack of niacin (B3). People whose diets rely heavily on corn (maize) are particularly susceptible to it because untreated corn is deficient in niacin.11 Symptomatic of pellagra are the “four D's”: diarrhea, dermatitis, dementia, and death; other symptoms include sensitivity to sunlight, aggression, red skin lesions, insomnia, weakness, mental confusion, and paralysis of extremities. Kiple and Kiple (1977) argue that pellagra, although not identified as a specific disease, was widespread in the slave population of the antebellum South.

There are difficulties in identifying pellagra; the difficulties are because pellagra advances in stages that are not easily recognized as one clinical manifestation. Kiple and Kiple (1977) maintain that pellagra manifested itself seasonally; its early symptoms appearing in late winter and early spring when slave diets were most monotonous and most dependent on corn meal. They argue that “black tongue” and other diseases that were identified by the southern medical establishment were simply manifestations of pellagra in its various stages. The case they make for the prevalence of pellagra in the antebellum South is compelling; their arguments are even more convincing because in the early twentieth century pellagra was found to be endemic among the poorer peoples in the South.

Knowledge of the prevalence of other nutritional diseases is less well documented than pellagra. Protein deficiencies can cause kwashiorkor (protein malnutrition; indicated generally by fatigue, irritability, lethargy, growth failure, diarrhea, dermatitis, and mental retardation); other specific vitamin and mineral deficiencies also can cause mental retardation, physical deformities, and reduced growth. While a single incidence of vitamin-protein-calorie malnutrition may not cause permanent disability, an entire regime of childhood malnutrition will have permanent effects (Das and Pivato 1976; Das and Soysa 1978). Except for iodine deficiencies (goiter and cretinism), most nutritional diseases would have been eliminated or, at least alleviated, by the more abundant and varied diet that accompanies the warm-weather months of the year. As noted earlier, Pierson (2008) suggests that darker skinned African-Americans were more susceptible to diseases associated with vitamin D (the “sunshine” vitamin) deficiency. To repeat a point, the diseases caused by nutritional deficiencies were due to neither malice nor avarice: slaveowners could have been morally obtuse, but they would not have deliberately hazarded slave capital valued at over a thousand dollars and compromised slave productivity to save a few dollars per year (see Lebergott 1984, pp. 210–35). Still, nutritional diseases were a hazard in the slave quarters, but they were not infectious, unlike parasitic diseases.

Parasitic diseases are those that are caused by foreign organisms entering the human body; these organisms include viruses, fungi, protozoa, bacteria, and multicellular parasites. The human body contains a multitude of organisms that are nonhuman. Indeed, it has been estimated that the amount of nonhuman DNA in a human body vastly exceeds that attributable to Homo sapiens. (So when people use the term “we” rather than “I” it can be interpreted literally.) The vast majority of these organisms do not cause illness in humans; some are symbionts (mutually beneficial), others are commensals (nonharmful parasites), and others become pathogenic only during specific circumstances.

All the diseases caused by foreign organisms (parasites) have not been identified; the acquisition of knowledge is ongoing, and what we know today may be undone tomorrow. The history of stomach ulcers is illustrative of the growing realization of the importance of microorganisms. Twenty years ago (in 1990) the prevailing medical view was that stomach ulcers were known to be caused by stress and diet. Barry Marshall, a relatively obscure (at least to the medical establishment and to the authors of medical textbooks) internist in Australia, had in the 1980s co-authored papers showing that a bacterium (Helicobacter pylori) was associated with gastric ulcers and that the ulcers responded to antibiotics. Only after the renowned medical organ, The National Enquirer (March 13, 1990), publicized his work did the medical establishment pay heed (cited in Ewald 2000, p. 233). Today antibiotics are the usual treatment for gastric ulcers. In this vein some forms of cancer have been shown to be caused by viruses; the importance of microorganisms is undergoing a transformation in the causal factors of diseases (Ewald 2000).

Parasitic diseases are infectious; just how they are transmitted depends on the organism. Some are transmitted by soil or water pollution, others have arthropod vectors, and others are communicable from other humans. We have difficulties listing all the parasitic diseases of the slave quarters because such diseases are superabundant and we cannot completely identify which diseases were present and when, and which were not. Microorganisms as a causal factor in disease were not recognized until the works of Louis Pasteur, Robert Koch, Charles Laveran, Ronald Ross, Charles Finlay, and the other pioneer microbiologists of the late nineteenth and early twentieth centuries. But we do know that parasitic agents existed before they were identified, and that diarrhea, which was common in the South, is a symptom that is widely associated with parasitic and water-borne diseases.

Diseases that are soil transmitted, water-borne, and/or density dependent would have been much more abundant in plantation slave quarters than on family farms in similar climatic and geographic circumstances. These diseases are so ubiquitous that they are frequently not identified by the causal parasite or scientific name, but by symptom; they are lumped together and characterized as diarrheal diseases. We cannot attempt an even modestly comprehensive list of parasitic diseases because there are just too many pathogens and they are (and were) evolving too rapidly to assemble a comprehensive catalog of them.

To illustrate the problem for water-borne diseases, the Center for Disease Control in the United States estimated that 58 percent of all cases of water-borne infections in the United States for the years 1991 to 1998 had no identified causal agent (Cullimore 2008, p. 29). The public health authorities in the United States at the end of the twentieth century were competent and conscientious; the reason that so many infections were not matched with pathogens is that the task is not possible given the state of knowledge. Nevertheless, some parasitic diseases related to human density (in addition to hookworm and malaria) that are mentioned prominently in the literature and were likely to have been in and around antebellum slave quarters are hepatitis A, infectious hepatitis, viral gastroenteritis, Campylobacter, Escherichia (E.) coli, leptospirosis, Salmonella enteric fever, rotavirus, Shigella, cholera, yersiniosis, amebiasis, Giardia lamblia, and enterobiasis. Most of these diseases are due to fecal pollution. Besides hookworm, other helminthic infections of tropical origin also would, in all probability, have been more abundant in plantation slave quarters than on family farms, as they are spread through fecally polluted soils and water, and are density dependent. The most common of the other parasitic worms in the antebellum South were Ascaris (large roundworm) and Trichuris (whipworm). While neither as abundant nor as widespread as hookworm, the available evidence suggests that large roundworm and whipworm infections were relatively common in the antebellum South.12

Although the significance, both medical and economic, of large roundworm and whipworm infections is tertiary to that of hookworm, it is not trivial and the two parasitic worms have many similarities with hookworm.13 The survival, maintenance, and spread of large roundworm and whipworm are similar to hookworm. Both predominate in areas of poor sanitation where there is fecal pollution. Individuals acquire the parasites when they inadvertently ingest fecally contaminated soil or consume contaminated food or water. Because of their behavior, infants and young children are more commonly and more heavily infected in poor, rural areas with warm humid climates and moist sandy soils. (Neonates also can acquire the worms in the placenta.) Both parasites eventually end up in the human’s intestine where they live on the host’s tissue secretions. Many of their symptoms and consequences also are similar to hookworm but typically not as severe because they do not suck the hosts’ blood (and its nutrients) as do hookworms. Individuals with either large roundworm or whipworm infections usually are asymptomatic. When they are not, they are likely to have abdominal cramps, fever, coughing and wheezing, nausea, and vomiting.

For large roundworm infections, a heavy worm load can block the intestine, resulting in severe pain and vomiting, and may lead to cachexia (general ill health, malnutrition, weight loss, wasting of muscle, loss of appetite, and physical weakness). Heavy worm loads also are associated with iron-deficiency anemia and impairments of growth and cognition (mental retardation). Children infected with heavy loads may not grow or gain weight normally. For whipworm infections, the common symptoms are present only for individuals with very heavy worm loads, and there are no pulmonary symptoms because, unlike large roundworms and hookworms, there is no pulmonary migration with whipworms. A large number of whipworms in the colon can lead to loss of appetite, chronic diarrhea, and dysentery. Exceptionally large whipworm loads, especially in children, may lead to weight loss, bleeding from the intestine, iron-deficiency anemia, and stunted growth.

Recent research suggests that another helminth also could have been in southern slave plantations. Hotez and Wilkins (2009) argue that infection with the roundworms Toxocara canis and Toxocara cati is the most common intestinal infection among Americans today; they are especially prevalent in poor rural and urban areas in the American South and among African-Americans. As with other parasitic worms, T. canis and T. cati are spread when humans inadvertently ingest fecally contaminated soils, but in this case it is zoonotic transmission; it is the feces of dogs and cats, respectively, that contain Toxocara eggs and larvae. The most common clinical features of infection are wheezing, pulmonary infiltrates, and eosinophilia (a blood abnormality commonly associated with helminthic infections). As these features also are characteristic of childhood asthma, it has been hypothesized that Toxocara infection and childhood asthma are linked; Hotez and Wilkins (2009) note that the two have been linked in a handful of studies. Toxocara infection also has been linked with mental retardation and developmental delays. Given conditions on southern plantations and the prevalence of dogs in the antebellum South, it seems likely that, at least, T. canis infection could have been common among the resident plantation population.

We emphasize that the slave quarters of nineteenth-century America were highly suited for the transmission of parasitic diseases, and that the increased density of the biomass provided ecological niches for pathogenic infestation. Microbes are ubiquitous; so the appropriate question is not “where are there microbes,” but “where are the places from which microbes are absent?” Everywhere there are humans there are microbes, and indeed we harbor multitudes of them inside our bodies; bodily functions of all animals release microbes that may become pathogenic outside of their normal environment. The interactions of diseases within the human body can create negative synergies that may exacerbate the consequences of disease. For example, individuals afflicted with pellagra and hookworm are much sicker than if they had just one or the other (Kunitz 1988; Martin and Humphreys 2006). The same is true of individuals infected with cholera and hookworm (Harris, Podolsky, Bhuiyan, et al. 2009). Moreover, having one disease often makes treatment and recovery from another disease more problematic.

When we consider the health effects of slave plantations, the interactions among diseases magnifies the negative consequences of the plantation system. Slavery exacerbated the disease situation; low living standards, poor sanitation, and dense human and animal populations made the slave quarters a haven for many pathogens that adversely affected human health. While we concentrate on the affects of these pathogens on the enslaved population, the pathogens were opportunistic, afflicting all to a greater or lesser degree. Large slave plantations served as disease reservoirs that aided the survival of pathogens and assisted in their transmission from the plantation to the wider world. Within the plantation South, slavery and diseases interacted and had profound effects.

We illustrate how we conceive of the biohazards that surrounded the inhabitants of the slave quarters on large plantations in figure 6.1. The figure illustrates our thinking; the arrows in figure 6.1 indicate the direction of causality. Starting at the top, large-scale slave plantations led to (1) increased rural human density, (2) poorly constructed and dense slave housing, (3) slave infants and young children concentrated in nurseries, and (4) a general increase in the biomass available for pathogens. Slave children who had not yet learned to control their bowel functions defecated around the slave quarters. Coprophagous animals (dogs, chickens, and swine) further disseminated the fecal biohazards. These conditions produced a plantation-specific disease ecology conducive to the establishment and propagation of pathogens that arise from fecal contamination of soil and water, zoonoses from the farm animals, food-borne diseases, parasites, and a host of other conditions associated with poor living standards, inadequate sewage and water, and animals and humans confined to a small area. Given these conditions, plantation water supplies would have been frequently contaminated by runoff and poor sanitation. All of this led to a deleterious plantation disease environment. These disease conditions would have affected slave infants and young children more severely than older children and adults because their immune system had not been exposed to the infections common on the plantations; consequently, they had no ready supply of antibodies to combat these infections.

Figure 6.1

Interactions of antebellum agriculture and slave health

The results of the plantation disease environment and infections were (1) low birth weight babies and small, stunted children, (2) high neonatal and infant mortality, and (3) the effects of severe protein deficiency—hypoalbuminemia (inadequate protein) and kwashiorkor in infants and young children afflicted with the pathogens abundant in and about the slave quarters.

Relative to living conditions in the twenty-first century, slave living conditions were abysmal. Still this is a book about history, and relative to early nineteenth-century conditions, slave living conditions were better than some and worse than others.14 A point we wish to make here is that the living conditions of slaves likely were improving throughout the nineteenth century’s slave era. Improvements would have been due to (1) an amelioration of the disease environment associated with the end of the international slave trade, (2) an increase in knowledge in providing medical care, sewage disposal, and water on large plantations, (3) an increase in slave “income” due to increasing productivity, and (4) rational behavior as the price of slaves increased substantially with increasing productivity and the end of slave imports. So, while conditions were not up to standards anyone reading this book would like to experience, they were improving.

The two diseases that we discuss in detail below, hookworm and malaria, were initially brought to the American South as a by-product of the Old World migrations. There is speculation that hookworm and some types of malaria appeared in other locations by the time of the Voyages of Discovery; but generally they are considered to be of tropical African origin. These infectious parasitic diseases can place severe burdens on their victims. In the South, the diseases differentially affected the white and black populations and their life cycles.

Before the Civil War, many of the enslaved blacks were concentrated on plantations; infants and young children were placed in plantation crèches (“nurseries” or “day care centers”) under the care of older children, pregnant women, and elderly slaves. The combination of slavery and large-scale plantation agriculture created pockets of relatively dense rural populations that allowed the maintenance and spread of malaria. Malaria during pregnancy, infancy, and childhood adversely affected younger slaves, who, while they might have had some innate immunity, had no acquired immunities. (More discussion of the two types of immunities appears later in the chapter.) The environment of slave crèches also was conducive to the spread of hookworm, as was the practice of allowing children to go barefoot. Hookworm placed stresses on slave children, reduced the nutritional value of a given diet, and reduced appetites. Moderate to severe hookworm infection would lead to low slave birth weights, short stature, extraordinarily small slave children, and high neonatal and infant mortality. These symptoms that the anthropometric literature attributes to inadequate diets also are consistent with the plantation South’s disease environment.

White infants and children were much less likely to be raised in conditions that were as conducive to the breeding and transmittal of disease. White children typically had fewer of the pathogens common to the plantation than slave infants and children because they were not subject to the disease environment that existed in the slave quarters 24 hours per day. The diseases also affected black and white adults differently. When slaves entered the adult workforce, they were taken from the disease breeding grounds of the slave quarters and “day care centers” and sent into the relatively (for blacks) healthy fields, while whites who went into the fields found a disease environment that was typically worse than that of their childhood. The antebellum agricultural labor force of the South was composed of two populations, white and black, both diseased; with the adult white population more sickly where the disease environment supported warm-weather pathogens that flourished in hot weather.

Plantation slavery allowed reservoirs of parasitic diseases to exist. Human hookworm and malaria require human hosts to complete their life cycles. Without a critical mass of both carriers and potentially infected humans, these diseases could not have existed. Slave plantations concentrated human beings into close proximity with one another; the increased density on plantations insured the diseases’ survival and transmission. Endemic to the plantation South these diseases spread to nonplantation populations.15

Disparate Impact of Diseases on Ethnic Groups

Repeating a major theme of this book, diseases do not have the same impact on people of different ethnic (ancestral) backgrounds; there are two reasons for this: acquired and innate immunities. Acquired immunities are obtained when an individual contracts the disease. Individuals who recover have immune systems that are primed to resist subsequent re-infections by the same pathogen. In an environment where the pathogen is endemic, acquired immunities may make the disease a childhood infection with a few adults contracting it. Some common diseases (yellow fever, mumps, measles, and chickenpox) appear to affect children much less severely than adults. But appearances may be deceiving, and in fact disease may not affect children less severely. The empirical evidence for the resistance of children to some “childhood” diseases may be because, in an environment where these diseases are endemic (the diseases are a common feature of life, so adults who were born in the area typically had exposure to the disease as children), there are disease-free adults to care for sick children. If both adults and children are stricken, as in epidemics, many children perish simply for the lack of adequate care. For example, diarrhea causes dehydration, and the effects of dehydration are a function of body mass; because of their small size infants can easily die from a loss of fluids in less than 24 hours.

Innate resistance is the product of evolutionary selection. Both pathogens and humans are native to specific geographic environments. Human populations become accommodated to the disease environments in which they reside generation after generation. Depending on a variety of factors (virulence, nonhuman hosts, and the method of transmission), pathogens can evolve into less deadly forms (Ewald 1994). It is widely thought that syphilis is less virulent now than in early sixteenth-century Europe in part because it’s more deadly forms were culled from the bacterial population (McNeill 1976). The genetic characteristics that make an individual more resistant to the onslaught of pathogens will spread among members of the population as the individual humans with that characteristic survive longer and have more surviving and successfully reproducing children. Genetic characteristics are innate. They are transmitted by parents to the fertilized egg and are in the individual at birth; they are not acquired after exposure to a pathogen. People who do not possess genetic traits that convey resistance to disease pathogens are at a disadvantage in environments that harbor the pathogens.

A single gene, as noted, typically has more than one phenotypic effect; these effects may have positive, negative, or no impact on reproductive success. Genes react with one another and with the external environment to shape phenotypic effects. Genes that confer resistance to pathogens that abound in a particular disease ecology and whose net effect on reproduction is positive (they have no offsetting phenotypic effects that in toto reduce reproductive success), are pro-adaptive; they will spread through the population. Conversely, in environments that do not harbor diseases to which the specific genes confer resistance, the phenotypic effects will typically prove maladaptive; populations that have these genes will have less reproductive success than populations that do not possess them. The sickle cell gene is illustrative of this process; where malaria is highly endemic, populations that carry the sickle cell trait have more reproductive success than populations without it because the trait confers substantial resistance to malaria, and malaria places heavy burdens of increased morbidity and mortality on nonresistant populations. In environments that are not malarious, populations that have the sickle cell trait have less reproductive success because when both parents have the trait, on average, one-quarter of their children will have sickle cell disease and die young. In a malarious environment, these deaths are less burdensome than the toll that malaria extracts from a nonresistant population. This illustrates that the relative frequency of genes in a population is subject to changes over time; these changes depend on, among other things, the disease ecology, evolution, and human responses to the prevalence of diseases.

Slaves in the antebellum South were of predominately tropical West African heritage, but not entirely. Africans from other regions were sometimes brought to the North American continent; also there were subsequent unions between people of tropical West African ancestry and individuals of other ancestry. As evidence of these unions, Wahl (1996, p. 21) has a vignette concerning a “blonde, blue-eyed, light-skinned slave” in the pre–Civil War South, obviously a slave whose American ancestry was more “European” than “African” due to sexual unions between African slaves and whites in the New World. Free labor in the antebellum South was of predominately northwestern European ancestry. Even accounting for unions of people of different ancestral heritages, any innate resistance that tropical West Africans had to specific pathogens would have been overrepresented in the enslaved population relative to other Americans. There is ample evidence that people of tropical West African ancestry (blacks) are, and were, more resistant to infection by hookworm and malaria than are people of northwestern European ancestry (whites).

Hookworm

The Disparate Impact of Hookworm

Hookworm disease is caused by an infection of parasitic nematodes (genus Strongyloidea). The two major hookworms that afflict humans are Ancylostoma duodenale and Necator americanus. While these genera differ in their habitat, mode of transmission, and other significant details, their effects on humans are similar (see Breeden 1988; Ettling 1981; Savitt and Young 1988).

As noted in chapter 5, hookworm was not identified as a major problem in the United States until 1902 when stool samples drawn from the southern population that September were microscopically examined by Charles Wardell Stiles; the samples indicated that hookworm was endemic to the South. The presence of endemic hookworm among the southern population was then given wide publicity in a paper presented by Stiles at a scientific conference in December 1902. But it appears to have taken several years for medical authorities to accept the fact of endemic hookworm (Chernow 2004, pp. 487–88). Compounding the error of not recognizing the disease, the type of hookworm infecting southern Americans was identified as native to America and named Necator americanus (American killer). The scientific appellation remained even after it was discovered that Necator americanus was endemic to tropical West Africa, parts of the Middle East and southern Europe, and the Indian subcontinent.

In reaction to Stiles’s work, “[n]o less than eight investigators surveyed all or part of the southern lowlands. . . . In every instance, the results were the same; each researcher uncovered numerous cases of the disease and provided concrete evidence that hookworm was indeed a condition from which many in the South suffered” (Marcus 1988, p. 91). The Rockefeller Sanitary Commission for the Eradication of Hookworm Disease, organized in late 1909 and lasting until March 1914, estimated that more than 43 percent of all southerners were infected with hookworm in the early 1900s, or 7.5 million people out of a population of approximately 17.5 million. Brinkley (1994, p. 84) also reports an estimate of hookworm infestation, estimating that 39 percent of the southern labor force was infected in 1909.

But the impact of hookworm is not the same for people of different ancestry. Clear empirical regularities exist that show, relative to other ethnic groups, descendants of tropical West Africans have less infection in the same hookworm environment (Chandler 1929; Dock and Bass 1910; Smillie and Augustine 1925). Descendants of West Africans also appear to tolerate a given parasite load better than descendants of people from northwestern Europe (Weisbrod, Andreano, Baldwin, et al. 1973, p. 76). The scientific basis of this relative resistance of people of West African ancestry is not known, though evidence has emerged suggesting that genetic factors may partially account for human susceptibility to hookworm infection (Williams-Blangero, Blangero, and Bradley 1997). The apparent reason for the lack of a scientific basis is that the medical and scientific communities in the United States and other developed countries devoted relatively few resources to the study of hookworm and other parasitic diseases after World War II until recently. The seeming eradication of hookworm disease from the developed world by World War II is probably the reason for the lack of resources engaged in its study. In the entire world, however, estimates indicate that over one billion people are still afflicted with hookworm (Hotez and Pritchard 1995, p. 68; Schad 1991, p. 179).

Early evidence indicating ethnic differences in hookworm disease in the United States comes from the examination and treatment of infections among southern men enlisted in the United States Army during the early twentieth century. Knowlton (1919) found that soldiers classified as white from the Carolinas and Florida had about four times the worm burden as soldiers classified as colored (black) from the same states. The study conducted at Camp Jackson, South Carolina, found 69 cases of hookworm infection among the white soldiers and only 18 hookworm cases among the colored soldiers; the average worm count for the white soldiers was more than 4 times greater than the average for the colored soldiers, 155.3 worms versus 38.3 worms, respectively (Knowlton 1919, pp. 701, 703). In other early twentieth-century United States Army hookworm studies, Frick (1919), Kofoid and Tucker (1921), Lucke (1919), and Siler and Cole (1917) all reported similar findings for colored and white southern soldiers, with the white soldiers having substantially more hookworm infection and worm counts in each study.

In a study of southern civilians, Smillie and Augustine (1925) showed that children classified as white had substantially more hookworm than those classified as colored (black). Table 6.1 reports their data on Alabama children aged six to sixteen. Smillie and Augustine's (1925) summary of their own findings is: “Our results clearly show that when the two races are living under almost identical conditions of sanitation, economic status, occupation, soil, temperature, etc., the whites may have a heavy infestation whereas the negroes have a very light infestation” (p. 1962).

Table 6.1

Prevalence and intensity of hookworm infection by ethnicity in Alabama and Covington County, Alabama, 1922

Source: Smillie and Augustine (1925).

Note: The number in parentheses (n) is the number of children

examined.

The data in table 6.1 are for 1922. They should not be confused with nineteenth-century experiences. Before the Civil War, the majority of African-Americans living in the rural South in 1860 were grouped together on plantations of twenty or more slaves. In the lower South, more than 60 percent of all slaves lived on such plantations (Gray 1958, vol. 1, p. 530), with infants and young children likely kept together in crèches, resulting in greater exposure to hookworms. After the Civil War, African-Americans living in the rural South were apt to live in cabins separated from their neighbors by the surrounding fields. The decreased density of African-Americans resident in the rural South after the Civil War would have reduced their exposure to hookworm disease.

In contrast, there is evidence of severe consequences of hookworm exposure among more susceptible northern whites resident in the South during the Civil War years. Northern Union soldiers who were prisoners at the notorious Confederate Military Prison in Andersonville, Georgia, died at extreme levels, even for Andersonville, in 1864. Cross (2003) attributes much of this extreme mortality to intensely endemic hookworm at Andersonville in the summer of 1864. In one case, 224 (or 59 percent) of 379 northern whites from the Vermont Brigade of the Army of the Potomac, who were captured June 23, 1864, and imprisoned at Andersonville, died during captivity or subsequently as a direct result of their prison experience. In another case, 34 (or 69 percent) of 49 northern whites from Company A, 5th Rhode Island Heavy Artillery, who were captured May 5, 1864, died at Andersonville or afterward in southern captivity elsewhere.

The differences in hookworm infections between the black and white Alabama children are depicted in figure 6.2, which is a visual illustration of the means of the data in table 6.1. They reveal some interesting anomalies. White children had vastly more hookworm in rural areas than colored children; yet in urban areas, colored children had more hookworm than in rural areas and in Covington County even more hookworm than white children. Our explanation for the ethnic disparity between rural and urban divisions is that in urban areas people of African ancestry were poorer and deprived of public services relative to their counterparts of European ancestry. African-American children residing in tightly packed living quarters, with poor sanitation and probably going barefoot much of the year, were subject to constant hookworm infestation. People of tropical West African ancestry are relatively resistant to hookworm infestation, but in an environment where hookworm is pervasive, they too will be infested with the parasites.

Figure 6.2.

Mean hookworm burden for blacks and whites in Alabama and Covington County, Alabama, 1922. Source: Smillie and Augustine (1925)

Nevertheless, urban white children had living conditions that were less amenable to the transmission of hookworm than their rural counterparts. With relatively good sanitation and wearing shoes, they were spared the hookworm burden that afflicted whites in the countryside and blacks in the cities. The relative differences between white and black children with respect to the impact of hookworm in urban environments can be attributed to economic conditions and to discrimination in the provision of sanitation, roads, and other public services to nonwhite urban areas.

The Rockefeller Foundation–sponsored International Health Board (IHB) conducted a series of hookworm inspections in 1920 to 1923 to determine the continuing presence of hookworm in the American South. Unlike earlier Rockefeller-sponsored hookworm inspections in the 1910s, those conducted in the 1920s reported hookworm infection among individuals by “race” for selected counties in eleven southern states. Surviving summary statistics for these county inspections indicate that the white infection rate was greater than the black rate in 57 of 60 counties surveyed (“Resurveys, Southern States” 1920–23).

Elsewhere, we examined the data for 542 black and white residents of 1922 Marion County, South Carolina, one of the handful of counties for which the IHB raw data survived (Coelho and McGuire 2006). The study indicates a large, statistically significant difference in hookworm infection between African-Americans (blacks) and European-Americans (whites) in Marion County in 1922. Controlling for other demographic factors, we estimate that an otherwise average white was 2.8 times more likely to be hookworm infected compared to an otherwise average black. The estimated probability of testing positive for hookworm is 56.1 percent for an average white and only 20.3 percent for an average black, an incremental effect of “race” of 35.8 percentage points. Consistent with the results for Marion County, Martin (1972) reports that in a survey of residents of rural southeastern Georgia in mid-1969, whites had a hookworm infection rate of 16 percent while blacks had an 8 percent infection rate.

The Spread of Hookworm

Hookworms attach themselves to the human intestine and obtain nourishment by sucking their host's blood and other nutrients. The hookworm arrives in the small intestine by a circuitous route. Hookworm larvae in the soil penetrate the skin of their host. The penetration is frequently through the feet of people walking barefoot, or through the hands and arms of people in contact with the ground such as farmers and miners. Ancylostoma duodenale also can be transmitted by nursing mothers to their children; this route of infection is not available to Necator americanus.16 The host's body reacts to hookworm larvae by “causing dermatitis labeled ‘ground itch’ or ‘dew poison’ in the southern United States” (Ettling 1993, p. 784). Hookworms make their way through the blood stream into the lungs. The worms are then coughed up, and if the mucosa containing them is swallowed, the hookworms have their route to the small intestine. After they are attached, hookworms live one to five years, grow, and mate (hookworms are sexual). The hookworm eggs produced by the female are passed with the fecal matter of their host.

Once attached and fertilized inside the human gut, the female hookworm commences her egg production. The female hookworm is a prodigious egg layer. “Frequently cited estimates of egg output range from 9,000 to 25,000 eggs/female per day” (Schad 1991, p. 33). Wherever its human host (victim) goes, hookworms go, and when the human defecates, large numbers of eggs are deposited along with the feces. The deposited eggs produce larvae that molt twice before they can infect another host. Depending on circumstances, the hookworm larvae can live for a few days to months before finding a host. Some hookworm larvae under laboratory conditions, with constant temperature and humidity, are remarkably long-lived. Some larvae survived more than two months (Smith 1990, pp. 97–99). The optimal conditions for long-term larvae survival are protection from direct sunlight, a moist (but not liquid) environment, and a temperature between 15oC and 35oC (59oF to 95oF).

The life cycle of the hookworm is fairly straightforward (see Chandler 1929, pp. 91–98). The hookworm eggs deposited in the feces typically consist of four cells and are not infectious. The eggs mature into an embryonic stage (under favorable conditions this takes over 48 hours), and then into a larval stage (approximately another 48 hours). Before the larvae can become infectious, they must mature further, typically an additional four to five days. Consequently, under normal conditions, from the time the eggs are passed in the feces to the time hookworm larvae are infective is about eight to nine days (two days in the egg stage, two in the initial embryonic stage, and four to five days in the second larval stage).

Given the primitive state of sanitation in the pre-modern, rural American South, hookworm was common. Sanitary privies were rare in the rural South prior to the twentieth century; toilet facilities in rural areas were frequently just secluded bushes (Dock and Bass 1910, pp. 86–93). Free-roaming coprophagous animals (pigs, chickens, ducks, dogs, and some wild life) could eat feces and pass viable hookworm eggs out in their own feces. “[In the typical county privy] [t]he animals root and scratch the feces, and then scatter them about over the ground. Not only this, but they often eat feces. If the feces eaten by such animals contain hookworm ova, they pass undigested and are thus widely distributed” (Dock and Bass 1910, pp. 86–87). Once established, hookworm became permanent in the rural South because of (1) climate favorable to hookworm that existed throughout much of the year, (2) lack of sanitary privies, and (3) non-use of shoes in much of the rural South in warm weather. The African slave trade introduced hookworm into North America; it spread wherever infected people went. Because of its life expectancy attached to the human intestine, from one to five years and fecundity (the female laying eggs at the rate of up to 25,000 eggs per day), the result was endemic hookworm throughout much of the antebellum South.

The Health Effects of Hookworm

Hookworm disease causes a number of medical problems. Depending on the number of worms attached and sucking, the signs of disease that are commonly associated with hookworm are anorexia, abdominal pain, nausea, headache, rash, weakness, fever, vomiting, diarrhea, dysentery, and intestinal bleeding. Figure 6.3 reproduces Crompton and Stephenson’s (1990, p. 235) outline of the symptoms, effects, and outcomes of hookworm disease in human populations. This should not be interpreted as a listing of predetermined outcomes; the clinical impact of hookworm disease depends on how many hookworms are attached, the nutritional and health status of its victims, and the host's ability to fight off hookworm. Age, living conditions, climate, cultural practices, and other diseases affect the rate of hookworm infection and its impact.

Figure 6.3

Effects and consequences of hookworm disease in human populations. Source: Crompton and Stephenson 1990, p. 235, reprinted with permission)

It is instructive to observe that hookworm disease can cause anorexia, diarrhea, and vomiting, which contribute further to the malnutrition of its victims. “[A]norexia caused by hookworm infections in children may be an important cause of loss of weight and failure to thrive. All of these early manifestations of hookworm disease can cause a great deal of morbidity but with no anemia” (Nelson 1990, p. 420). Because of their smaller size, the same number of hookworm in children is more deleterious to their health than to adults. The hookworm symptoms can contribute significantly to anemia and hypoalbuminemia.

Hypoalbumimenia is common [among hookworm victims]. As a direct consequence of infection it is probably explained by the fact that the worms abstract more fluid than the measurement of blood loss indicates, in other words, that they cause a true protein-losing enteropathy. The degree to which the patient becomes hypoalbumimenic depends, on the one hand, on the volume of fluid lost and the concentration of albumin which it contains, and on the other, on the capacity to increase the rate of synthesis in compensation. . . . The small albumin pool and maximal synthetic capacity of children may explain why hypoalbumimenia may dominate the clinical picture in infants (Migasena and Gilles 1991, p. 186).

Hookworm reduces the net nutrition available to its host for a number of reasons. The hookworm-infected human has a smaller intake of nutrients because of symptomatic anorexia and loss of appetite caused by the disease. The food that the host does manage to ingest has its nutritional value further reduced by symptomatic diarrhea and vomiting associated with hookworm disease. Finally, hookworms reduce the net nutrition available to the host by the red blood cells and other fluids that hookworms consume.

The deleterious effects of hookworm disease on physiological development and growth were indicated at least a century ago. Dock and Bass (1910, pp. 115–17) estimated that adult men in the American South in 1910 who were thought to have had long-term hookworm infection weighed 8.25 pounds less and were 2.33 inches shorter than men not infected with hookworm. Strong (1916) and Kelley (1917) likewise reported negative growth effects of hookworm infection on southern children in the early twentieth century. Smillie and Augustine (1926, tables 1 and 3, charts 1–2, pp. 154–59) determined that southern white adolescences (14- to 15-year-olds and 16- to 17-year-olds) who were found to have heavy hookworm infections (more than 500 worms) weighted about 9 to 14 pounds less and were about 2 inches shorter than similar adolescences with no hookworms.

The negative effects of hookworm disease on human health and physical development have been well documented for contemporary populations as well. In the last thirty years, numerous studies have investigated the interrelationships among various parasitic diseases in highly endemic environments and their impact on the development, physical fitness, and appetite of children.17 The major focus of these studies is to measure the influence of different treatments on improving the physical development and health of children; the studies provide valuable evidence on the influence of a reduction in the prevalence and intensity of hookworm, roundworm, and whipworm (the three most common intestinal worm infections), as well as other diseases. Bundy, Kremer, Bleakley, et al. (2009) recently concluded that such deworming programs provide economic benefits that greatly exceed their costs.

Separating the impact of individual diseases can be difficult because in highly endemic areas people are often afflicted with multiple helminthic infections, as well as malaria and schistosomiasis. Consequently, when any parasitic burden is decreased, other infections may be continuing, and re-infection is an additional problem. Despite these disentangling problems, a lengthy series of studies of Kenyan school children indicated significant improvements in the children’s growth and health.18 In one study of primary school boys in Kenya, the findings indicate that after treatment for several intestinal worm infections there was significant improvement in physical fitness, growth, and appetite (Stephenson, Latham, Adams, et al. 1993b). The improvements were measurable despite persistent exposure to re-infection and a less than complete cure after treatment. Four months after treatment, the hookworm prevalence rate fell significantly from 98 to 44 percent, roundworm prevalence fell from 98 to 85 percent (not statistically significant), and whipworm prevalence fell significantly from 41 to 18 percent; there also were significant reductions in the intensity of all three helminthes as measured by fecal egg counts. The treated Kenyan schoolboys gained on average 167 percent more in weight and 43 percent more in height than an untreated control group. There were significant improvements in the aerobic capacity of the treated boys as well; fitness scores, resting heart rates, and heart rates after exercise all increased. (The untreated group showed no improvement in the fitness test and no significant changes in the two heart rate measurements.) The reduction in intestinal worms also affected the potential nutritional intake of the schoolboys. “The most important new finding . . . is the significant improvement in perceived and measured appetite” for the treated boys, and “in the prevention and control of malnutrition and its functional sequelae, it is often just as important to improve children’s desire to eat” as it is to grow or buy more food (Stephenson, Latham, Adams, et al. 1993b, p. 1044).

In a study of iron-deficiency anemia among African children, an analysis of several thousand Zanzibari school children shows that while infections with malaria, hookworms, and other tropical worms “were all associated with worse iron status; the association with hookworms was strongest by far” (Stoltzfus, Chwaya, Tielsch, et al. 1997, p. 153). The study indicates that the intensity of the hookworm infection was the strongest variable in explaining anemia and accounted for 25 percent of all anemia cases, 35 percent of iron-deficiency anemia, and 73 percent of severe anemia in the Zanzibari school children.

In another study, designed specifically to determine the magnitude of stunting caused by hookworm disease, Foo (1990) collected anthropometric and parasitological data in Malaysia in the 1980s for Indian and Malay school children. The data show significant differences in anthropometric measurements between those children moderately infected with hookworm and those not infected, holding other factors the same. The infected 7-year-old Indian children were on average 2.4 centimeters (0.95 inch) shorter than those without hookworm; stunting among the 7- to 9-year-old Malay children was much greater; those with hookworm were on average 3.8 centimeters (1.5 inch) shorter than those not infected. Both groups of hookworm-infected children also were lighter and had lower hemoglobin values than those not infected. Foo’s (1990) findings also indicate the anthropometric and hemoglobin deficits were both significantly related to the intensity of hookworm infection.

The deleterious cognitive effects of hookworm disease in infants and children also have been documented in several different populations. A major factor in reduced cognitive development in school-age children and in children less than five years old—primarily in the southern United States in the early twentieth century and in tropical Africa and South Central Asia in more recent times—is hookworm disease and its associated anemia and malnutrition—as well as infection with the other helminthes, Ascaris (large roundworm) and Trichuris (whipworm). Research suggests that an entire childhood of hookworm disease and its associated malnutrition can have permanent effects on the mental capabilities of children. Hookworm infestation among infants and developing children can be particularly devastating cognitively because the growth process, and the growing brain, in particular, demands substantial nutrients; infestation both reduces the nutrition available and diverts some of it to the immune system that is combating the parasites. Repeating a point made earlier: “From an energetic standpoint, a developing human will have difficulty building a brain and fighting off infectious diseases at the same time, as both are very metabolically costly tasks” (Eppig, Fincher, and Thornhill 2010, p. 1).19

The Health Effects of Hookworm in the Antebellum American South

Clearly, hookworm was endemic in the American South in the years before 1902 when it was first recognized as widely endemic. While the first known case of hookworm in the United States was not until 1893 and only nine cases in total were known before it was recognized to be endemic (Marcus 1988, pp. 81, 89), we do know that parasitic diseases do not take hold and become highly endemic overnight. Given that hookworm’s effects are so gradual, hookworm is not normally recognized symptomatically. Nevertheless, there is evidence suggesting that hookworm was prevalent in the South before it was identified as endemic.

In the antebellum American South, plantations concentrated slaves into relatively close quarters with infants and young children even more concentrated in crèches that facilitated the spread of disease. The concentration of young children continues to contribute to disease to this day: “The rate of diarrhea for non–toilet-trained infants in day care centers in urban areas of the United States [in the 1990s] is comparable to the rate of illnesses seen in Third World countries” (DuPont 1993, p. 676). Young slave children, diaperless and wandering around barefoot, wearing only shirts, defecated randomly. Diarrhea made the situation even more favorable to the transmission of hookworm. Hookworm transmission typically occurred during the warmer months of the year and caused increased morbidity and mortality in slave children throughout the year.

The environment of endemic hookworm affected the health and net nutrition of slaves. Steckel (1992) reports that slaves were small relative to twentieth-century norms. In fact, “At age 1 the slaves were nearly Lilliputians” (Steckel 1986a, p. 177). Steckel (1992) attributes the remarkably small stature of slave children to the diet of slaves during infancy and childhood, which he believes was severely protein deficient because it was not profitable to feed slave babies and young children a nutritious diet. Aside from any moral qualms slaveowners may have had, to be plausible and make economic sense Steckel's view depends on a number of necessary conditions. First, the differential costs of an adequate diet versus an inadequate one. Second, real interest rates would have been such that the present value of current savings on food is greater than any losses caused by inadequate diets on slave prices, productivity, and a lower future stock of slaves. (It is worth noting that Steckel acknowledges that taller slaves were worth more than were shorter slaves; accordingly, a rational slaveowner would consider the decline in the market value of slaves, approximately “1.5 percent per inch of height”(Steckel 1999, p. 48) in any decision to reduce expenditures on food.) Third, any declines in the present value of workforce productivity would have been modest due to slaves caring for sick children, or mourning over dead infants and young children. Fourth, little or no transfers of food resources would have come from older slaves to infants and young children. Steckel (1992) reports that slaves underwent a growth spurt after they left the children's quarters to work in the fields, attributing the spurt to the better diet that field workers had relative to children. But this suggests that slaves who were more than adequately fed—they were doing fieldwork and undergoing a growth spurt—did not share some of their abundance with their younger siblings or their own children. This is a behavioral trait among adolescent and adult slaves counter to that documented in all other populations in deprived situations. And, fifth, the morale of the plantation workforce would have been unaffected, or if it was affected it did not impact the present value of labor productivity.

Even if all these conditions were met, the disease environment still would have extracted its toll. Slavery was a morally repugnant institution. Yet moral repugnance does not mean that slaveowners systematically deprived infants and young children of adequate diets. For a variety of reasons they could have starved infants and young children, but there is a simpler, and more compelling reason for low birth weights, stunted children, high neonatal and infant mortality, and a late-adolescent growth spurt. The reason is disease; the catalog of diseases that afflicted the slave quarters is long and as yet still incomplete. Endemic hookworm disease alone could have been the cause of these phenomena; repeating a point, slave nurseries also were nurseries for hookworm infections. Regardless of the amount of food provided, severe hookworm disease would have had deleterious effects on the net nutrition of pregnant women, infants, and young children, resulting in (1) an increase in neonatal and infant mortality, (2) low birth weight babies, (3) small stature children, and (4) the effects of severe protein deficiency (anemia, hypoalbuminemia, and kwashiorkor).

Starting with pregnant slaves, if owners put them in tasks that were less physically demanding than fieldwork, those who were assigned to the care of infants and young children would be at severe risk of hookworm disease. Pregnant slaves and their fetus would be in a worse environment in and around the slave quarters than in the fields. For the future newborns it may be a worse environment even when including the effects of maternal work on the fetus.20 The lack of bowel control in children in “day care,” and the shaded environment of slave nurseries, would increase the probability of hookworm infection in the pregnant women assigned to care for the children. If the women went barefoot, the risks of infection would be compounded. Pregnant women with hookworm disease give birth to smaller babies and nurse them less well relative to women who have no hookworm burden (Brooker, Hotez, and Bundy 2008). Steckel’s (1992) estimates on stillbirths, neonatal and infant mortality, and birth weights, which are derived from the heights of slave children, are all consistent with hookworm disease, as is the evidence on stunted children. People of tropical West African ancestry are more resistant to hookworm infection than other peoples; however, we emphasize this is relative. People of African ancestry are still afflicted by hookworm, and hookworm disease in African children can have severe consequences.21

Steckel (2000; 2009) has questioned a claim of ours about the deleterious effects of hookworm on the heights of slave children. Using estimates of the average height of antebellum slave children, modern height standards, and contemporary estimates of the impact of hookworm on the heights of Indian and Malay children, we argued elsewhere (Coelho and McGuire 2000) that hookworm infection during infancy and early childhood may account for as much as 31 percent (3.12 centimeters, or 1.23 inch) of the implied 10.1 centimeters (or 3.98 inches) mean height deficit that slave children had compared to modern height standards. Steckel (2000) maintains that our estimate is overstated by a factor of ten, basing his conclusion in part on a modern intervention study that examines child growth over a given period of time after ridding children of a significant amount of hookworms. We believe that Steckel (2000, 2009) ignores what amounts to the effects of a lifetime of exposure to hookworm during the crucial growth years for slave infants and young children when he claims that a temporally brief intervention study indicates that hookworm infection accounts for only a tiny proportion of slave children’s stunting. To approximate the amount of stunting due to hookworm in slave children, Steckel (2000) relies on the only intervention study (Stoltzfus, Albonico, Tielsch, et al. 1997), among many intervention studies, that indicates a small impact of treatment—one year after treatment for hookworms the heights of African school children improved only 0.3 centimeter (0.12 inch), one-tenth of our 3.12-centimeter estimate of stunting. While we are aware of the advantages of intervention studies for estimating temporal hookworm stunting and that the Indian and Malay study upon which we relied raises the problem of controlling for confounding factors and also notes the advantages of intervention studies itself (Foo 1990, p. 11), we still believe our 3.12-centimeter estimate is likely a lower bound for antebellum slave stunting for a number of reasons.

First, as noted, many historical and modern studies of the physical effects of hookworm infection indicate sizable amounts of stunting.22 Second, antebellum slaves were sicklier than contemporary Indians and Malaysians (and most other contemporary populations). As a result, hookworm infection would have had more adverse effects on the sicklier slaves. Third, in our view, Steckel (2000) confused the measured effect of a de-worming program on child growth with the effects of a lifetime of hookworm infection on child growth. He confused a one-year effect in one modern treatment study with the effect of years of infection on slave child heights. In intervention studies the researchers give some dosage of a therapeutic substance to a study group of children and a placebo (or no treatment) to another group; they measure the observed effects on growth sometime after the substance is expected to take effect, and then compare the growth of the treated children to that of the placebo (or no treatment) group. These “growth effects” are typically a centimeter or so. How large the effects are depends on the efficacy of the substance as an anti-helminth, the size of the dosage, the number of treatments, the duration of the study, and the rate of re-infection. If a study reports a “growth effect” of 0.3 centimeter, for example, over one year for a treated group, it means that one year after the study began the subjects given the treatment dosage grew 0.3 centimeter more than the placebo (or no treatment) group. The 0.3 centimeter should not be interpreted as the total effects of hookworm on child growth over many years unless one believes that the effects on stature of a childhood of exposure to hookworm can be offset by one year’s “growth effect” after de-worming. We know of no reputable expert in the field who makes that contention.23

We believe that Steckel (2000, 2009) confuses the “growth effect” in modern intervention studies with the antebellum “catch up growth” that began in slave adolescence and continued during the teen years and early adulthood and allowed maturing slaves to recover several inches of growth. The antebellum “catch up growth” was the result of many years of living and working in a reduced hookworm environment compared to the worm-rich environment of infancy and early childhood; the catch up was not the result of only a few months or a year’s growth because of having fewer worms.

When slave children were sent from the slave quarters to do farm work, they were freed from an environment highly conducive to the spread of hookworm (and other diseases). Slaves sent to work were typically given shoes that inhibited hookworm infection. Going barefoot in the rural American South was common among children and lower income people during warm-weather months, but going barefoot would have been less likely while engaged in physical labor. And those slaves who were sent to cotton fields were sent to a physical environment that was inhospitable to the survival of hookworm larvae (Chandler 1929, p. 186). But sawmill camps were hospitable to the transmission of hookworm. So whether the health of a particular slave child was enhanced by promotion to adult work depended on where he or she went; if sent to the cotton fields, it was enhanced. If hookworm were responsible for their small stature, children promoted to adult field work in early adolescence, or before, would have experienced a nutritionally induced growth spurt in subsequent years as the hookworm burden decreased. Because an attached hookworm has a life expectancy of one to five years in the human intestine, an individual with a heavy hookworm burden who is not re-infected will be cured over time as the worms age and die. Recall that among other benefits, a reduction in the prevalence and intensity of hookworm produces a significant improvement in appetite.

Slaves wearing shoes and working in the fields would have a much-reduced exposure to hookworm infection. Some adult practices such as chewing tobacco and spitting inhibit hookworm infection; because hookworm gains entrance to the intestine when the human swallows the worm-bearing mucosa coughed up from the lungs, constant spitting (as from tobacco chewing) reduces or eliminates this mode of infection (Dock and Bass 1910, p. 147). Older slaves could have acquired immunities to hookworm because a childhood of constant exposure to hookworm primed their immune systems to resist infections. Working, wearing shoes, adult lifestyles, and previous exposures all contributed to the reduction in the hookworm burden a field-working slave would have borne. Only when exposed to the children’s play area and where coprophagous animals were roaming would a field slave have had a high risk of hookworm disease.24

In their cabins, if adult slaves took their shoes off, neither wooden nor earthen floors would allow the easy transmittal of hookworm. Wood floors in slave cabins would be prophylactic and dirt floors are not necessarily “dirty.” Vlach (1993, p. 165) states that “[a] packed earthen floor can, if properly maintained, become as hard and smooth as concrete.” Regardless of the type of floors in slave cabins, because hookworm eggs do not mature for eight to nine days after passing, it would be highly unlikely that hookworm would have been transmitted from the floors of slave cabins. Adult slaves would not have been likely to leave feces lying on their cabin floors for eight to nine days.

Malaria

The Disparate Impact of Malaria

Malaria is an infectious parasitic protozoan (genus Plasmodium) disease. There are four types of malaria protozoa—Plasmodium falciparum, Plasmodium malariae (quartan), Plasmodium virax, and Plasmodium ovale—that cause the disease of malaria in humans, though only the first three are commonly found in humans. The most threatening to human life is P. falciparum. Before modern nomenclature was developed, P. falciparum was frequently referred to as “malignant malaria.” In contrast, the others were commonly call “benign malaria.” The use of the term “benign” may be questioned given that the case mortality rate for these types is above zero, and they cause significant suffering and hardship.

People of tropical West Africa ancestry show a remarkable resistance to malarial infection. The genetic defenses to P. falciparum are the ones that are most effective; evolution works by selectively increasing the genes in a population that enhance the probability of having surviving offspring; death halts reproduction. Plasmodium falciparum is the most deadly version of malaria, and it is endemic to tropical West Africa. The sickle cell trait is the best known of the innate defenses that evolved to combat P. falciparum. And not surprisingly the sickle cell trait is relatively abundant in tropical West Africa, and among people of West African ancestry. Malaria is considered endemic if “over a succession of years there is a constant measurable incidence of cases due to natural local transmission. The term epidemic malaria is applied when the incidence of cases in an area rises rapidly and markedly above its seasonal level” (Sambasivan 1979, p. 281, emphasis in original). Malariologists classify endemicity into four categories. From the least to the most severe, they are hypoendemic, mesoendemic, hyperendemic, and holoendemic malaria. Much of tropical African is considered holoendemic.

Kiple and King (1981, p. 17) write that “so long as the possessor of this [sickle cell] trait is heterozygous (inherited from one parent only), he or she does not develop deadly sickle cell anemia.” When the individual is homozygous (the gene is inherited from both parents) for the sickle cell gene, the result is sickle cell anemia. The resistance that the heterozygous sickle cell trait confers to populations against malaria disease offsets the evolutionary disadvantage of sickle cell anemia. Although sickle cell anemia does confer resistance to P. falciparum, many with sickle cell anemia die by adolescence (Kiple and King 1981, p. 17). When infected with P. falciparum, individuals with the sickle cell trait exhibit lower parasite counts than individuals without the trait, and are much less frequently beset with deadly complications such as blackwater fever or cerebral malaria (Kiple and King 1981, p. 18). The exact prevalence of the sickling trait among antebellum American blacks is not known with exactitude, still based on prevalence rates among contemporary West African and African-American populations medical authorities have estimated that at least 22 percent of Africans first brought here possessed the trait (Savitt 1989, p. 331).

The sickle cell blood abnormality is illustrative of the cost–benefit analysis that evolution employs. People who are heterozygous for the sickle cell trait have a substantial resistance to malaria and will do much better in an endemic malarious environment than people without this trait. But people who are sickle cell homozygous develop sickle cell anemia and typically die before reaching maturity. The arithmetic is grim, parents who are heterozygous for the sickle cell trait, on average, will have one-fourth of all their children homozygous for sickle cell; they will develop sickle cell anemia and are destined for an early death. The environment has to be highly malarious and deadly for a gene that confers resistance to malaria, yet kills one-fourth of its breeding population, to spread over a large segment of the gene pool. The toll it extracts is large; this is why the sickle cell trait is vanishingly small in populations (and their descendants) who did not live in highly malarious environments.

A second innate trait that populations of tropical West African ancestry have relative to others is “a blood enzyme deficiency bearing the somewhat forbidding name glucose-6-phosphate dehydrogenase deficiency . . . better known as G6PD deficiency . . . many variants of G6PD deficiency (over 100) have been discovered with more turning up all the time” (Kiple and King 1981, p. 19). Savitt (1989, p. 331) indicates that about 20 percent of West Africans possess G6PD deficiency, and estimates that between 30 and 40 percent of African slaves would have possessed either the sickle cell trait or G6PD deficiency. According to Kiple and King (1981, p. 19), “As with the sickling trait . . . [this] deficiency [glucose -6- phosphate dehydrogenase] is one . . . for which there is definitive evidence of stabilization by selection with geographic incidence pointing to malaria as the selective factor . . . The trait has been observed among Sephardic Jews, Greeks, Iranians, and other areas in which falciparum malaria is (or was) endemic.”

A third innate trait that populations of tropical West African ancestry possess is deficiency of Duffy blood group antigens, Fya and Fyb; the scientific evidence indicates that the Duffy antigens act as outside receptors for P. vivax. Without the Duffy antigens, P. vivax parasites cannot intrude humans (Allison 1961; Crutcher and Hoffman 1996). Savitt (1989, p. 330) maintains that among contemporary populations approximately 90 percent of West Africans and about 70 percent of African-Americans lack the Duffy antigens.

There also are acquired immunities to malaria, but because malaria protozoa reproduce sexually, the human immune system has difficulty combating malarial pathogens (Dunn 1993, pp. 858–59; Kiple and King 1981, pp. 12–23). The sexual reproduction changes genetic markers that immune systems bore in on when a foreign substance invades the body. As a result, each variety of malaria has a large number of strains that the human immune system may not immediately recognize as invaders, even if it had been previously infected by the same variety (but a different strain). An individual growing up in tropical West Africa is very likely to have had many exposures to different varieties and strains of malaria. Although this would not impact their New World descendants, the slave South did have malaria, and while not as abundant as in tropical West Africa, the existing forms and intensity, especially on plantations, would have conferred some acquired immunities to people who survived its initial onslaughts.

While references to malaria in Europe have been dated as far back as the mid-Pleistocene epoch and it was endemic to northwestern Europe until about the mid-nineteenth century, P. Falciparum (the most frequently deadly form of malaria) was never endemic there. Because evolution works by culling the breeding population, the lack of the relatively deadly P. Falciparum did not allow malaria’s existence there to confer on northwestern European populations the type of innate resistance that existed in West African populations. The prevalence of the sickling trait, G6PD deficiency, and Duffy blood antigen deficiency are, and were, much rarer in northwestern Europeans than among tropical West African populations. While good estimates of the prevalence of these traits among northwestern European populations for the nineteenth and earlier centuries are not available, from examining contemporary populations the historical prevalence among individuals of European ancestry can be inferred to be quite low relative to peoples of West African ancestry. Recent estimates of the prevalence of the sickle cell trait for whites and blacks in the United States are about 250 per 100,000 for whites and about 6,500 to 7,000 per 100,000 for blacks; the prevalence of the sickling trait today among American whites is about 1/20 the prevalence among American blacks (National Library of Medicine 1998). Additional evidence comes from Marks and Gross (1959) who show the differences in the prevalence of G6PD deficiency among twentieth-century blacks and whites in the United States.

Historical records suggest the reason for less innate resistance to malaria among northwestern Europeans is that northern Europe had a lower incidence of endemic malaria, and the types and strains of malaria were substantially less varied and deadly than in tropical West Africa (Bruce-Chwatt and de Zulueta 1980). The northern European climate was one factor affecting the incidence of malaria; malaria was not a year-round phenomenon there as it was, and is, in much of tropical West Africa. Ample historical epidemiological evidence supports the conclusion of less innate resistance among northwestern Europeans: Europeans involved in the early African slave trade suffered extraordinary mortality rates from malaria and other tropical diseases; their mortality rates in Africa during this period were on the order of fifteen to twenty times those of Africans (see chapter 5).

Malaria in northwestern Europe consisted primarily of P. vivax and to a lesser extent P. malariae. As noted, P. falciparum was not endemic there nor was P. ovale a factor. Sizable rates of prevalence of the sickle cell trait and G6PD deficiency among a population are dependent on people living and reproducing within an environment of intensely endemic (hyper- or holoendemic) and stable P. falciparum. Plasmodium falciparum is the most deadly of the malarial diseases; consequently, it has the greatest impact in evolutionary selection. A substantial proportion of a population being deficient in the Duffy blood group antigens is the result of people living and reproducing within an environment of intensely endemic and stable P. vivax (Bruce-Chwatt 1980). Epidemiologists classify malaria as stable in an area if it is intense, robust, and difficult to interrupt. Stable malaria accordingly is thought of as continuous throughout the year. Malaria transmission that is endemic to an area but lacks these traits would be referred to as unstable, and intermittent. Malaria becomes less stable as endemicity declines.

With respect to any acquired immunities, Europeans were less likely to be protected by them. As Kiple and King (1981, p. 16) explain, “although immunity once acquired, [to vivax, the primary type of malaria protozoa found in northwestern Europe] renders the host more or less refractory to a reinoculation with the same parasite (homologous acquired immunity) all strains of the same species (and there are many) are not immunologically similar. Therefore, despite premunition (immunity to a specific strain), one remains liable to infection of other strains, particularly those of a more virulent [non-vivax] nature.”

The Spread of Malaria

The life cycle of malaria is more complex than that of hookworm and is intimately involved with its vector and hosts, mosquito (genus Anopheles) and humans (see Oaks, Mitchell, Pearson, and Carpenter 1991, pp. 25–30). The female anopheline mosquito, seeking a blood meal, bites a human whose blood harbors infectious malaria gametocytes. The gametocytes are both male and female, and they mate and reproduce in the mosquito’s gut. The newly produced protozoa undergo several stages inside the mosquito until they reach the insect’s salivary gland and the sporozoites, the product of the protozoan sexual reproduction, are injected into another human. There the protozoa may be dormant (hypnozoite) or active, when active it finds its way to the liver and undergoes more transformations emerging as schizonts that release merozoites into the blood stream. The merozoites and the immune system’s reaction to them cause symptomatic malaria. Some of the merozoites in the blood differentiate sexually into male and female; these are the gametocytes that can reproduce only inside the mosquito’s gut. When another anopheline mosquito bites the infected person, the cycle begins again with the gametocytes being imbibed with the blood meal.

The larvae of a mosquito infected with malaria are free from malaria. So the malarial protozoa depend on human hosts for their survival. Because nonhuman animals do not contract most forms of malaria, they cannot serve as malarial reservoirs. Humans must serve as the reservoir with the exception of P. ovale, which is believed to have avian hosts in its African homeland. In places where there are killing frosts, most mosquitoes die over the winter, only their larvae (malaria free) survive. So a mass of infected humans must be available to start the cycle again. Humans maintain the cycle in two ways. One is by harboring dormant liver-stage protozoa that become active. (They can be dormant for years; this is termed relapse.) The other is by an outbreak of gametocytes caused by surviving blood-stage protozoa. This is termed recrudescence. Plasmodium malariae and P. virax can lie dormant within the human body, reappear, and cause the disease malaria even decades after the original attack. Wills (1996, p. 151) reports that a British veteran of the Asian campaigns of World War II who contracted malaria in the 1940s had it reappear in his body starting in 1987.

While other types of malaria can persist in the human body for years, P. falciparum cannot; at the most it survives 18 months (more typically 9 months). After these months, survivors of P. falciparum will have had the plasmodia purged from their bodies. Since the mosquito vector must imbibe the plasmodia from a malarious person in order to spread the disease, P. falciparum does not usually survive areas where there are killing frosts. Plantation slavery allowed P. falciparum to persist by providing a large number of potential human carriers living in a malarious environment. The potential carriers were of tropical West African ancestry who were relatively resistant to it; this means that they can be infected but the effects will be relatively mild; in some cases they can be asymptomatic carriers. These conditions increase the probability that there will be malarious people infected with the falciparum version of malaria to host new generations of mosquito vectors. Large slave plantations provided a reservoir for P. falciparum in the cold-weather (mosquito-free) months, from which malaria could spread with the return of warmth and mosquitoes. Malaria (presumably P. falciparum) was a significant cause of death in both the North and South before the Civil War. As a cause of death in the non-southern United States, malaria declined substantially after the war; we attribute a large part of this decline to the demise of the plantation system and its role as a malaria reservoir.

The spread of malaria in nineteenth-century America is not well documented. The biological and medical sciences did not identify the malaria protozoa until 1880 and even then its identification was not well accepted for a number of years. The mosquito vector was not identified until 1898, and the protozoan life cycle was not discovered until well into the twentieth century. Nevertheless, we can identify the symptoms of classic malaria (intermittent and recurring fever, chills, and low case mortality) throughout nineteenth-century America. Medical identification of malaria was certainly less accurate in the nineteenth century than in the twentieth. So when the American military reports that Fort Leavenworth, Kansas, had 629 cases of malaria per 1,000 men during the 1830s (Drake [1850, 1854] 1964, p. 707), this report may be biased downward or upward. A downward bias would be imparted if nineteenth-century medical practitioners incorrectly diagnosed malaria as typhus, typho-malaria fever, congestive fever, or some other idiosyncratic term. Alternatively, an upward bias would be imparted if instead practitioners misdiagnosed typhoid fever or typhus as malaria.

Malaria appears to have been widespread in the antebellum American South and Midwest, declining in both areas after the Civil War. Before the United States banned the importation of slaves in 1808, imports of African slaves from Africa brought with them many varieties of malaria protozoa, especially P. falciparum (Blanton 1930; Childs 1940; Curtin 1968; Rutman and Rutman 1976). In the nineteenth century, the growth of large-scale slave plantations in the South and the reduction in transportation costs and time facilitated the spread of malaria throughout the South and into the Midwest. On plantations with large numbers of slaves, the enslaved were housed in densely populated slave quarters. The availability of human hosts (slaves) that typically survived the disease enabled malaria to survive the winter months and spread when mosquitoes were once again abundant. Surviving a bout of malaria was enhanced by the use of quinine; it was widely known and efficacious in combating malaria. The financial resources of large plantation owners would have ensured that the cost of quinine would not have been an obstacle to its use. The value of a slave was substantially greater than the cost of quinine.

Anopheline mosquitoes were the vectors and incubators of malaria; with the right conditions, mosquitoes and malaria flourished in nineteenth-century America. The “right” conditions for malaria are a climate and geography conducive for harboring many mosquitoes and a relatively dense population of infected and non-infected people. The ideal climate and geography would be consistently humid lowland areas, temperatures between 17oC and 32–34oC (63oF and 90–93oF), standing water (typically freshwater pools or marshes), and average monthly rainfall in the range of 80 millimeters (3.15 inches) to provide a source for the standing water and humidity. Many mosquitoes were necessary because the mosquitoes must harbor the malarial pathogens for a number of days before the vector is infectious, before that, the mosquito is infected but not infectious. For P. falciparum in the mosquito Anopheles gambiae, the incubation period of the pathogen is eleven days and the life expectancy of A. gambiae in the wild is estimated to range from eight to about fifteen and one half days. This results in a rate of prevalence (the percentage of infectious mosquitoes) of malaria in mosquitoes of less than 2 percent (Anderson and May 1991, pp. 387–88).

A relatively dense population is necessary for the spread of malaria because if the number of infectious vectors is low, and if malaria is to remain an endemic disease, this means there has to be a relatively large number of people around to propagate the infection. Since female mosquitoes simply want a blood meal, they do not have to bite humans. Other mammals will do as well or better because horses, cows, and pigs, for example, are less successful in swatting mosquitoes than are humans. And if a nonhuman animal is bitten, malaria is not transmitted. How many and how dense the human population has to be depends on other factors such as the availability of nonhuman mammals, herding practices, the quality of human housing, and agricultural developments. Because the anopheline vector is primarily a rural mosquito, malaria is usually transmitted in rural areas.

Wesenberg-Lund (1920–21, pp. 161–95) attributes the decline in malaria in parts of northwestern Europe in the mid-nineteenth century to an increase in animal husbandry and the practice of stabling animals some distance from human habitation. The increase in animal husbandry provided the mosquito with alternate food supplies and the stabling of animals allowed mosquitoes to survive the winter. The mosquitoes that “preferred” nonhuman animals would have increased relative to mosquitoes that “preferred” humans, and to the extent that preferences were genetically transmitted between mosquito generations, malaria would have declined over time. Wesenberg-Lund (1920–21, p. 168) reports that the mosquito A. maculipenis living in barns did not bite humans. In other areas, A. maculipenis bit humans and transmitted malaria. This suggests some mosquitoes may have “preferences” both for and against biting humans.

In the antebellum American South, the slave labor force was kept close together, and animals were penned rather than stabled. Mosquitoes would have been more likely to bite humans under these circumstances than when animals were stabled and humans dispersed. A constant supply of unexposed children in slave compounds assured the survival of malaria on large plantations.

Malaria spread between plantations and from there to the non-slave population by mosquitoes infectious with malaria. This occurred in a number of ways: people with malaria going off the farm, people coming to a malarial-infested farm, or mosquitoes flying or being transported (by winds or inadvertently by humans or animals) into other areas. The reduction in transportation costs and times in the antebellum United States facilitated the spread of malaria. By 1830, the number of days for an upstream passage on a steamboat from New Orleans to Louisville, Kentucky was less than ten days, and the costs of passenger fares had fallen to 26 percent of their 1815 values (Mak and Walton 1972, pp. 630, 639, app. table II, respectively). The advent of the steamboat, allowed people in 1830 to be infected with malaria in the state of Mississippi, take the steamboat to southern Illinois and be there before they knew they had the disease. The latent period for malaria is from nine to sixteen days; once an individual had the disease, they typically remained infectious from two to nine and a half months (Anderson and May 1991, p. 378).25

The transmission mechanism of malaria to the Midwest is now transparent: loci of infection were along the lower Mississippi River and its tributaries. As the New South became more settled, and transportation costs and times declined, the disease pools of the American South and Midwest were integrated. Malaria became endemic throughout. With the post–Civil War abandonment of the plantation system and the growth of animal herds and stabling, malaria’s hold on the Midwest waned rapidly. Malaria declined more slowly in the South; it was not until after World War II that malaria was no longer a significant factor in the public health of the United States. The use of quinine as both a curative and prophylaxis assisted in the reduction of malaria. Curtin (1989) ascribes the reduction in malarial mortality rates for British troops stationed in various locations in earlier times to the isolation of quinine from the cinchona bark in 1820 and the increasing usage of quinine.

The question remains: How widespread was malaria in the antebellum American South? Because the malaria plasmodium was not recognized until 1880 and medical authorities did not accept it as the causal factor until even later, scientific data directly measuring the amount of malaria prior to 1880 are not available. However, mortality data on deaths by specific causes, including malaria, for the 1850 to 1900 censuses were reported. The 1850 census includes mortality data for the white and slave populations; the 1860 census did not report deaths by “race.” From 1870 to 1900, the census resumed reporting mortality for both whites and blacks. We calculated “malarial fever” mortality rates for whites and slaves for 1850 and for whites and blacks for 1870 to 1900 from the data reported in each of these censuses. The mortality rates for fourteen southern and border states are presented in table 6.2. (The accuracy and reliability of the 1850 to 1900 census mortality data, issues involved in the calculation of the malaria mortality rates, and the data sources are discussed in detail in appendix C. Appendix C also presents various mortality rates, which are discussed in chapter 7, for all causes of death and for nearly two dozen major infectious diseases for all states for 1850 to 1900 for which data exist.)

Table 6.2

Death rates for malaria per 100,000 in the southern United States

Source: United States Bureau of the Census (1855, 1872b, 1886, 1896, 1902b).

Note: For 1850, malaria death rates are calculated from the total deaths for “bilious,” “congestive,” “intermittent,” and “recurrent” fevers. For 1860, deaths were not reported separately for whites and slaves. For 1870, malaria death rates are calculated from the total deaths for “intermit- tent,” “recurrent,” and “typho-malaria” fevers; other malaria-symptomatic fevers were not reported in 1870. Beginning with the 1880 census, malaria deaths are listed as deaths from “malaria fever” and “bilious,” “congestive,” “intermittent,” and “recurrent” fevers were listed as “malaria fever.” For 1880, deaths in Kentucky and Missouri were not reported for blacks.

The malaria mortality rates in table 6.2 indicate that during the era of slavery African-Americans likely had lower death rates from malaria than European-Americans, for later years the situation was reversed. The reported death rates from malaria in 1850 among whites were greater than those for blacks in all but two slave states (Maryland and Virginia); in 1900, the reported death rates from malaria were lower for whites than blacks in all the southern and border states except Missouri (see table 6.2). This anomaly may be explained by the greater resources that slaveowners had in their possession to treat malaria and/or their reluctance to put slave capital in highly malarious areas. In most of the American South, the malaria mortality rates for whites increased up to 1880 and declined thereafter (in nine of the states in table 6.2; in two others states, the mortality rates peaked in 1870), while the death rates for blacks generally continued to increase until 1890 and declined thereafter (in ten of the states in table 6.2).

The mortality rates indicate a nontrivial amount of malaria in the antebellum American South. While direct data on the case mortality rate for malaria in the mid-nineteenth century South do not exist, it can be inferred from other data to have, most likely, ranged from well below one percent to about 1 percent at the most. Using “realistic” case mortality rates, the data in table 6.2 imply a nontrivial proportion of the southern population had malarial attacks during each of the census years listed. For example, in Alabama in 1850, there were 53.9 deaths per 100,000 among whites. Using a “high” but realistic case mortality rate of 1 percent, this implies there were 5,390 malaria cases per 100,000 whites in 1850. (To derive the rate of malarial disease per 100,000 whites, the number of deaths due to malaria per 100,000 whites is divided by the malaria case mortality rate to get the rate of malarial disease: 53.9 divided by 0.01 equals 5,390 per 100,000 whites.) Using a “low” but realistic case mortality rate of 0.2 percent implies, there were 26,950 cases of malaria per 100,000 whites. These estimates suggest that between 5.4 and 27 percent of the white population in 1850 Alabama would have been malarious.

These “realistic” case mortality rates are derived from Curtin (1989). Curtin reports mortality for British troops stationed in various locations during the nineteenth and early twentieth centuries for symptomatic malarial diseases (fevers, and continual and paroxysmal fevers), among other diseases. In the Madras Presidency in India, the combined case mortality rate for these fevers declined from 1.3 percent during the years 1837 to 1846 to 0.79 percent during the 1860 to 1867 period. (These case mortality rates are calculated from the morbidity and crude mortality rates in Curtin 1989, pp. 181–82.) In estimating the case mortality rate for the mid-nineteenth century American South, we take the Curtin data for the Madras Presidency as a benchmark. The American case mortality rate was certainly lower than that of British troops in Madras because Madras is a much more tropical area (approximately thirteen degrees from the Equator) than the American South, the bulk of which is above thirty degrees north. This means that the length of the mosquito season as well as the intensity of mosquito infection were greater in Madras. Consequently, we consider a 1 percent case mortality rate as a realistic “high” estimate. The benchmark for the lower bound case mortality is more problematic. The lowest malarial case mortality rate for British troops in India was 0.2 percent during the years 1886 to 1894. (This case mortality rate is calculated from the morbidity and crude mortality rates in Curtin 1989, p. 185, where data are presented for India not just the Madras Presidency.) If, similar to our “high” mortality rate, we use a “low” rate for the American South that is less than the lowest Indian rate, say, a 0.1 percent case mortality rate, it would make the malarial morbidity rate probably far too high—53,900 per 100,000 whites for Alabama in 1850—for the actual American experience. So, rather than make an adjustment to the lowest calculated case mortality rate for India, we use the unadjusted lowest Indian rate. This gives us what we believe to be a realistic “low” mortality rate of 0.2 percent. (Recall the case mortality rates for malaria in the Chesapeake during colonial times presented in chapter 5.)

The Health Effects of Malaria

The impact of malaria infection on its human victims depends on the severity of the disease and how often it is contracted. Malaria incapacitates by causing fever, chills, and other maladies. The malaria protozoa attack the human host’s red blood cells and also attack various organs (primarily, the liver and spleen). In a population that is severely infested with malaria, there are visible effects:

The term “chronic malaria” is sometimes applied to the condition seen in children who in highly endemic areas suffer from many attacks, often untreated, of malaria. This is seen more often in vivax and quartan infections and results in “malarial cachexia” characterized by stunting of growth, wasting, anemia, and much enlargement of the liver and spleen. The temperature is often normal, at other times low fever occurs; parasitemia is variable, and the thick blood film may be often negative; only prolonged observation can establish the diagnosis, which is not finally clinched until improvement occurs under appropriate therapy (Bruce-Chwatt 1980, p. 44).

Malaria does have a differential impact on specific segments of a population. Pregnant women and, in particular, women pregnant with their first child are especially vulnerable to malaria as pregnancy suppresses immunities to malaria. The vulnerability of pregnant women is well documented for Africans (Brabin 1983, 1991; Gilles, Lawson, Sibelas, et al. 1969; Fleming 1989). Malarious women give birth to low birth weight babies who are subject to anemia and other illnesses. Because some acquired immunities exist, the incidence and severity of malaria disease decline with age in highly malarious environments. Children consequently are relatively more vulnerable to malaria in such environments because they lack acquired immunities. Figure 6.4 summarizes the symptoms, effects, and outcomes of malaria in humans.

Figure 6.4

Effects and consequences of malarial disease in human populations. Sources: Bruce-Chwatt (1980); Meuris, Piko, Eerens, et al. (1993); Oaks, Mitchell, Pearson, and Carpenter (1991); Steketee, Wirima, Hightower, et al. (1996); Strickland and Hunter (1982)

An examination of figure 6.4, and comparison with the outcomes of hookworm in figure 6.3, indicate there are many similarities between the outcomes of malarial and hookworm diseases. The impact of these parasitic diseases on children also is very similar. Both diseases cause symptomatic anorexia and a loss of appetite. In both cases infected children have less desire to eat. Both diseases cause anemia and other nutrient losses. And both malarial and hookworm diseases lead to low birth weights, increased fetal and infant morbidity and mortality, reduced body size of newborns and infants, and stunting of child growth.

Studies of contemporary populations in malarious environments indicate the severe effects that endemic malaria has on human health and development. In a study of malaria disease and fetal growth in a malarious region of Zaire, Meuris, Piko, Eerens, et al. (1993) conclude that malaria in pregnant women had negative effects on their newborns. Among the several hundred expectant mothers studied, circulating malaria parasites, malaria-associated placental lesions, and low hemoglobin levels were observed either individually or together in different combinations. Over 25 percent of the women with both malarial pathological findings had low birth weight (less than 2,500 grams or 5.5 pounds) babies while only 6.5 percent of women without any malarial findings had low birth weight babies. Low birth weight is considered “the greatest single risk factor for neonatal and early infant mortality” (Meuris, Piko, Eerens, et al. 1993, p. 33). The low birth weight differences were significant for both malarial conditions. Moreover, a measure of the size of the Zaire newborns (the ponderal index, a measure of body mass) was significantly smaller for women with the placental lesions.

In a clinical study of the effects of malaria treatment on the anthropometric measurements of newborns in a sample of more than 1,700 Malawians, Steketee, Wirima, Hightower, et al. (1996) find that women with placental blood malaria were 1.7 times more likely to have had low birth weight babies. Placental blood malaria in pregnant women was, among other factors, significantly related to low birth weight babies due to intrauterine growth retardation while umbilical cord blood malaria (fetal) was significantly related to preterm (delivery at less than thirty-seven weeks’ gestation) low birth weight babies. The combined incidence of intrauterine growth retardation and preterm low birth weight babies was over 31 percent in firstborns to women with malaria versus 24 percent to women without the malaria; it was over 20 percent (with malaria) versus 10 percent (without malaria) in second-born children, and it was 16.5 percent (with malaria) versus 9 percent (without malaria) in all other births. Steketee, Wirima, Hightower, et al. (1996, p. 40) conclude that “in highly malarious areas, malaria may account for more than 30 percent of preventable [with treatment] LBW [low birth weight].”

In a study of the changes in weight gain and anemia among infants and children six to forty months old in a holoendemic malarial environment in Tanzania, Shiff, Checkley, Winch, et al. (1996) show statistically significant improvements when the subjects were given insecticide-impregnated bed nets. One year after implementation of the program, the infants and children not given bed nets grew, on average, 286 grams (10.1 ounces) less and were twice as likely to be anemic over the five-month study period. The authors conclude that P. falciparum has a marked effect on increased anemia and a negative effect on human growth as measured by weight gain. (Strictly speaking, P. falciparum reduces the hemoglobin/red cell content of the blood, leading to the clinical condition known as anemia.)

A treatment study of the relationship among schistosomiasis, hookworm, and malarial infections, and the nutritional status of Kenyan primary-school children, indicates the children’s growth increased significantly during the first six months of the sixteen-month study after treatment for schistosomiasis (Stephenson, Latham, Kurz, et al. 1986). Growth, as indicated by several anthropometric measures, was much slower in the last ten months of the study. Stephenson, Latham, Kurz, et al. (1986, p. 41) conclude that “the most likely explanation for this phenomenon, of the variables we measured, is the dramatic increase in malarial infection . . . that probably began soon after Exam 2 [after the first six months of the study].”

The Health Effects of Malaria in the Antebellum American South

During the antebellum period in the United States, slaves that did not have any innate immunity to malaria (either due to random genetic mixing or to having a parent that was not refractory) would have suffered in malarious environments. The poorly constructed slave quarters on plantations allowed the mosquito vectors easy access; the high density of large plantations allowed easy transmission from human to mosquito and the reverse. The concentration of slave infants and young children in crèches facilitated the transmission of the malaria pathogens among them and their caretakers—older children, pregnant women, and elderly slaves.

When whites contracted malaria, the consequences would have been more severe because the white population lacked the innate resistance that people of tropical West African ancestry typically had. But the white population would have suffered from fewer mosquito bites. The resident white population of a plantation typically was absent from the slave quarters at the times when mosquitoes’ bites were most likely—early mornings and evenings. Many wealthy whites adapted to existing conditions by going away from their plantations during the malaria season, or by sending away the most susceptible members of their families. Plantation whites also were wealthy enough to purchase quinine, which was available as both a remedy and a prophylactic. Consequently, the overall impact of malaria on antebellum plantation whites is problematic.

Antebellum nonplantation whites also would have suffered fewer mosquito bites because they did not live in the relatively high-density plantation environment. Yet for whites residing in areas to which malaria had spread and had become endemic the impact of malaria is more predictable. The impact of malaria on nonplantation (and relatively low-income) whites would have been more severe than on African-American slaves, other factors the same, because whites lacked the innate resistance to malaria that blacks typically possessed. The consequences of chronic malaria among these whites would include, among others, increased morbidity and mortality and, in all probability, reduced productivity in physical labor.

Given the southern climate and geography, poor conditions of slave quarters, and high human densities on large plantations, the southern plantation environment resulted in endemic malaria that affected the health and net nutrition of slaves. We believe the plantation disease environment explains the phenomena that plagued slave plantations: low birth weights, high neonatal and infant mortality, stunted slave children, and the effects of protein deficiency.26 Endemic malaria was a key element in shaping the plantation disease environment. As with endemic hookworm (and other tropical diseases), slaves in the South were relatively more resistant to malaria infection than others (white southerners and northerners). Nevertheless, in the highly malarious environment of the plantation South, African descendants (slaves) still would have been afflicted with malaria, and malaria in pregnant women, infants, and young children can have severe consequences.

The results of chronic malaria among slave infants, children, and pregnant slaves are consistent with the anthropometric measurements of slaves in the antebellum South: low birth weights, relatively high neonatal and infant death rates, anemic children and mothers, and severely stunted children. Regardless of the food resources available, a malarious population will have, relative to a nonmalarious population, reduced body size and increased morbidity and mortality.

Implications of Endemic Parasitic Diseases in the Antebellum South

The disease environment of the antebellum South affected various elements of the southern population differently. All other factors the same, we expect that warm-weather diseases would more severely affect (1) infants and children relative to adults, (2) whites relative to blacks, (3) women pregnant for their first time relative to other women, (4) residents of areas where tropical pathogens were abundant relative to people who lived in areas that were less conducive to the survival of tropical pathogens, (5) plantation residents relative to nonplantation residents, and (6) whites that were in close proximity to blacks relative to whites who were more isolated from blacks. The disease environment also affected the anthropometric dimensions of the southern populations differently. (But recall our earlier discussion of caveats when drawing implications from the slave height data.) All other factors the same, we expect that (1) plantation slave children would have been stunted, (2) plantation slave adults would have been shorter than normal stature, (3) high-income plantation whites would have been of normal stature, (4) low-income whites in close proximity to blacks and located in hookworm and/or malarial regions of the South (areas that were warm, moist, swampy, and contained humus and/or sandy soils are examples) would have been below normal stature, (5) nonplantation whites located in other regions of the South (the border states and areas that were colder, more upland, drier, and contained clay soils are examples) would have been of normal stature, and (6) nonplantation blacks would have been of normal stature and taller than plantation-born and reared blacks.

There are anthropometric data that are consistent with elements of our story, supporting the hypothesis of the impact of the southern disease environment on human populations. The data analyzed by Bodenhorn (1997), Coclanis and Komlos (1995), Komlos (1992), Margo and Steckel (1982, 1992), and Steckel (1995b) show that living in the South close to the Atlantic coast made black populations shorter. The reason is that the coastal areas are more humid and have fewer killing frosts. The human inhabitants of the coast were subject to pathogenic onslaughts more frequently than were their inland contemporaries.27

The hypothesis of endemic hookworm, malaria, and other tropical pathogens in the antebellum southern United States also has implications for the existing evidence on the relative productive efficiency of antebellum American slavery shown in the studies of Fogel and Engerman (1971, 1974a, b, 1977, 1980). The differential susceptibilities between African-Americans and European-Americans to parasitic diseases suggest differential agricultural productivity between the two groups of people in favor of higher productivity for antebellum blacks irrespective of the form of the labor system. This suggests that the entire issue of the relative productivity of slavery must be reexamined.

From the known genetic variations between populations of northwestern European ancestry and tropical West African ancestry concerning susceptibility to malaria, we surmise that some of the measured productivity differences between antebellum free farms and slave plantations in the American South can be explained by the presence of endemic malaria alone. Likewise the historical evidence from the early twentieth century on the impact of hookworm disease on the southern labor force and the significantly lower hookworm prevalence rates among African-Americans suggest that productivity differences due to hookworm in favor of blacks also were likely in the antebellum American South. If black adults (slave labor) were more productive than white adults (free labor) because of their greater resilience to the pathogens that abounded in their environment, then part of the measured differences in productivity between slave plantations and free farms should be attributed to the lower morbidity rates for individuals of West African ancestry, rather than to any inherent efficiencies of the institution of slavery.

The hypothesis of endemic tropical parasitic diseases has implications for the interpretation of the history of the post–Civil War American South as well. Real incomes and wages were lower in the South than in other regions of the United States and they remained below (probably substantially below) that of the rest of the country until after World War II (Easterlin 1961; Roberts 1979). Why? In their examination of regional incomes in nineteenth-century America, Coelho and Shepherd's (1979, p. 78) explanation for the lower southern incomes is that “[we] can only speculate that a combination of the remnants of slavery, Jim Crow laws, and racial discrimination had the effects of increasing supplies of unskilled workers in the South.” But this explanation of below-average southern incomes is flawed because it was not simply the preponderance of unskilled labor that drove southern incomes down. The implication from the hypothesis of endemic diseases is that post–Civil War southern incomes remained below other regions because the productivity of southern workers was lower than in the other regions as a result of the southern disease environment. In support of this, the Rockefeller Sanitary Commission “believed that ‘inefficient’ workers whose productivity was thought to have been reduced by disease could be made more productive simply by ridding them of the parasite [hookworm], thus contributing to the economic and industrial development of the region” (Kunitz 1988, p. 143).

Concluding Thoughts on the Antebellum Southern Disease Environment

The physical world shapes and constrains all human activities. We have focused on the biological environment and parasitic diseases that confronted people living in the pre–Civil War American South. Appropriately slavery also can be viewed as a form of parasitism in which slaveowners, like biological parasites, live off the labors of the enslaved. Viewed this way slaveowners have incentives to maintain their capital across generations, making undernourishment of slave infants and children both less probable and less profitable. Unlike slavery in ant societies though, American slavery was, by and large, an inherited condition by the nineteenth century.28

The diseases that were endemic to the American South and the slave system were not dramatic diseases that kill, but lingering diseases that debilitate. Lingering diseases affect infant and child development, adult heights, mental development, and economic productivity. Humans are prone to believe that their histories and fates are the products of human actions that are amenable to reason. Thus the Shakespearean quote: “The fault, dear Brutus, is not in our stars but in ourselves.” In opposition, we contend that much of the economic and social history of the American slave South is the product of uncaring, mindless forces that were unimagined by the contemporaneous human population. The pathogens that were abundant in the American South were not the products of human malevolence. They were the unintended consequences of trade with Africa and the tropical New World. It must be recognized that these pathogens were not equal opportunity diseases. People of tropical West African ancestry were more resistant than were those whose ancestors came from other areas.

Our examination of the biological and historical role of pathogens of tropical West African origin calls for a new interpretation of the economic and social history of American slavery. The evidence presented here calls into question the prevailing views of the inadequacy of the diets of slave infants and children and the relative productive efficiency of slavery. This suggests that an alternative interpretation of the histories of both the pre– and post–Civil War American South might be warranted as well. We envision an interpretation that incorporates the traditional factors of resources, institutions, and profitability along with the minute organisms that, although mindless and unconcerned with human welfare, still impose costs and haunt humanity. We share the world we inhabit with organisms that consider Homo sapiens a resource, we may not be able to see them with the naked eye, but their impact on humans has been visible.

Comments
0
comment
No comments here
Why not start the discussion?