Modern history


Reason and Revelation

In April 1829, Cincinnati, the city Frances Trollope found so dull and unsophisticated, hosted a week of intellectual excitement. The famous British rationalist Robert Owen, back in the United States again following the failure of his Indiana utopia, had offered to prove in debate “that all the religions of the world have been founded on the ignorance of mankind.” The popular Irish-born postmillennial evangelist Alexander Campbell took up Owen’s challenge. For eight days the two debated before an audience averaging twelve hundred people. Each debater spoke for thirty minutes every morning and another thirty in the afternoon. Inveterate controversialists, both of them loved the publicity such events provided; they treated each other with courtesy. Owen argued that planned communities would undergird social morality more effectively than did religion. Campbell defended Christianity as essential to human dignity and social progress. At the end, Campbell asked everyone in the audience who believed in Christianity or wished it to “pervade the world” to stand up. All but three members of the audience rose. Campbell claimed victory and proceeded to publish the full text of the debate as a vindication of Christianity; he invited the reader to “reason, examine, and judge, like a rational being, for himself.” The historian Daniel Feller comments, “Owen and his freethinkers were left aside, prophets of a discarded future, as Americans by the thousands decided for a Christian destiny.” The episode sums up much about the Christianity characteristic of antebellum America: its commitment to social progress, its confidence in popular judgment, and, most of all, its faith in rational discourse.1

Alexander Campbell believed in the Bible—believed that it comprehended all religious truth and constituted a sufficient guide to Christian practice in the present. He also believed it perfectly compatible with reason, history, and science. “The Bible contains more real learning than all the volumes of men,” he declared. Although Campbell gave the scriptures his own distinctive interpretation, his attitude toward them typified

1. Alexander Campbell, ed., Debate on Evidences of Christianity... held in the City of Cincinnati, Ohio from the 13th to the 21st of April, 1829 (Bethany, Va., 1829), 5; Daniel Feller, The Jacksonian Promise (Baltimore, 1995), 105; Mark Noll, America’s God (New York, 2002), 243.

the faith of evangelical Americans in general. The American Bible Society distributed 21 million copies of the Good Book during the fifty years after its founding in 1816 (in a country whose entire population in 1860 was 31 million). The Reformation principle ofsola scriptura, that the Bible contained all things necessary for salvation and could be properly interpreted by any conscientious believer, lived on and heavily influenced American culture. It helped promote universal literacy, democratic politics, and art that emphasized verbal expression. Respect for the Bible conditioned national identity, social criticism, natural science, the educational system, and the interpretation of authoritative texts like the Constitution.2

The Owen-Campbell debate was not unique. Frequent public debates over religion, like those over politics, attracted large crowds and attention from the national press. Debaters addressed infant baptism, universal salvation, and many other theological subjects. Catholics debated Protestants. Religious issues, like political ones, aroused widespread interest in antebellum America. The young Abraham Lincoln enjoyed debating religious questions before he took up political ones. Historical inquirers have long wrangled about which doctrines he ultimately embraced; what is beyond dispute is that he, like so many other reflective Americans of his generation, thought, read, and argued about them a good deal.3

Learned theological reflection flourished in the antebellum United States. Its professional practitioners numbered among the leading American intellectuals of their day. The Protestant majority included Nathaniel William Taylor of Yale, Henry Ware of Harvard, Moses Stuart of Andover Seminary, Charles Hodge of Princeton (the theological seminary, not the university), John W. Nevin of Mercersburg Seminary, James Henley Thornwell of South Carolina College, and Horace Bushnell (who was based not in an academic institution but in a parish, the North Congregational Church in Hartford, Connecticut). All were philosophically sophisticated and determined to apply reason to religion. All save Nevin and Bushnell shared a commitment to religious individualism, the empirical basis of knowledge, the Scottish philosophy of common sense, and an understanding of the Bible as historically accurate, its claim to divine authority confirmed by miracles. Nevin and Bushnell embraced a more

2. See Nathan Hatch and Mark Noll, The Bible in America (New York, 1982); Paul Gutjahr, An American Bible (Stanford, 1999); James T. Johnson, ed., The Bible in American Law, Politics, and Rhetoric (Philadelphia, 1985), Campbell quoted on 62.

3. Brooks Holifield, “Oral Debate in American Religion,” Church History 67 (1998): 499–520; Richard Carwardine, Lincoln (London, 2003), 28–40.

Germanic philosophical outlook, a more organic social theory, and a stronger sense of the evolution of Christianity through history. Like Henry Ware and his “liberal” party, they criticized revivalism and its demand for a conversion experience. All these theologians argued with each other over many issues, including freedom of the will, original sin, atonement, and even whether Catholics could be considered Christians.4Among the Roman Catholic minority, the scholarly Bishop Francis Kenrick of Philadelphia (later archbishop of Baltimore), who translated the Bible, and the lay Protestant convert Orestes Brownson were noteworthy thinkers. Within the even smaller Jewish minority, Isaac Harby of Charleston sought to adapt Judaism to American life in ways that prefigured the growth of Reform after Rabbi Isaac Mayer Wise arrived from Bohemia in 1846.5

Disputes over theology could have institutional consequences. The “Old School” of Hodge and Thornwell expelled the “New School” of Taylor from the Presbyterian Church in 1837, partly because the New School seemed too sympathetic to social reform. In Massachusetts, the “liberal” and “orthodox” Calvinist wings of the Congregational Church, theologically centered at Harvard and Andover Seminary respectively, split after decades of tension. Because Congregationalism had constituted a state religious establishment since Puritan times, often tax supported, this division entailed a legal controversy. In a case arising from the town of Dedham, the Supreme Judicial Court of Massachusetts ruled in 1820 that the “parish”—that is, the congregation as a whole—had the right to name a clergyman for the town even against the wishes of most members of the “church”—that is, the (typically much smaller) group of persons who had experienced conversion and received communion. Since the parish paid the minister’s salary, the decision was just, but it also had theological implications. “Liberal” views prevailed more among parish members than among church members; indeed, many liberals did not believe in “conversion experiences” and took little interest in the sacrament of communion. The decision facilitated the takeover of about a hundred Massachusetts Congregational churches by the liberals (eventually named Unitarians because they rejected the doctrine of the Trinity). Disillusioned with this outcome, orthodox Congregationalists saw no

4. See Leonard Allen, “Baconianism and the Bible,” Church History 55 (1986): 65–80; more generally, Brooks Holifield, Theology in America (New Haven, 2003).

5. On Kenrick, see Gerald Fogarty, American Catholic Biblical Scholarship (San Francisco, 1989), 14–34; on Brownson, Holifield, Theology in America, 482–93; on Harby, Michael O’Brien, Conjectures of Order (Chapel Hill, 2004), II, 1076–82.

reason to continue paying taxes for the support of parishes whose theology they no longer endorsed. They allied with religious dissenters (Baptists, Methodists, Episcopalians) to separate church and state in Massachusetts in 1833, ending the last of the state religious establishments. Congregationalists and Unitarians continued as two different denominations.6


Writing to the lieutenant governor of Kentucky in 1822, James Madison had to admit that Virginia’s educational system did not constitute a fit model for the younger commonwealth to follow; instead, Kentuckians should look to the New England states.7 New England’s township-based system of primary schools was the daughter not of the Enlightenment but of the Reformation; it had been created in colonial times to comply with the precept that all good Christians should be able to read the Bible for themselves. In the early republic, Protestant religion remained an important spur to literacy and New England a leader in education.8 In principle, the American Enlightenment typified by Jefferson and Madison also endorsed universal literacy in the interest of an informed citizenry. Jefferson himself had drafted a plan in 1817 for Virginia to provide free white children with three years of primary education, but the state legislature rejected even this minimal proposal. In practice, Jefferson’s political followers generally made their top priority low taxes, and public education accordingly suffered, not only in Virginia but in many other states as well.9

The churches of the American republic stepped into the breach left by the states. One of their educational initiatives, the Sunday school, provided one day a week of instruction in basic literacy for 200,000 American children by 1827. Only after public primary education became more widespread did Sunday schools concentrate exclusively on religious instruction.10 As Yankees moved from New England into the Old Northwest,

6. See Conrad Wright, The Unitarian Controversy (Boston, 1994); for the Dedham decision, 111–36.

7. James Madison to W. T. Barry, The Writings of James Madison, ed. Gaillard Hunt, IX (New York, 1910), 103–9.

8. See David Paul Nord, “Religious Reading and Readers in Antebellum America,” JER 15 (1995): 241–72; David Tyack, “The Kingdom of God and the Common School,” Harvard Educational Review 36 (1966): 447–69.

9. Much of this section is adapted from Daniel Howe, “Church, State, and Education in the Young American Republic,” JER 22 (2002): 1–24. For more on Jefferson’s views, see James Gilreath, ed., Thomas Jefferson and the Education of a Citizen (Washington, 1999).

10. Carl Kaestle, Pillars of the Republic (New York, 1983), 45; Anne Boylan, Sunday School (New Haven, 1988).

they replicated the publicly funded weekday primary schools with which they had been familiar. This project could be seen as hastening Christ’s Second Coming. The Connecticut state legislature issued a remarkable exhortation promising westward migrants who established schools a celestial reward. “How great will be your happiness,” it assured them, “to look down from heaven” in future centuries and behold your “enlightened and pious and happy” descendants, living under “the mild reign of the PRINCE of peace,” who will have returned and established his thousand-year kingdom.11

Much of the innovation in secondary education, too, came from religious impulses. In the absence of state-supported high schools, academies under religious auspices flourished. In the early days they hardly ever boarded students but drew them from their own vicinity. Usually coeducational, the academies played a role in opening up secondary education to girls. Of course, their students had to pay tuition. Only after 1840 did public high schools gradually supplant most of these academies.12

Secular authorities did not necessarily mind saving the taxpayers money by letting religious groups deal with educational needs. The educational goals of Christian and secular educators dovetailed conveniently. In traditional republican political thought, free institutions rested on the virtue of the citizenry, that is, on their devotion to the common good. Religious educators inculcated that respect for the social virtues which republicanism considered indispensable. Christian and Enlightenment moral philosophy alike taught that young people needed conscientious training to form a properly balanced character in which reason and the moral sense could prevail over baser motives. An educated Protestant laity provided the basis for an educated citizenry.13 Under the circumstances, public authorities sometimes granted religious educational institutions favors and subsidies to keep them functioning, rather than spend more money to create secular alternatives. Despite Jefferson’s famous allusion to a “wall of separation” (he used the phrase in a private letter, not a public document, and with reference to the federal government, not the states), strict separation of church and state, as later

11. Connecticut General Assembly, An Address to the Emigrants from Connecticut (Hartford, Conn., 1817), 18.

12. James McLachlan, American Boarding Schools (New York, 1970), 35–48; Theodore Sizer, The Age of the Academies (New York, 1964); J. M. Opal, “Academies and the Transformation of the Rural North,” JAH 91 (2004): 445–70.

13. See Stephen Macedo, Liberal Virtues (Oxford, 1991); Daniel Howe, Making the American Self (Cambridge, Mass., 1997).

generations came to understand that principle, did not characterize the young republic.14

For African Americans, religion was even more important as a source of education than it was for the whites. The “invisible institution” of religion in the slave quarters profoundly influenced African American culture.15 What little interest the state took in the education of black people was mainly negative: In a number of states the law prohibited schools for slaves, intending to insulate them against abolitionist propaganda. However, individual masters were often allowed to teach their own slaves. In fact, between 5 and 10 percent of adult slaves must have possessed some informally taught literacy and numeracy, useful in the skilled and supervisory occupations performed by the top echelon of enslaved workers. Some masters and mistresses also felt a religious obligation to teach their slaves to read. As late as Reconstruction times, northern religious philanthropy worked in tandem with the Freedmen’s Bureau to support the newly established schools for southern black children.16 In the free Negro communities schools existed—almost always segregated, even in the North. These schools had usually been created by white religious philanthropy and/or black self-help, seldom by local public authorities. In Connecticut, the authorities actually opposed Prudence Crandall’s efforts to provide secondary education for black girls.17

Religious-sponsored education for Native Americans figured prominently in the Cherokee Removal crisis. The American Board of Commissioners for Foreign Missions and other Christian missionaries had established schools in the Cherokee Nation, promoting knowledge of Western civilization. Sometimes such mission schools enjoyed a measure of federal support. Acting with the blessing of the Jackson administration, however, the state of Georgia determined to interrupt this educational process and expel the

14. See Daniel Dreisbach, “Thomas Jefferson, a Mammoth Cheese, and the ‘Wall of Separation Between Church and State,’” in Religion and the New Republic, ed. James Hutson (Lanham, Md., 1999), 65–114.

15. See, for example, Thomas Webber, Deep like the Rivers: Education in the Slave Quarter Community (New York, 1978); Paul E. Johnson, ed., African-American Christianity (Berkeley, 1994).

16. There are no hard data on slave literacy. The most commonly given estimate is 5 percent, but the most thorough study to date concludes 10 percent is closer to the truth. Janet Cornelius, “When I Can Read My Title Clear”: Literacy, Slavery, and Religion in the Antebellum South (Columbia, S.C., 1991), 8–9, 62–67. See also Beth Barton Schweiger, The Gospel Working Up (New York, 2000), 73.

17. Kaestle, Pillars of the Republic, 171–75; Brown, Strength of a People, 170–83. Jennifer Rycenga has a book on Prudence Crandall in progress.

missionaries, provoking the Supreme Court case of Worcester v. Georgia. An educated Indian population, determined to retain their land and develop its resources, could have blocked white intruders more effectively.18

But schools are not the only vehicles for literacy. Many youngsters learned how to read at home. Beth Schweiger’s studies of literacy among the southern yeomanry show that while only 40 percent of southern white children went to school, 80 percent of southern white adults could read. That indicates a lot of people acquiring at least rudimentary literacy at home. Why did farm parents, tired at the end of a day’s work, make time to teach their children? The primary motive seems not to have been commercial or political, still less to facilitate the children’s upward social mobility. It was religious. Although Protestant piety did not produce free common schools in the rural and individualistic South, the way it did in the villages of New England, Protestantism still promoted literacy in the South.19 Unlike reading, the ability to write did not have religious significance. Reading was more widely taught than writing, and many people—especially females—who could read at least a little had no experience with writing.

In its broadest definition, education is the entire process of cultural transmission. The rapidly expanding communications of the antebellum period enabled people to be better informed about the world than ever before from magazines, newspapers, and books; mail service integrated commercial and civic life. The Second Great Awakening both exploited and fostered these developments.20 Voluntary associations, such as foreign missions, the Sunday-School Union, and the sabbatarian, temperance, antislavery, and peace movements, educated a broad public in the issues of the day. The educational function of the evangelical associations seems particularly important in the case of women. Excluded from political institutions, their increasing literacy nevertheless enabled them to read the news and organize benevolent societies, acquiring further skills in the process.21

The commercialization and diversification of the economy multiplied

18. For a contemporary source sharing the missionaries’ outlook, see Jedidiah Morse, Report to the Secretary of War on Indian Affairs (New Haven, 1822). William McLoughlin has written two contrasting assessments: Cherokees and Missionaries (New Haven, 1984) and Champions of the Cherokees (Princeton, 1990).

19. Schweiger, Gospel Working Up, 67 and 202.

20. See Richard D. Brown, Knowledge Is Power (New York, 1989); Richard John, Spreading the News (Cambridge, Mass., 1995), esp. chaps. 5 and 7; and the works cited in chap. 6, n. 71.

21. See, for example, Nancy Hardesty, Your Daughters Shall Prophesy: Revivalism and Feminism in the Age of Finney (Brooklyn, N.Y., 1991); Nancy Hewitt, Women’s Activism and Social Change (Ithaca, N.Y., 1984); Stuart Blumin, The Emergence of the Middle Class (Cambridge, Eng., 1989).

jobs requiring literate and numerate skills; continued economic development would demand still more of them. American society needed an educational program synthesizing the civic objectives of Jefferson’s Enlightenment with the energy and commitment of the religious Awakening. Such a movement appeared in the educational reforms embraced by the Whig Party in the 1830s. The greatest of the Whig educational reformers was Horace Mann, who became secretary of the newly created Massachusetts State Board of Education in 1837. From that vantage point Mann tirelessly crusaded on behalf of “common schools”—that is, schools that the whole population would have in common: tuition-free, tax-supported, meeting statewide standards of curriculum, textbooks, and facilities, staffed with teachers who had been trained in state normal schools, modeled on the French école normale. In Massachusetts, Mann could build on the strongest tradition of public education in any state. There, local communities had become accustomed to taxing themselves to support education. Mann had no hesitation about employing the resources of the state; he was a political disciple of John Quincy Adams. The normal schools that he created (beginning with Lexington in 1839) constituted Mann’s most important innovation, the precursors of teacher training colleges. The normal schools turned out to be the avenue through which women in large numbers first entered any profession. Since they were paid less than men, women teachers provided a human resource agreeable to legislators worried about the cost of Mann’s ambitious plans.22

As envisioned by Mann and his successors until long after the Civil War, the common schools embodied a common ideology. The ideology of the American common schools included patriotic virtue, responsible character, and democratic participation, all to be developed through intellectual discipline and the nurture of the moral qualities. It would never have occurred to Mann and his disciples that such an educational program should not include religion, but since they wanted above all to achieve an education common to all, this necessitated a common religious instruction. In the days of more local autonomy, school districts had taught the religion of the local majority. Now, the Massachusetts School Board prescribed that only those doctrines should be taught on which all Protestants agreed.

22. On Mann in his context, see Jonathan Messerli, Horace Mann (New York, 1972); Daniel W. Howe, “The History of Education as Cultural History,” History of Education Quarterly 22 (1982): 205–14; Susan-Mary Grant, “Representative Mann: Horace Mann, the Republican Experiment, and the South,” Journal of American Studies 32 (1998): 105–23.

The Whig governor, Edward Everett, gave Mann solid support in appointments to the Board and helped him overcome opposition from jealous local authorities, doctrinaire Christian groups, and pedagogically conservative schoolmasters. When a Democrat, Marcus Morton, was elected governor by a margin of one vote in 1839, he proved unable to persuade the Massachusetts legislature to abolish Mann’s Board of Education and its new normal schools.23 Democrats throughout the country remained suspicious of educational programs like Mann’s as the creation of a remote elite; they preferred to leave schools under local control as much as possible. What probably tipped the scales in favor of states assuming some responsibility for education was the growth of cities and towns. With apprenticeship programs declining, the new urban working class embraced common schools as their children’s guarantor of opportunity—besides keeping them off the street. In rural areas, schools always competed with the need for children to work on the farm. The older ones could only attend a few months during the winter, when their labor and that of their part-time teacher could best be spared; the younger ones could also be taught in the summer after planting and before harvesting.24

At the top of Mann’s agenda stood the education of the immigrants, especially the children of migrant laborers. But nondenominational Protestant schools proved to be unacceptable to the growing Irish immigrant community. In New York, the conflict between Protestant-public schools and the Catholic minority led by Bishop (later Archbishop) John Hughes embarrassed the enlightened Whig governor, William H. Seward. Seward tried vainly to bridge the gap between the two sides with an unsuccessful proposal for state subsidies to Catholic schools, as Protestant educational enterprises had often been subsidized. Instead the legislature ruled that no public money should go to any school in which religion was taught.25

The lesson for the rest of the country was clear: Where public aid to Protestant institutions had been within the bounds of political acceptability, such aid to Catholic institutions was not. When faced with a

23. Messerli, Horace Mann, 326–31; Carl F. Kaestle and Maris Vinovskis, Education and Social Change in Nineteenth-Century Massachusetts (Cambridge, Mass., 1980), 221–28.

24. W. J. Rorabaugh, The Craft Apprentice (New York, 1986), 113–27; Joel Perlmann et al., “Literacy, Schooling, and Teaching Among New England Women,” History of Education Quarterly 37 (1997): 117–39.

25. See Glyndon Van Deusen, “Seward and the School Question Reconsidered,” JAH 52 (1965): 313–19; Vincent P. Lannie, Public Money and Parochial Education (Cleveland, 1968).

charge of inconsistency, public authorities would cut off aid to Protestants rather than extend it to include Catholics.

To be sure, many public, or common, schools would retain features of nondenominational Protestantism for a good many years to come. Horace Mann hoped that passages from the Bible, read without interpretation, might offer a nonsectarian common religious ground. Although Catholics and even some of the Protestant sects did not find this acceptable, Bible-reading in the common schools remained a widespread and even increasing practice in nineteenth-century America. Probably over half of American common schools practiced Bible reading at the end of the nineteenth century.26

In 1840, the U.S. census takers for the first time asked questions about literacy. They recorded 9 percent of adult American whites as illiterate, a rate comparable with that of Prussia, whose educational system, run by the established church, was much admired. Even when the African American population was included, U.S. illiteracy at 22 percent compared favorably with the 41 percent illiteracy in England and Wales recorded by their census of 1841. American literacy varied widely by region. In New England no state had less than 98 percent literacy, which equaled Scotland and Sweden, the two countries where energetic programs sponsored by Protestant established churches had forged the world’s highest literacy. The American state with the highest white illiteracy in 1840 was North Carolina: 28 percent. The public school system called for in the North Carolina state constitution of 1776 had never been implemented. However, in 1839 the Whigs gained control of North Carolina’s legislature and put through a long-delayed law authorizing common schools in counties that consented to them. As a result, white illiteracy in North Carolina fell to 11 percent over the next twenty years.27


Like school systems, higher education in the antebellum period reflected the energy of religious bodies and the frequent reluctance of civil authorities to spend money or expand the sphere of government. George Washington had conceived of a national university in the District of Columbia

26. R. Laurence Moore, “Bible Reading and Nonsectarian Schooling,” JAH 86 (2000): 1581–99.

27. See Lee Soltow and Edward Stevens, The Rise of Literacy and the Common School in the United States (Chicago, 1981), 11–22; Brown, Strength of a People, 141–48; Carl Kaestle, “History of Literacy and Readers,” in Perspectives on Literacy, ed. Eugene Kintger (Carbondale, Ill., 1988), 105–12.

but had never been able to persuade Congress to implement his vision. He had gone so far as to will a portion of his estate to form a core endowment for such a university. But Congress ignored his bequest, and in 1823 the fund became worthless when the company in which it was invested went bankrupt. Jefferson had originally supported a national university but eventually decided that an amendment to the Constitution would be required to authorize it. Madison and John Quincy Adams both recommended a national university; neither could budge Congress. Opposition came from existing colleges that feared being overshadowed, from strict construction of the Constitution, and from sheer parsimony. Under the Jacksonians the project of a national university vanished.28

As an alternative to a university created by the federal government, Thomas Jefferson founded the University of Virginia, intended as a model of public secular education. Drawing upon his network of political influence, the ex-president was able to get himself named rector and its site located in Charlottesville, close enough to Monticello that he could oversee every detail. The versatile elder statesman designed its architecture and mode of governance, named the professors, and even presumed to prescribe the curriculum—at least in sensitive subjects like politics and religion. After his death in 1826, Jefferson’s tombstone proclaimed the three achievements of which he was proudest: “Author of the Declaration of American Independence, of the Statute of Virginia for religious freedom; & Father of the University of Virginia.”29

Jefferson made the University of Virginia an architectural masterpiece. As an institution of higher learning, however, its distinction was not immediately apparent. In the first functioning academic year, 1825–26, the only one the founder lived to see, the students took advantage of Jefferson’s permissive discipline to get drunk, gamble, skip classes, and misbehave; among those who had to be expelled for participation in a riot was the founder’s own great-grandnephew.30 The shortcomings of the student body reflected the legislature’s failure to establish a proper system of preparatory secondary schools. Funding remained perennially problematic; the recruitment of both faculty and students, difficult. Sadly, Jefferson’s own vision for his beloved university had contracted over time.

28. This section is adapted from Howe, “Church, State, and Education.”

29. Merrill Peterson, Thomas Jefferson and the New Nation (New York, 1970), 976–88; Philip Bruce, History of the University of Virginia (New York, 1920), I; James Morton Smith, The Republic of Letters (New York, 1995), III, 1883–1951.

30. Robert McDonald, “Thomas Jefferson’s Image in America, 1809–1826” (master’s thesis, Oxford University, 1997). Cf. Bruce, University of Virginia, II, 300.

Originally he had imagined it drawing students from all over the Union, but after his political vision narrowed during the Missouri controversy, his plans for the university changed too. In the end, he conceived it as a bastion of southern sectionalism.31

The University of Virginia was by no means the only example of a disparity between promise and realization in American higher education during the early national period. In some states, the gap was greater. The so-called University of the State of New York had been created in 1784 as part of a grandiose plan intended to coordinate all levels of education, primary, secondary, and higher, in the state. In practice, however, this “University” exerted little control over the activities nominally subject to it, some of them private and sectarian. Not until after World War II did New York actually create a state university that would engage directly in teaching and research. The contrast between dream and reality appears again in the case of Michigan. Augustus Woodward, whom Jefferson appointed chief justice of Michigan Territory, projected the “University of Michigania.” However, for a long time the state only implemented primary and secondary levels of instruction; the Ann Arbor campus did not open until 1841, although the present University of Michigan proudly declares the date of its founding to be 1817. Finally, one might note the 1816 constitution of Indiana, which called for “a general system of education, ascending in regular gradation from township schools to state university, wherein tuition shall be gratis, and equally open to all.” It took more than thirty years for the Indiana legislature to begin to implement this promise. In the meantime, a Presbyterian seminary-turned-college operated in Bloomington.32

At the time of independence the United States contained nine colleges, all with religious connections. The status that these colleges would enjoy in the republic only gradually achieved definition. At first they seemed “mixed corporations,” privately owned but subsidized in return for serving public functions, like some banks and turnpikes of the time. The instability of such a status appeared in the Dartmouth College case of 1819. This lawsuit originated in a dispute between the president of the college and the trustees. Both sides were Federalist and Calvinist, but a majority of the trustees supported organized revivals and novel moral reforms like temperance. The president of the college had no sympathy

31. Thomas Jefferson to James Breckinridge, Feb. 15, 1821, TJ: Writings, 1452; Peterson, Thomas Jefferson, 981.

32. Lawrence Cremin, American Education, The National Experience (New York, 1980), 150–53, 160–63, 171–72.

with this program, and the trustees dismissed him. The Republican-controlled state legislature intervened on the side of the president, trying to make a mixed public-private institution more responsive to religious diversity. But the trustees resisted. Daniel Webster, a Dartmouth alumnus, took their case before the U.S. Supreme Court, arguing that the state legislature had no business tampering with Dartmouth’s royal charter. Webster won, and by his victory he set Dartmouth on a course of transformation from a mixed public-private institution into a completely private college.33

If the New Hampshire Republicans lost their battle against the Dartmouth trustees, they won the war on another front. The same Jeffersonian legislature that tried to alter the Dartmouth charter took advantage of the division among Federalists to strip the Congregational Church of its favored status in New Hampshire. Republican secularists allied with Baptists and other dissenters to pass what they called a “toleration act” that (in effect, if not in theory) disestablished religion in New Hampshire.34 The disestablishment of the New England state churches foreshadowed the disestablishment of what we call the Ivy League colleges, though the separation of college and state occurred more gradually. Eventually all but one of the colonial foundations became private and, if they did not need to fear for their autonomy, neither could they look to their state governments for financial assistance.35 The College of William and Mary in Virginia, alone among the nine colleges predating independence, ended up a state institution. The secularization of the colonial colleges is another story, one that takes place after the Civil War.

The most successful example of a state-founded, state-supported venture in higher education in the early national period was South Carolina College, founded in 1801. Although roiled by some of the same early problems with discipline as the University of Virginia, the college surmounted them to become the only institution of higher learning in the United States generously supported by annual legislative appropriations.

33. See Steven Novak, “The College in the Dartmouth College Case,” New England Quarterly 47 (1974), 550–63; Donald Cole, Jacksonian Democracy in New Hampshire (Cambridge, Mass., 1970), 30–41; Lynn Turner, The Ninth State: New Hampshire’s Formative Years (Chapel Hill, 1983), 334–43.

34. Turner, Ninth State, 352–56.

35. See John Whitehead, The Separation of College and State (New Haven, 1973), 53–88. The nine colleges predating independence are Harvard, William and Mary, Yale, Princeton, Columbia (originally King’s College), Rutgers (originally Queen’s College), Dartmouth, Brown, and Pennsylvania. All save Rutgers and William and Mary constitute, along with Cornell, the modern Ivy League.

Since the state did not support public schools, they did not compete with the college for funds.36 Thomas Cooper, an expatriate Englishman, accepted the presidency of South Carolina College in 1821. Cooper combined proslavery politics with anti-clericalism; Jefferson declared him “the greatest man in America, in the powers of mind,” and had tried desperately to recruit him to head the University of Virginia.37 In South Carolina, Cooper won popularity with his ardent state-rights rhetoric during the nullification crisis, only to lose it soon afterwards by his tactless denunciations of Christianity. Under fire from a combination of Presbyterian clergy and political Unionists, Cooper found it necessary to resign in 1834. The one example of successful state-sponsored higher education in the country also illustrated the unacceptability of state-sponsored secularism.38

Disestablishment did not dismay New England’s Congregationalists. Looking for ways to reassert their influence, they founded educational institutions. Yankees moving west created a host of Congregationalist colleges across their band of settlement, including Western Reserve University and Oberlin College in Ohio, Illinois College, Beloit College in Wisconsin, and Grinnell College in Iowa. Some of these institutions were in effect daughter colleges of Yale, founded by Yale graduates and imitating the Yale curriculum.39 But in the enthusiasm of the Second Great Awakening, the denominations that had never been established proved even more prolific in founding colleges than did Congregationalists and Episcopalians. By 1848, the Presbyterians had founded the most colleges (twenty-five), followed by Methodists and Baptists (with fifteen each), Congregationalists (fourteen), and Episcopalians (seven). Presbyterian Princeton had an academic empire in the South comparable to Yale’s in the North.40 Since denominational affiliation mattered little to the college curriculum in most cases, student bodies typically included youths from the area across denominational lines. These numerous little colleges were serving the purposes of their local communities, not just their

36. Michael Sugrue, “South Carolina College, the Defense of Slavery, and the Development of Secessionist Politics,” History of Higher Education Annual 14 (1994): 39–71.

37. Jefferson to Joseph Cabell, quoted in Robert P. Forbes, “Slavery and the Evangelical Enlightenment,” in Religion and the Antebellum Debate over Slavery, ed. John R. McKivigan and Mitchell Snay (Athens, Ga., 2001), 88.

38. Daniel Hollis, South Carolina College (Columbia, S.C., 1951), 74–119.

39. The classic account is Richard Power, “A Crusade to Extend Yankee Culture,” New England Quarterly 13 (1940), 638–53.

40. Statistics based on Donald Tewksbury, The Founding of American Colleges and Universities Before the Civil War (1932; New York, 1972), 32–46. See also Mark Noll, Princeton and the Republic, 1768–1822 (Princeton, 1989).

Table 4
Some American Institutions of Higher Education Founded Before 1848

Institutions marked with an asterisk admitted women before 1848. Some institutions that began as colleges later became universities


Meadville, Pa.




Amherst, Mass.




Beloit, Wisc.




McKenzie, Tenn.




Providence, R.I.




Lewisburg, Pa.



Cincinnati, Univ. of

Cincinnati, Ohio




Waterville, Maine




Hamilton, N.Y.



Charleston, Coll. of

Charleston, S.C.



Columbia (orig. King's College)

New York, N.Y.




Lebanon, Tenn.




Hanover, N.H.




Granville, Ohio




Durham, N.C.




Richmond, Ind.




Atlanta, Ga.




Fordham, N.Y.


Roman Catholic

George Washington

Washington, D.C.




Washington, D.C.


Roman Catholic

Georgia, Univ. of

Athens, Ga.




Gettysburg, Pa.




Greensboro, N.C.




Grinnell, Iowa




Hampden-Sydney, Va.




Cambridge, Mass.


Congregational; after 1805 Unitarian

Holy Cross

Worcester, Mass.


Roman Catholic

Illinois College

Jacksonville, Ill.




Gambier, Ohio




Easton, Pa.



Louisiana, Univ. of

New Orleans, La.




Marietta, Ohio



Maryland, Univ. of

Baltimore, Md.




Maryville, Tenn.



Miami Univ.

Oxford, Ohio



Mississippi, Univ. of

Oxford, Miss.



Missouri, Univ. of

Columbia, Mo.



Mount Holyoke*

South Hadley, Mass.



Mount Union

Alliance, Ohio



New York Univ.

New York, N.Y.



North Carolina, Univ. of

Chapel Hill, N.C.



Notre Dame

Notre Dame, Ind.


Roman Catholic


Oberlin, Ohio



Ohio Univ.

Athens, Ohio



Pennsylvania, Univ. of

Philadelphia, Pa.



Princeton (orig. the College of New Jersey)

Princeton, N.J.



Rutgers (orig. Queen's College)

New Brunswick, N.J


Dutch Reformed

St. Louis Univ.

St. Louis, Mo.


Roman Catholic

Tennessee, Univ. of

Knoxville, Tenn.




Lexington, Ky.


Disciples of Christ


Hartford, Conn.



U.S. Military Academy

West Point, N.Y.



U.S. Naval Academy

Annapolis, MD



Union College

Schenectady, N.Y.


Presbyterian with Congregational

Vermont, Univ. of

Burlington, Vt.




Villanova, Pa.


Roman Catholic

Wake Forest Coll.

Wake Forest, N.C.



Wesleyan Univ.

Middletown, Conn.


Wesleyan Methodist

Western Reserve Univ.

Cleveland, Ohio




Norton, Mass.



William and Mary

Williamsburg, Va.




Williamstown, Mass.



Wisconsin, Univ. of

Madison, Wisc.




Cincinnati, Ohio


Roman Catholic


New Haven, Conn.



particular sects. But they existed on the margin of financial viability and frequently succumbed to the same economic downturns that claimed business and financial enterprises.41

41. See David Potts, “American Colleges in the Nineteenth Century,” History of Education Quarterly 11 (1971): 363–80; idem, “‘College Enthusiasm!’ as Public Response, 1800–1860,” Harvard Educational Review 47 (1977): 28–42.

In 1815, thirty-three colleges existed in the United States; by 1835, sixty-eight; and by 1848, there were 113. Sixteen of these were state institutions, which by then were generally distinguishable from private religious ones. Eighty-eight were Protestant denominational colleges; the remaining nine, Roman Catholic.42 Catholic educational initiatives in the United States were largely the work of religious orders. They included Georgetown and Fordham (both Jesuit), Notre Dame (Order of the Holy Cross), and Villanova (Augustinian). All these were founded in advance of heavy Catholic immigration; they aimed initially at proselytizing, not simply catering to an existing Catholic population. Protestants like Lyman Beecher correctly interpreted the Catholic institutions as an ideological challenge.

American higher education responded to pressure for vocational utility at the graduate level. The early national period witnessed the foundation of professional schools, starting with medicine, law, and divinity. At the undergraduate level, however, Protestant, Catholic, and public colleges all emphasized a liberal education—that is, one designed to develop the student’s intellectual powers rather than to provide vocational training. It was termed “liberal” because designed to be liberating and thus suitable for a free man (liber meaning “free” in Latin).43 The Yale Report of 1828, issued by the faculty of what was then the country’s largest and most influential institution of higher learning, defended the traditional conception of a liberal education against its critics. The curriculum centered on the classics, particularly Latin. Advocates of curricular innovation succeeded in introducing modern history, modern literature, and modern foreign languages, but classics remained the core discipline, along with some mathematics and science. Colleges generally required some Latin for entrance, which in turn influenced secondary school curricula. Undergraduates had few elective subjects. Classical study inculcated intellectual discipline and provided those who pursued it, the world over, with a common frame of reference. The use of Latin marked one as educated and gave weight to one’s arguments. Physicians wrote their prescriptions in Latin; lawyers sprinkled their arguments with Latin phrases. American statesmen defended their principles of “classical republicanism” with arguments drawn from Aristotle, Publius, and Cicero. Sculptors flattered

42. These numbers could vary slightly because of the existence of evanescent and marginal institutions. See Tewksbury, Founding, 32–46.

43. OED, s.v. “liberal.”

public figures by portraying them in togas. Congress met in a Roman-style Capitol.44

A distinctive feature of antebellum American colleges was the course on moral philosophy, typically taught to seniors by the president of the college. The capstone of an undergraduate education, it treated not only the branch of philosophy we call ethical theory but also psychology and all the other social sciences, approached from a normative point of view. The dominant school of thought was that of the Scottish philosophers of “common sense,” Thomas Reid and Dugald Stewart, plus Adam Smith (whom we remember mostly for his work in economics, then a branch of moral philosophy). These philosophers were valued for their rebuttal to the atheistic skepticism of David Hume, their reconciliation of science with religion, and their insistence on the objective validity of moral principles. They sorted human nature into different “faculties” and explained the difficult, yet important, task of subordinating the instinctive and emotional faculties to the higher ones of reason and the moral sense. Moral philosophy as taught in the colleges reflected American middle-class culture’s preoccupation with character and self-discipline. This course, very similar at all public and Protestant colleges, substituted for the study of sectarian religious doctrine, which moved into the professional divinity schools and seminaries for training ministers.45

The colonial Puritans had included educational provision for girls as well as boys in their primary schools, and in the early nineteenth-century United States, secondary education opened up to girls with little controversy. The finest girls’ secondary school, Troy Female Seminary, founded in 1821 by Emma Willard, offered college-level courses in history and science. By the middle of the century, the United States had become the first country in the world where the literacy rate of females equaled that of males. At least as startling, the first few higher educational opportunities appeared for women. Religious motivations remained important in this, illustrated by the Calvinism of Mount Holyoke College, the evangelical abolitionism of coeducational Oberlin, and the Wesleyan Methodism of

44. Jack Lane, “The Yale Report of 1828,” History of Education Quarterly 27 (1987): 325–38; Daniel Howe, “Classical Education and Political Culture in Nineteenth-Century America,” Intellectual History Newsletter 5 (Spring 1983): 9–14; Carl Richard, The Founders and the Classics (Cambridge, Mass., 1994).

45. See D. H. Meyer, The Instructed Conscience (Philadelphia, 1972); Daniel Howe, The Unitarian Conscience: Harvard Moral Philosophy, 1805–1861, 2nd ed. (Middletown, Conn., 1988); Allen Guelzo, “The Science of Duty,” in Evangelicals and Science in Historical Perspective, ed. David Livingstone et al. (New York, 1999), 267–89.

Georgia Female College, all of them founded in the 1830s. No individual did more to apply the Second Great Awakening to women’s education than Catharine Beecher, eldest daughter of the evangelist Lyman Beecher.46

The United States pioneered higher education for women, and by 1880 one-third of all American students enrolled in higher education were female, a percentage without parallel elsewhere in the world.47 Scholars have often debated how far American history is “exceptional” by comparison with the rest of the world. No better example of American exceptionalism exists than higher education for women. Through the efforts of Christian missionaries, the American example of higher education for women has influenced many other countries.


As this chapter is written in the early twenty-first century, the hypothesis that the universe reflects intelligent design has provoked a bitter debate in the United States. How very different was the intellectual world of the early nineteenth century! Then, virtually everyone believed in intelligent design. Faith in the rational design of the universe underlay the world-view of the Enlightenment, shared by Isaac Newton, John Locke, and the American Founding Fathers. Even the outspoken critics of Christianity embraced not atheism but deism, that is, belief in an impersonal, remote deity who had created the universe and designed it so perfectly that it ran along of its own accord, following natural laws without need for further divine intervention. The commonly used expression “the book of nature” referred to the universal practice of viewing nature as a revelation of God’s power and wisdom. Christians were fond of saying that they accepted two divine revelations: the Bible and the book of nature. For deists like Thomas Paine, the book of nature alone sufficed, rendering what he called the “fables” of the Bible superfluous. The desire to demonstrate the glory of God, whether deist or—more commonly—Christian, constituted one of the principal motivations for scientific activity in the early republic, along with national pride, the hope for useful applications, and, of course, the joy of science itself.48

46. Barbara Solomon, In the Company of Educated Women (New Haven, 1985); Kathryn Sklar, “The Founding of Mount Holyoke College,” in Women of America, ed. Carol Berkin and Mary Beth Norton (Boston, 1979), 177–201; idem, Catharine Beecher (New Haven, 1973).

47. Solomon, In the Company of Educated Women, 63.

48. I discuss belief in intelligent design during this period more fully in Unitarian Conscience, 69–82.

One such demonstration of divine purpose appeared in the widely used textbook Natural Theology (1805 and ten subsequent American editions by 1841), written by the English clergyman William Paley. Paley presented innumerable cases to illustrate the teleological argument for the existence of God (that is, the argument that we find apparent design in nature and can infer from this a purposeful designer). For example, Paley argued, the physiology of the human eye shows as much design as a human-made telescope.49 Though a popularizer, Paley did not misrepresent the attitude of most scientists of his time. Natural theology, the study of the existence and attributes of God as demonstrated from the nature He created, was widely studied and Paley’s book used as its text. A synthesis of the scientific revolution with Protestant Christianity viewed nature as a law-bound system of matter in motion, yet also as a divinely constructed stage for human moral activity. The psalmist had proclaimed, “The heavens declare the glory of God, and the firmament showeth his handiwork.” The influential Benjamin Silliman, professor of chemistry and natural history at Yale from 1802 to 1853, affirmed that science tells us “the thoughts of God.50

Silliman and other leading American scientists like Edward Hitchcock and James Dwight Dana harmonized their science not only with intelligent design but also with scripture. They insisted that neither geology nor the fossils of extinct animals contradicted the book of Genesis. They interpreted the “days” of creation as representing eons of time and pointed out that Genesis had been written for an ancient audience, with the purpose of teaching religion, and not to instruct modern people in scientific particulars. Scientists varied in the importance they attached to identifying approximate parallels between science and scripture, such as comparing geologic evidence of past inundation with Noah’s flood. The most widely held theory explaining the emergence of different species over time, that of the great French biologist Georges Cuvier, held that God had performed successive acts of special creation. When the Scotsman Robert Chambers published Vestiges of the Natural History of Creation in

49. William Paley, Natural Theology; or, Evidences of the Existence and Attributes of the Deity, Collected from the Appearances of Nature (Boston, Mass., 1831), 19–38. For context, see John Hedley Brooke, Science and Religion (Cambridge, Eng., 1991), 192–225; D. L. LeMahieu, The Mind of William Paley (Lincoln, Neb., 1976), 153–83.

50. Quotation from John C. Greene, “Protestantism, Science, and the American Enlightenment,” in Benjamin Silliman and His Circle, ed. Leonard Wilson (New York, 1979), 19; idem, The Death of Adam (New York, 1961), 23. See also Chandos Brown, Benjamin Silliman (Princeton, 1989), the first volume of a projected two.

1844, anticipating Darwin’s theory of evolution, he still argued that evolution was compatible with intelligent design. The scientific community rejected this theory of evolution until Charles Darwin supplied a theory of natural selection to explain how it worked. But Louis Agassiz of Harvard, the renowned discoverer of past ice ages, defended the theory of special creation even after Darwin published his Origin of Species in 1859. His Harvard colleague botanist Asa Gray led the American fight to accept the theory of evolution, but argued (contrary to Darwin’s own opinion) for evolution’s compatibility with intelligent design.51

The early nineteenth century distinguished two branches of science: natural history (biology, geology, and anthropology, all then considered mainly descriptive) and natural philosophy (physics, chemistry, and astronomy, more mathematical in nature). Scientific activity in the United States emphasized natural history, the collection of information about flora, fauna, fossils, and rocks. Exploring parties like those of Lewis and Clark in 1804–6, Army Major Stephen Long across the Great Plains in 1819–23, and Navy Lieutenant Charles Wilkes through the Pacific in 1838–42 contributed to this knowledge. Many scientists were actually amateurs who earned their living in other ways, frequently as clergymen, physicians, or officers in the armed forces. Science figured in the standard curriculum of both secondary and higher education, and the subject enjoyed a broad base of interest in the middle class. Science, like technology, benefited from the improving literacy and numeracy of nineteenth-century Americans. Popular magazines carried articles encouraging interest in the natural history of the New World. The perceived harmony between religion and science worked to their mutual advantage with the public. As the industrial revolution reflected the ingenuity of innumerable artisans, so early modern natural history profited from the dedicated curiosity of many nonprofessional observers and collectors— women as well as men.52

51. John C. Greene, “Science and Religion,” in The Rise of Adventism, ed. Edwin Gaustad (New York, 1974), 50–69; A. Hunter Dupree, Asa Gray (Cambridge, Mass., 1959), 288–303, 358–83; idem, “Christianity and the Scientific Community in the Age of Darwin,” in God and Nature, ed. David Lindberg and Ronald Numbers (Berkeley, 1986), 351–68.

52. See Margaret Welch, The Book of Nature: Natural History in the United States, 1820–1875 (Boston, 1998); Sally Kohlstedt, “Education for Science in Nineteenth-Century America,” in The Scientific Enterprise in America, ed. Ronald Numbers and Charles Rosenberg (Chicago, 1996), 61–82; John C. Greene, American Science in the Age of Jefferson (Ames, Iowa, 1984).

The careers of several prominent figures in American natural science illustrate the unspecialized quality of their intellectual life and times. Henry Schoolcraft, son of a farmer and glass manufacturer, never went to college. As a government Indian agent with the Ojibwa (also called Chippewa), he described his hosts’ language, folklore, and customs with the aid of his wife, who was half Ojibwa and half Irish. In this way he became one of the earliest anthropologists to live with the people he studied. The immigrant Constantine Rafinesque combined prolific identification of new plants and flowers with the study of the Mound Builders and other Amerindian peoples. Some of the leading figures in American natural history are remembered today as both scientists and artists, such as John James Audubon and Charles Willson Peale. Books like Audubon’s Birds of America, Jedidiah Morse’s American Universal Geography, and Alexander Wilson’s American Ornithology, along with Peale’s museum in Philadelphia, displaying its celebrated mastodon skeleton, brought natural history to a broad public. Only two American women of this period considered themselves professional scientists: Emma Willard, who taught mathematics and natural philosophy at Troy and published on physiology, and Maria Mitchell, who discovered a comet in 1847 and later became professor of astronomy at Vassar.53

It was an age when scientists, like other scholars, placed a premium on organizing, classifying, and presenting their discoveries in readily intelligible form. Taxonomy (the classification of biological species) was understood as reflecting the rationality of the Creator rather than a process of natural selection. The exploration of the globe went hand in hand with improvements in cartography (mapmaking) as well as the discovery of species and varieties. The introduction of the metric system rendered measurements uniform. The organization of scientific data into statistics and graphs accompanied the development of accounting and bookkeeping. Dictionaries, encyclopedias, and law codes excited the imagination. Just as Americans formed voluntary organizations and publications to promote religious, benevolent, and political causes, they also formed scientific ones. The American Philosophical Society and the American Academy of Arts and Sciences had been founded before 1815; they were joined over the years by a host of societies, institutes, and lyceums, often local or regional in nature, and frequently concerned with the presentation of science to the lay public. Silliman’s American Journal of Science appeared in 1818, devoted to the publication of new research for a professional

53. Nina Baym, American Women of Letters and the Nineteenth-Century Sciences (New Brunswick, N.J., 2002).

audience. In short, scientific activity reflected the dramatic improvements taking place in communications and information retrieval, as well as the increased public interest in accessing information.54

The antebellum federal government played a somewhat larger role in scientific research than in education, as its three great exploring expeditions demonstrate. Another federal enterprise producing much scientific knowledge was the U.S. Coast Survey, which charted the oceanography of the expanding American empire. Conceived during the Jefferson administration, it was reinvigorated during Jackson’s. Designed to facilitate ocean commerce, the Coast Survey reflected the interest of the Jeffersonians and Jacksonians in international trade. Even John Quincy Adams’s much maligned call for a federal astronomical observatory gained implementation before long. The Jackson administration found money within the Navy Department to construct a small observatory in 1834 as an aid to celestial navigation, and the first Whig Congress passed an appropriation for a larger one in 1842, to Adams’s delight. The U.S. Naval Observatory remains today in Washington, D.C.55

But one of the most significant federal scientific undertakings, the Smithsonian Institution, was thrust upon the government from outside. A wealthy English scientist named James Smithson willed his estate to the U.S. government to found “an establishment for the increase and diffusion of knowledge.” President Jackson denied that he had authority to accept the gift and referred the matter to Congress, where Calhoun opposed it as unconstitutional. After Congress agreed to take the money, the bequest came across the Atlantic in a packet ship laden with half a million dollars’ worth of gold coins, arriving in New York harbor on August 28, 1838; a Democratic administration would not accept mere paper. A dozen years of further wrangling ensued over what to do with the endowment, and not until 1846 did Congress create the Smithsonian Institution, with a museum, laboratory, library, and art gallery. Among those in Congress deserving credit for the outcome were Benjamin Tappan of Ohio (brother of the abolitionists Lewis and Arthur), Robert Dale Owen of Indiana (son of Robert Owen), and John Quincy Adams.56

54. Alexandra Oleson and Sanborn Brown, eds., The Pursuit of Knowledge in the Early American Republic (Baltimore, 1976); Daniel Headrick, When Information Came of Age (New York, 2001).

55. A. Hunter Dupree, Science in the Federal Government, 2nd ed. (Baltimore, 1985), 29–33, 62–63.

56. David Madsen, The National University (Detroit, 1966), 60; William Rhees, ed., The Smithsonian Institution: Documents (Washington, 1901), I.

The new Smithsonian was fortunate to get as its secretary (chief executive officer) America’s leading physicist, Joseph Henry, professor of natural philosophy at the College of New Jersey (later Princeton University) since 1832. Henry’s researches into electromagnetism had already helped prepare the way for both the electric motor and Morse’s telegraph, though Henry had not realized how close his “philosophical toys” were to marketable applications and had taken out no patent. A devout Old School Presbyterian, Henry believed in the intelligent design of the universe and in the compatibility of reason with revelation; he enjoyed a close friendship with the conservative Calvinist theologian Charles Hodge of Princeton Theological Seminary. A capable administrator, Henry concentrated the Smithsonian’s endeavors on scientific research and publication and turned its book collection over to the Library of Congress. Together with Alexander Dallas Bache, the head of the Coast Survey, Henry led in the formation of a self-conscious American scientific community and founded the American Association for the Advancement of Science in 1848, modeled on the British Association for the Advancement of Science.57

The young American republic enjoyed a Protestant Enlightenment that bestowed an enthusiastic religious endorsement upon scientific knowledge, popular education, humanitarianism, and democracy. The most widespread form of Christian millennialism added faith in progress to this list. The spread of literacy, discoveries in science and technology, even a rising standard of living, could all be interpreted—and were—as evidences of the approach of Christ’s Second Coming and the messianic age foretold by the prophets, near at hand.58


Improvements in travel and transportation had their downside: the spread of contagious disease. Endemic in the Ganges River Valley of India, cholera moved along trade routes in the early nineteenth century to Central Asia, Russia, and across Europe from east to west. In the summer of 1832, it crossed the Atlantic with immigrants to Canada and the United States. Cholera hit the great port cities of New York and New Orleans

57. Theodore Bozeman, Protestants in an Age of Science (Chapel Hill, 1977) 41, 201; Albert Moyer, Joseph Henry (Washington, 1997), 66–77; quotation on 73. On Bache, see Hugh Slotten, Patronage, Practice, and the Culture of American Science (Cambridge, Eng., 1994).

58. James Moorhead, World Without End: Mainstream American Protestant Visions of the Last Things (Bloomington, 1999), 2–9.

hardest, but the disease spread along river and canal routes, exacting a heavy toll wherever crowded and unsanitary conditions (polluted water in particular) prevailed. Of course, the poor suffered the most.59

In response to the epidemic, the Senate passed a resolution introduced by Henry Clay calling upon the president to declare a day of national “prayer, fasting, and humiliation.” Jackson, following the example of Jefferson rather than that of Washington, Madison, and the elder Adams, decided that compliance would violate the separation of church and state. The Evangelical United Front supported the resolution, but some denominations backed the president, including Roman Catholics and Antimission Baptists. To Jackson’s relief, the resolution did not pass the House of Representatives. Most churches observed the day anyway, on their own authority, and twelve state governments endorsed it. The political issue remained alive, a partisan one. When another cholera epidemic occurred in 1848–49 and both houses requested such a day, Whig president Zachary Taylor issued the proclamation. Whatever the effect of the prayers, at least they did no harm to the victims of the disease—more than one can say for the remedies of the physicians: bloodletting and massive doses of poisonous mercury.60

Of all major branches of science in this period, possibly the least well developed was medicine. Vaccination against smallpox constituted one of the few valid medical interventions practiced. The germ theory of disease had been suggested (under the name “animalcular theory”) but remained untested, an eccentric speculation. That rotting garbage and excrement fostered disease had long been recognized, blame focusing on their evil-smelling fumes (“miasma”). Recurrent epidemics prompted cities to start to improve sanitation provisions, but they did not act decisively until much later in the nineteenth century. Physicians practiced neither asepsis nor antisepsis and often infected a patient with the disease of the last one they had seen. In 1843, Oliver Wendell Holmes the elder, professor of medicine at Harvard, published a paper showing that unhygienic doctors bore grave responsibility for spreading puerperal fever among women in childbirth.61

59. Charles Rosenberg, The Cholera Years (Chicago, 1987); Sheldon Watts, Epidemics and History (New Haven, 1997), 167–212.

60. Rosenberg, Cholera Years, 47–52, 66, 121–22; Adam Jortner, “Cholera, Christ, and Jackson,” JER 27 (2007): 233–64.

61. Joel Mokyr, Gifts of Athena (Princeton, 2002), 94. So reluctant was the medical community to accept Holmes’s findings that he republished the paper in 1855. It appears in his collected Medical Essays (Boston, 1889), 103–72.

Physicians like Jacob Bigelow of Harvard Medical School, looking for materia medica (medicinal drugs), classified large numbers of plants and herbs for the benefit of natural history, but in practice the pharmacopoeia chiefly consisted of laxatives and opiates. Holmes remarked with a candor uncommon among his profession that if the entire materia medica of his time could be thrown into the sea, it would be “all the better for mankind, and all the worse for the fishes.” Physicians acted as their own pharmacists, selling the medicines they prescribed. The invention of the stethoscope in France in 1819 helped diagnosis, but there was little doctors could do to help even a correctly diagnosed patient. Few therapies of the day had any efficacy beyond symptomatic relief. “Heroic,” that is, drastic, measures of bloodletting, purging, and blistering found favor with physicians for a wide variety of diseases. They carried the endorsement of Benjamin Rush, still America’s leading medical authority long after his death in 1813, despite criticism from Bigelow in 1835. One of the few therapeutic improvements was the isolation of quinine from cinchona bark in 1820 and its gradual application to treating malaria.62

Reacting to the futility of scientific medicine, many patients resorted to a variety of alternatives: homeopathy, hydropathy, Thomsonianism, Grahamism, phrenology, spiritualism, and folk remedies (Euro-American, African American, and Native American).63To defend their turf, a group of leading orthodox physicians founded the American Medical Association in 1847. But, like religion, American popular medicine reflected the free marketplace of ideas. Though unorthodox practitioners could be unscrupulous charlatans, some of them had sounder ideas and did less harm than the M.D.s. The unorthodox included Sylvester Graham, a Presbyterian minister who combined millennial preaching with advice on health. He advocated temperance, vegetarianism, and avoiding tobacco, heavily salted food, and “stimulating beverages” like coffee. He claimed that most diseases could be prevented by a wholesome diet, exercise, and cleanliness, both personal and public. Graham’s teachings made virtues of ordinary Americans’ necessities. With heating water so inconvenient that it discouraged bathing, Graham recommended washing in cold water. With most households having a scarcity of beds, he endorsed sleeping

62. John S. Haller, American Medicine in Transition (Urbana, Ill., 1981), 36–99; Holmes, Medical Essays, 203; Alex Berman, “The Heroic Approach in 19th-Century Therapeutics,” in Sickness and Health in America, ed. Judith Leavitt and Ronald Numbers (Madison, Wisc., 1978), 77–86.

63. James Cassedy, Medicine in America (Baltimore, 1991), 33–39; John Duffy, From Humors to Medical Science (Urbana, Ill., 1993), 80–94. On African American folk medicine, see Sharla Fett, Working Cures (Chapel Hill, 2002).

on a hard surface. With finely ground flour expensive, he promoted the coarse-grained flour of his famous Graham cracker. With many women hoping to limit the size of their families, he cautioned men that frequent sex would debilitate them. Graham’s lectures and writings on physiology exerted influence and provoked controversy throughout the 1830s and ’40s. The Seventh-day Adventists perpetuated Graham’s dietary program after the Civil War; one of them, John Kellogg, invented corn flakes.64

As the country grew, medical schools multiplied but remained small, unlicensed, and sometimes poorly equipped. Many practitioners never attended one anyway, but learned their profession through apprenticeship. (According to one estimate, in 1835 only 20 percent of Ohio physicians held a medical degree.) Gross anatomy was the aspect of medicine then best understood; yet, facing a chronic shortage of cadavers, anatomists made themselves unpopular by grave-robbing. Medical students who wanted the best training went overseas.65 The only recognized medical specialty was surgery, long regarded as an occupation altogether different from that of the physician; most doctors engaged in general practice.

The benevolent causes of the period included medical philanthropies like hospitals, insane asylums, and care for the deaf and blind. But in the absence of effective therapies, hospitals did not do their patients much good. They generally treated only the poor and recruited convalescents to “nurse” those sicker than themselves; professional schools for training nurses did not yet exist. People who could afford to pay for treatment usually received it at home, with care from family members between the doctor’s visits. The medical care of slaves reflected their masters’ financial stake in their productive and reproductive capacities but suffered from their often unsanitary living conditions and, of course, from the poor state of therapeutic knowledge. Sometimes physicians experimented on slave patients in ways they would not have done on free white ones.66

Not only cholera but other infectious diseases like typhoid spread more readily than in earlier years, as more people traveled and population

64. See Robert Abzug, Cosmos Crumbling (New York, 1994), 163–82; Jayme Sokolow, Eros and Modernization (London, 1983), 161–68; Stephen Nissenbaum, Sylvester Graham and Health Reform (Westport, Conn., 1980), 152–54.

65. William Rothstein, American Medical Schools and the Practice of Medicine (New York, 1987), 15–53; Ohio data on 50.

66. For hospitals, see Charles Rosenberg, The Care of Strangers (New York, 1987), 15–46. For slaves, see Marie Jenkins Schwartz, Birthing a Slave (Cambridge, Mass., 2006); Deborah McGregor, From Midwives to Medicine (New Brunswick, N.J., 1998), 33–68.

density increased, especially in unhygienic commercial centers. Schools transmitted disease as well as literacy to children. Even rural families came under increased risk when they moved into the malarial lowlands of the Mississippi and Ohio river basins. Endemic contagions like tuberculosis (then called “consumption”) and malaria (“ague”) actually constituted a graver health threat than startling unfamiliar epidemics like those of cholera and yellow fever.67 With medical science unable to understand, prevent, or cure most of these illnesses, the health of the nation deteriorated during the first half of the nineteenth century. Between 1815 and 1845 the average height of native-born white males dropped from 173 to 171.6 centimeters; life expectancy at age 10, from 52 to 47 years. Increasing democracy and economic productivity, even rising real wages, did not offset the spread of contagious diseases, which stunted the growth of young people even if they survived. Economic development outran medical science, and those who lived through this era paid a real physical price.68

Dentistry provided a bright spot in the generally gloomy picture. Although the transportation revolution had harmful consequences in contagious disease, the spread of commercial society, advertisements in the printed media, and the widespread aspiration to a better life stimulated desire for dental care and products. Dental fillings, extractions, and pros-theses (false teeth) improved in quality in response to consumer demand and competition among providers. The expanding middle class adopted tooth brushing, a major step in the improvement of health. A New Orleans dentist named Levi Parmly recommended his patients floss their teeth with silk thread as early as 1815, though flossing did not become common until after the invention of nylon in the twentieth century. In Europe dentistry had often been considered a trade rather than a profession, but in the United States its status improved. Leading dentists held M.D. degrees. In 1840 the first American dental school opened in Baltimore, and within a generation American dentistry had become recognized as the best in the world.69

67. Gerald Grob, The Deadly Truth (Cambridge, Mass., 2002), 96–101, 108–15, 121–32; Thomas Cuff, The Hidden Cost of Economic Development (Burlington, Vt., 2005), xv.

68. Richard Steckel, “Stature and Living Standards in the United States,” in American Economic Growth and Standards of Living Before the Civil War, ed. Robert Gallman and John Wallis (Chicago, 1992), 265–310; Robert Fogel, The Fourth Great Awakening (Chicago, 2000), 139–51, graphs on p. 141; Robert Fogel, The Escape from Hunger and Premature Death (Cambridge, Eng., 2004), 35.

69. Malvin Ring, History of Dentistry (New York, 1992), 197–228; Suellen Hoy, Chasing Dirt: The American Pursuit of Cleanliness (New York, 1995), 5, 89; Bridget Travers, ed., The World of Invention (New York, 1994), 635.

One major medical innovation did occur in the United States: the demonstration of anesthesia in 1846. Until then, only alcohol and versions of opium mitigated the agony of surgery. In the absence of anesthesia, patients were reluctant to undergo operations for any but the most serious of reasons, limiting surgeons’ opportunities to learn new procedures. Nevertheless, amputation of limbs was tragically common, because in unsanitary surroundings, wounded extremities often developed septicemia or gangrene. Without anesthesia, surgeons placed a great premium on getting their procedures over with quickly, although their haste increased the risk of errors. About a quarter of amputees died from shock or infection.70

On October 16, 1846, William Morton (significantly, a dentist by profession) successfully administered ether during an operation by Dr. John C. Warren for the removal of a neck tumor at the Massachusetts General Hospital in Boston. Others had been engaged in parallel research on anesthetic, including Morton’s former dental partner, Horace Wells, and a Georgia surgeon, Crawford Long. Morton’s public demonstration at one of America’s leading hospitals brought anesthesia international attention, but his efforts to obtain patent rights brought him only litigation and controversy—especially with Wells and a Harvard chemistry professor named Charles T. Jackson, who had provided advice. A farmer’s son dreaming of riches and fame, Morton neglected his practice to pursue his court actions. He died twenty-two years later in embittered poverty. Meanwhile, ether, chloroform, and other varieties of anesthesia, despite justified concern about their safety, had gained applications in surgery, dentistry, and obstetrics throughout the Western world. Besides its medical impact, anesthesia stimulated philosophical and religious debate over the function of pain in human existence. With the invention of anesthesia, medical science intersected with the humanitarian reform impulse that sought to minimize the infliction of physical pain in a wide variety of contexts, including corporal punishment of schoolchildren, wives, convicts, slaves, and members of the armed forces.71

70. James Cassedy, American Medicine and Statistical Thinking (Cambridge, Mass., 1984), 87; Elaine Crane, “The Defining Force of Pain in Early America,” in Through a Glass Darkly, ed. Ronald Hoffman et al. (Chapel Hill, 1997), 370–403. Thomas Dormandy, Worst of Evils: The Fight Against Pain (New Haven, 2006) came into my hands too late for me to use.

71. G. B. Rushman et al., A Short History of Anaesthesia (Oxford, 1996), 9–19; Martin Pernick, A Calculus of Suffering (New York, 1985).


The Bible occupied an even more prominent position in discussions of morality than it did in education and science. Pre–Civil War Americans debating moral issues almost always appealed to biblical authority. This practice extended to the most divisive of all arguments over social morality, the debate over slavery. In 1837, Theodore Dwight Weld published The Bible Against Slavery. Like other abolitionists, he quoted St. Paul’s great speech in Athens, that God “hath made of one blood all nations of men for to dwell on all the face of the earth” (Acts 17:26). One did not enslave kinfolk. But the defenders of slavery answered by quoting Noah: “Cursed be Canaan; a servant of servants shall he be unto his brethren” (Genesis 9:25). In rebuttal, Weld responded that no evidence showed Africans descended from Canaan. For abolitionists like Weld, slavery clearly violated a precept of Mosaic Law that Jesus had declared one of God’s greatest commandments: “Love thy neighbor as thyself” (Leviticus 19:18; Mark 12:28–31). To this, the redoubtable Southern Baptist Thornton Stringfellow pointed out that many other passages in the Pentateuch indicate God’s Chosen People practiced chattel slavery and that God, far from issuing a blanket condemnation of the institution, prescribed legal rules for it (as in Exodus 21). Rabbi M. J. Raphall of New York City vouched for the legality of slavery under the Torah.72 Abolitionists retorted that the patriarchs practiced polygamy too, but this did not legitimate it for Christian men. When opponents of slavery appealed to the Golden Rule in Jesus’ Sermon on the Mount, proslavery writers pointed out that Paul’s Epistle to Philemon proved that the church of New Testament times, like the Israel of Old Testament times, had included slaveholders and recognized their rights.73

Although David Strauss published his Life of Jesus in Germany in 1835, on the western side of the Atlantic American Christians carried on their debates without reference to the “higher criticism” of the Bible that Strauss’s book exemplified. Nevertheless a difference marked the two sides’ use of biblical references. Southerners seized upon specific and literal textual examples, while the advocates of antislavery invoked the general tenor of the Bible, for example, that “God is no respecter of persons” (Acts 10:34). The abolitionist Angelina Grimké declared the real issue not whether Jesus had ever explicitly condemned slavery but whether one

72. Raphall is quoted in Mark Noll, America’s God (Oxford, 2002), 393–94.

73. A recent anthology of primary documents from the debates over slavery is A House Divided, ed. Mason Lowance (Princeton, 2003); see 63–67, 92–96. See also Stephen Haynes, Noah’s Curse (New York, 2002).

could imagine Him owning a slave.74 The debate over the scriptural status of slavery did not involve only the extremists on both sides. One of the most comprehensive exchanges on the subject occurred in a series of letters between two Baptist clerical moderates, Francis Wayland (president of Brown University and author of the most widely used American textbook on moral philosophy) and Richard Fuller (pastor of a large Baltimore congregation and a leader of the new Southern Baptist Convention).75 Who “won” the biblical debate depends on whom you ask. At the time, each side felt it had the better of the argument. Some American historians have ruled in favor of the proslavery controversialists, but most contemporary Protestant foreign observers found the antislavery side more convincing—as would most American Christians today. To Jesuit commentators in Rome, the debate demonstrated the chaotic consequences of Protestants’ lack of a single religious authority.76

After the great debate over slavery by the Virginia legislature in 1831–32 had concluded, Thomas R. Dew, a professor at William and Mary College, published a Review of the Debate (1832) that commanded great attention throughout the South. Demonstrating the broad intellectual range of the moral philosophers of his time, he drew upon classical economics and the demography of Malthus. Dew sought to prevent further agitation of the slavery question in the southern states because it would indicate to the slaves that insurrections such as Turner’s might pay off. He concentrated his fire on the colonization proposals that had been advanced by legislators from western Virginia and constituted the most widespread version of antislavery in the South. Compensated emancipation and/or colonization would add prohibitively to the tax burden, he argued; uncompensated emancipation he dismissed as manifestly unjust. Dew did not shrink from defending slavery on economic grounds as an efficient and profitable system. Colonization programs would create a labor

74. Daniel McInerney, “The Political Gospel of Abolition,” JER 11 (1991): 371–94; Richard W. Fox, Jesus in America (San Francisco, 2004), 206. Among all U.S. biblical scholars, only Theodore Parker found any merit in Strauss’s approach; see Dean Grodzins, American Heretic (Chapel Hill, 2002), 186–90.

75. Francis Wayland and Richard Fuller, Domestic Slavery Considered as a Scriptural Institution (New York, 1847) is a tribute to the confidence of the participants in reasoned argument.

76. After an extended discussion, Elizabeth Fox-Genovese and Eugene Genovese declare the proslavery side victors in The Mind of the Master Class (Cambridge, Eng., 2005), 473–527, though they dismiss the Curse of Ham argument as weak. On foreign observers, see Noll, America’s God, 400–401, 408. An excellent brief summary of the debate is Holifield, Theology in America, 494–504.

shortage and deprive the state of its valuable export of surplus slaves to the Southwest, he warned. Dew did not go so far as to claim slavery superior to free labor, but he included philosophical and biblical defenses of slavery in his presentation to show that it was not necessarily an immoral system. Dew belonged to a generation that readily believed in the providential identity of morality and profitability. If at some future time slavery ceased to be profitable in Virginia (an eventuality he thought quite possible), then would be the just time to reconsider emancipation. Although not flawless, Dew’s arguments hurt the cause of colonization in the South at the same time it also came under fire in the North from abolitionists. In the first half century of independence, comparatively few intellectual defenses of slavery had appeared. Dew’s skillful and wide-ranging presentation commenced a new era of boldness on the part of slavery’s defenders. Not many of them, however, followed him in emphasizing the economic case.77

In the early years of the republic, critics of slavery had by no means all come from the North, nor were its few defenders necessarily southerners. Indeed, the existence of opposition to slavery within the South had reassured northerners that the task of emancipation could safely be left in state hands. By the 1830s, however, debates over slavery, often conducted between clergymen and highlighting the biblical arguments, had taken on an overwhelmingly sectional character, although northern biblical scholars like Moses Stuart and Charles Hodge occasionally supplied ammunition their southern colleagues could use to effect.78 Within the South, criticism of slavery was dampened down by the severe controls imposed in reaction against the abolitionist petition campaign of 1835. Dedicated southern abolitionists like James Birney and Angelina Grimké found they had to move to the North. Defying all threats, Cassius Clay (cousin of Henry Clay) managed to stay in Kentucky and maintain an antislavery movement there. Meanwhile, the increasing world demand for cotton made slavery ever more attractive economically, and the felt need to justify the system against its outside critics all the more urgent following emancipation in the British West Indies (1833). Southern intellectuals

77. Thomas R. Dew, Review of the Debate in the Virginia Legislature of 1831 and 1832 (Richmond, Va., 1832); William Shade, Democratizing the Old Dominion (Charlottesville, Va., 1996), 205; John Daly, When Slavery Was Called Freedom (Lexington, Ky., 2002), 34–56.

78. See Kenneth Minkema and Harry Stout, “The Edwardsean Tradition and the Antislavery Debate,” JAH 92 (2005): 47–74.

rallied to their section’s defense. Of possible arguments on behalf of slavery, they most often employed the biblical.79

The evangelical churches in the South had been a source of antislavery agitation in the eighteenth century. As late as 1818 the nationwide Presbyterian Church had declared slavery “utterly inconsistent with the law of God,” without any southern objection voiced. But southern evangelicals gradually made their peace with their section’s “peculiar institution” as the price for continuing undisturbed with their preaching and voluntary activities. By the 1830s, their clergy typically endorsed the biblical warrants for practicing slavery. They directed their reform efforts to temperance and combating the high level of violence in southern society, while providing religious instruction to slave and free alike and reminding slaveholders of their paternalistic responsibilities to their dependents. “Masters, give unto your servants that which is just and equal; knowing that ye also have a Master in heaven” (Colossians 4:1). The very clergy who would quote scripture to defend the slave system against outside critics also admonished masters, sometimes in the presence of their slaves, against breaking up families or preventing slaves from hearing or reading for themselves the divine word. The most distinguished South Carolina theologian, James H. Thornwell, justified slavery from the Bible but advocated state legislation during the 1840s to protect slave marriages and repeal restrictions on slave literacy. The Georgia Presbyterian clergyman Charles Colcock Jones, owner of three plantations and one hundred slaves, devoted his ministry to The Religious Instruction of the Negroes (title of his 1842 book), sometimes working in collaboration with black preachers. Jones saw himself as a social reformer trying to humanize the institution of slavery.80

The South’s evangelical clergy did not usually claim that slavery was “a positive good” (as Calhoun and some southern Jacksonian politicians began to do), but they certainly denied its intrinsic immorality. Chiefly, they resented the imputation that slaveholders were necessarily evil people. In 1844, when the national Methodist Church refused to accept as a bishop a man whose wife had inherited slaves, the Southern Methodist Church seceded.

79. See Jan Lewis, “The Problem of Slavery in Southern Political Discourse,” in Devising Liberty, ed. David Konig (Stanford, 1995), 265–97; Drew Faust, Southern Stories (Columbia, Mo., 1992), 72–87; Ralph Morrow, “The Proslavery Argument Revisited,” Mississippi Valley Historical Review 48 (1961): 79–94.

80. William Freehling, The Reintegration of American History (New York, 1994), 59–81; O’Brien, Conjectures of Order, II, 1149–57. On Jones, see Erskine Clark, Dwelling Place: A Plantation Epic (New Haven, 2005), esp. 135–39.

The following year the Southern Baptists likewise created their own denomination. The Presbyterians, who had split along Old School/New School theological lines in 1837, split again on sectional lines just before the Civil War. Emancipation and colonization at some undefined future time allotted by divine providence, when “conditions are ripe,” remained a vague but not uncommon hope among antebellum southern evangelicals. The earthly millennium would bring deliverance from slavery.81

The Roman Catholic Church in the United States adopted a position not far removed from that of southern evangelical Protestants—if anything, more conservative. In 1839 the otherwise arch-conservative Pope Gregory XVI forbade Catholics to participate in the Atlantic slave trade (by then largely in the hands of the Spanish and Portuguese) but did not condemn slavery itself. Scripture and natural law (going back to Aristotle) sanctioned the institution so long as masters permitted slaves to marry and receive religious instruction. Even when masters did not live up to their obligations, the church taught it preferable to suffer the wrong than to risk social turmoil, perhaps even race war, by immediate emancipation. Abolitionist rhetoric invoked principles derived from Protestantism and the Enlightenment, and emphasized the urgency of the slavery problem; it conveyed little appeal to antebellum American Catholics. Their religion honored the spiritual discipline of patient suffering and submission more than Protestantism did, and valued individual autonomy less. Sometimes individuals had to sacrifice for the sake of public order or community welfare, even to the point of accepting enslavement. In Europe, the Roman Catholic Church generally set its face against liberalism, modernism, and republicanism. It had not embraced the Enlightenment as Anglophone Protestantism had. Most of the American Catholic bishops who came after John Carroll were Europeans and shared that predominantly conservative outlook. Social engineering, such as planned colonization, seemed anathema to such men. Postmillennial expectations, which gave theological underpinning to Protestant Americans’ faith in progress, had no Catholic analogue. Anti-Catholicism among Protestants and anti-Protestantism among Catholics, both of them strong and mutually reinforcing, prevented cooperation in antislavery (or, indeed, any other enterprises).82

81. Christine Heyrman, Southern Cross (New York, 1997), 5–6; Anne Loveland, Southern Evangelicals and the Social Order (Baton Rouge, 1980), 191–219; Daly, When Slavery Was Called Freedom, 61–72.

82. John McGreevy, Catholicism and American Freedom (New York, 2003), 49–56. See also Madeleine Rice, American Catholic Opinion in the Slavery Controversy (New York, 1944).

Socioeconomic factors underscored the alienation of Catholics from antislavery. Most American Catholics were also immigrants and poor. They despised what they saw as the hypocrisy of those abolitionists who deplored the plight of distant slaves while ignoring that of the hungry newcomers on their doorstep. Sadly but understandably, poor Catholic immigrants, especially the Irish, treasured the whiteness of their skin as their one badge of privilege over the free Negroes who competed with them for jobs as laborers. Abolitionists, especially black abolitionists, deeply resented the attitude of Irish Americans and their church, contrasting it with the sympathy American antislavery received in Ireland itself from nationalists like Daniel O’Connell. As a result, abolitionists sometimes allied with the cause of nativism.83

But not even Catholics argued that slavery was a “positive good” and the best way to organize a society. Those who wished to make that case generally found it necessary to invoke secular rather than religious ideologies to justify their position. John C. Calhoun, theorist of southern sectional unity and constitutional interpretation, made himself the most widely known exponent of the “positive good” of slavery. On February 6, 1837, the South Carolinian addressed the Senate to oppose reception of petitions calling for the abolition of slavery in the District of Columbia. (Since Congress had no power to abolish slavery in the states, the District of Columbia became a favorite target for abolitionists wishing to focus national attention on their cause.) In his speech Calhoun abandoned the conventional Jeffersonian doctrine of slavery as an unfortunate legacy that the South must be left to deal with on its own. “I take higher ground,” he declared. “I hold that in the present state of civilization, where two races of different origin, and distinguished by color, and other physical differences, as well as intellectual, are brought together, the relation now existing in the slaveholding States between the two, is instead of an evil, a good—a positive good.” Without the coercion of slavery, Calhoun foresaw, white supremacy would be at risk; “the next step would be to raise the negroes [sic] to a social and political equality with the whites.” Slavery’s virtue lay not in its mere profitability but in its broad social consequences. It prevented both race and class conflict in the South, Calhoun claimed. “There is and always has been in an advanced stage of wealth and civilization, a conflict between labor and capital,” he insisted. But southern slavery “exempts us from the disorders and dangers resulting

83. See David Roediger, The Wages of Whiteness (London, 1991); Noel Ignatiev, How the Irish Became White (New York, 1995), 10–31; Charles Morris, American Catholic (New York, 1997), 63–80.

from this conflict.” This speech prompted the historian Richard Hofstadter to label Calhoun “the Marx of the Master Class.”84

Proslavery propagandists seized with delight upon the Census of 1840, conducted by the Van Buren administration. Data which that census collected for the first time included statistics on the number of the insane. The census returns wildly inflated the number of insane free Negroes (in some communities it exceeded the total colored population). Southern politicians cited these numbers, seemingly indicating far higher rates of insanity among free than enslaved blacks, to demonstrate that African Americans could not handle freedom. Calhoun himself used the statistics in public statements on behalf of expanding and protecting the beneficent institution of slavery. Meanwhile, however, the absurdity and contradictions contained in the data had been exposed by Edward Jarvis, a northern statistician. John Quincy Adams secured a congressional resolution calling for an inquiry into how the mistakes had occurred. One William Weaver of Virginia had been in charge of the census, and (as secretary of state under Tyler) Calhoun appointed him to head the investigation too, thus assuring a cover-up. Weaver succeeded in delaying the inquiry and obfuscating its outcome; proslavery politicians continued to exploit the returns. How the erroneous data got into the census remained a mystery until the detective work of historian Patricia Cline Cohen, who traced it to small print and confusingly labeled columns on the forms the collectors filled out. The census of 1840, the first to show the United States surpassing Great Britain in population and the first to collect information on literacy, was also the last of the amateurish ones. The census of 1850, conducted by a Whig administration, took advice from Jarvis and made considerable advances in the collection and processing of social statistics.85

The hothouse political atmosphere of South Carolina nurtured the attitude that slavery was a “positive good.” William Harper, enthusiastic supporter of nullification and chancellor of the state’s high court of equity, joined Calhoun in repudiating Jefferson’s principles; he explicitly rejected the assertion that “all men are created equal” in the Declaration of Independence as well as its doctrine of natural rights. A disciple of Edmund Burke, Harper distrusted those who would rebuild society on theoretical

84. John C. Calhoun, “Speech on the Reception of Abolition Petitions,” in his Works (New York, 1851–55), II, 625–33, quotations from 631–33; Richard Hofstadter, The American Political Tradition (New York, 1948), 68.

85. Patricia Cohen, A Calculating People (Chicago, 1982), 175–204; James Cassedy, Medicine and American Growth (Madison, Wisc., 1986), 124–26; Frederick Merk, Slavery and the Annexation of Texas (New York, 1972), 61–68, 85–92, 117–20.

principles. Though slavery had its evils, so too did “free society,” and who could say that there was more happiness or less immorality in England or the northern states than there was in the South?86 In the years between 1848 and the Civil War, other South Carolinians, including William Smith and James Henry Hammond, would elaborate this proslavery ideology. But the process reached its apogee during the 1850s in the writings of a Virginian. George Fitzhugh repudiated individualism and natural rights entirely in favor of a theory of universal social subordination: children to parents, wives to husbands, subjects to rulers. Abolitionists pointed out that by Fitzhugh’s logic, all workers, white as well as black, would be better off enslaved.87

Most often, however, the “positive good” school of slavery apologists followed Calhoun and based their arguments on race. They asserted that Negroes were inherently intellectually “defective” and therefore naturally suited to enslavement by their superiors. Josiah Nott, a Mobile physician, took this line to its farthest extreme in the 1840s. Black Africans represented an entirely different species, he claimed, created separately by God from whites. Racial interbreeding produced hybrid offspring inferior to either parent. Nott’s theory (called “polygenesis”) found some supporters among naturalists of the day but ran into trouble because it contradicted the creation account in Genesis, which clearly affirmed the descent of all human beings from a single original couple (“monogenesis”). The failure of Nott’s theory to win over southern public opinion—even though it pandered to popular prejudices and despite its claims to scientific respectability—testified to the strength of the prevailing conception of harmony between reason and revelation.88

Bible-centered Protestantism, synthesized with the Enlightenment and a respect for classical learning, helped shape the culture, determine the patterns of intellectual inquiry, and define the terms of debate in the antebellum American republic. On the slavery issue, the synthesis was ambiguous; in most other ways it underwrote democratic values. It supplied a young and rapidly changing society with a sense of stability. Without resolving moral controversy, it endowed moral standards and rational discourse with each other’s authority, strengthening both.

86. William Harper, Memoir on Slavery (Charleston, S.C., 1838); O’Brien, Conjectures of Order, II, 946–59.

87. See Eugene Genovese, The World the Slaveholders Made (New York, 1969), 118–244; O’Brien, Conjectures of Order, II, 972–91.

88. Reginald Horsman, Josiah Nott (Baton Rouge, 1987), 81–103, quotation (“defective”) from 88. See also George Fredrickson, The Black Image in the White Mind (Middle-town, Conn., 1971), 78–82.

If you find an error please notify us in the comments. Thank you!