Whenever we Americans meet someone new, within a few minutes one or the other asks the inevitable question, “What do you do?” By “do” we mean “work.” We mean “What is your occupation?”
We ask this question because the work we do is central to our personal identity and to our position within the socioeconomic structure of American life. The simplest way to place (or to rank) the person to whom we are talking is to know his or her work.
To ask the question is normal, but it is not natural. It is a deeply seated, culturally conditioned behavior. Like most such behaviors, we don’t think much about the question or the central role of work in American life or about the status of employees within our economic order.
We believe in work, and have been proud in our certainty that Americans work harder, smarter, better than people in other countries. We know about slavery. We probably know a little about the history of labor unions; we may even recognize that such men as Samuel Gompers, A. Philip Randolph, and John L. Lewis were heroes of labor history.
For the most part, however, we don’t know why work is so important to us, nor do we know how American ideas about work are connected to other elements of American culture, especially such important symbolic systems as our belief in individualism or our stories about the frontier or our faith in meritocracy.
We certainly don’t think of our current economic stresses as rooted in the role of work in the culture as a whole. But they are, and the fact that we are blithely unaware of the connection is a big part of our current problems.
That lack of awareness is, of course, only part of our general disregard for history. We think that we can gain “historic” background for current events by looking back 40 or 50 years. Such short range thinking is often worse that useless. Another problem is that if we look back to our national beginnings we too easily sacralize early decisions. We treat the Founders and the Constitution that they drafted as founts of ageless wisdom.
The Founders were not better people than we. They had problems to solve, and they bravely embraced revolutionary solutions. We don’t know what each signer of the Declaration of Independence thought about human equality, but we do know that human equality was available to them as a philosophical proposition on which one could build a justification for revolution, for casting off English rule, and for creating a government of their own.
After casting off English rule, they then had the courage to create a government of equals, a government based on the consent of the governed. It eliminated all inherited right to rule. No man could claim a right to political power because his father had previously held that power. Only those who by themselves won the consent of the governed could hold political power.
It was a magnificent achievement.
But . . .
This solution to political power created social problems. The elimination of hereditary claims to political power eliminated the foundation of hereditary hierarchical social class. In their place the Constitution of 1787 created a new set of political classes. “The governed” were never explicitly defined. However, by permitting the continuation of slavery and by mandating no change in the legal status of women, the Constitution of 1787 effectively excluded both slaves (in effect, all non-whites) and women from the class of “the governed” whose consent is needed by a just government.
By implication, these two Great Exclusions created a new hereditary political class of white males. Even though all other hereditary claims to political power were eliminated, all white males were born with a claim to political power while all women and all non-whites were hereditarily excluded from political power. Through the course of the nineteenth century Americans created their New Order of the Ages on the basis of this unusual hereditary political class structure.
The Constitution of 1787 created political classes, but it created no social order. For one thing, it provided no arrangements for relations among white men. Politically, white men were a society of equals. No such society had existed, so with little historical guidance, white men had to figure out what it meant to live in a society of equals.
An even greater challenge arose from the fact that white women and white men existed in separate classes, with women legally subordinate to men. The New Order that emerged was primarily a white male realm. Women’s lives became more dependent on men but increasingly separate from men’s lives. The lives of non-whites, as non-whites, were at best irrelevant. Indians and blacks, in differing ways, learned the high price of exclusion from the white male scheme.
By the end of the nineteenth century, the United States achieved social and political stability as a white man’s country.
As the twentieth century progressed, however, the race and gender exclusions lost coherence, and in the decades immediately following World War II, social and political stability weakened. As blacks and other non-white groups, and women, asserted claims on the American Dream, things fell apart. The claim of white men, individually and as a class, to a legitimate monopoly on political power (and hence to social and economic power) collapsed.
This collapse of the entitlement of white men to political power did not, however, end the actual power enjoyed by white men. No stable new foundation of political legitimacy has developed, so political power and government itself have no secure claim to be rooted in the consent of the governed. Political paralysis spreads, and social cohesion disintegrates.
Our Impossible Dream
We remain trapped in the failure of legitimacy and the resulting political paralysis like a sleeper unable to escape from a bad dream because we are trapped in a Bad Dream, the Impossible Dream we call the American Dream. While we all believe in the American Dream, we have no shared agreement on its meaning. Is it the right of the poor man to become a multi-billionaire? Is it the right of everyone to own a house of his own? Is it a society in which each individual has the opportunity to develop her personal capabilities to the full? Is it a nation powerful enough to dominate the rest of the world? Is it a society in which everyone enjoys a life of dignity? Is it a nation with the least possible government and lowest possible taxes?
Yes, it is. All of the above. The American Dream is not a coherent political philosophy, nor is it a collection of public policy goals. It is an incoherent tangle of choices we have not made, of issues we have not confronted. It is an Impossible Dream.
This Impossible Dream retains great power because it remains a living myth. It is not a pernicious scheme imposed from on high by the Founding Fathers but a genuine folk culture that was created gradually as Americans lived their daily lives after independence. It was shaped by the revolutionary social-political structure created by the Constitution of 1787, the unprecedented degree of freedom and equality experienced by individuals, and the existence of a vast expanse of unsettled land. It emerged as a mythology that tied the Enlightenment ideology that had shaped the Revolution to the daily lives of citizens of the United States of America and to the Western frontier that became a defining characteristic of the nation.
Freedom and equality were the central values of that ideology. Whether Thomas Jefferson had his fingers crossed when he wrote the line about “all men are created equal” is not a problem. He needed the ideas of human equality and its political manifestation, popular sovereignty, to justify rebellion against English rule. He could use them because the people in England’s North American colonies were already living with more freedom and equality than any European people enjoyed.
Integral to that freedom and equality was the decay of the power of family to shape the lives of young men. Until Independence the continuing allegiance to England’s king provided pressure to keep the old ways working. After Independence, however, all aspects of life were open to change. There was, however, no model to follow. Both national and individual identities—both “America” and “the American”—had to be shaped. Americans had no choice but create a New Order of the Ages out of the materials available.
In this fluid environment, the new states developed from two very different notions about freedom and equality, and that difference fell roughly along what became the Mason-Dixon Line (the southern border of Pennsylvania and western border of Delaware). South of that line, colonial leadership had modeled itself on the English aristocracy. Among Southerners freedom was an elite status—the freedom of a landed English gentleman who lived off the work of others. Equality was a notion that applied only to the free elite.
North of the Mason-Dixon Line leadership fell to men whose strongest attribute was educated intellectual strength or practical achievement. The Northern colonial elite for a century or more had been heavily comprised of the ordained clergy. Others attained elite status as attorneys, as merchants, as craftsmen, or as relatively successful farmers. Apart from the handful of proprietors of the great Hudson River estates, there was no landed elite who could pretend to be English country gentlemen.
At the time of the Revolution, Southern leadership came from the landed planter class while Northern leadership came from professional or entrepreneurial men: John Adams was a lawyer, Sam Adams was a brewer, Paul Revere was a silversmith, John Hancock was a merchant, Ben Franklin was a printer. Among these men freedom was a general human right, not an elite value. In 1776 slavery existed in all thirteen of the newly independent states, but by 1804 all states north of the Mason-Dixon Line had slavery on the way to abolition and had begun a process of realizing human equality to a previously unknown degree.
The Constitution of 1787 united the newly independent states by carefully papering over those fundamental differences, but by 1804 what the Constitution had united cultural differences had begun to sunder. The nation had divided geographically, and the great dividing issue was opposed systems of labor. The South preferred slave labor and the North preferred what it came to call “Free Labor.” By the time of the Civil War, “Free Labor” had achieved mythological status, linking labor to the Frontier in a belief system that asserted the availability of opportunities open to all.
White Men and Work
“Free Labor” ideology built on the most fundamental social process in the early years of the republic, the creation of occupation-centered identity for white men.
Two great dynamics merged in the creation of occupation-centered identity, the transformative power of equality among white men and the growth of the market economy. At the time that equality forced white men to find new ways to think about their relations to the world, the power of capitalism, technology, and industry provided a whole new world of occupations that met their needs. As traditional family-centered identity, including marriage, lost its ability to organize men’s lives, new occupations provided a replacement.
Occupation-centered identity replaced family-centered identity.
Four major elements converged in the emergence of occupation-centered identity: 1) the professions established the norms that separated middle-class occupations from the under-class world of labor; 2) higher education shifted from training young men to be traditional gentlemen to training them to function successfully in a world of changing science and technology; 3) the idea of a career provided a forward looking life story to replace the static life story of traditional society; and 4) “management” was invented to distinguish sub-professional middle class employment from labor.
With the growing emphasis on work, the individuals’ sense of worth came to rest not on the undifferentiated equality which all white men shared but on the varied socially recognized values of what each one did. The big problem in 1800 was the shortage of “varied socially recognized values” for different kinds of work. Much of America’s social history throughout the nineteenth century was the process of sorting out the social meanings related to various kinds of work and establishing control of those meanings.
The central standard that emerged—more important even than money—was the independence of the worker. Control of one’s personal life depended on one’s work, so control of work became central. Professional work provided the model.
The high status of professional work was secure because genteel Northern Americans, more than any other group, felt the declining value of family-centered identity as a serious loss. They more than others had based identity on status conferred by birth, and so they were the first to feel that it was ceasing to provide a reliable basis for their personal identities. For the genteel, however, a strategy was at hand. Northern America’s genteel had never been an idle group. They had worked. Their basic strategy, then, was to transfer the symbolic value of genteel standing from family to occupation. Gentility became equated with the traditional genteel occupations—the ministry, public service, law, land ownership, and trade.
The characteristic of professional work that had the greatest cultural significance was the independence of the professional. In vital respects, the professional did not work to the standards of others but worked to his own standards. In both work life and personal life, the professional was independent. In the words of Burton J. Bledstein, “The culture of professionalism incarnated the radical idea of the independent democrat . . . a self-governing individual exercising his trained judgment in an open society.” (The Culture of Professionalism, p. 87)
A crucial element of the transfer of gentility from family to occupation was that in time genteel standing could be claimed by admission to those occupations regardless of family background. In the old order, the gentleman was a lawyer because he was a gentleman; in the new order, the lawyer was a gentleman because he was a lawyer. An important general principal was thus established: certain occupations could by themselves confer high status within a community.
Simply by doing certain work, individuals could enjoy high standing. And the reverse may have been of even greater importance: by doing certain other things, such as working as someone else’s employee, individuals could enjoy only lesser standing. Occupations provided a new hierarchy on which American life could be structured. Occupational status provided social status.
Occupations provided a new hierarchy that honored equality because access to professional standing without regard to family background was long established in the northern states. In northern colonial society the Protestant clergy enjoyed very high status. In many communities, the clergy were the elite of the elite. Family standing alone could not provide access to the ministry, however. Only formal education could open the way to ordination and the associated clerical status.
Young white American men seeking professional standing naturally followed the clerical pattern and turned to higher education. The existing educational system failed them badly. Apart from training the clergy, colleges had trained young gentlemen for lives such as their fathers had lived. Such colleges could not provide what America’s young white men needed. The clearest measure of their failure and the pressures that young men felt was the astonishing student violence—students beat and occasionally murdered their classmates, they treated faculty with contempt and threats of violence, and they wrecked havoc on buildings and furnishings of both college and town. For example, James Fenimore Cooper’s older brother William was expelled from Princeton under suspicion of having set a fire that destroyed the college’s main building, and James was later expelled from Yale after blowing the door off the room of a fellow student who had beaten him rather badly in a fight.
After the Civil War colleges finally began to deal seriously with the challenges of American individualism. The traditions of education centered on ancient languages and philosophy—the education of the eighteenth century—gave way to modern languages and science. The greatest innovation was the addition of “elective” subjects. With elective subjects the new curriculum acknowledged that each student was an individual who could not deal with the world successfully by imitating his father.
Higher education became a growth industry. According to Bledstein, “By 1870, there were more institutions in America awarding bachelor’s degrees, more medical schools, and more law schools than in all of Europe.” Higher education did much more than control access to professional status. It could confirm claims to professional status by the myriad new lines of work emerging from scientific and technological developments. Science provided techniques for validating the creative work of individuals, and the growing academic world quickly applied the scientific mindset to achievements in all fields. Individuals then could build confidently on each other’s innovations. This process led logically to the creation of new academic specialties in not only the sciences but also in the emerging “social sciences” and in the traditional humanities. Colleges became universities, and by the end of the nineteenth century the modern American university was fully in place, serving as the culture’s fundamental source of occupational and personal status.
Universities also became a mechanism for multiplying professions and the number of professional positions available within the culture. After the Civil War, men trained in the new specialties began to organize themselves into national associations whose primary function was (and still is) to create and control the professional status and income of their members. In the last quarter of the nineteenth century the major professions—school teachers and college professors, history teachers and teachers of languages and literature, attorneys, doctors, engineers— defined themselves through new organizations and professionally defined specializations. Those who did not organize lost out to those who did: female mid-wives, for example, did not organize themselves, and they lost not only status but the right to practice their craft to the male physicians who organized the American Medical Association.
The development of universities and professional organizations facilitated the emergence of another device for organizing individual lives, the career. In traditional society a young man’s life story would be the story of his father and his father’s father, just as the story of his son and his son’s son would be the same as his. For young American white men in the nineteenth century, the traditional process ceased to work. They experienced “the failure of traditional expectations to provide their lives with meaning. Home, family, community, religion, college: none of these traditions . . . supplied a dependable basis . . . for the prediction and control of an active life.” (The Culture of Professionalism, p. 196)
In a dynamic, egalitarian market economy each individual needed to imagine his own life’s story, an imaginative projection into the future rather than a look back at the past. The solution to this new demand facing young men was the “career.”
The conception of a human life as a career was very much a nineteenth century invention. The man with a career was elevated above the mere hurly-burly of the day’s competition because the career was an outline plan for realizing the individual’s true potential. The career therefore took on much of the spiritual resonance of Emersonian individualism, which itself had absorbed much of the power of the Puritan notion of a “calling,” a role in the world to which one is called by God. The career retained the specialness of the calling, and therefore to pursue a career was to serve a lofty purpose in the world.
“Career” was not merely a synonym for any individual’s life course. It implied a competitive life in which an individual planned to advance through rising levels of achievement. It implied ambition and planning and recognizable intermediate goals. It implied individual control over one’s own life.
The career also provided at least a partial resolution to the problem of the control of work for the many who rose above manual labor but did not reach the ideal of professional independence—those who did not work with their hands but remained employees of someone else. At the beginning of the nineteenth century most “white” American workers, even the “mechanics” who worked with their hands, enjoyed substantial control over the work they did. As the century progressed and new work of all kinds developed, an intense struggle was waged over control of work. By the end of the century labor had clearly lost. A new category of work, “management,” had been invented, and managers controlled laborers and their work.
The separation between management and labor was understandable, but it was not inevitable. The explosive growth of American industry during and after the Civil War created a truly national economy, and with it came a need for and a drive toward standardization. During the 1860s a single national currency replaced a host of local currencies; a single railroad gauge (the width between the rails) was adopted, as were uniform standards for screw threads. While standards became increasingly uniform and precise, processes became increasingly complicated. As steel making developed, for example, precise quantities of materials became critical, because as little as a half cup of carbon per hundred pounds of iron could significantly alter the characteristics of the finished product.
When standards of such precision were involved, industry needed systems that could manage production to insure that uniform standards were reliably met. The need for control over the product led to a struggle for control of the work to be done. In the opening years of American industrialization, “factories” were only slightly larger than individually run shops had been. “Management” determined what was to be made, but workers continued to control the process of fabrication. Workers chose the tools to be used, they maintained those tools, they determined how the tools would be used, they selected the material to be worked on, they determined the speed at which the work was done; where heat was part of the process, workers controlled the heat. Relying on their own experience and on rules of thumb of their crafts, workers exercised significant control over their own activities and thus over the product they produced.
Management succeeded in winning control over work in part because of technological advances and the rapid development of new products. As long as work focused on making what had been made before in the way it had been made before, workers could keep control of their own labor. As new products and new techniques emerged, however, management was able to win increasing control over work by cutting workers out of the development process. By removing the planning and design for the new production from the work floor to management offices, a profound division was created between managers and workers.
Operational efficiency was only part of the impulse behind that division. The pioneering management consultant, Frederick Taylor, makes clear the underlying assumption that workers are different kinds of people from managers. He argued that “even if the workman was well suited to the development and use of scientific data, it would be physically impossible for him to work at his machine and at a desk at the same time. It is also clear that in most cases one type of man is needed to plan ahead and an entirely different type to execute the work” (The Principles of Scientific Management, p.38, emphasis added). He stripped all thought and judgment from the activities of laborers and awarded it all to managers.
In the decades following the Civil War the distinction between management and labor took on clear class and racial meanings. As early as the construction of the Erie Canal in the 1820s, it was clear that the hardest, dirtiest, and most dangerous jobs were not appropriate for native-born Americans. Irishmen had to be imported because the native-born would not accept either the working conditions or the low pay. Workers for such jobs had to be recruited abroad. The “different types” of men to which Taylor referred are clear. Not all native-born white men could be managers, but all managers would be native-born white men.
By 1900 what we refer to as “the meritocracy” was in place.
The meritocratic system was a top-down invention. It took care of the comfortable classes, but it effectively consigned those who worked with their hands to second-class status.
This result was effectively ratified by the election of 1896. That election is remembered for the struggle over the relative roles of gold and silver in national monetary policy. Beneath that arcane issue, however, the election was about the character of the nation.
The issue was most eloquently formulated by the Democratic candidate, William Jennings Bryan. Speaking to the Democratic nominating convention, Bryan made clear the underlying oppositions in American life symbolized by Gold and Silver. The speech is remembered primarily for its impassioned closing line, “You shall not press down upon the brow of labor this crown of thorns, you shall not crucify mankind upon a cross of gold.”
Read in its entirety, Bryan’s speech is terribly sad. It marks the fact that large numbers of Americans had come to feel inferior to other Americans, to feel excluded from the political life of the nation, to feel that they have no say in matters that affect their lives deeply—that many Americans felt that their lives lacked dignity and were no longer of value in the life of the nation. He hopes to reverse that trend. Bryan says of his people, “We have petitioned, and our petitions have been scorned; we have entreated, and our entreaties have been disregarded; we have begged, and they have mocked when our calamity came.” And he then continues, “We beg no longer; we entreat no more; we petition no more. We defy them!”
Silver defied Gold, West defied East, Labor defied Capital, Farm defied Industry, and they lost.
What was lost in the election of 1896 was the experience of assured equality among Americans. One of the saddest points in Bryan’s speech is his need to assert explicitly the equality of “his people” with the people of Massachusetts (then still identified as the center of moneyed interest). He said that “[I] stand here representing people who are the equals, before the law, of the greatest citizens in the state of Massachusetts.” An equality that must be thus asserted is an equality that has been denied.
The inequality felt by laborers and small farmers had roots deep in the pre-industrial past. The comfortable classes in America were not unmindful of the suffering of the working class and small farmers, but they lacked any intellectual machinery for doing anything serious about it. During the Revolutionary period, workers (especially skilled craftsmen in the cities and larger towns) were politically active on behalf of the Revolution, and as a group they enjoyed solid status through the first twenty years of the government under the Constitution of 1787. By 1815, with the end of the War of 1812, however, the world of skilled workers was changing.
Forces that had been invisible in 1787 were transforming the economy. New machinery and new systems of credit were replacing the world of goods handcrafted to order by a single craftsman into one in which goods were made wholesale, often with a number of different workers responsible for the completion of individual products. Management of production passed from master craftsmen to merchant capitalists. Workers were caught between competitive pressures: immigration brought in workers willing to work for less than Americans were accustomed to while market competition drove the capitalists to cut costs rather than raise prices.
Caught between these two long-term trends, workers felt not merely a lowering of their standard of living. Men who had experienced independence were becoming increasingly dependent. Instead of independent men, they were becoming employees, a substantial loss of status. Workers were coming to feel inferior, to no longer experience themselves as the equals of all of their countrymen. And they were no longer treated as equals.
This weakening of a general sense of equality reflected the inability of Americans to develop ideas that could dignify manual labor. Americans were not alone in finding the emergence of industrial labor problematic. The core problem was the stigma traditionally attached to people who worked for other people. Although feudal relationships had been breaking down for two centuries or more, the legal concepts with which they were managed were largely unchanged throughout Western European culture. In English and American tradition, those concepts were embodied in the Master and Servant provisions of common law: a person who worked for a master was the Master’s Servant. The Master’s rights to his Servant’s labor could be enforced legally with fines and imprisonment. Even as Parliament made legislative revisions to the common law in the nineteenth century, it continued to label them as “Master and Servant Acts.” Only in 1875 did the legislation become the “Employers and Workmen Act,” although even then the penal provisions were not completely eliminated.
The United States had a very different experience: all thought about labor was shaped by the existence of chattel slavery. Workers had to be independent people. They had to free of any hint of the dependence that characterized the position of the slave. For the comfortable classes it was imperative to find a way to think about workers as fundamentally different from slaves. Traditional Master-Servant relations were no longer workable. As the effects of the revolutionary commitment to freedom and equality spread, the master-servant relationship ceased to provide a workable understanding of employer-employee relations.
Prior to independence, in the North it had been rare for a white man to remain as an employee throughout his life. The dominant expectation was that adult males would achieve economic—and therefore social and political—independence. Such independence was contrasted with the lifelong dependence of the slave. At the time of Independence, however, the North was rapidly developing as a market economy within which it was increasingly common for white men to spend their lives as employees of other white men. As the struggle over slavery grew, the South taunted the North with the charge that lifetime employment by another amounted simply to “wage slavery.” The industrial worker had no more hope than a slave of achieving economic independence.
The Southern taunt was the more painful because it was obvious that many industrial workers enjoyed a standard of living no better than that of a slave. Further, employees suffered from periods of unemployment during which no one cared for them in the way a master would look after his slaves during periods of slow work, through illness and infirmity and old age. The experience of chattel slavery could be dressed up to look better than wage slavery.
Northerners dealt with this problem in part by sending forth their women to perform benevolent deeds to help relieve suffering among the workers, and they gradually expanded the franchise so that by 1840 most white males could vote. They did not, however, do anything fundamental. In colonial experience it was common for young men to start adult life working for another, but it also was common for young men in time either to acquire land of their own or to achieve independence in a craft or entrepreneurial activity. Employee status itself was accorded no dignity. Comfortable Northerners, therefore, extended the colonial assumption that any respectable individual would rise above working for another. They developed an ideology of Free Labor and convinced themselves that all was well.
Free Labor ideology embraced two basic elements: the contract and free land.
As a general principle, the contract had emerged gradually as the basic device tor reconstructing the relations of individuals to others. As traditional social and family relationships became increasingly unable to manage relations among free and equal individuals, the available conceptual substitute was the contract. There was a certain symmetry: the only personal relations dealt with by the Constitution of 1787 were the involuntary master-slave relationship and the voluntary relationship of the contract.
By the time of Independence, the contract had already begun to replace many of the domestic relations provisions of common law, including marriage, and the contract was the basic conception underlying the Constitution of 1787. The conception of the labor contract enabled the North to break the connection between employee status and the stigmatized master-servant provisions of English common law.
Eighteenth century labor contracts, most common in the form of indentures for a period of years or for apprenticeship, subordinated workers to employers. In such situations, the master retained extensive authority over both the work and the personal life of the worker. The Free Labor Contract, in contrast, was thought of as a mechanism which in itself insured the worker’s independence. The Free Labor Contract could be imagined because people had learned to think of the individual’s capacity to work as a commodity. In slavery, the master owned the slave, and he therefore owned not merely the fruits of the slave’s labor but the labor itself. The slave, being utterly dependent on his master, had no control of his own labor. In contrast, the free laborer was independent of any employer and was free to treat his labor as a commodity to be sold in the market. The individual’s sale of his labor was conceived of as a free and voluntary contract between employee and employer. Because the labor contract was voluntary, it insured the independence of both parties. For those in the North, that independence was the ultimate trump card proving the superiority of Northern Free Labor over Southern slavery.
The fervor of the Northern faith in the contract can scarcely be exaggerated. The contents of the contract—hours of work and rate of pay—were irrelevant. Abolitionists were especially cavalier about the contents. For them the moral challenge to slavery depended on the rightness of Free Labor Contracts. For William Lloyd Garrison, for example, “the wage contract had become the very token of freedom.” (Amy Drew Stanley, From Bondage to Contract, p.21)
After the Civil War the labor contract provided the intellectual foundation for dealing with the 4,000,000 freed slaves. As one scholar has noted, “In postbellum America contract was above all a metaphor of freedom. . . . To contract was to incur a duty purely by choice and establish its terms without the constraints of status or legal prescription. . . . Contract marked the difference between freedom and coercion.” (Amy Dru Stanley, From Bondage to Contract, p. 2.) Thus during Reconstruction it was possible for reasonable people to believe that forcing freedmen to sign labor contracts with former slave owners insured their freedom.
Unfortunately, Free Labor ideology did not and could not eliminate the deeply rooted contempt for those who worked as employees. The contempt of Southern gentlemen for those who dirtied their hands in their daily work was in fact widely shared in the North. The North had a more generous estimate than did the South of the value of head work of a professional order, and it even extended that respect grudgingly to the self-employed farmer working his own land or the skilled craftsman with his own shop or to the keeper of a retail shop. The relatively egalitarian experience of colonial years had demanded that much.
Nothing in colonial experience, however, provided a basis for respecting any work done long term as another person’s employee. In colonial experience it was common for young men to start adult life working for another, but it also was common for young men in time either to acquire land of their own or to achieve independence in a craft or entrepreneurial activity. Free Labor ideology did nothing to elevate the status of employees. Instead it promised that no decent American needed to remain an employee for life. No wage slavery here!
Every craftsman could become a master and own his own shop. That failing, everyone could go west and achieve independence on free land. The labor issue was not confronted. It was ducked. The condition of Northern wage earners in fact was not obviously superior to that of slaves. As Southerners pointed out, in many respects slaves were better off than wage workers. If there was no work, or if the worker was ill or injured or aged, the Southern slave and family still received food, shelter, and clothing (at least in theory). In such situations the Northern wage worker was abandoned by his employer to freeze or starve along with his family.
The theory that the wage laborer was better off because he had the freedom to sell his labor by contract and to change employers when he wished was very cold comfort. The theory of Free Labor added warmth with the promise that the wage laborer was free not only to change employers. The wage laborer, it was assumed, would naturally rise above working for another and achieve economic independence. This cheerful assumption rested on an even bigger assumption: that free land would always be available to provide the rising young man with the bulk of the capital needed to become a truly free man—the American Dream.
Free Labor and America-as-Frontier
Free land was central to Free Labor, and free land was above all the frontier. Free Labor ideology developed along with the association of American national identity with the frontier. Free Labor became fully embedded in the mythology of the Frontier.
In 1788 both “America” and “the American” were unknown quantities. Given the adulation lavished on George Washington, it is surprising that he and his genteel life did not become symbolic of the new nation.
For the first two generations it seemed that the nation would in fact be managed by Southern planters or New England gentlemen. Beneath the genteel varnish, however, a very different character was becoming established as the symbolic American. Beginning with an “autobiography” of Daniel Boone published in 1784, Americans showed an increasing attachment to the symbolic values of the Frontiersman and the Frontier. Through extensive experimentation with such figures as the very un-genteel Yankee, the uncouth and vicious Backwoodsman, and various confidence men, Americans were prepared by the 1820s to accept James Fenimore Cooper’s character Leatherstocking as representative of themselves.
Leatherstocking was a frontier scout. Born to English colonists, he was raised for a number of years by Mohicans. He roamed the great forests of colonial America rescuing English maidens from Indian captivity and generally making the world safe for European settlers. Leatherstocking was so popular that Cooper became the first American to make a fortune writing novels. For several decades his stories were imitated by popular novelists, and historic figures such as Kit Carson and John Fremont consciously marketed themselves in his image.
The broad and enduring popularity of Leatherstocking should be a puzzle. Readers of novels lived in cities and towns and the long-settled strip of land along the Atlantic Coast. What would have been the appeal of Cooper’s tales set in trackless wilderness? The answer is that Americans in settled areas experienced their fluid, highly competitive, and rapidly changing social world as a kind of frontier.
It is clear that Ralph Waldo Emerson did. Born in Boston and later living in Concord, Mass., he imagined the American self as a frontiersman. At the center of “Self-Reliance,” perhaps his most influential essay, Emerson wrote:
When good is near you, when you have life in yourself, it is not by any known or accustomed way; you shall not discern the foot-prints of any other; you shall not see the face of man; you shall not hear any name—the way, the thought, the good, shall be wholly strange and new. It shall exclude example and experience. You take the way from man, not to man.
This passage takes us into the world of the frontier scout, moving away from man: the way is unaccustomed, no one else has left foot-prints on the path, no other people are visible, no names are spoken, everything is strange and new. In another influential essay, “The American Scholar,” Emerson wrote that, “So much only of life as I know by experience, so much of the wilderness have I vanquished and planted, or so far have I extended my being, my dominion.”
The American self is a frontiersman, free in a trackless wilderness, confidently alone in a world without other people, vanquishing the wilderness, extending his dominion, rising in the world.
The identification of the American with the frontiersman took practical form with the election in 1828 of Andrew Jackson of Tennessee as President. He represented the ideal outcome of Free Labor, a man who started with nothing and rose in the world. Jackson was a self-made man, the son of Irish immigrants who became famous and wealthy yet remained strongly identified with the common man.
Jackson broke the hold of East Coast mandarins on the Presidency. Later, the election of 1840 ratified the Frontiersman-as-American by electing William Henry Harrison as President. Harrison was the scion of a wealthy Virginia family who moved west and married the daughter of one of the major land speculators in the Old Northwest. Like Jackson he had been a successful Indian fighter, having defeated Tecumseh’s great coalition at Tippecanoe. He had subsequently been governor of Indiana territory and both U.S. Representative and Senator for Ohio. He was not a “man of the people” in origins, and his instincts all lay with the landed and the wealthy.
The Whigs, inheritors of the defunct Federalist preference for the wealthy, had concluded that the way to win an election was to run a “man of the people” with frontier credentials. They packaged Harrison as a humble man from the West. The campaign newsletter was called “The Log Cabin,” and it featured a sketch of a small cabin with a bearskin tacked to the wall—almost certainly homage to a true frontiersman, Davy Crockett. Issues were not debated, Harrison was pushed as a man of the frontier, liquor flowed at rallies and at the polls, and Harrison won. As one voter acknowledged, “‘So far as ideas entered into my support of the Whig candidate [Harrison], I simply regarded him as a poor man, whose home was in a log cabin, and who would in some way help the people . . .”
Whatever else the Frontiersman might be, he was the embodiment of the core of the Free Labor mythology—a free man who began with nothing and rose in the world, achieving economic independence in association with the Free Land of the frontier,
Free Labor and Manifest Destiny
The absence of serious policy debates during the election of 1840 makes clear the emotional power which had gathered around frontier imagery. That emotional power stood out in relief against the intellectual vacuum, but its practical consequences remained unarticulated. Americans had not discussed what they intended to do.
What they intended to do quickly became clear, however: they would settle the land—not merely the land already part of the United States but as much of the continent as they could get control of. In 1841 Congress passed a permanent Preemption Act, a law designed to make it easier for settlers who “squatted” on land to obtain clear title once federal control was established. In 1843 the first emigrant wagon train traveled for months on the Oregon Trail to settle on land west of the Louisiana Purchase that England still claimed. The English used the territory as access to fur trade, not as land to be settled. The land was there, Americans wanted it, and they took it.
These well-known details deserve a moment’s reflection. The idea of legitimizing “squatter’s rights” as a matter of national policy expresses an extraordinary attitude toward land—that unsurveyed land can be claimed by the first person to grab it. True, one had to live on it and had to pay a nominal purchase price once it was surveyed, but the seizure and utilization of the land by itself established a claim to title.
Here we can see the backing to Free Labor’s promise that individuals can rise above work as another’s employee. In 1845 the assumptions implicit in these actions were given a concise formulation, “Manifest Destiny.” The editorial writer for the United States Magazine and Democratic Review (July-August, 1845) who coined the phrase did not even have to argue that the nation possessed such a destiny. He simply assumed both that it existed and that his fellow Americans would agree that it existed. He was concerned not with defining America but with denouncing those who would block our expansion. He denounced other nations (not internal critics) who dared to think of “hampering our power, limiting our greatness and checking the fulfillment of our manifest destiny to overspread the continent allotted by Providence for the free development of our yearly multiplying millions.”
“Allotted by Providence.” “Manifest Destiny.” The Free Land that sustained Free Labor ideology was no mere human scheme but part of the cosmic plan. And in three years—1845-1846— the United States increased its supply of land by 67%.
This explosive growth was not a simple natural process. What happened in the 1840’s was unprecedented, utterly different, for example, from the purchase of Louisiana. That purchase was an unforeseen opportunity, an offer too good to refuse, but the decision to purchase was guided by no explicit national policy and certainly not by public demand. It was an action of the President acting virtually on his own authority, and probably exceeding it.
The expansion in the 1840’s, by contrast, was a national action. There was strong presidential leadership, but tremendous pressure also came from outside the government. Public support was strong and articulate (as was opposition). Opportunity was not offered to the country, as when Napoleon offered to sell Louisiana. On the contrary, the lands acquired between 1845 and 1848 were taken by force or the threat of force. Americans wanted the land, they believed that they had a right to it as their “Manifest Destiny,” and they took it.
In the 1840s “Manifest Destiny” was less a government policy than a peoples’ policy. At stake was not the right of the government to conquer foreign lands but the right of “the multiplying millions” to “overspread the continent.” The people, not the government, possessed a preemptive claim to the land and were to be the primary actors in carrying out the national destiny. In effect, the role of government was legitimized by the role of the people. Popular preemption of the land justified the government’s use of force to sustain those claims. And the people’s claim rested on nothing more than a shared assurance that the cosmos blessed their actions.
By 1848 the Free Soil Party linked the frontier to Free Labor. Its slogan was “Free Soil, Free Speech, Free Labor, Free Men.” In 1856 Lincoln’s Republican Party clarified the slogan as “Free Labor, Free Land, Free Men.” The availability of free land underwrote the belief in free men. In the conclusion to his speech to Congress in 1861, President Lincoln framed the Civil War in terms of the competing systems of labor in relation to capital: “whether it is best that capital shall hire laborers, and thus induce them to work by their own consent, or buy them, and drive them to it without their consent”—whether it is best to rely on free labor or on slavery.
He proceeded then to dismiss the wage-slavery argument that “whoever is once a hired laborer is fixed in that condition for life.” He insisted that “The prudent, penniless beginner in the world, labors for wages awhile, saves a surplus with which to buy tools or land for himself; then labors on his own account another while, and at length hires another new beginner to help him.” And he continued to reiterate the promise of the frontier: “This is the just, and generous, and prosperous system which opens the way to all—gives hope to all, and consequent energy, and improvement of condition to all.”
As an economic and cultural “system,” this happy picture was a fantasy when Lincoln spoke and had been a fantasy for several decades. By his time it was an ideology that had hardened into mythology. It had ceased to be a truthful observation about the world and had become an established “truth” which was imposed on the world with scant regard for inconvenient facts of life in a market economy.
Some Inconvenient Facts
As Lincoln’s comments suggest, by 1861 Free Labor mythology was ready to confront what would seem to be the greatest inconvenient fact of all, the ending of the frontier. Some 30 years later, when the director of the census declared that the frontier had ceased to exist, Americans took the news in stride. The American Dream no longer depended on free land. All it needed was the glorious “system” that “gives hope to all, and consequent energy, and improvement of condition to all.”
There were other inconvenient facts, however, and they were much more troublesome. Deprived of free land, Free Labor ideology depended on two vital assumptions: 1) there would always be enough work for all, and 2) the available work would pay enough to allow the laborer to live an existence superior to that of a slave.
All were aware that there often was not enough work and that pay for the hardest, dirtiest, and most dangerous work was usually too low for a respectable life. Nevertheless, these assumptions became embedded in the Free Labor myth because they were vital to those who were not laborers. Northerners needed to believe in the humanity of Free Labor as badly as Southerners needed to believe in the humanity of chattel slavery.
Northerners were struggling to make sense of their lives in a newly egalitarian social order and in the unknown demands of a market economy. Their New Order of the Ages required them to function as independent individuals, and their great fear was dependence, the condition of slaves. They had little choice but to see dependent individuals as threats to themselves and to their society. In other words, they had to believe that laborers remained independent individuals who sold their labor under voluntary contracts, that reasonable contracts were always available to all.
The idea of the Free Labor contract enabled Northerners to sustain their faith in the humanity of the emerging market economy. As the American social order took form, it was important to those whose livelihoods did not depend on labor contracts to believe in the fairness of the contracts for those who did. This faith in the voluntary character of the individual labor contract produced a deep conviction that unemployment or poverty was a failure of individual workers. In spite of the evidence, Free Labor simply assumed that there was work at adequate pay for all who would work. The system could not fail, so the individuals must be the problem if there was no work or pay was too low. Those with too little work or too little pay had no one but themselves to blame.
The ideology of Free Labor, through the magic of the contract, defined employed workers as independent individuals responsible for their own lives. That definition gave workers little comfort. It was above all a reflection of the intense importance that the emerging culture placed on individual “independence” and the intensely negative meaning of “dependence.” “Dependence” was so great an evil that no political steps could be taken to free the unemployed from the curse of dependence. Workers could not organize among themselves to fight for a better deal nor could society legitimately lighten the burdens of the unemployed and poorly paid. The individualizing drive in American life made such solutions impossible. The unemployed man had to remain independent, even if it meant that he and his family starved. Starvation was thought to be beneficial as a mighty spur to productivity.
With admirable consistency Americans pursued this faith in their system to its logical conclusion. Lack of work, they came to believe, sprang not from systemic failure but from the laziness (or other failing) of the independent worker, and they therefore decreed that unemployment was a crime. Especially in the cities, toward which the unemployed gravitated, police were empowered to arrest men simply because they lacked any “visible means of support.” Arrested on slim evidence, men could be rushed through crude proceedings in which they were presumed guilty, convicted on the shabbiest of evidence, and sentenced to work, often hired out to be exploited by private employers.
Free Labor ideology did much for the middle class but very little for workers. It could not dignify labor, especially the dirty and dangerous but necessary drudge work that all towns and growing industry depended on. In fact, Free Labor made it difficult to endow with dignity any work conducted in the employment of another. Dignity resided not in work but in independence. Lincoln did not find dignity inherent in the work for another done by his “penniless beginner.” Dignity lay not in the beginning but in the rising to independence. The model of dignified labor was (and still is) the work of the professional or the independent entrepreneur.
The fear of dependence, finally, prevented workers from helping themselves. Conceived as working under individual contracts with employers, employees were legally prevented from organizing themselves to improve working conditions or pay. Free Labor ideology decreed that workers didn’t need unions. Such dependence on each other was inconsistent with the philosophy of the free labor contract. Free Labor ideology simply could not consider that such contracts might be the product of asymmetrical negotiations between single individuals on one hand and a powerful employer of many individuals on the other.
This anti-union dimension of Free Labor formed quite early, as is clear in a frequently cited 1836 labor case in New York. Twenty tailors had been fired for joining together to seek higher pay. They then were blacklisted by an employers’ association for such organizing (employers, it seems, could organize without violating their independence), and in response the tailors had picketed the employers’ shops. They were arrested for conspiracy, found guilty of committing acts injurious to trade, and heavily fined.
The judge in the case concluded that only foreigners could hold foolish notions such as the need of workers to organize to enhance their bargaining power with employers. In his decision he wrote that, “‘In this favored land of law and liberty the road to advancement is open to all, and the journeymen may by their skill and industry, and moral worth, soon become flourishing master mechanics. Every American knows . . . that he has no better friend than the laws and that he needs no artificial combination for his protection. They [ideas about “artificial combinations”] are of foreign origin and I am led to believe mainly upheld by Foreigners.’” (A History of American Labor, p. 82)
Even at that early date, Free Labor was mythologized into a truth that could be imposed on reality without regard to inconvenient facts. In 1859, speaking to an audience in Milwaukee, Wisconsin, Abraham Lincoln summarized the practical conclusion when the myth was imposed on reality: “If any continue through life in the condition of hired laborer, it is not the fault of the system, but because of either a dependent nature which prefers it, or improvidence, folly, or singular misfortune” (in Lincoln, Vol. III, p. 479). The system is right, therefore all blame falls on the individual.
Such myth does not remain mere myth. It becomes institutionalized.
The full institutional logic of the Free Labor mythology played out in the post-war South. The Freedmen’s Bureau adapted Northern vagrancy laws to deal with Freedmen. Since the great solution for dealing with the freed slaves was to have them enter into free labor contracts, the notion of vagrancy was relied on to force unwilling black men into work contracts with white men. After the end of Reconstruction, the process was exploited to create an industrial slavery that provided a large pool of very cheap labor to support the development of industry in the New South. (The extent of this industrial slavery, which was closer to forced labor in Nazi or Soviet concentration camps than to ante-bellum plantation slavery, has only recently been detailed. See Slavery by Another Name by Douglas A. Blackmon).
As the vagrancy laws made clear, Free Labor left workers terribly vulnerable. Free Labor ideology could not dignify industrial labor, and it could not deliver on the promise of a brighter future. Instead, it trapped workers in an intolerable present. Its emphasis on the individual character of the labor contract preserved a powerful bias against any organized effort by workers to improve their position as a group. In hard times they were at risk not merely of starvation. They were at risk of falling out of the system altogether, of becoming “savages” outside of constitutional protections.
In the turbulent industrial development in the decades following the Civil War, that is what happened. When workers organized, the power of the militia (National Guard) and the U.S. Army were turned against them. The readiness to use military power against the threat to civilization represented by workers became manifest in new local armories. After the Civil War, hundreds of armories were built. They were not built in the South to control blacks. They were built in towns and cities in the industrialized Northeast and Midwest to keep industrial workers under control. Symbols of local pride and middle-class power, they were as common as the convention centers and arenas of today.
Those armories, not the Frontier, were metaphors for the institutionalized position of labor within American culture.
Ideas about the self and the nation were shaped around the development of occupation centered identity and Free Labor mythology. Those ideas developed out of the daily lives of Americans exploring the meaning of their New Order of the Ages.
The process obscured the role of capital because Free Labor ideology positioned the accumulation of capital as the expected result of every man’s work. That some accumulated more than others did not trouble the basic theory. Only after the Civil War did it become obvious that it was possible for some to accumulate so much capital that they possessed political and economic privileges equal to those enjoyed by the top ranks of European nobility and sufficient to destroy democracy.
The Constitution of 1787 did not anticipate the possibility that capital accumulations would become the foundation of a new privileged class. The framers, on average, expected that people like themselves—the “better sort”—would naturally continue to run the government. They thought they had guaranteed that by having Senators selected by the state legislatures rather than by the people themselves and by having the President selected by that oddity, the Electoral System.
In the absence of inherited privileges, however, these devices did not preserve political power in the hands of the traditional genteel class. Although the genteel probably had not thought of the lesser classes as their equals, those lesser classes had had the gall to believe themselves equal to their betters. Driven by that belief, they had gradually forced the expansion of the franchise, until almost all of them—all of the men, that is—had the right to vote. The wake-up call for the traditional elites came in 1828, with the election of Andrew Jackson as President.
The demise of the genteel class appeared to usher in an era of unprecedented democracy. What it really did was create a vacuum quickly filled by nouveau riche upstarts. Long on money and short on fine manners and correct English, a rising class of capitalists arose out of the same constitutional roots as Free Labor, the contract.
The Constitution of 1787 dealt with only two relations between individuals: the involuntary relationships between master and slaves and the voluntary relationships created by contracts. The Constitution is mealy mouthed about slavery, but it takes a clear stand on contracts. Their clarity was inspired in part by the fact that some state legislatures had taken to canceling debts—aiding rural people at the expense of city slickers—and so the Constitution specifically asserts that no state legislatures may make any law “impairing the Obligation of Contracts.”
Other provisions in the Constitution of 1787 protected interests of the wealthy—Congressional regulation of interstate commerce, the power to levy tariffs to protect American industry, the honoring of Revolutionary War debts, the protection of property rights in slaves, the authority to regulate the value of money—but the Contracts Clause lent itself most fully to the protection of business interests
The significance of the Contracts Clause grew immensely under the Supreme Court when John Marshall was Chief Justice (1801-1835). Marshall, who was appointed by John Adams, was a firm Federalist with a sharp commitment to and vested interest in property rights. Under his leadership the Court established the principle of its right to subject acts of Congress to judicial review and the practical importance of the Contracts Clause.
In general terms the Marshall Court shielded all contracts from interference by state governments. Marshall’s frequently cited decision in Dartmouth v. Woodward prevented the State of New Hampshire from altering the terms under which it had chartered Dartmouth College to transform the private college into a state institution. The critical importance of that decision was that it established as precedent the treatment of corporate charters as contracts. Before that decision, corporate charters, each of which required specific action by the legislature, were rarely granted. Further, they had been granted on the understanding that the purpose of the corporation was to perform a public service, such as operating a ferry, toll road, or college. The Dartmouth decision opened the way for corporations serving purely private purposes, such as manufacturing, building and operating railroads, and so forth.
Over the following decades, Supreme Court decisions led as legislatures and lower court decisions opened up the process of incorporation. By the end of the Civil War the process had advanced to the point at which experience with corporate structures had enabled corporations to aggregate and concentrate large amounts of capital under the direction of a relative handful of men.
The explosive growth of corporations in the last decades of the nineteenth century provided what looks a great deal like an economic class structure: a small number of wealthy capitalists; a cadre of educated managers, professionals, small entrepreneurs, and the skilled traditional craftsmen organized by the AFL; on the bottom, an army of workers.
Race and Gender
As one Austin Lewis described it in 1911, American society, looked like this:
“‘A few thousands of millionaire capitalist ‘kings,’ uniting the means of a few hundred thousands of passive stockholders, and served by, perhaps an equal number of well-salaried managers, foremen, inventors, designers, chemists, engineers and skilled mechanics, will absolutely control an army . . . of practically property-less wage laborers, largely Slavonic, Latin, or Negro in race.'” (quoted by David R. Roediger, Working Toward Whiteness, p.6)
The striking feature of this description is the assurance that the millionaires “will absolutely control . . . wage laborers” combined with the identification of wage laborers by race. The “new immigrants”— Slavonic (eastern European) and Latin (southern European)–are lumped together with the freed slaves. Laborers were identified by race, and by definition the “capitalist kings” and their “servers” were “white men.” The economic order, like the social order and cultural order, was also a racial order. What is not explicit is the assumption that the millionaires and their servers are native born white men. As early as 1884, one social commentator narrowed the meaning of “American” even more, asserting that the term “American” refers to only a fraction of American society, the “four million . . . brain-workers.”
By about 1900 the United States had reconstructed itself out of the chaos of the Civil War. People who had fought and killed each other found a way to imagine themselves as a single nation. The keystone to national reunification was the establishment of national identity based on the presumed superiority of white males. They constructed the White Republic in which America was her white males and white males were America.
The establishment of white male superiority as the defining characteristic of the nation was not a simple process. Both race and gender superiority were implicit in the Constitution of 1787, but they were not developed as a conscious theory. The subordination of women and non-whites was assumed, but the superiority of white males lacked explicit formulation.
Gender issues followed a distinctly American course. The exclusion of women from the political class, combined with the elimination of social class as a shared bond, led to the gradual emergence of the understanding that women dwelled in their own “sphere.” The lives of white men and white women became increasingly separated. In time that separation produced gender based tensions which in time, for some women, turned into active hostility.
That hostility revealed itself most clearly after the Civil War in the Women’s Christian Temperance Union (WCTU). Under the leadership of Frances Willard, the WCTU developed what she called the “do everything policy.” The WCTU worked not only to discourage the use of alcoholic beverages (a largely male failing) but to support female suffrage, to teach women about nutrition, to work with prostitutes, to push for jail reform. It opposed vivisection and supported mediation as a path to international peace.
In short, the WCTU worked in opposition to the dynamic of the world being created by men. The separation of white men and white women into separate political classes was leading some women to identify primarily with other marginalized groups within society—workers, non-whites, William Jennings Bryan’s small farmers and businessmen, prostitutes, criminals, the poor.
By the mid-1890s, however, the national movement toward identification of the nation with its white men forced women to make a choice. They had to choose between identification with other marginalized groups or with their husbands, fathers, brothers, and sons. It was not a real choice. Leadership of the temperance movement itself was taken over by the male-led Anti-Saloon League, and the WCTU abandoned the “Do Everything” policy and limited its activities to temperance issues under the leadership of the Anti-Saloon League. The political structure implicit in the Constitution of 1787 asserted itself, and in a new way white men dominated white women.
The new separation of the lives of white men from those of white women demanded no ideological shift in thought about gender relations; women were merely restored to their rightful places. Racial thought, however, required a new ideology. The instinctive racism that shaped treatment of Indians and blacks by itself provided no foundation for national identity. It was only negative. Reuniting North and South on racial grounds needed a positive theory, Anglo-Saxonism.
Anglo-Saxonism could emerge only after it became possible for Americans to accept a new racial identification with the English. This crucial shift occurred after the conclusion of the War of 1812. By the end of that war, independence and the integrity of national boundaries were assured. Americans no longer needed to look on the English as enemies. Racial identification with the English became acceptable.
American Anglo-Saxonism initially focused on establishing the superiority of Anglo-Saxons over any other European peoples, a project already developed by German scholars. While no “Anglo-Saxon” people ever existed, the term generally included the people of the northern German principalities, the Netherlands, the Scandinavians, the Scottish, and the English—the northern European Protestants. Positive American racism began with efforts to asset the superiority of Anglo-Saxons over immigrants from Ireland and Southern or Eastern Europe.
The impact of these developments was limited as long as the country was run by the East Coast patrician establishment, men who had been shaped by Enlightenment thought. With the election of Andrew Jackson in 1828, power passed to men more in the Romantic mode. The change in thinking was most obvious in Jackson’s determination to clear all Indians off lands east of the Mississippi. The treatment of the Cherokees, who had done everything possible to assimilate themselves to American culture, revealed that the reason for the assault on Indians had changed. The problem was no longer with what Indians had done or not done. The animosity toward Indians was purely racial. Jackson hated Indians because they were Indians. Their claims to land that white men wanted could not be respected.
Cherokee resistance to removal produced two cases heard by the Supreme Court in 1831 and 1832. The results of those cases were ambiguous. The Indians were treated respectfully by the Court, but the their claims to rights seem not to have been treated with seriousness. By 1857 the tone had hardened. In his opinion in the case of Dred Scott v Sanford, Chief Justice Roger Taney not only declared that blacks were not and could not become citizens, he also went on to declare that Africans were “so far inferior, that they had no rights which the white man was bound to respect.”
Many Americans were outraged by the decision, but the sorry truth is that Taney was closer to the broad attitude toward non-whites than the beliefs of the radical Republicans who forced through the Fourteenth Amendment in 1868. The first paragraph of the amendment was written as an explicit rejection of the entire argument in the Dred Scott decision. It granted citizenship to “All persons born or naturalized in the United States.”
Congress added further language which has had continuing radical effects on American life. The Fourteenth Amendment carries on to assert that those who are citizens of the United States are also citizens of the states where they reside. Recognizing that citizenship alone was no guarantee of equality, the amendment expressly insists on legal equality for all citizens:
No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.
With these words the egalitarian ideals of the Declaration of Independence became the law of the land. So that there would be no question, the final sentence of the amendment states simply that “Congress shall have the power to enforce, by appropriate legislation, the provisions of this article,” a provision that made the federal government the protector of individual rights against the states.
For a few years Congress tried to enforce the rights of blacks, but the nation lacked any conception of how to include blacks as equals. Like Abraham Lincoln, many Americans persisted in wishing that all blacks could be shipped back to Africa. Some hoped that they would simply die out. There was no functional conception of how the nation could be reunited if blacks were treated as equals.
The only foundation for national reunification was white supremacy. Mere racial prejudice was not enough, but even at the end of Reconstruction in 1877 there was no systematic identification of the nation with whites. Anglo-Saxonism, a positive racism, needed further development.
The intellectual framework of white supremacy was rooted in notions of Anglo-Saxon superiority that reached back at least to the struggles between England’s Parliament and the King in the seventeenth century. The Parliamentary view was that Anglo-Saxon superiority was a matter of their superior institutions (Parliament and the common law). In nineteenth century America, that emphasis on institutions yielded to a simple racial explanation: Anglo-Saxons are inherently superior people.
By about 1890, American Anglo-Saxonism had matured. In 1889 Theodore Roosevelt published the first volume in a projected eight volume history entitled The Winning of the West. In the introductory chapter Roosevelt outlines an interpretation of American history that makes the Americans “a separate and individual people.” He acknowledges the participation of non-English people in the creation of the new American people, but he insists that the core is Anglo-Saxon. He was troubled by the fact that “the negro, unlike so many of the inferior races [Indians, in other words], does not dwindle away in the presence of the white man.” (I,xviii) In spite of this aberration, Roosevelt does not add blacks to the list of contributors to the new American people, nor does he embrace southern and eastern Europeans (notably Italians and Slavs) or Asians.
Roosevelt moved the center of force down from the heavens to man, specifically to the new American race of Anglo-Saxon origin. The chosen people are no longer defined by their religious purity or by their commitment to freedom and equality; they are defined by their racial purity.
With Roosevelt we see America’s Manifest Destiny reduced to white supremacy. By 1900 white racial solidarity had reunified the nation as the White Republic.
There was one more thing required to complete the construction of “America”: a good mythological narrative that could present a compelling version of the United States as a unified white male nation.
Much of the work of creating a national narrative was carried out by academics writing supoposedly scientific (i.e. evidence based) history on the German model. The academics performed two key tasks: they disposed of blacks as a meaningful presence in American life, and they dodged the divisive fact that slavery was the dominant political issue for the first full century of national existence.
The presence of blacks was left to Southern historians. Until the migration of blacks out of the South early in the twentieth century, there were so few blacks in the North that their presence was too small to need attention. While many Northern whites were disturbed by post-Reconstruction treatment of blacks in the South, they weren’t disturbed enough to think that the issue touched them. For Southerners blacks were a central issue. Radical Republicans had tried to turn former slaves into free and equal citizens, and the task of Southern historians was to prove that blacks were unworthy of such inclusion. They accomplished that demonstration by writing histories of Reconstruction that portrayed blacks as unfit for democratic self-government. They were saddled with responsibility for everything that had gone wrong during Reconstruction, most of which was beyond anyone’s control or was in fact the doing of white folks. Northerners, sorely tired of the entire issue, accepted those histories at face value.
With blacks out of way, the historic reality of slavery remained in the way of reunification on the basis of white supremacy. Slavery dominated American history and gave it a North-South orientation. The solution was to orient American history to an East-West axis, the story of the frontier and Manifest Destiny. During Reconstruction national attention followed the Army units as they moved from the occupation of the defeated Confederate states to the effort to subdue the Indians west of the Mississippi. Theodore Roosevelt’s popular The Winning of the West made the westward movement the defining story of our history. Then, in 1893, a young academic named Frederic Jackson Turner published as essay entitled “The Significance of the Frontier in American History.”
In that essay Turner propounded what is known as “The Frontier Thesis.” He argued that the democratic character of the nation had been shaped by the frontier, and he worried that the closing of the frontier, which the director of the census had recently announced, would cripple democracy. His argument continues to generate controversy, and it put the Frontier at the center of official American history.
Meanwhile, popular culture was celebrating the Frontier in the form of Buffalo Bill’s Wild West and its imitators. For over three decades these “historic” pageant/circuses traveled throughout the United States and Europe celebrating the far western frontier as the central drama of American history. Racial conflict involved only Indians, and the cowboy with his six-guns became the hero of our history.
By the end of the nineteenth century the Western novel tradition had flourished and then degenerated into mere celebrations of violence. In 1902, however, the tradition was revived by Owen Wister’s The Virginian: A Horseman of the Plains. It is broadly recognized as the origin of the modern Western. It celebrates the Free Labor myth as the hero rises from a poor farm in West Virginia to become a wealthy rancher in Wyoming. It also affirmed the submission of women to men as an inherent element of the American story.
The final development of our mythological narrative was the presentation of central elements of the Western story with blacks in the roles of savage Indians in the film Birth of a Nation by Thomas Dixon and D. W. Griffith. Probably the single most influential artifact of popular culture in the twentieth century, the film is a brilliant propaganda piece that celebrates the reunification of North and South on the basis of white racial supremacy. Seen by tens of millions of Americans, it replaced the gentle old black male of Uncle Tom’s Cabin and Joel Chandler Harris’s Uncle Remus with the black male as bestial rapist. The triumph of white Southerners over oppressive rule by blacks leads to white unity. The film’s closing shot shows a peaceful world of happy white folk presided over by an image of Jesus.
Birth of a Nation was released in 1915, the 50th anniversary of the end of the Civil War. It stands as the completion of an effective national narrative of America as a white man’s country.
The Revolution of the 1960s
The White Republic provided the United States with stability for two generations. It survived the Great Depression with its racial character enhanced by the two most effective New Deal programs, Social Security and the Federal Housing Authority. Neither program explicitly excluded blacks, but both did so very effectively.
After World War II, however, things began to fall apart. White males continued to enjoy their entitlement, but the status that supported those privileges slowly gave way. White men continued to possess dominant power, but the post-war generation could not, would not grant them authority. The Civil Rights movement, and then the Women’s movement, seized the opportunity provided by that failure of authority and transformed America.
We celebrate the moral achievement of those movements, but the revolution lay more in the collapse of white male status than in the successful reform. Successful reform depended on the general collapse of authority. Unfortunately, the reform was less sweeping than the collapse of authority. While the status of white men, as white men, failed, the power of white men remained largely undisturbed.
The system had nevertheless been profoundly shaken. Power—social, economic, and political power—had been legitimized by the superior status of white men. The collapse of white male status created a crisis of legitimacy. A new foundation for power in America needed to be fashioned. The Democratic Party, reeling from the murders of Martin Luther King and Robert Kennedy and burdened with the albatross of Vietnam, imagined no effective policy. They embraced the revolution but failed to pursue its logic: the collapse of white male entitlement had to be followed by an assault on white male power. Instead, Democrats fell back on a reliance on judicial decisions rather than legislated change, pursuing increasingly marginal incremental advancement of minority and female rights. Their minds were untarnished by new ideas.
In the election of 1968, Richard Nixon exploited the insecurities created by the loss of white male status. With winks, nods, and coded phrases, he implied that he would restore white males to their proper place in American life.
It worked. That is, Nixon won the election. White supremacy, however, was merely an electoral strategy for Nixon. It was not a governing policy. The problem of legitimate power remained unresolved. The Republican Party adopted a basically counter-revolutionary attitude, but an overtly racist and sexist policy had become unthinkable on the national level.
The first Reagan administration created a serious alternative. Reagan continued to speak in code to those who wished to restore the gender and racial claims to privilege for white men, but he in fact made little effort to turn back the clock. Instead he worked to establish money alone as the legitimate basis of social, political, and economic power.
Reagan in effect recognized a fait accompli. Beginning in the early 1970s, the collapse of white racial solidarity helped American business cease pretending that it had any responsibility to employees. The position of labor in the new scheme of things was made clear in 1981 when Pres. Reagan fired striking air traffic controllers and ruthlessly broke their union. Corporations relied increasingly on the theory that they owed allegiance to no one but stockholders—not to employees, not to customers, not to suppliers, not to the communities within which they lived, not to the United States. Profits—money—was all that business was about. The main business of government was to assist those who made a great deal of money to make even more.
Government ceased serving all the people and instead increasingly served the wealthy. The reduction of taxes on large incomes and on businesses are the best known steps taken on behalf of wealth, but as early as 1990 Kevin Phillips had detailed the myriad ways in which federal policies had been managed to concentrate wealth (see The Politics of Rich and Poor).The Clinton administrations did not raise significant challenges to these policies, and George W. Bush, of course, pursued them with amazing vigor.
Money has had its chance to legitimize its claim to power. Although personal and corporate wealth have acquired greater power than they have enjoyed since the 1890s, in myriad ways that power has failed to establish political legitimacy,. Our political confusion arises from the lack of a legitimate source of sovereign power. If we are to restore our vitality as a nation, we must reconstitute the sovereign power of the governed, the power of “We the People.”
The issue now, as it was in 1863, is whether “government of the people, by the people, and for the people” shall perish.