THE REPUBLIC OF BEES
In 1776, the colonies declared themselves independent. The bitter war that followed ended in an American victory. Peace, of course, raised as many questions of government as it answered. A plan of government is a plan for distribution of the power and wealth of a society. The choice of system, then, is no idle exercise in political theory. How to plan the new American government was the major policy issue of the late 18th century. The first grand scheme was embodied in the Articles of Confederation. It proved unsatisfactory to powerful circles in the country. After the failure of the Articles, a federal Constitution was drawn up, and ratified in 1787.
Each colony, too, underwent its own revolution. Colonies became states, and embarked on new courses of action with new problems and new programs. First, they had to fight a war and patch up domestic disruptions. All this called for a major outburst of lawmaking. In Pennsylvania, for example, a constitutional convention, in 1776, declared a general amnesty and established a new form of government. Old officials were replaced by men loyal to the Revolution. The ordinary business of government was to continue, where possible; and the emergencies of war had to be coped with. In October 1777, British troops "penetrated into [the] state, and after much devastation and great cruelty in their progress," seized Philadelphia; the state government then created a "council of safety," with vast and summary powers "to promote and provide for the preservation of the Commonwealth." It had power to seize goods "for the army and for the inhabitants," punish traitors, and "regulate the prices of such articles as they may think necessary." But the "ordinary course of justice" was to continue as far as feasible. In the same year, the legislature passed a bill of attainder against a number of men who had "traitorously and wickedly" gone over to the king. The state redefined and punished treason, declared bills of credit of the Continental Congress and the state to be legal tender; and, inevitably, legislated about the militia, army supplies, taxes, and the policy of war.
When the war ended, debates over law continued. The king of England and his government had been successfully overthrown. Should the king's law be also overthrown? Should ordinary private law be radically altered? The first generation seriously argued the question. The common law was badly tarnished; so was the reputation of the lawyers, many of whom had been Tories. It seemed to some men that new democratic states needed new institutions, from top to bottom, including fresh, democratic law. A pamphleteer, who called himself Honestus, asked, in 1786: "Can the monarchical and aristocratical institutions of England be consistent with...republican principles?" It was "melancholy" to see the "numerous volumes" of English law, "brought into our Courts, arranged in formidable order, as the grand artillery to batter down every plain, rational principle of law." Thomas Paine, an old firebrand, spoke for at least some zealots when he denounced, in 1805, the "chicanery of law and lawyers." He complained that Pennsylvania courts, even at that late date, had "not yet arrived at the dignity of independence." The courts, he said, still "hobble along by the stilts and crutches of English and antiquated precedents," which were often not democratic at all, but "tyrrannical." During Shays's Rebellion, in Massachusetts (1786), mobs stopped the courts from sitting, and forcibly staved off execution of judgments against debtors. It was easy to attribute class bias to the courts, and attribute this class bias in turn to the antiquated, oppressive, inappropriate common law.
There were two apparent alternatives to the stilts and crutches. The common law could be replaced by some rival system. Or all systems could be abandoned in favor of natural principles of justice. The first alternative had some slight basis, in hope if not in fact. There were other systems of law. After the French revolution, American liberals were particularly attracted to the French civil law. In the early 19th century, the Napoleonic Code served as a symbol and model of clarity and order. Some civil-law jurists were translated into English during this period: A Treatise on Obligations, Considered in a Moral and Legal View, "translated from the French of [Robert] Pothier," appeared in New Bern, North Carolina, in 1802. To some small extent, French scholars influenced American legal thought. Compared to civil law, common law seemed, to a number of jurists, to be feudal, barbaric, uncouth.
In hindsight, the common law had little to fear. It was as little threatened as the English language. The courts continued to operate, continued to do business; they used the only law that they knew. Few lawyers had any grasp of French. French lawbooks were rare and inaccessible; English authorities flooded the country. To be sure, there were some American jurists who had the education and skill to handle Continental law -- James Kent of New York, for example. Joseph Story, who served on the Supreme Court, was a tower of erudition. These men cited and used bits of foreign law in their writings and opinions. But they were not revolutionaries. They believed in purifying and improving the law, not in overthrowing it. They were willing to snatch doctrines and ideas from Continental Europe; but even English law did that. One of the culture heroes of the American legal elite was England's Lord Mansfield, who died in 1793. Mansfield was Scottish by birth and an ardent admirer of Roman-flavored civil law.
And of course the common law had many defenders. Not everybody saw the common law as old and despotic. It was also the birthright of free men, a precious inheritance, perverted by the British under George III, but still a vital reality. One rhetorical pillar of the men of 1776 was that the common law embodied fundamental norms of natural law. The first Continental Congress, in 1776, adopted a Declaration of Rights; it declared that the colonies were "entitled to the common law of England," in particular the right of trial by jury. Americans were also entitled to the benefit of those English statutes which "existed at the time of colonization; and which they have, by experience, respectively found to be applicable to their several local and other circumstances."
Common-law lawyers were among the heroes of the Republic. John Adams was one; Thomas Jefferson, for all his ambivalence toward common law and its judges, another. Lawyers mostly drafted the state and federal constitutions. Courts were increasingly manned by lawyers, who listened to the arguments of other lawyers. Lawyers moved west with the line of settlement; they swarmed into state capitals and county seats. Wherever one looked in political life -- in town, city, county, state, and national government -- the lawyers were there. Unlike some later revolutions, and some earlier colonial Utopias, the new republic did not try to do business without lawyers. Old lawyers continued to function, training new lawyers in their image, who, like their teachers, turned almost instinctively to common law. The common law was also a weapon of integration. The Northwest Ordinance imposed common law on the lands of the American frontier. In the prairies and forests, where French settlers lived and worked in the path of the American onrush, the common law was an agent of American imperialism.
The common law would have to be Americanized, of course. Now that the states had freedom to choose, what parts of English law would remain in force? This was a tortuous question, not easily solved. Many states passed statutes to define the limits of the law in force. A Virginia law of 1776 declared that the "common law of England, all statutes or acts of Parliament made in aid of the common law prior to the fourth year of the reign of King James the first, and which are of a general nature, not local to that kingdom...shall be considered as in full force." The Delaware constitution of 1776 (art. 25) provided that "The common law of England as well as so much of the statute law as has been heretofore adopted in practice in this State, shall remain in force," except for those parts which were "repugnant to the rights and privileges" expressed in the constitution and in the "declaration of rights."
The New York experience was particularly complex. A law of 1786 declared the common law in force, and such English statutes as were in effect in the colony on April 19, 1775. Later, New York specifically re-enacted some British laws -- the Statute of Frauds, for example, a law first passed in 1677, and which had virtually become a part of the common law. In 1788, a New York law, "for the Amendment of the Law, and the better Advancement of Justice," declared that "after the first day of May next," no British statutes "shall operate or be considered as Laws" in the state. The New York Constitution of 1821 (art. VII, sec. 13) stated that "Such parts of the common law, and of the acts of the legislature of the colony of New York, as together did form the law of the said colony" on April 19, 1775, and the resolutions of the colonial Congress, "and of the convention of the State of New York," in force on April 20, 1777, would continue to be law, unless altered or repealed, and unless they were "repugnant" to the constitution. No mention was made of British statutes; for good measure, an act of 1828 specifically pronounced the British statutes dead.
Yet even this flock of New York laws fell short of solving the problem. A New York court later held that some English statutes had become part of the "common law" of the colony. This meant that an undefinable, unknowable group of old laws somehow maintained a ghostly presence. They lived on, of course, only insofar as they were not "repugnant" to the constitution or unsuitable to conditions. One could never, then, be sure if an old law were dead or alive. New York was not the only state whose judges held that some of the old statutes were valid, and thus sentenced the legal public to a certain amount of uncertainty. To this day, an occasional case still turns on whether some statute or doctrine had been "received" as common law in this or that state. The question of "reception" had troubled the colonials too. Independence merely altered the form of the question. And in a broader sense, the question is an abiding one in all common-law jurisdictions. Judges must constantly re-examine the law, to see which parts still suit society's needs, and which parts must be thrown on the ash heap, once and for all.
The reception statutes dealt with the older English law. What about new law? There was, as expected, a strong burst of national pride. To Jesse Root of Connecticut, writing in 1798, it was "unnecessary, and derogatory" for courts of an independent nation to be governed by foreign law. His ideal was "the republic of bees," whose members "resist all foreign influence with their lives," and whose honey, "though extracted from innumerable flowers," was indisputably their own. In pursuit of the republic of bees, New Jersey passed a law, in 1799, that
no adjudication, decision, or opinion, made, had, or given, in any court of law or equity in Great Britain [after July 4, 1776]...nor any printed or written report or statement thereof, nor any compilation, commentary, digest, lecture, treatise, or other explanation or exposition of the common law,...shall be received or read in any court of law or equity in this state, as law or evidence of the law, or elucidation or explanation thereof.
Kentucky prohibited the mere mention of recent British law. Its statute, passed in 1807, declared that "reports and books containing adjudged cases in...Great Britain...since the 4th day of July 1776, shall not be read or considered as authority in...the courts of this Commonwealth." During Spring Term, 1808, Henry Clay, appearing before the court of appeals of Kentucky, "offered to read" a "part of Lord Ellenborough's opinion" in Volume 3 of East's reports; the "chief justice stopped him." Clay's co-counsel argued that the legislature "had no more power to pass" such a law than to "prohibit a judge the use of his spectacles." The court decided, however, that "the book must not be used at all in court."
But Lord Ellenborough was not so easily banished, in New Jersey, or Kentucky, or elsewhere. The New Jersey statute was repealed in 1819. As a practical matter, English law continued to be used by lawyers and courts, throughout the period, throughout the country. England remained the basic source of all law that was not strictly new or strictly American. The habits of a lifetime were not easily thrown over, despite ideology. Indigenous legal literature was weak and derivative. There was no general habit of publishing American decisions; American case reports were not common until a generation or more after Independence. To common-law lawyers, a shortage of cases was crippling. To fill the gap, English materials were used, English reports cited, English judges quoted as authority. In the first generation, more English than American cases were cited in American reports. Ordinary lawyers referred to Blackstone constantly; they used his book as a shortcut to the law; and Blackstone was English to the core. Sometimes curiously old-fashioned bits of law -- phrases, old doctrines, old writs -- turned up in curious places (for example, the American frontier); the reason was the ubiquity of Blackstone.
American law continued, in short, to borrow. The English overlay was obvious, pervasive -- but selective. The English doctrines that were invited to this country were those which were needed and wanted -- and only those. Sweeping changes took place in American law in the years between 1776 and the middle of the 19th century. During that time, there developed a true republic of bees, whose flowers were the social and economic institutions that developed in their own way in the country. They, not Lord Ellenborough and Lord Kenyon, were the lawmakers that made American law a distinctive system: a separate language of law within the family founded in England.
The second apparent alternative to the common law was also a mirage. To abolish the tyranny of lawyers and their rules, to reduce law to a common-sense system, at the level of the common man's understanding, a system of simple, "natural" justice: this was an age-old yearning, but it flared up with special vigor after 1776. As one citizen of Kentucky put it, the state needed "a simple and concise code of laws...adopted to the weakest capacity."
In part, the antilaw movement was an outgrowth of radical politics. One current of thought distrusted the common law on the grounds that it was remote from the needs of ordinary people, and was biased toward the rich. Another current of thought distrusted the law because it was archaic, inflexible, irrelevant; it did not suit the needs of merchant or businessman. Both groups could make common cause against lawyers' law, which suited nobody's wants but the lawyers. There was a general interest, then, in a reform of legal institutions, in which rich and poor, radical and conservative could share. In a complex society, however, it was Utopian to imagine that lawyers' law could be overthrown and replaced by natural justice, whatever that might mean. On the contrary, more and more rules, of more and more definite shape, were needed as time went on. The reform urge, as we shall see, did not abate; but it came to mean, not artlessness, but adaptation to the needs of a market economy.
One basic, critical fact of 19th-century law was that the official legal system penetrated, and had to penetrate, deeper and deeper into society. Medieval common law was not the law everywhere in England; nor was it everybody's law. American law was more popular, in a profound sense. It had no archaic or provincial rivals. It had no competitive substratum. Paradoxically, American law, divided into as many subsystems as there were states, was less disjointed than the one "common law" of older England.
Of course, millions were beyond the reach of formal law and indifferent to it. But comparatively speaking, American law had an enormous range. It drew its strength from the work and wealth of millions, and it affected the work and wealth of millions more. In 16th- or 18th-century England, few people owned or dealt in land. Only a small percentage were inside the market economy. Only a few were potential customers for family law, the law of torts, the law of corporations. There was surely less oligarchy in the United States than in the old kingdoms of Europe. A law for the millions, for the middle class, had to develop. And this law, to survive, had to be more pliant and accessible than a law for the wealthy few.
In short, law had to suit the needs of its customers; it had to be at least in a form that lawyers, as brokers of legal information, could use. What happened to American law in the 19th century, basically, was that it underwent tremendous changes, to conform to the vast increase in numbers of consumers. It is dangerous to sum up long periods and great movements in a sentence. But if colonial law had been, in the first place, colonial, and in the second place, paternal, emphasizing community, order, and the struggle against sin, then, gradually, a new set of attitudes developed, in which the primary function of law was not suppression and uniformity, but economic growth and service to its users. In this period, people came to see law, more and more, as a utilitarian tool: a way to protect property and the established order, of course, but beyond that, to further the interests of the middle-class mass, to foster growth, to release and harness the energy latent in the commonwealth: "Dynamic rather than static property, property in motion or at risk rather than property secure and at rest."
It was not only property to which the word dynamic seemed more and more apt. These two polar words -- dynamic and static -- aptly describe a fundamental change in the concept of law. The source of the change lay not so much in the Revolution as in revolution: the transformation of economy and society that occurred in the machine age and the age of rational thought. A dynamic law is a man-made law. The Constitution talked about natural rights, and meant what it said; but these rights did not define the duties and status of the subject; rather, they served as a framework for the fulfillment of people's needs and desires. Gradually, an instrumental, relativistic theory of law made its mark on the system. It meant a more creative view of precedent. It meant asking for the functions of past law, and measuring these against demands of the present and future. Once, change in law was looked on as rare and treated almost apologetically. But in the 19th century, Americans made law wholesale, without any sense of shame. Basically, this was legislative law, law made by elected representatives, rather than law made by judges. To be sure, the boldness of the judges and the rapidity of social change meant that there was room for both institutions in the house of creative law-making; the judges seized this opportunity, and played a mighty, if secondary, role in making fresh law.
CONSTITUTIONS: FEDERAL AND STATE
The Revolutionary period was, by necessity, an age of innovation in fundamental law. The old ties with England had been snapped. The states and the general government decided to put their basic political decisions in the form of written constitutions. Some states had begun as chartered colonies; they had gotten into the habit of living under these charters, and had even learned to revere them, as guarantees of their liberty. American statesmen tended to look on a written constitution as a kind of social compact -- a basic agreement among citizens, and between citizens and state, setting out mutual rights and duties, in permanent form.
The Articles of Confederation (1777) envisioned a loose, low-key grouping of highly sovereign states. It did not provide for a strong executive. It had no provision for a federal judiciary. Congress, however, was given some judicial power; it was "the last resort on appeal in all disputes and differences...between two or more states concerning boundary jurisdiction or any other cause whatever." Congress also had power over matters of admiralty law, with "sole and exclusive right" to establish "rules for deciding, in all cases, what captures on land or water shall be legal," and how prizes might be "divided or appropriated." Congress also had sole right to set up "courts for the trial of piracies and felonies committed on the high seas," and courts "for receiving and determining, finally, appeals in all cases of captures" (art. IX).
The Articles of Confederation, by common consent, were a failure; the Constitution of 1787 was a stronger, more centralizing document. The Northwest Ordinance (1787), which set up a scheme of government for the Western lands, and which was enacted shortly before the Constitution, took it for granted that all future states would have a "permanent constitution." Any new states carved out of the Northwest Territory would have a "republican" constitution, consistent with federal law (Northwest Ordinance, 1787, art. V).
The original states had in theory the option to write or not write constitutions. But most of them quickly chose the way of the new-written word. Within a short time after the war broke out, eleven states had drafted and adopted new constitutions. To some, a constitution was a rallying point, a symbol of unity during war. The New Jersey Constitution (1776) put it this way:
in the present deplorable situation of these colonies, exposed to the fury of a cruel and relentless enemy, some form of government is absolutely necessary, not only for the preservation of good order, but also the more effectually to unite the people, and enable them to exert their whole force in their own necessary defense.
A few states chose to rest upon their original charters. But these, too, were eventually replaced by new documents, of the constitutional type. Connecticut discarded its charter and adopted a constitution in 1818. Eventually, every state in the union came to have a constitution in the strict sense of the word. All, in short, embarked on careers of making, unmaking, and remaking constitutions.
Constitutionalism answered to a deep-seated need, among members of the articulate public, for formal, outward signs of political legitimacy. This urge had driven tiny, isolated colonies in what is now Rhode Island or Massachusetts to express the structure and principles of government in the form of an agreement -- a visible, legible bulwark against the lonely disorder of life outside the reach of the mother country. Much later, but by something of the same instinct, the remote Oregon pioneers, in a no-man's land disputed among nations, drew up a frame of government and called it a constitution. So did the residents of the "lost state of Franklin" in the 1780s, in what is now part of eastern Tennessee. So did the handful of citizens of the "Indian Stream Republic," in disputed territory near the border of New Hampshire and Canada. And so did the Mormons of the "State of Deseret." These "constitutions," to be sure, were mostly copycats; they borrowed provisions from existing constitutions, taking a phrase here, a clause there, and making whatever changes were considered appropriate. They were short-lived and of dubious legality. But they illustrate how strong the idea of the written constitution had become in American life.
There have been dozens of state constitutions. Their texts, style, and substance vary considerably. Some of the earliest ones, written before the 1780s, were quite bold and forward-looking for their times. The first Pennsylvania Constitution (1776) represented a sharp victory for the liberals of the state. Virginia pioneered a Declaration of Rights (1776). The idea and content of the Bill of Rights came from sources in the states. The federal Constitution could not have been ratified, without the promise of a bill of rights, which took the form of ten amendments. After 1787, the language and organization of the federal Constitution became in turn a powerful model for state constitutions. One feature, however, was not easily transferred to the states: durability. There has been only one federal Constitution. It has been amended from time to time -- but never overthrown. A few states (for example, Wisconsin) have also had only one constitution. Other states have followed a more variegated, or chaotic, constitutional career. Louisiana has had nine constitutions, perhaps ten, depending on how one counts. Georgia has had at least six.
The federal Constitution was marvelously supple, put together with great political skill. The stability of the country -- Civil War crisis aside -- has been the main source of its amazing survival. But the Constitution itself deserves a share of the credit. It turned out to be neither too tight nor too loose. It was in essence a frame, a skeleton, an outline of the form of government; on specifics, it mostly held its tongue. The earlier state constitutions, before 1787 and for some decades after, also guarded themselves against saying too much. There were, of course, some idiosyncratic features, even before 1787, in state constitutions. New Hampshire (1784), in the spirit of Yankee thrift, solemnly declared that "economy" was a "most essential virtue in all states, especially in a young one; no pension shall be granted, but in consideration of actual services, and...with great caution,...and never for more than one year at a time." But most state constitutions began with a bill of rights, described the general frame of government, and left it at that.
A constitution, if at all different from ordinary law, has two functions. First, it provides a terse exposition of the structure of government -- its permanent shape, the nature of its organs or parts, and their boundaries and limits. Second, it may contain a list of essential rights, essential limitations on government, essential rules -- all those propositions of high or highest law, which the drafters mean to secure against the winds of temporary change. But this second function has no natural boundary. Opinions differ from generation to generation on what rights and duties are most fundamental. Even the federal Constitution was more than mere framework. Imbedded in it were fragments of a code of law. Trial by jury, for example, was guaranteed (art. III, sec. 2, par. 3). The Constitution defined the crime of treason (art. III, sec. 3), and set out minimum requirements for convicting any man of this crime. The Bill of Rights contained a miniature code of criminal procedure.
What existed in embryo, and in reasonable proportions in the federal Constitution, was carried to much greater lengths in the states. The inflation of constitutions reached its high point (or low point) after the Civil War. But the process began long before that. Even the state bills of rights became bloated. The federal Bill of Rights had ten sections; Kentucky, in 1792, had 28. Some of these were quite vague: "elections shall be free and equal" (art. XII, sec. 5); others seemed hardly to warrant their exalted position -- for example, that estates of "such persons as shall destroy their own lives shall descend or vest as in case of natural death."
The Delaware constitution of 1792 was another offender. It included many details of court organization and procedure; for example, "No writ of error shall be brought upon any judgment...confessed, entered, or rendered, but within five years after the confessing, entering or rendering thereof." This constitution also specified minutely how courts should handle accounts of executors, administrators, and guardians. The Alabama constitution of 1819 provided that "A competent number of justices of the peace shall be appointed in and for each county, in such mode and for such term of office, as the general assembly may direct." Their civil jurisdiction, however, "shall be limited to causes in which the amount in controversy shall not exceed fifty dollars." Clearly, the fifty-dollar limit was no immutable right of man. The Tennessee constitution of 1796 set a ceiling on the governor's salary ($750 a year) and forbade any change before 1804. No one could deduce these numbers from natural law.
There was a point to every clause in these inflated constitutions. Each one reflected the wishes of some faction or interest group, which tried to make its policies permanent by freezing them into the charter. Constitutions, like treaties, preserved the terms of compromise between warring groups. These sometimes took the form of a clause that postponed the power of the state to enact a given kind of law. The federal Constitution left the slave trade untouchable until 1808; until that year "the Migration or Importation of such Persons as any of the States now existing shall think proper to admit, shall not be prohibited by the Congress" (art. I, sec. 9, par. 1). Ohio (1802) made Chillicothe the "seat of government" until 1808; the legislature was not to build itself any buildings until 1809. For very delicate issues, the tactics of constitutionalism appeared essential. Otherwise, slight changes in political power could upset the compromise. One legislature can swiftly repeal the work of another; a constitution is harder to change.
Between 1790 and 1847, state constitutions became more diverse and diffuse. Some developments, like some problems, were peculiar to one state, or to one group of states; some were common to the country as a whole. The most general problems were apportionment and suffrage. Any change in the electoral map or in the right to vote meant a reallocation of political power. The suffrage was a bottleneck of law; who voted determined who ruled. Hence, constitutional disputes over suffrage and apportionment were widespread and sometimes bitter. In Rhode Island, the franchise was narrow, and the apportionment scheme outdated. Only those men who owned real estate worth $134 were entitled to vote; this excluded perhaps nine out of ten even of white males over 21. Conservatives stubbornly resisted any change. The so-called "rebellion" of Thomas Dorr (1842) was an unsuccessful, mildly violent attempt to force a change. A new constitution, which went into effect in 1843, finally brought about some measure of reform.
In other states, bloodless revolutions overthrew constitutions and reformed the suffrage. The search for permanence was constant, but permanence escaped men's grasp. The Pennsylvania constitution of 1776, a product of advanced 18th-century liberalism, was replaced in 1790 by a much more conservative constitution; and in 1838 by a moderate one. Statute books were supple; new governments changed them as they wished. Constitutions were brittle. They could be patched up at times; but when they were too deeply impregnated with the policies and interests of an old or lost cause, they had to be completely redone. Inflexibility was the vice of constitutions, as well as the virtue.
An observer with nothing in front of him but the texts of these state constitutions could learn a great deal about state politics, state laws, and about social life in America. The Southern constitutions gave more and more attention, over time, to the protection of slavery, and the repression of free blacks. Legislatures were forbidden to emancipate slaves, unless the master agreed and was compensated. In Pennsylvania (1838) any person who fought a duel, or sent a challenge, or aided or abetted in fighting a duel, was "deprived of the right of holding any office of honor or profit in this State." The Connecticut constitution of 1818, though it paid lip service to freedom of worship, froze every resident into his "congregation, church or religious association." A man might withdraw only by leaving a "written notice thereof with the clerk of such [religious] society." Constitutions often dealt with the state militia, a matter of considerable interest to the Revolutionary generation. In Ohio (1802) brigadiers-general were to be "elected by the commissioned officers of their respective brigades." Some states barred practicing clergymen from public office. The Tennessee constitution of 1796 testily remarked that "ministers of the gospel are, by their professions, dedicated to God and the care of souls, and ought not to be diverted from the great duties of their functions." The draft constitution of "Franklin," in 1784, would have extended this ban to lawyers and "doctors of physic." The Georgia constitution of 1777 declared that "Estates shall not be entailed; and when a person dies intestate, his or her estate shall be divided equally among their children; the widow shall have a child's share, or her dower, at her option." As early as 1776, North Carolina provided that "the person of a debtor, where there is not a strong presumption of fraud, shall not be confined in prison, after delivering up, bona fide, all his estate real and personal, for the use of his creditors, in such manner as shall be hereafter regulated by law." Many 19th-century constitutions contained provisions of this general type, on the subject of imprisonment for debt.
State constitutions reflected the theories of the day on separation of powers, and on checks and balances. The earlier the constitution, however, the weaker the executive branch. For example, 18th-century constitutions gave only feeble powers to the chief executive (called a governor, like his colonial antecedent). His term of official life was typically brief. The Maryland constitution of 1776 solemnly asserted that "a long continuance" in "executive departments" was "dangerous to liberty"; "rotation" was "one of the best securities of permanent freedom." This constitution practiced what it preached. The governor -- a "person of wisdom, experience, and virtue" -- was to be chosen each year, on the second Monday of November, for a one-year term, by joint ballot of the two houses of the legislature. But he could not continue his wisdom in office "longer than three years successively, nor be eligible as Governor, until expiration of four years after he shall have been out of that office."
The Pennsylvania constitution of 1776 showed a similar bias. It too called for rotation in office. In England, office tended to depend on the crown or the great grandees; public office was essentially a nice warm udder to be milked. American constitutions firmly rejected this notion. According to the Pennsylvania constitution of 1776, "offices of profit" were not to be established; such offices led officeholders to a state of "dependence and servility unbecoming freemen," and created "faction, contention, corruption, and disorder" in the public (sec. 36). But, as the emphasis on rotation shows, these constitutions also rejected the modern notion of politics as a specific career. Rather, it was a duty, a form of public service, open to the virtuous amateur. This notion, alas, did not long survive.
Early constitutions, as was mentioned, slighted the executive; they preferred to give the lion's share of power to the legislature. In the light of American political history, this was only natural. The colonial governor -- and the judiciary, to a certain extent -- represented foreign domination. The assemblies, on the other hand, were the voice of local influentials. The Pennsylvania constitution of 1776 gave "supreme legislative power" to a single house of representatives. No upper house or governor's veto checked its power. Over the course of the years, however, the states became disillusioned with legislative supremacy. The governor was one beneficiary of this movement. Typically, he gained a longer term of office, and the power to veto bills. In the federal government, the President had this power from the start.
Judicial power, too, increased at the expense of the legislature. Judicial power took the form of judicial review; the judges, in private litigation, passed on acts of other branches of government; and had the right to declare these acts void if, in the judges' opinion, they were unauthorized by the constitution. Ultimately, judicial review fed on constitutional detail; the more clauses a constitution contained, especially clauses that did something more than set out the basic frame of government, the more potential occasions or excuses for review. But wholesale use of the power was, on the whole, an unsuspected sword in this period. True, in the landmark decision of Marbury v. Madison (1803), John Marshall and the Supreme Court, for the first time, dared to declare an act of Congress unconstitutional. But the Court made no clear use of this power, against Congress, for over 50 years. The weapon was used more frequently against state statutes. State supreme courts, too, began to exercise judicial review. It was an uncommon technique; it was hated by Jeffersonians; some judges resisted it; it made little impact on the ordinary working of government. But when its occasion arose, it was an instrument of unparalleled power.
Legislative supremacy declined because influential citizens were more afraid of too much law than of not enough. In a number of states, scandals tarnished the reputation of the legislatures. Blocs of voters became afraid that landlords, "moneyed corporations," and other wealthy and powerful forces were too strong in the lobbies of these assemblies. A movement arose to limit the power of the legislatures. Rules to control legislation were written into one constitution after another. The process began modestly enough. Georgia's constitution (1798) outlawed the practice of legislative divorce, except if the parties had gone to "fair trial before the superior court," and obtained a verdict upon "legal principles." Even then, it took a "two-thirds vote of each branch of the legislature" to grant a divorce. Indiana, in 1816, forbade the establishment by statute of any "bank or banking company, or monied institution for the purpose of issuing bills of credit, or bills payable to order or bearer."
The Louisiana constitution of 1845 was something of a turning point. More than those that came before, this constitution was a charter of economic rights and obligations, and a code of legislative procedure, as well as a plain frame of government whose economic import was implicit. The constitution sharply restricted the state's power to pledge its credit or to lend its money. The state was not to "become subscriber to the stock of any corporation or joint-stock company." Lotteries, legislative divorces, and special corporate charters were forbidden. Every law was to "embrace but one subject, and that shall be expressed in the title." No new exclusive privileges or monopolies were to be granted for more than twenty years. No banks of any sort were to be chartered. These were not chance notions. They were not, in the eyes of contemporaries, extreme. The state was trying to reach legitimate legislative goals; but what was new (and ominous) was that it was doing it through anti-legislation: by foreclosing whole areas of law to statutory change. Other states enthusiastically followed Louisiana's lead. But when times changed, and conditions changed, these overblown constitutions became all too often embarrassments. They were therefore evaded, or amended, or done over completely.
No two state constitutions were ever exactly alike. But no constitution was pure innovation. Among the states, there was a great deal of copying, of constitutional stare decisis. A clause or provision tended to spread far and wide, when it met some felt need, or caught the fancy of lawmakers and constituents. New states embarked on statehood with constitutions that borrowed heaps of clauses and sections from constitutions in older states. Some new states favored the latest, most recent model; others looked to their neighboring states; others to the home state of their settlers. The New York constitution of 1846 left a deep mark on Michigan, and later on Wisconsin. The first constitution of California was heavily indebted to Iowa; Oregon was indebted to Indiana.
Borrowing and influence are not, of course, the same. The states shared a common political culture. Michigan was not a legal satellite of New York; the people in the two states were Americans of the period, and, for the most part, thought alike on political and legal issues. The New York constitution was a recent model; thus people in Michigan used it. When convenient patterns were readily at hand, it was inefficient to start drafting from scratch. Borrowing was always selective. No constitution was swallowed whole. On matters of great importance, conscious choice was never absent. No state was bound to adopt the constitution of another state, or any part of it. They adopted out of expedience and fashion.
In a common-law system, judges make at least some of the law, even though legal theory has often been coy about admitting this fact. American statesmen were not naive; they knew it mattered what judges believed and who they were. How judges were to be chosen and how they were to act was a political issue in the Revolutionary generation, at a pitch of intensity rarely reached before or since. State after state -- and the federal government -- fought political battles over issues of selection and control of the judges.
The bench was not homogeneous. Judges varied in quality and qualification, from place to place, and according to their position in the judicial pyramid. Local justices of the peace were judges; so were the justices of the United States Supreme Court. English and colonial tradition had allowed for lay judges, as well as for judges learned in law. There were lay judges both at the top and the bottom of the pyramid. In the colonies, the governor frequently served, ex officio, as chancellor. New Jersey continued this system, in its constitution of 1776. This constitution also made the governor and council "the court of appeals in the last resort in all causes of law as heretofore." Since governor and council were or might be laymen, this meant that nonlawyers had final control over the conduct of trials and the administration of justice. In the New York system, too, laymen shared power at the apex of the hierarchy of courts. The constitution of 1777 set up a court "for the trial of impeachments, and the correction of errors," consisting of senators as well as judges. This system lasted well into the 19th century.
The lay judges were not necessarily politicians, though this was ordinarily the case. But they were invariably prominent local men. William E. Nelson has studied the background and careers of the eleven men who served as justices of the superior court of Massachusetts between 1760 and 1774, on the eve, that is, of the Revolution. Nine had never practiced law; six had never even studied law. All, however, of these lay judges had "either been born into prominent families or become men of substance." Stephen Sewall, chief justice in 1760, was the nephew of a former chief justice; he had served thirteen years as a tutor at Harvard College.
The base of the pyramid was even more dominated by laymen. Lay justice did not necessarily mean popular or unlettered justice at the trial court level. The English squires were laymen, but hardly men of the people. Lay justice in America had something of the character of rule by the squires. Nor was lay justice necessarily informal. Laymen, after years on the bench, often soaked up the lawyer's jargon and tone. After all, the difference between lawyers and nonlawyers was not that sharp; frequently, a man came to the bar after the briefest of clerkships and with little more than a smattering of Blackstone. The way lay judges absorbed their law was not much different from the way men in general learned to be "lawyers."
There are many anecdotes in print about the coarseness and stupidity of lay judges. Old lawyers, writing years later, and historians of bench and bar, have tended to drag the reputation of these judges through the mud. For sentimental and other reasons, these lawyers and lawyer-historians wanted to exaggerate the rawness and vulgarity of pioneer judges, and to make the point that laymen who wore the clothing of judges must be incompetent. The actual facts are harder to unearth. Popular feelings against the courts, in the late 18th century, had nothing to do with whether judges were laymen or not. The complaint was not that justice was crude, but that it was biased in favor of creditors, in favor of the rich.
In any event, the lay judge was in slow retreat throughout the period. Eventually, he disappeared entirely from upper courts. The first lawyer on Vermont's supreme court was Nathaniel Chipman (1752-1843). He took office, in 1787, as an assistant judge of the court. Not one of the other four judges was an attorney. On such a court, a lawyer found it easy to take the lead. Chipman later became chief justice, and edited law reports for Vermont. In other states, professionalization came even earlier. All five of the judges of the Virginia Court of Appeals, established in 1788, were lawyers; Edmund Pendleton headed the court.
Historically, judges were appointed from above. But American democracy put strong emphasis on controls from below. This implied some more popular way to choose the judges. The Vermont constitution of 1777 gave to the "freemen in each county" the "liberty of choosing the judges of inferior court of common pleas, sheriff, justices of the peace, and judges of probates." Under the Ohio constitution of 1802, "judges of the supreme court, the presidents and the associate judges of the courts of common pleas" were to be "appointed by a joint ballot of both houses of the general assembly, and shall hold their offices for the term of seven years, if so long they behave well." This gave the electorate at least an indirect voice in judicial selection. Georgia in 1812, and Indiana in 1816, provided for popular election of some judges; Mississippi in 1832 adopted popular election for all. New York followed in 1846, and the rush was on.
The movement, according to Willard Hurst, was "one phase of the general swing toward broadened suffrage and broader popular control of public office which Jacksonian Democracy built on the foundations laid by Jefferson." It was a movement, however, "based on emotion rather than on a deliberate evaluation of experience under the appointive system." The hard facts about judicial behavior were never easy to come by. There is no simple way to compare elected and appointed judges. Still, if judges were not elected, how could they be forced to respond to the will of the people?
There was plenty of evidence from which a jaundiced mind could conclude that judges were political men, and had to be kept in check. Thomas Jefferson, and his party-members, in particular, were quite convinced of this. Federal judges were appointed for life. Before Jefferson came to power, they were naturally Federalists. Their behavior was quite controversial. To Jefferson's men, the judges seemed "partial, vindictive, and cruel," men who "obeyed the President rather than the law, and made their reason subservient to their passion." As John Adams was leaving office, in 1801, Congress passed a Judiciary Act. The act created a flock of new judgeships, among other things. Adams nominated judges to fill these new posts; they were confirmed by the Senate in the last moments of the Adams regime. Jefferson's party raged at these "midnight judges." It was a final attempt, Jefferson thought, to stock the bench forever with his political enemies.
The law that created the "midnight judges" was repealed; the judges lost their jobs; but the other Federalist judges stayed on serenely in office. These holdover judges threatened Jefferson's program, he felt; there was no easy way to be rid of them: "Few die and none resign." He wanted to limit their power; he wanted to make them more responsive to national policy -- as embodied, of course, in Jefferson and in his party. John Marshall, the Chief Justice of the United States, was particularly obnoxious to Jefferson. He was a man of enormous talents, and (as luck would have it) enjoyed good health and long life. He outlived a score of would-be heirs to the office, including Spencer Roane of Virginia, and cheated a whole line of Presidents of the pleasure of replacing him. Equally annoying to Jefferson and his successors was the fact that later justices, who were not Federalists, seemed to fall under Marshall's spell, once they were safely on the bench. Madison appointed Joseph Story, who was at least a lukewarm Democrat. On the bench, he became a rabid fan of the Chief Justice. Roger Brooke Taney, who finally replaced John Marshall, had been a favorite of Andrew Jackson, and a member of his cabinet. But Taney, too, became living proof of the perils of lifetime tenure. He outlived his popularity with dominant opinion, at least in the North. The author of the Dred Scott decision tottered on in office until 1864, when the Civil War was almost over.
The prestige of federal courts stands high today, particularly with liberals and intellectuals. An independent court system is (at least potentially) a tower of strength for the poor, for the downtrodden, for the average person facing big institutions or big government. Hence Jefferson's famous fight against the Federalist judges is one policy of his party that has not done well in the court of history. Yet, undeniably, some federal judges behaved in ways that would be considered disgraceful today. Federal judges did not run for re-election; but they played a more active political role than is standard for judges today. Some Federalists made what were in effect election speeches from the bench. They harangued grand juries in a most partisan way. This gave some point to Jefferson's attacks. Other judges were more discreet, but (in Jefferson's view) equally partisan. One of the sources of Marshall's strength was his tremendous solemnity. His opinions, mellifluous, grandly styled, even pompous, purported to be timeless and non-political. They appealed to principle, to the sacred words of the Constitution. Their tone implied that their true author was the law itself in all its majesty; the judge was a detached, impartial vessel. This attitude annoyed Jefferson inordinately, who saw in it nothing but a subtle, maddening hypocrisy. Either way, it was an effective piece of political theater.
Jefferson's attacks on the judges did not totally fail. Like Roosevelt's plan to pack the court in 1937, Jefferson and his successors may have lost the battle but won the war. In both cases, an extreme tactic -- the threat of impeachment, a court-packing plan -- ended in failure. But perhaps, in both cases, the real impact of the tactic was to scare the opposition; it served its chief role as a bogeyman, and not without impact.
The Constitution gave federal judges tenure for life. It left only one way open to get rid of judges: the terrible sword of impeachment. Federal judges could be impeached for "Treason, Bribery, or other high Crimes and Misdemeanors" (art II, sec. 4). The South Carolina constitution of 1790 permitted impeachment for "misdemeanor in office" (art. V, sec. 3). Literally interpreted, then, the law permitted impeachment only in rare and extreme situations. But there were a few notable cases, in the early 19th century, in which impeachment was used to drive enemies of party and state out of office. In 1803, Alexander Addison, presiding judge of the fifth judicial district in Pennsylvania, was impeached and removed from his office. He was a bitter-end Federalist; as a lower-court judge, he harangued grand juries on political subjects. On one occasion, he refused to let an associate judge speak to the grand jury in rebuttal; impeachment was grounded on this incident. A straight party vote removed Addison from the bench. Eighteen Republicans in the Pennsylvania senate voted him guilty; four Federalists voted for acquittal.
On the federal level, the purge drew first blood in 1804. It was a rather shabby victory. John Pickering, a Federalist judge, was impeached and removed from office. He had committed no "high crimes and misdemeanors"; but he was an old man, a drunk, and seriously deranged. It was understandable to want him off the bench; but it was far from clear that the words of the Constitution applied to this kind of case. Pickering's removal was, in fact, a stroke of politics, a dress rehearsal for a far more important assault, the impeachment of Samuel Chase.
This celebrated affair took place soon after Pickering's trial. Chase was a Justice of the United States Supreme Court. He had a long and distinguished career, but he was an uncompromising partisan, and a man with a terrible temper. He too was notorious for grand-jury charges that were savage attacks on the party in power. President Jefferson stayed in the background; but his close associates moved articles of impeachment against Chase. The articles of impeachment made a number of specific charges of misconduct against Chase; among them, that at a circuit court, in Baltimore, in May 1803, he did "pervert his official right and duty to address the grand jury" by delivering "an intemperate and inflammatory political harangue," behavior which was "highly indecent, extra-judicial and tending to prostitute the high judicial character with which he was invested to the low purpose of an electioneering partizan."
But the anti-Chase faction overplayed its hand. In a frank private conversation, Senator Giles of Virginia admitted what was really at stake:
a removal by impeachment [is] nothing more than a declaration by Congress to this effect: You hold dangerous opinions, and if you are suffered to carry them into effect you will work the destruction of the nation. We want your offices, for the purpose of giving them to men who will fill them better.
The trial was long, bitter, sensational. Republicans defected in enough numbers so that Chase was acquitted, by the slimmest of margins. John Marshall and his court were thenceforward "secure." It was the end of what Albert Beveridge claimed was "one of the few really great crises in American history." In January 1805, an attempt was made to impeach all but one of the judges on Pennsylvania's highest court; it failed by a narrow vote. Impeachment was not a serious threat after these years.
Another radical plan to get rid of bad judges was to abolish their offices. Of course, Jefferson could not do away with the Supreme Court, even had he wanted to; that would have meant major Constitutional change, which was clearly impossible. But his administration repealed the Judiciary Act of 1801; that at least put the midnight judges out of business. An ingenious method for removing unpopular judges was tried out in Kentucky. In 1823, the Kentucky court of appeals struck down certain laws for relief of debtors; this act aroused a storm of protest. Under Kentucky law, the legislature could remove judges by a two-thirds vote; the "relief party" could not muster that percentage. Instead, the legislature abolished the court of appeals and created a new court of four judges, to be appointed by the governor. The old court did not quietly give up its power. For a time, two courts of appeal tried to function in the state, and state politics was dominated by the dispute between "old court" and "new court" factions. Most lower courts obeyed the old court; a few followed the new court; a few tried to recognize both. Ultimately, the "old court" party won control of the legislature, and abolished the new court (over the governor's veto). The old court was awarded back pay; the new court was treated as if it had never existed at all.
As these crises died down, it seemed as if the forces of light had triumphed over darkness -- that this country was to have a free, independent judiciary rather than a servile mouthpiece of state. The Chase impeachment failed (it is said) because in the end both parties believed in a strong, independent judiciary; both believed in the separation of powers. Many politicians did in fact have qualms about impeachment; it smacked of overkill. Some of Jefferson's men shared these qualms. They did not feel that a sitting judge should be replaced on political grounds. But the failure of impeachment was not a clear-cut victory for either side. It was rather a kind of social compromise. The judges won independence, but at a price. Their openly political role was reduced; and ultimately most states turned to the elective principle. There would be no more impeachments, but also no more Chases. What carried the day, in a sense, was the John Marshall solution. The judges would take refuge in professional decorum. It would always be part of their job to make and interpret policy; but policy would be divorced from overt, partisan politics. Principles and policy would flow, at least ostensibly, from the logic of law; they would not follow the naked give and take of the courthouse square. Justice would be blind; and it would wear a poker face. This picture of the behavior of judges had enough truth, and enough hypnotic force, to influence the role-playing of judges, and to bring some peace and consensus to issues of tenure, selection, behavior, and removal of judges.
This did not mean that judges could, or would, avoid the mine-fields of politics and policy. High courts faced sensitive issues every term. Some of these issues were so charged with emotion that the veil of objectivity fell or was torn from the judges' faces. Dred Scott, in 1857, was an example of the court overreaching itself -- a blatant political act, and more significantly, a wrong-headed one. There were many minor Dred Scotts at lower levels of decision. Prejudice and arrogance in court did not die out in 1810, or 1820, or ever, but it became less overt; in any event, it became harder to document. Long after Chase, there were political trials in the United States, and judges who persecuted unpopular or dissenting men. In 1845, Circuit Judge Amasa J. Parker presided at the trial of the antirent rioters of upstate New York; his behavior was as partisan and prejudiced as any Federalist judge of the early 1800s. But more and more, the judges assumed the outward posture of propriety.
Meanwhile, the actual power of judges, as makers of doctrine and framers of rules, may have actually grown somewhat after 1800. The courts had to hammer out legal propositions to run the business of the country, to sift what was viable from what was not in the common-law tradition, to handle disputes and problems thrown up in the course of political, social, and technological change. Once case reports began to be published, judges had rich opportunities to mold law, as logic and social sense directed, and as the docket, responsive to outside pressure and the demands of the litigants, required. They did not let the opportunity slide. The legal generation of 1776 had been a generation of political thinkers and statesmen. The Constitution of the United States is their greatest legal monument. In the next generation, the great state papers, in a sense, were such judicial opinions as Marbury v. Madison, or the Dartmouth College Case. The best of the early 19th century judges had a subtle, accurate political sense, and firm economic and social beliefs. In particular, the judges turned their attention to law in one of its prosaic meanings: the workaday rules of American life. They built and molded doctrine -- scaffolding (as they saw it) to support the architecture of human affairs.
Perhaps the greatest of the judges was John Marshall, Chief Justice of the United States. He, more than anyone else, gave federal judgeship its meaning. It was, of course, conceded that the judiciary made up a co-ordinate branch of government. They were separate, but were they equal? In Marbury v. Madison (1803), John Marshall invented or affirmed the power of judicial review over acts of Congress. But the Marbury decision was only a single dramatic instance of Marshall's work. His doctrines made constitutional law. He personally transformed the function and meaning of the Supreme Court. When he came on the bench, in 1801, the Supreme Court was a frail and fledgling institution. In 1789, Robert Hanson Harrison turned down a position on the court to become chancellor of Maryland. John Jay resigned in 1795 to run for governor of New York. In the first years, the court heard very few cases; it made little impact on the nation. By the time Marshall died, the court was fateful and great.
Marshall had a sure touch for institutional solidity. Before he became Chief Justice, the judges delivered seriatim (separate) opinions, one after another, in the English style. Marshall, however, put an end to this practice. The habit of "caucusing opinions," and delivering up one unanimous opinion only, as the "opinion of the Court" had been tried by Lord Mansfield in England. It was abandoned there; but Marshall revived the practice. Unanimity was the rule on the Court; for a while, until William Johnson (1777-1834) was appointed by Jefferson, in 1804, the unanimity was absolute, with not a single dissenting opinion. Johnson broke this surface consensus. Yet neither Johnson nor any later justices could or would undo Marshall's work. Doctrines changed; personalities and blocs clashed on the court; power contended with power; but these struggles all took place within the fortress that Marshall had built. The court remained strong and surprisingly independent. Jefferson hoped that Johnson would enter the lists against Marshall. Yet Johnson more than once sided with Marshall, in opposition to Jefferson, his leader and friend. The nature and environment of the court -- life tenure, the influence of colleagues -- loosened his other allegiances. It was to be a story often told. Joseph Story betrayed Madison; Oliver Wendell Holmes bitterly disappointed Theodore Roosevelt; the Warren Burger court slapped Richard Nixon in the face.
There were strong leaders and builders in the state courts, too. James Kent dominated his court in New York, in the early 19th century. As he remembered it, "The first practice was for each judge to give his portion of opinions, when we all agreed, but that gradually fell off, and, for the last two or three years before I left the Bench, I gave the most of them. I remember that in eighth Johnson all the opinions for one term are 'per curiam.' The fact is I wrote them all...." Kent's pride in his work was justified. But the opportunity for creative work was there. The judges were independent in two senses: free from England, but also free, for the moment, from stifling partisan control. Also, they were published. The colonial judges, who left no monuments behind, are forgotten men. From 1800 on, strong-minded American judges, whose work was recorded, influenced their courts and the law. In New York there was, as we mentioned, Chancellor Kent (1776-1847); in Massachusetts, Theophilus Parsons (1750-1813) and Lemuel Shaw (1781-1861); in Pennsylvania, John B. Gibson (1780-1853); in North Carolina, Thomas Ruffin (1787-1870); in Ohio, Peter Hitchcock (1781-1853); in Louisiana, Francis Xavier Martin (1762-1846).
The spheres of these state judges were less floodlit, of course, than the Supreme Court in its greater moments. Their work had less national significance. But in their states, and in the world of the common law, they made a definite impact. Some were excellent stylists; all wrote in what Karl Llewellyn has called the Grand Style: their opinions were often little treatises, moving from elegant premise to elaborate conclusion, ranging far and wide over subject matter boldly defined. They were, at their best, far-sighted men, impatient with narrow legal logic. Marshall, Gibson, and Shaw could write for pages without citing a shred of "authority." They did not choose to base their decisions on precedent alone; law had to be chiseled out of basic principle; the traditions of the past were merely evidence of principle, and rebuttable. Their grasp of the spirit of the law was tempered by what they understood to be the needs of a living society. Some were conservative men, passionately attached to tradition; but they honored tradition, not for its own sake, but for the values that inhered in it. And they became famous not because they stuck to the past, but because they worked on and with the living law. Most of the great judges were scholarly men; a few were very erudite, like Joseph Story, who could stud his opinions with acres of citation -- a thing Marshall tended to avoid. The great judges were creative, self-aware, and willing to make changes. James Kent described his work as chancellor as follows:
I took the court as if it had been a new institution, and never before known in the United States. I had nothing to guide me, and was left at liberty to assume all such English Chancery powers and jurisdiction as I thought applicable....This gave me grand scope, and I was checked only by the revision of the Senate, or Court of Errors....
My practice was, first, to make myself perfectly and accurately...master of the facts....I saw where justice lay, and the moral sense decided the court half the time; and I then sat down to search the authorities until I had examined my books. I might once in a while be embarrassed by a technical rule, but I most always found principles suited to my views of the case....
Kent noticed that no one ever cited the work of his predecessors. Their opinions, unpublished, were gone with the wind. He made sure he would not share that fate. He worked closely with William Johnson (1769-1848), who reported his cases. Other judges did this job themselves. F. X. Martin compiled Louisiana decisions from 1811 to 1830; during much of this time, he was himself one of the judges. Many judges were active in collecting, revising, or digesting the statutes of their states, and in writing or rewriting treatises. Joseph Story wrote a series of definitive treatises on various branches of law. James Kent, after retiring from the bench, wrote his monumental Commentaries. John F. Grimke (1753-1819) published the laws of South Carolina, wrote a treatise on the "Duty of Executors and Administrators," and the inevitable "South Carolina Justice of the Peace" (1788). Harry Toulmin (1766-1823) edited the statutes of Kentucky; then, as judge of Mississippi Territory, he edited the territorial statutes. Still later, in Alabama Territory, he edited the statutes there too.
Many appellate judges were versatile and highly educated. George Wythe of Virginia (1726-1806), chancellor of the state from 1788 to 1801, was perhaps the foremost classical scholar in the state; as "professor of law and policy" at William and Mary (1779-1790), he occupied the first chair of law in an American college. Augustus B. Woodward (1774-1827), judge of Michigan Territory from 1805, prepared a plan for the city of Detroit, wrote on political subjects, and published a book called A System of Universal Science. Some judges were classicists, had gifts for science or language, or, like Story, wrote bad poetry. Theophilus Parsons of Massachusetts published a paper on astronomy, and dabbled in mathematics. His "Formula for Extracting the Roots of Adjected Equations" appears as an appendix to the memoir of his life, which his son published in 1859. Parsons was also a Greek scholar and wrote a Greek grammar, which was never published. John Gibson of Pennsylvania was a devotee of French and Italian literature, a student of Shakespeare and an excellent violinist. He was probably the only major judge who designed his own false teeth. William Johnson, while a Supreme Court justice, wrote and published a life of Nathanael Greene, in two volumes, nearly 1,000 pages long. He was also an active member of the Charleston, South Carolina, horticultural society, and wrote a "Memoire on the Strawberry" in 1832. John Marshall himself wrote a life of George Washington.
Most frequently, judges had political careers before they reached the bench. Although most of them willingly "put aside ambition" when they "donned the ermine," their prior lives probably included elective or appointive office. On the frontier, judging was only one aspect of a politician's busy life. Harry Toulmin, in the Old Southwest, was "also postmaster, preached and officiated at funerals and marriages, made Fourth of July orations, practiced medicine gratuitously and in general was the head of the settlements." Before his arrival in the territory, he had been secretary of state in Kentucky. John Boyle (1774-1835), chief justice of Kentucky, was a member of Congress before he became a judge. Joseph Story was speaker of the Massachusetts House when he was appointed to the court. For some judges -- John Jay, for example -- judgeship was an interlude between other political posts.
Judgeship, then, was not a lifetime career for all judges. It was a stepping-stone, or a refuge from politics, or a political reward. It was not a distinctive career, with its own distinctive pattern of training and background, as in many Continental countries. Judgeship was a matter of luck and opportunity, not special skill, background, or aspiration. It was and is an offshoot of the bar -- of that part of the bar active in political affairs. Kermit Hall studied the men appointed to the federal bench between 1829 and 1861. An astonishing 78 percent of the judges had held prior elective public office. This was the Jacksonian era; but it was also true (though to a slightly lesser degree) of the pre-Jacksonians -- 65 percent of those federal judges had served in elective office.
A prominent lawyer usually took a cut in income when he became a judge. The salaries of judges, as of public officials in general, were not generous. Judges continually complained that they were pinched for money. By statue in 1827, New Jersey fixed the salary of the justices of the state supreme court at $1,100. The chief justice earned $1,200. The governor earned $2,000. Lemuel Shaw became chief justice of Massachusetts in 1830, at a salary of $3,000. The low salary, in fact, was the only source of reluctance for Shaw. For trial judges, the fee system was common. A New York law of 1786 awarded fees to the "judge of the court of probates...to wit; For filing every petition, one shilling; for making and entering every order, six shillings; for every citation, under seal, to witnesses, or for any other purposes, six shillings...for copies of all records and proceedings, when required, for each sheet consisting of one hundred and twenty-eight words, one shilling and six-pence." Under the fee system, some lower-court judges got rich. Still, a seat on a state Supreme Court paid off in the coin of high status.
THE ORGANIZATION OF COURTS
The Constitution of 1787 created a new system of federal courts. Like the privy council before 1776, the federal courts had power to review the work of state courts. The extent of that power was and still is a matter of constant redefinition. It was limited severely in scope by the Constitution; yet, in its sphere, it was more potent than colonial review by the British government. Federal courts were close at hand; no ocean intervened; they were familiar with local problems and how to handle them. The Constitution gave the federal courts jurisdiction in admiralty; and, on matters of federal law, litigants could appeal from state to federal courts. In nonfederal matters, however, the states and their courts were supreme. Of course, the judicial clauses of the Constitution merely set an evolution in process and marked off some of its obvious limits. No one even dreamt of the vast powers that would grow from these 18th-century seeds. In the late 18th century, and in the first half of the 19th, the federal courts clearly played second fiddle to the state courts. Where they were supreme, they were supreme; but the realm was a narrow one. Little is known about the way the lower federal courts functioned. A pioneer study, by Mary K. Tachau, examined the records of the federal court in Kentucky, between 1789 and 1816. She found a surprisingly active, useful, and competent court, handling a large volume of casework, in a place that was at the very rim of American civilization.
The federal Constitution was a document of its time, bearing the mark of contemporary theory. In particular, it embodied concepts of separation of powers, and the idea of checks and balances in government. These theories also influenced judicial organization in the states. But the states did not immediately move to seal off judicial power from the rest of the government. It was a deep-seated tradition to mix branches of government at the highest level. In Connecticut, the governor, lieutenant governor, and council constituted a "Supreme Court of Errors." Formerly the legislature itself had acted as the court of last resort. In New Jersey, under the constitution of 1776, the governor held the office of chancellor; and together with his council, constituted "the court of appeals in the last resort." In New York, the court for the "trial of impeachments and the correction of errors" -- the highest court, under the constitutions of 1777 and 1821 --consisted of the president of the senate, the senators, the chancellor, and the justices of the supreme court.
One by one, however, states began to give their highest court final judicial authority. In New Jersey, the constitution of 1844, in New York, the constitution of 1846, put the seal on this change. Rhode Island was the last state to join the movement. As late as the 1850s, the legislature passed an act which in effect granted a new trial in a pending case. The Rhode Island court, in 1856, interposed itself, labeled the practice intolerable, and appealed to the concept of separation of powers. By that time, political opinion, almost universally, rejected this kind of legislative pretension. On the other hand, throughout the period legislatures passed private acts that filled functions now thought of as purely judicial or executive. They granted charters for beginning corporations, and divorces for ending marriages. They quieted title to property, declared heirships, and legalized changes of name.
As in colonial days, in most states there was no clear-cut division between trial and appellate courts. High-court judges often doubled as trial or circuit judges. Appellate terms of court were held in the capital at fixed times of the year. In between, judges spent part of the year on coach or horseback, traveling circuit. Circuit duty was a considerable hardship. The country was large and sparsely settled, the interior a wilderness. Roads were made tortuous by mud or dust, depending on the season. Often, the judges rode circuit in their home districts, which somewhat lightened the burden. Still, many judges complained ceaselessly of their gypsy life. The system did have some virtues. It brought justice close to home; it conserved judicial manpower. Circuit duty gave the appellate judge actual trial experience. This gave him exposure to real litigants, and this (some felt) was good for his soul.
Trial work also cemented relations between bench and bar. Around 1800, in York County, in what later became the state of Maine, judges and lawyers traveled together, argued together, joked together, drank and lived together; this traveling "collection of lawyers, jurors, suitors, and witnesses filled up the small villages in which the courts were held." Often there was literally no room at the inn. "It was quite a privilege, enjoyed by few, to obtain a separate bed." The members of the bar "were no ascetics. The gravity and dignity of the bar...were very apt to be left in the courtroom -- they were seldom seen in the bar room." On the frontier, the circuit system bred a rugged type of judge. In Missouri, around 1840, Charles H. Allen, appointed to one of the Missouri circuits, traveled "on horseback from Palmyra," his home, about two hundred miles; then "around the circuit, about eight hundred more. This he did three times a year, spending nearly his whole time in the saddle. He was, however, a strong, robust man, and capable of the greatest endurance. He always traveled with a holster of large pistols in front of his saddle, and a knife with a blade at least a foot long. But for his dress, a stranger might have readily taken him for a cavalry officer."
As the states grew in wealth and population, they tended to sharpen the distinction between trial courts and courts of appeal. Some states established tiers of intermediate courts, called circuit courts, superior courts, or courts of common pleas. These middle courts were typically trial courts of quite general jurisdiction. They also heard appeals, or retried matters begun in the lowest courts. The supreme court tended to become more and more an appeal court, even if the judges, on occasion, still rode circuit.
Only the highest courts, chiefly the Supreme Court of the United States, have been the subject of any sizeable amount of historical research. The judges of the Supreme Court wore robes, stood heir to a great tradition, and heard cases of far-reaching importance. The further down one goes in the pyramid of courts, state or federal, the thinner the trickle of research. Yet it is certain that the everyday courts, churning out thousands of decisions on questions of debt, crime, family affairs, and title to land, were of vital importance to society. Who were the trial court judges? And what sorts of people served as justices of the peace? What was their background and influence? The justice of the peace was "arch symbol of our emphasis on local autonomy in the organization of courts," reflecting the "practical need, in a time of poor and costly communications, to bring justice close to each man's door." But what was the quality of that justice? How much of the justice's work was social control of the English type -- autocratic guidance through democratic forms? The role and function of lower-court judges probably changed greatly between 1790 and 1840; and there were probably great differences between East, West, and South. But little about form, function, and staff is definitely known.
State courts tended more toward specialization than the federal courts. Some states, for example, used separate courts for wills and probate matters. Others, as is the practice today, gave these problems over to courts of general jurisdiction. In New Jersey, the "ordinary" probated wills, and granted letters of administration and guardianship in his "prerogative court." Other estate matters were handled in New Jersey by the oddly named orphans' court. This tribunal was composed of judges who also served on the court of common pleas in the New Jersey counties. Some states kept separate courts for equity. Georgia's constitution of 1789 (art. III, sec. 3) declared that "Courts-merchant shall be held as heretofore"; but the constitution of 1798 deleted the clause. In Delaware, there were courts of oyer and terminer, and courts of general sessions and jail delivery. Memphis, Cairo, and Chattanooga had their own municipal courts, New York its mayor's court; Richmond's hustings court preserved an ancient, honorable name. The St. Louis Land Court (1855) was an unusual example of specialization.
Unlike the state constitutions, which often discussed the structure of courts in some detail, the federal Constitution was quite laconic on the subject. Judicial power was to be vested in a "Supreme Court" and such "inferior courts" as "the Congress may from time to time ordain and establish." For a few types of cases, the Supreme Court was to have "original" jurisdiction, meaning that these cases came to the Court right off the bat, and not by way of appeal. This was true for cases involving ambassadors, and those to which a state was party. Congress had power to decide how much appellate jurisdiction, and of what sort, the Supreme Court would enjoy. The President had power to appoint justices, subject to Senate confirmation. The Senate proved to be no rubber stamp. George Washington appointed John Rutledge of South Carolina to succeed Chief Justice Jay. A bloc of Federalists, annoyed that Rutledge had opposed Jay's treaty, punished him by refusing confirmation, though Rutledge had already served a few months on an interim basis, and had even written a few opinions.
Under the Constitution, Congress could have dispensed with any lower federal courts. State courts would then have had original jurisdiction over all but a few federal issues. The famous Judiciary Act of 1789 provided otherwise. It divided the country into districts, each district generally coextensive with a state, each with a Federal District Court, and a District Judge. The districts, in turn, were grouped into three circuits. In each circuit, a circuit court, made up of two Supreme Court justices and one district judge, sat twice a year. In general, the circuit courts handled cases of diversity of citizenship -- cases in which citizens of different states were on opposite sides of the case. The district courts heard admiralty matters. In certain limited situations, the circuit courts heard appeals.
One defect of the system was that it forced the justices to climb into carriages and travel to their circuits. At first, the justices were on circuit twice a year. Later, the burden was lightened to once a year. Additional judges were added to the court, and new circuits were created as the country grew. But these changes helped only a little. In 1838, Justice McKinley traveled 10,000 miles in and to his circuit (Alabama, Louisiana, Mississippi, Arkansas). Five other justices traveled between 2,000 and 3,500 miles each. As with state courts, there were those who argued that circuit duty was beneficial. It brought the judges closer to the people; it retaught them the legal problems of trial work. But as Gouverneur Morris pointed out, no one could argue that
riding rapidly from one end of this country to another is the best way to study law....Knowledge may be more conveniently acquired in the closet than in the high road.
Reform proved politically impossible. In Congress, a strong states-rights bloc was hostile to the federal courts. This bloc saw no reason to cater to the judges' convenience. The Judiciary Act of 1801 abolished circuit riding for Supreme Court justices. Unhappily, this was the famous act of the midnight judges. It was an admirable law in the technical sense, but doomed by its political implications. The Jefferson administration promptly repealed it. Again and again, reform proposals became entangled in sectional battles or battles between Congress and the President, and went down to defeat. As was so often true, notions of technical efficiency were sacrificed to more powerful values and interests.
Left to themselves, the colonies might have developed modes of procedure more rational and streamlined than anything in England. The process was retarded in the 18th century by the trend toward Anglicization. Many lawyers and judges were English-trained; English legal texts were in use. Imperial rule, English prestige, and the growth of a professional bar led to a reaction against the "crudities" of colonial procedure. The colonial systems became compromises between English law and native experiment. By the time of the Revolution, colonial procedure was bewilderingly diverse.
There was no chance that classical English pleading would be established after Independence. For one thing, the American bar was simply not equipped to handle this dismal lore. Nor was it in anybody's interest to introduce into law the complexities of English pleading, least of all in the interests of merchants. It was one thing to evolve such a system, as the English had; to bring it in from outside was quite another. But since English lawbooks were easily available, English procedure did have a certain degree of influence. This was particularly true of the first generation after 1776, and for the upper reaches of the profession. Some pleading and practice manuals were written in America, stressing local forms and procedures. When Joseph Story published A Selection of Pleadings in Civil Actions (1805), he took some of his forms from Massachusetts lawyers; and he justified his book on the basis of differences in "customs, statutes, and common law" between England and Massachusetts. But basically the book was English, some of it culled from fairly ancient sources. It was merely adapted for Americans, in a process that had the merit of avoiding "the high price of foreign books." James Gould's Treatise on the Principles of Pleading in Civil Actions, published in Boston in 1832, made the rather grandiose claim of setting out principles of pleading as a "science." It paid scant attention to American law as such:
As the English system of pleading is, in general, the basis of that of our country; the former has, in the following treatise, been followed exclusively, without regard to any peculiarities in the latter; excepting a very few references to those, which exist in the law of my native State.
In fact, the more popular English manuals were in actual use in the United States. In addition, special American editions of English treatises were prepared. Joseph Chitty's treatise on pleading was so well received that an eighth American edition was published in 1840. Chapter 9 of this dreary book bore the following title: "Of Rejoinders and the Subsequent Pleadings; of Issues, Repleaders, Judgments non Obstante Veredicto, and Pleas Puis Darrein Continuance, or now Pending Action; and of Demurrers and Joinders in Demurrer." In this book, the American lawyer could drink in such strictures as the following:
A traverse may be too extensive, and therefore defective, by being taken in the conjunctive instead of the disjunctive, where proof of the allegation in the conjunctive is not essential. Thus, in an action on a policy on ship and tackle, the defendant should not deny that the ship and tackle were lost, but that neither was lost.
Clearly, English common-law pleading was no model to follow slavishly. Pleading was an elaborate contest of lawyerly arts, and winning a case did not always depend on substantive merits. There were too many rules, and they were too tricky and inconsistent. The idea behind English pleading was not itself absurd. Pleading was supposed to distill, out of the amorphousness of fact and fancy, one precious, narrow issue on which trial could be joined. Principles of pleading were, in theory, principles of economy and order. Pleading demanded great technical skill. Those who had the skill -- highly trained lawyers and judges -- saw no reason to abandon the system. But the country was not run by lawyers. Merchants, bankers, and landowners were more important in American life. To these, the "science of pleading" could not appear otherwise than obfuscation, a lawyer's plot to kill justice, or to frustrate the expectations of ordinary people of affairs. It is a question, what Chitty's ship-merchant would have thought of disjunctive and conjunctive traverses.
English procedure, then, was too medieval for the modern world. (One wonders why it was not too medieval for the Middle Ages.) Reform of civil procedure, at any rate, found fertile soil in the United States. Pleading reform was one of the changes made necessary by the explosion in legal consumers. Legal skill was a resource; and it was scarce. It had to be husbanded. Shrewdness in pleading was not cheap enough and universal enough to pay its way in American society. Popular democracy, the colonial tradition, business demands for rationality and predictability: these too were allies in opposition to the strict "science of pleading." Against them were arrayed a few troops of bench and bar. But this was a raggle-taggle army, halfhearted in defense of its position. Francophile judges had contempt for the common law. Rank-and-file lawyers were too untrained for Chitty. English procedure had become like a drafty old house. Those born to it and raised in it loved it; but no outsider could tolerate its secret panels, broken windows, and on-again, off-again plumbing.
Reform did not come in one great burst. The actual practice of courts, particularly trial courts, was freer and easier than one might assume from reading manuals of procedure. As more research is done, more variance between book learning and reality comes to light. Alexander Hamilton prepared a manuscript manual of practice in 1782, probably for his own use in cramming for the bar. In it, he noted a decline in "nice Learning" about pleas of abatement. These pleas, he said,
are seldom used; and are always discountenanced by the Court, which having lately acquired a more liberal Cast begin to have some faint Idea that the end of Suits at Law is to Investigate the Merits of the Cause, and not to entangle in the Nets of technical Terms.
New York was, relatively speaking, quite conservative in legal affairs. Hamilton's manual of New York pleading, if it can be trusted, shows deep dependence on English formalities and forms. Yet even here many cracks in the armor appear, many defections in favor of "the Merits of the Cause." Other jurisdictions departed much further from English propriety. Georgia, in the 18th century, passed a series of laws that went a long way toward rationalizing civil procedure. The climax was the Judiciary Act of 1799. Under Georgia's laws, a plaintiff might begin civil actions simply by filing a petition, setting out the case "plainly, fully and substantially"; the defendant would file his answer, stating whatever was necessary to constitute "the cause of his defense." Georgia's law was, among other things, a courageous attempt to unite equity and common-law pleading. It also got rid of the forms of action, those ancient pigeonholes (or straitjackets) of procedure into which pleaders were forced to fit their pleas, or else. Conservative Georgia courts, outraged by the reforms, supposedly undermined the Georgia scheme and led to its downfall. But the evidence for this charge rests on nothing more than scraps of talk from appellate courts. Quite possibly, the reform had lasting effects on trial court behavior in Georgia.
By the 1830s, reform of procedure was in the air. In England itself, native lair of the common law, reform was a definite force. Jeremy Bentham cast a long shadow. Lord Brougham, in Parliament in 1828, fairly begged for reform. These men and their supporters set off a process with enormous significance for the common-law systems of law. After all, England too was a modernizing society, years ahead of America in industrial development, if not in class structure. During the 19th century the two nations were mutually influential -- their experiences meshed neatly, in parallel currents of change. But most of the key developments happened toward the middle of the century, or later.
In the Middle Ages, equity courts had been a source of law reform. Equity boasted a flexible collection of remedies; hence it had often prodded and pushed the more lethargic common law toward rationality. But equity had itself become hidebound; by 1800, it needed procedural reform as desperately as the common law; it was equity, not law, that was the subject of Dickens's Bleak House. The very existence of equity was an anomaly -- a separate and contradictory jurisprudence, living uneasily beside the "law." In the United States, many states simply handed over equity jurisdiction to the general courts of common law; the same judges decided both kinds of case, merely alternating roles. North Carolina, for example, in an act of 1782, empowered "each superior court of law" to "be and act as a court of equity for the same district." In a few states (Mississippi, Delaware, New Jersey), the courts were distinct, with different judges for law and equity. This was true of New York until the 1820s, when the circuit judges were made "vice-chancellors," and given full equity powers within their circuits. But a loser could not appeal from these "vice-chancellors" to ordinary appellate courts; instead, he had to appeal to the chancellor. Finally, some states had no developed system of equity at all (as distinct from "law"). Louisiana was one of these, because of its civil-law heritage. Massachusetts and Pennsylvania were outstanding common-law examples. Before 1848, however, only Georgia (and Texas, whose history was deviant) had actually abolished the distinction.
States that lacked "equity" had developed a rough union of their own. Statutes made equitable defenses available in cases at "law." In Massachusetts, the supreme judicial court had "power to hear and determine in equity all cases...when the parties have not a plain, adequate and complete remedy at the common law." The statute listed what these "cases" were, for example, "All suits for the specific performance of any written contract." In Pennsylvania, "Common law actions were used to enforce purely equitable claims; purely equitable defenses were permitted in common law actions; and, at rare times, purely equitable reliefs were obtained by means of actions at law." Throughout the period, Pennsylvania legislatures, by private laws, gave permission to individuals to do acts which, in other states, were within the scope of equity power. For example, they sometimes permitted executors to sell parcels of land to pay the debts of a deceased. In 1836, on the advice of a law-revision commission, Pennsylvania gave broad equity powers to some state courts. Thus Pennsylvania (and Massachusetts), by a piecemeal process, attained a "curious anticipation of future general reforms." In essence, "law" was bent to suit "equity"; but not all the change was in one direction. Equity was addicted to documents and written evidence, and classically tolerated nothing else; the common law favored the spoken word. But the Judiciary Act of 1789 provided for oral testimony in federal equity cases. Georgia allowed trial by jury in some causes which by tradition belonged to the equity side of the bench. North Carolina, in a statute of 1782, did the same. And a North Carolina law of 1801 provided a simple way to continue equity actions after the death of a party on either side. This replaced the "bill of revivor," one of the prime procedural culprits, a device so slow and technical that it gave rise to the suspicion that a chancery bill, once begun, would never come to an end.
In English law there were countless kinds of "appeal," calling for different pleadings and forms, full of dark mysteries, tripping and trapping the unwary. The word appeal, strictly speaking, applied only to review by higher equity courts. The states, continuing a trend that began in the colonial period, fixed on and emphasized two kinds of review by higher courts: the equity appeal and, for common-law actions, proceedings by writ of error or the equivalent. Local variations were frequent. In North Carolina, instead of writs of error, writs of certiorari were used -- a form of review which in England was confined to review of noncommon-law courts. In Connecticut the writ of error reviewed decisions in equity, usurping the place of the appeal.
The basic problem of review or appeal is how to avoid doing everything over again -- which would be a tremendous waste -- but at the same time make sure that lower-court mistakes are corrected. In essence, writs of error corrected only some kinds of errors, those that appeared on the face of the formal record. These were pleading errors mostly, except insofar as a party, in a bill of exceptions, preserved the right to complain about other kinds of "error" -- if the judge, for example, had let in improper evidence. These "errors" rarely went to the heart of the matter. The appeal system suffered from what Roscoe Pound has called "record worship" -- "an excessive regard for the formal record at the expense of the case, a strict scrutiny of that record for 'errors of law' at the expense of scrutiny of the case to insure the consonance of the result to the demands of the substantive law."
This system was not completely irrational; it could be defended as a reasonable way to divide powers and functions between high and low courts. High courts expounded the law, low courts decided cases. By correcting errors of record, high courts were able to adjust any kinks in formal doctrine. But at the same time, they left the day-to-day work of the trial courts alone. Still, record worship meant that a lot of mistakes and injustices could never be reviewed by a higher court; at the same time, high courts sometimes reversed perfectly proper decisions on highly technical grounds. But the actual control of lower courts by upper courts -- and the actual impact of record worship on the life cycle of a typical trial -- is exceedingly hard to measure.
Record worship was a disease that probably did not randomly infect every type of case. Courts are stickier, for example, about small errors in cases where life or liberty is at stake. It would be no surprise, then, to find that the law of criminal procedure outdid civil procedure in record worship and technical artifice. This branch of law had a special place in American jurisprudence. The Bill of Rights contained what amounted to a miniature code of criminal procedure. Warrants were not to be issued "but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized" (art. IV). No person "shall be held to answer for a capital, or otherwise infamous crime" without presentment or indictment by grand jury (art. V). No one "shall be compelled in any Criminal Case to be a witness against himself" or be "deprived of life, liberty, or property, without due process of law" (art. V). The accused "shall enjoy" the right to a "speedy and public trial," by an "impartial" jury of the vicinage; he must be told the nature of his crime; must be able to confront the witnesses; must "have compulsory process for obtaining witnesses in his favor" and the "Assistance of Counsel for his defense" (art. VI). "Excessive bail" was not to be required (art. VIII).
Many states adopted their own bills of rights even before the federal government did. Other states copied or modified the federal version. Criminal procedure was a major part of all of these bills of rights. The basic rights of man turned out, in large part, to be rights to fair criminal trial. These rights were supposed to guard against the tyranny of autocrats and kings. Abuse of power by Federalist judges only strengthened the ideas that underlay the Bill of Rights. Criminal procedure, on paper, gave a whole battery of protections to persons accused of crime. The defendant had the right to appeal a conviction; the state had no right to appeal an acquittal. In a number of cases, it seemed as if the high court searched the record with a fine-tooth comb, looking for faulty instructions, improper evidence, error in formal pleadings, or prejudicial actions by the judge. Sometimes the upper court quashed an indictment for a tiny slip of the pen or set aside a verdict for some microscopic error at the trial. In State v. Owen, a North Carolina case of 1810, the grand jurors of Wake County had presented John Owen, a cabinetmaker, for murder. According to the indictment, Owen, in Raleigh, on April 21, 1809,
not having the fear of God before his eyes, but being moved and seduced by the instigations of the Devil...with force and arms, at the city of Raleigh...in and upon one Patrick Conway...feloniously, wilfully, and of his malice aforethought, did make an assault...with a certain stick, of no value, which he the said John Owen in both his hands then and there had and held...in and upon the head and face of him the said Patrick Conway...did strike and beat...giving to the said Patrick Conway, then and there with the pine stick aforesaid, in and upon the head and face of him the said Patrick Conway, several mortal wounds, of which said mortal wounds, the said Patrick Conway then and there instantly died.
Owen was found guilty and appealed. His attorney argued that the indictment was defective, in that it did not "set forth the length and depth of the mortal wounds." A majority of the supreme court of North Carolina regretfully agreed: "It appears from the books, that wounds capable of description must be described, that the Court may judge whether it be probable, that death might have been produced by them." Since the indictment did not allege that the wounds were two inches long and four inches deep, or the like, the case had to be overturned, and Owen won a new trial.
Roscoe Pound has called this kind of case the "hypertrophy of procedure." This hypertrophy, he felt, was an example of 19th-century individualism run riot. The criminal law tolerated "hypertrophy" because it served the needs of the dominant American male -- the self-reliant man, who lived in a small town or on the frontier. Such a man was supremely confident of his own judgment, but tended to be jealous of the power of the state. Better to let the guilty free on a technicality than allow courts and prosecutors real power or discretion.
There is a serious question how far Pound's description fits the working law and how much State v. Owen describes a real phenomenon rather than an occasional mutation. Pound's picture of criminal procedure best describes a tiny group of instances, preserved in appellate records like flies in amber. Recorded appellate cases have always been a small minority of litigated cases. Who was it that took advantage of procedural rights and the record worship of appellate courts? Certainly not the slaves, the dependent poor, the urban masses. It is hard to tell how fair the average trial was. American law had its dark underside -- vigilantism, lynching, mob rule. These show that in some communities, for some cases, public opinion did not tolerate the niceties of fair trial. Some formal rights, recognized today, were not recognized in the 19th century. Thousands were arrested, tried, and sentenced without lawyers. Yet only a good lawyer could make effective use of the full guarantees of the Bill of Rights, let alone take advantage of record worship and "hypertrophy." Probably the average criminal trial fell far short of the ideal of procedural due process. The average trial, no doubt, was simple and short. And there were even some major trials -- trials with political overtones -- that were by any standards unfair.
THE LAW OF EVIDENCE
There is good reason to believe that the law of evidence tightened considerably between 1776 and the 1830s. Judging from surviving transcripts of criminal trials, courts had rather loose attitudes toward evidence around 1800. In the trial of the so-called Manhattan Well Mystery -- the transcript was found among Alexander Hamilton's papers -- hearsay was freely admitted; and "some of the most important testimony was not elicited by questions" from the attorney, but rather "by allowing the witnesses to give a consecutive and uninterrupted account." Opposing counsel did not meekly wait their turn to cross-examine. Rather, they broke in with questions whenever they wished.
When a field of law becomes cancerously intricate, some fundamental conflict of interest, some fundamental tension between opposing values, must lie at the root of the problem. The American political public has always resisted strong, central authority. Power tends rather to be fragmented, balkanized, dispersed. This attitude, which found expression in the theory of checks and balances, affected the law of evidence too. The modern European law of evidence is fairly simple and rational; the law lets most everything in, and trusts the judge to separate good evidence from bad. But American law distrusts the judge; it gives the jury full fact-finding power and, in criminal cases, the final word on innocence or guilt. Yet the law distrusts the jury as much as it distrusts the judge, and the rules of evidence grew up as some sort of countervailing force. The jury only hears part of the story, that part which the law of evidence allows. The judge's hands are also tied. If he lets in improper testimony, he runs the risk that the case will be reversed on appeal. Hence the rules of evidence bind and control both jury and judge.
It was during this period that many rules of evidence received classic formulation. The rules were heavily influenced by the jury system and the attitude of the law toward the jury. In medieval times, the jury had been a panel of neighbors -- knowing busy-bodies, who perhaps had personal knowledge of the case. When the function of the jury changed to that of an impartial panel of listeners, the law of evidence underwent explosive growth. Rules were devised whose point was to exclude all shaky, secondhand, or improper evidence from the eyes and ears of the jury. Only the most tested, predigested matter was fit for the jury's consumption. Consequently, in the 19th century, the so-called hearsay rule became one of the dominant rules of the law of evidence. This was a bizarre kind of rule -- one simple, if Utopian, idea, along with a puzzle box of exceptions. The general rule was that juries should not hear secondhand evidence: they should hear Smith's story out of his own mouth, and not Jones's account of what Smith had to say. But in many situations, the rule was deemed too strict, and there were recognized exceptions. These ranged from involuntary utterances ("ouch!"), to shopbooks, to the dying words of a murder victim, naming his killer.
Hearsay rules grew luxuriantly, on both sides of the Atlantic. Most of the doctrines appeared first in England; but not all. The business-entry rule admits records made in the ordinary course of business, even though these are, strictly speaking, hearsay. The rule seems to have started in America around 1820. In general, the American law of evidence outstripped English law in complexity, perhaps because of a deeper American fear of concentration of power.
The rules relating to witnesses were as complicated as rules about the kind of evidence that could be heard, and for similar reasons. In 1800, husbands and wives could testify neither for nor against each other. No person could testify as a witness if he had a financial stake in the outcome of the case. This meant that neither the plaintiff nor the defendant, in most cases, was competent to testify on his own behalf; their mouths were shut in their very own lawsuits. During the period, some fresh restrictions were added. The doctor-patient privilege, for example, prevents a doctor from giving medical testimony without his patient's consent. It seems to have first appeared in a New York law of 1828. Missouri passed a similar law in 1835.
In theory, the rules of evidence were supposed to be rational rules of legal proof. Trials were to be orderly, businesslike, and fair. Each rule had its reason. But the rules as a whole, in their sheer complexity, tended to defeat the rationale. They seem bewildering and irrational, as liable to cheat justice as to fulfill it. Exclusionary rules, said Jeremy Bentham, had the perverse effect of shutting the door against the truth; the rules gave "license to oppression by all imaginable wrongs." Quakers would not take an oath on religious grounds; the rule excluding their testimony,
in a case of rape...includes a license to the whole army to storm every quakers' meeting-house, and violate the persons of all the female part of the congregation, in the presence of fathers, husbands, and brothers.
It is unlikely that this particular gang rape ever took place. But there was plenty of real and possible abuse, in the law of evidence, to feed the appetite for reform. Indeed, the law of evidence began to be purged of excesses at the very time that courts were putting the final building blocks in place. But major changes were achieved only later in the century.
At the close of a trial, the judge instructs the jury -- tells them about the applicable law. In the 20th century, the lawyers write these instructions themselves, or copy them from form-books. The instructions tend to be stereotyped, antiseptic statements of abstract rules. In many cases, it is hard to see how juries can make heads or tails out of them. They seem to matter mostly to lawyers, who argue about the wording and base appeals on "errors" in instructions. In 1776 or 1800, judges tended to talk more freely to the jury. They summarized and commented on the trial; they explained the law in simple, nontechnical language. Instructions were clear, informative summaries of the state of the law. Chief Justice Richard Bassett, of Delaware, explaining adverse possession to a jury, in 1794, remarked: "If you are in possession of a corner of my land for twenty years by adverse possession, you may snap your fingers at me." All this changed in the 19th century; there was to be no more finger snapping, no more vivid language, no more clarity, no metaphors. As early as 1796, a statute in North Carolina made it unlawful, "in delivering a charge to the petit-jury, to give an opinion whether a fact is fully or sufficiently proved," since that was "the true office and province of the jury." In the 19th century, a number of state statutes took away the judge's right to comment on evidence. This made the jury less liable to domination by the judge. The stereotyped instructions may have confused the jury, but they helped maintain its autonomy. The judge was more rulebound than before -- hamstrung, one might even say.
In the early years of the 19th century, the roles of judge and jury were subtly altered and redefined. Up to that point, both judge and jury had been relatively free to act, each within its sphere. According to William Nelson, the jury "ceased to be an adjunct of local communities which articulated into positive law the ethical standards of those communities." It was rather an "adjunct" of the court, whose main job was to handle "facts," not "law." Nelson traces to the first decade of the 19th century, in Massachusetts, the articulation and definition of what became a classic definition of powers and roles: that the judge was master of "law," the jury of "fact." The result, in theory at least, was a better balance of power. Ultimately, it made possible a more rational, predictable system of justice, especially in commercial cases. At least that much seemed plausible. And within the little world of the courtroom, the two major powers, judge and jury, were locked into tighter, better definitions. Checks and balances were more than constitutional concepts; they pervaded the whole of the law.
Copyright © 1973, 1985 by Lawrence M. Friedman
A History of American Law, Revised Edition
Now Professor Friedman has completely revised and enlarged his landmark work, incorporating a great deal of new material. The book contains newly expanded notes, a bibliography and a bibliographical essay.