The Conservative Sensibility Read online

Page 36


  The presidential election of 1936 had raised the curtain on a new politics, the pursuit of power in the context of the regulatory, administrative state. Roosevelt cultivated the support of groups his policies had created as self-conscious groups. These included farmers attached to the federal government by the Agricultural Adjustment Act of 1933 and the Farm Security Administration of 1935, union members empowered by the Wagner Act of 1935, and the elderly entitled by the Social Security Act of 1935. So, fifty years later, President Ronald Reagan, whose politically formative years were during the New Deal, and who had been an ardent supporter of FDR, spoke at the Illinois State Fair, where he made this boast: “No area of the budget, including defense, has grown as fast as our support of agriculture.”112 The farmers’ applause interrupted his eleven-minute speech fifteen times. Confession is, as Mark Twain said, good for the soul but bad for the reputation, which perhaps explains why the Republican Party does not confess that it has largely come to terms with New Deal–style politics in the context of a regulatory state and an entitlement society. And with permanent deficit spending.

  For this nation’s first two centuries, deficit spending was largely for investments—building infrastructure, winning wars—that benefited future generations. So government borrowing appropriately shared the burden with those generations. Now, however, there is continuous borrowing because there are unending deficits, even when the economy is growing briskly and there is full employment. The borrowing and deficits burden future generations in order to fund current consumption of government goods and services. As Christopher DeMuth says, we have gone from investing in the future to borrowing from the future. This is because the promises we have made to ourselves through the entitlement state cannot, as a political matter, be funded by taxation sufficient to fund the government that the political class has been pleased to create and that the public has been pleased to have created. And there is approximately no constituency for additional kinds of taxes—consumption or value added taxes—to fund the entitlement state architecture. It is unreasonable to expect there to be, given that, in today’s decadent democracy, there are no ethics to restrain the off-loading of burdens onto the unborn. Infantilism—the refusal to will the means for the ends that one wills—has become the national norm.

  Modern conservatism began in Edmund Burke’s splendid recoil from the French Revolution, not only from the terror but also from the Revolution’s assault on privacy in the name of civic claims. Conservatism has always been defined by its insistence on limits to the claims the collectivity—the public sector—could make on the individual. Contemporary American conservatism was born in reaction to the New Deal and subsequent excessive enlargements of the state. Today, this conservatism is a persuasion without a party, a waif in a cold climate. However, over the last fifty years America’s politics have shifted in one way disadvantageous to progressives. Watergate and Vietnam caused an erosion of confidence in the motives of government. The internationalization of economic life has weakened the power of governments to control economic forces. The mobility of money and businesses inhibits governments because wealth can flee from currencies threatened by inflation, or from jurisdictions where growth is slow because government is meddlesome. Furthermore, recurrent recessions and slow growth have increased individuals’ anxieties and decreased social solidarity, thereby weakening society’s support for collective actions through government. Perhaps there is a paradox lurking here. If government does what conservatives wish it would do, if it retrenches and does less to impede growth, it might experience a rebirth of prestige and be poised to advance the progressives’ agenda. This is a risk conservatives should cheerfully run.

  Lincoln’s vision, and the essence of the early Republican Party’s free labor doctrine, was that “the prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him.”113 Lincoln insisted that in America “there is no such thing as a freeman being fatally fixed for life, in the condition of a hired laborer.”114 The essence of America’s aspiration, however imperfectly realized, is that no one should be “fatally fixed for life.” Hence the constant need to refresh the nation’s commitment to a life of constant social churning by forces beyond the control of politics and of government with its bias toward the status quo.

  People often lament the “impersonal” economic forces that shape the lives of individuals and communities. But personal forces, meaning political forces, are rarely preferable. Often they are more bitterly resented, because they are seen and felt to be personal. Also, before too heartily celebrating the organic life of small communities, one should revisit an American literary genre that gave us Sinclair Lewis’ depiction of Gopher Prairie, Minnesota, and Sherwood Anderson’s Winesburg, Ohio. Before deploring the disruptive effects of new technologies, consider the fact that one of the best things that ever happened to African-Americans was the mechanization of agriculture that destroyed so many of their jobs. Time was, in places like rural Mississippi, African-Americans lived in stable, traditional, organic communities of a sort often admired by intellectuals who praised them from far away. African-Americans led lives of poverty, disease, and oppression, experiencing the grim security of peonage. Then came machines that picked cotton more efficiently than stooped-over people could, so lots of African-Americans stood up, packed up, got on the Illinois Central, got off at Chicago’s Twelfth Street station, and went to the vibrant South Side where life was not a day at the beach but was better than rural Mississippi. Destruction of a “way of life” by “impersonal” economic forces can be a fine thing.

  In any case, Americans have no alternative to embracing economic dynamism, with its frictions and casualties and uncertainties. Otherwise they must live with the certainty of stagnation and of a zero-sum politics of distributional conflicts driven by government as the allocator of wealth and opportunity. Americans must choose to live somewhere on the continuum between stasis and the whirl and fluidity of modern life—people, ideas, and capital flowing hither and yon. This is inherently unsettling; it tests and disrupts settled ideas and arrangements. Theodore Roosevelt said the mission of public officials is “to look ahead and plan out the right kind of civilization.” This, from the man who peered into the future and spotted an imminent “timber famine” caused by railroads’ needs for wood rail ties, a famine that never arrived. Decades later, another New Yorker, urban planner Robert Moses, spoke of “the clean-cut, surgical removal of all of our old slums,” some of which, of course, remain, as do human casualties from the surgery. Ross Perot, a billionaire businessman, ran for president in 1992 saying that he had been unintentionally training for this job because “I’ve spent the last forty years designing, engineering, testing, and implementing complex systems.” Today, the idea of empowering the political class to design and engineer society has lost its allure. The Bible, Virginia Postrel reminds us, teaches that no sooner had God created man and woman than He seemed to lose control of events.115 To the conservative sensibility, much of the pleasure of life derives from the fact that in an open society, events, and the future, are splendidly beyond control.

  Chapter 6

  CULTURE AND OPPORTUNITY

  The Scissors that Shredded Old Convictions

  The central conservative truth is that it is culture, not politics, that determines the success of a society. The central liberal truth is that politics can change a culture and save it from itself.

  Daniel Patrick Moynihan1

  There had been an unusual spring snowstorm on March 30, 1964, the day protracted debate and attendant maneuverings began as the US Senate took up the Civil Rights Act. The temperature in Washington was one hundred degrees when the Senate finally voted for the cloture that led to the July 2 passing of this legislation that, among other things, forbade racial discrimination in “public accommodations”—places of business open to the public
. Also on July 2, 940 miles away in Kansas City, Missouri, Eugene Young, a thirteen-year-old African-American, had been turned away from the barbershop at the Muehlebach Hotel, which had refused service to African-Americans since it opened in 1915. That afternoon, at a White House ceremony, President Lyndon Johnson signed the bill into law. At eight a.m. the next day, Young returned to the Muehlebach, and for two dollars received a haircut from Lloyd Soper, who said: “I didn’t mind cutting that little boy’s hair.”2

  Getting Eugene Young that haircut was difficult. It took a century of struggle, culminating in months of legislative maneuverings. But getting that boy into that barber’s chair was much the easiest part of America’s long coming-to-terms with its racial problem. In 1964, Americans had not begun to fathom how entangled racial problems were, and remain, with problems of class, a subject that makes Americans deeply uneasy. A decade earlier, the US Supreme Court had begun confronting the problems of class, without quite realizing that it was doing so.

  The Muehlebach Hotel is sixty-two miles east of what then was Monroe Elementary School in Topeka, Kansas. In 1951, Oliver Brown, whose wife, Leola, had attended Monroe, decided that his daughter Linda, nine, should not have to. Monroe, a school for black children in Topeka’s segregated school system, was separate from, but not equal to, schools for white children. Linda’s name is associated with the Supreme Court case Brown v. Board of Education, which propelled progress toward the 1964 Civil Rights Act and the dismantling of racial segregation by law. The fact that the board of education being sued for its segregation policies was in Kansas is indicative of the fact that segregation was widely practiced, and even more widely approved. In Montgomery, Alabama, it was illegal for a white person to play checkers in public with a black person. Congress was running a segregated school system in the nation’s capital. In 1948, President Harry Truman could not persuade Congress to make lynching a federal crime. When Brown was first argued in 1952, the Supreme Court was composed entirely of Democratic—of Roosevelt and Truman—appointees. If the court’s composition had not been unexpectedly changed in 1953 by the addition of a Republican nominee, the legal basis of segregation—the doctrine that “separate but equal” public facilities are constitutional—probably would have been affirmed in 1954. No Republican nominee had served on the court since Owen Roberts, a Hoover nominee, resigned in 1945. But eight months into Dwight Eisenhower’s presidency, there occurred the most fateful heart attack in American history. It killed Chief Justice Fred Vinson, a Kentuckian who believed the “separate but equal” doctrine, enunciated in the 1896 Plessy v. Ferguson decision, should remain. If he had survived, the Plessy precedent probably would have, too.

  In Brown, the court held that assigning white and black children to separate schools on the basis of race violates the Fourteenth Amendment’s guarantee of equal protection of the laws. Unfortunately, the court’s ruling was insufficiently radical. The court waxed sociological, citing such data as the preference of some black children for white dolls, which might have been related to feelings of inferiority caused by school segregation. And the court cited studies—studies more problematic than the court assumed—concerning the effects of segregation on children’s abilities to learn. By resting the desegregation ruling on theories of early childhood development, the court’s rationale limited the anti-discrimination principle of the ruling to primary and secondary education. As Robert Bork said, making the ruling contingent on sociological findings “cheapened a great moment in constitutional law.”3 The proper, more radical rationale for the Brown outcome was simply that the government’s use of racial classifications in making decisions is incompatible with the Constitution’s guarantee of equal protection of the laws. Had the court said this plainly in 1954—had the justices been content to apply not sociology but this sweeping legal principle—much subsequent court-produced mischief might have been avoided. Instead, before a generation had passed, the court was ordering busing—excluding, on the basis of race, children from neighborhood schools, and transporting them to more distant schools, to which they were assigned because of their race.

  The Brown decision presaged the 1964 Civil Rights Act, in which Congress mandated non-discrimination in much of public life, proscribing discrimination by government and by individuals in employment and public accommodations. Or so Congress thought. Just four years later the court was saying otherwise. The 1964 act defined school “desegregation” as “the assignment of students to public schools…without regard to their race.” But in 1968 the court held that compliance with Brown involved more than ending segregation, which hitherto had been understood as the government-compelled separation of the races by the law. The court said that where almost all white or almost all black schools—so-called “de facto segregation”—still existed, government-ordered racial discrimination was required.

  The phrase “de facto segregation” is an Orwellian oxymoron. Segregation, properly understood, is de jure—by law—or it is not segregation. Nevertheless, soon there was compulsory busing, which became one of the most costly failures—costly in money, ill will, educational distortion, and flight from public schools—in the history of American social policy. One tragedy of racial policy since Brown is that the 1964 Civil Rights Act was twisted, against legislative intent, by people whose idealism made them serene in their cynicism. Racial discrimination is any action based on race. The 1964 act forbade discrimination in employment. Yet the court has held that the spirit of the act requires what the letter of the act forbids—that employers often must take race into account for various “affirmative action” purposes. Brown begat the Civil Rights Act of 1964, which propelled the nation toward a painful, still ongoing reckoning with this fact: Dismantling the laws that enforced inequality did not bring the nation close to a general enjoyment of equal opportunity. Rather, the path to the present ran through a tragedy of social regression and through economic and educational developments that would produce new forms of social stratification and new impediments to social mobility.

  LIFTING WEIGHTS

  The American creed is that all individuals are created equal in the possession of freedom—the capacity for human agency—and in the right to exercise it. Government is instituted to give citizens equal protection of their enjoyment of liberty. An open and democratic society is one in which people are equally free to become socially and politically unequal. It is indisputable that equality of rights will breed inequality of conditions. People with equal rights will have unequal aptitudes, abilities, situations, and random instances of advantages and disadvantages.

  For generations conservatives have been recoiling from what they correctly consider many unwise, unjust, and counterproductive government policies redistributing wealth. In doing so, however, conservatives have clung to a distinction that is increasingly difficult to draw: They have said that they favor equality of opportunity, not equality of condition. This is facile because opportunities are conditioned by conditions. Chapter 4 argued that conservatives’ reflexive rhetoric in praise of judicial “restraint” enabled the progressive agenda and disabled conservatism from standing for a political end, liberty, rather than a political process, majority rule. This chapter argues that conservatives have allowed their intelligence to be anesthetized by lazy recourse to the tired formulation “equality of opportunity, not of outcomes.” Equality of outcomes is, as conservatives rightly argue, neither possible nor desirable. Equality of opportunity is, however, far more complex and elusive than conservatives can comfortably acknowledge.

  In his long, circuitous train trip from his home in Springfield to Washington for his inauguration, Abraham Lincoln paused in Philadelphia where, at Independence Hall, he was prompted to give what he said was “wholly an unprepared speech.” In it he said that the Declaration of Independence’s affirmation of equality “gave promise that in due time the weights would be lifted from the shoulders of all men, and that all should have an equal chance.”4 Before we in the twenty-first century can understa
nd what it would mean for all to have an equal chance, conservatives, especially, must think more clearly than many of them have done about what Lincoln called “weights” on individuals’ shoulders.

  Lyndon Johnson, the greatest presidential benefactor of African-Americans since Lincoln, advanced the nation’s thinking on June 4, 1965, at Howard University in Washington, D.C. There he delivered the most consequential speech on race ever given by an American president, and perhaps the most important ever given by a white American. Johnson’s crucial contention was that “freedom is not enough.” He declared: “You do not wipe away the scars of centuries by saying: Now you are free to go where you want, and do as your desire, and choose the leaders you please. You do not take a person who, for years, has been hobbled by chains and liberate him, and bring him up to the starting line of a race and then say, ‘you are free to compete with all the others,’ and still justly believe that you have been completely fair. Thus it is not enough just to open the gates of opportunity. All our citizens must have the ability to walk through those gates.”5 The subsequent pursuit of even incomplete fairness—there can be no other kind—has been generally noble and in many ways successful. It also has been an arduous tutorial in the truth of Michael Oakeshott’s axiom that attempting the impossible is inherently corrupting.

  It would have been helpful—wholesomely disconcerting and embarrassing—if Johnson had given an example of how the federal government itself had closed to African-Americans “the gates of opportunity.” Created in 1934, the Federal Housing Administration invented and enforced “redlining,” explicitly steering new mortgages away from black buyers to maintain the racial homogeneity of neighborhoods. A 1946 FHA manual said: “Incompatible racial groups should not be permitted to live in the same communities.” And: “Properties shall continue to be occupied by the same social and racial classes.” And: “Appraisers are instructed to predict the probability of the location being invaded by . . . incompatible racial and social groups.” Invaded. During World War II, when a developer sought FHA guarantees for proposed housing on the last of the farmland still within the sprawling city of Detroit, the FHA initially refused because the development would be contiguous with a black neighborhood. The developer proposed a solution: I will build a wall. It would be between his development and the incompatibles. He did; you can see it today. Mollified, the FHA guaranteed mortgages on the white side. Almost half of all postwar suburban homes built in the United States had FHA mortgage guarantees. From 1934 through 1962, whites received 98 percent of those guarantees.6