Tuesday, November 30, 2010

The Generalist

Chris Beneke

Gordon Wood’s favorable review (“The Real Washington at Last”) of Ron Chernow’s massive new biography of George Washington appears in the latest New York Review of Books.* For a man who said so little and wrote so economically, Washington has inspired an avalanche of words. As Wood notes:

[W]e now have assessments of Washington’s political philosophy, his constitutionalism, his religion, his private life, his portraits, his leadership, his physical appearance, his interest in the Virginia backcountry, his concern for the decorative arts, his enlightenment, his place in popular culture, his view of the Union, and his relations with his wife Martha, Lafayette, James Madison, Henry Knox, Nathanael Greene, Benedict Arnold, his other generals, and various other revolutionaries. There are studies of Washington as a president, as a slaveholder, as a man of the West, as a general, as a partisan fighter, as an American symbol, as the modern Cincinnatus, as a Freemason, as a young man, as a patriarch, as a visionary, as a spymaster, as the architect and owner of Mount Vernon, as the designer of the nation’s capital, as the French saw him, and as the master manipulator of public opinion.

Wood’s title isn’t ironic. He contends that Chernow gets us closer to the “real Washington” than any of the legions of earlier biographers. Chernow is the beneficiary of a series of herculean archival efforts, including the ongoing project at the University of Virginia to publish all of Washington’s papers, which will eventually consist of ninety volumes.

Chernow benefits from another fortuitous circumstance, according to Wood—he’s not an academic. It isn’t that academic historians write especially bad. By comparison with other fields, our prose is not wholly dull, nor completely impenetrable. The problem lies, says Wood, in our tendency to write for one another and to publish books on “specialized problems” that few readers outside of History Departments will ever comprehend, never mind enjoy at the beach.

As Wood notes, we share this internal orientation with chemists and literary theorists alike. Like theirs, ours is an “accumulative science.” We are sunk in its immensity. “[T]he monographs have become so numerous and so refined and so specialized that most academic historians have tended to throw up their hands at the possibility of synthesizing all these studies, of bringing them together in comprehensive narratives. Thus the academics have generally left narrative history-writing to the nonacademic historians and independent scholars who unfortunately often write without much concern for or much knowledge of the extensive monographic literature that exists.”

Not Chernow. Wood says that he writes well and knows the secondary literature. The result is a very big and illuminating portrait of our national icon of sincerity, the general who always managed to elude his pursuers.**

******

* Barnet Schecter’s book, George Washington’s America: A Biography Through His Maps is also reviewed here. But the focus is on Chernow.

** For a sharp and less reverent account of both Chernow’s book and Washington’s life, see Jill Lepore’s “His Highness” in The New Yorker (September 27). Lepore isn’t persuaded that Chernow has made Washington more comprehensible.

Monday, November 29, 2010

Oral History and Iconic Red Desk Objects

Heather Cox Richardson

Morgan’s post on oral history struck a chord. (Among other things, he observes how valuable information is lost from one generation to the next.) I was shocked, recently, when talking to a high school student about her National History Day project, to learn that she had never heard of the Cold War hotline between the US and the USSR.

Indeed, why should she have? She was born after the end of the Cold War, and knows the USSR only from history books, most of which are too general to mention the hotline.

But in the 1960 and 1970s, everyone knew the story of the Red Telephone. It was such common knowledge that no one, apparently, has bothered to make a point of passing it down.

The significance of that loss goes far beyond understanding the mechanics of the connection. Indeed, the actual hotline was not a red telephone on the President’s desk; it was a teletype machine at the Pentagon. (The history of the hotline is told wonderfully here, by Webster Stone, now producer and executive of the American Film Company.)

The mechanics of the line are far less important than the cultural context it evoked. Imagine watching TV or films from the era of the Cold War without the knowledge of what a red telephone meant. Everyone who lived during that time understood that when a red phone sat on a desk, it was not a fashion accessory. It was a symbol of an enormously important link on which hung the fate of the world. (See this clip of a 1967 episode of Batman, for instance.)

But to a more recent generation, it’s just a red telephone.

For younger readers who don’t see why this matters, think of a red Swingline stapler. It’s a key prop from the black comedy Office Space. It represents the stifling bureaucracy of the modern office, cut into cubicles staffed with faceless paper pushers. (This is also the film that gave us “Didn’t you get the memo?”) To a certain generation, a red stapler carries an indictment of the soul-crushing big business of the early twenty-first century. Ignorance of that meaning tears a critical understanding away from modern popular TV and film.

But will anyone bother to tell their children what a red stapler signifies?

It seems to me that such cultural context is one key aspect of history that is lost without oral history. People simply don’t write down what is common knowledge. It is more likely to get recorded in a passing comment made to an oral historian.

Friday, November 26, 2010

Archeologists and Historians at Work

Randall Stephens

What do historians do? How do they go about their work?

The work of a historian is not entirely unlike that of an archeologist. Both see through a glass darkly. (Maybe that's more "darkly" for an archeologist.) They look at primary source materials (material culture and texts), make comparisons to corroborate evidence, think about the context of one era compared to that of another, and use secondary literature to give them a bigger picture.

A great deal of interpretation and analysis informs the work of archeologists and historians. What's more, archeologists often disagree with each other just as historians do. Questions still remain open for debate. And debates can easily become politicized, tied into issues of national identity, or personal.

The hard work of archeology is spelled out clearly in a wonderful piece in the latest issue of National Geographic: Robert Draper, "Kings of Controversy: Was the Kingdom of David and Solomon a Glorious Empire—or Just a Little Cow Town?" National Geographic (December 2010)

See, for instance, how Draper sets up the nature of controversies:

In no other part of the world does archaeology so closely resemble a contact sport. Eilat Mazar is one of the reasons why. Her announcement in 2005 that she believed she had unearthed the palace of King David amounted to a ringing defense of an old-school proposition under assault for more than a quarter century—namely, that the Bible's depiction of the empire established under David and continued by his son Solomon is historically accurate. Mazar's claim has emboldened those Christians and Jews throughout the world who maintain that the Old Testament can and should be taken literally. Her purported discovery carries particular resonance in Israel, where the story of David and Solomon is interwoven with the Jews' historical claims to biblical Zion.

Draper also sheds light on how texts are used (or misused) to ground the material evidence. Pieces of the puzzle are put together slowly over decades:

The books of the Old Testament outlining the story of David and Solomon consist of scriptures probably written at least 300 years after the fact, by not-so-objective authors. No contemporaneous texts exist to validate their claims. Since the dawn of biblical archaeology, scholars have sought in vain to verify that there really was an Abraham, a Moses, an Exodus, a conquest of Jericho. At the same time, says Amihai Mazar, Eilat's cousin and among Israel's most highly regarded archaeologists, "Almost everyone agrees that the Bible is an ancient text relating to the history of this country during the Iron Age. You can look at it critically, as many scholars do. But you can't ignore the text—you must relate to it."

Of course, historians don't use hard science or equipment in the same ways that archeologists do--carbon dating, chemical analysis, shock-proof computers that can handle intense heat and dirt. Historians don't typically study prehistorical cultures. And, historians spend their time digging, mostly, in the comfort of air-conditioned archives.

Still the similarities and points of contact between the two fields is quite interesting. We read evidence, whether that be in the form of pottery shards and olive pits stuck in the side of a Palestinian hill or 18th-century English newspapers and court records.

Does the ambiguity of the archeological record produce controversies that burn hotter than historical controversies? After reading the National Geographic article, I could not think of historical debates that rage with the same intensity.

Wednesday, November 24, 2010

Pre-Holiday Stress Relief

Heather Cox Richardson

When I was in graduate school, the story circulated that one of our very few female professors had protested the scheduling of a committee meeting on the Wednesday before Thanksgiving. “The next day is Thanksgiving,” she reportedly told the room full of male colleagues. She then asked: “Just who cooks in your house?”

We thought she was brilliant (and the meeting time got changed).

In honor of everyone stressing about the upcoming holiday, for whatever reason, I offer some classic moments in the popular culture of Thanksgiving:

There was the problem of commercial advertising, covered brilliantly by WKRP in Cincinnati, when the station manager decided to hold a turkey giveaway:



There’s the problem of politics, expertise, and celebrity. This was covered well by West Wing, when the President checks out the Butterball hotline:



And then there’s the problem of dissent in a democracy. This was covered, of course, in the all-time classic "Alice’s Restaurant." (Updated here. Would embed it here, but that's been disabled.)

Enjoy.

Tuesday, November 23, 2010

Death, Memory, and Oral History

Morgan Hubbard

A recent piece in The Root on World War II-generation African Americans got me thinking about memory.

When people die they take their memories with them. Their memories become inaccessible forever. Our understanding of events that shaped our world can be lost with the death of one woman, man, or child.

When memories have contemporary political valence, this can be dangerous. Allesandro Portelli’s The Order Has Been Carried Out: History, Memory, and the Meaning of a Nazi Massacre in Rome (Palgrave Macmillan, 2003) illustrates this point well. Portelli shows how nationalist Italians have purposefully misremembered the circumstances of a Nazi massacre sixty-five years ago for present political purposes. In 1945, anti-fascist Italian partisans attacked a column of German soldiers in Nazi-occupied Rome. The Nazis retaliated less than 24 hours later by massacring more than 300 Italians. Pro-fascist Italians at the time concocted a counter-narrative that blamed the partisans, not the Nazis, for the massacre, alleging that the partisans ought to have turned themselves in to forestall the murders. The documentary record demonstrates that this was never possible, but the counter-narrative is persistent, even among young right-leaning Italians today. Portelli's work rescues the truth, but only in the nick of time—the citizens of Rome who remember what really happened are now elderly. Many have already passed away.

Granted, most cases of memory are not so politically and morally fraught. But the fact remains that the loss of memory accompanying a person's death is also tragic for the historical record. Since the inception of large-scale oral history projects in the 1940s like the Works Progress Administration's Federal Writers Project—and the cultural turn in the humanities since the 1960s—academic historians in America have increasingly considered this. This is a good thing; memory enriches the documentary record.

The next step is to understand that generational memory loss is no longer as inevitable as it once was, thanks to technology, which has made democratized/amateur oral history a reality. If you have a laptop, or even a smart phone, you can conduct an oral history. StoryCorps has an excellent Do-It-Yourself guide to oral histories; the American Folklife Center at the Library of Congress is another good place to start.

Monday, November 22, 2010

Andrew Johnson Sworn in as President

Heather Cox Richardson

It always surprises me how much I think I know about Civil War history that I really don’t.

What did it mean that Georgia was “remanded to military rule” in 1869? I always thought that phrase indicated that troops marched into the state and took control. That’s wrong. Actually, what being “remanded to military rule” meant was that Congress did not seat the elected representatives from Georgia that session.

What did it mean that President Rutherford B. Hayes “removed the troops from the South” in 1877? That always sounded to me like the soldiers packed up and moved out. That’s wrong, too. Actually, in April 1877, the president removed federal troops from around the South Carolina State House, permitting Wade Hampton’s men to take control of the government from Republican incumbent Daniel Chamberlain. (There were very few troops in the South at that point, in any case, since many had been moved to the northwest plains to fight the Lakota and Northern Cheyenne in 1876. More moved out in summer 1877 to combat the Great Railroad Strike.) Federal troops remained in the South for years after 1877, a nominal number, but enough to be a thorn in the side of Southern Democrats.

Knowing that much we “know” is wrong, I’ve always wondered if Andrew Johnson was actually president during Reconstruction, or if he was only acting president—a legal distinction, to be sure, but an important one.

It turns out that this is a story we’ve gotten right. Johnson did, indeed, take the oath of office and become president, not simply acting president, of the United States.

On Monday, April 17, 1865, the New York Times ran a stark account of the event, the very sparseness of the language conveying some of the reporter’s shock at what had transpired in the past two days.

Shortly after President Lincoln breathed his last at 7:22 on April 15, Attorney General James Speed visited Vice-President Johnson at his rooms in Kirkwood House on Pennsylvania Avenue. The newspaper reporter simply recorded that Speed delivered a message informing Johnson of Lincoln’s death and impressing upon him that “the emergency of the government” required that he take the oath of office immediately. What he did not say was that James Speed was the older brother of Lincoln’s best friend Joshua Speed, and that he was quite likely both in shock and in tears.

Johnson replied that he would take the oath at 10:00 in his rooms.

At that hour, eleven men arrived at Kirkwood House for the ceremony. Curiously, they presented a fair representation of Lincoln’s presidency.

Lincoln had close friends there from the early years in which he had learned his profession and built a political following. James Speed attended, undoubtedly remembering the younger Lincoln who had roomed with his brother and visited the older James at his law office in Kentucky to talk business. Two of Lincoln’s friends from his early days in Illinois also came: Senator Richard Yates, with whom a young Lincoln had plotted for political advancement, and General John F. Farnsworth, who was a fan of the off-color jokes Lincoln used to appeal to rural voters.
There were wartime political rivals like Salmon P. Chase, whom Lincoln had recently neutralized by appointing him Chief Justice of the Supreme Court, who arrived to administer the oath of office to Lincoln’s successor.

There were political associates who understood the difficulties Lincoln had suffered under for the past four years, and who had wished him well. Frank and Montgomery Blair, father and son, former Democrats and strong Lincoln supporters from border regions had come; hot-tempered Montgomery had been Lincoln’s Postmaster General for three years. Also there was Secretary of the Treasury, Hugh McCulloch, who had seen Lincoln the morning of the assassination and was relieved to see that that war-weary president seemed happier and more cheerful than McCulloch had ever seen him.

Solomon Foot, president pro tem of the recently adjourned Senate, and Senator Alexander Ramsey of Minnesota, who was not especially close to the president, lent the gravitas of the party organization to the occasion.

Newcomers eager to underline their connection to the famous president were represented by Senator William M. Stewart, of Nevada, who had shaken Lincoln’s hand the night before outside his carriage as he left for the theater, and who later claimed to have received the very last lines Lincoln ever wrote: a note inviting Stewart to bring a friend to meet the president the next morning, a memo whose significance Stewart could not anticipate, and that he threw away as soon as he had read it.

Finally, staunch Republican Senator John P. Hale of New Hampshire was there, an unhappy symbol of Lincoln’s assassination. Hale’s daughter Lucy was a Washington belle, and was romantically involved in some fashion with John Wilkes Booth—possibly secretly engaged to the famous actor.

The eleven men gathered in Johnson’s rooms. Chief Justice Chase read the oath of office, and Johnson repeated it. Chase declared Johnson president, and those gathered gave him their best wishes.

“All were deeply impressed by the solemnity of the occasion,” the New York Times reporter wrote.

Indeed.

Sunday, November 21, 2010

World Heritage Sites

Randall Stephens

The Dallas Morning News features a piece on World Heritage sites. Lynn O'Rourke Hayes notes that "The U.N. Educational, Scientific and Cultural Organization works to preserve significant and inspirational places worldwide. Designated World Heritage sites, they're as diverse as Yellowstone National Park, Shark Bay in Australia and the historic center of Vienna, and they symbolize the world's collective history, culture and landscape. Reviewing the list of 911 World Heritage locations provides an impressive history lesson. Here are five your family would enjoy." >>>

Unesco's World Heritage Committee makes the decisions on whose in. On the criteria: "To be included on the World Heritage List, sites must be of outstanding universal value and meet at least one out of ten selection criteria. . . . The criteria are regularly revised by the Committee to reflect the evolution of the World Heritage concept itself."

A handful of the sites in China: Imperial Palaces of the Ming and Qing Dynasties in Beijing and Shenyang; Mausoleum of the First Qin Emperor; Mogao Caves; Mount Taishan; Peking Man Site at Zhoukoudian; The Great Wall . . .

A few in the United States: Mesa Verde National Park; Yellowstone National Park; Independence Hall; Statue of Liberty; Hawaii Volcanoes National Park . . .

A building, town, area, or natural feature designated a World Heritage Site can still sink into the sands of time. The Australian reports that "Historic treasures across Italy, from the fabled Golden House built by the emperor Nero to the Colosseum, are at risk of collapse. The treasures are under threat because of official neglect and budget cuts, heritage experts say."

It helps to have a thriving economy and a relatively corruption-free political order.

Friday, November 19, 2010

Lincoln and November 19, 1863… 1864… and 1865

Heather Cox Richardson

Seven score and seven years ago Abraham Lincoln brought forth on this continent a new sentiment, conceived in liberty, and dedicated to the proposition that all men are created equal.
Civil War historians know the Gettysburg Address so well that writing about it seems almost trite. We lecture about it; we teach it in discussion groups; we know it by heart.

It is hardly innovative to note that this famous speech marked a turning point in the meaning of the Civil War. With his masterful invocation of the Declaration of Independence, President Lincoln redefined the conflict. No longer would it be a fight solely to prevent the dismembering of the Union; from 1863 forward, it would be a struggle to guarantee that everyone born in America would have equal access to education, economic opportunity, and the law.
Lincoln’s declaration was truly a rededication of America. This, as much as anything, earned Lincoln a dominant place in the American pantheon. His words spoke directly to the true meaning of modern America.

But this belief in equality in America has never gone uncontested. It seems that Lincoln could have been speaking to the present when he warned at Gettysburg that the living must defend the legacy of the dead: “It is for us the living, rather, to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced. It is rather for us to be here dedicated to the great task remaining before us—that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion—that we here highly resolve that these dead shall not have died in vain—that this nation, under God, shall have a new birth of freedom—and that government of the people, by the people, for the people, shall not perish from the earth.”

A year to the day after delivering the Gettysburg Address, on November 19, 1864, President Lincoln offered another epigram about America.

Among the blizzard of correspondence that crossed his desk that day was a brief note the President jotted to General William S. Rosecrans. In it, Lincoln stayed the execution of Confederate Major Enoch O. Wolf, convicted of murdering Major James Wilson and six members of the cavalry of the 3rd Missouri State Militia.

The President freely admitted he did not know anything of the circumstances of the case, and that the decision about Wolf’s future was in Rosecrans’s hands. He had suspended the sentence because he wanted to make sure Rosecrans understood that the general’s own inclinations were unimportant, and that he must do only what was best for the nation. “I wish you to do nothing merely for revenge,” Lincoln wrote, “but that what you may do, shall be solely done with reference to the security of the future.”

After 1863, Lincoln turned his masterful political skills solely toward securing equality for all Americans. As he counseled Rosecrans to do, he lost himself in his vision for the nation. Lincoln took hit after political hit, deflected opponents’ wrath with wry stories, and tried to find middle ground with his enemies. As he indicated to Rosecrans, he had only one goal: to make the American dream accessible to all Americans.

In the end, Lincoln was unable to blunt the hatred of the men who saw his defense of equality as an assault on civilization. By November 19, 1865, the President was dead. But he left behind him a new vision of America, and a charge to those born after the night that he, too, died for it: “that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion.”

November Issue of Historically Speaking

Randall Stephens

The November issue of Historically Speaking should be arriving in mailboxes soon. Not long after that it will appear on Project Muse. The issue features essays on race and culture, "modernist" economics, the Viking Age, Byzantium, Arabs and Jews in Israel, and the philosophy of history. It also includes a forum on comparative ways of war and interviews with Donald Kagan and Richard Reinsch.

Nancy Marie Brown has an essay here, too, about an intriguing figure from the Dark Ages that I knew absolutely nothing about: "In Search of the Scientist Pope, Gerbert of Aurillac (c. 950-1003)." (Brown also supplied us with the striking photo of the statue for the cover.) This piece on a monk who would become Pope Sylvester II made me wonder how many other historical characters, perhaps lost to the ages, might make us rethink what we know about a given period. Brown writes:

Born a peasant in the mountainous Cantal region of France in the mid-900s, Gerbert entered the monastery of Saint-Gerald’s of Aurillac as a child. There he learned to read and write in Latin. He studied Cicero, Virgil, and other classics. He impressed his teacher with his skill in debating. He was a fine writer, too, with a sophisticated style full of rhetorical flourishes.

To further his education, his abbot sent him south with the count of Barcelona in 967 to the border of al-Andalus. Islamic Spain was an extraordinarily tolerant culture in which learning was prized. The library of the caliph of Cordoba held 40,000 books (some said 400,000); by comparison, Gerbert’s French monastery owned less than 400. Many of the caliph’s books came fromBaghdad, known for its House of Wisdom, where for 200 years works of mathematics, astronomy, physics, and medicine were translated from Greek and other languages and further developed by Islamic scholars. Arabic was then the language of science. It was from al-Andalus that the essence of modern mathematics, astronomy,
physics, medicine, philosophy—even computer science—would seep northward into Christian Europe over the next 300 years. While Gerbert lived in Spain the first of many science books were translated from Arabic into Latin through the combined efforts of Muslim, Jewish, and Christian scholars. Many of the translators were churchmen; some became Gerbert’s lifelong friends and correspondents.

Gerbert’s role in bringing these ideas from Cordoba to Rome is unclear. Writers in the 11th
and 12th centuries made him the instigator. Gerbertus Latio numeros abacique figuras runs a verse on two mathematical manuscripts, meaning—as the illustrations clearly show—the Arabic numerals 1 to 9. Seven manuscripts (out of eighty) give him credit for the first Latin explanation of the astrolabe. Even William of Malmesbury, whose 12th-century history of Gerbert’s stay in Spain reads like The Arabian Nights, says he “surpassed Ptolemy in knowledge of the astrolabe” and “was the first to seize the abacus from the Saracens.” . . .

Gerbert also took an experimental approach in his study of music. He made a pipe organ and wrote a treatise explaining how to compute the length of organ pipes for a span of two octaves. He was searching for a mathematical truth: a law for computing the dimensions of an organ pipe that would sound the same note as the string of a certain length on the monochord. He came up with an equation, using what physicists call “opportune constants” (or “fudge factors”), that allowed him to switch, mathematically, from the monochord to organ pipes and back. His treatise shows an extraordinarily modern perspective. He did not simply theorize—or search out authorities. He collected data and made practical acoustic corrections. His solution is ingenious, though labor intensive, and stands up to the scrutiny of modern acoustics theory. (Read more in the print or the on-line version when it's posted next week.)

Historically Speaking (November 2010)

Modernist Economics
Wyatt Wells

Winning All the Battles
Robert L. O’Connell

The Waspish Hetero-Patriarchy: Locating Power in Recent American History
Kevin M. Schultz

Empathy and the Etiology of the Viking Age
Robert Ferguson

Whittaker Chambers the Counterrevolutionary: An Interview with Richard Reinsch
Conducted by Donald A. Yerxa

Byzantine Exceptionalism and Some Recent Books on Byzantium
Warren Treadgold

Comparative Ways of War: A Roundtable

The German Way of War Revisited
Robert M. Citino

The American Way of War Debate: An Overview
Brian McAllister Linn

The Many Ways of Chinese Warfare
Peter Lorge

Wending Through the Way of War
James Jay Carafano

Thucydides and the Lessons of Ancient History: An Interview with Donald Kagan
Conducted by Randall J. Stephens

On the Liberation from the Tyranny of the Past: Arabs and Jews in Israel
Alon Confino

History’s a Mystery
Bruce Kuklick

In Search of the Scientist Pope, Gerbert of Aurillac (c. 950-1003)
Nancy Marie Brown

Thursday, November 18, 2010

Jill Lepore to Lecture Tonight at 7:00pm, Eastern Nazarene College

.
The Eastern Nazarene College History Department will present a lecture by Harvard University historian Jill Lepore at 7 p.m., Thursday, November 18, in Shrader Lecture Hall
(23 East Elm Ave, Quincy, Mass). The lecture is free and the public is invited to attend.

Titled “Poor Richard’s Poor Jane,” Lepore’s talk will be based on her forthcoming biography of Benjamin Franklin and his sister, Jane Mecom.

Lepore is the author of New York Burning: Liberty and Slavery in an Eighteenth-Century City, The Name of War: King Philip’s War and the Origins of American Identity, which won the Bancroft Prize, A is for American: Letters and Other Characters in the Newly United States and, most recently, The Whites of Their Eyes: The Tea Party’s Revolution and the Battle over American History. Lepore has written for the New York Times, the Washington Post, the New Yorker, and the Los Angeles Times.

The ENC History Department Public Lecture Series is made possible by the support of ENC alumni.

Wednesday, November 17, 2010

Glenn Beck, the Slave Power Conspiracy, and the Paranoid Style

.
Today's guest post comes from Michael Bonner, an adjunct professor of 19th-century history at the University of Arizona. Here, Bonner considers the "paranoid style" rhetoric of Glenn Beck, conservative radio and TV commentator, and slave-power conspiracy theorists in the 1850s.


Michael Bonner

Fox News host Glenn Beck has accumulated enough public prestige, through widespread media exposure, to alter the nation’s political dialogue. How did Beck evolve into such a potent source of political opinion? It is the style of delivery, not the message, that drives Beck’s success. Beck is a master practitioner of the “paranoid style.” (More on that below.) This is not a new approach to building a political base nor is it confined to either the right or left. To simply compare Beck’s popularity with other right-wing examples of the paranoid style would be pointless. However, to analyze Beck’s success alongside the most revered American progressives, abolitionists in the 1850’s, will show that, not only is the “paranoid style” a recurrent theme in American political thought, but implementation of a similar rhetorical strategy made both Beck and creators of the “slave power conspiracy” powerful voices in their respective political dialogues.

Columbia University historian Richard Hofstadter spelled out this phenomenon in The Paranoid Style in American Politics (1952), and defined it as “a way of seeing the world and of expressing oneself” that included a “feeling of persecution . . . [and] grandiose theories of conspiracy.” The paranoid style is constructed from several foundational themes. First, one must believe in “the existence of a vast, insidious, preternaturally effective international conspiratorial network designed to perpetrate acts of the most fiendish character,” and society is at war with “a vast and sinister conspiracy . . . set in motion to undermine and destroy a way of life.” Second, paranoid stylists are incredulous about others’ sincerity—opposing arguments, they think, are not based on reasonable deliberation, but must originate from some other source. As a result, they tend to believe that “history is a conspiracy, set in motion by demonic forces of almost transcendent power,” which cannot be conquered by “the usual methods of political give-and-take,” but instead require the constant efforts of “an all-out crusade.” The final element of the paranoid style is what Hofstadter called the “conclusive jump.” After compiling sufficient circumstantial evidence, its practitioners are ready to weave the proof into a conspiracy theory. This is the paranoid style’s weakest aspect and one that critics successfully attack when deconstructing conspiracy theories. Hofstadter declared, “what distinguishes the paranoid style is not . . . the absence of verifiable facts (though it is occasionally true that . . . the paranoid occasionally manufactures them), but rather the curious leap in imagination that is always made at some critical point in the recital of events.”[1] The willingness to interpret issues based on far-flung dubious evidence, or the “conclusive jump,” appears illogical to objective onlookers. Still, it symbolizes the essence of the paranoid style.

Far from being a moribund historical curiosity, the paranoid style is alive and well on television and the radio. A close inspection of the May 13, 2010 Glenn Beck show serves as an excellent example. Beck called this episode “Crime Inc.” and discussed the impending authoritarian global government. He used an “umbrella” analogy to describe several allegedly interconnected leaders motivated by the doctrines of United Nations functionary Maurice Strong. Beck focused on a 1990 television interview in which Strong said, “isn’t the only hope for the planet that the industrialized civilizations collapse? Isn’t it our responsibility to bring that about?” Next, Beck listed the “small group” bent on global “control,” which included a variety of progressives: John Holdren, Jeremiah Wright, Anita Dunn, Andy Stern, and Donald Berwick. Beck compiled a series of similar views, expressions, and comments and masterfully represented them as conspiratorial evidence, ready for interpretation through the paranoid lens. The Fox host asked viewers to fall-in with his interpretation of alleged proof and affirmed the conclusive jump by reminding viewers they were looking for “a small group, that . . . will collapse the industrialized system.”[2] It is the delivery of information, not the content, which makes Beck such a powerful political rhetorician. Any single argument in the chain of evidence could be disputed as out-of-context, one-sided, and at times absurd, but when carefully constructed to fit specific ends, the collection of otherwise unusual evidence evolves into a plausible conspiracy.

Glenn Beck uses a strikingly similar rhetorical strategy that anti-slavery leaders employed in the 1850’s. Abolitionists symbolized the most radical element of the fledgling Republican Party and struggled to advance immediate emancipationist views against competing party platforms of nativism, prohibition, and free labor. Politically active abolitionists conceptualized the slave power conspiracy to demonize southern Democrats, increase their power within the Republican Party, win popular support, and provide a foundation for Republican unity. Ample circumstantial evidence existed for the slave power thesis, and using that evidence proponents accumulated political power out of proportion to their numbers. David Brion Davis writes: “It was a fairly small group of men—scarcely over twenty-five or thirty-who first delineated the Slave Power in speeches, articles, and books,” but their concept greatly influenced the mainstream Republican political rhetoric of William Seward, Charles Sumner, and, to a degree, Abraham Lincoln.[3] Thus the paranoid style assisted Republican growth in the 1850’s and enhanced the power of immediate abolitionists over the party’s rhetorical strategy.

Objective thinkers realized there was no Slave Power Conspiracy, which was determined to take over the world in the 1850’s. Today, there is no global shadow government bent on the destruction of industrialized nations.

The paranoid style makes sense of Beck’s TV and radio rants. A misuse of evidence is not a right-wing phenomenon, but a historical style adaptable to rhetorical strategies on both the right and left. It is not Beck’s message or compilation of evidence that make his arguments appealing to audiences, but the content’s organization and delivery. From a long-term historical perspective, Beck is not unique but simply a professional purveyor of the paranoid style.
___

1. Richard Hofstadter, The Paranoid Style in American Politics (Vintage Books, New York, reprint 2008), 4, 14, 29, 37.

2. Quoted material from May 13, 2010 “Glenn Beck” television program on the Fox News Channel.

3. David Brion Davis, The Slave Power Conspiracy and the Paranoid Style (LSU Press, Baton Rouge, 1969), 62.

Tuesday, November 16, 2010

Art of the Americas

Randall Stephens

The Boston MFA opens its new Art of the Americas Wing this week. (Watch a video on the space.) The extension represents the most significant public art project in America today. With a price tag of over $500 million, it will display a portion of the museum's massive South and North American collections. It ranges from the pre-Columbia to the 20th century. Some critics have complained that the wing will lack some of the MFA's best post-war pieces. That seems like a minor issue though.

WBUR reports:

And while this is a big cultural moment for Boston, the rest of the world is paying attention, too.

“Museums all over the country have expanded over the last decades and this is the MFA’s entry in the ‘space race,’ ” said arts writer Judith Dobrzynski, who has reported on the MFA’s expansion for the Wall Street Journal and the New York Times.


“And because it’s a beautiful space, and because it’s chock-a-block with art from the Americas, which is different. This is very important and I think it probably going to raise the profile and the stature of the MFA.”


Dobrzynski is impressed by the MFA’s fundraising effort, and the $504 million represents a feat for Boston. Other cities — bigger cities, she said — including Los Angeles and Chicago, have not been as successful in raising money for their own museum campaigns. Or as strategic. The MFA’s new construction cost about $340 million — much of the rest sits in an endowment.
“You know the MFA was very ambitious here,” she said, laughing, “but at the same time a bit conservative by going for a building fund and an endowment fund at the same time. I shouldn’t say conservative. I should say responsible here.”>>>

As far as North American artists, highlights will likely include:

Washington Allston
John Singer Sargent
Gilbert Stuart
Frederic Edwin Church
Martin Johnson Heade
Fitz Henry Lane
Thomas Eakins

John Singleton Copley
Childe Hassam
Frank Benson
Edmund Tarbell
Winslow Homer
Mary Cassatt
Ellen Day Hale
Gretchen Rogers
Lilian Westcott Hale
Georgia O’Keeffe
Arthur Dove

Monday, November 15, 2010

The Journal Standard

Chris Beneke

Historians are people of the book. We write piles of them—monographs, textbooks, and edited books, strictly academic books and books intended (usually with no foundation in reality) for the bestseller list. Some of our better books are histories of the book; some of our better historians are historians of the book. We cherish books dearly, not least for their narrative artistry. But we also value their utility within the academic world. At research universities and colleges with research aspirations, after all, the scholarly book serves as the elusive ticket to the vastly overrated world of the tenured, Associate Professor, and later to invitations to speak, comment, and publish still more books.

It’s probably safe to say that, as a profession, we are agreed that a series of significant journal articles or book chapters may substitute for “the Book” when it comes to things like tenure and promotion. Harvard probably won’t grant you tenure for such an achievement—to be honest, Harvard won’t grant you tenure under any circumstances—but other research-oriented institutions probably will. Still, “the Book” rules in nearly every history department at every institution that aspires to climb the U.S. News rankings. “How’s your book coming along?” is the haunting refrain that echoes in the corridors of the Marriotts and Grand Hyatts at convention time.

It’s no revelation that different criteria prevail in disciplines such as medicine, engineering, business mathematics, and the natural sciences. For scholars in these fields, books often connote “textbooks” and are therefore of little interest to serious scholars. What matters is journal publication—and not just in any old venue. Hits in so-called A-list journals are coveted. Even in the social sciences and some of the humanities, junior faculty are expected to produce journal articles and book chapters, but nothing in codex with a single author’s name on it.

In short, historians are producing, recognizing, and even celebrating work that runs sharply against the grain of research in other disciplines. To put it in the bluntest terms, we have a Book Standard; they have a Journal Standard. It’s not that we don’t value that other form of scholarly currency. We just don’t value it quite as much.

Nonetheless, the advantages of the Journal Standard to university administrators and trustees who are seeking both transparency and higher, measurable rates of faculty research production is almost self-evident. Journals are ranked according to clear-cut categories, typically from A to C. An article in an A-list journal can be treated like another A-list journal. That isn’t the case with books. To this point, blessedly, we haven’t had to bother with such lists. The unfortunate consequence is that a book’s “impact” may now be more difficult to demonstrate.

Of course, we know that Harvard, Yale, Chicago University Press, etc. are selective and prestigious publishers. But do we know anything more than that? And, if we don’t, aren’t we, the authors of such books, liable to inflate their value? That is certainly the conclusion that an administrator or trustee, especially one trained outside the humanities and the humanistic social sciences, might draw. Even humanists could be tempted by such heretical logic.

We might look upon the situation as analogous to that faced by the United States in the late nineteenth century. To pay for the massive buildup of forces in the Civil War, the federal government issued paper “greenbacks,” which were not directly convertible to specie (gold or silver, etc.). The unsurprising result was inflation. In an attempt to limit price increases, currency fluctuations, and speculation, the United States, along with much of the industrializing West, moved to a Gold Standard.

It’s not hard to see the current evaluation regime in universities as equivalent to a return to the Gold Standard following a massive expansion in scholarly output (we could likewise imagine our books as the equivalent of silver coinage and our relatively balanced treatment of both books and journal articles as something akin to of bimetallism). To historians, of course, books are golden. But that’s not necessarily how the rest of the university will see them in the future, or even now. Moreover, if administrators and scholars in other fields have clear metrics for gauging the impact of non-historical work and ranking their non-historical outlets and we do not, we may find ourselves at a severe disadvantage in the quest for scarce research resources. We may also hurt the very scholarly form that we claim to prize too much to defend in instrumental terms.

Here’s the payoff: To save scholarly historical books and the sustained research efforts that they represent, we may need to think hard about the impact (I won’t quibble now about the use of that term) of such books. That might mean doing things that are unpalatable to humanists, like going beyond the perennial questions asked by promotion and tenure committees (e.g. How many journal articles does it take to equal one scholarly monograph?) and consider ranking book publishers across all fields of history and within them. We might need to find some weighted measure of reviews (determined partially by the journals in which they appear), as well as working to track and recognize citations made many years after publication. We could also tighten up the peer review process, attempt to account for the influence of subventions, and properly credit books that only appear online. In other words, to ensure that our work is fairly judged and equally valued, and to save these often beautiful, extended works of scholarship that we call “books” (whose covers are still adorned with old paintings and sharp modern designs) we may have to set aside our reservations and occasionally treat them as nothing more than well recognized and secure mediums of scholarly exchange.

Saturday, November 13, 2010

Roundup: American Civil War at 150

.
"Into the flames: An ambitious reappraisal of the bloody war between the states," Economist, November 11, 2010.

AMERICANS proudly, and understandably, stress the exceptional character of their history. Yet from the beginning it has been intertwined with larger narratives of European, Atlantic and world history, including themes of migration, trade, ideology and power. The American revolution itself was an episode in the long conflict between Britain and France. Now Amanda Foreman, an Anglo-American historian, has looked at the American civil war “as it was seen by Britons in America, and Americans in Britain”. She points out that it was a defining moment not just in American history but in the relations between the two countries.>>>

Jenny Upchurch, "Civil War history comes to life at Nashville celebration. Re-enactors kick off state's celebration," Tennessean, November 13, 2010.

100 re-enactors . . . return today to the Bicentennial Mall in downtown Nashville to cook, march, fire cannons, make glass photo negatives and pump the bellows of a charcoal fire. The living history event at the state park, setting the scene of the beginning of the Civil War in 1860, kicks off the state's celebration of the war's 150th anniversary.>>>

Fredrick Kunkle, "In Richmond, a Civil War expert seeks to emancipate history's narrative," Washington Post, November 7, 2010.

. . . [Edward Ayers says,] "I am trying to get us to rethink what the war is about, and what we've being doing in Richmond is instead of talking of one sesquicentennial, one anniversary, it's really two: One's the Civil War, and the other's Emancipation," Ayers says, with the faintest drawl. "The main thing that happened, the consequence of the war, was freedom for 4 million people who had been held in bondage for over two centuries in this country.">>>

Ted Widmer, "Lincoln’s Mailbag," New York Times, November 12, 2010.

. . . Most of Lincoln’s correspondence is housed in the Library of Congress, just off the East Portico of the Capitol, where he gave his two great inaugural addresses. (They are there, too.) The Library is a national treasure, both for its holdings and for its robust commitment to make these priceless artifacts available to all. That means putting them online, for free, which the Library has been doing since February 2000, with scholarly support from the Lincoln Studies Center at Knox College.>>>

Bruce Smith, "Black SC Civil War vet honored with grave marker," Washington Post, November 11, 2010.

CHARLESTON, S.C. -- Almost 100 years after his death, a black Union Civil War vet from South Carolina finally has a veterans marker on his grave. The gravestone for Henry Benjamin Noisette was unveiled Thursday in a black Charleston cemetery. Noisette escaped slavery and joined the U.S. Navy in 1862.>>>

Friday, November 12, 2010

Honor and the War in Afghanistan

Bertram Wyatt-Brown

"Honor has come back, as a king, to earth, And paid his subjects with a royal wage."
- Rupert Brooke (1914)

Does the venerable ethic of honor apply in any way to our ongoing Middle Eastern wars? In dealing with Afghanistan, it seems that President Barack Obama feels obliged to preserve America’s honor despite his personal skepticism regarding the outcome. General David H. Petraeus, the Afghanistan commander, proposes that the U.S. Army’s honor is at stake as well. But if the latest offensive does not achieve realistic results, should we order the troops to soldier on for honor’s sake indefinitely? Are there better alternatives for a cause with few positive advantages? Those are some of the big questions that will face us next summer when the American mission is scheduled to draw down.

From the American perspective, honor, it could be argued, prompts our continuing war in Afghanistan. We don’t use that rather antiquated term except in the ceremony that confers the Medal of Honor for outstanding, self-sacrificing bravery in battle. Since the dawn of human history, however, armies have universally required: respect and obedience to higher authorities; self-denying discipline; and loyalty toward and willingness to defend others in the ranks. These make up the essence of a code. To fail to meet such imperatives can mean shame and disgrace. That stigma must be averted at all costs.

In Afghanistan, the American government obviously sought retribution for the brutal assault of Al Qaeda. We went after the Taliban who were harboring the Arab terrorists. The result was a swift overthrow of the Taliban government. Despite the mistakes of the Bush administration in subsequently ignoring the Afghan situation, even his Democratic and peace-minded successor could not withstand the thrust of American cultural and military history—never admit defeat.

At the same time, according to Bob Woodward’s new exposé, Obama’s Wars, the president has opposed a full-scale escalation, as strenuously promoted by General Petraeus, Admiral Michael G. Mullen, Chairman of the Joint Chiefs, and other military leaders. In an interview about his book, Woodward mentions an exchange of the President with General Petraeus. Obama told him, “You have to recognize also that I don't think you [will] win this war.” The commander replied, “I think you keep fighting . . . This is the kind of fight we’re in for the rest of our lives and probably our kids’ lives.” Woodward observes that the President knows “how dreary it [the war] is.” Moreover, “he realizes he’s been dealt a bad hand, but he can’t walk away, and so he’s committed but it’s not the George Bush kind of ‘bring it on’ commitment.” To go down that path means the expenditure of trillions more, absorbing heavier casualties, and still without prospect of “mission accomplished.”

While setting a relatively early date for reducing the national commitment there, the president could not brush aside the military enthusiasm for fighting on. As a result, a minor escalation of 30,000 additional troops was granted to carry out a new war strategy. Yet, this compromise was also designed to satisfy the army’s need for reassurance of its honor in the eyes of the nation. The fear is that not to do so would create disrespect both at home, politically, and abroad. We must be esteemed for our American determination, guts, and willingness to see things through.

In order to understand better the Afghan issue, let’s look at the role of honor in the Middle East. There, its ancient principles have their deepest roots. Knowing little of the cultural setting in which we find ourselves engaged, the American public must be appalled, for instance, by what are known as “honor killings.” As of 2000 a UN report estimated that 5,000 women are murdered every year at the hands of relatives. In 2003 a sub-cabinet official in the Pakistani government guessed that at least 1,261 Pakistani women were killed for sexual misconduct, as the community and family perceived it. Often enough such deaths are authorized by the jirgas, or councils of patriarchs. The Middle East is largely dominated by male authority. In that region, familial, clan, and tribal ties are in the hands of men, notably so in Afghanistan. In a rigidly structured hierarchy, all adult males regardless of social status, bear the prime responsibility for defending and projecting the honor of their relations, clan, and tribe. The honor code makes deadly revenge a paramount duty. That is especially so when the Afghans believe that they, their way of life, and Islamic faith have been grossly insulted by outside and dangerous forces.

In dealing with a never-conquered mountain warrior people, General Petraeus offers an innovative approach to warfare. It may prove more successful than the tradition of “search and destroy.” The formula is to show respect for the whole culture, its leaders and civilians. The Afghans can be most welcoming when the guest’s deep respect is manifest. That, Petraeus believes, is the key for winning over reluctant tribesmen. He orders subordinates to follow some well-planned instructions. To that end, he prescribes that all officers and men should conform to the revered principles of agreeable conversations over tea and offered hospitality; make skillful gestures to suggest the agreed upon equality of all parties; and engage in the exchange of gifts. They are what we might consider bribes. In Afghan eyes, however, the reception of largesse is an honorable transaction. The exchange serves as a pledge for solidifying the mutual loyalty and respect of the parties involved. Money or favors seal the oral contract. At the same time, the general pursues with as much force as possible military action against the Taliban and Al Qaeda.

This subtle, nuanced, and yet aggressive approach may still be insufficient. Thousands of decades of nearly constant warfare of family against family, clan against clan, and tribe against tribe are not to be overcome in even a few years. We may have a modern approach to martial ways. Nonetheless, the Afghans, especially Pashtuns, know how to fight small actions and how to wear down their foes by unappeasable resistance. The country continues to live up to its reputation as “the graveyard of empires.” Do we have the patience to see this war through to its perhaps endless denouement, as Petraeus predicts?

Without a military draft commensurate with the alleged seriousness of the conflict, the American military establishment relies on a relatively small number of increasingly battle-worn troops, who are recycled sometimes as often as a dozen times. The public is blessedly indifferent to their plight. We can hope, however, that General Petraeus and the President can prove the indisputable worthiness of the mission. Our own sense of honor in warfare has already exacted a high price. With reference to the ironic epigraph by the English war poet Rupert Brooke, should honor continue to be worshiped as a “king?” The poet had in mind the number of monarchs whose armies were engaged in the Great War—King George V of England, Kaiser Wilhelm II of Germany, Emperor Franz Joseph of Austria-Hungary, Tsar Nicholas II of Russia, Sultan Mehmed V of the Ottoman Empire. Their honor was deemed at stake in the desperate struggle. Yet, even if that primordial code offers a “royal wage” in the form of blood and treasure, as Brooke implies, are our nation’s current aims and sacrifices worthwhile?

The answer will not be easy to fathom at this juncture in our national history.

Thursday, November 11, 2010

In Defense of Facts and Memorization

Randall Stephens

I recently had a student in a large survey class who did not appear to be prepared for an exam. That's not unusual. But this student answered the essay question on the test in a very unusual way. She/he wrote a poem describing how much she/he hated "history." (I was glad to be spared from his/her wrath, at least in the poem.)

This got me thinking about why students say they despise history. It certainly could be related to how history is presented to them: dry-as-dust fashion, or one-damn-thing-after-another mode. Perhaps such students think of lectures, textbooks, and history classes in general as producing storms of useless facts, unconnected to reality. Some non-majors complain that they did not come to college to learn about the past or irrelevant dead people.

Some students might not have an aptitude for history, plain and simple. That's fine.

But how much of the undergraduate complaint against history has to do with an unwillingness to learn content? Surely one needs to know real details about the past in order to understand it.

It strikes me that historians can be a little too defensive about teaching too many of the facts, the details of history. To be sure history is not a collection of pointless facts, as I tell my students. Among other things history helps us undertsand who we are by examining who we were. I like how Peter Stearns puts it in "Why Study History" on the AHA site: "The past causes the present, and so the future. Any time we try to know why something happened—whether a shift in political party dominance in the American Congress, a major change in the teenage suicide rate, or a war in the Balkans or the Middle East—we have to look for factors that took shape earlier."

A student will need to know what actually happened in the past before he or she can go on to write history, tell a story, formulate arguments, and do the interesting work of interpretation.

That's not unique to history. Content and some basic memorization are a the heart of most disciplines. Biologists have to learn anatomy and classifications. Others in the hard sciences must memorize formulas and need to have a grasp of mathematics. Language requires plenty of memorization. And on and on.

History professors, though, blush a bit when they ask students to memorize a list of names, ideas, dates, and the like. A student of Antebellum America should know the difference between John Calhoun and John Brown. A student in a course on the Early Republic should be able to distinguish a Federalist from an Anti-Federalist. A student in a colonial history course will need to know that the French and Indian War came before the American Revolutionary War.

OK, I may be overstating the case, or grossly oversimplifying things . . . But, I'd like to say nothing more than this . . . facts matter, memorization has its place, and history does require exposure to and understanding of real content.

Oh . . . and George Washington never drove a Dodge Challenger.

Wednesday, November 10, 2010

Screening the Past: Films for the Second Half of the Western/World Civ Course

Randall Stephens

I've compiled a list below of films that I use for a course I'm teaching this semester, The West in the World since 1500. I usually use short selections from these. Roughly 15 minutes of film for a 1 hour and 15 minute class seems to work well.

Most of the documentaries and features included below use historians as commentators. Many contain archival photos, paintings, and prints; artful dramatizations; and vintage film footage.

A DVD search on the WorldCat site can usually yield movies on a wide range of subjects. (Though I've been surprised that there are not enough good ones on 17th-century topics: European wars, absolutism, colonial encounters, advances in science . . . I'd also like to find more history docs on Africa, Asia, and the Middle East . . .)

In the past I have included embedded video segments in the PowerPoint presentations I use for lectures. (The free software Handbrake is the best I've found for ripping DVDs onto my MacBook. It's easy from there to put them into a presentation. See also this tutorial on how to download and embed YouTube videos into a PowerPoint slide. I have not tried this, so I'm not sure how well it works.) I've not been entirely happy with the quality of ripped videos, and the size of the files makes them a little impractical. YouTube or a simple cued-up DVD works much better for me.

In the list below I've thrown in a number of DVDs that I've not been able to use in class. (Far more feature films could be added to this, too.) In chronological order:

Luther (2004).

Martin Luther (PBS, 2003).

Empires: Islam: Empire of Faith (PBS, 2001).

Conquistadors (PBS, 2001).

The Return of Martin Guerre (1984).

Classical Destinations (Sky Arts, 2006), YouTube clip of Versailles, Louis XIV, and Paris.

Versailles (2004), YouTube, multiple sections.

Vatel (2000), see trailer.

Girl with the Pearl Earring (2004).

John Adams (HBO, 2008).

The French Revolution (History Channel, 2005), entire film can be watched in sections on YouTube.

Egalité for All: Toussaint Louverture and the Haitian Revolution (PBS, 2009).

The Lost Kingdoms of Africa (2010).

Slavery and the Making of America (PBS, 2004).

Amazing Grace (2007).


Charles Dickens (Biography, 1995), watch instantly on Netflix, if you have an account.

A History of Britain (BBC, 2000), I use sections from episodes on Industrialization. See YouTube clips.

Guns, Germs, and Steel (National Geographic, 2005), I use part of the last episode, which can be watched instantly on Netflix.

The Young Victoria (2009), watch instantly on Netflix.

China's Boxer Rebellion (History Channel, 1997).

The Last Emperor (1987), watch instantly on Netflix.

East Wind, West Wind: Pearl Buck, the Woman Who Embraced the World (1993).

The Great War (PBS, 1996).

Influenza 1918 (PBS, 1998).

Matisse Picasso (2008).

Sigmund Freud: Analysis of a Mind (Biography, 1997).

The People's Century: Red Flag, 1917 (PBS, 1997),

The War of the World: A New History of the 20th Century (PBS, 2008).

The Crash of 1929 (PBS, 2009).

Nanking (2008).

Europa Europa (1990), watch instantly on Netflix.

Downfall (2005).

Frontline: Memory of the Camps (PBS, 2005).

CNN: Cold War (1998), I use an episode on the iron curtain and the red scare.

1968 with Tom Brokaw (History Channel, 2008).

About the United Nations: Decolonization (1999), not an easy one to track down.

The Road to 9/11: A Brief History of Conflict in the Middle East (PBS, 2006).

See also, "Some Films I Use for My Colonial History Course"; "Dancing about Historiography: At the Movies with a Methods Course"; and the March 2008 issue of Perspectives Online, which was devoted to film.

Monday, November 8, 2010

Look Back in Anger: The 1960s and Evangelical Conservatives

Randall Stephens

The rightward turn of voters in the 2010 elections and the traction that conservative candidates have gained has a variety causes. Certainly, a number of Americans are unhappy with health care reform, unemployment, and a president that they feel is far too liberal. But one group, white conservative Christians, is particularly up in arms.

The Pew Research Center reports that "Two of the largest religious groups in the electorate followed the same basic voting patterns in the 2010 elections for the U.S. House of Representatives as they have in prior elections: white Protestants voted overwhelmingly Republican and religiously unaffiliated voters cast their ballots overwhelmingly for Democrats. . . . However, among all white voters who describe themselves as born-again or evangelical Christians -- a group that includes Catholics and members of other faiths in addition to Protestants -- 78% voted Republican in 2010, compared with 70% in the last midterm election."

At least since the 1960s, many evangelical and fundamentalist Christians have felt besieged by new currents in the larger American culture. The upheaval of the swinging sixties shocked and frightened believers. Evangelicals and fundamentalists were horrified by campus riots, the counter culture, and what they saw as the excesses of the liberal political establishment. In the late 1960s and early 1970s Christianity Today, the chief magazine of American evangelicalism, published article after article on the terrors of the Left and the end of Christian civilization. Their world, so it seemed, was crumbling around them. (See Darren Dochuk's From Bible Belt to Sunbelt and Dan Williams recent God's Own Party for excellent insight into these and earlier developments.) The 1970s bestselling work of nonfiction, Hal Lindsey's Late Great Planet Earth, wove an evangelical end-times drama out of the explosive issues of the age. (Though Jesus People wore beads and Roman sandals and grew their hair "all long and shaggy," as Merle Haggard put it, they were in step with the apocalyptic temper of the times and drank in the anti-60s brew of the day.)

James Dobson, one of the most influential evangelical leaders of the modern era, described this anti-1960s view bluntly in 2008. Much of the wickedness of modern society, Dobson thundered, could be traced to that era of moral decline. The so-called Summer of Love in 1967 unleashed a whirlwind of hedonism and vice, he said. In his multi-million selling parenting manual, Dare to Discipline (1970), Dobson wrote: "In a day of widespread drug usage, immorality, civil disobedience, vandalism, and violence, we must not depend on hope and luck to fashion the critical attitudes we value in our children. That unstructured technique was applied during the childhood of the generation which is now in college, and the outcome has been quite discouraging. Permissiveness has not just been a failure; it’s been a disaster!"

Long after the bong smoke and tear gas have cleared, conservative American Christians--some in the Tea Party and many more who are happy enough with the Republican Party--continue to register post-60s fears. A number want to reclaim their America from secularists and godless liberals. They fear that an overpowering government wants to curtail their rights to raise their children as they see fit. They worry that their freedom to express their religious beliefs in the public sphere continues to come under attack. In other words, many are uncomfortable with a post-60s pluralism that has reshaped the nation and with a secular notion of the public good.

Such concerns are not lost on savvy politicians. Christine O'Donnell, Sarah Palin, Rand Paul, Ken Buck, and a host of others are intimately aware of constituents' fears. When such candidates lash out at secular experts and laud commonsense Christianity, they know perfectly well that they are tapping into a powerful counter ideology, one that has been decades in the making.

Sunday, November 7, 2010

NPR's Studio 360 Looks at Buffalo Bill and Western Myths

Randall Stephens

This week NPR's acclaimed Studio 360 series looked at Buffalo Bill's Wild West Show and some of the enduring myths of the West. The program explored the legend of this major American celebrity and tracked the career of the western, from dime novels to modern incarnations like HBO's Deadwood. A particularly interesting line the that the Studio 360 program took was the decline in the popularity of the western in the 1960s, and the western's reemergence in altered form. Robert Altman's Buffalo Bill and the Indians, or Sitting Bull's History Lesson (1976) would have been nearly sacrilegious in an earlier era. A drunk, ego-maniacal Buffalo Bill (played by Paul Newman) staggers pathetically through this weird carnival of a film.

Buffalo Bill "was the most famous American in the world," says Studio 360, "a showman and spin artist who parlayed a buffalo-hunting gig into an entertainment empire. William F. Cody’s stage show presented a new creation myth for America, bringing cowboys, Indians, settlers, and sharpshooters to audiences who had only read about the West in dime novels. He offered Indians a life off the reservation — reenacting their own defeats. Deadwood producer David Milch explains why the myth of the West still resonates; a Sioux actor at a Paris theme park loves playing Sitting Bull; and a financial executive impersonates Buffalo Bill, with his wife as Annie Oakley."

Listen to the program here.

See Buffalo Bill-related posters, photographs, and more from the Library of Congress.

Watch the full episode of PBS's Buffalo Bill (American Experience).

Hear the Beatles rehearsal of the Continuing Story of Bungalow Bill (1968) . . . unrelated, but excellent.