Thursday, June 30, 2011

American Geography and the Missouri River

Heather Cox Richardson

One of the things I insist all my undergraduates learn at the beginning of each of my classes in American history is basic American geography. This sounds terribly basic, I know, but no one ever bothered to teach it to me (or at least, if they did, I wasn’t paying attention), and once I started teaching it, I discovered that no one had ever taught it to most of my students, either.

I don’t do anything fancy. I start with a physical map of North America (with an inset of Hawaii, of course, pointing out the great piece of trivia that, thanks to Hawaii, America has six time zones. Then I tell them that they must know the East Coast, the West Coast, the Gulf Coast, Canada and Mexico, and three mountain ranges: the Appalachians, the Rockies, and the Sierra Nevada. I talk a bit about why each has been important in some major development (using the Donner Party for the Sierra Nevada, to keep them awake). I point out the Great Plains and the Great Basin and explain the low rainfall and high winds that create such distinctive features there, showing some of the scenery from those sections.

But the natural elements on which I spend the most time in my introduction to American geography are the Ohio River, the Mississippi River, and the Missouri River.

The Mississippi is the great artery of what became the United States, and figures prominently in most economic developments until after the Civil War (this is an easy one, since most of them have heard of the Mississippi, and maybe even have read Huckleberry Finn).

Then I turn to the Ohio. No one can understand early American history without understanding the Ohio River, the incredible fertility of the land around it, the determination of Indians to hold it, the line it created between slavery and freedom. (Huck can help here, too, since he and Jim went down the Mississippi River to reach freedom—counterintuitive until I explain they were trying to get down the Mississippi far enough to get up the Ohio.)

Then, balancing the importance of the Ohio on the western side of the Mississippi, there’s the Missouri River. The Missouri gets short shrift in American History texts and courses, it seems to me, and it shouldn’t. It dominates the West as thoroughly as the Ohio dominated the East in an earlier time. Just a few obvious points: It was the Missouri that offered Lewis and Clark a highway to the West Coast at the beginning of the nineteenth century. The land around the upper Missouri has always had some of the best hunting in the country, making it highly prized both by the native people who managed the lands before the coming of easterners, and by the interlopers who started arriving in the 1860s. It was this region, of course, that produced Red Cloud and Sitting Bull and Crazy Horse, and it was to wrest control of this land that General George A. Custer marched his men to the banks of the Greasy Grass in 1876. Only a few decades later, it was the Missouri River that seemed to offer the answer when the dry plains defeated hopeful western farmers. Fifteen dams now slow the river, creating reservoirs and taming floods to make the upper Northwest more productive and habitable.

And that, of course, is the reason for this post. How many of our students—at least those east of the Mississippi—paid any attention to the historic and terrifying flooding of the Missouri River this spring? A large melting snowpack has combined with torrential rains to create a flooding emergency that rivals Hurricane Katrina. Levies are breaking up and down the river, sending evacuees fleeing as the water covers homes and fields, as well as the Fort Calhoun Nuclear Station in Nebraska (which does not appear to be in danger). Reuters is reporting that the floods along the Missouri and the Mississippi have damaged about 2.5 million acres of farmland in the U.S. which seems likely to hurt farm production. It’s probable that this will, in turn, drive up food prices. This is not an unimportant weather event.

Already, observers are speculating about what this disaster says about the way America has historically approached the taming of the Missouri. The management of water in the West has long been a subject that interests western historians, but seemed to have drifted by most people in the East. And now that debate has sprung pretty dramatically out of the pages of historical journals and into the headlines. This year’s disastrous flooding suggests that American historians might be able to do the country a service by making sure all their students know what the Missouri River is, why it’s been important in the past, and just how vital its management has been, and is still, to the nation today.

Wednesday, June 29, 2011

History and Myth

Dan Allosso

To follow up on Heather, Jonathan, and Chris's posts—and the general theme that we’ve been developing, about history education, the state of the profession, and writing for the public—I’ve been listening to James Loewen’s lecture series, The Modern Scholar: Rethinking Our Past. Like Lies My Teacher Told Me, (my thoughts on that book here), the lectures focus on bad history taught to high school students, and how we could do it better. Columbus, Civil War & Reconstruction, and the racial “nadir” of the 1890s-1920s are again central, as is Loewen’s critique of “heroification.”

Heroification, Loewen says, is the rendering of pivotal figures in American history (Columbus, Washington, Lincoln, Wilson) as two-dimensional icons rather than as complicated humans. This process not only decontextualizes their achievements, it renders them useless as role models. And it feeds a myth-building process that is not only ethnocentric, but that stifles initiative and participation in civic life. At the moment, these are the parts that fascinate me: myth and participation.

The myths supported by mainstream high school texts include American exceptionalism and a subtler, but possibly more problematic, commitment to “Progress.” Loewen demonstrates that both myths are historically inaccurate with dramatic, fact-based counter-narratives. But I wonder whether myths can be defeated with facts? Maybe what is needed is a powerful counter-myth.

When I began thinking seriously about writing, a few years ago, I had the opportunity to work with Terry Davis, author of the classic young adult novel Vision Quest. As the thousands who studied with Terry during his career at Mankato State can attest, Terry is not only a great writing coach, he’s a tireless advocate of serious writing for young adults. “You won’t have to condescend,” Terry said when he invited me to join his YA fiction workshop. So I tried not to, when I wrote Outside the Box. I tried to project myself back, and think about teen alienation from the inside, as I had experienced it myself.

I think it’s significant that Harry Potter never has to kill anybody. But what type of myth—what type of worldview—does this give to a generation growing up in a nation at war? Like the myth of Progress hidden in our high school histories, does it make the real world more or less comprehensible for young adults? Loewen says students are bored by history and sense that they’re being lied to. Unlike Harry Potter, their history teachers are not supposed to be presenting escapist fiction. But if Loewen is right, it’s not at the level of facts, but at the level of myth that high school history really operates. So what about alienation?

Young adults are expected to rebel, for a little while, from the society of their parents. But typically, they’re then expected to accept the authority of the people and institutions they had rebelled against. Their ultimate reward is that they will inherit that authority, and the cycle will be complete. This myth of cyclical youth rebellion is based on the myth of progress. What happens to it, if we admit there are limits to growth? If society’s current behaviors are unsustainable, should we be reassuring ourselves and our children that their alienation is just a phase that they’ll outgrow?

What does this imply for those of us who want to write history for young adults? Are there counter-myths that can be useful? The myth of “the little guy against faceless power” resonates in our culture, from the Terminator movies to internet conspiracy theories. Historians on the right and the left have built their narratives on this structure—it’s no accident that Glenn Beck hides his “let the corporate overlords do what they want” message in a swirl of populist rhetoric. Can this counter-myth be used to make space for young Americans to challenge authority on something more than a generational-hormonal basis?

All myths leave out detail and complexity. Can a historian work at the mythic level, without betraying history? America is said to be exceptional in the great distance between what our professional historians understand about our past, and what the general public believes. Is this because—unlike other cultures whose myths reside in shared ethnicity or religion or some other institution—Americans are held together by their mythical history? Should we distinguish even more than we do, between historians and people who write for the public?

Tuesday, June 28, 2011

History’s Tests

Chris Beneke

Testing brings out the anti-Whig in all of us it seems. The declension model was back in fashion last week as the American public was reminded of how little history it knows. The National Assessment of Educational Progress’ report on U.S. history revealed that American eighth graders have difficulty enumerating colonial advantages over the British in the Revolutionary War, fourth graders have trouble explaining Abe Lincoln’s importance, and twelfth graders often fail to grasp who was allied with who during the Korean War.

Here’s a quick summary of the academic fallout:

Several historians suggested that we un-knot our knickers. Sam Wineburg reminds us that we’ve been wringing our hands over our historical ignorance for about a century now and assures that common knowledge questions are not included in these assessments. In other words, graduating seniors probably do appreciate the significance of Harriet Tubman, Rosa Parks, the bombing of Hiroshima, and Auschwitz. They just aren’t asked about stuff that we know they know. Paul Burke echoes Wineburg’s claim that the much bemoaned results may simply reflect the test’s design, rather than the United States’ descent into barbarity. James Grossman adds that, whatever the use of the individual questions, they may not have been asked of the right students. “[I]n many states children don’t study much U.S. history until fifth grade.” “Next year,” he quips, “let’s give fourteen-year-olds a test on their driving skills.”

Very little history until the 5th grade? Linda Salvucci’s argument is that this is precisely the problem. Indiana elementary students, for example, get a grand total of twelve minutes of history instruction per week. Salvucci says that “parents … really ought to be mobilizing to demand that public officials get serious about adequately funding history education in the schools. History must not be allowed to become some optional or occasional add-on to the ‘real’ curriculum.” Her conclusion: “We need a STEM-like (Science, Technology, Engineering and Math) initiative for history.”

James Livingston lays some blame for the alleged poor performance at the feet of professional historians and their affinity for anti-glorious counter narrative. Viewing the matter from the perspective of a 12th grader, he writes: “If you tell me the past doesn’t matter because it’s a record of broken promises, systematic cruelty, and failed dreams, or because it’s an irretrievable moment of eccentric deviations from a norm of appalling complacency, fine, f--- it. If I can’t use it to think about the present, why should I bother? Thanks, Doc, you convinced me that I don’t have to.”

Happily, the NEAP flare up coincided almost exactly with the publication of Historically Speaking’s roundtable on historical thinking at the K-12 level. Fritz Fischer suggests that “[w]hen it comes time to write the guidelines for how history is taught in the classroom, historical thinking [as opposed to the digestion of content] needs to become the guide.” Bruce Lesh agrees and details how he focuses student attention on a series of provocative questions and getting them engaged in interpreting primary sources. Robert Bain draws this lesson from the teaching of world history—teachers need to keep the overarching economic and social forces in mind “while attending to what their students are thinking and learning.” The two goals, he points out, are not easily reconciled. Because while you may be thinking about the geopolitical forces that propelled the European conquest of America, your students are thinking about “Columbus’s desire for personal wealth and glory” (or something along those lines . . . ). Linda Salvucci wraps up the forum with a call for nudging the public toward better history. “[W]e need to grab, define, and educate the audience,” she writes. We need to offer history that is both accessible and edifying.

Monday, June 27, 2011

Jump Right in, the Water's Fine

Jonathan Rees

In the new issue of the Journal of the Historical Society, Allan Kulikoff makes a series of suggestions about how to improve history education at the higher ed level. One of the problems he cites is that:

Historians have uncovered entirely too many social facts to digest. The glut in scholarship sets the stage for increasingly impenetrable survey textbooks, puts ever-longer lists of must-read books before graduate students, narrows the focus of dissertation research, and increases the flood of unreadable monographs.

There seems to be a budding consensus on the textbook part of that complaint, as no less a personage as David McCullough recently unloaded on them in an interview with the Wall Street Journal:

What's more, many textbooks have become "so politically correct as to be comic. Very minor characters that are currently fashionable are given considerable space, whereas people of major consequence farther back"—such as, say, Thomas Edison—"are given very little space or none at all.

Mr. McCullough's eyebrows leap at his final point: "And they're so badly written. They're boring! Historians are never required to write for people other than historians."

I would take issue with the notion that the facts in most textbooks are comic in their political correctness, since McCullough and I clearly have different priorities. Nonetheless, we historians should probably all agree with the notion that fitting everything we want students to know and think about history between the covers of a single volume has become increasingly difficult in the last forty years, at least since the advent of the New Social History (which is, of course, now rather old). Textbook authors have to make choices, and it is inevitable that those of use who assign their books will disagree with many of the choices that they make.

While Kulikoff proposes a series of interesting suggestions attacking the entire crisis in history education (which I’ll let you read yourself by getting a hold of the JHS June issue), I have a modest proposal of my own to take care of the textbook problem: don’t assign one. No, I’m not kidding. I ditched the textbook in my survey class last semester for the first time and was delighted by the results.

While I’d like to credit a prominent history blogger from the northern part of my state for giving me the idea, the truth is that I had been thinking about killing my textbook for years, but never had the nerve to try it until I read that she had already done so. I had been switching textbooks about once a year for years and was unsatisfied with every text I tried before I started assigning primary sources instead. It’s not as if all textbooks are as badly written as McCullough suggests they are (although some clearly are), it was that none of them emphasized the same facts and themes that I did in class. I wanted a textbook that compliments my teaching rather than one that provides a competing narrative. Now I build my own reading list based upon what I teach already and have more time left to teach other skills besides memorization.

What did my students think? I did a special evaluation toward the end of the course and they seemed to like it just as much as I did. Yes, this might be expected when you’re giving them less reading, but I like to compare my new syllabus to the Sugar Act of 1764: I assign fewer pages than I used to, but I enforce the reading of the pages that I still assign much more stringently. Deep in my heart I knew that nobody read the textbook before, but now I see the documents I assign and teach get directly referenced on the best student essays. By pouring fewer facts into their heads, I’m convinced that fewer of them are coming out on the other side.

Why admit to such pedagogical heresies in a public forum?

I’m convinced that if more of us no-textbook professors make ourselves known, more historians will join the bandwagon. I was once afraid to go with my gut, but I’m through living in fear of the unknown. Just because you’ve assigned a textbook in the past (the same way that your teachers assigned you a textbook and their teachers assigned them a textbook), you do not have to assign a textbook in the future. Think of the students in your survey classes who will never take another history class. Do you really want their last memory of our discipline to be an overly-long, dull book without an argument and written by a committee?

If you’re happy with your survey textbook, then disregard this post. If not, then I say jump right in, the water’s fine.

Jonathan Rees is Professor of History at Colorado State University – Pueblo. He blogs about history and other matters at More or Less Bunk.

Friday, June 24, 2011

A Hotline for Teachers

Heather Cox Richardson

It hasn’t been a great week for history teachers. News media made headlines out of a new report that only 13% of high school seniors are proficient in American history. Students perform worse in history than other subjects routinely tested in the NAEP, the National Assessment of Educational Progress.

Who is to blame for the appalling condition of our historical knowledge? Most of us could make a pretty decent list of things that make it difficult to teach history today, but according to Rick Santorum, the problem is liberal teachers. In Ames, Iowa, hot on the presidential trail, the Republican former Senator from Pennsylvania ignored the Texas textbook controversy, the Virginia textbook controversy, the rewriting of history by David Barton and Sarah Palin and Michele Bachmann and Mike
Huckabee
. Instead, Santorum declared:

We don’t even know our own history. There was a report that just came out last week that the worst subject of children in American schools is — not math and science — its history. It’s the worst subject. How can we be a free people. How can we be a people that fight for America if we don’t know who America is or what we’re all about. This is, in my opinion, a conscious effort on the part of the left who has a huge influence on our curriculum, to desensitize America to what American values are so they are more pliable to the new values that they would like to impose on America.

The bad news for teachers continued. How does Congress propose to combat this deficit in history? By slashing, or perhaps eliminating altogether, the funding for the Teaching American History program.

So for all the disheartened teachers out there, I offer a ray of hope. Months ago, I mentioned a high school sophomore who had never heard of the Cold War hotline between the U.S. and the U.S.S.R. Just yesterday, the student sent me a copy of her final project for National History Day: an interactive website about the history of the hotline. There was a strict word limit and an equally strict limit for images, so the project does not take long to read. But I highly recommend you take some time to click through it (my favorite is the section on pop culture). It shows that, without a doubt, at least some students are learning and some teachers are teaching well.

The student is a minor, so I’m not going to give her name, but hats off to both her and to her teacher, Mr. Christopher Kurhajetz!

Nice work, both of you. You give us hope.

Thursday, June 23, 2011

Some Teaching Ideas

Heather Cox Richardson

Next year, I’m going to have the opportunity to teach the American Civil War and Reconstruction to a small group of students at a new university. It seems like a perfect time to try out some new approaches. I’ve been trolling through syllabi and teaching discussions on the internet, and have unearthed a couple of new ideas I think are worth entertaining:

First of all, I always organize classes backward, using my own version of the Understanding by Design method explored by Grant Wiggins and Jay McTighe. This method requires tight organization of material to illuminate a larger point: the “take away” students should learn from a course. Every reading, every lecture, supports this larger theme (sometimes by challenging it).

A number of people use diagnostic essays early in the semester to gauge the level of student writing, but professor Lendol Calder of Augustana College in Illinois has put an interesting quirk in an early assignment that could be used to make the diagnostic essay do double duty. Calder asks students in the survey to write a two-page essay explaining what they know about American history. His goal is less to gauge their writing than it is to see what “story” they use to organize their knowledge. Have they learned “the glory story, the people versus the elites,” or that the story of America is that of “high ideals/mixed results”? This twist on the diagnostic essay would fit particularly well with a course on the Civil War, since so many students come to such a course with fervent views of the conflict, but it seems like it would work well with a number of courses.

And then there’s this gem. I’m afraid I might have to break down and switch to a Mac to be able to use this timeline. Constructed carefully—and yes, it will take ages, I’m sure—this would free up a great deal of time I used to use lecturing, enabling us to spend more time analyzing primary documents.

What about assessments? I’m still not sure about them, or rather, I’m even less sure about them than I am about the other aspects of the course. For an upper division course on American history, a long paper based in primary research is a no-brainer. Americanists have the great luxury of being able to send students to almost limitless primary sources that are in a language they speak. There’s no reason not to let them experience the excitement of their own research and having the fun of writing it up. (There will soon be similar reason to celebrate for people studying British history, too.)

But what about exams? I’ve written before about team research projects, and I do like the idea of assigning teams of people to research a topic. In this era, knowing how to find information, weigh it, and assimilate it into an argument seems crucial. But for a topic as widely covered as the Civil War, is it possible to come up with interesting assignments that will really require significant teamwork?

The same friend who has tried cooperative work has also tried exams that place the student directly in the era being studied, as in: “You are a 25-year-old man from southern Ohio, visiting New York City on July 12, 1863. How will you spend the day?” In this example, a student would have to understand the history that made southern Ohioans tend to be Democratic and sympathize with the South, and would have to realize that the young man would arrive in New York City in the middle of the Draft Riots that pitted Democrats against Republicans, local government against the national government, and workers against African Americans. I have not yet heard how the experiment went when my friend tried it. It seems to me to have great potential, but also the chance of getting some utter fantasies that would be incredibly irritating.

It’s a wonderful thing to have so many new tools for teaching. Perhaps most of all, it’s wonderful to have the intellectual space to try new approaches. If anyone else has been kicking around new ideas, do let the rest of us know.

Wednesday, June 22, 2011

Half-Breeds, Stalwarts, and Contingency

Heather Cox Richardson

A year or so ago, a graduate student studying for her comprehensive exams asked me if I could explain the difference between Stalwart and Half-Breed Republicans in the nineteenth century. I could, I said, so long as she understood that no one cared.

I went on to explain that the difference between Stalwarts and Half-Breeds crystallized in the Republican national convention in 1880. In that contest, Stalwarts, led by Roscoe Conkling and Don Cameron, wanted to secure the nomination for former president U. S. Grant. Half-Breeds backed James G. Blaine.

What was at stake in the nomination was really nothing other than a personal feud. Conkling and Blaine had hated each other since the war years. Conkling had become Grant’s right-hand man during his presidency, and he hoped to become the crucial figure in a renewed Grant government, handing out patronage to his supporters in New York. Blaine had a different set of friends, and they pushed his nomination in the hope that he would cut out the Conkling men.

The nomination process did not proceed as either camp hoped. A significant body of delegates refused to support either Grant or Blaine. They threw their votes to a dark horse, James Garfield of Ohio. The Blaine delegates, willing to vote for anyone but Grant, followed. Garfield won the 1880 nomination, and the 1880 election.

I told my friend that the distinction between Stalwarts and Half-Breeds didn’t matter because there was little daylight between the actual policy positions of Conkling and Blaine. The ascendancy of either would not have changed the course of the Republican Party, or the legislation it supported. For the purposes of the country they were interchangeable, and so, for that matter, was Garfield. The difference between Stalwarts and Half-Breeds was rather like the difference today between John Boehner and Eric Cantor: a difference in style, to be sure, but not such a great difference that one can imagine history graduate students in 2130 being asked to explain its significance.

Today, as I wrote about this fight, I rethought that flippant dismissal.

In order to quiet the angry Stalwarts, the Republican convention put Stalwart Chester Arthur into the vice presidency. But then, to calm the Half-Breeds, President Garfield named James G. Blaine himself to the position of Secretary of State in the new administration. This was the most powerful position in the Cabinet, and Stalwarts—led, as ever, by the insatiable Roscoe Conkling—cried foul.

When Garfield offered the fabulously lucrative position of collector of the port of New York to an appointee without consulting Conkling, the fat was in the fire. Conkling, a famously touchy character, was undoubtedly personally affronted. But he claimed to oppose the appointment on the grounds that the Senate’s power to advise and consent gave Senators the power of appointment. Garfield was usurping power, he insisted; senators had ultimate say over who received government appointments in their home states. What was really at stake, though, was whether or not Conkling would control New York.

To force the issue into the open, Conkling resigned his position in the Senate. New York’s other senator, Thomas Platt, joined him (thereby earning the memorable nickname “Me Too” Platt). They were confident the New York legislature would reelect them, thus slamming Garfield and returning the Stalwarts to the top position in the government. They were wrong. New Yorkers had had enough of Conkling’s dictates and were not willing to endorse the idea that senators should hold sway over the president’s appointments. The legislature turned to entirely new senators.

Traditional historians who even tried to find any significance in this teapot tempest blamed the fight between Conkling and Blaine for the assassination of President Garfield two months later. In July, Charles Guiteau shot President Garfield in an apparent attempt to put Stalwarts in power after all. From this the nation got civil service reform, but, as I told my scholarly friend, no real change in governmental policies.

Here is where histories generally drop the Stalwarts and Half-Breeds.

But I came to realize today that there was, in fact, a terrifically significant event to come out of this clash: Roscoe Conkling was out of a job.

With his political career suddenly gone, Conkling had to find a way to rebuild his fortunes. He had always been a brilliant orator, and he turned naturally to the law. The first client through his door was Jay Gould, the railroad magnate. Conkling used his great popularity and fame as a legislator to become one of the nation’s premier litigators for big business. It was in that capacity that he argued the case of San Mateo County v. Southern Pacific Railroad in December 1882. In this case, Conkling argued that a county tax on the railroad violated the due process clause of the Fourteenth Amendment. He insisted, based on his position as a congressman who had participated in the framing of that amendment, that when it adopted the Fourteenth Amendment, Congress had intended for it to protect corporations as well as individual persons. The court did not explicitly comment on Conkling’s really quite outrageous claim in 1882, but in 1886, it announced that Conkling’s doctrine was so definitive that it would not hear arguments to the contrary. The principle that corporations were protected by the Fourteenth Amendment limited government regulation of business well into the twentieth century.

And therein lies the great historiographical revelation of my current project. I have always seen political history as the story of competing ideas, but again and again I find small personal quirks changing the course of history. If only Thurlow Weed had not wanted to protect William Henry Seward we would not have gotten President Andrew Johnson; if only Charles Sumner hadn’t been so snobbish the Grant presidency would have succeeded.

And if only Stalwarts and Half-Breeds had been on speaking terms, corporations might never have been protected by the Fourteenth Amendment.

Tuesday, June 21, 2011

June Issue of Historically Speaking

Randall Stephens

It's a little past mid-June. And that means . . . the June issue of Historically Speaking is now on the Project Muse site. (As usual, access it through your library's website or through a university or college computer.) Hard copies should be arriving in mailboxes about right now.

This latest features a roundtable on teaching high school history; interviews with Robert M. Citino, Leo Damrosch, Thomas Albert Howard, Simon Price, and Peter Thonemann; essays by Sean McMeekin, David T. Courtwright, Bruce Mazlish,
and Toby Wilkinson; and review essays by Aaron L. Haberman and William R. Shea.

Toby Wilkinson's "The Army and Politics in Ancient Egypt" is particularly relevant with the ebb and flow of the Arab Spring. An excerpt:


To the student of Egypt’s ancient history, the pervasive influence of the army in the country’s current politics comes as no surprise. Throughout the pharaonic era, from the foundation of the Egyptian state (ca. 3000 B.C.) to its absorption into the Roman Empire (30 B.C.), military might played a role at least as important as hereditary succession in determining who ruled the Nile Valley. The first king of the First Dynasty, Narmer, won his throne by force and proclaimed his victory in a great commemorative stone palette decorated with scenes of military victory. Celebrated as Egypt’s founding document, the Narmer Palette stands today in the entrance of the Egyptian Museum in Cairo’s Tahrir Square, just yards from the site of the popular uprising that recently unseated President Hosni Mubarak.

In the centuries and millennia following Narmer, the kingship of Egypt was always vulnerable to seizure by the strongman of the day, despite its presentation in art and writing as a sacred institution, god-given and immutable. More often than not, that strongman was an army commander. This pattern of succession is most apparent at times of political upheaval, for example the century of turmoil that fol- lowed the collapse of the Middle Kingdom in the 18th century B.C. In this uncertain time, when the kingship passed from one claimant to another with bewildering rapidity, one of the men who seized the throne (and ruled long enough to commission a stone statue of himself) was called Mermesha. His name simply means “overseer of the army.” Another ruler, King Sobekhotep III (ca. 1680 B.C.), started his career in the palace guard and rose through the ranks of the army to a position where he was able, successfully, to challenge for the throne. On his royal monuments he made a virtue of his background, openly flaunting his non-royal origins so as to distinguish himself from the tired and discredited royal family that, a generation or two earlier, had led Egypt into disunity and chaos. Sobkehotep III did not inaugurate a dynasty of his own, but instead in typical army fashion he left the throne to three brothers—Neferhotep I, Sahathor, and Sobekhotep IV—who shared his military background (their grandfather was an infantry officer).

Even in periods of strong central rule, such as the “golden age” of the Egyptian Eighteenth Dy- nasty (ca. 1539–1319 B.C.), a close study of the historical sources reveals the central role of the army in the succession to the throne. The founding kings of the Eighteenth Dynasty came to power as victors in a protracted civil war. Once established in the royal palace, they were keen to portray themselves in a new light: as kings by divine right, the guardians of Egypt’s religious traditions. Yet they never forgot their military origins. Thus when the childless Amenhotep I looked for an heir to succeed him, he chose an ambitious and dynamic army leader (the future Thutmose I) who would extend the borders of Egypt and forge an empire in the Middle East. Thutmose I’s surviving royal inscriptions betray his origins. In a tone of rampant militarism, they extol warfare as the righteous duty of an Egyptian ruler, and laud the king as a great warrior who is ready to roam the earth and take on any adversary: “He trod its ends in might and victory seeking a fight, but he found no one who would stand up to him.” . . .

Historically Speaking (June 2010)

Jihad-cum-Zionism-Leninism: Overthrowing the World, German-Style
Sean McMeekin

Is “Right Turn” the Wrong Frame for American History after the 1960s?
David T. Courtwright

The Politics of Religion in Modern America: A Review Essay
Aaron L. Haberman

Military History at the Operational Level: An Interview with Robert M. Citino
Conducted by Donald A. Yerxa

The Birth of Classical Europe: An Interview with Simon Price and Peter Thonemann Conducted by Donald A. Yerxa

Historical Thinking at the K-12 Level in the 21st Century: A Roundtable

The Historian as Translator: Historical Thinking, the Rosetta Stone of History Education
Fritz Fischer

Making Historical Thinking a Natural Act
Bruce Lesh

Considering the Hidden Challenges of Teaching and Learning World History
Robert B. Bain

“The Music Is Nothing If the Audience Is Deaf”: Moving Historical Thinking into the Wider World
Linda K. Salvucci

Galileo Then and Now: A Review Essay
William R. Shea

Tocqueville in America: An Interview with Leo Damrosch
Conducted by Randall J. Stephens

God and the Atlantic: An Interview with Thomas Albert Howard
Conducted by Donald A. Yerxa

Ruptures in History
Bruce Mazlish

The Army and Politics in Ancient Egypt
Toby Wilkinson

Monday, June 20, 2011

An Interview with John Fea on the Historian's Public Role and the Christian Nation Debate

Randall Stephens

With the 4th of July just around the corner, it's a good time to reflect on how Americans conceive the settlement and founding of their nation. John Fea
has been thinking and writing about colonial America and the Revolutionary Era for quite some time. (He writes on that and related matters at his popular, always interesting blog Way of Improvement Leads Home.) Fea is the author of The Way of Improvement Leads Home: Philip Vickers Fithian and the Rural Enlightenment in Early America (U of Pennsylvania Press, 2008) and an editor of Confessing History: Explorations in Christian Faith and the Historian's Vocation, with Jay Green and Eric Miller (University of Notre Dame Press, 2010). His most recent book, Was America Founded as a Christian Nation? A Historical Introduction (Westminister John Knox Press, 2011), uncovers the historical roots of the Christian nation question and offers much-needed, timely insight. I recently caught up with Fea and asked him about his book, contemporary discussions on the matter, and his experience lecturing on the topic.

Randall Stephens: Gordon Wood has commented on the strange fascination Americans have with their founders. Other westerners, he observes, are not so obsessed with the lives and values of their nations' progenitors. Why do you think it matters so much, to so many Americans, that the founding fathers and the nation itself was and is Christian?

John Fea: As a I argue in the first four chapters of Was America Founded as a Christian Nation?, the overwhelming majority of Americans have always seen themselves as living in a Christian nation. Though the idea of America as a "Christian nation" has been understood in different ways by different groups, I think one could make a pretty good argument that today's advocates of a "Christian America" have a large chunk of American history on their side. This is more a statement about the influence of Christianity on American culture and less a statement about whether or not the founders believed that they were creating a uniquely Christian nation or whether those who believe today that America is a Christian nation are correct in their assumption.

Anyone familiar with the historiography of American religion knows that American evangelicalism and American nationalism have, in many ways, grown-up together. A lot of Christians today have a hard time separating the two. Since this book came out I have talked to several people—both on the radio and on the lecture circuit—who have a hard time distinguishing their patriotism from their Christian faith. One radio host told me point blank that if America was not a Christian nation he could not be a patriot.

Stephens: You write about the difference between what the public wants out of the past and how historians actually practice history. Is this difference at the heart of the Christian nation debate?

Fea: Yes, I think it is. Before I wrote this book I was aware, at a cognitive level, that most people were in search of a usable past. But the reaction to this book has really opened my eyes to the way ordinary Americans think about history. I don't believe that there is anything wrong in searching for a past that helps us achieve our present-minded agendas. Lawyers do it all the time. But those who only approach history in this way miss out on the transformative power that the study of the past can have on our lives and our society. In a world in which self-interest, individualism, and even narcissism reign supreme, history forces us to see ourselves as part of a larger human story. It has the potential to humble us. Its careful study has the potential to cultivate civility as we learn to listen to voices that are different from our own. I am working on a book on this topic which should be out sometime in late 2012.

I would like to start a crusade to promote good historical thinking as a means of contributing to civil society. Now if only I could find a wealthy philanthropist or foundation who might be willing to fund my project. (If anyone wants to talk more about this let me know).

Stephens: What do you think accounts for the widespread popularity of amateur Christian nation historians like David Barton and the late Peter Marshall?

Fea: I can think of three reasons for their popularity. First, as I mentioned in my answer to a previous question, Barton and Marshall are reinforcing the God-and-country narrative that many American Christians feel comfortable with. Christians can read these authors and breathe a sigh of relief because someone is affirming their already held beliefs about the American past.

Second, I think both Barton and Marshall are/were effective communicators. Barton is smooth. He can be very compelling. I have watched him on television and have found him to be an effective salesperson. Marshall and his co-author David Manuel were excellent writers. When compared to your average history textbook, their million-selling The Light and the Glory reads like a page-turner.

Third, I think Barton and Marshall have been so successful because, frankly, they are the only game in town. Most scholars, academics, and intellectuals have not stepped up to the plate to provide an alternative narrative. Scholars do not go out on the lecture circuit or write for popular audiences. They do not have public relations people or connections with local churches. They don't have the time or inclination to do these things.

When most Christian academics think about being public intellectuals they think in terms of writing for The New Republic or The Atlantic or a similar venue. Don't get me wrong, I think that Christian intellectuals should be writing in these venues. I also think that Christian intellectuals should be publishing with major university presses and trade presses, but if they want to serve the church and society they need to think about their careers, or at least part of their careers, in a different way.

Stephens: How do you address the Christian nation debate in the classroom?

Fea: I have had several opportunities in the classroom to address this issue. I usually don't dive into it in any great depth in my U.S. survey course (although this could change since I have now written this book), but I do explore it a great deal in my upper-division course on the American Revolution and a seminar I teach on religion and the founding. I teach mostly Christian students so many of them come to my classes with some opinion about the whole Christian America debate. Since I am a history professor, I usually try to approach the topic without a strong opinion one way or the other. (In other words, I don't come into class with my proverbial guns blazing and tell my students that I am going to try to debunk their false views about the relationship between Christianity and the American founding). Instead, we usually handle the issue through the reading and discussion of primary sources.

Stephens: You've lectured extensively on the subject of your book. Could you comment briefly on your experience? Would you recommend going on the lecture circuit?

Fea: My experience on the so-called "lecture circuit" has been mixed. I have thoroughly enjoyed the speaking and engagement with those who come to my talks. On the other hand, I have realized just how difficult it is to get people to think historically about this topic. Most people who come to one of my lectures come with their minds already made up about the question in the title of my book and are thus looking for me to confirm their position. When this does not happen (for those on both sides of the debate), I think folks get a bit uncomfortable or perhaps even defensive. I welcome this response. After all, history is complex and messy. Any type of education or learning should make us a bit uncomfortable.

Would I recommend the lecture circuit? Yes. I think that academics and scholars should be able to take their research and explain it to a popular audience in an enthusiastic and passionate way. On the other hand, such an approach to an intellectual or academic life requires taking the time to leave the ivory tower and get on the road in order to meet people in all kinds of settings. Needless to say, I have had my share of cookies and punch in church basements, chicken dinners on college campuses, tours of local historical societies, microphone problems at revolutionary-era round tables, and long car rides listening to E-Street Radio on XM. I have enjoyed it all and hope to do more of it, but I also realize that not all academics will want to do this, nor should they.

Friday, June 17, 2011

A Day in the Life: Art and History

Heather Cox Richardson

Everyone knows the iconic image of John, Paul, George, and Ringo from the cover of Abbey Road. That image launched deep investigations into its hidden meanings—“Paul is Dead,” anyone?—and into the stories it might be telling about the Beatles.

There was a story behind the image, too, and it’s one in which art and history intersect. As with any photo shoot, at the August 1969 session with the Beatles, the photographer Iain Macmillan took a number of different shots. They swept in bystanders, cars, different expressions on the musicians’ faces, different interactions.

What can these photos tell us about history?

I wonder, not only because it’s Friday, but because of another treasure trove of images recently discovered in Chicago. Vivian Maier was an emigrant from France in the 1930s and worked as a child in a New York sweatshop. When she was older, she worked as a nanny in Chicago. She had few friends, apparently, and interacted with the world largely through her camera. She left her photos, largely unseen, in a storage locker in Chicago, which put them up for sale when her payments became overdue after her death. John Maloof, writing an Images of America book about a Chicago neighborhood, bought them.

What he found was, to my mind, incredible. These are simply stunning pieces of art, chronicling the world of the streets in Chicago, primarily, as well as New York and distant countries. Her use of line, light, and texture is extraordinary.

Her photos are works of art, but they are also unusual snapshots of life in the mid-twentieth century. What can they tell us about the world in her era?

The historical reading of photographs intended to tell a societal story is straightforward compared to reading the Abbey Road photos or the Vivian Maier collection. Jacob Riis was making a point about urban poverty; Nick Ut was making a point about the Vietnam War with his 1972 image of Kim Phuc. A recent article honoring the late Tim Hetherington suggested that the key to successful war photography was an understanding of the complexity of the conflict and the ability to capture images encapsulating that story.

Artists, of course, have a different imperative. Their stories are not, necessarily, driven by current societal concerns. But if art historians can use paintings to interpret the world in which the images were made, shouldn’t historians be able to use artistic photography to interpret the modern world? And if so, how?

What can the Abbey Road photos tell us about their era?

Thursday, June 16, 2011

The Writer’s Toolkit

Philip White

What makes a successful writer? Talent? Certainly. Knowledge of and passion for one’s subject? Absolutely. The ability to find a market for your wares? No doubt.

Yet without the proper tools, a writer, like any craftsperson, will face serious difficulties.

The best communicators through the ages have turned to the latest innovations to help them eke out words and a living. The Roman orator and statesman Cicero relied on his Tiro’s invention of tablet-based shorthand (no, not an iPad 2!), a Royal Quiet Deluxe portable typewriter was Hemingway’s weapon of choice, and Raymond Chandler used a Dictaphone for the first drafts of his screenplays.

In the past few years I—a Twitter-averse, text message-avoiding, Facebook-shunning curmudgeon—have forced myself to find tools that eliminate paper-shuffling inefficiency and allow me to record late night thoughts that invariably evade me the next morning. (Putting such things out of reach of my four-year-old son, Johnny, and his two-year-old brother, Harry, was also a good move). So, here goes with my list of treasured tech tools, which see a lot more use than my dust-collecting hammer, screwdriver or pliers.

Speech Recognition Software + Mic Winston Churchill tormented many a secretary with late-night transcribing duties and, while she’d make a fine scribe, I doubt my good lady wife, Nicole, would care to record my nocturnal babble. So, I turned to technology—namely, a wireless, Bluetooth-compatible microphone and a copy of Dragon Naturally Speaking. When my mother first purchased its predecessor, Dragon Dictate (complete with a very poorly designed dragon logo), some 15 years ago, this was a crude, ineffective technology that required more coaching than a preening, high-strung NFL wide receiver. Now, it’s quick, user-friendly, and mostly accurate. Although you can’t eliminate the occasional hilarious gaffe—I don’t think Joe Stalin would’ve cared for what my speech recognition software calls him.

Tablet + Stylus We’ve moved on a little since the aforementioned Tironian notes were in their heyday, but the concept is similar—take a stylus and mark semi-intelligible scribble on a tablet. Now, I know there are a lot of Apple fanboys and girls who will want to string me up for saying this, but, despite its numerous merits, the iPad is not the best thing for the job. That honor goes to the HTC Flyer, whose seven-inch form and handy “Magic Pen” make on-the-go note taking a cinch. The real power is the wireless synch with Evernote, which runs OCR on your notes so you can find certain words later with a full content search from any device. And if it can read my scrawl, it can read anything. Another bonus is the ability to highlight and annotate within e-books downloaded from the Kobo store—as close to marking up a real paper and glue copy as you can get on a slab of aluminum and plastic. Yes, having to pay extra for the stylus at Best Buy is a fine example of tech company money grabbing, but to me, at least, it was worth it.

Olympus Phone Call Recording Thingy OK, before you think I’m going all 1984 here, I try to tell interviewees that I am going to record our conversation lest they outpace my makeshift shorthand. The technology here is simple: a tiny microphone that sits in my ear and records both sides of any phone conversation on my voice recorder. Just like the NSA (totally kidding, noble overlords). I then connect the recorder via a USB cable and rip the file right into iTunes. Love this gizmo, except the droning of my own voice. (Who, except talk show hosts, politicians, and Charlie Sheen actually likes to hear themselves talk?) There has to be something similar for the iPhone, iPad, iWhatever, too, Applenistas.

How about you, dear readers? Do you have any tech toys/tools that you’d find it hard to live and write without?

Wednesday, June 15, 2011

17th-Century English History Roundup

.
James Weeks, "Diggers for victory: 17th-century radicals inspire choral music," Guardian, June 9, 2011

As we wallow in our 21st-century mires of recession, environmental destruction and gluttonous children of a selfish and profoundly unequal society we seem to have no serious intention of reforming, it's salutary to read these bracing words from a distant, more hopeful time. In 1649, as parliament consolidated its triumph in the civil war and Charles I mounted the scaffold, Gerrard Winstanley and his band of True Levellers climbed St George's Hill, near Weybridge in Surrey, and began digging to cultivate the earth for food.>>>

"Nameberry: 12 best virtue names," Kansas City Star, June 13, 2011

In the 17th century, for some of the most puritanical of the Puritans, even biblical and saints' names were not pure enough to bestow on their children, and so they turned instead to words that embodied the Christian virtues. These ranged from extreme phrases like Sorry-for-sin and Search-the-Scriptures (which, understandably, never came into general use) to simpler virtue names like Silence and Salvation.>>>

Antoinette Kelly, "English aristocracy consumed the skulls of Irish killed in battle," Irish Central, May 26, 2011

The skulls of Irish who lost their lives during 16th and 17th century battles were ground up and consumed by the English aristocracy, as it was believed they could cure illnesses and heal wounds.

The claim is made in a new book "Mummies, Cannibals and Vampires", by the British academic Dr Richard Sugg, who is a lecturer at Durham University.>>>


George Webster, "Real 'pirate of the Caribbean' was funded by London elite," CNN World, May 25, 2011

Forget peg-legs, parrots and eye-patches -- the real pirates of the Caribbean were much more complicated.

According to an eye-opening new exhibition near the bank of London's river Thames, a number of Britain's most notorious buccaneers colluded with high-profile politicians and businessmen during the "golden age" of piracy in the 17th century.>>>

Franklin W. Knight, "The fading allure of revolution in the Caribbean," Jamaica Observer, June 8, 2011

WITH all the political agitation throughout the Arab World, there is resurgence in the use of the word revolution to describe the aspirations of the restless ones. But with few exceptions such as Venezuela, Ecuador and Cuba, revolution hardly emerges in the political discourse in the Americas. That is a pity, especially among Caribbean folk. After all, the Caribbean constitutes the region par excellence for revolutionary change. . . .

By the end of the 17th century, the word had passed into all the European vernacular languages. The English parliament used the phrase, "Glorious Revolution" to describe its overthrow of the Catholic King, James II, in 1688 and his replacement by the Dutch Protestant, William III of Orange-Nassau and his English queen, Mary II of England. But monarchical replacement hardly constitutes revolutionary change. This so-called "Glorious Revolution" was more a coup d'état than any profound change, so Jamaicans need not worry about replacing their monarchy with a republic.>>>

Tuesday, June 14, 2011

American Political Partisanship in Historical Perspective

Heather Cox Richardson

Peter Orszag had an article last week at Bloomberg arguing that political partisanship in America has increased dramatically in recent years because Americans have self-segregated their housing according to political leanings. Once in like-minded groups, he suggests, they tend to reinforce each other and drift toward extremes as individuals try to outdo each other in enthusiasm for their political affiliations.

This is an interesting theory. It suggests our political inclinations are beyond our control and that society has spiraled into extremism for reasons we cannot stop.

It would certainly be quite interesting to Americans of the 1840s and 1850s, whose partisanship was so extreme congressmen took guns to the House of Representatives to protect themselves, settlers in Kansas and Missouri murdered each other in their beds, and millions of men killed off several hundred thousand of each other before deciding to call it quits. Who knew that when they moved West, setting up shelters wherever they found good land, that antebellum Americans were unconsciously segregating into political neighborhoods?

There is a much more obvious and more plausible explanation than political segregation for the increases in political partisanship that have occurred with pretty cyclical regularity in American history. It is an explanation that suggests that partisanship and compromise are both deeply imbedded in the American political tradition.

Rising politicians need to be able to attract attention. To that end, they need to distinguish themselves from the successful politicians who hold power. When those senior politicians have emphasized compromise, aspiring politicians have attacked them and advocated more extreme positions. Extremism begets extremism until the system becomes utterly dysfunctional. At that point, aspiring younger politicians can attract attention by advocating not extremism, but compromise.

This cycle of compromise to partisanship to extremism to compromise has turned over again and again in American history.

To see how this works, let’s look at the first generation of professional politicians in America: the Jacksonians of the 1830s. The men who wanted to put Andrew Jackson in the White House needed a way to garner support for a man who was widely regarded as volatile and a rather dim bulb. How could they elevate him when men like Henry Clay, the Great Compromiser, controlled the political scene? By viciously attacking Clay and men of his ilk with unfounded accusations, decrying compromise as weakness, and building a constituency that despised the very art of compromise Clay performed so well.

How then could the next generation of politicians opposed to Jackson’s Democratic Party build its own constituency? By attacking the Democrats, of course.

As political leaders squared off, the newspapers that supported them echoed their rhetoric. There, and not in unconsciously politically segregated communities, individual editors turned up the heat of extremism. Each tried to outdo the competition to draw readership and the advertising dollars readers attracted. Partisanship rose as voters learned to value conflict rather than compromise.

As members of each party more and more often characterized their opponents as corrupt, dangerous, and evil, compromise became increasingly unthinkable.

We know how that turned out.

But the need for politicians to distinguish themselves from their predecessors can serve compromise as well as conflict. When partisanship has become more important than actual governing the government ceases to function in any sort of a competent way. Astute younger politicians then can build careers by promising to compromise with opponents to create solutions that make the government work again. Theodore Roosevelt, for example, recognized voters were frustrated by the extremism of the late nineteenth century that had paralyzed government just when the nation was desperate for solutions to the crises of industrialism. Roosevelt created an image of himself as bipartisan, willing to side with Democrats even against members of his own party to do what was good for the country (a position that infuriated old-fashioned Republicans and Democrats both). Roosevelt was not alone, though. His construction of a politics of compromise was part of the reaction of his generation to the partisanship of the previous generation. That premium on compromise produced the bipartisan Progressive Era.

Which, in turn, was followed by the growing extremism of the 1920s . . . and so on.

These political swings have been part of American society since at least the 1830s. They are not about living quarters. While housing patterns may reflect the current political values of the national culture, Americans are not first self-segregating politically and then self-integrating politically every few generations. What they are doing, though, is listening to their political leaders, reading the news, watching TV, and now, using the internet. What they hear drives their attitudes toward politics.

Far from being a reflection of living patterns created without our conscious control, partisanship and compromise are both deliberate decisions made by political leaders.

Monday, June 13, 2011

Revere, Revisited

Chris Beneke
Now that public interest has shifted to the contents of Sarah Palin's email account, it appears that the dust has settled on her imaginative reconstruction of Paul Revere’s Ride. It was fun while it lasted. The high point may have been Steven Colbert’s demonstration of how Revere could have rung a bell and fired multiple warning shots from a front-loading (single shot) musket, while riding on a rocking, coin-operated steed.
The editors of Revere’s once relatively sedate Wikipedia page were kept very busy with this extra attention. Palin supporters descended upon them with Palin-friendly edits. Then the gawkers, like me, stopped for a look. The page saw as many as 140,000 visitors on June 6.
At least we were all motivated to learn something about Paul Revere and the American Revolution (how many of those 140,000 were history professors and teachers making sure they had their stories straight?). The chief authority on this topic might be David Hackett Fischer, author of the magisterial book with the deceptively quaint title Paul Revere’s Ride. But Fischer appears to have (wisely) made himself scarce during this controversy.
Though the subject was one on which very few, outside of the Minute Man National History Park, are expert, Palin’s Revere comments gave some very respectable historians and pundits a chance to address the public on an early American history topic and to reflect more broadly on our commitment to education.
Here, forthwith, is a brief snapshot of the historically informed media attention:
In the New Yorker, Jill Lepore described Revere’s ride as a form of “hyperlore, which passes from one computer to the next, along a path best called hyperbolic.” Lepore provides a helpful link to Revere’s 1775 deposition for the Massachusetts Provincial Congress, which is held at the venerable Massachusetts Historical Society. It’s well worth the few minutes it takes to read Revere’s account, charmingly laden with contemporary expressions and the variety of spellings for which early Americans are justly known.
On Salon.com, Andrew Burstein and Nancy Isenberg were in grading mode, awarding Palin an "'F' on the Paul Revere quiz." They continue: "Okay, Sarah. Here's your guide to what you need to know about Paul Revere. He did not ring bells or fire warning shots. He did not warn the British. He did not defend ‘freedom.’ And he did not yell, ‘The British are coming!’ because he was a British subject in 1775. As Professor David Hackett Fischer explained in his book 'Paul Revere's Ride,' Revere would have shouted, ‘The regulars are coming!’ That is, the regular army. Americans in and around Boston were called ‘country people.’ Revere was not defending a nation, because the nation we became did not exist yet. Before the phrase ‘United States of America’ was born with the Declaration of Independence, those resisting British power, identifying with the Continental Congress, were collectively known as the ‘United Colonies.’"
Burstein and Isenberg’s larger point is that Palin, who lacks “a basic respect for knowledge” should be, but is decidedly not, “embarrassed by her ignorance.”
Robert Allison, was more sympathetic to Palin in his New York Daily News oped, finding several nuggets of truth in her understanding of Revere’s Ride: “[S]he was, in a sense [right]. Revere, in fact, was warning the British Empire—of which Massachusetts was part—that it could not invade the rights of Americans. Revere himself did not ring bells or fire shots, but the colonists he alerted did. The British troops beginning their march westward heard the bells, and knew the alarm was out. The rest of it—the warning about being secure and being free, was metaphorical . . .”
Allison’s larger point is that historians should take responsibility for failing to educate the public and be grateful for this opportunity to share what they know. "Sarah Palin is not a historian. . . . She is a politician, and quite emphatically a representative of ‘ordinary Americans.’ If her reading of Revere is too subtle for the professoriate, and if she comes across to many as woefully misinformed after visiting these sites, whose fault is it? Hers, or ours, as tour guides and historians?"
Acknowledging how much we all could do with more learning, Pulitzer-Prize winning commentator Leonard Pitts, Jr., observes that "while it is comforting to think Palin’s gaffe speaks only to her own considerable limitations, it is also short-sighted. The evidence suggests that she is less an exception to, than a reflection of, a nation that is in the process of forgetting itself."
Which, I now editorialize, makes Congress’ recent decision to gut the Teaching American History (TAH) program especially disappointing. I just assisted with a TAH proposal and was looking forward, as part of the proposed program, to take local elementary school teachers on a tour of the battle sites at Lexington and Concord, showing them where Paul Revere was likely to have been captured—and where he warned the Regulars that colonial militia were mustering—as well as discussing Paul Revere’s Ride with them.
The alarm has rung, but getting the actual message out will now be more challenging.

Friday, June 10, 2011

Less is More in Elgin Park, and in Writing

Heather Cox Richardson

Michael Paul Smith has been described as the “Mayor of Elgin Park,” a town he has created entirely through photographs posted on the internet. Elgin Park is a town in the American Midwest, constructed as if it were the 1950s, without any inhabitants. In the photographs visible on the web, it appears to be a real town. But Elgin Park exists only in pixels.

My first reaction to the models created by Mr. Smith was to be a bit creeped out. The recreation of a “perfect” model of an imaginary idyllic past, documented in photographs, seems too close for comfort to the world of History as Fantasy that historians so abhor.

But looking at Mr. Smith as an artist rather than looking at his art as historical representation offers an interesting perspective on writing history.

Mr. Smith explains that he works hard to make sure he does not provide too much information in his images. He leaves room for the viewer to project himself or herself into the photograph, using his or her own eyes and emotions to fill in details.
“Things visually ‘read’ better when the amount of information is kept in check,” Mr. Smith notes. “The brain / eye / emotions will fill in the details, even when there is minimum amount of data available. On the other hand, there can be too much information. When that happens, you end up with a literal representation of something and very little room for personal interpretation. The more the viewer can project themselves into something, the more powerful it becomes.”

This struck a chord with me because it is precisely what my wonderful editor hammered home when we worked together on a recent project. She insisted on chopping all my sentences in half. While I worried the resulting simplicity would insult readers by suggesting I thought they were stupid, she held her ground and told me the book itself read better with very simple prose. I came—eventually—to see that long complicated sentences and drawn out paragraphs commandeer all a reader’s attention, making him or her work at deciphering the mechanics of the prose. This can be useful if the writer’s idea is to focus, as certain theoreticians do, on words and their meaning. But for historians exploring other aspects of our field, it serves no real purpose. With complicated writing, a story never comes to life. Instead, it sits stubbornly on the page, imprisoned in a tangle of words.

Simple sentences, like Mr. Smith’s uncluttered images, free a reader’s mind to fill in the ideas and the emotions of a story.

Laura Ingalls Wilder, the author of the Little House books, was a master of creating evocative scenes with very simple sentences. Here, for example, she describes a department store in Dakota Territory:

The inside of the store was all new, and still smelled of pine shavings. It had, too, the faint starchy smell of bolts of new cloth. Behind two long counters, all along both walls ran long shelves, stacked to the ceiling with bolts of muslin and calicoes and lawns, challis and cashmeres and flannels and even silks.

There were no groceries, and no hardware, no shoes or tools. In the whole store there was nothing but dry goods. Laura had never before seen a store where nothing was sold but dry goods.

At her right hand was a short counter-top of glass, and inside it were cards of all kinds of buttons, and papers of needles and pins. On the counter beside it, a rack was full of spools of thread of every color. Those colored threads were beautiful in the light from the windows. (Little Town on the Prairie, 1941, p. 48).

In ten sentences, she suggests the look and feel of a brand new store, the excitement it generates, and the isolation and poverty in which Laura has always lived. Wilder has left room for her readers to imagine the scene, rather than forcing us to use all our mental energy on her prose.

While a simple style is certainly not the only way to write evocatively, it is one that historians, especially beginning historians, should not shun in the fear that they will look stupid if they don’t write in tangles. As Mr. Smith says, less can often be more.

Thursday, June 9, 2011

Editing over the Decades

Randall Stephens

I've worked, at most, 7 years on a single project. But, I'm just one person, toiling on my books and articles. The scholars at the Jefferson Papers--Bland Whitley is one of them--have been editing away for over 60 years. I wonder how long the researches at Yale have been doing the same with the Jonathan Edwards Papers? There must be a record for longest ongoing project. 100 years? 200 years? 1,000 years?

See the June 6 article in the NYT, "After 90 Years, a Dictionary of an Ancient World," by John Noble Wilford. Writes Wilford: "Ninety years in the making, the 21-volume dictionary of the language of ancient Mesopotamia and its Babylonian and Assyrian dialects, unspoken for 2,000 years but preserved on clay tablets and in stone inscriptions deciphered over the last two centuries, has finally been completed by scholars at the University of Chicago."

And the dictionary is more of an encyclopedia than simply a concise glossary of words and definitions. Many words with multiple meanings and extensive associations with history are followed by page after page of discourse ranging through literature, law, religion, commerce and everyday life. There are, for example, 17 pages devoted to the word “umu,” meaning “day.”

The word “ardu,” for slave, introduces extensive material available on slavery in the culture. And it may or may not reflect on the society that one of its more versatile verbs was “kalu,” which in different contexts can mean detain, delay, hold back, keep in custody, interrupt and so forth. The word “di nu,” like “case” in English, Dr. Cooper pointed out, can refer to a legal case or lawsuit, a verdict or judgment, or to law in general.>>>

The dictionary set costs a fortune, but can be downloaded for free in PDF form. So, if you're hankering to know what the earliest recorded wisdom was on love, food, work, law, and more, browse away!