Monday, February 28, 2011

Screening the Past: On History Docs, 1960s Counterculture Films, and Online Abundance

Randall Stephens

Werner Herzog once quipped: "It's all movies for me. And besides, when you say documentaries, in my case, in most of these cases, means 'feature film' in disguise."* Perhaps that's a post-mod nod to relativism. It is true that there are documentaries and there are "documentaries," just as there are feature films and "feature films." Be wary about which ones you use in a history class. The History Channel's series of films on ancient aliens stand in contrast to episodes of PBS's American Experience. (Do history students know the difference?)

I've seen a few history-related documentaries in the last few months that are as enchanting as many feature films. I've also seen some feature films that view like weird documentaries, or period pieces, trapped in the amber of time.

While The Unseen Alistair Cooke: A Masterpiece Special first first aired on PBS in 2008, I hadn't seen it until a couple of weeks ago. The film, "chronicles Cooke's decades in America, friendships with Hollywood icons, celebrated journalism career and years as host of Masterpiece Theatre. Marking the November 2008 centennial of his birth, The Unseen Alistair Cooke: A Masterpiece Special turns an admiring eye on the master observer." It's a captivating story of an endlessly fascinating character. For anyone interested in exploring how Brits view Americans and vice versa, this is a fun one.

The latest installments of American Experience include some films that tie in to anniversaries. On the Civil War front, the Robert E. Lee bio aired in January. On February 28 Triangle Fire will run on PBS, marking
the 100th anniversary of that tragedy. HBO will be showing a similar documentary. "The PBS special is affecting," writes Aaron Barnhart at the Kansas City Star, "much more so than the overly talky HBO documentary on the fire airing next month. PBS also takes more liberties with the facts. In 1909, about a year earlier, the Triangle ladies had led a walkout that quickly spread to other Garment District shops. Crucially, some of New York’s leading aristocratic women, such as Anne Morgan (daughter of J.P.), joined them."

Moving forward in time and genre . . . I watched the 1970 film Getting Straight, which seems strangely proud of its relevance and counter cultural bona fides. (In full here.) The film stars that ubiquitous actor of the Me Decade, the hirsute Elliott Gould as a a former campus radical who returns to school with a single-minded purpose: He wants his degree and his fun, with no political strings attached. Complete with chamber-pop hippie soundtrack, Getting Straight features a young Candice Bergen, an even younger Harrison Ford, and an array of stock characters playing Baby Boomer roles. There's the Dionysian stoner, the African-American radical (who demands a black studies department), and various libertines and longhair sign carriers. It almost has the feel of a clunky play, with the youngsters squaring off against the hopeless, old squares in the admin. (Watching it, I was reminded of Christopher Lasch's line about the era: "Even the radicalism of the sixties served, for many of those who embraced it for personal rather than political reasons, not as a substitute religion but as a form of therapy" [Culture of Narcissism, 33].) Getting Straight's cinematography borrows heavily from pop art in some visually appealing ways. The filming is playful, even silly at times. A great period piece, which can, at times, be grating.

I'm always on the look out for 60s films that can be used, in bits, in class. Maybe next time I teach America in the 1960s I'll use a clip or two from the Monkees colossal psychedelic bomb, Head (1968), or Roger Corman's The Trip (1967). The Graduate (1967), or the Strawberry Statement (1970) (watch the latter in full here) might work as a generational flick in ways that Getting Straight would not.

Much, much, much can be tracked down on YouTube or on the "watch instantly" feature on Netflix. Selections from two well made Rock docs, both on Netflix, come to mind: Who Is Harry Nilsson (And Why Is Everybody Talkin' About Him?) (2010); and Rolling Stones: Stones in Exile (2010).

On accessibility/access to film clips, performances, historical movies Don Chiasson observes in his review of "Keif's" new biography ("High on the Stones," March 10 NYRB):

Anyone reading this review can go to YouTube now and experience Muddy Waters, or Chuck Berry, or Buddy Holly, or the first Stones recordings, or anything else they want to see, instantly: ads for Freshen-up gum from the Eighties; a spot George Plimpton did for Intellivision, an early video game. Anything. I am not making an original point, but it cannot be reiterated enough: the experience of making and taking in culture is now, for the first time in human history, a condition of almost paralyzing overabundance. For millennia it was a condition of scarcity; and all the ways we regard things we want but cannot have, in those faraway days, stood between people and the art or music they needed to have: yearning, craving, imagining the absent object so fully that when the real thing appears in your hands, it almost doesn’t match up.

It all makes screening the past in the history classroom much easier. More choices than ever, though.

Friday, February 25, 2011

Numbers

Chris Beneke

And the LORD hearkened to the voice of Israel, and delivered up the Canaanites; and they utterly destroyed them and their cities. (Numbers 21:3)

Estimating mass deaths requires an accounting in which the sums are always grievous. For the twentieth century, the numbers are so terribly large that the researcher could be tempted to discount the meaning, and the value, of each one. For those who study other times, the scale of the twentieth century’s killing is both humbling and edifying . In the early hours of April 19, 1775, eight provincials were killed on Lexington Green; in the spring and early summer of 1994, eight hundred thousand Tutsis were killed in Rwanda. We’re certain about the number killed in Lexington on that morning. We’ll probably never know how many Rwandans were killed in those grim months. The lives lost in the former were no more dear than the lives lost in the latter. But proximity, collective memory, and relative connection to transformative events can easily obscure that moral truth. As Yale Historian Timothy Snyder puts it:

Discussion of numbers can blunt our sense of the horrific personal character of each killing and the irreducible tragedy of each death. As anyone who has lost a loved one knows, the difference between zero and one is an infinity. Though we have a harder time grasping this, the same is true for the difference between, say, 780,862 and 780,863—which happens to be the best estimate of the number of people murdered at Treblinka. Large numbers matter because they are an accumulation of small numbers: that is, precious individual lives.

Snyder’s essay in the latest New York Review of Books (“Hitler vs. Stalin: Who Killed More?”) focuses on death tolls but also addresses moral culpability. In some cases, as when the Nazis gassed Jews or the Stalinists shot dissidents, the evil and its sources are transparent. On this ledger, the Germans were directly responsible for the death of 11 to 12 million noncombatants (depending on the measure), and the Soviets 6 to 9 million. But these figures do not include the millions who died on the battlefield, nor the millions who starved as a result of invasion, collectivization, and murderous indifference. Snyder’s striking point here and in the bestselling book from which the essay is derived is that the “most fundamental proximity of the [Hitler and Stalin] regimes … is not ideological but geographical.” In the great mass of land between Berlin and Moscow, where colossal armies clashed in both World War I and II “we must take seriously the possibility that some of the death and destruction wrought in the lands between was their mutual responsibility.” Killing there was multiplicative, rather than merely additive.

Moving from west to east, Roderick MacFarquhar’s review of Frank Dikötter’s Mao’s Great Famine: The History of China’s Most Devastating Catastrophe, 1958-1962 (NYRB, Feb. 10) presents some chilling tallies as well. According to Dikötter, the longstanding estimate of 30 million deaths attributed to China’s Great Leap Forward needs to be revised upward to “a minimum of 45 million.” These deaths resulted primarily from food shortages, rather that secret police forces, mobs, machine guns, or forced labor camps. At the same time, MacFarquhar observes, the “exploitation of the peasantry during the GLF and into the famine was so unprecedently excessive that provinces were left with virtually no food for the people who had produced it.” As with the terror unleashed by the Nazis and the Stalinists, the relative degree of culpability may be in dispute, but the fact of incalculable loss is not.

Thursday, February 24, 2011

Middle East History Roundup

.
Ishaan Tharoor, "A History of Middle East Mercenaries," Time, February 23, 2011
. . . . Foreign warriors were valued by monarchs wary of their own restive populations and the rivalries and jealousies of local nobles. The great empires of the Middle East all boasted a rank of soldiers drawn (or abducted) from abroad. The Ottomans had the janissaries, mostly young Christians from the Caucasus and the Balkans, who converted to Islam and were reared from an early age to be the Sultan's elite household troops, often forming a powerful political class of their own in various parts of the empire. Elsewhere, the Mamluks, slave warriors from Africa to Central Asia forced into service by Arab potentates, managed to rule a large stretch of the modern Middle East from Egypt to Syria for some 300 years, repulsing the invasions of European crusaders as well as the Mongol hordes.>>>

John Melloy, "Middle East Mirrors Great Inflation Revolutions Since 1200 AD," CNBC, February 23, 2011
Inflation has led to political revolutions since Medieval times and we may be witnessing the fifth such great revolution in history unfolding in the Middle East and in our own country right now, said Dr. Ed Yardeni, president and chief investment strategist of Yardeni Research.

Yardeni cites the work of historian David Hackett Fischer, who described civilization’s first four major inflation cycles in his 1999 work The Great Wave: Price Revolutions and the Rhythm of History.>>>

"Different Meanings Of Democracy For West, Middle East," NPR, February 5, 2011
The chants, chaos and cries from the streets of Cairo and other cities in Egypt this week revive questions for historians and political scientists that politicians have to answer with practical policies. Host Scott Simon speaks with Dr. J. Rufus Fears, a historian and Classics scholar at the University of Oklahoma, about western concepts of democracy and the events now sweeping Egypt and the Middle East.>>>

Robert Darnton, "1789—2011?" NYRB blog, February 22, 2011
The question has come to haunt every article and broadcast from Egypt, Tunisia and other countries in the region whose people have revolted: what constitutes a revolution? In the 1970s, we used to chase that question in courses on comparative revolutions; and looking back on my ancient lecture notes, I can’t help but imagine a trajectory: England, 1640; France, 1789; Russia, 1917 … and Egypt, 2011?>>>

Ibrahim Al-Marashi, "The Arab World's Leadership Deficit," History and Policy (February 2011)
The Arabs have few victories to claim, going back a millennium, all the way to 1187 to celebrate a leader, Salah al-Din and his victory in Jerusalem during the Crusades. What remains after that date are only a few de facto victories. Victories defined in terms of survival. In 1956, when the Egyptian President Jamal Abdul Nasser lost a war against Britain, France and Israel, the Arabs claimed it a victory because he stood up to the 'West'. Even then, the highly popular Egyptian leader was feared among the elites in Jordan, Iraq and Saudi Arabia. When Saddam Hussein was soundly defeated by Coalition forces in the 1991 Gulf War, the Iraqi leader claimed it a victory because he stood up to the 'West' and survived.>>>

Wednesday, February 23, 2011

Abraham Lincoln's Bank War, or Whigs Leaping out of Windows

Randall Stephens

Everyone now knows the story of how Wisconsin state senators stopped the wheels of government, for a moment at least, by getting out of the state. Over at the the Chicago Tribune Eric Zorn points out another such instance ("As a State Legislator, Lincoln Tried to Play the 'Run Away' Game, too," Chicago Tribune blog, Feb 21, 2011). In 1840 Lincoln and the Illinois Whigs tried the same exit strategy against the then-dominant Democrats. (Zorn quotes from Gerald J. Prokopowicz's book on the topic.) The Whigs hoped to build canals and railroads throughout the state. And then the Panic of 1837 set in. The Democrats and the Whigs squared off on the matter of the Illinois State Bank. What transpired played out some of the national themes of the Jacksonian era.

A bit more on it from Alexander Davidson and Bernard Stuvé, A Complete History of Illinois from 1673 to 1873 (Springfield, 1874), 423.

Parties in Illinois became almost divided upon the subject of the banks. Nearly all the leading democrats opposed them and the acts legalizing their suspensions, although they were authorized and their capital stocks were increased irrespective of party. The whigs were called bank-vassals and rag-ocracy, and charged to be bought and owned by British gold. The bank officers were sarcastically denominated rag-barons; and the money was called rags and printed lies. The whigs retorted that the democrats were disloyal, and destructive of their own government; that the banks were the institution of the State, and to make war upon the currency was to oppose its commerce and impede its growth and development. Although parties were in a measure divided upon the banks, with the democrats largely in the majority, this was not without benefit to those institutions. It gave them unswerving friends. Besides, the merchants and business men of that day were, with rare exceptions, whigs, who gave currency or not to the money as they pleased. Partisan zeal led them to profess that the banks were not only solvent, but that they were unduly pursued, and that the opposition to them was nothing but absurd party cry.

When the suspensions of the banks was legalized again in 1839, it was to extend until the end of the next general or special session of the general assembly. The legislature for 1840-41 was convened two weeks before the commencement of the regular session to provide means to pay the interest on the public debt, due on the first of January following. . . . The democrats now, however, thought that their time of triumph had arrived. It was by them contended, that that portion of the session preceding the time fixed for the regular session to begin, constituted a special session, and if the suspension was not further extended, the banks would be compelled to resume specie payment on the day the regular session should begin or forfeit their charters and stop business. Upon the other hand, it was contended that the whole constituted but one session. Much party animosity was, besides, manifested at this session. The fate of the banks seemed to hang upon the motion pending to adjourn the first part of the session nine die. It was perceived that the motion would prevail. To defeat it in the House, the whigs now essayed to break the quorum. But the doors were closed, a call of the House ordered, and the sergeant at arms sent in quest of the absentees. The whigs, being thus cut off from the usual avenues of retreat, bounded pell mell out of the windows, but without avail—enough were held in durance to make a quorum, and the sine die adjournment was carried.

Tuesday, February 22, 2011

Media and Messages

Dan Allosso

Long before I ever thought of going into the history game, I worked in the computer industry. “Long before” is an interesting issue of periodization. It was about twenty years ago: less than a human generation, but ten or fifteen computer generations of Moore’s Law. In that time, transistor counts on the central processors of computers have risen from a few hundred thousand to a few billion. What does that mean to historians?

I think the biggest change for me as a historian is that content has really become king. The ability to store and move huge volumes of data cheaply and effortlessly has changed the game for people who want to communicate their ideas with others. Network bandwidth and processing power have very visibly led the way, but storage technology has improved just as incredibly. And it’s the ability to store information that makes the whole thing work.

When I started building “clone” computers in the late ‘80s, we were putting 32 Megabyte Seagate hard drives in them. The ST-138R was a physically small drive by the standards of the day, measuring just 3.5” by 1.66” by about 5.25” and weighing a couple of pounds. An OEM could buy them for about $150, making them an attractive entry level drive. We also sold higher capacity drives, but the 32 MB drive and its 65 MB big brother were the “sweet spot,” the best deal on a dollars-per-megabyte basis.

32 Megabytes would hold a lot of text. Average word processor files used a couple dozen kilobytes per page, as they still do today. So you could write to your heart’s content. But there wasn’t a lot of room to store your research. A text-only copy of a decent-sized book (say James Joyce’s Ulysses on Project Gutenberg) took up a Megabyte and a half. So you’d only get about twenty of those on your disk—how you’d get them there was another issue, but we’ll skip over that. And if you were able to get your hands on high resolution images, you’d be lucky to store half a dozen.

Since that time, hard disks have gotten faster, smaller, and cheaper. It’s now possible to spend the same type of money that once bought an ST-138R, and get a disk that’s smaller in size but nearly a million times larger in capacity. Or, if you prefer the ultimate in portability to the ultimate in capacity, you can dispense with disks entirely and store your data on chips. For less than $50, you can carry 32 Gigabytes of data on your keychain.

I recently moved my dissertation project to just such a device. I now carry a 32 GB flash drive that holds all my writing, as well as all my research. In what would have taken a thousand ST-138Rs (that late-80s drive), I can store ten thousand high resolution photos or scans, thousands of books, and all the writing I’ll ever do on this project. Think of a thousand hard drives. Think of the electrical current they drew. They would have filled a room and heated your home. I carry this thing in my pocket (it’s backed up at home and at the office), and it allows me to always have the most recent versions of my work at my fingertips. It plugs into the USB port of whatever computer I happen to be sitting in front of, and transfers my data so fast I can’t tell it’s not on my local drive.

As I’m researching and writing, I can’t help thinking, although I want this work to come out as a regular, old-fashioned, paper-and-cardboard book, my writing and all the supporting primary evidence in its original form fits on a chip. I can’t help but believe that in the long run, this will change the way we do our research, write our histories, and communicate them to other people. The only question in my mind at this point is, will that “long run” be measured in human years, or computer generations?

Monday, February 21, 2011

Is Your Teaching Stuck in an Industrial Paradigm?

Jonathan Rees

A few weeks ago Heather Cox Richardson recommended a video embedded in a post on this blog. I’ve been kind of freaked out about what I heard and saw on it ever since. In it, among other things, Sir Ken Robinson (a guy who I can tell you literally nothing about other than the fact that he’s obviously much smarter than I am) suggests that education, as we know it, is organized along the lines that factories were during the mid-nineteenth century.

Time periods are divided by ringing bells. The instruction in particular subjects is neatly divided into different rooms. Children are brought through the system in batches based upon how old they are. This educational system that we all take for granted was conceived, Robinson suggests, in the image of factories in order to produce people to work in factories.

For me, the idea that I’m doing anything along the lines of a factory is deeply disturbing. Had you asked me why I wanted to be a professor before I started graduate school, I might actually have said in order to be sure that I would never have to work in a factory. I study labor history in large part because I have such great respect for the people who did work so much harder than I do for much less reward. And yet, I don’t want my classroom to resemble a factory setting in any way!

Sometimes, though, I know that factory thinking raises its ugly head while I’m teaching. Whenever I get in one of those funks brought on by a large batch of uninspired answers coming from the students in front of me, I always imagine myself as Brian in that scene from Monty Python’s Life of Brian where he addresses all his new followers from a window.

“You are all individuals,” he tells them.

“We are all individuals,” they reply in unison.

“You are all different.”

“We are all different.”

“I’m not,” says a guy in the right foreground, just to be difficult.

How do we get more students to think for themselves, even if (like that difficult guy in the foreground) they don’t even realize that they’re doing it? Robinson, who’s mostly discussing secondary school students, seems to be suggesting that the best way to break the paradigm is to give up on standardized testing. Don’t measure output. Measure creativity. Create an incentive system in the classroom designed to foster creativity—the same kind of creativity that kids see in the new electronic media that surrounds them every moment of every day other than when they’re in school.

Leaving the current assessment craze in higher ed aside, trying to break the paradigm in the college history class seems like a much more difficult task than it would be for secondary schools, as the vast majority of the colleagues I know would already rather retire than ever hand their students a standardized or multiple-choice history test. We grade on composition, not memorization, but an essay produced as part of a system conceived along the lines of a factory probably isn’t the best possible essay it can be.

So what can you do to foster creativity in our students other than just shout “Be creative!” and hope you don’t get a response like “How shall we be creative, oh Lord!”? (That’s a variation on another Life of Brian joke there, by the way, but I can’t explain it on a family-friendly blog.)

Trying to make myself feel better, it wasn’t too hard to think of a few things I’ve already done that at least in theory promote this effect. For instance, I’ve tossed out the textbook this semester (and have been blogging about it here). You can’t get much more top down than most textbooks, with their declarations of what happened coming from an omniscient narrator with the voice of God. No ambiguity. No nuance.

But now I feel like I should be doing more. Robinson alludes to collaborative work and implies that more interdisciplinary instruction can be done, but alas doesn’t suggest how. So what are you doing to break down the education/industrial paradigm or have you (like me) not yet fully come to terms with the fact that you’re perpetuating it?

Jonathan Rees is Professor of History at Colorado State University - Pueblo. He blogs about history, academic labor issues and other matters at More or Less Bunk.

Saturday, February 19, 2011

Editors . . . Editing?

Randall Stephens

Last week Alex Clark wrote of the "Lost Art of Editing" in the Guardian. Presses have been cutting back for some time now. "Many speak of the trimming of budgets," notes Clark, "the increasingly regimented nature of book production and of the pressure on their time, which means they have to undertake detailed and labour-intensive editing work in the margins of their daily schedule rather than at its centre." A freelancer Clark consulted told him: "'big companies used to have whole copy-editing and proof-reading departments. Now you'll get one publisher and one editor running a whole imprint.'"

Clark's mostly talking about literary fiction here. But the cutting back on editing--line, copy, content--is something I've heard about repeatedly from historians and editors at university and trade presses.

You can do a thing or two to counter the trend. Have multiple historians, experts in your field, read your work. Getting far more than your two MS reviewers to take a look at your work will be a big plus. And, readers will probably be happy to have you return the favor for them at a later date.

See if you can get a second copy editor to go over your manuscript. I was able to work this out with Harvard Univ. Press for my first book, The Fire Spreads: Holiness and Pentecostalism in the American South. It helped. The two copy editors caught loads of grammatical infelicities, leaps in logic, spelling mistakes, etc. that I didn't have eyes to see. (It's even worth shelling out the money for an extra copy editor, if you can afford the $500 or so.)

And, finally, ask those at the press that you are speaking to if editors do much "editing." How will your editor help you shape your MS? Talk to authors who have worked with that editor in the past to see what goes into the process. Does she have a hands-off approach? Will she help you craft your argument and ask for important revisions?

Friday, February 18, 2011

Plagiarism: Getting the Point Across

Heather Cox Richardson

Over the years, I’ve tried everything I can to warn students away from plagiarizing. I explain, cajole, and threaten. I even have a set performance attacking plagiarism in the middle of the semester (the Plagiarism Lecture ought to win me an acting award).

It appears those of us who are soldiers in the war
against plagiarism now have a new weapon in our arsenal (from the University Library at the University of Bergen, Norway):

(If captions don’t appear immediately, click the cc button on the toolbar.)

There are pieces of this video that may be dicey for a classroom, but it does offer two crucial pieces of evidence that support our cause. First, it provides obvious proof to a student that plagiarism is not just the crazy hang-up of his or her particular teacher. It’s clear that a lot of money and time went into the making of this video. And it shows that plagiarism is hated everywhere, not just at a student’s particular school.

The way the video presents plagiarism as unacceptable is not how I present it. My own main point is that it is a profound version of theft. Still, educators I respect emphasize what the video does: that a student who plagiarizes cheats him or herself.

And in the era of Jon Stewart and Stephen Colbert, poking fun at the issue to make a serious point might just work.

Thursday, February 17, 2011

Think Borders Is Going Down the Tubes Because of e-Books? Not So Fast

Philip White

Today’s guest post comes from Philip White, a writer based in Kansas City. A Lion in the Heartland, his forthcoming book about Winston Churchill’s unlikely journey to Fulton, Missouri to deliver the “Iron Curtain” speech will be released by PublicAffairs in 2012.

So it’s now official–Borders, the real-life Fox Books (come on, admit that you’ve seen Tom Hanks hamming it up as a nationwide bookstore mogul in AOL vehicle You’ve Got Mail) has filed for bankruptcy.

Many a web-based postmortem is fingering electronic books, or e-books as the kids call ‘em, as the perpetrator. After all, supposed industry bellwether Amazon recently announced that e-books in its proprietary Kindle format are outselling paperbacks, though, as is the company’s wont, they conveniently neglect to mention how many of these e-books are free, and avoiding giving precise sales figures.

So are e-books to blame for the fall of this once-mighty purveyor of the printed word? Not to the extent that many are claiming.

Certainly, the sales of e-books are continuing to rise. Cheaper and better devices, the shameless self-promotion of the Kindle by Amazon on its homepage and atop many of its book listings and cross-platform, multi-device support in the booming smartphone and tablet markets are just some of the reasons. Not to mention that a new e-book costs significantly less than its paper-based counterpart, and is, in many cases, available at the same time or before the hardcopy release.

But, for all the hype surrounding the rise of the e-book, many other factors contributed to the downfall of Borders, and most of its brick-and-mortar rivals. For the record, I will not be analyzing the viability of the online Borders offering, which may yet survive the follies of its physical location.

First to consider is one of the primary drivers of any consumer purchase: price. The reason that I get a coupon from Borders via e-mail each week–anywhere from 25 to 40 percent off a single item–is that the store often sells its wares at full list price. This can be above $30 for a new history hardback, and pushing $20 for a softback. Yikes! Meanwhile, Amazon, much to publishers’ chagrin, deeply discounts most of its titles from the get go. No need to waste money on e-mail marketing and website customizations that promote special offers when your upfront prices are already the lowest (with some exceptions, like during the run up to Christmas). I have also, in my forgetful way, gone to my local Borders to make use of a coupon and, as Sod’s Law would have it, left the offending item on the kitchen counter. Many retail employees at other chains would help a poor, forgetful bibliophile out in such a situation by simply scanning a coupon from behind their particle-board checkout counter. A fine fellow at Half Price Books recently handed me back a 20 percent off coupon “In case you want to use it again later in the weekend.” Not so at Borders, at least in my experience. No, they’d rather belittle you and huff and puff for a few minutes before eventually yielding. Yay, customer service.

Then there’s the convenience factor, or, in the case of Borders and its ilk, the lack thereof. Option one: Drive 20 minutes across town to a big box store strip mall, and, once there, weave in and out of the parent-with-screaming-kids coming out of Old Navy and the scary looking, 2-by-4 wielding fellow who looks like he wants to hurt this puny bibliophile for slowing his exit from Home Depot. Then spend 15 to 20 minutes, if I’m lucky, looking for a book that may or may not be in stock. Calling ahead doesn’t help, and even if it did, it’s just one more step. If, praise the Lord, the book is on a promotional stand there is a brief moment of joy–3 for the price of 2!– followed by the horrendous realization that every other book on that neatly-stacked table, is, bar none, utter tripe. So, even if I reluctantly, begrudgingly make the decision to buy the book anyway, I have to wait in line for who knows how long. The reward? Dealing with the not-so-friendly, can’t-wait-to-get-home-after-another-crappy-shift “sales associate” who hosed me on the coupon the previous week. Repeat the parking lot debacle. Waste yet more time (and gas) on the drive home.

Option two: From my tablet or desktop, log onto any online bookseller’s site. Time elapsed: 1 second. Yep, the higher speed home broadband’s worth every cent. Search for the book. Buy with one click. E-mail confirmation received. E-mail with reading recommendations customized to my purchase history to follow. True, there is the wait for shipping, but that’s not the point–it takes far less time and is far less a hassle to buy online. And if I, the Luddite, ever bury my suspicions and embrace the e-book revolution, my text will be delivered in a minute or two. I could, of course, select borders.com as my online bookseller of choice. But that will (or should I say, would, for we speak of the deceased) do nothing to support a bloated, unsustainable network of characterless warehouses that happen to house books. (Note to corporate execs everywhere–adding in-store “cafés” that sell mediocre coffee at inflated prices does not a comfy shopper experience make.

“But what of the tactile, in-store browsing experience?” I here you ask. True, I do enjoy it. At local, independent stores (selling used or new books) or chains such as Half Price Books, where the proprietors and staff members know their Stephanie Meyer from their Ernest Hemingway. At Borders, there was always the same experience I get at every big box store. Employees who like the company health plan but despise every other aspect of their job, and have no idea who John Lukacs is, let alone where to find his latest book. Does that apply to each and every staffer? No, of course not, but when it’s the case with even 30 percent, that’s a problem, and the percentage is far greater than that. Indie bookstores also find ways to set themselves apart–the exemplary Rainy Day Books here in Kansas City has many author events each month, and bundles tickets with a hardback to add value. Borders? Not so much.

Finally, Borders, Barnes and Noble and their kind annoy the daylight out of publishers and authors because of their pay-for-play positioning model. Publishers have to cough up outrageous sums to get their latest titles on the most prominent, front-of-store displays and they’re rarely going to fork over that money for anyone other than their cash cow authors–Meyer, Rowling, Grisham et al. Something similar may be true in Amazon's New and Notable section, but not to the same mercenary degree, I’ll wager.

So, do I sympathize with the thousands of Borders employees who will likely be seeking new employment? Yes, of course. Do I have the same sympathy for their soon to be former employer? Not a smidge. Like Blockbuster vs. Netflix, Borders has been outfoxed by a nimble, visionary competitor, baffled by new technology and, fatally, unwilling to change because of the misplaced belief that its overestimated brand loyalty would save it. Does the fall of this behemoth mean printed books are dead, or that e-books are the only viable medium? That is, for now, overstatement on both counts.

Wednesday, February 16, 2011

Chris Beneke on The First Prejudice

Randall Stephens

[From Religion in American History]

About a week ago over at Religion in American History Paul Harvey posted on Chris Beneke and Christopher S. Grenda's, The First Prejudice: Religious Tolerance and Intolerance in Early America (Univ. Pennsylvania Press, 2011). The edited volume, "presents a revealing portrait of the rhetoric, regulations, and customs that shaped the relationships between people of different faiths in seventeenth- and eighteenth-century America. It relates changes in law and language to the lived experience of religious conflict and religious cooperation, highlighting the crucial ways in which they molded U.S. culture and politics." I recently caught up with Beneke, a Historical Society board member, by email and asked him some questions about the project and the work being down on tolerance/intolerance.

Randall Stephens: What is the unifying theme of The First Prejudice: Religious Tolerance and Intolerance in Early America?

Chris Beneke: Our title, The First Prejudice, plays on the popular understanding of religious liberty as the nation’s “First Freedom.” It also draws on the proposition that religion was initially the source of the deepest prejudice to afflict early Americans and the object of the first large-scale efforts to mitigate prejudice. We asked our contributors to be attentive to the distinguished and extensive historiography on church and state, but not beholden to it. The idea was to create a history of religious tolerance and intolerance that took into account a broader range of religious and cultural interaction than histories of religious liberty have traditionally done. For us, it presented an opportunity both to build a compelling new narrative of early American religious history-where religious differences are at center stage-and to develop a common set of reference points and questions that would frame more useful conversations about tolerance and intolerance in America.

Stephens: Why did religious tolerance develop in the West when and where it did?

Beneke: In a sense, it depends on what you mean by tolerance (I know it’s annoying when historians say that, but there, I’ve gone and done it). If you mean what Willem Frijhoff calls “everyday ecumenism,” or at least everyday cooperation and non-violence, then it’s very old indeed. Historians have been hard at work in the archives over the past two-plus decades, discovering that sort of tolerance in surprising places across medieval and early modern Europe. But as a commonly accepted ideal, as a stated commitment to some form of equality, and a legal practice that guaranteed a modicum of protection, tolerance is something that developed in the intellectual capitals of northern Europe during the late seventeenth and eighteenth centuries. And though I risk irritating my intrepid co-editor and some contributors by saying this, I think that it took hold in a much more fundamental and irrevocable way in the early national United States.

Stephens: When it comes to religious tolerance did the early United States differ all that much from Great Britain and western Europe?

Beneke: Here's the very short answer: official church establishments persisted across most of Europe into and beyond the twentieth century. In the United States, they did not. The U.S. may have maintained an unofficial Protestant establishment for many decades (via instruments such as the common law, public education, state religious tests.), but the fact that it was un-official and the fact that it was accompanied by substantive protections for free exercise, was critically important. For all the disingenuity involved, an un-official establishment was surely more hospitable toward religious minorities than almost any official establishment might have been. Maybe just as importantly, the commitment to disestablishment and religious liberty meant that the religiously intolerant had to explain themselves and find ways to wrap bigotry in the mantle of tolerance.

These factors have always kept American religious intolerance in check.>>>

Tuesday, February 15, 2011

Ronald Reagan vs College Students, 1967

Randall Stephens

"NEW HAVEN, Dec. 4 [1967]--Gov. Ronald Reagan of California, who said he had never taught anything before except swimming and Sunday school, sat on a desk at Yale University today and conducted a class in American history." So reported the New York Times on the Gipper's visit to the ivy, where he was met with student protests and plenty of probing questions (December 6, 1967).

"Should homosexuals be barred from holding public office?" a senior from LA asked. The governor was surprised by the question. Rumors had been swirling that his administration had fired two staff members after their sexual preferences came to light. "It's a tragic
illness," said Reagan, after a pause. And, yes, he did think that homosexuality should remain illegal. Some students earlier had demanded that the school rescind its invitation to Reagan. The governor, who visited Yale as a Chubb fellow, gave his $500 honorarium to charity.

The confrontation between the 56-year-old governor and Yale students in 1967 speaks to the culture wars that roiled the decade and continue to reverberate to this day. In the video embedded here the students, with haircuts that make them look like clones of Rob from My Three Sons, square off with Reagan on poverty, race, and Vietnam.

The commemoration of the one-hundredth birthday of the 40th president brought with it the usual fanfare of radio specials, documentaries, guest editorials, and the like. The new HBO doc
Reagan, like PBS's American experience bio, spans the actor-turned-politician's career. (Watch the latter in full here.)

Lost in the telling, sometimes, is the scrappy, intensely ideological cold and cultural warrior from the 1960s and early 1970s. To correct that a bit, see the governor go at it with the somewhat nervous Yalies. Or, observe him lashing out against that "mess in Berkeley." (A clip from the HBO doc showing the governor dress down Berkeley administrators shows that pretty well.) The public memory version--rosy-cheeked, avuncular, sunny--overshadows that more fiery aspect of his personality and politics.

Americans remember their leaders as they choose. (The myths and legends are as stubborn as a Missouri mule.) But it is good to remind ourselves that the politicians and public figures we revere and/or study are rarely as one-dimensional as we'd sometimes think they are.

Monday, February 14, 2011

Medieval History Roundup

.
"Binham Priory discovery of giant medieval graffiti," BBC, January 20, 2011

Thousands of years of history at a remote priory in Norfolk could be unearthed after the discovery of giant medieval graffiti on its walls. The Norfolk Medieval Graffiti Survey (NMGS) group claims to have found 8ft (2.4m) building plans etched into the stonework of Binham Priory, near Wells. . . . "It was difficult to believe what I was seeing," said NMGS's Matthew Champion. "Most graffiti inscriptions tend to be relatively small and modest. This was just so big that it wasn't really possible to see exactly what it was until we had surveyed the whole wall surface.>>>

David Abel, "Below the Western Wall, passage to a cherished history," Boston Globe, February 13, 2011

JERUSALEM — The giant slabs of limestone, which have remained standing for two millennia despite repeated efforts to demolish them, make up the most sacred structure in the world for Jews, the ancient wall that once protected the Temple Mount. . . . The painstaking work of exploring the underworld along the hidden portion of the Western Wall has led to perhaps the most interesting and controversial tour in Jerusalem, one that has to be made by appointment.>>>

Arminta Wallace, "Time travel with an app," Irish Times, February 12, 2011

The iPhone app Dublin City Walls is a virtual tour guide of a kind that could transform our experience of historical sites YOU KNOW the way you sometimes go to visit a historical site and when you find it there’s just a bump in the ground or a pile of stones – and that, pretty much, is that? Those who are trained in matters historical can get their imagination into gear and fill in some of the gaps. But for the rest of us it can be a somewhat underwhelming experience. A new iPhone app promises to revolutionise the way we interact with our heritage sites. Dublin City Walls uses high-resolution graphics, 3D imaging, video and GPS technology to bring the marvels of medieval Dublin right into the palm of your hand.>>>

Helen Castor, "The death-throes of the Wars of the Roses," TLS, January 26, 2011

Historical explanation is a tricky business. The reality of the past is lost to us, and all historical writing is an act of re-creation, in which the historian has a bewildering number of choices to make. One approach – adopted here by Desmond Seward – is to take a single thread and trace its path through the fabric of history. His theme in The Last White Rose is the precariousness of the Tudor title to the throne, and the repeated challenges the fledgling dynasty faced during the reigns of the first two Tudor kings.>>>

Friday, February 11, 2011

More on Our Virginia: Past and Present

Randall Stephens

The new issue of Harper's Magazine includes some of the findings of a "Report on the Review of Virginia’s Textbook Adoption Process, the Virginia Studies Textbook Our Virginia: Past and Present, and Other Selected United States History Textbooks." Heather blogged about the controversy surrounding the adoption of the textbook and some of its pseudohistory here. Now we return to it with more on the text's creative or unknowing anachronisms . . . I didn't see anything in the report about John Quincy Adams being a founding father.

The reviewers of the text included: Christopher Einolf (DePaul University); Mary Miley Theobald (Retired: Virginia Commonwealth University); Brent Tarter (Retired: Library of Virginia); Ronald Heinemann (Retired: Hampden-Sydney College); Lauranett L. Lee (Curator of African American History, Virginia Historical Society).

A note to any future textbook writers out there: Check your sources and then check them again. Or, at least have sources. Don't make things up.

I include below a sampling of the official critique. The items in quotes are from Our Virginia:

“In 1607 Queen Elizabeth sent three ships to found Jamestown, Virginia.” That would have been difficult, since Queen Elizabeth died in 1603, and neither she nor her successor, King James, “sent” any ships. They approved when a private company, the Virginia Company, sent ships to Jamestown, and no doubt King James approved when the colonists astutely named the town after him.

“They had been terribly persecuted and had seen friends killed.” I would like to know the source for this statement. I’ve never heard of Pilgrims being killed in England. Mostly they left England because they wanted to get away from the bad influences of the established Anglican Church. The statement seems over-the-top, but I can’t prove or disprove.

“Very few people in colonial America could read . . .” This is a myth. The overwhelming majority of white colonists were literate. In New England, literacy rates were higher than elsewhere because there were more schools and there was an emphasis on learning to read the Bible, but even in Virginia and other Southern colonies, almost all white men and even most white women could read in the eighteenth century. Percentages change over time, always growing larger, but even in the seventeenth century, about 60% of men in Virginia could read and about a quarter of the women. Figures are higher for the northern colonies. At no time in American history did “very few people” know how to read (unless one is talking about African Americans or
Native Americans).

“. . . until you realize that it hurt America’s tea makers, whose tea already had a heavy tax.” America didn’t have any tea makers; the climate isn’t suited to growing tea. American had merchants who sold smuggled tea, avoiding the tax. Again, I understand it is hard to explain a complicated issue in simplistic terms, but this treatment of the Tea Act isn’t accurate.

“Washington and French General Lafayette inspect troops before the Battle of Morristown in New Jersey.” First of all, there was no Battle of Morristown. Morristown was where Washington and his troops wintered in 1777 (January 6- May 28). Second, Lafayette was not a general until July 31, 1777 and didn’t even meet George Washington until August 10, 1777, long after Morristown, so they wouldn’t have been reviewing any troops.

“Cyrus McCormick’s young grandson was there on the day the reaper was tested.” (then the book quotes the grandson’s “eyewitness ” account). Since Cyrus McCormick was 22 in 1831 when he first tested his reaper, it is unlikely his grandson was present. The reason this grandson’s account is quoted is to “prove” that a black slave, Jo Anderson, helped invent the reaper. While the slave helped with all the farm work, including building a reaper, he should not be credited as a co-inventor, as some Politically Correct people would like. It is a serious mistake to title this section “Anderson and McCormick’s Reaper.” It was Cyrus McCormick’s reaper. http://www.virginialiving.com/articles/lion-of-the- hour/index.html

“The Quakers, a religious group, believed that all people were created by God.” A rather unnecessary sentence, don’t you think? What religious group does not believe that all people were created by God? It doesn’t say anything about the Quakers’ beliefs. It might be better to note that they were Christian pacifists who believed all people were equal, even women, Indians, and blacks.

“Evenings were spent playing cards or checkers, writing letters to loved ones, reading old worn newspapers, and playing baseball by torchlight.” That would be some trick, playing baseball by torchlight, since you couldn’t see to catch a ball. Torches give off almost no light beyond a few feet from the flame. I have never seen any mention of playing baseball by torchlight.

“Atlanta, Georgia, was one of the South’s largest cities. It was an important railroad hub . . .” Yes, Atlanta was a railroad hub, but it was one of the smaller cities in the South. In the 1860 census, Atlanta ranks 99th among American cities, with a population of 9,000. The South had many, many cities larger than Atlanta, including Baltimore at 212,000, New Orleans at 169,000, Louisville at 68,000, Richmond at 38,000 (25th), etc. Georgia alone had three cities larger than little Atlanta: Savannah, Augusta, and Columbus. http://www.census.gov/population/www/documentation/twps0027/tab09.txt

Map showing Fort Necessity (which is spelled wrong as Neccesity) locates the fort in the wrong state. Fort Necessity is in Pennsylvania, not Ohio. Fort Duquesne is also in Pennsylvania, not Ohio. (The text accurately states that Fort Duquesne is near Pittsburgh, but the map positions it in Ohio.)

St. Louis is always written as St., not Saint.>>>

History, Gender, and Sexuality Roundup

.
Lisa Hilton, "Mistresses through the ages Prostitute, concubine, mistress, wife: the boundaries are blurred in this study," TLS, February 9, 2011

What is a mistress? Elizabeth Abbott, who has also published A History of Celibacy and held the post of Dean of Women at Trinity College, University of Toronto, offers this definition: “a woman voluntarily or forcibly engaged in a relatively long-term sexual relationship with a man who is usually married to another woman”. Given the persistence of this model across time and cultures, Abbott maintains that “mistressdom”, like celibacy, is therefore an essential means by which to consider sexual relationships outside marriage – “in fact, an institution parallel and complementary to marriage”. Considering the media’s current obsession with love-rat footballers and cheating celebs, “mistressdom” might also be considered a safe bet for a publisher’s list, and Abbott duly provides us with a generally cheerful tumble through adultery down the ages.>>>

Elizabeth Varon, "Women at War," NYT, February 1, 2011

What do women have to do with the origins of the Civil War? Growing up in Virginia in the 1970s, I often heard this answer: nothing.

Much has changed since then. A new generation of scholars has rediscovered the Civil War as a drama in which women, and gender tensions, figure prominently. Thanks to new research into diaries, letters, newspapers and state and local records, we now know that women were on the front lines of the literary and rhetorical war over slavery long before the shooting war began.>>>

Adam Kirsch, "Macho Man: Exodus recast Israel’s founders as swaggering heroes and secured Leon Uris a place on the Jewish bookshelf even though, as a new biography shows, he was a mediocre writer and a troubled person," Tablet, February 1, 2011

Jews take pride in calling themselves “the people of the book,” and while there’s something a little vainglorious about the phrase—all peoples have books, don’t they?—its appeal is easy to understand. For millennia, in the absence of land and power, Jews found a kind of virtual sovereignty in texts, and the history of Judaism from the Babylonian Exile onward could be written as a history of books and writers—the Torah and the Prophets, the Mishna and Gemara, Rashi and Maimonides, down to modern, secular authors like Theodor Herzl and Sholem Aleichem and Primo Levi.

And then there’s Leon Uris.>>>

Carol Tavris, "The new neurosexism," TLS, January 26, 2011

. . . . Today we look back with amusement at the efforts of nineteenth-century scientists to weigh, cut, split or dissect brains in their pursuit of finding the precise anatomical reason for female inferiority. How much more scientific and unbiased we are today, we think, with our PET scans and fMRIs and sophisticated measurements of hormone levels. Today’s scientists would never commit such a methodological faux pas as failing to have a control group or knowing the sex of the brain they are dissecting – would they? Brain scans don’t lie – do they?>>>

Thursday, February 10, 2011

What Exactly is Latin America?

Joel Wolfe

After a lecture I gave to my Modern Latin America survey, a student asked me how Haiti could be considered Latin America given it had been a French colony. This is a great question, but it was also a little annoying, because one of the themes I use to organize the survey is whether or not it makes sense to consider the region a region.

One point I make on the first day of class is that there is no right or wrong answer to that question. There are very strong arguments both for and against seeing Latin America as a unified whole.

Looking at the region as everything in the Western Hemisphere south of the United States can be very useful. The vast majority of the nations in this area share a common Iberian heritage. Most of them are predominantly Catholic, have large mixed race populations, and have had complex and often contested relations with the U.S.—the hemisphere’s dominant power—for more than a century. And, on many significant levels, these nations tend to see themselves as having a shared history. Sure, Haiti’s French roots complicate things (along with Jamaica’s and Belize’s ties to Great Britain), but even those countries tend to have more in common historically with the other nations of the region than not.

We can also find a great deal that differentiates one country from another, however. In the colonial era (ca. 1500-1820), Mexico City loomed as one of the world’s great cities, but the villages that became some of the great cities of the present (Buenos Aires and São Paulo, for example) were tiny backwaters. Today, life in Buenos Aires or São Paulo is more consonant with that in Barcelona or even Chicago than with that in the villages of highlands Guatemala or Bolivia or any number of other places in Latin America. It isn’t just a matter of the physical geography of these cities versus rural spaces. Many of the region’s largest cities exude a modern ethos and Western orientation. Such identities are often absent or at least contested in other, particularly rural, Latin American spaces.

This tension about Latin America’s coherence as a concept or even region has fueled more than just the ways I organize some of my classes; it has also shaped my scholarship. My first book is a study of the rise of Brazil’s industrial working class in the city of São Paulo. After writing my doctoral thesis and then revising it for publication, I realized that one of my study’s limits was São Paulo’s uniqueness. It is simultaneously the largest metropolitan area in the entire southern hemisphere and Latin America’s largest industrial complex, and yet in many ways it is atypical of Brazil. Within its own country, São Paulo (the city and the state), with its modernist ethos, large immigrant populations, devotion to both advanced agricultural and industrial production, is both unique and dominant. In other words, you can’t rationally analyze Brazil without reference to São Paulo, but you would be wrong to see Brazil through Paulista (a resident of the state) eyes.

I tried to address this issue in my new book on automobility in Brazil. Autos and Progress is a study of Brazil’s struggle to integrate the massive, often disconnected, and regionally diverse nation through the use of technology (cars, trucks, and buses). In many ways, the embrace of the technological fix by Brazilians was a Paulista idea, although autos and automobility had and have broad appeal throughout the country. The tensions among Brazilian regional identities and the very real (you can’t say “concrete” when you write about cars and road building!) struggles to physically, socially, and economically unify the nation became a central theme for the book and have become part of how I organize my History of Brazil class.

In other words, there is a great deal of utility in asking whether or not it makes sense to think of Latin America (or Brazil or Mexico, for example) as a unified whole. Thinking about what we gain and what we lose when we either split or lump regions and sub-regions in our teaching and scholarship can not only help our students make sense of a lot of complex history, it can also clarify key aspects of our research agendas.

Wednesday, February 9, 2011

Questioning the Assumptions of Academic History

.
A lively forum in the January issue of Historically Speaking critiques some of the assumptions of academic history. Here are selections from the lead essay by Christopher Shannon and a comment by Elisabeth Lasch-Quinn.

"From Histories to Traditions: A New Paradigm of Pluralism in the Study of the Past"
Christopher Shannon

The last forty years have witnessed a tremendous expansion of the range of historical topics deemed fit subject matter for professional academic historians in America. Beginning in the late 1960s, social historians concerned to recover the experience of common people led a revolt against the perceived elitism of the then-dominant fields of political, diplomatic, and intellectual history. The pioneers of social history, particularly those rooted in the field of labor history, soon came under attack for focusing on white male historical actors. This critique gave birth to the flourishing of studies of women, racial and ethnic minorities, and most recently, sexual minorities. Even those sympathetic and supportive of these developments at times wonder if any principle of unity or synthesis remains in the wake of so much diversity, or if those seeking to make sense of the past must give up on History and rest content with the proliferation of histories.

This diversity of subject matter masks a fundamental uniformity of method. For all its openness to new subjects for inquiry, the historical profession in America has refused to accept any fundamental questioning of its basic assumptions about how we gain meaningful knowledge of the past. Despite various philosophical challenges over the past century, most historians remain committed to a common-sense empiricism rooted in the philosophical assumptions of the physical sciences that dominated the intellectual landscape of the West at the birth of the historical profession in the late 19th century. This consensus is, moreover, at once epistemological and political: common-sense empiricism must be defended because it is the epistemology most appropriate to liberal modernity. The epistemological alternatives are historicism and relativism; the political alternatives are fascism and totalitarian communism.>>>

"Comment on Christopher Shannon"
Elisabeth Lasch-Quinn

. . . . In his 2009 book God, Philosophy, Universities, Alasdair MacIntyre paints a portrait of the modern university as fragmented beyond coherence by specialization into the various disciplines. No overarching understanding of how the disciplines are connected exists. Whether “professedly secular or professedly Catholic,” universities today are no longer informed by “a notion of the nature and order of things.” It is the same with the subspecialties. Shannon may be assuming too much unity of perspective in suggesting that it is the liberal world-view that informs most historical writing today. Autonomy is a common enough theme, but it is not necessarily any longer part of a unified view of the world. Agency is more a mantra than a well-articulated intellectual framework or philosophical system. It appears adequate to conclude a study of this or that marginal group with the suggestion that the group, however oppressed, exhibited agency. Why that matters is assumed, not defended.

This diversity of subject matter masks a fundamental uniformity of method. For all its openness to new subjects for inquiry, the historical profession in America has refused to accept any fundamental questioning of its basic assumptions about how we gain meaningful knowledge of the past. Despite various philosophical challenges over the past century, most historians remain committed to a common-sense empiricism rooted in the philosophical assumptions of the physical sciences that dominated the intellectual landscape of the West at the birth of the historical profession in the late 19th century. This consensus is, moreover, at once epistemological and political: common-sense empiricism must be defended because it is the epistemology most appropriate to liberal modernity. The epistemological alternatives are historicism and relativism; the political alternatives are fascism and totalitarian communism.

The pluralistic tolerance of all views as equally valid is difficult, if not impossible, to reconcile with either traditions or histories possessing value judgments of any kind. Herein lies the problem. If the welcoming of Catholic history is to bear any fruit, we will need to confront head-on the fundamental clash between a totalizing pluralism, which only results in nihilism, and any normative story. As Shannon writes, it is not enough to concede that “objectivity is not neutrality.” But neither is it enough to let all flowers bloom. We have to confront the huge intellectual obstacles that arise when traditions are incompatible. I think this is what Shannon intends in his worthy embrace of “a new norm for historical inquiry, one less geared toward the industrial production of information about the past and more directed toward a philosophical reflection on human nature through the study of the past.” Moving toward this kind of reflective activity might be foreclosed by the unbridled embrace of traditions. It matters what those traditions are, their content, practices, and quality. I’m not sure I’d like to see “the ability of Enlightenment institutions like the modern history profession to open themselves to distinctly Catholic interpretive traditions” made into “a good test of their ability to include other nonliberal traditions.”>>>

Tuesday, February 8, 2011

Sports, History, and Culture

Randall Stephens

With the Super Bowl and the Puppy Bowl over, and the barrage of clever and not-so-clever ads that went with the former, I've been thinking about the history of sports. I admit, I know little to nothing about the subject. (I do know that a class on the history of sports would probably populate.)

I am fascinated by how recreation has changed over the centuries. Has it become less violent, less like hand-to-hand combat? Of course, we do have ultimate fighting in are era, and their are all sorts of ways to die in a high-speed Nascar race, but it strikes me that common sports are less violent than they were in previous ages. A man does not need to train an animal to kill another animal to show that he is a force to be reckoned with.

Is it natural that sports should become more humane? Would dog fighting or bear baiting have struck late antebellum Americans as being as cruel and debased as most Americans think those are today? Ideas about propriety and impropriety appear to have dominated conversations about recreation for centuries.

There are still class and cultural connotations to sports in our age, much as there were hundreds of years ago. (One of my favorite Onion articles in recent years revealed that "a professional wrestling 'fan' has written a shocking new book that claims wrestling fans are actually paid actors.") But were class and cultural markers much stronger 150 or 200 years ago?

What do sports tell us about the people who have enjoyed them? How long have sports been woven into consumer culture? What can we know about western history be looking at the way men and women "recreated." (I hear that those who work in the subfield of cricket studies have some interesting things to say about empire and global culture.)

Anthony Fletcher's Gender, Sex, and Subordination in England, 1500-1800 (Yale, 1999) explores some of these topics. On sport in 17th-century England, he writes:

The gentry enjoyed the sport of their deer parks, their bowls and tennis; communal sports tested men's physical prowess and endurance, absorbing competitive vigor. Local tradition was deeply founded in this respect. In Wiltshire football was entrenched in the downlands, while bat-and-ball games like stoolball and trapball flourished in the vales of the north. East Anglian villages had their 'camping' grounds with their own indigenous and popular team games. There was something for everyone at the Whitsun Cotswold games, held annually from around 1611 on Dover's Hill, a marvelous green amphitheater outside Chipping Camden which is now owned by the National Trust. There was hunting and horse-racing for the nobility and gentry and the old sports, like wrestling, singlestick fighting and shin-kicking, for the country populace. The games were a veritable celebration of manhood which, at least until the 1640s, attracted people of all social ranks from miles around. (94-95)

An observer of late-17th-century England, Guy Miège said a little about sports in his country. Notice the praise for bloodsports and the comment about foot-ball's popularity among the lower sort.
Guy Miège, The New State Of England Under Their Majesties K. William and Q. Mary: In Three Parts (London, 1691), 39-40.

Monday, February 7, 2011

Historians Teaching Grammar

Heather Cox Richardson

Over the years, I’ve experimented a lot with how to teach writing. Increasingly, I get a significant number of students who don’t even know what a sentence is. Of course this means I spend most of my time on the larger picture of writing: thesis statements, structure, supporting evidence and so on.

But at some point, I got tired of correcting the same grammatical errors for the thousandth time. That frustration made me play around with some new approaches. One of the ones that seems to work is to treat the mechanics of writing like mathematical principle. Rather than repeatedly marking certain grammatical errors that come up again and again, and reminding the students to pay attention to them, I have taught the rule AS A RULE and told the students that this is the way things are, just as 2 + 2 + 4. Always. Period. I told them I would not accept any more mistakes on that particular rule. I’m still experimenting with how frequently I can introduce a new no-break rule, but one a week seems an acceptable pace for now.

Far from resenting this rote method—which conjures for me that dreadful orange grammar text we dragged ourselves through in eighth grade—the students seem thrilled to have found something they can hang on to with certainty in their writing. Yes, they still mess the rules up, but far less frequently. And when they do, I can just mark the error without explaining the rule in my comments.

I recently put to paper my four favorite rules for high school students. These are the ones that if violated immediately marks an essay as grammatically problematic (although they are only the top four). I’d like to keep building this list, so if anyone has suggestions, let me know.

Four crucial rules for ratcheting up your writing:

1. Make sure your subject and verb agree. For example: "Bats in the tent behind the tree were (not was) black."

2. Get your verb tenses correct. (This one is the hardest one on the list. Let me give you some tricks to get it right. First of all, put everything in past tense. English teachers sometimes get upset at that advice because they argue that literature is alive, and thus should be discussed in present tense. Yep. Okay. Got it. Now ignore that instruction unless you have a teacher that absolutely insists on it. Most readers are quite content to have everything in a single tense, and the past is a zillion times easier to manage in an essay than the present. Second, everyone—absolutely everyone—screws up the past and the past perfect. Don’t worry about the names of the tenses. Figure it out this way: imagine your action on a timeline. The majority of what’s happening in your essay will be in past tense. IF SOMETHING HAPPENS BEFORE THE MAJORITY OF THE ACTION, slip in a verb that indicates an earlier time. Usually, this will be the word “had.” For example: “He hoped that the ship would arrive that day, but he HAD heard the day before that it would be late.” This won’t always work, but it will work often enough that it’s worth doing. Just as if you were speaking, by the way.

3. An introductory clause always, always, always, always, always, always, always, always, always, (get the picture?) always, always, always, ALWAYS modifies the noun that comes immediately after it. ALWAYS. So:

“Running down the alley, he dropped the knife,” is correct.

“Running down the alley, the knife dropped from his hand,” is wrong, wrong, wrong, because the knife is not running down the alley.

4. Keep your writing in active voice. Avoid passives whenever possible (which is about 98.5% of the time). This is a very hard thing for students to manage, for two reasons. First of all, for some reason we have this weird idea that it sounds smart to write prose that has no action, as if somehow it makes us sound learned and above the fray to write as if events just occur. Second, using passive voice makes it possible to refrain from taking any sort of position on your topic. In passive voice, things just happen; you don’t have to explain why or how they happened. Passive voice is disastrous for both writers and thinkers. Take a look at this example: Here are two ways to write about a horrific massacre of more than 250 people, shot and knifed as they surrendered to soldiers. So which is more honest? “Two hundred and fifty people were shot,” or “Angry soldiers murdered two hundred and fifty women and children”? In the first, the deaths just happen; no one is at fault. In the second, I have squarely blamed the murders on the people who committed them. In the second version I’ve tried to explain an event, the actors, and the action. I’ve had to figure out exactly what happened, I’ve thought about the action, and I’ve taken a stand. And that’s why we write, isn’t it? To tell someone about the world. Active voice makes you stick your neck out, and it will make people angry at times. But it enables you to contribute your own ideas and vision to the world. Plus it’s a zillion times more fun to read than passive voice!

Saturday, February 5, 2011

Reading Clothes, Hair Styles, Architecture, and More

Randall Stephens

I'm teaching a course this semester on American history from 1783-1865. I'd like to introduce the students more to everyday life than I have in previous years. So, I'm asking questions like: How did Americans behave, dress, eat, live, work, worship, and play? What can we learn from reading the material culture and the manners of, say, the Early Republic or the Age of Jackson?

A look at Jack Larkin's excellent The Reshaping of Everyday Life: 1790-1840 (Harper, 1989) seemed like a good place to start. The book is part of a series that examines the intimate and public lives of Americans in a given period. I read a couple of short passages to the class on Thursday. For example, Larkin says this of how Americans were greeting each other in the Jacksonian period:

Shaking hands became the accustomed American greeting between men, a gesture whose symmetry and mutuality signified equality. The Englishman Frederick Marryat found in 1835 that it was 'invariably the custom to shake hands' when he was introduced to Americans, and that he could not carefully grade the acknowledgment he would give to new acquaintances according to their signs of wealth and breeding. He found instead he had to 'go on shaking hands here, there and everywhere, and with everybody.'

All this will overlap nicely with a book that the class is reading--Leo Damrosch's wonderfully entertaining and insightful Tocqueville's Discovery of America (FSG, 2010). In Damrosch's telling Tocqueville was quite sensitive to the styles, cultural peculiarities, and attitudes of the Americans he encountered in his trek across the country in 1831 and 1832.

I have been doing some searches on-line for websites and resources for the teaching of material culture. I wonder if their is a one-stop site that would include bibliographies and short summaries of what material culture and style can tell us about a given era? What can we know about American men over the decades by looking at changes in facial hair? (That topic would certainly lend itself to an interactive graphic.) Or, as one student asked me several years ago: Why did men have outrageous mustaches and lambchops--like cats and walruses--in the 1850s-1870s and why did so few have the same in the 1920s and 1930s? I don't really know. For those later decades, maybe faces were supposed to look like the fronts of streamlined trains. What can we learn about men and women, children and adults, in the Jacksonian period by looking at the clothes they wore? How might we compare those styles with ones from today? Can we speak about the democratization of architecture, speech, or, as Larkin writes, physical greetings?

Students seem to have fun with these kinds of topics. I do as well, though, I know little about them. So . . . if anyone out there knows of some on-line resources to get at these kinds of material culture and cultural history questions, please let us know.