Friday, July 29, 2011

Thou Shalt Review Books Responsibly

Chris Beneke

Last week, Robert Pinsky, the former poet laureate, offered three exceedingly sensible “golden requirements for book reviews”:

1. The review must tell what the book is about.

2. The review must tell what the book's author says about that thing the book is about.

3. The review must tell what the reviewer thinks about what the book's author says about that thing the book is about.

Those who have mastered the art of reviewing books, writes Pinsky, can “then get quickly beyond them, in ways that are fun to read.” The problem is that too many reviewers fail to comply with all three. Some consider two—or even one—sufficient.

My sense is that historians are a little more solicitous than most when it comes to these matters. Maybe it’s because we’re a relatively small, incestuous community where you’re likely to run into the book’s author at the next major conference—or, heaven forbid, have your own book reviewed by the injured party three journal issues hence. Of course, most historical journals convey something along these lines in their reviewer guidelines; though they are seldom, if ever, stated with such crystalline precision.

It’s fairly obvious that historians follow one rule almost as piously as Pinsky’s Golden Three. It’s more tactic than principle, and goes something like this: In a favorable journal review, the review’s penultimate paragraph must identify the book’s minor flaws. Perhaps you object to this entrenched professional habit on aesthetic grounds, but it would be hard to make a strong ethical or professional case against it.

Anyway, to Pinsky’s Golden Requirements, we might add the following historically specific Decalogue:

1. Thou shalt not use the review to tell us about your own scholarship.

2. Thou shalt not tell readers that “the definitive history of such-and-such remains to be written” when you are the person who intends to write it.

3. Thou shalt not tell us too much—or really anything at all—about the supposed religious beliefs or political commitments of the author whose book is being reviewed.

4. Thou shalt not treat the omission of your own book from the endnotes as a personal affront, punishable by withering historiographical criticism.

5. Thou shalt not use the review to suck up to powerful and/or beloved members of the profession. (Corollary: Thou shalt honor thy dissertation advisor, but not in your review.)

6. Thou shalt not use the review as an occasion to advance a specific political agenda.

7. Thou shalt not tell readers—either explicitly or implicitly—that the book under review does not deserve serious consideration. That shalt be told to the editor, privately, before the review is written.

8. Thou shalt not submit the review six months after the due date, especially when the book was published three years ago.

9. Thou shalt not use the review to expose your utter ignorance of the topic.

10. In reviews of edited collections, thou shalt tell a little something about each contribution.

What am I missing?

Thursday, July 28, 2011

Random Questions about the Past Inspired by Films

Randall Stephens

I've been watching a variety of history-related films in the last few months. (Some loosely based on history, I'll admit!) A few are documentaries. I can't get enough of these, it seems. They include: Michael Collins (1996), Liam (2000), The General (1998), Flame and Citron (2008), Europa Europa (1990), Black Hawk Down (2001), Black Death (2010), Dean Spanley (2008), Great Balls of Fire! (1989), Girl Groups (1983), Amazing Journey: The Story of the Who (2007), Gangs of New York (2002), Secrets of Shangri-La: Quest for Sacred Caves (2009), The Sting (1973). Of course, some of these a very good, and some . . . not so much. I've found, though, that almost all history films bring to mind a number of questions about the past. Why did this or that happen as it did? What can we learn from the past? Perhaps many viewers of these flicks have similar questions. Maybe it's even natural that the visuals of a good costume drama or the narrative arc of a documentary start to make the gears in our heads turn in new ways.

So, what follows are some serious, and some rather ridiculous, questions about the past. The classroom might even be enlivened by some fun questions, sparked by films, about the cultures, peoples, and events of yesterday. (Granted, many of the questions here reflect my own current teaching and research obsessions.)

Why did women's hats become so elaborate and extraordinary in the late Victorian era?

What cultural turns gave rise to facial hair and strange hair styles in different eras. (Think of the turned-down, satirical, post-colonial mustaches of 1967.)

Why did breeds of dogs proliferate so rapidly from the 19th century forward?

Why did Edwardians like ferns and wicker/rattan so much?


Why did 1950s Teddy Boys (or Teds) adopt the fashions of Edwardians?

Why did Mods and Rockers go at each other as they did?

How can we make sense of the widespread acceptance of rock music as a legitimate form in our era and the denigration of it as "trash" by elites in the 1950s?

Why Elvis?

How is it that the west came to dominate world history from the age of exploration forward?

What accounts for the punctuated equilibrium of technological progress in the the 19th and 20th centuries?

Has industrialization and mechanization been largely good or bad for humankind? (This one comes from a bright student in my World Civ class!)

Why did writing develop when and where it did?

What accounts for the development of the world's religions?

Did premodern westerners think about race in markedly different ways than early modern westerners did?

Can historians make predictions about the future based on what they know of the past?

Wednesday, July 27, 2011

Persistence in Folkways

Heather Cox Richardson

My sister makes the world’s best baked stuffed lobster. She learned to make it from watching our mother who, in turn, learned how from her father. But Kath has a problem. She has tried for years to get someone else in the family to learn how to make the lobsters. No one will (not least because of just how gruesome the process is, something that my marine biologist sister handles with none of the squeamishness the rest of us show). She warns us that she will take the recipe to her grave if no one will learn how to prepare it.

She might well be right, but another recipe that has come to light recently suggests that recipes do persist in communities even when lost by their originators. About thirty-five years ago, a friend’s grandmother, who lived on a nearby island, served me a blueberry cake made with molasses. She didn’t have a recipe, but she gave me approximate amounts and I wrote them down as “Gram’s Blueberry Cake.”

In my parent’s house is a shelf of local cookbooks that cover the years from the 1890s to the present. I read them occasionally; I find it fascinating to see how eating patterns change over time. I was interested to see recently that what I discovered as Gram’s Blueberry Cake has showed up repeatedly in cookbooks over the years, coming from different households. As it turns out, the woman who submitted the recipe to the local cookbook in one of its earliest appearances was one of my relatives.

Whether or not Kath’s recipe for baked stuffed lobster will come around to our great-grandchildren by way of someone else remains to be seen, but the persistence of the unusual blueberry cake recipe brings to mind an issue I’ve wondered about lately: the persistence of folkways in communities.

David Hackett Fischer explored this persistence in Albion’s Seed, when he traced the origins of four American folkways to four different European migrations. Malcolm Gladwell put great emphasis on it in Outliers, suggesting that the different experiences of various immigrant groups in America could be traced to their cultural heritage. More recently, Slate highlighted a study suggesting that modern-day German anti-Semitism bears a direct correlation to whether or not those same communities murdered their Jewish populations during the Black Death in the 14th century.

Frankly, I always found these arguments about such extreme cultural persistence farfetched. At least in the modern world, it’s hard to imagine that the forces of change do not outweigh cultural continuity over the course of even one generation, let alone hundreds of years. Even ubiquitous human conditions like racism change rapidly to reflect changing anxieties, making it hard to argue that even they reveal cultural continuity on anything but such a macro level that they have very little explanatory power.

But my recipes give me pause. If they have lived for a hundred years by word of mouth, what else has? What about attitudes and prejudices? Surely they persist, too, and surely they are crucially important when forming political affiliations, for example. But how do we, as historians, grapple with such intangible cultural artifacts in a meaningful way?

Tuesday, July 26, 2011

Bertram Wyatt-Brown, Southern Honor, and Festschrift

Randall Stephens

I was in Boothbay Harbor, Maine, last week visiting with Bertram and Anne Wyatt-Brown. Both have been longtime members and supporters of the Historical Society and have participated in our national conferences. Bert has contributed several essays to Historically Speaking the Journal of the Historical Society, and this blog.

Before I left Boothbay and made my way back down to a sweltering Boston, I asked Bert some questions about his classic book Southern Honor: Ethics and Behavior in the Old South and his research into how honor continues to shape the world today. (Listen to audio of interview here.)

As some readers of this blog might know, Wyatt-Brown is the author of over 100 scholarly articles and essays and has written a variety of acclaimed books. Southern Honor was a finalist for the Pulitzer Prize. The study of masculinity, gender, violence, and southern culture has thrived in the years since Bert wrote his book. Not long after its publication the it won high praise. Novelist Walker Percy called it "A remarkable achievement--a re-creation of the living reality of the antebellum South from thousands of bits and pieces of the dead past." And David Herbert Donald observed that, "Unlike so many historians who have been interested in handing down judgments, favorable or unfavorable, on the Old South, Wyatt-Brown has studied Southerners much as an anthropologist would an aboriginal tribe. An important, original book which challenges so many widely held beliefs about the Old South."

In celebration of Bert's long career and his impact, the University Press of Florida will soon publish a Festschrift to honor him. Edited by Daniel Kilbride and Lisa Tendrich Frank, the volume includes essays by a number of Bert's students and others on which he had a significant influence. It will be out just in time for the Southern Historical Association meeting in October.

Monday, July 25, 2011

Avast! Pirates in History

Heather Cox Richardson

One of my favorite graduate students was an expert on pirates. Trying to supervise his research meant that I had an opportunity to learn from him enough about historical piracy to have a working knowledge of it. From Roger, I learned that we actually have very few primary sources directly discussing pirates, and that much of what popular histories say of piracy is fantasy. I also learned that piracy was an economic and political enterprise that was vital to countries in the seventeenth and eighteenth centuries, and that early governments largely accepted it.

Finally, I learned that piracy is every bit as active today as it was in the Golden Age of Piracy, as men at sea make a living by stealing the wealth of others. There are even, my student pointed out, websites for the reporting of pirate attacks, although he explained that the legal tangles such accusations launch means that piracy remains seriously under-reported. There are also companies that promise protection against pirates, selling technology that makes the days when sailors rounding Cape Horn scattered carpet tacks on their decks to thwart robbers seem quaint indeed.

It is in honor of this student that I carry my keys on a fob of pirate flags.

Roger also inspired me to start reading about pirates on my own. Some of the scholarly books out there, notably Robert C. Ritchie’s Captain Kidd and the War Against the Pirates, are smart and worth reading.

But Roger’s tutorial taught me enough to know that the best book on pirates I’ve come across is William Gilkerson’s Pirate’s Passage. It purports to be a children’s story, although the themes it addresses are relevant to everyone. It is the story of the relationship between a young boy in Nova Scotia in the 1950s and an old sailor who brings a 35-foot yawl into the family dock on a treacherously stormy winter night. While the actual age and status of the old sailor is deliberately obscure, there is little doubt that he is—or was—a pirate.

As the weeks pass, the boy endures bullying from the local rich family that is trying to get control of his mother’s valuable real estate and develop it. In the evenings, the mariner tells the boy stories of pirates. Eventually, the child’s quest to defend himself from the local thugs and save his mother’s property becomes a personal exploration of wealth, ownership, and piracy, in the past and the present, forcing the boy to make decisions about what is truly just.

Pirate’s Passage is an engaging romp that tells good history. The author clearly knows his primary sources on piracy, and he often includes passages from them as the elderly man reads to the boy. Gilkerson situates Sir Francis Drake, Mainwaring, Morgan, the buccaneers, and all their peers in their proper times and places; he also brings to life what it meant to be a sailor tied to the pirate life: hunting pigs on Hispaniola, raiding passing vessels, and getting a share of the take—a rough life, to be sure, but one that compared favorably to life on a Royal Navy ship, where lice, scabies, whipping, injuries, and endless work were the norm.

But Gilkerson does more than provide a good account of historical pirates. His book is a profound reflection on the meaning of history. The mariner refuses to let his young friend imagine the pirates as fun swashbucklers. They are human beings, trying to negotiate the shifting spheres of politics and power in order to survive and, whenever possible, make their fortunes. The first conversation the old man has with the boy about pirates begins with an observation that speaks directly to what historians do: “Rules are a given,” he says. “What could be more important than seeing who makes ‘em, and who breaks ‘em, and who makes their own, and how it’s worked through time?”

As the old man tells his stories and advises the young man’s exploits, he insists the boy look deeply into cause, effect, and, critically, responsibility. When his young friend dismisses Drake as a criminal, the mariner asks him to reconsider. Was it Drake who was the criminal, he asks, or Queen Elizabeth, who encouraged the famous pirate to go strike a blow for England? Who, exactly, is a pirate, when governments as well as individuals engaged in piracy? What justification for theft of property is acceptable? Are the rules the same for the rich as for the poor? These questions are not just academic in Pirate’s Passage, either. The local family persecuting the boy’s mother represents the local government, forcing the boy to cross a number of legal lines (in extraordinarily interesting ways) in order to protect her property. He becomes, the mariner tells him, a member of the Brotherhood.

In the end, the boy must sift through not only the past but also the present for his understanding of justice. He does so with the guidance of a wily old mariner, who refuses to let him accept easy answers.

It turns out the old salt is not just a pirate, he is a historian.

Friday, July 22, 2011

Sal Khan and Online Teaching

Dan Allosso

For those who don’t recognize his name, Sal Khan is the founder and faculty of Khan Academy, which offers over 2,400 educational videos on the web, free of charge. Khan’s goal is to educate the world.
He’s been featured on PBS’s News Hour, he’s on Forbes list of “Names You Need to Know,” he’s spoken at TED and at the end of his talk Bill Gates came up on stage to say how cool he thought Khan was.

In the TED Talk clip (at about 2:50), Sal Khan tells the story of how he began tutoring his cousins, and putting “refresher” talks on YouTube for them to look at when he wasn’t around. The cousins preferred the videos to the tutoring, and Khan realized the reason was that they could go over them at their own pace, without the pressure of a teacher looking at them. He extended this idea to the classroom, and suggested teachers use the videos as “homework,” and then follow up in class. Teachers become less involved in lecturing and more involved in mentoring, students get instruction that moves at their pace and requires them to master a concept before they move on (Khan says even smart students get to the end of traditional curricula with a “swiss-cheese” knowledge of a subject, which can cause problems for them later), and schools can focus less on teacher-student ratios, and more on meaningful interaction between teachers and students.

I think this is insanely cool. So does Bill Gates, and you can see what he says about it at the end of the TED Talk, and also in the clip from The Gates Notes on the Khan Academy homepage. Bill especially likes the fact that Khan “has taken all this material, and broken it down into little 12-minute lectures.” Of course I like this, because I’m doing the same thing with my online American Environmental History program. But more than that, I think it’s an editing process that helps you focus on what is really important in a topic.

Well okay, you may be thinking, but the Khan Academy format is better suited to some kinds of learning. Sure it is, but let’s not kid ourselves that there’s none of that kind of learning in what we do as history teachers. Even at the college level. Yes, he’s got a lot more math up there than he does history. And yes, I think that’s partly because history is much more driven by interpretation—it’s not just a careful accumulation of brick-like facts that get stacked one on another until you’ve built a wall (with apologies to Arthur Marwick). But the idea of turning lectures and discussions/reflections upside-down is exciting!

There’s been a lot of recent talk here and throughout the history blogosphere (for example) about the positive and negative possibilities of online education. I think Khan Academy is worth looking at carefully, and keeping a sharp eye on.

Thursday, July 21, 2011

New England's "Maruellous" Pine Trees

Heather Cox Richardson

How many people today have heard of the King’s Broad Arrow?

Not many, I’d wager, and yet it was once the key to settling a continent and the spark to a revolution. It’s a simple mark: three quick swings with an ax, one straight up and two in a V at the top, to make an arrow. After 1711, the King’s Mark branded old-growth New England white pines as the property of the King of England.

Those old-growth white pines were key to British interest in settling New England. In 1605, Captain George Weymouth explored the coast of what is now Maine, sailing the Archangel to Monhegan, Camden, and up the Kennebec River. He discovered vast shoals of fish and, as one of his comrades recorded, giant “firre-trees,” “out of which issueth Turpentine in so maruellous plenty, and so sweet, as our Chirurgeon and others affirmed they neuer saw so good in England. We pulled off much Gumme congealed on the outside of the barke, which smelted like Frankincense. This would be a great benefit for making Tarre and Pitch.”

The trees that so impressed Weymouth and his men were White Pines, (Pinus Strobus), still known in England as the Weymouth Pine.

These huge trees dominated the coastline where Weymouth sailed. They were the tallest trees in eastern North America, standing up to 230 feet. Their wood is soft, easy to cut, straight, and generally without knots. Unlike hardwood, it can stand for years without cracking, and it bends, rather than breaks, in a high wind. It was a perfect tree to make masts, and if there was one thing the Royal Navy needed, it was its own source of mast wood. As William R. Carlton put it in his 1939 New England Quarterly article titled “New England Masts and the King’s Navy”: “Masts, in the days of wooden ships, played a far greater part in world affairs than merely that of supporting canvas. They were of vital necessity to the lives of nations. Statesmen plotted to obtain them; ships of the line fought to procure them. . . .” They were vital to the well being of the British Navy . . . and thus to Britain itself.

The Navy had been getting its masts from the Baltic countries and Norway, but the masts they supplied had to be spliced, and the supply was always susceptible to disruption. The discovery of a new source of masts was enough to spur interest in settling New England. By 1623, entrepreneurs in Maine and New Hampshire were milling pine masts for British navy yards, a trade centered out of Portsmouth, New Hampshire’s “Strawberry Bank.”

After a war with the Dutch closed off British access to the Baltic in 1654, England began to rely on the Colonies to supply masts. The resulting boom in mast wood created a frenzy of cutting which threatened to decimate the old-growth trees. By 1691, the Crown had protected almost all white pines more than 24 inches in diameter at 12 inches above the ground. Surveyors marked these potential masts with the King’s Broad Arrow.

Colonists were outraged. Pine wood was valuable—very valuable—not only for masts but also for boards. Men routinely poached the pines, sawing the old-growth trunks into widths no more than 22 inches wide to get around the new laws. They also protested the restrictions, which were a real hardship in a region where wood was imperative for everything from houses to heat. They began to mutter that the Parliament had no right to intrude on their private property.

In 1772, a New Hampshire official tasked with protecting the King’s Trees charged six sawmill owners with milling trunks that had been marked with the King’s Broad Arrow. One of the owners refused to pay the resulting fine. He was arrested and then released with the promise that he would provide bail the next day. Instead, the following morning he and 30 to 40 men, their faces disguised with soot, assaulted the government officials and ran them out of town. While eight of the men were later charged with assault, the local judges who sentenced them let them off so lightly the verdict could easily be seen as support for their actions.

The Pine Tree Riot, as it came to be called, has often been cited as a precursor to the Boston Tea Party. The latter is the more famous occasion when New Englanders challenged royal authority, but it is worth noting that the first flag of the American Revolutionaries bore the image of a White Pine in the upper lefthand corner.

Wednesday, July 20, 2011

Doris Kearns Goodwin on Learning from American Presidents

Randall Stephens

In 2008, popular historian Doris Kearns Goodwin gave a TED talk on what she had learned from studying the lives of American presidents. I had not seen this before, and only became aware of it when it popped up on Facebook via American Experience.

Most non-history majors (and perhaps even many majors) will want to know how their knowlegde of history can be applied in the present. (Few young majors, to be sure, have the antiquarian gene.) This clip seems like a good way to explore the "usable past."

Tuesday, July 19, 2011

Russian History Roundup

.
Michael Johnson, "A Romp Through History," American Spectator, July 18, 2011

Alexander Motyl was clearly having great fun when he wrote his latest book, The Jew Who Was Ukrainian, a comic novel with half-serious historical underpinnings. It manages to amuse and challenge without losing its headlong momentum into the realm of absurdist literature.>>>

Jennifer Siegel, "A Statesman For the Czar," Wall Street Journal, July 18, 2011

No political figure looms quite as large over late imperial Russia as Sergei Iulevich Witte. Among much else, he was Russia's finance minister from 1892 to 1903; its plenipotentiary representative at the Portsmouth negotiations ending the Russo-Japanese War; the foremost instigator of the manifesto that introduced moderate parliamentary representation and united government into the autocratic empire in the wake of the 1905 Revolution; the country's first prime minister; the political and financial architect of the Trans-Siberian Railway; and the mastermind behind the stabilization of the ruble and the implementation of the gold standard, through the construction of an intricate web of foreign capital investment in the empire.>>>

Tony Halpin, "Gulags reveal awful secrets," The Australian, July 18, 2011

The journey to the heart of Joseph Stalin's reign of terror was long and arduous. Finally, hidden behind a clump of trees, the gulag emerged. This is where victims of Stalin's repressions were imprisoned as slave labourers and worked to death on the Road of Bones, the notorious Kolyma highway that connects Khandyga to the port of Magadan, in Russia's far northeast.>>>

Gary J. Bass, "Why the Crimean War Matters," New York Times, July 8, 2011

The Crimean War was the first major war to be covered by professional foreign correspondents, who reported on the disastrous blundering of commanders and the horrors of medical treatment at the battlefront. Today, we remember fragmentary stories: the charge of the Light Brigade, symbolizing the blundering; Florence Nightingale, for the medical treatment. But the real war has faded away, eclipsed by the two vastly worse world wars that were to come.>>>

Thomas Gladysz, "He Who Gets Slapped," San Francisco Chronicle, July 4, 2011

As its closing film, the San Francisco Silent Film Festival will show He Who Gets Slapped (1924). It is, in my mind, the finest "sad clown" movie you'll ever see.

He Who Gets Slapped tells the story of "HE," a disgraced intellectual forced to find work as a circus clown. His popular act consists of being repeatedly slapped by the other clowns whenever he attempts to speak even a simple truth. The crowd, which likes to laugh at the misfortunes of others, loves this bizarre and rather pathetic act.>>>

Monday, July 18, 2011

Scholarly Journals

Dan Allosso

Along with the rest of the subscribers to H-Net’s mailing list, “C19-Americanists,” I got an appeal today from the editors of Poe Studies and ESQ: A Journal of the American Renaissance. Washington State University is cutting their funding, and the editors are soliciting support. They admit that the administration’s move “is a fiscal one,” but the editors also suggest there is a bad “climate for higher ed” in Washington, and that administrators lack “awareness about the value of humanities journals,” and by implication, the humanities.

All these claims may be true. And these two journals are probably wonderful, and well worth continuing, even at the expense of cutting other WSU programs—which I assume would be necessary. But in light of the appeal's wide circulation to the mailing list, I think it raises some interesting questions beyond the immediate situation in Washington

Before today, I had never heard of either of these journals. I’ve certainly never held either of them in my hands, or read anything from them. So I’ve got to assume the editors are appealing to people like me in hopes we’ll feel a general sense of solidarity with other Americanists or humanities folks, a sense of antagonism toward clueless administrators, or a fear that our favorite journal may be next in line.

I do subscribe to a few journals (or rather, I get them as a result of membership), but I usually don’t read them in hard copy, even the articles that really grab my attention. Allan Kulikoff’s article in the June Journal of the Historical Society was very interesting (I blogged about it here). I read it online—can’t actually find my hard-copy June issue right now, but it’s still right there if I want to refer to it again.

I’m not suggesting that the Washington journals should be discontinued. Honestly, I think it’s criminally short-sighted how little of 21st century America’s money is invested in education. So yes, the journals should get funding and we should build fewer military drones. But it might also be a good time for those of us who support scholarly communication and refereed exchanges of ideas, to ask ourselves whether the way it’s always been done is the best way to do it now.

Since I don’t think anyone is getting rich writing for or publishing academic journals (correct me if I’m wrong), it doesn’t strike me that there’s an entrenched financial interest resisting change. So the question is, can we find less costly, more effective ways to do the things that journals do for the academy? Many journals are available online as a matter of course, with no increase in cost; so it seems reasonable to assume that online publishing does not add significant expense to the publishing process. I may be wrong, but it seems like a journal’s major expenses would be staff and printing/distribution.

Obviously, you can’t have a refereed journal without referees. But there seem to be important institutional interests supporting this process of validating and professionalizing fields of study. So I suspect there will continue to be ways to get this done—and to get it paid for. The function will be preserved, so it’s the form we’re worrying about.

Is part of the problem a continuing belief in the validation our writing acquires by being printed on paper? Isn’t this belief especially redundant in the case of peer-reviewed journal articles? Would the Washington journals be able to survive in electronic-only form? I don’t know the answer to this (I emailed them the question, and I’ll let you know what they say); nor do I know whether the Washington State administrators would be willing to negotiate, if they were presented with a lower-cost option than the journals’ current budgets. I’m just suggesting that it’s time to think about these issues. As I think the Washington State journal advocates implied in their letter to the list, our favorite journal could be next.

Friday, July 15, 2011

Acknowledging Shortcomings as a Writer

Philip White

As a writer, you get to know pretty quickly what you’re naturally good at. My good lady wife is the best, untrained copy editor I’ve come across and, as Erik Larson says about his other half, my “secret weapon” in writing my next book. A good friend is a grammar wizard, and what’s more (unlike me) he actually enjoys wielding his Chicago Manual of Style and Gregg Reference Manual. Another longtime buddy is the master at describing people and places, and holy crap I hate him for it!

Perhaps you also, in times of honest self-appraisal, realize what your weaknesses are. I lack the skills of the three mentioned above, and wonder how on Earth they acquired such gifts. The answer is obvious – a combination of acquired knowledge and God-given talent. It’s easy to get down on your inadequacies, to despair when staring your shortcomings in the eye, and to covet your neighbor’s grasp of the past participle!

But I’ve found that embracing what I lack and seeking assistance is actually quite a formative experience. By involving the aforementioned people in my writing and editing processes I’m submitting myself to an ongoing skills development program. And if you can check your ego enough to take constructive criticism from your spouse, you can certainly take it from any editor.

Next is to seek people who are willing to provide mentoring. I’m lucky enough to be related to one such person and seek regular guidance from my former college adviser on all matters regarding the written word. Another professor (and author of 15+ books) who’s based on the West Coast has guided me through the tangled web of the publishing industry. The keys to learning from these people? Humility, receptiveness to the opinions of people who know more than me, and a willingness to share my weaknesses openly.

Another way I am constantly trying to improve is by reading the work of writers I admire. This involves poring over all forms of books – history (Rick Atkinson, David McCullough), nonfiction (Larson, John Berendt) and historical fiction (Robert Harris, Juliet Barker), as well as keeping up on the latest features from the WSJ and magazines such as The Atlantic. The third individual I mentioned is the executive editor of a prominent culture magazine, and as he’s kindly put me on the mailing list (thanks again, Luke!), I am confronted by his brilliance once a month.

The idea here is that you become what you behold. So by focusing on those who are skilled writers, I’m hoping that I assimilate some of their powers of description, mastery of pacing, and brevity (hmmm, still working on that one, for sure).

I admit to have not having arrived at a place where I can be satisfied with my writing, nor do I ever want to get to such a place. But by surrounding myself with talented people who can teach me something and reading the best work in the genres I dabble in means I’m better than I was yesterday, and tomorrow will be another step along the road to “writing well.”

Thursday, July 14, 2011

Got My Degree . . . Now What?

Randall Stephens

Get a job, first and foremost. . . .

At the HS blog we've had some discussions about careers for history majors. (Here, here, and here, for instance.) We'd like to keep the conversation going. Some questions that might be worth pursuing:

* What career information resources are available for history majors?

* Is history a portable major? Can the skills learned in the discipline transfer into other fields?

* What can history majors do if they choose not to teach history or go on to grad or law school?

* Why do many prelaw students major in history?

* If the job market is so bad right now, why should an undergrad major in history rather than majoring in a preprofessional program?

A recent graduate emailed me the other day. He was having trouble finding a job that was directly related to history. It made me think that my history department should sponsor at least one lecture every year that addresses the subject head on. I talk about what history majors can do with their degrees in my classes, but perhaps it's time to go beyond that.

Fortunately, my former student's question came right as the AHA blog put up an excellent post on "Finding History Jobs Outside of the Academy." Serendipity! "Looking for a history job outside of academia? While the AHA publishes job ads (including some outside the classroom) online and in the back pages of Perspectives on History, there are many other history job listing sites online. This post draws from our Careers in Public History page, where you can find a number of links to job postings for work in museums, historical societies, state and local government, archives, and more."

For more on this, see John Fea's thoughful blog, The Way of Improvement Leads Home. Fea regularly posts on "What Can You Do with a History Major?" For more, check out the AHA's "What Can You Do with a Graduate Degree in History?" Other resource pages include: Stanford University, Department of History, Careers for History Majors; American University, Careers in History; University of Texas at Austin, Career Resources for History Majors; and Wright State University, Public History, Career Opportunities.

Or, if you'd prefer old-fashioned print, have a look at: Stephen Lambert and Julie DeGalan, Great Jobs for History Majors (McGraw-Hill, 2007); What to Do with Your History or Political Science Degree (Princeton Review, 2007); Katharine Brooks, You Majored in What? Mapping Your Path From Chaos to Career (Viking, 2009); Jules R. Benjamin, A Student's Guide to History (Bedford/St. Martin's, 2009).

Wednesday, July 13, 2011

Eating Our Way through History Roundup

.
Tim Carman, "With America Eats Tavern, Jose Andres offers bites of history," Washington Post, July 12, 2011

A few historic cookbooks from Jose Andres’s personal collection are displayed inside a hulking glass case at his new America Eats Tavern in the former Cafe Atlantico space. The chef is attempting to explain each volume — a notebook kept by George Washington’s chef, a “Chemistry of Cookery” tome that proves Harold McGee didn’t invent that field — when he can’t stand it anymore. He suddenly wraps his arms around the glass case, gives it a big bear hug and yanks it off the stand. He wants to paw through his books and actually show me what he’s talking about.>>>

Michael Knock, "History as an ingredient," Iowa City Press-Citizen, July 12, 2011

It's time for a little history. But don't worry, this is history you can really sink your teeth into.

The Johnson County Master Gardeners will host the 16th annual Taste of the Heritage Garden on July 20 at Plum Grove in Iowa City. The dinner, which runs from 5:30 to 7 p.m., offers the opportunity to learn a bit about Iowa's culinary history by letting attendees sample some of the foods and dishes our ancestors might have enjoyed.>>>

James McWilliams, "How 'Conscientious Carnivores' Ignore Meat's True Origins," Atlantic, July 12, 2011

. . . . The rationalization is that because factory farming is so horrifically brutal to animals, the conscientious carnivore can vote with his or her fork by purchasing meat from farmers who raise their animals in a more "humane" manner—free-range pork, grass-fed beef, cage-free eggs, and all that. The reality, however, is that the so-called conscientious consumers who support these alternative systems are doing very little to challenge the essence of factory farming. In fact, they may be strengthening its very foundation.>>>

Ann Treistman, "Eatymology: Our favorite summer foods, explained," Salon, July 9, 2011

Thinking about American cookery from its very roots reveals how nearly everything we eat came from Europe with settlers. It also makes very clear the elaborate -- and sometimes random -- updates and changes that have been made to these dishes. Brownies were once prepared without chocolate (is a brownie without chocolate really a brownie? you might ask). Pumpkin pie was made with rosemary, thyme and apples. Granula, a precursor to today's granola, was as hard as a rock and had to be soaked in milk before it was eaten. Biscuits went from twice-cooked pucks taken on ship journeys because they never became stale (they started out that way) to the flaky, buttery mounds we enjoy today. Peanuts for peanut butter were once boiled, not roasted. And there are dozens of variations on meatloaf; we added the ketchup and the cheese.>>>

Tuesday, July 12, 2011

San Francisco Moving Picture Time Machine

Randall Stephens

It's really hard to believe that 60 Minutes has been on the air since 1968. (In fact, you can watch original episodes in their entirety here.) This Sunday proved the show still has much to offer all these years later.

In a segment called "60 Minutes Rewind," the program turns its attention to a remastered film, an amazing, rare bit of footage from more than 100 years ago. Way back when two clever filmmakers decided to mount a camera to the front of a trolly car that was rattling down Market Street. The footage is astonishing. I'm thinking about using it in my fall course, the United States from Reconstruction to World War I. Like this 1848 daguerreotype of Cincinnati, the moving pictures might encourage class discussion on what we can learn about a large American city from this film from so long ago. (See also how the story follows the sort of digging, investigation historians have to do.) Have a look and see what you think . . .

Monday, July 11, 2011

Key Questions for a World Civ Seminar

Bill McCoy

Today's guest post comes from my Eastern Nazarene College history department colleague Bill McCoy. Bill is a PhD candidate in African history at Boston University, where he is completing his dissertation: "To Heal the Leper: The Mbuluzi Leprosy Hospital in Swaziland, 1948 to 1982." Along with teaching non-western history, McCoy has taught courses on Europe since the middle ages, world political geography, and a Swaziland travel course on the
history of missions. Here, McCoy considers something that quite a few of us probably think about: how to frame our courses with key questions in mind.

This coming Fall, I have a chance to teach a course at Eastern Nazarene College titled "Contemporary Questions." It's a seminar for first-year honors students, which will (because I am teaching it) replace their general education history requirement (in our context, a survey called The West in the World Since 1500). In the past, the course has been a replacement for the general education philosophy requirement, and in the future, it might replace a literature requirement or something else, depending on the specialty of the faculty member teaching the course.

So the class is a history class, but instead of the traditional chronological survey approach, I am building the course around significant question for our contemporary world and then trying to help students work through the ways that history helps us answer those questions, even if the questions will not have definitive answers. In the past few weeks, I've been brainstorming the questions that will shape the course syllabus, but I'd love some input from others about this. What questions matter most in the world today? What reading material might students enjoy/get the most out of in a course such as this. To get things started, I'll offer a few examples of questions I've considered; I would love to get reactions to these and, especially, suggestions about other questions to add to the list:

* What is the role of geography and the environment in history?
* Why is there such massive economic inequality in the world?
* What are the causes of horrors like genocide?
* Why do we live in nation-states?
* Is patriotism a virtue?
* Why do so many people live in cities?
* Who makes history? Who matters in history?
* How have humans expressed themselves in the arts?

Friday, July 8, 2011

Kaboom! The Legacy of World War I

Randall Stephens

The past is never dead. It's not even disarmed.

Among the many legacies of World War I: the unexploded bombs that still litter Europe. (This
is also a major inheritance of the Second World War. See the DW clip embedded here.) The Christian Science Monitor's Randy Dotinga discusses the problem in a brief review of Adam Hochschild 's new book, To End All Wars: A Story of Loyalty and Rebellion, 1914-1918 (Houghton Mifflin, 2011). "I had my own encounter with an unexploded shell in 2006," says Dotinga, "during a tour of World War I battlefields and cemeteries in the region of Belgium known as Flanders. We dropped by an archaeological site in a field next to a gas station and found volunteers who'd just uncovered an unexploded German shell. You can see it in the accompanying photo. The date on the shell, which was about 20 inches long, is 1916. Holding the shell – carefully – was probably harmless. Banging it with a hammer, however, would have been a very bad idea."

In the June issue of Historically Speaking, Sean McMeekin–who teaches diplomatic history in the department of international relations of Bilkent University in Ankara, Turkey–considers the enormous impact of the Great War and the ongoing scholarly discussion/debate surrounding it. I excerpt a portion of that essay here.

Sean McMeekin, "Jihad-cum-Zionism-Leninism: Overthrowing the World, German-Style"

It is often said that the First World War marks a watershed in modern history. From the mobilization of armies of unfathomable size—more than 60 million men put on uniforms between 1914 and 1918—to the no less mind-boggling human cost of the conflict, both at the front and beyond it (estimated military and civilian deaths were nearly equal, at some 8 million each), the war of 1914 broke all historical precedent in the scale of its devastation. Ruling houses that had endured for centuries—the Romanov, Habsburg, and Ottoman—shook, tottered, and fell, unleashing yet more misery as these precariously assembled multiethnic empires were wracked by internecine warfare. As the war of 1914 spread beyond Europe into the Balkans and Middle East, racial and religious score-settling and reprisals led inevitably to large-scale ethnic cleansing, with millions of civilians uprooted from their ancestral homes, which most would never see again. Even the victorious Western powers, France and Britain, suffered a collapse in cultural confidence that arguably has never been repaired. After centuries of progress had brought the West to a position of unparalleled domination of global affairs, it took only four years for the whole glittering edifice of European civilization to fall apart.

If 1914–18 marked an epitaph for Old Europe, we may usefully ask: Was it murder or suicide? Popular historians have usually leaned toward the latter verdict, viewing the catastrophe of 1914 as a tragedy of miscalculation, the idea being that no European statesmen were truly guilty of intending the war, at least not the horrendous global war of attrition that it turned into.1 Since the Fritz Fischer debate of the 1960s professional historians have generally favored the former explanation, explaining the war’s outbreak in terms of German and/or Austrian premeditation, coming down with a verdict of, if not outright homicide, then at least civilizational manslaughter. The German decision for war in 1914, Holger Herwig writes in a recent scholarly collection on the conflict, was not quite Fischer’s aggressive and deliberate “bid for world power” but rather “a nervous, indeed panicked ‘leap into the dark’ to secure the Reich’s position of semihegemony on the Continent.” In the new “consensus” interpretation, Berlin still bears primary responsibility, no longer for premeditated imperial aggression in the sense implied by the Versailles Treaty and by Fischer, but for an impulsive preemptive strike to ward off incipient strategic decline, with further mitigation in that the Germans received a strong assist in unleashing the dogs of war from their equally panic-stricken (and equally pessimistic) Austrian allies.

This sort of moderate academic consensus is usually welcomed. Now that so few historians have a real personal or patriotic stake in the controversy (as many Germans with memories of both world wars still did in the 1960s), scholars working in the field today are spared the bitter acrimony of the Fritz Fischer years. Even on the level of practical politics, with the centennial approaching, there is now a sense of “goodbye to all that”—literally, as the last German reparations payment was finally processed in 2010!>>>

Thursday, July 7, 2011

Summer Fiction Reading

Randall Stephens

Have you noticed all the newspaper and magazine articles on summer reading? Can you take your iPad, Kindle, or Nook to the beach? (Someone must have invented a waterproof case already.)

Over at the Chronicle and the Guardian, you'll find some great suggestions.

Can't get enough of the academy? Already missing the longstanding cold wars, never-ending committee meetings, and seeing what your sartorially challenged colleagues are wearing? Do you have a taste for higher ed schadenfreude and the drama of the academy? Then have a look at Ms. Mentors' list of academic novels. "In real life, academicians do have flashes of wit, and they love gossip," she writes. "They're honest researchers and dedicated teachers, and some revel in committee work. They may even have romances and happily marry each other, despite their terrible fear of fun. They rarely kill anyone, even at MLA meetings. But they do love to write about themselves and about the classroom as a site for contested and resisted hermeneutical hegemonies." She highlights "Lucky Jim, by Kingsley Amis; The Lecturer's Tale and Publish and Perish, by James Hynes; Changing Places and Small World, by David Lodge; Straight Man, by Richard Russo; and Moo, by Jane Smiley," and a score of others.

Maybe historical fiction is more your bag. Over at the Guardian Andrew Miller offers up his "Top 10 Historical Novels: From Rosemary Sutcliff to Hilary Mantel, the novelist chooses his favourite books drawing on history's 'rattle-bag of wonderful stories.'" He notes: "The books listed here share the essential virtues of all good fiction: the renewal of our sense of the world, of ourselves, of language, the extension of ourselves across time and space. And how odd it would be, how dull, if novelists and readers confined themselves, in the name of some dubious notion of relevance, to the events and style of one particular period." Among others the list includes: Kepler by John Banville; Rites of Passage by William Golding; A Place of Greater Safety by Hilary Mantel; Memoirs of Hadrian by Marguerite Yourcenar; and I, Claudius by Robert Graves.

That last bit reminds me that I need to order the HBO series Rome for our library. Intrigue, violence, military campaigns, ancient occult . . . It will make for some great summer viewing.

Wednesday, July 6, 2011

Interviewing No-Nos

Philip White

British newspaper The Independent is a top-drawer broadsheet that features insightful, timely commentary from the likes of John Walsh, Adrian Hamilton and Mary Dejevsky. Its columnists typically research their op-eds well, providing ample evidence to back up their claims, and interviews are usually conducted with the subject as the focus (as should always be the case), in a manner that seems legit.

So it was with some shock that I read last week about the alleged misconduct of Johann Hari, an Independent writer who won the prestigious Orwell Prize (that he may now lose) for journalistic excellence in 2008. He is being accused of two offenses that would make even Jayson Blair blush. First, Hari supposedly (and I choose my words carefully here, because his guilt or innocence has yet to be established and he has denied the accusations) copied and pasted quotes from other sources into his interviews. Second, some say he has lifted text from authors’ written work and used these excerpts in place of quotes they gave him during interviews.

Now, this writer is in no way trying to pass judgment on Hari, particularly as I am suspicious of people being tried in the court of Twitter opinion. However, whether Hari did or did not do such things, his case brings into focus interviewing etiquette and journalistic ethics. Hari stated in his defense that:

When you interview a writer—especially but not only when English isn't their first language – they will sometimes make a point that sounds clear when you hear it, but turns out to be incomprehensible or confusing on the page. In those instances, I have sometimes substituted a passage they have written or said more clearly elsewhere on the same subject for what they said to me, so the reader understands their point as clearly as possible.

Is this acceptable practice, or taking the interviewers attempt to provide clarity to his audience too far? I will let you be the judge. Meanwhile, here’s a list of no-no’s accrued over a decade’s worth of my interviews, as well as from journalism professors and esteemed colleagues and mentors. Not all of these are ethics-related, some may seem obvious, but each is something I share with college undergrads when teaching them how to interview, (and how not to).

Don’t Be Late: In an age of smartphones and iPads, the wristwatch may soon become passé, but it’s still worth wearing one to make sure you’re on time for interviews. Showing up late to an in-person conversation or a “phoner” makes it seem that you believe your time is more valuable—it isn’t! (Confession time—I still struggle with American time zones, despite having lived in the U.S. for almost 10 years, so I always make sure to ask the interviewee or publicist to confirm the time zone, as well, so I don’t make the embarrassing “Oh, sorry, I thought we said Pacific time” mistake.)

Don’t Phone it In: You cannot go into an interview cold and expect to get anything meaningful from it. Not even the greats—Larry King, Ed Murrow, et al—would’ve relied on their interviewing prowess and walked into a Q&A unprepared. The more a person is interviewed, the more tired they will become of dull, unimaginative, and generic questions: “Did you enjoy writing the book?” is not something you need to waste time on. Check other recent interviews with that person, think up five to 10 original questions, and write them down (that last part is too often forgotten). Pack extra batteries for your voice recorder/check that you iPhone/iPod is charged, double check the venue and always take too many pens (if you’ve seen the Russell Crowe journalism flick State of Play, you’ll know what I mean.)

Don’t Fill in the Gaps Yourself: Is your interviewee a mumbler, or are you on a bad line so you couldn’t quite hear the end of a response? Are they talking in riddles, or an Elven language? It’s better to ask, “Could you repeat that, please?” or “Can you explain what . . . means?” rather than stumble over your transcription later and try to fill in the blanks of what you thought was said or meant. If after typing up your notes you’re still unsure, try to follow up with the interviewee/their PR rep via e-mail for clarification.

Don’t Let the Recorder Do all the Work: Last week, HS blog contributing editor Heather Cox Richardson used the phrase “engaging with the text.” When I am conducting an interview, writing notes (formerly on a reporter’s notebook, now on a tablet) while recording the conversation does just that—connects me in a tactile way to the subject. With practice you’ll be able to keep eye contact and observe visual cues during in-person interviews while scribbling away. Mastery of shorthand, whether your own version or a traditional method is also helpful. Then there’s the disaster planning reason for combining recording and note-taking—if your iPhone/infernal voice recorder fails you, you’ll have backup.

Don’t Force your Interviewee into a Comment: Even if you’re an investigative reporter, you cannot force an interviewee to cough up information (unless you live in a country that supports advanced interrogation techniques and you work for the state propaganda mouthpiece). So, if your subject refuses to answer a question and you’ve asked it another way without result, don’t step over a line and try to put words in their mouth. Or you may have a libel suit on your hands, not to mention losing the chance to interview that person again.

Don’t Make Your Writing the Focus: If you’re writing a feature, remember that your reader cares very little about your ability to craft fancy motifs and very much about what your subject has to say and who they are. So get out of the story’s way, already. Ask questions that allow your interviewee to tell their story, in their words, in a way that’s more compelling than any “look at me, I went to journalism grad school” fireworks.

Don’t Rush Transcription: When you’re on deadline, it’s tempting to rush the transcription process. Is it always fun to make sure you got every utterance down verbatim, particularly if it was a long interview? No, but it is always worth it. The interviewee did you the courtesy of sparing their time, and to entrusting you with their words. You’re duty-bound to represent them (and the publication you’re working for, even if it’s just your blog) accurately, so make some coffee, put on your earbuds, and take as long as it takes for an accurate transcription.

Don’t ‘Borrow’ from Other Interviewers’ Work/The Interviewees’ Work: See the intro!

Tuesday, July 5, 2011

Bloody Sundays

Dan Allosso

I was driving home from work the other day, listening to music instead of audiobooks for a change; randomly playing the “top-rated songs” from my iPod. It’s surprising how songs you forgot you’d loved show up there from time to time. I found myself driving along to the snare-drum beat of U2’s 1983 hit, “Sunday Bloody Sunday,” which got me thinking about the events behind the song and about popular protest.

The U2 song refers to the 1972 shooting of unarmed protestors in Derry, Ireland. But the name Bloody Sunday has also been used to describe the reaction to the first Selma March (1965), police violence against unemployed protestors in Vancouver, BC (1938), a 1905 St. Petersburg massacre that helped spark the Russian Revolution, and two other days of violence in the Irish conflict (1920, 1921). The original Bloody Sunday was a November 1887 demonstration in London that was routed by the Army and Metropolitan Police.

The issues that led to 1887’s Bloody Sunday included the 1885-86 Irish Coercion Acts, but the demonstration was really a culmination of tensions brought on by England’s “Long Depression” of the 1870s. East Londoners had been demonstrating against unemployment and poverty in their section of London for several years. The difference in November of 1887 was, they marched westward with their protests, to Trafalgar.

The 1887 East End protestors were not the first to march on London’s centers of power. In 1848, Chartists had planned to march on Parliament, and had only been turned away when the Duke of Wellington placed cannons on the Westminster bridge over the Thames. But less than a generation later, Britain’s Reform League led demonstrations that brought hundreds of thousands of protestors to Trafalgar Square and Hyde Park in 1866, and to the Agricultural Hall and Hyde Park again in 1867. These protests had been declared illegal by government authorities. So how did the reformers “get away with” these massive demonstrations?

The leaders of the Reform League demonstrations had learned from the 1848 intimidation of the Chartists. Charles Bradlaugh was an East Londoner who had been beaten by police at age 15, while leading an East London sympathy demonstration on the day the Chartists surrendered to Wellington. Bradlaugh was subsequently posted to County Cork with the British Army, where participating in the eviction of starving Irish peasants helped to radicalize him, and where he learned cavalry battle techniques and tactics. Bradlaugh insisted on public protests, against the wishes of many of his fellow Reform League leaders, and personally led them with the military discipline he had learned in Ireland.

200,000 Londoners marched on Hyde Park in July 1866. Home Secretary Spencer Walpole had outlawed the protest a few days earlier, and threatened military action. Bradlaugh and his radical allies in the Reform League leadership declared the government’s position unconstitutional, and announced they would challenge Walpole’s illegal attempt to bar peaceful, free assembly. Of course, in order to be legal, Bradlaugh realized the protest would have to be peaceful. And, aside from the famous destruction of some railings around the “Marble Arch” gate, which marchers led by a carriage of Reform League executives were prevented from entering by a troop of 1,600 police and soldiers, the demonstration was disciplined and nonviolent. Bradlaugh wasn’t at the railings, he was leading another column of protestors across Knightsbridge toward Hyde Park at the time.

To make a long story short, Spencer Walpole resigned in disgrace and the Reform League’s demonstrations established the right of Londoners to march, occupy public parks and squares, and demonstrate for their political rights. In 1868, the Reform Act passed, extending voting rights to working-class people for the first time in British history. And yet, twenty years later, Bloody Sunday. What had changed?

In 1887, Charles Bradlaugh was not at the head of the demonstration. He was exhausted and ill, following a six-year battle to take the seat in Parliament he’d been elected to in 1880. And he had doubts about the programs proposed by the Social Democratic Federation, which was sponsoring the protest. Bradlaugh advised readers of his National Reformer to stay away, and warned SDF and Fabian Society leaders to be cautious and consider the security of their people. Believing the precedent set by the Reform League had established their rights once and for all, Annie Besant, George Bernard Shaw, and the other leaders of the demonstration defied the government and marched. The results were disastrous.

Men, women, and children marchers were beaten by London police. Over two hundred people were hospitalized. Three were killed. Infantry and cavalry troops were on hand, with fixed bayonets; but luckily were not called into action. Several of the march’s leaders were arrested and held behind bars for six weeks.

It’s difficult to pinpoint all the factors that had changed, between 1866 and 1887. There was a new government, with new people in key positions—although in both cases, the Tories were in power. Chicago's Haymarket riot of 1886 was fresh in the minds of both the protestors and the government. The radicals who had been united behind the Reform League had split into factions. Socialists who seemed to be gaining ground in London political circles offered utopian ideals but had little connection with actual working people (the Labour Party was established in 1893, partly in response to these issues). But it’s possible that the key difference was the 1887 leaders’ belief that their right to protest, established 20 years earlier, was inviolable.

The lesson of Bloody Sunday, it seems to me, is that we can’t take our rights for granted. If we are not prepared to defend them, they can be taken from us. Like Charles Bradlaugh and his fellow reformers, we have to be resolute, disciplined, and nonviolent—but prepared to defend ourselves. It’s not insignificant that when Bradlaugh died four years after Bloody Sunday, one of the mourners at his simple secular funeral was twenty-one year old Mohandas Gandhi.

Monday, July 4, 2011

4th of July Roundup

.
Jim Cullen, "The Declaration of Independence and the American Dream," HNN, June 29, 2011

"America is a young country," people sometimes say. What they really seem to mean is: "the United States is a young nation." Such a statement makes some sense if one thinks of a political entity that came into existence circa July 4, 1776. It makes less sense when one considers that the Constitution that followed about a dozen years later is the oldest written functioning one in the world. Compared with a traditional nation-state like France or Spain, sure—the United States is a young nation, even if France has had a few republics since then.>>>

"American civil war re-enactment in South Yorkshire - in pictures," Guardian, July 4, 2011

Enthusiasts from all walks of life took part in re-enacting scenes from the American Civil War and in 'living history' events in the grounds of Cusworth Hall, near Doncaster, which are as authentic as possible.>>>

Peter Rothberg, "What Is Patriotism," Nation, July 1, 2011

The first sentence of The Nation's prospectus, dated July 6, 1865, promised "the maintenance and diffusion of true democratic principles in society and government," surely a patriotic sentiment, as was the magazine's name.

Since that time The Nation has attempted to represent and give voice to the best of American values and culture and has steadfastly resisted all efforts through the years to brand dissent as unpatriotic.>>>

Amy Bingham, "Almost a Fourth of Americans Do Not Know When the U.S. Declared Independence," ABC News, July 4, 2011

American Fourth of July traditions are tightly woven into the fabric of U.S. society, but the history of the country’s independence seems to have slipped through the seams.

A Marist poll released Friday shows that only 58 percent of Americans know when the country declared independence. Nearly a fourth of respondents said they were unsure and sixteen percent said a date other than 1776, when the Declaration of Independence was signed.>>>

Brian Handwerk, "Fourth of July: Nine Myths Debunked," National Geographic, July 4, 2011

Many time-honored patriotic tales turn out to be more fiction than fact. On the Fourth of July—today marked by a continent-spanning Google doodle—here's a look at some memorable myths from the birth of the United States.>>>