Tuesday, February 28, 2012
Documentary Films and History Features Roundup
The Amish, American Experience (2012)
An intimate portrait of contemporary Amish faith and life, this film examines how such a closed and communal culture has thrived within one of the most open, individualistic societies on earth. What does the future hold for a community whose existence is so rooted in the past? And what does our fascination with the Amish say about deep American values?
Clinton, American Experience (2012)
The biography of a president who rose from a broken childhood in Arkansas to become one of the most successful politicians in modern American history, and one of the most complex and conflicted characters to ever stride across the public stage.
Melvyn Bragg on Class and Culture, BBC Two
Melvyn Bragg explores the relationship between class and culture from 1911 to 2011 in a new, three-part series.
Into the White (2012)
Into the White is an anti-war movie. High above the harsh Norwegian wilderness, English and German pilots shoot each other to the ground after a violent chance encounter. Isolated, they must fight to survive the brutal winter. Though war has made them enemies, antagonism is hard to maintain as days go by. Through mutual need, unlikely friendships bloom. Somehow, they become comrades. War, after all, is absurd.
Nicolaus Mills, "Why 'Downton Abbey' is a hit in America," CNN, February 25, 2012
Americans love a period drama, and they dote on the British aristocracy. That's the way the popularity of "Downton Abbey," the British television series that drew 5.4 million viewers for the finale of its second season on PBS, is being explained these days.
It's an explanation that reflects our television history. In the 1970s, the British series "Upstairs Downstairs" was wildly popular on PBS. "Downton Abbey," which this season took place during and after World War I, covers much of the same social territory in following the trials of the fictional Crawley family, headed by Robert, the earl of Grantham.
Monday, February 27, 2012
Capitalism and Colonialism
When I was reading for my US History oral exams, one of the historiographical arguments that really got my attention was the long-running debate over the market transition. The question of when America made the turn from being an agrarian, egalitarian society to becoming a commercial, class society fascinated me; and so did the heated disagreements of eminent historians. As I read more, I realized that a lot of the argument really had to do with the definition and grouping of these terms (as Michael Merrill brilliantly pointed out in a 1995 article called “Putting Capitalism in its Place”). Were Joyce Appleby and Christopher Clark (not to mention Allan Kulikoff or Winifred Rothenberg!) even talking about the same thing when they used the words capitalism, market, commerce, and agrarian? Did “agrarian” naturally line up against “commerce,” and did either side really own the moral high ground?
Now I’m teaching Honors US History to undergrads. Clearly it wouldn’t be appropriate to expose them to the full glare of this debate. It would not only take too long to do, but it would be drilling too deep in even an Honors general education class for non-history majors. But I don’t want to cruise through this moment in history without mentioning it – I’m trying to challenge these students to think critically, so it’s my job to bring up the complex issues the textbook buries.
I had them read a couple of chapters of Matthew Parker’s 2011 book The Sugar Barons. Parker writes about Barbados in the early decades of its sugar revolution, the 1630s and 40s. He includes a detailed description of the introduction of slaves into the British sugar economy, through an interesting series of highly conflicted excerpts from the memoirs of English observers. A really valuable addition, from my perspective, was Parker’s extensive use of letters between several Barbados planters and merchants and John Winthrop, Governor of the City on the Hill.
The direct connection between Boston and the West Indies is useful, I think. Unlike Virginia or the New Netherlands or the Spanish colonies, which are usually presented to students as business ventures, the New England colonies are often portrayed as the seat of . . . something different. Something exceptional. The early link between Boston and Barbados, the Winthrop family’s business interests in the Caribbean, and the close connection that developed during the English Civil War, when Barbados became a principal market for New England produce, are all important challenges to the idea that there was ever a clean separation between commerce and colonies.
This is not to say that the type of agrarian anti-capitalism described by historians like Kulikoff never existed. But perhaps it suggests that when such sentiments developed, they were reactions to a colonial system built on a very problematic type of commerce rather than attempts to claim that a naïve, pre-commercial yeomanry had ever existed in America. From this perspective, even the earliest “agrarian” documents like Jefferson’s Notes on the State of Virginia seem to share something with writings of back-to-the-land idealists of the 19th, 20th, and 21st centuries.
Friday, February 24, 2012
Submarine History
This infographic from the BBC is just too good to pass up.
It’s fun for a Friday, in any case, just to take a look at ocean life at different depths. But the historians out there—especially the historians of science—will want to scroll all the way down to the bottom, where there is an interview with Don Walsh, one of the two men ever to travel to the bottom of the deepest place in the world: the Mariana Trench.
The US Navy bathyscaphe Trieste descended to the bottom of the Trench on January 23, 1960, making the deepest dive of Project Nekton, a project designed to launch careful study of the deep sea. Those plans ran aground, though, as the U.S. looked toward space exploration, rather than oceanic studies. Other countries have continued to invest heavily in ocean research—Denmark’s Galathea 3 Project drew international attention in 2006-2007, for example—but the focus of American popular interest has drifted elsewhere.
While Walsh notes that there has never been another manned expedition to the Trench, there are at least three plans underway right now to send crews to the deepest part of the ocean.
None of the groups planning the descents are scientific crews: they are a marine consulting company, a company that makes private submarines, and Richard Branson’s Virgin Oceanic group, which plans to offer undersea adventures.
Thursday, February 23, 2012
The Battle of Olustee, February 20, 1864
Monday was the anniversary of the Battle of Olustee, the largest Civil War battle fought in Florida. Although few people today have even heard of it, Olustee was crucial in convincing Civil War era Americans to accept black freedom.
On February 7, 1864, Federal troops landed in Jacksonville. Carving Florida off from the rest of the Confederacy had several obvious advantages. First, the Confederacy was hurting for food, especially cattle. When the Union took the Mississippi River, it cut off the Texas herds from the rest of the South. The cattle herds in Georgia, Alabama, North Carolina, and South Carolina could not fill the gap. If the Union could cut the lines for moving the Florida cattle that still fed the South in 1864, it would be closer to starving the South into submission. One general estimated in 1864 that 20,000 head of cattle and 10,000 hogs a year went from Florida to feed the Southern armies.
President Lincoln also wanted to reorganize Florida out from under the Confederacy as a free state much as he was trying to do in Louisiana. Opponents carped that he was trying to get Florida back into Congress so he could count on more electoral votes in the 1864 election, although there were obvious reasons to want Florida back on the Union side even without the president’s reelection fight looming on the horizon.
Finally, an excursion into Florida promised to attract black recruits to fight for the Union. And in 1864, new soldiers would be quite welcome to the battle-thinned Union ranks.
Brigadier General Truman Seymour, the head of the expedition, had strict orders not to move far from Jacksonville. Instead, Union troops under Colonel Guy V. Henry of the Fortieth Massachusetts mounted quick raids that destroyed supplies and reconnoitered the Confederate army. Their operations among the poor and dispirited people were successful and relatively painless: they suffered few losses.
It was perhaps the ease of the raiding to that date that made General Seymour decide on February 17 to march his 5,500 men 100 miles west to destroy the railroad bridge over the Suwanee River. Seymour did not know that Confederate officers had surmised the danger to Florida and had moved troops quickly to prevent Union troops from gaining a foothold in the interior. Five thousand Confederates under Brigadier General Joseph Finegan were encamped on the road Seymour’s men would take, near the railroad station at Olustee, about fifty miles from Jacksonville.
When the two armies came together in mid-afternoon on February 20, Seymour threw his men in without much forethought, apparently believing he was up against the same ragtag fighters Henry had been smashing for weeks. But Finegan’s men were experienced troops. They trained their cannons and held their ground. The Union lost more than 1800 men to the Confederacy’s 950. Most of the surviving Union soldiers ran from the field to hightail it down the road back to Jacksonville.[1]
The Union rout did not turn into a panic solely because the remnants of the Massachusetts 54th and the 35th U. S. Colored Troops held the Confederates back to cover the retreat. The soldiers of the Massachusetts 54th had earned their reputation for bravery in the assault at Fort Wagner, near Charleston, South Carolina the previous July. At Olustee, the black soldiers from the 54th and the 35th held their ground until past dark, enabling the white troops to get safely out of range, before they received their orders to move back toward Jacksonville.
Few people now remember Florida’s major Civil War battle, but it made a searing impression on President Lincoln. “There have been men who have proposed to me to return to slavery the black warriors of Port Hudson & Olustee to their masters to conciliate the South,” he told visitors in August 1864. “I should be damned in time & in eternity for so doing.” (AL to Alexander W. Randall and Joseph T. Mills, August 19, 1864, in Roy P. Basler, Collected Works of Abraham Lincoln, volume 7, pp. 507.)
___________
[1] The New York Times noted that the Colonel Henry had three horses shot out from under him during the battle, but was himself unhurt. His luck would not hold. Henry continued to serve in the army until 1892. He fought in the Apache campaign before joining the Sioux Wars. He was shot in the face in the Battle of the Rosebud, losing part of his cheek and one eye. He later led the Ninth Cavalry in the events surrounding the Ghost Dance and Wounded Knee Massacre.
Tuesday, February 21, 2012
Winston Churchill and the New Digital “Iron Curtain”
March 5th will mark the 66th anniversary of Winston Churchill’s “Sinews of Peace” address, better known as the “Iron Curtain speech,” delivered in a gymnasium at Westminster College in tiny Fulton, Missouri. There, Churchill provided the epoch-defining view of the division between the Communist “Soviet sphere” and the democratic West, the memorable (and now, almost overused) appraisal of the Anglo-American partnership as the “special relationship” and a word-perfect exhortation of the principles of freedom and liberty.
But all these years later, with the USSR no more, do Churchill’s words still ring true?
In searching for an answer, one need look no further than the recent censorship actions of another Communist regime, North Korea. Following the death of Kim Jong-Il, their Supreme Leader, the Pyongyang authorities declared that anyone caught using a mobile phone during the state-ordered 100-day mourning period would be convicted of a war crime. Similarly, during the recent crackdown in Syria, the tech minions of Bashar al-Assad used a “kill switch” to cut its embattled citizens off from the web – the same tactics used by the panicking regime in Egypt during its last days. Meanwhile, Iran tried to close down all social networking sites to prevent protest organizers from spreading the word.
And how does this relate to Churchill, a technophobe who, after all, denied Westminster College president Franc “Bullet” McCluer’s request to broadcast the Iron Curtain speech by TV, telling him "I deprecate complicating the matter with technical experiments”?
One of Churchill’s reasons for using the “iron curtain” metaphor was that Stalin’s cronies were preventing media access to Poland, Yugoslavia, and other countries struggling under the Red Army’s jackboots. Despite the Marshal’s feigned support for “free and unfettered elections” in the Yalta Declaration, diplomats from Britain, America, and elsewhere were, just weeks later, followed and harassed, and some expelled. Stalin had rung down this solid metal curtain to prevent reports of his puppets’ malfeasance from leaking out, and to keep his new subjects and their tales of woe in.
The modus operandi of the new dictatorships is different, but the spirit is the same. Essentially, the people of Syria, North Korea, and Iran (not to mention China, which also restricts internet use) are living behind a virtual iron curtain, every bit as oppressed as their predecessors in the USSR. And while bright minds in these countries are jerry-rigging internet connections via old fax machines and (for those with the resources) satellite phones, and Twitter provides a platform for Iranian dissidents to show the Revolutionary Guard’s brutality, we are in need of a Churchill to enunciate their plight on the world stage.
In addition, our leaders must be forthright in not only explaining the inherent wickedness of totalitarian rule, but also in their defense of the principles we are privileged to have in a democracy: the rule of law; freedom of religion, the ballot box and expression; and the chance to advance ourselves without the backhanders and corruption that are rampant in a police state. Too often, we take for granted these great pillars of liberty, or we fear that praising them will make us sound like self-righteous imperialists. Churchill knew that this was not so: it is only by confessing our creed that we can hope to perpetuate it, and, by putting it into practice through strong diplomacy, to help others who find themselves under the dictator’s yoke to obtain it. As he said at Fulton, “We must never cease to proclaim in fearless tones the great principles of freedom and the rights of man which are the joint inheritance of the English-speaking world. ” Just as true now as 66 years ago.
Monday, February 20, 2012
Board Games, Capitalism, and Piracy
It’s fairly widely known that the game Monopoly was developed in America in the late nineteenth century to illustrate the evils of land monopoly. Rising prices, especially in the cities, in the 1870s and 1880s brought fortunes to a lucky few and misery to many. Ideas for negotiating this rough transition to a modern economy sprouted from all sorts of fertile minds, but few held the popularity accorded to Henry George’s Single Tax plan. George had lived both in California and New York during land booms and argued that land values rose through public development, rather than through individual enterprise. To restore equality, he argued, the government should take this unearned wealth back from the pockets into which it fell by taxing the value of the land.
This seemed a remarkably easy way to address the problem of growing inequality. Henry George clubs sprang up across the U.S. and even spread to Europe. George came close to winning the mayoralty of New York City in 1886 (he won more votes than newcomer Theodore Roosevelt). And Elizabeth Magie invented The Landlord’s Game, Monopoly’s forerunner, to explain the principles of land monopoly to potential Single Tax acolytes.
As anyone who has endured a rainy afternoon as a child knows, playing Monopoly was also a brutal lesson in the harshest form of capitalism. Invariably, one player emerged early as the canniest trader, or was lucky enough to capture Boardwalk and Park Place. S/he would slowly bleed the rest of the players dry over the long, painful course of hours. The only real option for a losing player was to rob the bank (something that, sadly, I didn’t figure out until I watched my children play the game). As someone said to me today, a young loser did not figure out the game was rigged, but rather assumed s/he was just bad at the game.
The structure of “land monopoly” and the internalization of failure, of course, were what Henry George’s followers were trying to highlight.
In contrast to the long, slow death of Monopoly stands the original Pirateer, a game that took the toy world by storm in 1994. It was produced independently, very briefly, by the Mendocino Game Company. In 1996, it won the Mensa Select Award for board games. In Pirateer, four gangs of pirates compete to collect a treasure from the island at the center of the board. They must then get it back to their own harbor before their ships are sunk by the other pirates, tacking according to wind patterns and the roll of dice. It is a rollicking game, essentially a free-for-all, but one that is bounded by natural laws (the wind), limited elements of luck—the roll of dice—and by a player’s strategic skill.
Crucially, anyone can win Pirateer right up to the very last play of the game. A clever four-year-old seeing the patterns of the board differently than his opponents can beat a seasoned player. No one can have a lucky break that determines the entire cast of the game. Everyone stays enthusiastic. No one gets an early advantage that means success four painful hours later. And the resentments at the end of Pirateer are correspondingly minor compared with those after Monopoly.
The contrast between these games hit me today when someone suggested that the true secret to the success of capital accumulation was protecting goods from piracy. The discussion was of the 1400s and the importance of walled cities, but it seems to me to hold true for colonial settlements in America, and even for modern-day attempts to regulate the internet.
Monopoly and Pirateer. Worth thinking about.
Friday, February 17, 2012
Pardoning Alan Turing
Last week the British House of Lords declined to pardon Alan Turing for the crime of being gay. Convicted of indecency in 1952, Turing chose chemical castration rather than a prison term. Two years later, he killed himself by ingesting cyanide. Perhaps not ironically—since such symbolism was almost certainly intentional when committed by such a brilliant individual—he administered the poison to himself in an apple.
Alan Turing is widely considered to be the father of the modern computer. He was a key figure in Britain’s World War II code breaking center at Bletchley Park, inventing a machine that could break ciphers, including the difficult German Enigma codes. After the war, he continued to work in the world of artificial intelligence. Engineers still use the Turing Test to judge a machine’s ability to show intelligent behavior.
In 2009, Prime Minister Gordon Brown made a formal apology to Dr. Turing. Noting that the brilliant scientist had truly helped to turn the tide of war, Brown called it “horrifying” that he was treated “so inhumanely.” “While Turing was dealt with under the law of the time, and we can't put the clock back, his treatment was of course utterly unfair, and I am pleased to have the chance to say how deeply sorry I and we all are for what happened to him,” Brown said. “So on behalf of the British government, and all those who live freely thanks to Alan's work, I am very proud to say: we're sorry. You deserved so much better.”
The formal apology was followed by an on-line petition asking British government officials to pardon Turing. By February 2012, 23,000 people had signed it. Last week, the Justice Minister declined to do as they asked. “A posthumous pardon was not considered appropriate as Alan Turning was properly convicted of what at the time was a criminal offence,” he explained.
Never shy about his defense of gay rights, columnist Dan Savage compared the conviction of Turing to the conviction of a Swiss man who also broke a law we now find appalling. In 1942, Jakob Spirig helped Jewish refugees from Germany cross into Switzerland, and was sent to prison for his crime. In January 2004, the Swiss government pardoned Spirig, and all other people convicted for helping refugees escaping Nazi Germany. Savage asked the House of Lords: “Did the Swiss government err when it pardoned Jakob Spirig? Or did you err by not pardoning Alan Turing?
Much though I hate to disagree with Dan Savage, who could rest on his laurels for the It Gets Better Project alone, I’m not a fan of pardoning people who have committed the crime of being human under inhumane laws. This describes Turing. He doesn’t need a pardon; the society that made him a criminal does. As the Justice Minster went on to explain: “It is tragic that Alan Turing was convicted of an offence which now seems both cruel and absurd, particularly... given his outstanding contribution to the war effort…. However, the law at the time required a prosecution and, as such, long-standing policy has been to accept that such convictions took place and, rather than trying to alter the historical context and to put right what cannot be put right, ensure instead that we never again return to those times.”
An apology is appropriate; a pardon is not.
Some things can never be put right. Pardoning a dead victim for the crime of being hated is a gift to the present, not the past. It lets modern-day people off the hook. They can be comfortable in their own righteousness, concluding that today’s injustices have nothing to do with such right-thinking people as they are. But they do. Laws reflect a society, and the ones that turned Turing and Spirig into criminals implicated not just their homophobic or pro-Nazi fellow citizens, but all of the members of their society who accepted those laws. A pardon in a case like Turing’s is a Get Out of Jail Free card not for him, but for us.
It’s way too late to pardon Alan Turing. And it’s way too early to pardon ourselves.
Wednesday, February 15, 2012
Tuesday, February 14, 2012
Civil War Soldiers and Conversation Hearts
Valentine’s Day is synonymous with Sweethearts Conversation Hearts, those heart-shaped sugar candies with the messages on them: “Be Mine,” “I’m Yours,” and now, “E-mail Me.” The hearts are made in Revere, Massachusetts, at the New England Confectionery Company. Necco makes them from late February through mid-January of the following year for the Valentine’s Day market. They manufacture about 100,000 pounds a day, making more than 8 billion of the hearts every year.
Conversation hearts are the holiday version of the perennial Necco Wafers. Rumor—in the shape of an NPR story, a Boston Globe story, and hundreds of references on the internet—says that Union soldiers carried Necco wafers, which were called Hub Wafers in those days, in their haversacks. To write a lecture on the growth of industry during the Civil War, I set out to verify that little tidbit.
It was harder than you would think.
The histories on the Necco website and the Reference For Business website explain that Necco is the oldest candy company in America. In its earliest incarnation, it was founded in 1847 by an English immigrant to Boston, Oliver B. Chase, and his brother. Chase invented a device that would cut a simple candy, made of sugar, gelatin, and flavoring, into wafers. These early candies were called “Hub Wafers” (“Hub” was the nineteenth-century nickname for Boston). They were surprisingly popular, since they were cheap, durable, and lasted a very long time. This made them easy to ship and to carry, and thus, a likely thing for Civil War soldiers to take with them on their long marches.
But these histories don’t mention the Civil War. To solve my problem, I decided to try the company directly. Telephone calls to the Necco headquarters netted me only answering machines and an email to the company went unanswered.
So I turned to Civil War reenactors. Surely, in their quest for accuracy, they would know whether or not Union soldiers had carried Hub Wafers. Two hours of trolling through reenacting outfitters, newsletters, and educational websites about what Civil War soldiers carried netted me fascinating glimpses of army life (the US Army Center of Military History has a great website on haversacks and mess gear), but nothing on Hub Wafers. A telephone call to a reenactor yielded a fun chat, but no information on Hub Wafers. He had never heard of them.
Turning to primary sources on-line offered more information. In fact, Hub Wafers—or Necco Wafers, as they were sometimes known by the twentieth century, since both words were on the wrapper—seem to have become very visible and very popular in 1913-1915. Both advertising images and original wrappers are available from this era, some for purchase. The ads promise apothecary owners that the wafers will increase profits; they assure mothers that the candies are healthy and a good addition to afternoon tea. It turns out that this timing makes sense. According to the company history, Necco launched an aggressive advertising campaign in 1912; and in 1913, made much of the fact that Arctic explorer Donald MacMillan took Necco Wafers along on his journey, “using them for nutrition and as rewards to Eskimo children.”
By the 1930s, the candies had a huge following. During the Depression, they were cheap enough to be a treat, and well-loved enough that Admiral Richard Byrd took two tons of Necco Wafers on his own polar expedition. By the time of Pearl Harbor, they were the nation’s premier portable candy: the US government requisitioned the entire output of the Necco factory during WWII.
I never did find a historical source for the information that Civil War soldiers carried Hub Wafers. My guess is that some Boston boys did, as they would have carried other foods from home, but that it was hardly systematic. Instead, my search taught me that before modern techniques of refrigeration enabled people to keep chocolate from melting and spoiling, portability was such an important quality in a sweet that Necco was able to turn its wafers into the nation’s leading candy. While eclipsed now by newer products, Necco Wafers still exist, sold in a wrapper almost identical to that of 100 years ago.
Not bad for a candy that may—or may not—have been carried by Civil War soldiers.
Monday, February 13, 2012
The Malleability of “the Law”
Are Supreme Court decisions important? Most intellectuals, lawyers, teachers, professors, and anyone with an eighth grade education would exasperatedly answer, “Of course! What a stupid question.” The Supreme Court hands down “The Law of the Land.” In Dred Scott v. Sanford (1857), the Supreme Court fueled the tension between North and South over the fugitive slave law. The decision Minor v. Happersett (1875) posited that women are citizens, but citizenship does not mean suffrage. Brown vs. the Board of Education (1954) declared segregation in American public schools unconstitutional. Finally, Citizens United vs. Federal Election Commission (2010) opened the floodgates for political contributions and significantly changed the nature of American politics. These cases have an important impact on our society—increasing tension (and even war) between the branches, parties, regions, and social classes—or, have they?
The fundamental question lurking below the surface is: Are these decisions truly that significant in themselves, or is their significance historically constructed? In other words, are Supreme Court decisions an end in themselves, or are they a means to an end crafted not by the Court, but by historians, lawyers, academics, or historical figures seeking to advance their own agendas? In studying the “history” of such decisions, and how intellectuals and public figures use them for their own purposes, one arrives at the uncomfortable notion that the law is malleable for those who choose to pursue a political agenda.
2003 was the bicentennial of Marbury v. Madison (1803), the fundamental Supreme Court case that set in motion “judicial review” and established the legitimacy of the Court. Or so Constitutional Scholars tell us. In an article titled “The Rhetorical Uses of Marbury v. Madison: The Emergence of a ‘Great Case,’” the Dean of William and Mary and Professor of Law, Davison Davis tracked the evolution of the decision and pointed out that “between 1803 and 1887, the Court never once cited Marbury for the proposition of judicial review (376).” How is it that such a “landmark case” was not mentioned or invoked by the nineteenth-century Court?
Davis contended that when the Court struck down the income tax as unconstitutional in Pollock v. Farmer’s Loan & Trust Co. (1895), only then did defenders of the Court’s decision make Marbury relevant. Davis wrote, “In the struggle to defend the Court’s actions, judicial review enthusiasts elevated the Marbury decision—and Chief Justice Marshall—to icon status to fend off attacks that the Court had acted in an unwarranted fashion (377).” Davis then goes on to outline how politicians, populists, and other figures trashed the Court’s decision, and also how the Warren Court used Marbury as a shield against segregationists and as a sword to assert its power (409). Thus, the Marbury decision was not an end in itself, but a means to end.
It is common knowledge that the Plessy v. Ferguson (1896) decision espoused the “separate, but equal” doctrine. American history textbooks, law books, and historians overwhelmingly cite it as a landmark decision. On May 19, 1896, the day following the decision, The New York Times, the nation’s leading paper, reported of this monumental case, “No. 210 Homer Adolph Plessy vs. J.H. Ferguson, Judge & c- In error to the Supreme Court of Louisiana. Judgment affirmed, with costs.” For those living at the time, the Plessy decision was not that important. The South did not use the Court’s decision to implement Jim Crow. Nor did the decision legalize the Jim Crow laws. Jim Crow came long before the 1896 decision. The New York Times did not mention the case again until the 1950s, when Brown v. the Board of Education (1954) was making its way through the courts.
If one examines the history and evolution of a Supreme Court decision, one finds that decisions in themselves are not ends, but instead, a means to an end for others not on the Court. This is evident in examining the context of our major court cases and tracking their usage across time. Historians, from generation to generation, construct, tear down, and then reconstruct a narrative. While “deconstruction” has had its moment in the sun, it nevertheless has some validity for history: the narrative changes from generation to generation with those who interpret the past. If history is malleable, then the law is malleable, and that is a scary concept.
Thursday, February 9, 2012
Roundup: Exhibits, Preservation, and a Stuffed Horse
Steve Kastenbaum, "History amended by earliest recording of sound," CNN, February 8, 2012
(CNN) -- Thomas Edison came up with a way to play back recorded sound in 1878. But 20 years before the inventor patented the phonograph, French scientist Édouard-Léon Scott de Martinville was fiddling around in his laboratory trying to come up with a way to record sound. His invention, the phonautogram, enabled him to create a visual representation of his voice.>>>
Christine Legere, "1912 capsule opens Old Abington tricentennial," Boston Globe, February 9, 2012
ABINGTON - Fifty residents from three towns that composed Abington before their late 19th-century split eagerly awaited the opening of a 100-year-old time capsule Sunday afternoon, but when the sealing wax was broken, the red twine snipped, and the brown wrapper removed, the contents were a little disappointing.
The capsule contained five programs from the weeklong 1912 celebration of “Old Abington’s’’ 200th anniversary and a handful of sepia images of buildings and local landscapes from 100 years ago, most of those familiar from copies already in possession of local historians.>>>
"Historians to uncover and preserve DC history," NECN, February 5, 2012
WASHINGTON (AP) — With its wooden sign in imperfect French advertising "frittes, ales, moules" seven days a week, Granville Moore's on H Street NE looks like any other hip gastropub. But its exposed brick and chalkboard menu of craft beers belie the tavern's rich history. In the 1950s, the Formstone row house housed the office of Granville Moore, one of the city's most respected African American doctors.>>>
Alex Garrison, "Historians help honor horse of a different era," Lawrence Journal World, February 2, 2012
Comanche, the horse that famously survived — reportedly with 20 bullet wounds — the battle of the Little Bighorn in 1876, is 150 years old. But what exactly his equine legacy should be so many years later is still being debated.
A commemorative event in Kansas University’s Dyche Hall on Thursday acknowledged all sides. Historians spoke about how the beloved horse came to KU, how he was heavily restored in 2005 and how the community so fascinated by him should process its interest with reverence for the lives lost at Little Bighorn and other battles.>>>
Tuesday, February 7, 2012
Norway Doorway, pt 3: Trysil and Bergen
A couple of weeks ago I gave a series of lectures in Trysil, Norway, on American history, regional culture, and religion in the South. It was a wonderful visit, though, I think I never figured out just how to say "Trysil" like a native. (Spoken, it sounds like "Trusal" to me.)
My hosts were wonderfully gracious. Lively conversationalists and the sort of people you meet briefly and miss quite a bit when you're back on the road.
The school at which I spoke had a culinary, vocational program. Meaning: fantastic multi-course lunches that featured a salmon casserole and then moose burgers. Quite a few of the students here were part of a sports program. And, from what I understand, some of those were on the professional track, with sponsorships and bright futures. Xtreme energy drink ski suits with aggro fonts and nuclearized color schemes.
One of the sessions I gave in this lovely ski resort town was for teachers. I focused on the range of teaching materials and resources out on the world-web, inter-tubes:
"Teaching American History and Culture with Online Newspapers and Images."
As part of that talk I gave a handout to the teachers that included the sources we discussed. Here's that list, with some brief descriptions
Google Books
http://books.google.com/
DATA MAPS
Bedford/St. Martin’s Map Central
http://worth.runtime.com/browse/Music
HISTORICAL IMAGES
Artcyclopedia (paintings, prints, lists of movements and countries)
http://www.artcyclopedia.com/
Library of Congress, Prints and Photographs On-line (cartoons, photographs, paintings)
http://www.loc.gov/pictures/
Picturing America (historical American paintings)
http://picturingamerica.neh.gov/
HISTORICAL MAPS
Beinecke Rare Book and Manuscript Library (images, manuscripts, maps)
http://beinecke.library.yale.edu/digitallibrary/
LESSON PLANS/TEACHING AMERICAN HISTORY
Gilder Lehrman Institute of American History (lesson plans, guides, documents, and more)
www.gilderlehrman.org/
Smithsonian: Teaching American History
www.smithsoniansource.org/
Teaching History: National Education Clearinghouse
http://teachinghistory.org/
NEWSPAPERS
Chronicling America: Historic American Newspapers
http://chroniclingamerica.loc.gov/
University of Pennsylvania Libraries: Historical Newspapers Online
http://gethelp.library.upenn.edu/guides/hist/onlinenewspapers.html
SOME OF EVERYTHING
American Memory from the Library of Congress (music, movies, documents, photos, newspapers)
http://memory.loc.gov/ammem/index.html
Archive (documents, movies, photos, music)
www.archive.org
Bedford/St. Martin’s Make History Site (documents, photos, maps, and more)
http://bcs.bedfordstmartins.com/makehistory2e/MH/Home.aspx
Today I arrived in Bergen for three days of sessions with students and teachers at the Bergen Cathedral School, founded, according to legend, all the way back in 1153 by Nicholas Breakspear, who would go on to become pope Adrian IV. Breakspear . . . no relation to Burning Spear, right? (Check out where I am and where I'll be on the Google Map.) These are bright, bright kids. They will no doubt keep me on my toes!
When I'm not doing the shutterbug thing around town and on the wharfs, I'll be speaking about the following: “The Praying South: Why Is the American South the Most Religious Region of the Country?” and “What do American English & Regional Accents Tell Us about America?”
Next Week it's on to Øya videregående skole 7228 Kvål. Say that five times quickly.
Monday, February 6, 2012
American Pre-history
.
I thought I’d try something different this fall, to add an element of perspective and Big History to the beginning of my US History survey.
So I pulled some of the latest ideas from genetics-enhanced archaeology from books I’ve read recently, including Clive Finlayson’s The Humans Who Went Extinct: Why Neanderthals Died Out and How We Survived, Colin Tudge’s Neanderthals, Bandits and Farmers: How Agriculture Really Began, and David J. Meltzer’s First Peoples in a New World: Colonizing Ice Age America. Although most textbooks nowadays briefly mention pre-Columbian America, the impression you get is of a pre-history that is vaguely understood, remote, and largely irrelevant to American history. By starting my syllabus at 36,000 BP rather than 1492 and devoting my first lecture to “Pre-history,” I tried to suggest to the students that the pre-Columbian American past is interesting and relevant.
Of course, you’d be interested in pre-Columbian Americans if you carried their blood, and I thought my students might be interested to know that outside the U.S., the majority of Americans are partly descended from the people who were here when the Europeans arrived. Norteamericanos are unique in the degree to which we didn’t mix, although the Mexicans and even the Canadians did much better than those of us living in the middle third of our continent. And I thought my students ought to understand that three out of the five most important staple crops in the modern world (maize, potatoes, and cassava – the other two are rice and wheat) were developed by early American farmers. Even the 2010 textbook I’m using fails to escape the gravity-well of the master narrative, repeating the myth that Indians were poor farmers and that agriculture was invented in the eastern Mediterranean and later in China.
Finally, I was really fascinated (and I hope some of my students were as well) by the recent developments in theories of migration. Although anthropologists are not yet unanimous on the issue, there is growing support for the theory that most of the people alive today are principally descended from an ancestral population of plains hunter-gatherers who migrated from Africa between 80,000 and 50,000 years ago and settled on the steppe north of the Black Sea. When the last ice age began, steppe and tundra environments spread across Eurasia from the Atlantic to the Pacific, and these people followed the herds of caribou, mammoth, and wooly rhinoceros. The plains hunters prospered as earlier human populations and their temperate forest habitats dwindled. By about 30,000 to 26,000 years ago, some of these plains hunters had spread westward into Europe (where based on genetic evidence they mixed with some older groups, including Neanderthals), while others had covered the entire breadth of Siberian and arrived at Beringia, their gateway to the Americas.
Most of my students seemed vaguely aware that the first Americans migrated from Asia across a “land bridge.” I tried to impress on them that Beringia, which lasted for 16,000 years and was 1,100 kilometers wide, was really not a “bridge,” and the people who crossed it weren’t “migrating.” They were living in Beringia and northwestern Alaska, just as they had done for hundreds of generations. But I think the most important element of this story is that it helps the students recognize that Native Americans and the Europeans they encountered in the Caribbean in 1492 were cousins who had expanded in different directions from the same ancestor population. The differences between them were extremely recent – as is recorded history.
As a final example of this recent rapid change, I talked a little about milk. Most people in the world cannot digest milk after childhood. The ability of Europeans to synthesize lactase and digest lactose is a recent mutation, dating to about 10,000 years ago. It corresponds with the domestication of the aurochs into the modern cow (several African groups like the Masai share this trait, but scientists believe they developed it and domesticated cattle independently), and the mutation probably spread rapidly because it gave its bearers a tremendous nutritional advantage in times of famine. This rapid spread of a biological change, I hope, will suggest other ways that Europeans and Native Americans diverged from their shared ancestry, while at the same time reminding my students of this shared heritage.
Thursday, February 2, 2012
Gilder Lehrman History Scholar Award - Application Open Through March 15th
The History Scholar Award administered by The Gilder Lehrman Institute, honors outstanding graduating college seniors who have demonstrated academic and extracurricular excellence in American History or American Studies. The Institute recognizes the outstanding achievements of Phi Alpha Theta undergraduate members. We are contacting you to invite the graduating seniors in your Phi Alpha Theta Chapter to apply for this honor.
Highlights of the Gilder Lehrman History Scholar Award include:
• Special meetings with eminent history scholars.
• Exclusive behind-the-scenes tours of historic archives.
• Celebratory awards dinner.
Recipients will be reimbursed for up to $600 for travel expenses to New York, and room and board will be provided during the award weekend.
Application Deadline: March 15, 2012
Notification Deadline: April 16, 2012
To apply, or for more information, visit: www.gilderlehrman.org/historyscholaraward.
If you have questions about the award, please email scholars@gilderlehrman.org.
Wednesday, February 1, 2012
TV Debates: Political Discussion or MMA in Suits?
When John F. Kennedy and Richard Nixon took the stage for the first of four televised debates on September 26, 1960, the world of politics changed forever. Nixon was recovering from knee surgery and looked gaunt and ill-prepared as he sweated under the glare of the lights. In contrast, the sun-tanned young junior senator from Massachusetts appeared fit and confident as he answered questions from Howard K. Smith, the venerable CBS reporter and moderator for that evening’s exchange on domestic affairs. The debates were Kennedy’s idea and it was soon apparent why—his youth, good lucks and confident demeanor put his opponent at a distinct disadvantage.
At this point, 88 percent of Americans owned at least one TV set, and the medium had eclipsed radio as the primary source for news. Ed Murrow and his “Murrow Boys” had ushered in the golden age of American TV journalism (though, as Lynne Olson and Stanley Cloud point out, he far preferred radio) and the other major networks were trying everything in their power to catch up with CBS. Eager to raise his profile and to put a dent in Nixon’s campaign, Kennedy was spot on in his deduction that, with the help of Ted Sorensen and other advisors, he could become the favorite once he got in front of the cameras. 74 million viewers tuned in for that opening exchange, and Kennedy later acknowledged, “It was the TV more than anything else that turned the tide.”
Though the debate was spirited and the participants were far apart ideologically, they treated each other courteously and avoided insults and undue criticism. Indeed, a New York Times subhead declared that “Sharp Retorts are Few as Candidates Meet Face to Face.” How times have changed!
In the United States, it is now inconceivable to think of a national political race without TV, though in England the first TV debate between prime ministerial candidates took just before David Cameron’s election triumph. And yet, despite our familiarity with the medium, it is worth considering if we put too much emphasis on how our would-be leaders fare on the box.
Do we count out less telegenic candidates that may have flourished in a bygone era? Have we put too much power in the hands of moderators and their potential agendas? Is it fair to dismiss a politician after a major gaffe?
Certainly, the definition of what makes a “good speaker” has changed. In the late 1800s and early 1900s, audiences packed halls to see scientists introduce new wonders, to hear authors talk about their new books and to listen to lecturers ply their trade. Then, during World War II, British audiences were spellbound by Winston Churchill’s inspirational and defiant rhetoric, yet, when asked if he would permit live TV broadcast of his “iron curtain” speech in 1946, he replied curtly, “I deprecate complicating the occasion with technical experiments.” He, for one, was better suited to well-prepared speeches than impromptu exchanges. Despite being a formidable opponent in the House of Commons, would he have floundered or flourished in a TV debate?
Another questionable element of the TV forum is sponsorship. Media outlets across the ideological spectrum want in on the act, and YouTube has even extended the format to the web. How long until other companies get in on the act, and we have a Tostitos Debate on National Security or a Five Hour Energy Debate on Foreign Affairs, complete with tailored, Super Bowl-like commercials?
And then there’s the matter of frequency. Do we really need to see debate after debate to make up our minds who to vote for, or does the over-exposure and increasingly repetitive content just turn us off? Do we benefit from celebrities weighing in on TMZ about their favorite candidates’ virtues, or denunciations of those they oppose?
The tone of the candidates’ conversation is also subject to scrutiny. A far cry from the civilized banter between JFK and Tricky Dick more than 50 years ago, we appear to be nearing the point at which we will either fashion the competitors with rotten fruit or jousting lances before they go on the air. Perhaps that would make for “better” TV, or at least allow us to confess that, beyond gaining new insight into the candidates’ views, we love seeing one gladiator emerging triumphant from the arena while another is left bloodied and vanquished. Excuse me, I’m off to watch UFC on Fox.