George Mason University's
History News Network

Roundup: Talking About History

  Follow RU: Talking About History on RSS and Twitter

This is where we excerpt articles about history that appear in the media. Among the subjects included on this page are: anniversaries of historical events, legacies of presidents, cutting-edge research, and historical disputes.

SOURCE: The Atlantic (3-15-12)

Terrence M. McCoy is the Gordon Grey Fellow of International Journalism at Columbia University.

...

In the 1920s, the lingering specter of World War I and austere German reparations battered Europe's market-based economy, giving rise to class tension and stark inequality. For worn-down workers, socialism and communism started sounding like pretty good ideas. A world revolution -- indeed, the rise of the proletariat -- seemed possible, and the Communist International was stoked.

But the Americans just wouldn't fall into line. The United States had long since passed the United Kingdom as the world's largest industrial power, but hadn't yet plunged into the Great Depression. To members of the U.S. Communist Party, it was a paradox. Why, in the what appeared to be the purest capitalist Western economy wasn't there any desire for egalitarianism? Had Marx been wrong when he wrote socialism would, inexorably and universally, emerge from the ruins of capitalism? ...

In 1929, Communist leader Jay Lovestone informed Stalin in Moscow that the American proletariat wasn't interested in revolution. Stalin responded by demanding that he end this "heresy of American exceptionalism." And just like that, this expression was born. What Lovestone meant, and how Stalin understood it, however, isn't how Gingrich and Romney (or even Obama) frame it. Neither Lovestone or Stalin felt that the United States was superior to other nations -- actually, the opposite. Stalin "ridiculed" America for its abnormalities, which he cast under the banner of "exceptionalism," Daniel Rodgers, a professor of history at Princeton, said in an interview....

How did a phrase intended as derision become a rallying cry of American awesomeness? As significant portions of the electorate -- think Southern Democrats -- shifted toward the GOP in the 1960s and 1970s, conservative thinkers charted a new Republican identity emboldened by triumphalism and uncompromising patriotism. Doubting exceptionalism became "un-American." Looking to history for more evidence, conservative intellectuals stumbled across Tocqueville, who in Democracy in America had described a nation as "exceptional" for its devotion to practicality over art or science. He lent enough oomph to credibly define America as categorically transcendent, Rodgers said.

It worked. In a 2010 Gallup Poll, 80 percent of Americans agreed that based on history and the Constitution, the United States was the "greatest country in the world." American exceptionalism, along with flag pins shining from one's lapel, is one of the rare issues where Republicans and Democrats agree. In 2009, President Obama said in Strasbourg, France, that he subscribed to American exceptionalism (just as other nations, he stressed, should feel the same about their own country). Gingrich used the phrase 44 times in his recent book. For whatever reason, its author, Stalin, didn't even get a cameo....



SOURCE: WaPo (3-16-12)

John R. McNeill is a professor of history at Georgetown University and a vice president of the American Historical Association.

My 94-year-old father and two of my uncles were among the 16.5 million men and women who served in the American armed forces during World War II. Both uncles, and another who served with the Canadian military in the war, are now dead. I have only snippets of information about their lives in uniform. At my urging, my father recently wrote the story of his life on active duty in the U.S. Army from 1940 to 1946....

Those stories are a form of national treasure. For years, historians, journalists and family members have been collecting letters, diaries, journals and interviews from a few of those 16.5 million. The Veterans History Project at the Library of Congress, created by an act of Congress in 2000, has materials of some sort from 48,000 World War II vets. University libraries, state historical societies, military units and other organizations have collected a few thousand more. No one knows exactly how many because there is no clearinghouse or coordination. But it is likely that fewer than 1 in 200 of these veterans’ stories are preserved in any fashion.

Of those that are preserved, a small share is digitized and easily accessible to the public. The Veterans History Project has put up 7,000 World War II vets’ stories on its Web site. The Rutgers Oral History Archives have an additional 469. The total in all digitized collections is well under 10,000. For the rest, one has to travel to a library or historical society....



SOURCE: WSJ (3-16-12)

Mr. Zaretsky is teaches at the Honors College, University of Houston, and is author most recently of Albert Camus: Elements of a Life (Cornell, 2010).

Fifty years ago this Sunday, the greatest human migration in postwar Europe began to unfold. When French and Algerian representatives signed a series of peace accords in the spa town of Evian on March 18, 1962, the two nations ended an unspeakably bloody and brutal eight-year war. But the agreements also heralded the exodus of nearly one-and-a-half million French Algerians—an event of seismic proportions, yet now mostly forgotten.
 
Not surprisingly, the anniversary of the Evian Accords is less the occasion for commemoration—forget celebration—than it is for confusion. The perplexity reflects how France's past in Algeria is not only not dead, but not even past. The television programs and magazine stories all too predictably focus on the war. Center stage is taken by the Battle of Algiers and the mad contagion of terrorism on both sides, the simmering civil war in France and the pivotal role played by Charles de Gaulle, who in bringing an end to this war without a name, brought down one republic and created another.
 
Mostly ignored in the sound and fury, though, are the so-called oubliés de l'histoire, or history's forgotten ones. This, in particular, is the case with the pieds-noirs: the European immigrants who had begun to settle in Algeria in the mid-19th century. Hailing largely from other Mediterranean countries, these men and women learned the French language, read French history, served in the French army and became French citizens.
 
But here's the rub: The pieds-noirs were citizens of a nation they scarcely knew...


SOURCE: openDemocracy (UK) (3-13-12)

Anthony Lock is an academic and freelance journalist based in Sydney. His current research is concerned with the relationship between the sciences and arts, with a particular focus on applying Orwell's work to memetic study.

March 8th marked the 75th anniversary of the publication of George Orwell’s 1937 landmark The Road to Wigan Pier, a work of extreme candor on pre-war poverty in England. It is a cherished snapshot of the North in the 1930s, and The Observer, among others, have been nostalgic  in printing pictures from the area for the commemoration of Orwell’s journey.
 
As an Orwell scholar, my interest in Wigan Pier is largely in the role it played in the road to Animal Farm and 1984. But the anniversary of the publication comes at a pertinent time - during what is the worst economic hardship since just after the Second World War. It is now being asked if Wigan Pier can be used to address present anxieties. Would Orwell think his original argument still stands for current poverty in the North of England and beyond?
 
Wigan Pier is significant because it was the first salient work produced by Orwell during a period where, by his own words, “every serious line” he wrote was intently written “directly or indirectly, against totalitarianism and for democratic socialism”. Having written previously on imperialism and other economic and social issues, he came to view certain aspects of  supposedly ‘liberal’ capitalism under MacDonald and Baldwin as stops on the road toward the most extreme form of oppression: totalitarian governments. Wigan Pier emerged when Orwell’s editor commissioned him in 1936 to investigate the poverty in the north of England, and it is, in accordance with this larger project, an argument that the causes of 1930s northern poverty themselves constitute a form of totalitarianism...


SOURCE: Financial Times (UK) (3-12-12)

Gideon Rachman is chief foreign-affairs commentator for the Financial Times and author of Zero-Sum Future: American Power in an Age of Anxiety.

How would a Chinese superpower treat the rest of the world? Anyone wanting to peer into the future, could start by looking back at the past – or, at least, at the official version of China’s past. The message is not reassuring. China’s schoolchildren are being taught a version of history that is strongly nationalist. The official narrative is that their country was once ruthlessly exploited by rapacious foreigners. Only a strong China can correct these historic wrongs.
 
This official story has a lot of truth in it. China in the 19th and 20th centuries was indeed the victim of foreign imperialism. The trouble is that China’s official history lacks the quality that Maoism was meant to stress: self-criticism. If you visit the exhibitions in the vast National Museum of China on Tiananmen Square you will see and read about the terrible things that foreigners have done to the Chinese. There is almost nothing about the even more terrible things that Chinese people did to each other – largely because most of these crimes were committed by the Communist party, which still runs the country.
 
These gaps matter. A more honest debate about the past will be an essential part of China’s journey to a more open political system. A view of Chinese history that moves beyond a narrative of victim-hood, might also make China’s rise to global power smoother...


SOURCE: Slate (3-13-12)

Jonathan D. Sarna is the Joseph H. & Belle R. Braun professor of American Jewish history at Brandeis University and chief historian of the National Museum of American Jewish History. This article is adapted from his new book, When General Grant Expelled the Jews.

On Dec. 17, 1862, as the Civil War entered its second winter, Gen. Ulysses S. Grant issued the most notorious anti-Jewish official order in American history: “The Jews, as a class violating every regulation of trade established by the Treasury Department and also department orders, are hereby expelled from the department within twenty-four hours from the receipt of this order.” Known as General Orders No. 11, the document blamed Jews for the widespread smuggling and cotton speculation that affected the area under Grant’s command. That area, known as the “Department of the Tennessee,” stretched from northern Mississippi to Cairo, Ill., and from the Mississippi River to the Tennessee River. Grant ordered Jews expelled from every inch of it and warned that “any one returning ... will be arrested and held in confinement until an opportunity occurs of sending them out as prisoners.” Lest anyone try to change his mind, Grant made clear that “no passes will be given these people to visit headquarters for the purpose of making personal application for trade permits.”...



SOURCE: BBC (3-9-12)

David Cannadine  is a British historian, author and professor of history at Princeton University.

In a few days' time, David Cameron will be journeying to Washington to visit Barack Obama, and according to a White House Statement, his visit will "highlight the fundamental importance of the US-UK special relationship and the depth of friendship between the American people and the people of the United Kingdom".

Perhaps it will, and I hope it does, but it's also likely to give rise to at least two challenging questions. Is America's relationship with Britain as special as it used to be? And is it genuinely more special than with any other country?

These matters have been much on my mind of late, because I've recently returned from lecturing at Westminster College, in Fulton, Missouri, where in March 1946, Winston Churchill gave one of his most significant post-war speeches in which he launched the phrase "special relationship" into popular currency.

Churchill was merely a private citizen, having been turned out of 10 Downing Street at the general election in the previous summer, but he was introduced by the American President Harry S Truman, and during the course of his speech, he offered a new and in many ways alarming view of the post-war world.

From Stettin in the Baltic to Trieste in the Adriatic, Churchill insisted, an Iron Curtain had descended across Europe, dividing the continent between a free and democratic west and a totalitarian and Communist east. The Iron Curtain was the first phrase his Fulton speech made famous, and the second was indeed the "special relationship" which he believed existed between Great Britain and the United States…



SOURCE: American Heritage (11-1-11)

Edward G. Lengel, the senior editor of The Papers of George Washington and an associate professor of history at the University of Virginia, is author most recently of To Conquer Hell: The Meuse-Argonne, 1918 (Henry Holt and Co. 2008).

As the editor of the papers of George Washington at the University of Virginia in Charlottesville, I have the privilege of intersecting with many people who come bearing documents supposedly signed by the first president. More often than you might think, I have the unenviable task of informing them that their letter‚ often lovingly framed and passed down for decades in their family is a fake. An office file, which we've marked "Forgeries," overflows with dozens of similar examples. Individuals and families are not the only ones duped by what I've discovered has been a robust 150-year-old market in Washington forgeries. Recently a well-meaning alumnus sold a multipage letter apparently in Washington's handwriting to a major university library. The librarians placed the letter on display and trumpeted their acquisition to the media. Several months later, an astute visitor pointed out some strangely awkward flourishes in the letter's handwriting. Upon examination, the letter turned out to be the work of a prolific 19th-century forger.

Just as high demand for popular consumer goods today inspires the production of cheap imitations to satisfy the market, the demand for Washington letters in the mid-19th century inspired the production of thousands of forgeries. Ironically, these fakes are now often collector's items in their own right....

Like many villains throughout history, the British-born Robert Spring could turn on the charm. He immigrated to America as a young man in the 1850s, starting out as a Philadelphia bookseller. He quickly learned that success came by actively appealing to the interests of collectors. At some point he realized that true wealth lay in creating interest where it didn't already exist.

The clever bookseller purchased an armful of volumes that once had belonged to Washington, marked them up significantly, and sold them at a great profit. Pleased with the results, he pulled some dusty, 18th-century books from his back room and lied to customers that Washington had once owned them. They sold as fast as he could put them on display. Spring then began signing Washington's signature on the title pages of some old tomes that he had been trying to unload for years. Customers snatched them up....



SOURCE: NYT (3-6-12)

Frank Jacobs is a London-based author and blogger. He writes about cartography, but only the interesting bits.

“I think I’ll write a book today,” the writer Georges Simenon was said to tell his wife at breakfast. “Fine,” she would reply, “but what will you do in the afternoon?” Winston Churchill was similarly prolific, and not just in the field of letters [1]. In his later years, he liked to boast that in 1921 he created the British mandate of Trans-Jordan, the first incarnation of what still is the Kingdom of Jordan, “with the stroke of a pen, one Sunday afternoon in Cairo” [2].

Also like Simenon, Churchill wasn’t averse to the odd tipple, and according to some, that Sunday afternoon in Cairo followed a particularly liquid lunch. As a consequence, the then colonial secretary’s [3] penmanship proved a bit unsteady, allegedly producing a particularly erratic borderline. The result is still visible on today’s maps: the curious zigzag of the border between Jordan and Saudi Arabia.

Starting at the Gulf of Aqaba, the Jordanian-Saudi border drifts northeastward as six relatively short, straight lines, manacled together into an unsteady chain gang that doesn’t quite know which direction to take. Then, in a single, 90-mile stretch, the border suddenly and spectacularly lurches northwest, aiming for the southern Lebanese coast. But finally, it seems to regain its footing, continuing the 130 miles northeast toward the Iraqi border in a near-straight line, as if running away from all those twists and turns....

according to the legend of its creation, the border owes its strange shape to nothing more significant than Churchill’s propensity for champagne, brandy and whisky. This stretch of border is still, and in retrospect rather euphemistically, referred to as Winston’s Hiccup, or Churchill’s Sneeze. Wouldn’t it be ironic if Saudi Arabia, the nation that puts the “total” in teetotal, owed part of its external border to the inebriated scribblings of a British boozehound [5]?

Unfortunately, even though Churchill was sufficiently involved in redrawing the map of the Middle East to boast that Jordan was his creation, and even though he was fond of a tipple, the hiccup part of the story is entirely apocryphal....



SOURCE: AHA Perspectives (3-1-12)

William Cronon (Univ. of Wisconsin–Madison) is the president of the AHA.

...Given the immense public appetite for history, and the essential contributions history can make to public understanding of all manner of problems in the present, the risks associated with too narrow and academic a definition of "professional history" could not be more clear.

This is why, I would argue, we should keep a close watch on boredom if we want to make sure history continues to reach beyond our professional circles to a public that includes not just an educated citizenry, but intellectuals in other disciplines and historians in other fields. If professional history is sometimes boring, let's ask what it is about our professionalism that makes it so.

This is also why professional historians who work in the academy should be immensely grateful when they are joined in an organization like the AHA by professional historians who make documentaries, design web sites, post blogs, curate exhibits, teach school, and publish popular books. Only if we all gather together under the same big tent will we be able to learn from each other the ways good history can be more effective in reaching the many audiences that hunger for its insights. Forty million people watched Ken Burns's documentaries on The Civil War. Barbara Tuchman probably influenced more people's understanding of the First World War than any other historian of her generation. Public school teachers shape the historical consciousness of many millions more students (and citizens) than college teachers ever will. And so on and on.

How do we avoid professional boredom? By making sure we don't define "professional" too narrowly. By not talking only with each other. By welcoming into our community anyone and everyone who shares our passion for the past and who cherishes good history. By remembering that no matter what else we do, we are all teachers whose foremost responsibility is to share what we know in ways people can understand—and, more basic still, in ways that people will find interesting, even intriguing. By communicating as clearly and engagingly as we can. By telling good stories.

And: by never forgetting that our first and most important job—the one on which all others depend—is to make the past come alive for nonprofessionals who would otherwise find it dry, dead, . . . and boring.



SOURCE: Spiegel International (3-6-12)

Translated from the German by Christopher Sultan. 

Ten days before Christmas, the German Interior Ministry acquitted itself of an embarrassing duty. It published a list of all former members of the German government with a Nazi past.

The Left Party's parliamentary group had forced the government to come clean about Germany's past by submitting a parliamentary inquiry. Bundestag document 17/8134 officially announced, for the first time, something which had been treated as a taboo in the halls of government for decades: A total of 25 cabinet ministers, one president and one chancellor of the Federal Republic of Germany -- as postwar Germany is officially known -- had been members of Nazi organizations.

The document revealed that Chancellor Kurt Georg Kiesinger, a member of the conservative Christian Democratic Union (CDU) who governed Germany from 1966 to 1969, had been a member of the Nazi Party ever since Adolf Hitler seized power. According to the Interior Ministry list, German President Walter Scheel, a member of the business-friendly Free Democratic Party (FDP) who was in office from 1974 to 1979, had been a Nazi Party member "from 1941 or 1942."

The list names ministers of all political stripes and from a wide range of social backgrounds. Some, like leftist Social Democratic Party (SPD) mastermind Erhard Eppler (Minister of Economic Cooperation), did not become Nazi Party members until the end (at 17, in Eppler's case). Others, like conservative Christian Social Union (CSU) agitator Richard Jaeger (Minister of Justice), had been part of Hitler's paramilitary organization, the SA (since 1933, in Jaeger's case). Even FDP luminary Hans-Dietrich Genscher (first Interior Minister and later Foreign Minister), who denies to this day that he knowingly joined the Nazi Party, is listed as a Nazi Party member.

According to the government list, former SPD Finance Minister Karl Schiller was in the SA, while his fellow cabinet minister Horst Ehmke was a Nazi Party member, as were ("presumably," the list notes) former SPD Labor Minister Herbert Ehrenberg and Hans Leussink, a former education minister with no party affiliation. On the conservative side, the report names several former Nazi Party members, including former CDU Foreign Minister Gerhard Schröder and former CDU Minister for Displaced Persons Theodor Oberländer, as well as former CSU Post and Communication Minister Richard Stücklen and former CSU Interior Minister Friedrich Zimmermann.

Germany's Dark Past

None of this information is new. It isn't just since the 1968 student revolts that critical citizens, intellectuals and the media have broadcast new details on the contemporary relevance of Germany's dark past. For years, the notion that partisans of the Nazi regimes were able to manipulate their way into the top levels of government in the young federal republic, and that former Nazi Party members set the tone in a country governed by the postwar constitution in the 1950s and 60s has been a subject for historians.

But six decades after the Nuremberg Trials against the leaders of the Nazi regime, a new attempt -- the first official one, at that -- to come to terms with postwar Germany's Nazi past is now underway...



SOURCE: National Interest (2-28-12)

Robert W. Merry is editor of The National Interest and the author of books on American history and foreign policy.

Of all the U.S. presidents since Franklin Roosevelt, none stands taller in history or exercises a greater lingering influence on American politics than Ronald Reagan. Republican politicians invoke his name as example and lodestar, and Democrats have granted him increasing respect as the passions of his presidential years have ebbed with time. Surveys of academics on presidential performance, initially dismissive, now rank him among the best of the White House breed. Even President Obama has extolled his approach to presidential leadership.

This veneration poses a dark danger—that Reagan will become associated with philosophies he never held and policies he never pursued. This is happening today with increasing force as neoconservative intellectuals and politicians seek to conflate Reagan’s Cold War strategy with their push for American global dominance in the name of American values. Their aim is to equate today’s Islamic fundamentalism with Russian Bolshevism and thus boost the argument that U.S. military actions in the Middle East are a natural extension of Reagan’s forceful—and successful—confrontation with Soviet Communism in the 1980s.

Back in 2002, when neoconservative war advocates were beating the drums for the Iraq invasion, William Kristol, editor of the neoconservative Weekly Standard, told the Financial Times: "Americans see clearly which are democratic states and which are tyrannies in the world today, as they did when the Soviet Union was the main enemy." Writing in Kristol’s magazine, Reuel Marc Gerecht declared that George W. Bush’s "liberation theology" constituted a "Reaganesque approach." Bush himself, in declaring that the "advance of freedom is the calling of our time," invoked Reagan as a progenitor of his missionary drive. Some months later, Kristol noted admiringly that Bush’s foreign-policy advisers, nudging the country to war, were "all Reaganites."

More recently, these efforts to wrap Bush’s aggressive foreign policy in a Reagan blanket have been duplicated in the rhetoric of the now-dwindling field of candidates for this year’s GOP presidential nomination. Mitt Romney, Rick Santorum and Newt Gingrich all identified Reagan as a model for the foreign-policy bellicosity they espoused with such zeal during the campaign.

This is all specious... 



SOURCE: Guardian (UK) (3-5-12)

David Smith is the Guardian's Africa correspondent.

'Nature was our playground," writes Nelson Mandela in his memoir Long Walk to Freedom. "The hills above Qunu were dotted with large smooth rocks which we transformed into our own rollercoaster. We sat on flat stones and slid down the face of the rocks. We did this until our backsides were so sore we could hardly sit down."

Walking down the grassy slope into a breeze, I came upon it: Mandela's "sliding stone". The big granite boulder has an unmissable track worn smooth and shiny by his childhood sport. It is one of the rocky outcrops overlooking the bucolic valley of Qunu, where South Africa's first black president grew up and which, at 93, he still calls home.

I had come here on the "Mandela trail" in the rolling hills of the area formerly known as the Transkei. I've previously stood in the pokey bed chamber where it is thought William Shakepeare was born in Stratford-upon-Avon, the grander birthplace of Winston Churchill at Blenheim Palace and the cramped abode where Stan Laurel breathed his first in Ulverston, Cumbria. Thus I could hardly neglect one of the most famous people on the planet, a man who, in South Africa, has been canonised in his own lifetime. Johannesburg alone boasts a Mandela statue in Mandela Square, a Mandela bridge, Mandela house, Mandela theatre and Mandela foundation.

Indeed, if he wasn't so authentically loved, such idolatry could look North Korean. When it was recently announced that Mandela's face will replace the Big Five wild animals on the national currency, one columnist described it as "a bit banana republic".

Have those rocky outcrops in Qunu already been carved out like Mount Rushmore?..

 



SOURCE: The Age (AU) (3-2-12)

Christopher Bantick is a Melbourne writer and senior literature teacher. He was formerly head of history at Trinity Grammar in Kew, where he taught Australian History and Revolutions.

BRITISH historian Simon Schama once observed of contemporary students, "What of history do they know?" It's a good question. A more immediate one is why don't Australian undergraduates want to know about their own country?

The blunt fact is that Australian history, once the gold standard of university history courses, is dying. This year, La Trobe University does not have any undergraduate Australian history subjects. The University of Melbourne is winding up its Australian Studies Centre. Why? Both La Trobe and Melbourne are universities with enviable records in the study of Australian history. So how can the decline in undergraduate demand be explained?

A lack of interest in Australian history undergraduate courses does not begin at university. It starts far earlier. Look no further than schools....

Schools killed Australian history. Reduced to a brain-deadening subject where nothing seemed to happen, it proved Henry Ford's assessment of history being "bunk".

Primary schools peacocked the past and hammered Ned Kelly's armour to the density of tin plate. Gallipoli became the rite of passage story and the First Fleet was a good place to start genealogy. Gold presented a rush of enthusiasm but Lola Montez's spider dance was edited out....



SOURCE: WSJ (3-5-12)

Mr. Galambos, professor of history at Johns Hopkins University and an editor of The Papers of Dwight David Eisenhower, is the author of The Creative Society—and the Price Americans Paid for It (Cambridge, 2012).

The effort to memorialize Dwight Eisenhower in Washington has stirred up a hornet's nest of controversy. The design by architect Frank Gehry includes a formidable metal tapestry devoted to the Abilene, Kan., countryside where the 34th president spent his childhood. There are also plans for a statue of young Ike, positioned between two very large stone blocks devoted to the presidency and the distinguished military career of the man who led the Allied forces to victory in World War II.
 
Those opposed to Mr. Gehry's design seem to prefer something traditional and neoclassical, along the lines of existing monuments in the nation's capital.
 
The debate is as much over Eisenhower's role in history as it is over an innovation in Washington architecture. Was Ike essentially a traditionalist, a stodgy staff-man, a Republican president who wanted to return America to a golden age of small towns and honest folk who wore proper hats? Or was he an agent of change who embraced modern technology and worked quietly and efficiently to bring the country into a new and challenging international role?..


SOURCE: NYT (2-25-12)

Ross Douthat is a columnist for the New York Times.

...In a 2011 Gallup poll on the greatest president, Eisenhower came in a lame 12th, in a tie with Jimmy Carter. He performs solidly in scholarly surveys, but he’s frequently ranked behind his prominent 20th-century rivals.

In part, this underestimation is a result of the political persona Eisenhower cultivated — an amiable, grandfatherly facade that concealed a ruthless master politician. In part, it reflects the fact that his presidency has always lacked an ideological cheering section. Liberals (who preferred Adlai Stevenson) generally remember the Eisenhower administration as a parenthesis between heroic Democratic epochs, while conservatives (who favored Robert Taft) recall a holding pattern before their Goldwater-to-Reagan ascent.

But ultimately Eisenhower is underrated because his White House leadership didn’t fit the template of “greatness” that too many Americans pine for from their presidents. He was not a man for grand projects, bold crusades or world-historical gambles. There was no “Ike revolution” in American politics, no Eisen-mania among activists and intellectuals, no Eisenhower realignment.

Instead, his greatness was manifested in the crises he defused and the mistakes he did not make. He did not create unaffordable entitlement programs, embrace implausible economic theories, or hand on unsustainable deficits to his successors. He ended a stalemated conflict in Korea, kept America out of war in Southeast Asia, and avoided the kind of nuclear brinkmanship that his feckless successor stumbled into. He did not allow a series of Middle Eastern crises to draw American into an Iraq-style intervention. He did not risk his presidency with third-rate burglaries or sexual adventurism. He was decisive when necessary, but his successes — prosperity, peace, steady progress on civil rights — were just as often the fruit of strategic caution and masterly inaction....



SOURCE: NYT (2-25-12)

Robert Zaretsky is a professor of history at the Honors College, University of Houston, and a co-author of “France and Its Empire Since 1870.”

THE French empire is back — this time, though, rather than coming to you, you will need to go to it.

Earlier this month, Yves Jégo, the mayor of a small town southeast of Paris, officially announced his plans for the Bivouac de Montereau, better known as Napoleonland — an amusement park commemorating French history, with an emphasis on the emperor’s achievements, that will rival nearby EuroDisney....

Against the background of a dispiriting presidential campaign, an anemic economy and a deepening social divide — not to mention this year’s 200th anniversary of Berezina, a word long synonymous in French with “disaster” — now seems like an odd time to announce the empire’s return. But the French seem to be hoping that such a park might galvanize not just the local economy but also a national sense of purpose.

Napoleonland’s best historical precedent suggests that only one of these hopes is justified. During the World Exposition of 1889, held to mark the centenary of the French Revolution, the French similarly turned to the past for the stuff of a theme park. The exposition included a replica of the Bastille, as well as more modern attractions, like the Gallery of Machines and the new Eiffel Tower....

Today, the debate over the revolution is the affair of historians; in 1889, it was the affair of citizens. For those on the left, the revolution had given the world a nation based on the ideals of liberty, equality and fraternity; for those on the right, it had given the world a nation awash in the blood spilled by the guillotine. When the exposition opened, conservatives saw the crowds it attracted as the dregs of society unleashed by the revolution. Recoiling in front of “the vast bestial joy” of the visiting masses, the critic Edmond de Goncourt shuddered at the thought of France’s democratic future....



SOURCE: Philadelphia Inquirer (3-1-12)


Jonathan Zimmerman teaches history at New York University and lives in Narberth. He is the author of "Small Wonder: The Little Red Schoolhouse in History and Memory" (Yale University Press). He can be reached at jlzimm@aol.com.

In March 1942, a few months after America entered World War II, the U.S. Army issued its first official regulation designed to screen out gay men. The directive listed three supposedly telltale signs of homosexuality: "feminine bodily characteristics," "effeminacy in dress and manner," and a "patulous rectum." For those who don't have a dictionary handy, patulous means "expanded." And, yes, the Army regulation really said that.

I thought of this shameful history as I read a recent Inquirer story about the mass murderer Howard Unruh, whose case files were released by the Camden County Prosecutor's Office after persistent requests by the newspaper. Nobody will ever know what exactly led Unruh, a World War II combat veteran, to gun down 13 innocent people in East Camden on Sept. 6, 1949. But the case files confirm that Unruh was gay at a time when American culture attached a heavy stigma to homosexuality....

Between 1941 and 1945, roughly 9,000 soldiers and sailors were discharged from the U.S. military for being gay. Servicemen suspected of homosexuality were frequently forced to strip naked or submit to a "gag test" with a tongue-depressor; if you didn't gag, you were assumed to be gay.

Before they were discharged, gays were detained in "queer brigs" or "pink cells." Some of these brigs were outdoor pens where servicemen awaiting discharge were publicly humiliated by passersby. In private, meanwhile, they were sometimes forced to sexually gratify their guards.

If that sounds a bit like the Nazis' persecution of gays, it's because it was. Ironically, some American homosexuals were inspired to enlist in the armed forces after reading reports of German mistreatment of gays, only to find themselves under attack by their own military....



SOURCE: Rolling Stone (2-22-12)

Rick Perlstein is the author of Before the Storm: Barry Goldwater and the Unmaking of the American Consensus and Nixonland: The Rise of a President and the Fracturing of America. He writes a weekly column for RollingStone.com.

...Here's the problem: Even if Obamaism works on its own terms – that is, if [Andrew] Sullivan is right that Obama’s presidency is precisely on course – it can't stop Republicans from wrecking the country. Instead, it may end up abetting them.

To understand why, let's look at Ronald Reagan. Barack Obama has famously cited him as a role model for how transformative a president can be. Well, what did he transform, and how did he do it? Here's how: He planted an ideological flag. From the start, he relentlessly identified America's malaise with a villain, one that had a name, or two names – liberalism, the Democratic Party – and a face – that of James Earl Carter. Reagan's argument was, on its face, absurd. For all Carter's stumbles as president, the economic crisis he inherited had been incubated under two Republican presidents, Nixon and Ford (see this historical masterpiece for an account of Nixon's role in wrecking the economy), and via a war in Vietnam that Reagan had supported and celebrated. What's more, to arrest the economy's slide, Jimmy Carter did something rather heroic and self-sacrificing, well summarized here: He appointed Paul Volcker as Federal Reserve chairman with a mandate to squeeze the money supply, which induced the recession that helped defeat Carter – as Carter knew it might – but which also slayed the inflation dragon and, by 1983-84, long after Carter had lost to Reagan, saved the economy.

In office, Reagan, on the level of policy, endorsed Carter's economics by reappointing Volcker. But on the level of politics, in one of the greatest acts of broad-gauged mendacity in presidential history, he blamed Carter for the economic failure, tied that failure to liberal ideology and its supposed embrace of "big government" (Carter in fact took on big government), and gave conservatism credit for every success. Deregulation and supply-side tax-cuts brought us "morning in America," he said. That was bullshit, but it won him a reelection landslide against Walter Mondale, Carter's VP, whom he labeled  "Vice President Malaise."

What's the lesson? It’s not that you have to lie – Republicans had to do that to win, but Democrats don't. No, Democrats, in 2009, could simply have told the truth, and called it hell. The truth was this: For the first few years of this new century, America had ventured upon a natural experiment not attempted since the 1920s – governing the country with conservatives in control of all three branches of government. The result, of course, was – smoking ruins. Everybody knew it. A majority of Americans was receptive to "liberal solutions," and even conservatives knew it – which was why, after Obama delivered his February 24, 2009 speech defending the stimulus that, as I noted last week, got a 92 percent approval rating, and Bobby Jindal responded to it by excoriating the $140 million in stimulus spending "for something called 'volcano' monitoring," David Brooks said his "stale, government-is-the-problem, you can't trust the government" rhetoric was "a disaster for the Republican Party.”...



SOURCE: Chronicle of Higher Ed (2-26-12)

Aaron Bobrow-Strain is an associate professor of politics at Whitman College. This essay is adapted from his new book from Beacon Press, White Bread: A Social History of the Store-Bought Loaf.

Not long ago I found myself crammed into a tiny commuter plane bucking over the Cascade Range of Washington State. Behind me, two women distracted themselves from the turboprop's lurching and groaning by talking loudly about beef.

A self-avowed "foodie" with a bit of ranching in my background, an overdeveloped interest in how people think about eating, and a pressing need to escape thoughts of turbulence-induced aviation disaster, I listened in.

The two women had clearly read their Michael Pollan. They spoke ably about the dangers of contaminated meat, the environmental consequences of feedlot production, and the benefits of grass-fed beef. At one point I heard them bandy about the name of a company that sold organic meat in the town we had just left. Then, as we dropped low over Seattle and the bumps eased, one of them concluded: "I guess I know all that, but we still buy regular meat from Walmart. It's just who we are." "Yup," replied the second woman.

Thanks to an explosion of socially and environmentally aware food writing, readers in the United States now have access to a great deal of information about the shortcomings of our industrial food system as well as a growing collection of fairly simplistic ideas about how to change it. Nevertheless, very little has been written about the complex world of habits, desires, aspirations, and anxieties that define Americans' relationship to eating—the emotional investments that frustrate reformers and help keep the industrial food system as it is.

Most foodie discourse assumes that once people have knowledge about the difference between "good" and "bad" food, along with improved access to the former, they will automatically change their diets—like a dammed river freed to find its natural course. But what about all the people, like the two women on my flight, who know and could change, but don't?

I wrote a book about ultrasoft, mass-produced sliced white bread because I wanted to understand America's fraught relationship to industrial eating in all its contradictory ferment. Over the past 100 years, few foods have been as revered and reviled as industrial white bread. It has served as a touchstone for the fears and aspirations of racial eugenicists, military strategists, social reformers, gourmet tastemakers, health experts, philosophers, and food gurus. Sixties counterculture made it an icon of all that was wrong with Amerika, and the famed style arbiter Diana Vreeland famously proclaimed: "People who eat white bread have no dreams." By which she meant that they don't dream the right dreams, the up-to-date, hip dreams. This sentiment still resonates today, as industrial white bread has become even more widely associated with poor choices and narrow lives. But, whatever the context, Americans' embrace (or rejection) of industrial white bread has never been a simple matter of taste, convenience, or health.

From the 1860s to the 1960s, Americans across class, gender, and, to a certain extent, racial lines got more of their daily calories from bread than any other single food: 25 percent to 30 percent, on average, and higher during times of war and recession. Not surprisingly, what people thought about bread said a lot about who they were. And I don't just mean that bread has long been a marker of social status, although that is true, too. Rather, what I found was that America's love-hate relationship with this fluffy stuff has been wrapped up in a series of much larger questions about who we are as a nation, how we understand progress, how we envision America's role in the world, what we believe counts as responsible citizenship, and, ultimately, how we relate to each other across our differences....



Syndicate content