George Mason University's
History News Network

Roundup: Talking About History

  Follow RU: Talking About History on RSS and Twitter

This is where we excerpt articles about history that appear in the media. Among the subjects included on this page are: anniversaries of historical events, legacies of presidents, cutting-edge research, and historical disputes.

SOURCE: Washington Decoded (2-12-12)

John Earl Haynes and Harvey Klehr are contributing editors to Washington Decoded, and the authors of many books on the history of American Communism and Soviet espionage.

The relationship of physicist J. Robert Oppenheimer to communism and Soviet espionage has been controversial subject since 1954, when the decision of the Atomic Energy Commission (AEC) to decline renewal of his security clearance put the issue firmly into the public arena. Journalists and historians addressed the issue repeatedly in the decades that followed. Nothing fueled the liberal/left critique of the so-called “national security state” more than the supposed excesses of the US government in the Oppenheimer case, save the cases involving Alger Hiss and the Rosenbergs.[1]

But while the emotional level, even shrillness, of the debate continued, the substance of the argument became increasingly stale and repetitive; there was little new evidence to clarify the ambiguities of the matter. In the last two decades, however, new evidence has emerged that, while not resolving all ambiguities and still leaving a number of details unclear, nonetheless allows confident answers to the question of whether Robert Oppenheimer was a Communist and a spy. It demonstrates that he had, indeed, been a Communist but had not been a spy.

We addressed the issue of Oppenheimer’s involvement in Soviet espionage in “Special Tasks and Sacred Secrets on Soviet Atomic Espionage,” which critiqued and rejected the claims in books written by former KGB officer Pavel Sudoplatov and journalists Jerrold and Leona Schecter that Oppenheimer consciously assisted Soviet espionage and did so in a substantial way. This essay reviews the evidence indicating that Oppenheimer was a secret member of the Communist Party, USA (CPUSA), joining at some point in the late 1930s and actively participating in a secret Party faculty unit at the University of California, Berkeley, in 1939, 1940, and 1941. Secondly, it critiques the conclusion of Oppenheimer biographers Kai Bird and Martin Sherwin that he was never a Communist. Finally, it discusses the evidence indicating that in early 1942 he quietly left the Party, coinciding with, and likely connected to, his formal recruitment into the Manhattan atomic bomb project.[2]

In private conversations with security officers and later in public statements, Oppenheimer admitted his political and financial support for Communist-backed causes in the late 1930s and his social relationship with various Communists. While admitting in 1954 to generous contributions to CPUSA official Isaac Folkoff, he said the funds were not for the CPUSA itself but for various causes supported by the Party such as Spanish Civil War veterans and unionization of farm workers. He explained, “I doubt that it occurred to me that the contributions might be directed to other purposes than those I had intended, or that such purposes might be evil. I did not then regard Communists as dangerous; and some of their declared objectives seemed to me desirable.” He also emphatically denied under oath Party membership or any covert participation in Communist Party meetings or activities.[4]

Even prior to 1999 several sources contradicted Oppenheimer’s denials of direct CPUSA links. In December 1943 FBI listening devices picked up a conversation between Steve Nelson, chief of the CPUSA in the San Francisco Bay area, and Bernadette Doyle,  organizational secretary of the CPUSA’s branch in Alameda County, which included Berkeley where Oppenheimer lived. Nelson and Doyle spoke of both Oppenheimer brothers as CPUSA members, but Nelson mentioned that Robert had become inactive. (That Frank was not characterized similarly along with some other evidence throws doubt on Frank Oppenheimer’s claim that he dropped out of the CPUSA at the end of 1941.)  Nor was this the only time FBI surveillance picked up such incriminating information. Earlier in 1940 a wiretap of the phone of Isaac Folkoff alerted the FBI to a private meeting of senior Communists at Haakon Chevalier’s home. Follow-up FBI surveillance noted that Oppenheimer’s car was parked outside the meeting place. In 1945 the Bureau tapped a meeting of the Executive Committee of the North Oakland Communist Club, at which one Party official, Katrina Sandow, stated that Oppenheimer was a Communist Party member and another official, Jack Manley, boasted that he had been “close to Oppenheimer,” whom he called “one of our men.” Lastly, an undated FBI report, sourced only to informant “T-2” identified Oppenheimer as belonging to a secret Communist Party professional section.[5]



SOURCE: WSJ (2-11-12)

Mr. Wasserstrom is the author of "China in the 21st Century: What Everyone Needs to Know."

In the early 1860s, a violent fight raged to determine the fate of a vast country. An insurrection had split it in two, leaving much of the southern half governed by men who claimed to be the leaders of a new state but were dismissed by their foes as illegitimate "rebels," outlaws who had given themselves fancy titles. The conflict involved legendary generals with names that schoolchildren still memorize, and it had not just local but international significance: In far-off London, debates raged over whether the British Empire should back the rebels, with whom some Britons felt a sympathetic bond.

American readers might naturally assume that this description refers to our Civil War. In fact, I had in mind an Asian conflict, which may be little known to Americans today but which was far bloodier than the struggle that pitted Grant against Lee (tens of millions dead, compared with under a million). The insurgents with fancy titles in this case were the self-proclaimed "Kings" of the Taiping Uprising, a movement that at its apogee held sway over a territory roughly the size of Italy.

Hong Xiuquan (1814-64), the "Heavenly King" who was the movement's supreme leader, strove to transform China by fulfilling a quasi-Christian millenarian prophecy. A frustrated scholar who had been exposed to a missionary tract while preparing to take the all-important civil-service examination that would secure him a post in the official bureaucracy, Hong went into a trance after failing the grueling test and awoke convinced that he was Christ's younger brother, selected by God to save China from rule by barbarian "demons," his term for the Manchu members of the Qing royal family.

Stephen Platt's "Autumn in the Heavenly Kingdom" is an impressive, gracefully written account of the war that ensued. Like many historians of our War Between the States, Mr. Platt presents stirring accounts of battles and finely etched portraits of military commanders. On the insurgent side, the commanders included figures like Chen Yucheng (aka the "Brave King"), who started his life in poverty and ended it near the top of the Taiping hierarchy. Ranged against Chen were men such as Li Hongzhang, a famous military modernizer, and Li's mentor, Zeng Guofan. A shrewd strategist torn by competing loyalties—to his family, his home province of Hunan and the dynasty he served—Zeng did more than anyone else to topple Hong's Taiping Kingdom. Mr. Platt's richly textured portrait of this complex, conflicted official is one of the strengths of the book....



SOURCE: Chronicle of Higher Ed (2-12-12)

Thomas Bender is a professor of history and university professor of the humanities at New York University. This essay is adapted from a talk at the annual meeting of the American Historical Association.

I want to pose a question to historians, as a profession: Did we make a decision in the past that has had consequences, presumably not positive ones, for our present and future? The question invites looking backward as a way to think about the future. Were there possibilities we missed? Did we lose part of ourselves sometime in the past?

The question is related to the plea recently made by the president and the executive director of the American Historical Association to consider nonacademic as well as academic careers as proper outcomes of doctoral education. This issue is forcing serious thinking not just in history, but in the humanities generally. And it affects not just our professional lives, but our lives as citizens.

The cultures of our departments too often discourage open discussion of nonacademic careers. One of the dirty little secrets discovered by the AHA Committee on Graduate Education, which I and others reported in The Education of Historians for the Twenty-First Century (2004), was that graduate students from various institutions were afraid to tell their advisers that Plan B, a nonacademic career, was for them Plan A. They preferred to pursue the profession of history in museums, historical societies, filmmaking, and the park service, among other possibilities. But they worried that if their advisers learned of that ambition, they could expect little or no future support from them.

Equally unsettling, the study reported that survey research covering thousands of doctoral students showed that graduate students in history, more than any other discipline except philosophy, entered graduate school to become teachers and left wanting to be researchers with as little teaching responsibility as possible. That really narrows the definition of our profession....



SOURCE: Spectator (UK) (02-09-2012)

Jonathan Freedland writes a weekly column for the Guardian. He is also a regular contributor to the New York Times and the New York Review of Books, and presents BBC Radio 4's contemporary history series, The Long View.

The Americans have 1776, the French have 1789 and we have 1940. The date is not official for us the way it is for them; it marks no formal founding of a nation or republic. But the events of that year — specifically, Britain’s lonely stand against the Nazi menace — have acquired the status of a creation myth, the heroic and finest hour in which modern Britain was born.

Our children study the second world war in school and when we vote for our Greatest Briton we choose the hero of 1940, Winston Churchill. When David Cameron deployed his quasi-veto in Brussels in December, the headline-writers and cartoonists instantly reached for 1940 and the image of the solitary Tommy: ‘Very well, alone.’ Meanwhile, an unlikely Twitter sensation, with more than 200,000 followers, is RealTimeWWII, providing 140-character updates of what happened on this day and at this hour in 1940.

It’s not hard to fathom this centrality of the war, and particularly 1940, in our collective memory. It was, we tell ourselves, the moment — perhaps the last — when Britain stood unambiguously for good against evil. Others surrendered, others dithered, but we fought, even at a great and enduring cost in blood and treasure, for what was right.

That is the national, founding narrative of modern Britain. The United States has a version of it too, even if it does not enjoy quite the same mythic status. In this Saving Private Ryan picture of America, the US was the lead force on the side of justice, represented by a ‘greatest generation’ which played the crucial role in defeating Hitler. It was all but a matter of manifest destiny: where else could America be but on the side of freedom against tyranny?..



SOURCE: WSJ (02-09-2012)

Mr. Roberts, a historian, is author most recently of The Storm of War: A New History of the Second World War (Harper, 2011).

Queen Elizabeth II acceded to the British throne 60 years ago this week. The Commonwealth will mark the occasion with celebrations throughout the year, but many republicans claim that the queen has made no genuine difference to the history of her country. Is that so?

The past six decades haven't been easy for the United Kingdom, and were it not for the monarchy there is no telling what social and political unrest might have dominated. To celebrate the good times properly, it's important to recall the perilous ones, and what might have happened had the queen not been, in the words of the historian John Grigg, "a bastion of stability in an age of social and moral flux."

When the queen's father, King George VI, died in February 1952, Evelyn Waugh remarked that his reign had been the most disastrous since that of King Stephen in the 12th century. Appeasement, World War II and the transfer of power in India meant that Britain had slipped from great-power status—where Churchill could meet Stalin and Roosevelt on equal terms at Tehran and Yalta—to the position of a second-division power teetering on bankruptcy. Elizabeth II's reign saw Britain further shorn of any delusions of global grandeur and forced to readjust to a new, less exalted but more honest place in the world. That this was managed without severe internal dissension was largely due to the political stability and continuity personified by the queen.

In France, the decolonization of Algeria led to bloody riots on the streets and several attempts to assassinate President Charles de Gaulle. The British were spared such traumas as the queen visibly supported the post-imperial settlement...



SOURCE: Irish Times (02-08-2012)

Paul Bew is professor of politics at Queen’s University Belfast, and an Independent (cross bench) member of the House of Lords. He has just published Enigma: A new Life of Charles Stewart Parnell (Gill and Macmillan 2011).

Winston Churchill made his first public appearance in Ireland in 1878. In 1877 Disraeli had sent his family into a form of internal exile – the Duke of Marlborough was appointed viceroy in Dublin Castle and his son Randolph decided to act as his aide. Randolph’s wife Jenny – proud mother of cherubic Winston – painted his portrait and placed it on public display at a Dublin exhibition, to the joy of the local press.

He also learned his first political lesson. His nanny warned him against the dangers posed by the Fenians, reasonable advice as in 1882 republican assassins murdered Lord Frederick Cavendish, the incoming chief secretary, in the nearby Phoenix Park.

Churchill’s relationship to Ireland is encapsulated for many by a few famous phrases – his celebrated reference to the integrity of the quarrel of the dreary steeples in Fermanagh and Tyrone, and his sharp critique of de Valera and neutrality in the fight against Hitler. But what did Churchill really think about Ireland?..



SOURCE: TIME (02-08-2012)

Susana Ferreira is a freelance journalist based in Haiti.

On Jan. 30, more than a year after former "President for Life" Jean-Claude Duvalier returned to Haiti, a Port-au-Prince judge concluded his lengthy investigation into the ex-dictator's brutal, 1971-86 rule. Supreme Court Magistrate Carvès Jean had at his disposal reams of documents, human rights complaints, testimony from torture victims, copies of checks, international bank transfers and diary entries from former political prisoners at the notorious Fort Dimanche prison. Yet while Jean ruled that Duvalier should be tried on financial corruption charges — for the hundreds of millions of dollars allegedly plundered from Haiti's national coffers — he decided the statute of limitations on Duvalier's crimes against humanity had expired.

The U.K.-based Amnesty International spoke for most human rights groups worldwide when it called Jean's dismissal of the torture and murder charges against Duvalier "a disgrace." Some of Duvalier's alleged victims, including national soccer hero Bobby Duval and former U.N. Secretary-General spokesperson Michèle Montas, have vowed to appeal the ruling — citing, for one thing, Haiti's ratification of the American Convention on Human Rights, which puts the country under the jurisdiction of the Inter-American Court of Human Rights. Under international law, there is no time limit on crimes against humanity.

But other victims weren't so surprised by Jean's ruling. "I had no doubt, not even for a fraction of a second," says former parliamentary deputy Alix Fils-Aimé, who was held in solitary confinement by Duvalier's secret police for 16 months and then at Fort Dimanche before being exiled. "I have no faith in [Haiti's] justice system." And that's especially true, critics like Fils-Aimé fear, when it comes to the handling of Duvalier. Few countries, especially after the devastating 2010 earthquake that killed more than an estimated 200,000 people, have as troubling an image problem to solve as Haiti does. And the last thing the national re-brandingcampaign of President Michel Martelly needs is an ugly, protracted trial that would remind the world of the 30,000 Haitians abducted, tortured and killed by the regimes of "Baby Doc" Duvalier and his more ruthless father, François "Papa Doc" Duvalier, who ruled from 1957 until his death in 1971.

As a result, a big question is whether Martelly's efforts to market the idea of a new Haiti — one that is open for investment, tourism and modernization — include sweeping away the atrocities of the Duvalier dynasty like so much quake rubble. Another is whether otherwise well-intentioned U.S. celebrities like Haiti's new Ambassador-at-Large, Hollywood actor and humanitarian Sean Penn, have signed on with that controversial approach...



SOURCE: Washington Times (02-03-2012)

Edwin Meese III was attorney general of the United States and is a founding member of the policy board of the Carleson Center for Public Policy.

As the years pass, Ronald Reagan’s stature continues to grow, and it has reached the point where all sorts of people quote him to support their policies or candidacies.

Last year, during the contrived "crisis" over a possible national default, California Sen. Barbara Boxer evoked the Gipper to justify raising taxes: "I find myself these days quoting Ronald Reagan. ‘The full consequences of a default,’ he said, ‘or even the serious prospect of a default by the United States are impossible to predict and awesome to contemplate.’ "

Never mind that President Reagan, whose 101st birthday is Feb. 6, said in 1982: "We don’t have a trillion-dollar debt because we haven’t taxed enough; we have a trillion-dollar debt because we spend too much."

With our nation at a crossroads and in desperate need of the kind of leadership Reagan afforded us, it’s important to set the record straight about what he stood for...



SOURCE: NYT (2-5-12)

Joseph Tartakovsky is a law clerk at the United States Court of Appeals for the 10th Circuit.

TUESDAY is the bicentenary of the birth, in Portsmouth, England, of Charles Dickens, literature’s greatest humanist. We can rejoice that so many of the evils he assailed with his beautiful, ferocious quill — dismal debtors’ prisons, barefoot urchin labor, an indifferent nobility — have happily been reformed into oblivion. But one form of wickedness he decried haunts us still, proud and unrepentant: the lawyer.

Lawyers appear in 11 of his 15 novels. Some of them even resemble humans. Uriah Heep (“David Copperfield”) is a red-eyed cadaver whose “lank forefinger,” while he reads, makes “clammy tracks along the page ... like a snail.” Mr. Vholes (“Bleak House”), “so eager, so bloodless and gaunt,” is “always looking at the client, as if he were making a lingering meal of him with his eyes.” Most lawyers infest dimly lighted, moldy offices “like maggots in nuts.” (No, counselor, writers dead since 1870 cannot be sued for libel.)

Dickens knew whereof he spoke. At 15, he was hired as an “attorney’s clerk,” serving subpoenas, registering wills, copying transcripts; later he became a court reporter. For three formative years he was surrounded by law students, law clerks, copying clerks, court clerks, magistrates, barristers and solicitors who (reborn in his fiction) uttered cheerful sentiments like “I hate my profession.”...



SOURCE: LA Times (02-01-2012)

Elliot Perlman is the author of, most recently, the novel The Street Sweeper.

Some six or seven years ago I happened to see an Academy Award-winning documentary, "The Last Days," directed by James Moll and with Steven Spielberg as executive producer. It was of interest to me because, like the novel I was then writing, it dealt with the Holocaust and tangentially with the role of African American troops in World War II.

In the film, Paul Parks, an African American WW II veteran and civil rights activist, recounts being one of a number of black troops of the then-segregated U.S. Armypresent at the liberation of Dachau, the first concentration camp the Nazis built and one of the last to be liberated. Although it was not one of the six death camps created specifically for mass murder, many thousands of people died there during the Third Reich. The historical and moral significance of African American troops taking part in the liberation of Dachau was of interest to me.

Subsequently I learned that "The Last Days" and "Liberators: Fighting on Two Fronts in World War II" — a 1992 PBS documentary that also drew attention to the presence of black troops at Dachau — were roundly attacked either for their unquestioning acceptance of claims by allegedly dishonest black veterans or for allegedly fabricating the story.

I was curious about the motives of each side in this dispute. Why would black veterans say there were black troops present if there were not?..



SOURCE: Common-place (1-30-12)

Emily Redman is a PhD candidate in the history of science at the University of California at Berkeley who, when not distracted by enviro-political issues of yore, studies the history of mathematics education reform in the twentieth-century United States.

The planet is in the spotlight somewhat literally these days. Arguably interchangeable locutions of global warming, climate change, or "solar variations" have made headlines in the past decades—yes, those same decades that brought us An Inconvenient Truth and extreme storms. The underlying science has effectively bisected Washington, with the left and right offering partisan legislation aimed at the decidedly nonpartisan climate. Yet despite circular debates on Capitol Hill, options are being proffered to Americans for their fight to protect the global environment.

Efforts from Capitol Hill, you say? Given American's conflicted relationship with the regulatory powers of Washington, this fight is unsurprisingly politicized. Where the battlegrounds lie, however, is at once surprising and historically awkward.

Recently, media channels have brought to our attention the efforts underway to provide Americans with alternatives—federally mandated alternatives, no less—to the good ol' familiar light bulb. Scientists and engineers, tasked with developing eco-friendly light sources that mimic Thomas A. Edison's (1847-1931) incandescent bulb aesthetically while improving on it technologically have unveiled an LED version of the original with all the federal subsidies and fanfare that Washington can offer. This past summer, Philips, the Netherlands-based producer of consumer electronics, collected $10 million in prize money for developing a highly efficient alternative to the standard sixty-watt incandescent. The award, familiarly known as the L Prize, was sponsored by the U.S. Department of Energy in the wake of George W. Bush-era legislation that requires light bulb makers to improve efficiency of bulbs by twenty-five percent. The L Prize, then, was instituted as a government-sponsored nudge to spur lighting manufacturers to develop higher efficiency alternatives to Edison-era products disparaged as "dated" on the prize website. And in a dangerous flirtation with the "nanny state," the Website promises the prize will drive market adoption.

Apparently, however, Edison's familiar glass-bulb-meets-metal-filament is near and dear to many American hearts. Despite those years of thoughtlessly tossing cardboard boxes of replacements into our shopping carts, we've become inextricably connected to these devices. Edison, we shout, championing for our American-scientist-hero who bore innovation. We are sure that there is some mistake, that Edison could not have led us astray; his light bulb seems irreplaceable and must be compatible with modern-day America....



SOURCE: WaPo (01-27-2012)

Mark Feldstein, the Richard Eaton professor of broadcast journalism at the University of Maryland, is the author of Poisoning the Press: Richard Nixon, Jack Anderson, and the Rise of Washington’s Scandal Culture.

Richard Nixon was many things — crafty, criminal, self-pitying, vengeful, paranoid. But gay?

According to a book to be released Tuesday, “Nixon’s Darkest Secrets,”the former president and his best friend, Charles “Bebe” Rebozo, had a relationship of a “possibly homosexual nature.” But author Don Fulsom, a former radio reporter who covered the White House from Lyndon Johnson’s presidency to Bill Clinton’s, provides scant evidence for this claim. No new White House tapes. No love letters, incriminating pictures or diary entries. No recently declassified government documents. Just a recollection from retired journalist Bonnie Angelo, who, in an interview with me, confirmed the story she told Fulsom: In 1972, she saw a tipsy Nixon pull Rebozo into a group photo at a Florida restaurant and hold his hand for “upwards of a minute.”

That’s pretty thin gruel — but not so thin that it keeps the author from enthusiastic speculation. “Was Nixon’s tough-guy attitude toward gays just a cover for his own homosexuality, bisexuality or asexuality?” Fulsom writes. “Well, he isn’t still called ‘Tricky Dick’ for nothing.”.. 



SOURCE: Smithsonian Magazine (2-1-12)

Toby Lester’s new book, Da Vinci’s Ghost, is about the history behind Leonardo’s Vitruvian Man. You can read more of his work at tobylester.com.

In 1986, during a visit to the Biblioteca Comunale Ariostea, in Ferrara, Italy, an architect named Claudio Sgarbi called up an anonymous copy of the Ten Books on Architecture, written by the Roman architect Vitruvius. The only such treatise to have survived from antiquity, the Ten Books is a classic, studied by historians of architecture and antiquity alike. Early copies are of great interest to scholars, but few had any idea this one existed. Academic inventories made no mention of it, and the Ariostea catalog described it unpromisingly as only a partial manuscript.

When Sgarbi took a look at it, he discovered, to his amazement, that in fact it contained almost the full text of the Ten Books, along with 127 drawings. Moreover, it showed every sign of having been produced during the late 1400s, years before anyone was known to have systematically illustrated the work. “I was totally astonished,” Sgarbi told me. But then he made what he calls “a discovery within the discovery”: On the manuscript’s 78th folio, he found a drawing that gave him the chills. It depicted a nude figure inside a circle and a square—and it looked uncannily like Leonardo da Vinci’s Vitruvian Man.

Everybody knows Leonardo’s drawing. It has become familiar to the point of banality. When Leonardo drew it, however, he was at work on something new: the attempt to illustrate the idea, set down by Vitruvius in the Ten Books, that the human body can be made to fit inside a circle and a square....



SOURCE: Telegraph (UK) (1-28-12)

Piers Paul Read’s 'The Dreyfus Affair: The Story of the Most Infamous Miscarriage of Justice in French History’ is published by Bloomsbury this week.

France in the last decades of the 19th century saw an extraordinary flourishing in the arts, the sciences and technology which, along with its climate of sexual permissiveness, earned this period the title of la belle époque. To celebrate these achievements, the French government prepared for a Universal Exposition in Paris in 1900, with an ambitious programme of building that included two railway stations, Gare de Lyon and Gare d’Orsay, and two exhibition halls, the Grand and the Petit Palais.

These plans were suddenly jeopardised, in the autumn of 1899, by an international campaign to boycott the exhibition, a result of the outrage felt throughout the world at the conviction, at a court martial in Rennes, of a Jewish officer, Alfred Dreyfus, on charges of passing secret documents to the Germans. This was his second court martial. The first, five years earlier, had led to a sentence of life imprisonment on Devil’s Island. A campaign by his family, his lawyer and a small number of supporters had eventually uncovered overwhelming evidence that the traitor was not Dreyfus but another officer, Charles Walsin-Esterhazy. However, senior officers on the general staff and in military intelligence feared that to admit a miscarriage of justice would not just lose them their jobs but discredit the army. To thwart a revision of the case against Dreyfus, they resorted to a series of threats, forgeries and dirty tricks.

On January 13, 1898, France’s leading novelist, Émile Zola, entered the fray with a polemic, J’Accuse, naming the officers responsible for the conspiracy against Dreyfus. It was hailed as heroic by the Left, outrageous by the Right, and provoked anti-Semitic riots throughout France. Opinion abroad was incredulous. How could France, the most civilised country in Europe, experience this eruption of medieval barbarism? Why had the case of one Jewish officer led to this rage against all Jews?...



SOURCE: NYT (1-31-12)

Roger Cohen is a columnist for the NYT.

VILNIUS, LITHUANIA — The “double genocide” wars that pit Stalin’s crimes against Hitler’s are raging in wide swathes of Europe and every now and again along comes a gust from the past to stoke them. The 70th anniversary this month of the Nazi adoption at Wannsee of annihilation plans for the Jews provided one such squall.

Yes, the past is still treacherous beneath Europe’s calm surface. Memory swirls untamed in the parts of the Continent that the American historian Timothy Snyder calls “Bloodlands,” the slaughterhouses from Lithuania to Ukraine that Hitler and Stalin subjected to their murderous whim.

To mark the Wannsee anniversary, over 70 European Parliament members, including 8 Lithuanians, signed a declaration objecting to “attempts to obfuscate the Holocaust by diminishing its uniqueness and deeming it to be equal, similar or equivalent to Communism.” It also rejected efforts to rewrite European school history books “to reflect the notion of ‘double genocide.”’...



SOURCE: Japan Times (01-27-2012)

Gwynne Dyer is a London-based independent journalist whose articles are published in 45 countries.

I go to France quite often, but after this article is published, I may be liable to arrest if I set foot in the country. The French parliament has just passed a bill, proposed by President Nicolas Sarkozy's party, that will make it a crime to question whether the Armenian massacres in eastern Turkey in 1915 qualified as a genocide. Sarkozy will doubtless sign it into law next month, just in time for the presidential elections.

It won't be just a crime to deny that hundreds of thousands of Armenians, perhaps as many as a million, were killed in eastern Anatolia in 1915, and that it was the responsibility of the Turkish state. That is a historical fact, and only fools, knaves and Turkish ultranationalists deny it. It will also be a crime, punishable by one year in prison and a fine of up to 45,000 even to question the use of the word "genocide."

"Genocide" doesn't just mean killing a lot of people, even a lot of civilians. If it did, then the United States would be guilty of genocide because of Hiroshima. Genocide is a deliberate attempt to wipe out much or all of a specific ethnic, linguistic or religious group. Words matter...



SOURCE: American Interest (01-24-2012)

Walter Russell Mead is professor of foreign affairs and the humanities at Bard College and editor-at-large of The American Interest.

Writing about the onset of the Great Depression, John Kenneth Galbraith famously said that the end had come but was not yet in sight. The past was crumbling under their feet, but people could not imagine how the future would play out. Their social imagination had hit a wall.

The same thing is happening today: The core institutions, ideas and expectations that shaped American life for the sixty years after the New Deal don’t work anymore. The gaps between the social system we inhabit and the one we now need are becoming so wide that we can no longer paper over them. But even as the failures of the old system become more inescapable and more damaging, our national discourse remains stuck in a bygone age. The end is here, but we can’t quite take it in.

In the old system, most blue-collar and white-collar workers held stable, lifetime jobs with defined benefit pensions, and a career civil service administered a growing state as living standards for all social classes steadily rose. Gaps between the classes remained fairly consistent in an industrial economy characterized by strong unions in stable, government-brokered arrangements with large corporations—what Galbraith and others referred to as the Iron Triangle. High school graduates were pretty much guaranteed lifetime employment in a job that provided a comfortable lower middle-class lifestyle; college graduates could expect a better paid and equally secure future. An increasing "social dividend", meanwhile, accrued in various forms: longer vacations, more and cheaper state-supported education, earlier retirement, shorter work weeks, more social and literal mobility, and more diverse forms of affordable entertainment. Call all this, taken together, the blue model.

In the heyday of the blue model, economists and social scientists assumed that from generation to generation Americans would live a life of incremental improvements. The details of life would keep getting better even as the broad outlines of society stayed the same. The advanced industrial democracies, of which the United States was the largest, wealthiest and strongest, had reached the apex of social achievement. It had, in other words, defined and was in the process of perfecting political and social "best practice." America was what "developed" human society looked like and no more radical changes were in the offing. Amid the hubris that such conceptions encouraged, Professor (later Ambassador) Galbraith was moved to state, in 1952, that "most of the cheap and simple inventions have been made." If only the United States and its allies could best the Soviet Union and its counter-model, then indeed—as a later writer would put it—History would end in the philosophical sense that only one set of universally acknowledged best practices would be left standing.

Life isn’t this simple anymore...

 

 



SOURCE: openDemocracy (UK) (01-24-2012)

Christopher Sisserian is a freelance journalist living in London, and currently a graduate student of International Politics at the School of Oriental and African Studies (SOAS).
 

The recent passing of the French bill criminalising denial of the Armenian genocide has been the cause of much celebration for Armenians in France and across the world.  Though celebrated by many as a step towards recognition and justice for a crime committed nearly 100 years ago it is difficult to see how the law presents anything other than another obstacle to the process of reconciliation between Armenians and Turks. Though the formal process of reconciliation has definitely stalled in the past year, informal contacts between Armenians and Turks have continued to grow, maintaining the small possibility of improved neighbourly relations.

The assassinated Turkish-Armenian newspaper editor Hrant Dink, whose murder brought the importance of Turkish-Armenian relations to the forefront of global consciousness, was resolutely against the passing of any such law and even promised to travel to France and deny it himself if it were to ever be passed. That Dink is sadly no longer alive to stand up for the freedom of speech he campaigned for in his native Turkey is testament to the fact that relations urgently need to be improved.

However, rather than provide a step forward the new French law only serves to fuel the seemingly diametrically opposed nationalist identities that can trace their roots back to the events being legislated over. The media storm created by the bill adds to the discourse of presenting the issue as a simple binary, with Armenians claiming one thing and Turks maintaining another. However, it is not Armenians that claim genocide occurred but rather research from scholars of various nations, including Turks that have documented and analysed the history. Conversely it is not Turks that present the counterargument; rather it is the Turkish state that denies the genocide through its official state policy. Many Turks are aware of what happened towards the end of the Ottoman Empire, particularly amongst intellectual circles and those living in the south east of Turkey where the tragedy of what happened in 1915 is maintained in oral histories. Consequently those Armenians and Turks managing to find common ground and come to terms with their shared history are constantly growing...



SOURCE: The Atlantic (1-25-12)

Dominic Tierney is assistant professor of political science at Swarthmore College. He is the author of How We Fight: Crusades, Quagmires, and the American Way of War.

On June 5, 1944, General Dwight Eisenhower wrote down a message, carefully folded it, and placed it in his wallet. It contained a public statement in case the D-Day invasion failed. Twenty-five years later, in 1969, Richard Nixon's White House drafted a speech to use if the moon landing was unsuccessful and the astronauts were trapped on the lunar surface. This is not alternate history. This is very real history, about leaders preparing for a contingency that never transpired. More than anything, the messages reveal the fine line between triumph and disaster.

Publicly, Eisenhower radiated confidence about the liberation of Europe. But privately, he was deeply worried that the Germans would push the invaders back into the sea. After all, the Allies could initially propel only five divisions by sea and three divisions by air against an area held by 58 German divisions. If the Allies had been defeated, Eisenhower planned to issue a statement.

Our landings in the Cherbourg-Havre area have failed to gain a satisfactory foothold and I have withdrawn the troops. My decision to attack at this time and place was based on the best information available. The troops, the air, and the Navy did all that bravery and devotion to duty could do. If any blame or fault attaches to the attempt it is mine alone.



SOURCE: Foreign Policy (01-24-2012)

Michael Dobbs is a prize-winning foreign correspondent and author.

The photograph above is a unique historical image. It captures a massacre actually in progress near the United Nations "safe area" of Srebrenica around 17:15 on July 13, 1995. What makes this image even more remarkable -- and worth studying by anyone interested in the subject of genocide prevention -- is that it became a public document one day after the massacre, on July 14. It was part of a video reportage on events in Srebrenica aired by a Belgrade television station.

Granted, the photograph is initially difficult to interpret. If you look closely, however, you can identify bodies piled outside a warehouse, guarded by a soldier. In the video from which the image was taken (shown below), you can hear shots, and a reporter talking about "dead Muslim soldiers." Combined with overhead reconnaissance collected by the United States, intercepts, and eyewitness accounts, the fleeting image displayed on Belgrade Studio B was clear evidence that terrible events were taking place in eastern Bosnia.

Of course, it is easy to pull all this evidence together now and analyze exactly what it means. The challenge for the American intelligence community back in 1995 was the same as it was during the run up to 9/11: "connecting the dots." An additional problem, in the case of Srebrenica, was that preventing genocide in a faraway country ranked low on the list of U.S. intelligence priorities. At the time, the U.S. government was more interested in the military/strategic aspects of the three-and-a-half-year Bosnia war...



Syndicate content