George Mason University's
History News Network

Book Reviews


Book Editors: Ron Briley, Jim Cullen, Murray Polner, Luther Spoehr

This Department features reviews and summaries of new books that link history and current events. From time to time we also feature essays that highlight publishing trends in various fields related to history.

If you would like to tell the editors about a new book (even your own) that addresses the concerns of HNN -- current events and history -- or would like to write a review, please send us an email: editor@historynewsnetwork.org.


SOURCE: HNN (1/14/2012)

[J. Stanley Lemons, Professor of History Emeritus at Rhode Island College, is the author of First:  The History of the First Baptist Church in America (2001) and Rhode Island: The Ocean State (2004), among other works.]

 

            John Barry has written one of the best and clearest books about Roger Williams that I have ever read, a book about the ideas of Roger Williams and their context. It is not a biography in the usual sense, but it tells what ideas, forces, and events shaped Williams’ thinking. Barry concludes that Williams was truly revolutionary and was the source of one of two major versions in the American “soul.” He was the fountain of the stream of religious freedom, separation of church and state, and political liberty. The other stream was represented by John Winthrop, the first governor of Massachusetts. Winthrop presented America as a Christian “city on a hill” whose continued success required the state to be a “nurturing father” to the church. The government and the church had to be partners in curbing the human wickedness and preserving their covenant with God. If they failed, God would surely make an example of their city, treating it as if it were Sodom or Gomorrah. Williams, on the other hand, said that the state had absolutely no role to play in religion.

            Barry devotes about a third of the book to the British context of Williams’ development, especially stressing the impact of Sir Edward Coke and Sir Francis Bacon. The greater influence was Coke, who plucked the boy Williams from obscurity, made him his amanuensis, and saw that he received the finest formal education available at Charterhouse School and Cambridge University. Williams accompanied Coke to Parliament, the Court of Star Chamber, Court of Common Pleas, the Privy Council, conferences with the King, and other high-level meetings. Williams learned about the law, government, and justice first hand at the elbow of England’s greatest jurist and legal thinker.

While making a solid case for Coke’s great influence on Williams, Barry’s contention that Williams’ view of the world was formed by the scientific ideas of Sir Francis Bacon seems strained. He contends that Williams’ reliance upon evidence rather than dogma or logic resulted from his acceptance of Bacon’s experimental view of the world. But it seems that what Williams concluded about religion, the state, and humanity flowed from his interpretation of the Scriptures and his personal experiences.

            His interpretations and experience led Williams to hold the most enlightened view of the Native Americans by any Englishman of his time, and it led to his revolutionary conclusion that church and state must be separated. He founded, for the first time in modern history, a totally secular state. The town government that he created in 1637, the charter that he obtained from Parliament in 1644, and the government established for Providence Plantations in 1647 were all secular entities. The enormity of this development provoked the neighboring colonies, Massachusetts, Plymouth, and Connecticut to try to dismember, over-awe, and destroy Williams’ colony. Barry observes that when Williams secured a charter to legitimize his plantation, the friendship that he had had with John Winthrop went cold and Massachusetts tried to claim Providence Plantations on the basis of a fraudulent charter.

            The “wall of separation” between church and state was a metaphor from Williams 150 years before Thomas Jefferson used it. And, while there is no evidence that Jefferson ever read anything written by Williams, everyone knows how important John Locke was to Jefferson. Barry agreed with many historians that Locke knew Williams’ work, even though Williams was even more radical than Locke or John Milton in advocating religious freedom for “all men of all nations.”

            A couple of minor factual errors appear. For example, Barry has the irrepressible, obstreperous Samuel Gorton coming to Newport before it has even been founded, not realizing that Gorton was one of the people who caused William Coddington and company to withdraw from Portsmouth in 1639. Elsewhere, Barry says that Roger Williams’ famous debate with Quakers in 1670 lasted two days, when, in fact, the debate lasted four excruciating days, three in Newport, one in Providence five days later.

            A more important issue is applying the “Seeker” label to Williams. Barry accepts the authority of James Ernest, who wrote a major biography of Roger Williams in 1932 in which he argued that Williams was a Seeker. However, more recent scholars, such as Edmund Morgan and Edwin Gaustad, dismissed this interpretation, noting that the Seeker label was applied to Williams by his enemies. It was meant to discredit him. The Seekers were generally understood to believe in Universal Salvation and to deny the divinity of Jesus Christ, two positions that were totally anathema to Roger Williams. It is interesting to note that the last persons to be burned at the stake in England in 1612, Bartholomew Legate and Edward Wightman, were identified as Seekers, and Legate was executed close enough to where the nine-year-old Roger lived that he may have actually witnessed the burning. The bells of Williams’ parish church tolled while Legate “burned to ashes.”

            Williams never accepted the label, but consistently described himself as a witness for Christianity. “Witness” is the English translation of the Greek word “martyr,” and that is exactly how Williams repeatedly described himself. One did not have to die to be a martyr, but one had to suffer, and Williams felt that he had certainly suffered for his faith. He had been driven from England, then condemned by his fellow ministers and former friends and banished from Massachusetts (and Plymouth, de facto), and forced from civil society into the wilderness. He saw himself as a “witness” to Christianity until God would send a new apostle to restore the Church. “Seeker” has come to have a positive meaning in our time, but in Williams’ time, it was a slander. One might as validly call him a “Roman Catholic” as to call him a “Seeker,” but everyone would recognize that as being absurd.

            Originally Barry had intended to write about Billy Sunday, the popular Fundamentalist evangelist of the early 20th century, but that led him back to the 17th century and Roger Williams. He concludes that the division between Winthrop and Williams is still here and that the two represent divergent understandings of the American soul. The debate that Roger Williams had in the 1640s with John Winthrop and the champion of the Massachusetts way, John Cotton, still rages on in the 21st century. The Religious Right and those who maintain that the United States was founded as a Christian nation articulate the same position that John Winthrop and John Cotton made in the 17th century.

            The late Peter Marshall, Jr., wrote three books trying to make the case that the United States was founded as a Christian nation and to call Americans back to their “Christian and biblical heritage.”  One of these books, The Light and the Glory, written in 1977, argued that God had a plan for America, beginning with Columbus in 1492. But, Marshall identified two villains who damaged God’s plan for America: Roger Williams and Thomas Jefferson.

In many ways, Roger Williams is a greater problem for Fundamentalists because he was a devout Christian minister, a profound Biblicist whose ideas flowed from his interpretation of the Bible. And Williams demolished the arguments of those who would argue that God favored the nurturing father-state or even that there was any such thing as a “Christian nation.”  Williams’ profound analysis and conclusion of these matters has made him especially attractive to those concerned about the rise of the Religious Right, Christian Reconstructionists, and those who think that the United States is God’s new Israel.


Saturday, January 14, 2012 - 16:44

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is completing a study of Hollywood actors as historians slated for publication by Oxford University Press later this year. Cullen blogs at American History Now.

The wacky premise of this novel merits a look. On March 4, 1913, on the final day of a presidency wedged between the more commanding Theodore Roosevelt and Woodrow Wilson, the outgoing president William Howard Taft -- all 300+ pounds of him -- somehow slips through a time portal and reappears on the White House grounds in late 2011. Shot by a secret service agent terrified by the muddy beast, Taft, he of stuck-in-the-bathtub lore, is nursed back to health and introduced to the 21st century, where there's a lot more affection for the 27th president than there ever was a century ago.

There's some entertainment to be had in this fish-out-of-water story. "Good God, man. Is this all truly necessary? I must look like a cut-rate Manila harlot," the one-time administrator of the Philippines says. He wonders what ever happened to good old tap water, and expresses surprise that cell phones didn't come along sooner.

First-time novelist Heller, a music journalist and writer of genre fiction, renders Taft as colorful cartoon, which is mildly amusing, though all the attention to his gargantuan appetite and handlebar mustache becomes a bit tiresome after a while. (Other characters are a good deal less compelling.) We watch Taft as he visits familiar places, gets drunk, gets laid, and passively finds himself drawn into presidential politics (just as he was the first time around).  Heller augments his traditional storyline with a series of mock documents -- television talk show transcripts; Secret Service memos; twitter feeds, polling data -- that contextualize the story.

In this fictional world, Barack Obama is still president, running against an unnamed Republican. Taft's politics are a bit of a cipher, which is at least partially Heller's point. One of the great ironies here, of course, is that the trust-busting, good-government policies of a man who was perceived as a conservative Republican then puts him far to the left of anyone in the GOP now, and indeed far to the left of many Democrats. But libertarians are quick to note that his tax rates were lower than any today, and a dissatisfied general electorate rallies to anyone who seems authentic. So it is that we witness the birth of the Taft Party, an apparent satire of the Tea Party in all it incoherence (we get a particularly wrong-headed discussion about immigration from a surveillance tape of two men discussing Taft while standing in front of their respective urinals).

Heller weaves in a subplot involving Big Agriculture that figures in the climax of the story. But having seized on an arresting premise, he has a little trouble maintaining control of his material, which takes a bit long to develop and which fizzles somewhat. But it's nevertheless a fast, light read.

Taft is revealing in the way it taps a longstanding American nostalgia that goes back at least as far as Mr. Smith Goes to Washington. We want straight shooters, until they start telling us things we don't want to hear. The difference now is that we literally can't (as opposed to won't) afford the pretty promises of a military that will always remain powerful, services that will always be adequate, and taxes that will always be low. I suspect that the longings Heller describes are real enough and available to be exploited by those whose with less scruples than Taft, one of the few good men to be president, and, not coincidentally, like other good men -- a pair of Adams, an elder Bush, Gerald Ford, Jimmy Carter -- were also one-term presidents. Maybe what we really need an Iron Lady (with Meryl Streep's wit) instead.

Note: There is an accompanying website for Taft 2012, and a Facebook page worth a connection. It's fun to see updates like, "Time lauded Mitt Romney for using "clear, concise, declarative sentences" in this week's debates. We don't expect much today, do we?" Sounds like he might be a good commentator to have around for the presidential campaign.


Thursday, January 12, 2012 - 11:09

“Socialism” has been redefined, praised and denigrated by free marketeers, assorted liberals and conservatives, pandering politicians and Obama-haters.  But in the end, it never took hold in this country because of governmental persecution, corporate opposition, recurring internal divisions, the New Deal, and perhaps above all the “American Dream,” which led American workers to support capitalism.

Was it possible to reconcile socialism with capitalism?  In socialism’s heyday, Eugene Debs, the Socialist candidate, received 6 percent of the vote in the 1912 election running against the Republican William Howard Taft, the pugnacious Bull Mooser Teddy Roosevelt and Democrat Woodrow Wilson.  Eight years later, in the 1920 election, while Debs was still imprisoned in Atlanta penitentiary for opposing World War I and the draft, he received one million votes.  It’s hard to remember that today’s consistently Republican/conservative Oklahoma once contained the nation’s second-largest socialist party.  The nadir of socialism’s national electoral popularity was when Norman Thomas, the party’s perennial candidate, won 140,000 votes in his final run in 1948, next to Harry Truman’s 24 million and Thomas Dewey’s 22 million.  In that last fruitless presidential campaign he railed against communism, Henry Wallace’s Progressive Party, and the bellicose foreign policies of both Democrats and Republicans.

Neither Debs, nor Thomas, nor for that matter once-prominent socialists Daniel DeLeon, Morris Hillquit, and even Helen Keller, are remembered today.  Nor are socialism’s historical ancestors:  Brook Farm, the Amana and Oneida communities, Robert Owen’s New Harmony colony, and Edward Bellamy’s late nineteenth-century “Looking Backward,” a hugely popular novel about a socialist and utopian America.

All the more reason, then, to welcome Norman’s great-granddaughter Louisa Thomas’s Conscience, a fitting title for the story of the four Thomas sons and their two sisters.  It’s an affectionate, well-crafted, and occasionally critical biography of the writer’s extended family set during the early twentieth century, when poverty was common and suffering unrelieved by safety nets.  Strikes were brutally smashed by the military, police, the National Guard, hostile judges, and rapacious corporations.  It was a century that began with the Spanish-American and Philippine-American wars and continued with U.S. military intervention in China, the Dominican Republic, Haiti, Mexico, and Russia—and of course World War I.  The aftermath of the Great War witnessed federal agents and vigilantes hunting down “subversives,” and lynchings and attacks against Black Americans.

After graduating from Princeton, Norman became an ordained liberal minister, serving in an East Harlem Protestant church where, for the first time in his rather sheltered life, he was exposed to his congregation’s poverty and desperation.  Never really a doctrinaire theologian, he discarded his Christianity in 1918 and turned instead for the rest of his life to an unrequited devotion to socialism, pacifism, defense of liberty and protection of working men and women.

His younger brother Evan was an ethical and religious conscientious objector to WWI.  Encouraged by pro-war officials, super-patriots condemned COs as traitors.  Seventeen received death sentences and some one hundred fifty life sentences, all of them ultimately rescinded, yet not until the early 1930s were the last COs finally released.  Louisa Thomas describes Evan’s remarkable courage and stubbornness while being tortured in federal prisons, a practice repeated in our secret prisons after 9/11.

The Thomas family also produced Ralph, an Army captain whose soldier son was killed in World War II, and the youngest, Arthur, who briefly served in the army but never in combat.  “Two pacifists, two soldiers, One Family,” the book’s appropriate subtitle, also briefly includes the two younger Thomas sisters whose outlooks resembled Norman’s and Evan’s.  Emily became a social worker in a hospital for interned Japanese during the Second World War, while Agnes was a teacher in a Quaker school and active in the Women’s International League for Peace and Freedom.

Louisa Thomas spends the greater part of her absorbing book on Norman.  He was hardly a dogmatic socialist, and one imagines he never spent much time composing theoretical works about the blessings of socialism or trying to master Marx and the writings of European socialists.  “Norman became a socialist,” writes Louisa, realistically but somewhat sorrowfully, because “he wanted to help workers who were powerless to change their situations, but the truth was that he was powerless, too.”

Once World War I ended, he challenged the notorious Red Scare, denounced the cruelties directed at powerless blacks, and, with Raymond Baldwin and Rabbi Judah Magnes, helped establish the ACLU.  During World War II, even though he ran and lost six times for the presidency, Thomas was one of the rare well-known Americans who dared to denounce the incarceration of Japanese Americans.  Until Pearl Harbor and the resulting collapse of the huge non-interventionist America First movement to which he belonged (other prominent members included Joseph Kennedy, Gerald Ford, progressive senator Robert LaFollette, Jr. and Father Charles Coughlin, the anti-Semitic Catholic radio priest), he opposed the U.S. entry into the war.  In his old age, Norman continued speaking against his nation’s addiction to war, appearing at anti-Vietnam War rallies where he always seemed to reflect Randolph Bourne’s classic remark, “War is the Health of the State.”

Long forgotten, it was the socialists in this country who early on propped up labor unions, backed the 8-hour day, favored laws against child labor and other protections for working people.  Yet Norman and his adherents never succeeded in building an organization strong enough to challenge the forces that created the Great Depression.  The growing Communist Party and its loyalists were in love with Stalin’s Russia while Democrats were infatuated with FDR and the New Deal.  Moreover, the bombing of Pearl Harbor meant that few Americans wanted to listen to any talk about pacifism or the reordering of the capitalist system.  Louisa is absolutely correct in summing up her great-grandfather.  “Norman’s conscience,” she wrote, was not the nation’s; it was his own.”


Tuesday, January 10, 2012 - 15:08

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN and the author of Born in the USA: Bruce Springsteen and the American Tradition (1997, 2005).  He is completing a study of Hollywood actors as historians slated for publication by Oxford University Press later this year. Cullen blogs at American History Now.

Greil Marcus is the Ernest Hemingway of cultural criticism. I don't mean that in terms of style -- Hemingway's laconic prose is light years away from that of the effusive, endlessly analogizing Marcus -- but rather that Marcus, in a manner perhaps only paralleled by Pauline Kael, has inspired a generation of bad imitators. Myself among them.

I discovered Marcus somewhat belatedly, at the time of the second (1982) edition of his classic 1975 study Mystery Train: Images of America in Rock 'n' Roll Roll Music. I read the book multiple times in ensuing iterations, enchanted by its intoxicating prose, despite the fact that it would be years before I heard much of the music on which it was based. I was thrilled by the idea that popular music could be a subject of serious fun. It's hard to imagine that I would have ever received a Ph.D. in American Civilization, specializing in the history of popular culture, had I not encountered that book at a formative period in my life.

Though he has been a consistently productive magazine journalist, Marcus's output as a writer of books was relatively modest in the twenty years following Mystery Train, notwithstanding that his 1989 book Lipstick Traces: A Secret History of the 20th Century has had the heft and durability of a major study. But in the last two decades -- and in the last five years or so in particular -- his pace as a writer, editor and collaborator has picked up. He's taken to writing quick, impressionistic books on subjects like Bob Dylan and Van Morrison. The Doors: A Lifetime of Listening to Five Mean Years represents relatively fresh territory, not only because the band has not really been a long-term fixture of his writing, but also because the group has always had a mixed critical reputation. Conventional critical wisdom holds that while the Doors produced a few deeply suggestive songs that have had a remarkably durable life on FM radio, lead singer Jim Morrison in particular was, in the main, undisciplined at best and boorishly pretentious at worst. Though his overall stance toward the band is positive, Marcus does not fundamentally challenge this view, instead focusing on what he considers the band's best work in its brief life in the second half of the 1960s.

I use the word "focusing" loosely; Marcus has never been an especially tight writer. Indeed, as a number of impatient readers have complained, the Doors are less the subject of this book than a point of departure for a series of riffs on subjects that seem loosely connected at best. A chapter whose locus is generally on the 1991 Oliver Stone biopic The Doors jumps (or perhaps lurches) from there into an extended analysis of the now obscure 1990 Christian Slater film Pump Up the Volume for reasons that are never entirely clear. If you look up Slater in the index of the book, you'll find him sandwiched between The Situationists, Tennessee Ernie Ford's "Sixteen Tons" and Josef Svorecky on one side, and Grace Slick, Bessie Smith and Peter Smithson on the other. As one who considers himself about as well read as anyone in 20th century cultural history, I find myself wondering if Marcus could possibly expect anyone to keep up with him as he leaps from pop music to architecture to crime fiction and back again.

He can exasperate at the level of individual sentences as well. He writes of "The End," one of the better-known songs in the Doors canon, that "The furious, impossibly sustained assault that will steer the song to its end, a syncopation that swirls on its own momentum, each musician called upon not just to match the pace of the others but to draw his own pictures inside the maelstrom -- in its way this is a relief, because that syncopation gives the music a grounding you can count on, that you can count off yourself." To which I say: Huh? He describes "Roadhouse Blues" "not as an autobiography, not as confession, not as a cry for help or a fuck you to whomever asked, but as Louise Brooks liked to quote, she said, from an old dictionary, 'a subjective epic composition in which the author begs leave to treat the world according to his own point of view." Marcus has long been lionized as a founding father of rock criticism, and one can't help but wonder whether he and others regard him as beyond the quotidian vagaries of line editing.

But there's a reason Marcus is lionized. At his best he opens cultural windows that can only be jimmied open with unconventional prose. Of the long shadow cast by his generation, he writes, "This is what is terrifying: the notion that the Sixties was no grand, simple, romantic time to sell to others as a nice place to visit, but a place, even as it is created, people know they can never really inhabit, and never escape." (Coming of age in the seventies, I certainly had that oppressive feeling.) He describes the prescient dark mood of the Doors by noting that "After Charles Manson, people could look back at 'The End,' 'Strange Days,' 'People are Strange,' and 'End of the Night' and hear what Manson had done as if it had yet to happen, as if they should have known, as if, in the deep textures of the music, they had." Yes: the Doors did ride a curdling cultural wave as the promise of the early sixties gave way to the kind of mindless violence of the Manson murders. Marcus distills the essence of the band better than they ever had themselves: "They didn't promise happy endings. Their best songs said happy endings weren't interesting, and they weren't deserved."

Marcus is like a stand-up comedian who only speaks in punch lines, refusing to set up the payoff (in this case, brief biographical sketches, career overviews, and something resembling a systematically offered sense of context). Such omissions appear to be an avowed (Beat) aesthetic, even a moral principle: You don't get to the old weird America by traveling down familiar highways. The problem, for him no less than the pop artists he writes about -- Jim Morrison in particular -- is that in the negotiation between reader and writer there's a thin line between bracing challenge and alienating self-indulgence, and it's hard to avoid concluding, as much as I hate to, that there are times when I feel Marcus crosses it.

I find myself thinking about Marcus the way he felt about Elvis Presley: awed by his talent but dismayed by his lack of constancy. I've got this idea that asking him to be different would be ungrateful at best and stupid at worst, failing to value the very devil-may-care quality that made him special in the first place. And I'm not sure how much in the way of evolution I should expect of any person old enough to have earned social security benefits, among other benchmarks. But I also feel not to ask would also be a betrayal of sorts, a willingness to settle that Marcus taught me long ago is a seductively dangerous temptation in American life. So I'll say: thank you, Greil Marcus. You changed my life. And I'll ask: Should we go somewhere else now?


Tuesday, January 3, 2012 - 13:57

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is completing a study of Hollywood actors as historians slated for publication by Oxford University Press later this year. Cullen blogs at American History Now.

It's always a surprising pleasure to find an English professor able to write about literature in comprehensible English. It's even more surprising when that professor can write narrative history better than most historians do. What's stunning is an English professor who writes good history that spans about 1800 years and who manages to ground his story in a set of richly contextualized moments that he stitches together with notable deftness. But then, this shouldn't really be all that surprising: we're talking about Harvard professor Stephen Greenblatt here. This New Historicist extraordinaire -- author of Will in the World: How Shakespeare Became Shakespeare -- has just won the National Book Award for his latest book, The Swerve: How the World Became Modern.

The point of departure for The Swerve is the year 1417, when an obscure former papal scribe named Poggio Bracchiolini enters a German monastery.  Greenblatt manages to capture both the way in which Poggio is a figure of his time even as he explains the novelty, even strangeness, of this bibliophile's quest to discover ancient works and the practical difficulties involved for a man of his station to do so. He then describes how Poggio encounters On the Nature of Things, a poem by the Roman poet/philosopher Lucretius, written in the first century BCE. Lucretius was deeply influenced by the Greek philosopher Epicurus (341-270 BCE). In the context of its pre-Renaissance recovery, the poem represented a radical challenge to the common sense of its time in its emphasis on pleasure as an end unto itself, as well as its de-emphasis on the role of the divine in human aspiration and fate.

Greenblatt's analysis leads to some deeply satisfying digressions, among them an explanation of Epicurean philosophy and the place of Greek thought in the Roman republic and empire. It also includes an explanation of the ongoing scholarly significance of Pompeii as a source of understanding ancient life in the 250 years since its discovery under the mountain of ash spewed by Mt. Vesivius in 79 CE. (On the Nature of Things was discovered in an impressive library in a house there.) And, most hauntingly, it includes an explanation of the process whereby the classical legacy was gradually erased from the human record by a combination of disasters, neglect, and active forgetting by an ascendant Christianity determined to eliminate epistemological rivals. It's difficult to finish reading this segment of The Swerve without having one's confidence shaken that our current state/memory of civilization is destined for permanence, especially when one considers the utter fragility of electronic information when compared with the strength, never mind beauty, of vellum.

From here, Greenblatt resumes telling the story of what happened when On the Nature of Things was re-injected into the bloodstream of western civilization. This was by no means a straightforward process. Ever a man of the world even amid his classical studies, Poggio skillfully navigated papal politics even as he grew exasperated by a friend's unwillingness to return the book. Eventually, however, On the Nature of Things was re-copied and distributed all over Europe, where its Epicurean vision laid the foundations for the Renaissance in Italy and beyond. Greenblatt traces its influence across sources that include Montaigne, Shakespeare (of course), and Thomas Jefferson.

Readers with intimate familiarity with these subjects will no doubt quibble with aspects of Greenblatt's account, among them the centrality of Lucretius or Epicurus in kick-starting modernity. Whether or not they're correct, The Swerve is simply marvelous -- emphasis here on simply -- in illustrating cultural disruption and transmission as a deeply historical process even as ideas partially transcend the circumstances of their articulation. In some sense, Greenblatt is playing the role of popularizer here, but he could never mesh his subjects and analyze them as well as he does without a lifetime of immersion and first-hand observation. One can only hope that this book will be among those that survive fires, floods, microbes and sheer human cupidity so that others will know what the finest flower of our academy could produce.


Sunday, January 1, 2012 - 15:15

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is the author of The American Dream: A Short History of an Idea that Shaped a Nation, among other books. He is completing a study of Hollywood actors as historians slated for publication by Oxford University Press next year. Cullen blogs at American History Now.

I didn't really want to read this book. I'm finishing one on a related topic, and have reached that point in the process where I just want to be done with it already. But my editor sent Hollywood Left and Right along to me, as good editors do, as a way of nudging me a little bit farther. I'm glad he did. It's a good piece of scholarship. And, I'm happy to report, an entertaining one.

A seasoned film historian, what Steven J. Ross offers here is a set of ten biographies that function as case studies in the way movies stars and impresarios -- sometimes the same person -- have used their cinematic careers for the purposes of political activism. With a sense of judiciousness and empathy toward all his subjects, he renders  five careers on the left (Charlie Chaplin, Edmund G. Robinson, Harry Belafonte, Jane Fonda, and Warren Beatty) and five on the right (Louis B. Mayer, George Murphy, Ronald Reagan, Charlton Heston, and Arnold Schwartzenegger). That said, Ross gently suggests that while we tend to think of Hollywood as a liberal bastion, it has had a series of prominent conservative champions, who on balance have been more successful than liberals in actually realizing their political goals. To that extent, at least, the book has a revisionist air.

Ross does a lot of things well. Each of his chapters offer skillfully limned portraits (Murphy and Reagan, whose careers coincided and interests overlapped, are treated as a pair). In some cases their stories are familiar, but Ross is able to season them with an eye for relevant, sometimes first-hand, observations. He managed to get on interviews with many of his principals, among them reclusive subjects like Beatty, as well as their associates like George McGovern and Gary Hart.

Ross is also a deft analyst. He weaves in close readings of particular films, contextualizing them in their immediate sociopolitical environments. There's very good stuff, for example, on the complexities of anticommunism and Hollywood unions at mid-century and its impact on the careers of Robinson and Reagan. He's also able to stitch together his subjects by periodically comparing and contrasting them with each other, allowing their nuances to come into focus.

Indeed, one of the more interesting aspects of Hollywood Left and Right are the varied ways stars have actually exploited their star power. Some, like Chaplin and Fonda, formed their political consciousness only after they became celebrities, and channeled that celebrity into potent fundraising machines. Others, like Belafonte and Schwartzenegger, had already formed their convictions before entering show business and then applied their personal skills to political activism. Still others, like Reagan and Heston, underwent political transformations (which always seem to go from left to right). Murphy, Reagan, and Schwartzenegger, of course, eventually won elective office. Yet many of these people -- Fonda in particular -- had a surprisingly durable impact in their behind-the-scenes organizations. These and other permutations give the book a kaleidoscopic quality.

At the end of this study, Ross poses the necessary question of whether it's all that healthy for the democratic process to have such outsized figures exercising their influence on the body politic. He notes the reasons why the answer might actually be no, but makes the important point that many of these stars serve an important purpose in mobilizing otherwise indifferent segments of the electorate. In a perfect Hollywood world, such people might be undesirable. But in the sometimes benighted political world in which we live, we may need the stars to see.


Tuesday, December 20, 2011 - 11:01

Thomas Fleming is a former president of the Society of American Historians and is on the advisory board of HNN.

New Yorkers—and many other people—are likely to find this brief briskly written book a fascinating read.  It combines the story of Westchester County in the Revolution and its climax—General Washington’s decision to march south from his encampment at Dobbs Ferry and nearby towns to trap Charles, Lord Cornwallis, at Yorktown.

The author does an excellent job of describing the war in Westchester, including the crucial Battle of Stony Point.  But he naturally focuses most of the book on the fighting that accompanied the creation of an encampment for the French and American armies in 1781 as they debated whether to attack British occupied New York.

Not a few people will be surprised by how much gunfire echoed around Dobbs Ferry when the British sent a fleet of warships up the Hudson to destroy American boats that were ferrying supplies to both armies.  The allies had set up a redoubt at Dobbs Ferry, equipped with numerous cannon, and they blasted the British ships coming and going.  One, HMS Savage, took a direct hit on a powder box that exploded, terrifying twenty sailors into jumping overboard.

Next come some graphic pages on the “Grand Reconnaissance,” the probe of the British northern defenses around New York along the Hudson and Harlem Rivers and the realization that the allies lacked the manpower to win a victory.  That led to the decision to march south.  It took four days to get both armies across the Hudson.  One French officer expressed amazement that the British warships had not made another foray up the river. They could have inflicted horrendous damage. But not a shot was fired at the allied army and soon they were marching south. The rest was history in capital letters.

In a final chapter, the author narrates one of the last encounters with the British in Westchester County—a 1783 winter foray by fifty Westchester militia on horseback.  The horsemen penetrated deep inside the British lines in an attempt to capture one of their most courageous enemies, loyalist Colonel James Delancey.  A battle exploded around Delancey’s house in West Farms, in the present-day Bronx.  It rapidly became apparent that Delancey had more than enough men to make capture impossible.

Soon the patriots were in headlong retreat, with the loyalists pursuing them.  On the banks of the Croton River, they were about to be surrounded.  It was every man for himself, and the rebels rode in all directions.  One of them, John Odell, galloped onto the ice-covered river, pursued by two saber swinging loyalists.  In a wild encounter, with the horses slipping and sliding beneath them, Odell managed to knock one man off his horse with a blow to the head, and the other man abandoned the pursuit.  It was a symbolic final clash, dramatizing the bitterness and determination on both loyalist and rebel sides which persisted until the British evacuation of New York several months later.


Tuesday, December 13, 2011 - 12:40

Luther Spoehr, an HNN Book Editor, teaches at Brown University.

One of the most difficult tasks when “thinking historically” is to avoid presentism and instead see the world as it looked at the time through the eyes of participants who acted on the basis of incomplete or inaccurate information and couldn’t know for sure how their decisions would turn out.  Steven Gillon, a historian at the University of Oklahoma and author of (among other books) The Kennedy Assassination—24 Hours After,  is up to it.  He vividly recreates and interprets President Franklin Roosevelt’s activities in the 24 hours after the Japanese attacked the American naval base at Pearl Harbor on December 7, 1941, famously designated by FDR as “a date which will live in infamy.”

Taking the long view, Gillon asserts that “Pearl Harbor was the defining event of the twentieth century” because “it changed the global balance of power, set the stage for the Cold War, and allowed the United States to emerge as a global superpower.”  But no one could know that then.  At the time, FDR needed to find out quickly what had happened (at a time when “intelligence was scarce and difficult to obtain”), then decide how to set America on the right path for its next step.  The President, Gillon says, “was forced to make every major decision based on instinct and his own strategic sense of right and wrong.  There were no instant surveys to guide his actions, no twenty-four-hour television coverage offering him a glimpse into the national mood.  Making matters worse, the president’s advisors were anxious and divided.”

Compared to news of the Kennedy assassination or the 9/11 attacks, “news about Pearl Harbor spread slowly, trickling out over the radio in the afternoon.”  The White House press corps included only about a dozen reporters, all of whom were off duty on that Sunday afternoon when the first word came through.  FDR himself initially heard about it at 1:47 p.m.  Thus began “perhaps the most well-documented day of the Roosevelt presidency,” written about by the people around Roosevelt and subsequently by several government investigations into how the disaster could have happened.  Roosevelt retreated to his Oval Study (his private office, far more informal than the Oval Office), where, surrounded by the clutter of his books, stamp collection, ship models, and other miscellanea, he met with advisors, pieced together the shards of information that came in, and crafted the brief, 500-word war message that he would deliver the next day. 

FDR, of course, wasn’t the only one hobbled by incomplete information—and he used that fact to his advantage.  As the scale of the damage became progressively clearer to him, he took refuge in ambiguity and vagaries when speaking to others, not least because he didn’t want word to get to the Japanese of how successful they had been.  As Gillon aptly points out, “Roosevelt’s successful leadership depended on a level of deception that would be unacceptable by today’s standards.”  (One thing that he was not deceptive about:  the fact that the attack was indeed a surprise.  Gillon rightly spends very little time on diehard adherents of the “back door to war” thesis, but all the evidence he uses in his narrative thoroughly refutes them.)

As reports on damage (including sinking of or damage to eight battleships and four destroyers, and over 2,300 men killed) came in, it was clear that the United States would be facing a foe more formidable than previously estimated.  And there was every expectation that very soon Japan would strike elsewhere—but where?  the Philippines?  Samoa?  California?  Nobody knew—and reports coming in about those places were confused and conflicting.  But FDR had no doubt that the primary enemy was still Hitler’s Germany.  The Japanese seemed to him, in Gillon’s words, “German puppets,” who wouldn’t dare attack the United States on their own.  Initial reports—later shown to be inaccurate—that two of the planes over Hawaii were marked with swastikas probably reinforced his conviction. 

Here, then, is a classic case of doing the right thing for reasons that are at least partly erroneous.  Even if Roosevelt had calculated Japanese intentions and capacities more accurately, there was still no question that Germany posed the greater, more immediate threat.  And that fact required that FDR manage the delicate feat of arousing the nation sufficiently to meet the threat in the Pacific but not stirring it so much that it forgot about Europe.  Hence the speech’s terseness (Secretary of State Cordell Hull had wanted a much longer recital of America’s grievances against the Japanese) and FDR’s tone of grim determination.  Indeed, throughout the day, FDR displayed what Eleanor Roosevelt referred to as “deadly calm,” a quiet, relentless focus that she had seen before, when he began his struggle with polio.  (Gillon adds that if that earlier struggle had made him resilient, it also enhanced his “propensity for deception.”)

Although the war had been drawing near for some time, and security arrangements in Washington had been enhanced by the Secret Service when hostilities began in Europe, America was still far from a national security state.  That began to change immediately after Pearl Harbor.  The Secret Service, says Gillon, “formed an airtight seal around FDR.”  When he went to deliver his war message, he rode in an armored car that had been confiscated from Al Capone. 

On the afternoon of December 8, less than four hours after Roosevelt had asked Congress to declare war, both the House and Senate did so, and the President signed the measure.  On December 11, Germany, for reasons of its own, relieved FDR of his concerns about how to continue an unofficial war with Germany while fighting an official one against Japan, by declaring war on the United States. 

If Roosevelt had underestimated Japan, Japan most certainly underestimated the United States.  As Gillon points out, “within two weeks, the army had almost as many planes in Hawaii as before December 7.”  And mobilization for a two-front war, already primed by Lend-Lease and the naval conflict in the Atlantic, instantly roared to life.

All of these achievements, however, lay in a hazy, uncertain future on December 7 and 8, 1941.  Gillon’s brisk tale that follows Franklin Roosevelt through the fire and fog of Pearl Harbor is a miniature model of the historian’s craft.  He gives careful consideration to the national and international contexts.  He gives accident and coincidence their due.  He emphasizes appropriately the significance of individual personality and character.  And, most of all, he shows how important it is to know what the President knew, and didn’t or couldn’t know, if we are to understand and evaluate his leadership at a critical moment in history.


Sunday, December 11, 2011 - 21:07

Murray Polner is the author of numerous books, the former editor of Present Tense, and a book review editor for the History News Network.

Not since Harrison Salisbury’s book The 900 Days appeared in 1969,has an English-language book devoted to the German siege of Leningrad (now renamed back to St. Petersburg) appeared. The longest blockade in recorded history, it consumed 1.5 million people, half of them civilians, many of them children. In merciless, unvarnished detail, Anna Reid’s Leningrad is filled with searing images of starvation, cannibalism, corruption and death in that most Westernized and striking of Russia’s cities.

The siege has essentially been overlooked in the West.  But then, too, we’ve ignored the enormous sacrifices of the Russian people and its military forces in defeating Nazi Germany and its allies.

Reid is a onetime Ukraine correspondent for The Economist and Daily Telegraph, and a journalist who holds an advanced degree in Russian studies.  The heart of her book is the memoirs, archives, letters and diaries of people who lived through the siege.  Her heartbreaking and angry version does not spare the vicious German invaders, though she rightly excoriates the Communist regime for waging a reign of terror against the city’s imaginary dissenters.

Trapped Leningraders would in time turn livid at the sight of well-fed Party bureaucrats while the rest were starving,  Reid is on target in wondering why sufficient food supplies were not stocked before the Germans invaded and surrounded the city.  She also faults Party officials for failing to order a general evacuation until it was far too late.  While admittedly difficult to measure public opinion, Reid’s reading of the diaries and memoirs “show Leningraders raging as much against the incompetence, callousness, hypocrisy and dishonesty of their own officials as against the distant, impersonal enemy.”

Yet Stalin’s purges and punishments never ceased.  The NKVD and two of Stalin’s closest henchmen, Andrei Zhdanov (who once denigrated the great Leningrad poet Anna Akhmatova as “a cross between a nun and whore”) and Georgi Malenkov (who would become one of Stalin’s successors after the dictator’s death in March 1953 and then just as abruptly would be removed and sent, or so it is said, to Siberia for an alleged offense) carried out a reign of fear aimed at domestic “enemies.”

Reid cites a Leningrad NKVD study citing the sort of people punished, among them supposed anarchists, Trotskyists, kulaks, tsarist sympathizers, the rich, and of course, Jews.  She offers a devastating portrait of one roundup of people awaiting banishment.  According to an eyewitness, Lyubov Shaporina.  “…about a hundred people waited to be exiled.  They were mostly old women…  These are the enemies our government is capable of fighting…  The Germans are at the gates, the Germans are about to enter the city, and we are arresting and deporting old women—lonely, defenseless, harmless people.”  Reid’s book is filled with similar examples.  The popular poet Olga Berggolt’s doctor father had, she wrote, loyally served the Soviet Union since the Russian Civil War, but was dispatched, emaciated, to Krasnoyarsk in western Siberia because, Reid speculates, he was Jewish and refused to spy on his colleagues.

Even party officials were not immune, and their personal conflicts resembled a Mafia shootout with the persecutors turning on one another during the darkest days of the siege.  Most notably, Malenkov tried to eliminate his rival Zhdanov, but Stalin spared his sycophant.  Everyday Leningraders were not as lucky while Malenkov, who survived his fight with Zhdanov, and Vyachaslav Molotov (he who disillusioned many Communist Party members throughout the world in 1939 when he brushed off the Soviet-Nazi Non-Aggression Pact by saying, “Fascism is a matter of taste”) began yet more arrests and deportations of “suspects” inside the city.

While Reid dwells on Soviet crimes and ineptitude, she also turns toward the Germans, who after all, were the primary cause of the city’s misery.  After they captured nearby Pavlovsky and Pushkin, a “fiercely anti-Bolshevik” diarist Olga Osipova, initially believed that compared to the Communists, Hitler and the Nazis were not so bad.  But she quickly learned “that the Nazis were different” after seeing them in action.  All of Pushkin’s Jews were executed.  Another memorist, the composer Bogdanov-Berezovsky, met a former neighbor who described countless examples of hangings and shootings of Jewish civilians in surrounding regions.

In the end, Reid argues, “the war was won at unnecessarily huge cost.  Of this the blockade of Leningrad is perhaps the most extreme example….Had Russia had different leaders she might have prepared for the siege better, prevented the Germans from surrounding the city at all, or, indeed, never have been invaded in the first place.”

Eventually the siege ended and a few years later the war ended.  More than twenty million Russians were dead.  Leningraders, and in fact many Russians, looked forward to a change in direction since “[h]aving fought, worked and suffered for their country for four years, they felt they had earned the right to be trusted by its government.  They longed for the ordinary decencies of civilized life.”

It was not to be and the repression continued unabated.  4.5 million Soviet troops were captured by the Germans and only 1.8 million emerged alive; the rest, especially Jews and Party members, were executed by their captors.  POW returnees were often punished by a government which suspected then of betraying the Party.  Lev Kopelov, a Red Army soldier, publicly objected to the mass rapes of German and Polish women, and was sent to a gulag until 1954, the year after Stalin finally departed this earth.

The oppression that followed rivaled the darkest years of the thirties when leading communists, generals and intellectuals were put to death. At its simplest, most inane level, French bread was renamed Moscow bread, echoed in 2003 by American pseudo-patriots’ transforming French fries into freedom fries after France objected to the U.S. invasion of Iraq.  And Soviet propagandists insisted that baseball was really a Russian invention.  Far more seriously, terrified colleagues informed on one another and the camps once again began filling up.  Jews again bore the brunt of the regime's vindictiveness.  Molotov’s Jewish wife was falsely accused of being a Zionist and advocate of group sex and was sent to a camp, after which her husband, ever faithful to the Party and his own personal survival, divorced her.  Not until Stalin died did the terror begin to subside, yet paradoxically millions of Russians publicly grieved for their brutal Georgian leader, though just as many millions were silently thankful that he was finally gone.


Sunday, December 11, 2011 - 14:14

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is the author of The American Dream: A Short History of an Idea that Shaped a Nation, among other books. He is completing a study of Hollywood actors as historians slated for publication by Oxford University Press next year. Cullen blogs at American History Now.

When a psychiatrist friend in our reading group recently suggested that our next book discussion focus on the topic of sexual deviance, my instinctive reaction was one of aversion. (Not that there's anything wrong with that. Is there?) I did recall, however, that the latest Russell Banks novel deals with that subject. I've long been a Banks fan -- his 1995 novel Rule of the Bone was a rich re-imagining of an unlikely interracial friendship spanning North and Latin America, and his 1998 novel Cloudsplitter helped me understand the 19th century abolitionist freedom-fighter/terrorist John Brown in a way no else ever had -- but again, the topic of sex offenders was not particularly appetizing. Still, I figured that if anyone could make that subject compelling, Banks could, and the group agreed to adopt it as our next title.

I took for granted that it was going to take a while to get into Memory of Lost Skin. But from the opening page, when its fearful young protagonist -- known only as the Kid -- goes into a public library in order to ascertain whether he could be found on an Internet site listing local sex offenders, I was riveted. Here as in his other fiction, Banks demonstrates a remarkable ability to make us care about people in situations we are unlikely to understand, much less sympathize with, were we to encounter them in real life. But I found myself with an instant attachment to this character in his unselfconscious affection for his pet iguana, the only living creature in his life with which he experiences anything resembling emotional reciprocity. Instinctively smart and yet profoundly ignorant, I was stunned by the intensity of my desire that this homeless, fallible human being get a second chance after a foolish mistake. And my anxiety that he would not.

The Kid, who never knew his father, grew up with a mother whose stance toward him was one of benign neglect (emphasis on the latter). Since she was largely concerned with a string of disposable sexual liaisons, the socially isolated Kid viewed online pornography as his primary window on the outside world. A stint in the army was cut short by a maladroit act of generosity, sending him back home again to South Florida. We eventually learn what he subsequently did with a minor that resulted in a three-month jail sentence. More punishing than the jail stint is his ten-year prohibition against living less than 2500 feet from any public setting in which there are children, which effectively makes it impossible to do much else than pitch a tent under a highway in a makeshift community of other convicts. We meet various members of this community, whose appeal and moral stature vary widely.

We also meet another mysterious character who, like the Kid, is known by the similarly enigmatic name of the Professor. A sociologist of immense girth and intellect, the Professor enters the Kid's life just after the young man experienced a series of setbacks involving his job and makeshift residence. But the Professor's motives are murky, something the Kid knows just as well as the reader. The omniscient narrator allows us to see more of the Professor's life than the Kid does, and we sense decency in his motives, even as we know that there's a lot of his story that's missing. Over the course of the tale we learn more (not everything, but more) about him. The Kid, meanwhile, finds himself ever more dependent on the Professor. There's irony in this, because the Professor helps the Kid adopt new pets for which he can exercise responsibility, and he aids the Kid in assuming a role of leadership among the sex offenders in their efforts to survive in the face of community hostility and poor living conditions. But there's another irony as well, because in the key plot twist of the novel, the Kid finds himself in a position to help the Professor, though he's not sure he should.

Like Rule of the Bone, Lost Memory of Skin -- the title has reptilian, sexual, and other connotations -- resonates with the spirit of Mark Twain's The Adventures of Huckleberry Finn, whose name is invoked a number of times here.  In all three cases, we have an unlikely friendship between an older man and a younger one in a world that regards both with suspicion. But Lost Skin is a bit different than the others in that it's less a story of flight than a quest for its main characters to keep a home despite pasts that make this seemingly impossible. There is no territory for the Kid to light out for; as for the Professor, unseen walls are closing in. That's what makes their tale so gripping, and so sad.

In a more important sense, however, this novel really is consonant with Huck Finn. Banks, like Twain, believes that we are all born with varying forms of decency independent of the circumstances of our birth. At the same time, however, our notion of morality is shaped by those circumstances, which can lead us to tragically misguided notions of of right, wrong, and our capacity to know the truth. Yet the belief -- and we are in the realm of faith -- that we can find a justifiable reality gives the novel a sense of earned hope. Not optimism, mind you, but hope.

I understand -- insofar as anyone who hasn't experienced sexual abuse can ever really understand -- the imperative to protect people from a real evil, even as I wonder about the costs of what appears to be an intensifying taboo. I sometimes find myself wondering whether my appetite for reading is simply one more form of addiction, albeit one in which I am fortunate because my predilections don't afflict anyone beyond loved ones who may wish they had more of my undivided attention. But I experienced Lost Memory of Skin not as a fix for a bad habit, but rather an an experience that widened and deepened my understanding of the world. I'm grateful for the compassion of Russell Banks. And I'll try to keep an eye out for the Kid.


Tuesday, December 6, 2011 - 11:31

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is the author of The American Dream: Short History of an Idea that Shaped a Nation, among other books. He is completing a study of Hollywood actors as historians slated for publication by Oxford University Press next year. Cullen blogs at American History Now.

Before there was Facebook, before there were iPhones, there was MTV. After an unprepossessing launch in 1981, the cable network became a powerful force in American popular culture, exerting a much-noted impact not only on the music and television industries, but also on film, fashion, and even politics. Some of the attention MTV got was celebratory; some of it highly critical (from a variety of directions). About the only thing more striking than the network's dramatic impact is the degree it has receded since its first decade of cultural dominance. So the time seems right for an assessment of its trajectory.

Former Billboard editor Craig Marks and music journalist Rob Tannenbaum make a shrewd choice in rending the MTV story as an oral history, taking a page from Live from New York, the 2003 Tom Shales/James Andrew Miller history of Saturday Night Live (and before that, George Plimpton's ground-breaking 1982 biography of Edie Sedgewick, Edie). Tannenbaum and Craig conducted hundreds of interviews that that they arrange in a kaleidoscopic array of voices that include corporate executives, performers, video directors, and so-called "VJs" like Martha Quinn and Mark Goodman.

From its inception, MTV was a slick corporate product. Underwritten by the somewhat unlikely duo of Warner Cable and American Express -- which at the time hoped to sell financial services via interactive television -- the network's commercial premise rested on an audacious concept: to use one kind of advertising (musical acts promoting themselves) in order to sell another (ads that would be sandwiched between the videos). Even more audacious is that MTV got programming, at least initially, free, as it expected record labels to supply the material it broadcast, though the actual cost of the videos was typically charged to the artists in the form of an advance against royalties. There was widespread skepticism in just about every direction that this business model would actually work, but it proved to be spectacularly successful.

Like the advent of sound in motion pictures, the rise of music video rearranged the power structure of the music business. British musicians, who had long been using video clips for shows like the much-beloved Top of the Pops, were better prepared, both in terms of having content at hand and their willingness to produce more, in exploiting the opportunity, spawning a second British invasion in the early 1980s that included acts like Flock of Seagulls, Culture Club, and the Human League. Similarly, established acts with photogenic and/or charismatic lead singers, such as the Police and U2, were also able to exploit the potential of the new genre. By contrast, those without such assets or an inability to fully understand it suffered; there's an amusing chapter in I Want My MTV that chronicles the way rock star Billy Squier's video "Rock Me Tonight" was directed in a gay-friendly manner that wrecked his credibility among his core audience.

In aesthetic terms, music video evolved with remarkable rapidity, its development greatly accelerated by Michael Jackson, who overcame early resistance to having his videos broadcast and took the form to a whole new level. Madonna was similarly successful in bending the channel to showcase her talents, not the least of which was creating a sexual brand. But MTV was finally a director's medium, and was important in launching a series of careers, among the most important of which was that of David Fincher, whose apprenticeship in music video became the springboard for a distinguished, and ongoing, Hollywood career.

But almost from the start, MTV had a remarkably decadent corporate culture that over time sapped its vitality. In part, it was corrupted -- insofar as the term makes any sense in the music biz -- by an unholy alliance between executives and artists, who collaborated in a regime of sex, drugs, and rock & roll that made the counterculture of the 1960s seem tame by comparison. But MTV's indulgences were not only sybaritic. The network cultivated incestuous commercial relationships with certain performers, as well as indulged in racist, sexist and other questionable practices. Above all, it was corroded by money, chiefly in the form of inflated video budgets that gave accounting precedence over art.

Marks and Tannenbaum chart these developments at the network with surprising detail and clarity, the panoply of voices showing both multiple perspectives on the same video as well as the way in which prevailing perceptions were widely shared. The authors also document the many memorable highlights and byways of MTV's history, like Madonna's notorious appearance in a wedding dress at the 1984 MTV Awards ceremony, for example, or Tipper Gore's notorious crusade against Twisted Sister and other bands with the Parents' Music Resource Coalition (PMRC) in the late eighties. They also chart the network's gradual move into hip-hop, which revived the vitality of pop music as well as video in the early 1990s, and the role of MTV in electing Bill Clinton president in 1992.

By this point, however, the vast center MTV had created -- for much of the eighties it was the de facto national radio station, creating and/or sustaining huge mass audiences for the likes of acts like Prince and Bruce Springsteen -- was beginning to crack. A rotation that included R.E.M., Debbie Gibson, and Public Enemy was intrinsically centrifugal, and as such less attractive to advertisers. The rise of grunge rock, particularly that of Nirvana and Pearl Jam, represented a bracing new chapter for MTV, but that's because such bands overtly challenged much of what the network stood for. At the same time, the channel found other sources of cheap programming, like The Real World, that squeezed time for music videos, which gradually but inexorably disappeared from sight. Finally, the advent of the Internet, which empowered viewer choice to an unprecedented degree, balkanized audiences to the point of no return. As Marks and Tannenbaum note, "Offering MTV to a kid in 1993 was like offering a board game to a kid in 1981."

Today, MTV is just another cable channel, albeit one that enjoys commercial success with Jersey Shore, a tawdry show that honors the network's brash roots in style, though not in content. Music video lingers, chiefly on Internet sites like You Tube, where it remains the marketing tool it always has been. It's much less important than it used to be, but something closer to what its more modest champions imagined three decades ago. Reliving the glory days of MTV in this book is entertaining but sobering: the things that once seemed to matter so much now seem so small. Sic transit gloria mundi, Facebook. As Elvis Costello put it so memorably way back "Girls Talk," his 1979 song from before the MTV era, "You may not be an old-fashioned girl but you're gonna get dated."


Saturday, November 26, 2011 - 19:33

SOURCE: ()

Murray Polner is the author of numerous books, the former editor of Past Tense, and a book review editor for the History News Network.

Reviewing Ian Kershaw’s The End: The Defiance and Destruction of Hitler’s German, 1944-45 in the New York Times, James J. Sheehan wondered why everyday Germans, facing imminent defeat in mid-1945, “continued to obey a government that had nothing left to offer them but death and destruction.”  That’s not an easy question.  Why did they and their fellow Germans blindly follow murderers and thugs they had once hailed and faithfully served?  Could it have been simply obedience to leaders?  Or was it, Thomas A. Kohut asks, a tribal tendency to “belong and, consequently to exclude?”  Kohut, professor of history at Williams College and author of Wilhelm II and the Germans: A Study in Leadership, has no definitive answer –nor does the vast literature about the subject.  The virtue of this book is that he does try to see that blood-spattered era through the eyes of individuals, rather than politicians, generals and others justifying what they did and didn’t do.

A partial if still unsatisfying answer may nonetheless lie in Kohut’s fifteen-year quest to collect sixty-two oral histories of “ordinary” Germans, many of them composites, a method to which some may object.  He explains that while the sixty-two “cannot be equated with an entire generation” they do show “significant characteristics of the generation to which they belong, at times to a pronounced degree.”  Kohut tells us his father was a Viennese Jew, a fact which would have meant his death had he not fled to the U.S. in 1938, a detail which may explain his remark, “I do not particularly like the interviewees.”  It’s hard to know when and if personal feelings and ideology cloud a scholar’s view.  Whatever the truth, his conclusions have been corroborated by many scholars, few of whom “liked” the people they wrote about.

All sixty-two were German Protestants, all born before World War I, all members of German youth movements. They were happy that the Nazis won the January 1933 election and were overwhelmingly supportive for most of the following years as fervent Nazis.  One of his interviewees hailed the Anschluss with Austria in 1938 and proudly served in the military.  Another spoke of the good life until it all came crashing down with the arrival of devastating bombing raids and allied armies from east and west.  Once the war ended they needed to rationalize their behavior, but were blindsided when their adult children of the 1968 generation wanted to know why they had so enthusiastically favored so brutal a regime.

Kohut devotes much space to Germany from World War I to the Weimar Republic, a weak democratic government despised by putative and actual Nazis but also by run-of-the-mill Germans trapped by raging inflation, massive unemployment, and bitter and divisive political wars, all of which attracted them to promises of jobs as well as dreams of renewed German power.  Actually, Germany’s fate was sealed when the German Communist Party (KPD), following Moscow’s orders, slandered the Social Democrats (SPD) as “social fascists” in the 1933 election, thus splitting the opposition and allowing the Nazis to win.  This review concentrates on the Nazi epoch that followed.

The issue of anti-Semitism was raised in his interviews because it was so prominent during the Nazi era.  Jews had lived in both German states and later unified Germany for centuries and contributed much to music, the arts, sciences and literature as well as political and economic life.  Yet, emboldened and persuaded by incessant Nazi propaganda, many of the sixty-two accepted the anti-Jewish line.  Enabled by the silence of the Catholic and Protestant churches (though church politicians felt brave enough to speak out against the Nazi’s euthanasia plan), they either “looked away”—or approved—when confronted by the sight of Jews  beaten on public streets, their businesses shattered, and their disappearance from cities, towns and villages.  Writes Kohut: “One way to characterize the interviewees’ knowledge of the Holocaust before the end of the war is that they knew facts that should have led them to conclude that atrocities were being committed against Jewish people.  It took an act of will not to have known what was going on.” 

How could they not have known, at least from stories they surely must have heard from furloughed and wounded soldiers, especially those who served in Poland, the Baltic states, and Russia, where Jews were regularly murdered by Germans and their Baltic, especially Latvian, and Ukrainian allies.  Germans filled and volunteered for the Einsatzgruppen death squads, murderers of an estimated one million Jews, plus many others (only fourteen of the killers ever received death penalties in postwar war crimes trials and even then though few were executed; most had their sentences commuted and in 1958 all surviving executioners were freed).  There is too the glaring example of the “ordinary men” in Christopher Browning’s searing book of the same name (0rdinary Men: Reserve Police Battalion 101 and the Final Solution in Poland), the story of a Hamburg police force, most of whom physically rejected by the army, but willing to kill  every Jew they could find.  Browning’s thesis was that they became mass murderers because of their deference to authority.  Yet in Hamburg, known in pre-Hitler Germany as “Red” Hamburg for its large number of Communist sympathizers, they were not scorned by the populace when they returned from their bloody service.

 

“Franz Orthmann,”one of Kohut’s subjects, was a child of the middle class, and “absolutely susceptible” to the “powerful sense of revitalization” he dreamed the Nazis would create.  He joined the party in 1938 after the Anschluss with Austria, ecstatic about the realization of a Greater Germany.  He also enlisted in the army, became an officer and always believed he was serving a noble cause.  But had he ever known of the sadism at home and on the various fronts?  Only rumors, he answered. “I never gave a thought to what it all meant, and there was much about the Propaganda Ministry that one shouldn’t simply dismiss out of hand.”  Besides, he added, “I believed that I was serving in a great cause.”

But what about the many civilians victimized by genocide?  Orthmann says that when a soldier described several killings he had seen he called him a “pig.”  After a driver for the Oranienburg concentration camp told him of gas chambers, of Jewish children “tossed up in air and spitted on bayonets” he says he finally became convinced that the rumors were true.  All the same, after Richard von Weizacker, president of the Federal Republic of Germany from 1984 to 1994, told the German parliament that all Germans knew about the concentration camps and the savagery, Orthmann became incensed, at first claiming it wasn’t so.  He finally changed his mind when he personally heard an SA man boast of the mass killing of Jews. 

Magdalene Beck, another composite interviewee, was eighteen when the Nazis came to power.  Her husband was a soldier.  An early and unswerving enthusiast for the Nazis, she claims to have nevertheless been apolitical.  Though Jews in her town suddenly were nowhere to be seen in their homes and local schools, she looked away [111] even though stories began circulating about horrific happenings to civilians in Poland, Russia and elsewhere in conquered lands.  She and many Germans have always defended their silence, arguing that individuals who objected to what their government was up to were helpless before the power of the Nazi state.  Dissenters had all been crushed early on and always the penalty for opposition was often execution, the fate of the White Rose trio and the July 20 plotters, for example.  In the documentary film Elusive Justice, Heinrich Gross, who ran a homicidal institute which euthanized children, justifies his role by saying, “If you were a staff member and refused to participate, you would be killed by the Nazis.”

Resist?  “You are asking someone to resist who knows what will happen to him if he opposes such a dictatorial regime,” she told Kohut, echoing Gross and far too many Germans.  “He’ll implicate his whole family.  You try that!  You try that!”  There is some truth in that few individuals anywhere dare to challenge their tyrannical governments.   Yet silence also meant that conforming and passive Germans had little to fear from the Nazis, except as Kohut notes in his mesmerizing book, that they ultimately paid a heavy price:  six-and-a-half million non-Jewish Germans died in the war, five-and-a-half million of them soldiers.
 
In the final days of the war, with the Red Army approaching and stories of massacres in East Prussia circulating, Beck said she was terrified of rape, given the stories of mass rapes in Germany and Poland by Russian troops,  though she said nothing about what her countrymen had done to Polish and Russian and women.  And yet when Russians soldiers finally arrived, she says, they were unexpectedly kind, especially to children.  But, echoing the Nazi line, she was frightened of black American soldiers and told her children to avoid taking candy from them, fearing that it might contain poison.

“I was never one of those fanatical Nazis,” Magdalene Beck told Kohut after the war, “but I believed in it with all my heart and soul, and I worked for it.”


Wednesday, November 23, 2011 - 14:17

SOURCE: ()

Mel Ayton is author of The JFK Assassination—Dispelling The Myths (2002), A Racial Crime—James Earl Ray and the Assassination of Dr Martin Luther King Jr (2005) and The Forgotten Terrorist—Sirhan Sirhan and the Assassination of Robert F Kennedy (2007).  His latest book about racist killer Joseph Paul Franklin, Dark Soul of the South, was published in May 2011.  Readers can access his online HNN articles here.

Every now and then a JFK assassination book comes along that bristles with erudition and common sense, providing the reader with rational answers to anomalous pieces of evidence in the case that have been exaggerated beyond belief by bogus historians cashing in on the public’s desire for drama and intrigue.

In the 1970s, Priscilla Johnson McMillan’s Marina and Lee, a book which could be characterized as ‘Marina Oswald’s Memoirs, gave the American public an insight into the mind and character of JFK’s assassin Lee Harvey Oswald, an enigmatic  young man who had  remained a puzzle to the American people since the November 1963assassination.

In the 1980s, Jean Davison’s Oswald’s Game gave readers a logical explanation for the assassination:  Oswald, a hero-worshipper of Fidel Castro and a wannabe revolutionary, had political motives and he likely acted out of a distorted sense of political idealism.

In the 1990s, Gerald Posner’s Case Closed, a well-written account of the assassination that debunked numerous conspiracy scenarios provided a refreshing antidote to Oliver Stone’s movie about the assassination, JFK.  Stone’s largely fictional drama had been released in cinemas in the early 1990s.  Its central character was Jim Garrison, the New Orleans District Attorney who accused the CIA of Kennedy’s murder.  His false history of the assassination had a corrosive effect on a new generation’s ability to understand this important event in U.S. history.  Fortunately, another corrective to the movie came in 1998 with the publication of Patricia Lambert’s excellent book False Witness, which firmly exposed Garrison as a charlatan and a fraud.

Within recent years Vincent Bugliosi’s Reclaiming History, a mammoth 1,600 page book, examined every theory and every conspiracy claim.  The former Los Angeles lawyer, who became famous for his prosecution of hippie killer Charles Manson, took the debate about conspiracy allegations a step further by providing a devastating no-nonsense approach to the ridiculous assassination scenarios constructed by conspiracy authors, all of whom, as his book ably demonstrates, deliberately skewed the evidence in the case.  His book was a masterwork that decisively marginalized JFK conspiracists.

So at the end of the first decade of the new century the matter appeared to be settled.  I, amongst many JFK assassination researchers, would have thought there was nothing more to say on the subject.  The above authors provided all the answers to conspiracy allegations to the satisfaction of history.

I was wrong.  John McAdams has added to the sum of knowledge about this case and other famous conspiracy theories by writing a book which will help many who have fallen victim to the vast conspiracy literature on the market.  His “how to” book challenges readers to look at how conspiracy writers have interpreted the evidence using seriously flawed methods.

McAdams has provided a blueprint for understanding how conspiracy theories arise and how anyone interested in conspiracies should judge the mass of contradictory evidence in the case.  Having studied the JFK assassination for the past two decades he has developed a sharp intellectual ability at pointing out the illogical nature of virtually all conspiracy theories and helps the reader to separate the wheat from the chaff in Kennedy assassination literature.

The author’s intent is not to persuade the reader that there is no credible evidence to prove that JFK was assassinated as the result of a conspiracy.  Instead, McAdams concentrates on advising the reader how to think about conspiracy theories, especially the JFK assassination. By addressing the logical weaknesses in conspiracy books, he has been able to demonstrate how not to be duped about this important event in American history.  For example, McAdams asks the reader to think logically; to stick to the evidence; to stick to common sense.  He teaches you how to reach a rational, compelling conclusion based on evidence and reason, not on emotion or conjecture.  His work is based not on theory, speculation, rumor, third-hand hearsay, or secondary evidence or opinion (save those of scientifically qualified experts).  Instead, he advises the reader to reach a conclusion based on reflecting on the notion of “coincidence,” selectivity in the use of evidence, making an informed choice between contradictory pieces of evidence, and to search for evidence which fits a coherent theory.  This advice is central to his didacticism.

Many of the assassination’s elements have become part of American folklore—the so-called “Magic Bullet” (The subject of a recent National Geographic Channel documentary The Lost Bullet), the grassy knoll shooter, the ballistics and medical evidence and the alleged mysterious deaths.  McAdams immerses the reader in the fine points of each element then demonstrates to the reader how illogical the conspiracist interpretation really is.

Three of the more interesting expositions in the book address the alleged conspiracy remarks made by FBI Director J. Edgar Hoover, the alleged involvement of the CIA in the president’s murder and the repeated and wrongful use of Jack Ruby’s statements to the press and the Warren Commission.

As McAdams demonstrates, Hoover was “clueless” in the first weeks after the assassination.  The FBI director had been kept informed about the direction of the FBI’s investigation by his agents on the ground.  Inevitably, investigating agents were confronted by contradictory statements made by witnesses at the scene of the assassination and the doctors who attended the president and Governor Connally.  The “less than coherent” data that agents collected in the frenetic circumstances of the time was utilized by Hoover when the director passed information about the investigation to President Johnson, Bobby Kennedy and other government leaders. The FBI eventually cleared up the false data, false leads and false witness statements, and its completed report on the assassination became central to the Warren Commission’s own investigation.  However, conspiracists simply ignored its contents and instead concentrated on Hoover’s wrong-headed comments as proof of a conspiracy, instead of putting Hoover’s remarks in context as the act of a confused person attempting to grasp what exactly had happened in the hours and days following the assassination.

McAdams also challenges those who believe the FBI was part of a conspiracy by asking, “So just how does somebody who is so confused on so many points direct a cover-up?”  In a similar vein, McAdams debunks allegations of CIA involvement in the assassination by demonstrating how the agency mishandled their investigation into Oswald’s nefarious political activities.  In telling the story of the CIA’s involvement in Jim Garrison’s 1967/1968 New Orleans investigation, McAdams allows the reader to come to the logical conclusion that bureaucratic bungling, rather than conspiratorial malfeasance, lay at the heart of their efforts.

McAdams, in his chapter “Bogus Quoting: Stripping Context, Misleading Readers,” shows how conspiracy writers have abused the evidence by taking quotes and statements out of context.  He demonstrates this no better by making reference to the countless times conspiracists have used Jack Ruby’s published statements to the press and the Warren Commission which make reference to a “conspiracy.”  For example, the conspiracist par excellence Mark Lane wrote, “Ruby made it plain that if the commission took him from the Dallas jail and permitted him to testify in Washington, he could tell more there; it was impossible for him to tell the whole truth so long as he was in the jail in Dallas... (Ruby said) ‘I would like to request that I go to Washington and... take all the tests that I have to take.  It is very important...Gentlemen, unless you get me to Washington, you can't get a fair shake out of me.’”

However, it is clear from Ruby's Warren Commission testimony that he simply wanted to inform the commissioners of a conspiracy to murder Jews.  Earl Warren, the commission's chairman said, “I went down and took Jack Ruby's testimony myself – he wouldn't talk to anybody but me.  And he wanted the FBI to give him a lie detector test, and I think the FBI did, and he cleared it all right.  I was satisfied myself that he didn't know Oswald, never had heard of him.  But the fellow was clearly delusional when I talked to him.  He took me aside and he said, ‘Hear those voices, hear those voices’?  He thought they were Jewish children and Jewish women who were being put to death in the building there.”  Ruby told Earl Warren, Gerald Ford and others, “I am as innocent regarding any conspiracy as any of you gentlemen in the room.”  Ruby was actually begging the commission to take him back to Washington so that he could take a polygraph examination and prove that he was telling the truth when he denied any role in a conspiracy.

McAdams divides his book into further chapters dealing with how eyewitnesses and ear witnesses behave, how over-reliance on witness testimony weakens any crime investigation, the use of photographic evidence and how bureaucracies behave.  He allows the reader to become a detective who tries to solve an intriguing puzzle.  The solution, in each case, involves using intellectual tools and skills.

If those wishing to learn the truth about the JFK assassination (and other bogus conspiratorial hauntings of the American psyche) follow his step-by-step approach in understanding conspiracy claims there may well be a time when a new generation of Americans will be able to once more take control of their own history.

In the opinion of this reviewer John McAdams’ book is the final nail in the coffin of conspiracy theorists who have grabbed the attention of the mainstream media for far too long—mainly because the media understands all too well how the public loves a mystery.  If John McAdams’ book is read in conjunction with the excellent books mentioned earlier in this review the JFK assassination will be no mystery at all.


Monday, November 21, 2011 - 10:41

SOURCE: (11-17-11)

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is the author of The American Dream: Short History of an Idea that Shaped a Nation, among other books. He is completing a study of Hollywood actors as historians slated for publication by Oxford University Press next year. Cullen blogs at American History Now.

To say that the Civil War ain't what it used to be is to indulge a postmodern cliché: by this point, we all understand that what we "know" is socially constructed -- and contested. The takeaway from this anthology edited by Tom Brown at the University of South Carolina seems more prosaic but is actually a good deal more pointed: the Civil War is not what it used to be because it matters less than it once did. Which is not to say it's unimportant; the war continues to be engaged, in some cases with real intensity. But these essays collectively assert that it is now less a defining touchstone of national identity than a point of departure or iconographic warehouse for cultural productions that invert, bend, or re-imagine the conflict in ways that previous generations would hardly recognize, much less endorse.

Significantly, this cultural shift is not simply that of the avant garde. One of the more compelling pieces in the collection is Brown's own contribution, which looks at the lingering contemporary obsession with the Confederate flag. He notes that in the century following Appomattox, the flag was a rallying point for a sense of shared Southern identity, one whose resonance intensified in the mid-twentieth century as a response to the Civil Rights movement. Now, however, he argues that the Stars & Bars, along with related symbols, have become emblems of a self-conscious white minority that defends its civil right of self-expression with consumerist logic that would appall earlier guardians of Confederate identity, who regarded selling flags or souvenirs as a form of sacrilege. Insofar as the Southern experience of defeat has any compelling moral or psychological legitimacy, it's via a Vietnam analogy that is itself fading into history.

One also sees the recession of the Civil War in Robert Brinkmeyer's piece on contemporary Southern literature. Brinkmeyer notes that for African-Americans in particular the military conflict seems far less important than the antebellum decades leading up to it, and the battles are less important than various aspects of the home front. (The Wind Done Gone, Alice Randall's 2001 parody of Gone with the Wind is discussed by a number of essayists.) And for many white writers such as Bobbie Ann Mason or Ron Rash, the Civil War is a tangent, even a dessicated husk.

In many of these essays, local, even private, concerns trump national ones. In his piece on the growth of Juneteeth celebrations marking the anniversary of emancipation's arrival in Texas, Mitch Katchun observes that February 1, the day Abraham Lincoln signed the joint resolution that led to the Thirteenth Amendment, would be an apt candidate for a national holiday, especially since it comes at the start of Black History Month.  But it has been only one of many, and not a particularly beloved one.

Even the stock of the blue-chip Lincoln has sunk a bit. Amid his analysis of how the Left in general and Barack Obama in particular have tapped into the mythology of the Great Emancipator, C. Wyatt Evans notes that the contemporary Right has largely given up on Lincoln, feeling uncomfortable with his Big Government reputation and awkward with his Civil Rights legacy. The Tea Party invokes the Revolution, not the Civil War, as the source of its power and legitimacy.

The primary focus of Remixing the Civil War, however, are the visual arts, where collective memory of the conflict functions as a postmodern closet that gets raided for varied acts of bricolage. Essays by Elizabeth Young, Gerard Brown, and W. Fitzhugh Brundage all look at the way images, particularly photography, have been used to destabilize inherited notions of what the war was about. Sometimes contemporary artists complicate racial hierarchies or essentialized notions of blackness; other times their work involves the expansion or projection of alternative notions of sexuality or gender into nineteenth century settings. Ironically, some art carefully uses patiently recreated artifacts or settings to call attention to the artifice involved in remembrance.

Such work can be impressive in its passion, creativity, and intelligence. But it's a little depressing, too. In part that's because written history, scholarly and otherwise, seems to lack some of the same spark these artists show, as even the most avowedly transgressive or revisionist scholarly writing remains helmeted in academic convention. Conversely, the deeply fragmented quality of contemporary Civil War remembrance suggests a larger crisis of confidence in which grand unifying themes or aspirations can only be looked on with a sense of irony or suspicion. It's remarkable to consider that the versions of the Civil War that do evince such confidence, like Ken Burns's celebrated documentary or the 1989 film Glory are now (already!) a generation old. In becoming what can plausibly considered the first real 21st century rendition of its subject, this book provocatively suggests that the Civil War may really be running out of time.


Wednesday, November 16, 2011 - 21:52

SOURCE: ()

Ron Briley is a history teacher and an assistant headmaster at Sandia Preparatory School in Albuquerque, New Mexico.

The construction of the transcontinental railroads following the Civil War is often celebrated as the triumph of American business and industry, with the support of government, unifying the country and fostering the growth of national markets.  In Railroaded, Richard White, the Margaret Byrne Professor of American History at Stanford University and one of the founding scholars of the New Western History, challenges this assumption; concluding that Americans were “railroaded” in the late nineteenth century by finance capitalists into supporting the construction of a transportation system which was not based upon the economic needs of the western United States.  Refuting the creative destruction model of entrepreneurial capitalism employed by Joseph Schumpeter, White questions the assertion that the initial economic chaos of the transcontinentals paved the way for long term progress.  In this important piece of scholarship, White doubts whether the farm and business failures, dispossession of Native populations, and the environmental destruction wrought by the transcontinentals were harbingers of progress.  In a time when the achievements of corporate America are under great scrutiny, White’s history merits careful reading and consideration.

Viewing the American transcontinentals, the Canadian Pacific, and the railroads of northern Mexico as an interconnected railroad system essentially controlled by the same interests, White focuses most of his attention upon corporate records and the story of entrepreneurs who created these railroads.  Government played a crucial role in providing the credit that enriched private fortunes at public expense.  Describing the transcontinentals as dysfunctional corporations, White argues, “They built railroads that would have been better left unbuilt, and flooded markets with wheat, silver, cattle, and coal for which there was little or no need.  They set in motion a train of catastrophes for which society paid the price” (xxvi).

White also challenges the assumption of scholars such as Alfred Chandler and Robert Wiebe that the transcontinental railroad corporations were models of order and rationality, ushering modernism into the economy.  Instead, White perceives the organization of the railroads as dysfunctional and requiring government rescues to bail out corporate greed and mismanagement.  But White insists that he does not want to simply resurrect the Robber Baron interpretation of late nineteenth-century capitalism.  Rather than brilliant tycoons who masterfully manipulated the system to gain their personal fortunes, White views such entrepreneurs as often inept, while the corporations they created were often mismanaged and required public rescue.  Nevertheless, many of these business leaders were quite successful in creating personal financial empires, even if their railroads ended up bankrupt or in receivership.

This state of affairs promoted antimonopoly movements, which White depicts not as traditionalists opposed to modernism, but rather dynamic organizations of merchants, farmers, and laborers who sought control of government to limit corruption and the powers of corporations, recognizing the inequality imposed by the new social order.  While the antimonopolists were unable to wrest control of the government from railroad corporations, their story, notes White, indicates the possibilities for a different American history and future.

But it was never easy for the reformers, as is noted by the career of railroad critic Charles Francis Adams who ended up serving as President of the Union Pacific Railroad before being ousted by Jay Gould.  White devotes considerable time and space to Adams’s career as a railroad executive.  Adams expressed contempt for the railroad men whose credit schemes milked the public to amass private fortunes.  Nevertheless, Adams was unable to change corporate culture, and he succumbed to the notion that building even more inefficient rail lines could in some fashion restore competition and reform the system.  Adams found the political influence of the railroads, or what White terms “friends” in high places, thwarted his efforts at rationalizing the system.

While White expresses some sympathy for Adams, he finds little redeeming value in the careers of Leland Stanford, Collis P. Huntington, and their Associates with the Central Pacific and Southern Pacific Railroads.  White argues that Stanford and Huntington understood the big truth that corporate failure could be even more lucrative than business success.  White concludes, “The smooth internal function of these corporations was not necessary to their persistence.  They could be internally chaotic, financially undisciplined, prone to failure, and tremendously attractive for insiders nonetheless.  Attached to this big truth was a little one:  if failure could be lucrative, then ignorance, incompetence, and disorganization were not incompatible with the corporate form” (232).

The financial schemes of Stanford, Huntington, Gould, Tom Scott, Henry Villard, and James J. Hill are complex, and White deserves credit for carefully researching these market manipulations and attempting to explain them to the reader.  However, these details are often overwhelming for the reader, and it seems that even the perpetrators of these frauds were often in over their heads.  White, who has spent decades examining corporate records and archives, however, seems able to follow the bouncing ball of stock manipulations.

While much of this massive book focuses upon the financial schemes of railroad tycoons, White does not ignore the workingmen who constructed and maintained the rail lines.  Railroad monopolies, sometimes referred to as “The Octopus,” threatened a republican economy based upon the opportunity for producer citizens to rise in society.  Unfortunately, to many labor leaders in the West exploitation of Chinese contract labor represented the degraded fate of all workers, and thus Chinese immigrants earned little sympathy from their white counterparts whose struggles against the railroads were often plagued by racist rhetoric and action.  To combat consolidated corporate power and curtail vigilante actions against groups such as the Chinese, union leader Eugene Debs attempted to organize workers in the corporate image with a centralized and hierarchical structure.  The corporations, however, were able to enlist the aid of government and crush the countervailing power of labor.  White concludes by the mid 1890s, the railroads were even more wards of the government than thirty years earlier when their construction was paved by land grants and the Credit Mobilier.

Thus, White insists that the transcontinentals were “the triumph of the unfit, whose survival demanded the intervention of the state, which the corporations themselves corrupted” (509).  As an academic who has spent many years in railroad corporate archives, White is careful when making comparisons between the Gilded Age and the contemporary financial crisis.  Nevertheless, White finds the continuing linkage of corporate profit with state legislation and intervention to be a troubling legacy.  White writes, “Much has changed, but states and corporations remain intertwined, and structural conditions forged during the Gilded Age have never entirely disappeared.  My guys are dead and gone, but their equivalents—and the conditions that allow them to prosper—endure” (513).  White, however, perceives a more positive legacy in the tradition of antimonopoly which opposes corporate corruption as endangering the democratic promise of an economic and social system based upon republican citizenship.  It is the democratic promise of American life which empowers the Occupy Wall Street movement and the questioning of corporate greed and bail outs.  White’s history of the late nineteenth-century transcontinental railroads deserves a wide readership as we ponder the continuing social, economic, and political costs of the corporate model of creative destruction pioneered by the railroad entrepreneurs of the Gilded Age.


Wednesday, November 16, 2011 - 20:22

Aaron Leonard is a freelance journalist based in New York. He is regular contributor to the History News Network, truthout, rabble.ca, and PhysicsWorld.com. His writings can be found at www.aaronleonard.net.

When Mao Tse-Tung was alive he was cast alternately as bandit, communist leader, ruthless dictator, elder statesman, and mass murderer. Since his death the characterization has been less ambivalent: hedonistic despot, reckless utopian, unbridled monster. The change is anchored in the twists and turns of history. The unfettering of capitalism in the wake of the collapse of the Soviet Union and China’s manic opening to Western capitalism has no interest in seeing Mao in shades of grey. He is part of the troika of twentieth century “Evil:” Hitler, Stalin, Mao Tse-Tung.

Mao’s rap sheet encompasses two convulsive periods: the Great Leap Forward and the Cultural Revolution. The former is the evidently clinching event proving Mao’s personal culpability in the murder of tens of millions. So now we get the latest entry in the teeming “Mao is a monster” literature. Frank Dikötter’s Mao’s Great Famine: The History of China’s Most Devastating Catastrophe 1958-1962 is an examination of the Great Leap Forward period of rapid collectivization in the late 1950s and early 1960s, which coincided with a massive famine and the loss of life for many people. This book presents a revised and even more horrifying picture of what happened during the Great Leap. Dikötter’s claim to originality is that not only has he studied this extensively, he has examined regional records in the provinces of China and thereby proclaims to etch a truly accurate picture of what really happened. The following breathless declaration gives us the flavor of Dikötter’s approach:

As the fresh evidence presented in this book demonstrates coercion, terror and systematic violence were the foundation of the Great Leap Forward. Thanks to the often meticulous reports compiled by the party itself, we can infer that between 1958 and 1962 by a rough approximation 6 to 8 per cent of the victims were tortured to death or summarily killed—amounting to at least 2.5 million people. [1]

This declaration, if true, is damning and staggering. Yet a closer read reveals it as fallacious, as artful writing full of extrapolation and conjecture. Here we have reports that are “often meticulous,” (and what of the ones less so?) yet nonetheless we can only arrive at a “rough approximation.” To get to that dubious approximation we are given, without any explanation or elaboration, an arbitrary mathematical formula. Nowhere is there a table documenting the quantitative breakdown. There are no charts showing X number of victims in Y Province, or any other means for grounding us in exactly where these awesome numbers supposedly come from. What we have, in sum, are assertions based on tendentious guesswork. In short, this claim—like others in the book — is incredible. This rather glaring handicap ought to have led to the work being taken with a large grain of salt, if not rejected out of hand. Instead, there are only mainstream raves. This is a “masterly study” (the UK Guardian). Dikötter is “extremely careful with his evidence” (the New Republic), and this is “the best and last word on Mao’s greatest horror.” (Literary Review, Edinburgh). So what is going on?

First and foremost, this analysis is ripped out of the larger historical context. There is no mention of famine ever occurring before benighted Communist rule. An earlier work on the same subject, Jasper Becker’s Hungry Ghosts, at least offered background—albeit through its own skewed lens—noting that pre-Communist China “suffered no fewer than 1,828 major famines.” [2] In other words, in the modern era China—like India and other parts of Asia—has been racked by famine whose toll of suffering is beyond human comprehension, and certainly beyond anything Mao’s opponents care to acknowledge. By not addressing previous famines Dikötter looks at China under Communist rule in a narrow vacuum, thus dispensing with the inconvenient fact that famine in this part of the world has been a recurring phenomenon, which Mao did not invent or even magnify.

This distorted lense, however, serves Dikötter’s central thesis that the Great Leap famine was the progeny of the diabolical Mao Tse Tung alone. Dikötter has to stretch to get there, but stretch he does. “Unlike Stalin, he [Mao] did not drag his rivals into a dungeon to have them executed, but he did have the power to remove them from office, terminating their careers—and the many privileges which came with a top position in the party.” [3] In this world, having your career unfairly terminated is a crime on the level of being unjustly executed. Actually looking at what Mao had to say — something the author is loathe to do—might have been instructive:

People may be wrongly executed. Once a head is chopped off, history shows it can’t be restored, nor can it grow again as chives do, after being cut. If you cut off a head by mistake, there is no way to rectify the mistake, even if you want to. [4]

Mao is repudiating Stalin’s method here—which ought to be of interest for someone wanting to understand this fraught period. For Mao execution was not a moral issue, rather it was a matter of calculating how the practice fit into the overall aim of achieving his Sinified version of socialism and communism. This approach was too often instrumental and problematic—a point that lends credence to the book’s attributing to Mao a certain "ends justify the means" philosophy. But there seems to be no interest in exploring that practice in any nuanced way (such as comparing it with how his opponents behaved). Refusing to consider Mao on his own terms, if only as a device to sharpen the argument, makes such writing as Dikötter’s the very kind of propaganda it is so incensed by when glimpsed in the work of enemies.

Also tellingly absent in this analysis is any political sense of proportion regarding the geopolitical situation. For example, we learn,“ever since the United States had started to provide military support for Taiwan and after the Americans introduced tactical nuclear missiles in March 1955, Mao had been set on having the bomb.” [5] What Dikötter doesn’t explore is the reason why, i.e., that the United States was preparing for the option of nuclear war with China. As a headline in New York Times from the period unambiguously put it, “U.S. Called Ready to Use Atom Arms.” The article quotes James H. Douglas, then Secretary of the Air Force, who coolly lays out the strategy: “[T]he nuclear-armed missiles had a dual capability and were not limited to nuclear retaliation; they could use conventional high explosive as well.” [6] This was the threatening context in which China was racing to achieve modernization, self-sufficiency and yes, nuclear weapons. It is hardly an exaggeration to say the situation was a life-and-death struggle. It was not a good thing China ultimately obtained those weapons — or that any country did, including the United States—but to argue that Mao was “paranoid” in the abstract is disingenuous and misleading.

Under Mao’s leadership China decisively broke the grip of colonialism, defeating both Japan and the U.S.-backed Kuomintang regime. This radical upstart regime was a major obstacle to the U.S. in its quest for hegemony in post-World War II Asia — and the U.S. defeat in Vietnam cannot be understood fully without understanding the role of Communist China. Through a torturous and contentious process Mao and his adherents transformed China from the “Sick Man of Asia” into a country that, by the mid-1960s was able to feed, clothe and supply healthcare for its people, all done in conscious opposition to a market-based economy. While it is legitimate to argue as to how costly, even at times horrific, this process was, from the stand of virulent defenders of current capitalism to assert anything good about Mao is absolutely out of bounds. To do so would suggest that the miseries of capitalism that abound amid the splendor of present-day China, actually have an alternative. The desirability and content of such alternative is beyond our scope here, but the very idea of alternative is anathema in such neoliberal quarters. In order to reinforce the status quo, Mao must be cast ignominiously into the dustbin of history.

It is in this skewed context that we witness a proliferation of memoirs and biographies whose sole aim is to depict Mao as among the worst people to ever walk the earth. In this familiar enterprise there are elements that range from base to surreal. You encounter a former board member of the New Left Review, Jon Halliday who co-authored Mao: The Unknown Story, a biography so unrelentingly and sensationally harsh that a group of China scholars felt compelled to publish, “Was Mao Really A Monster?” in an effort to bring the debate somewhere back into the realm of rationality. [7] Here, too, you find a New York Times reviewer castigate an author for being “chillingly cavalier about the tens of millions of people who lost their lives during Mao’s years in power,” [8] for failing to make a point of Mao’s monstrousness. That the book under review was by Henry Kissinger—no stranger to crimes against humanity—is an irony completely lost on the critic.

Which brings us back to Dikötter. This is a large volume whose key selling point is that the author spent ages researching dusty local archives. Yet when he presents his data he repeatedly undercuts the legitimacy of these archives. For example he writes, “Even when cadres were willing to confront the harsh reality of famine, who could have kept track of an avalanche of death?” [9] While this may be true, what does it suggest about the punctilious accuracy he seeks? He further undercuts his claims when he writes of existing statistics, i.e., those compiled by local Party officials, investigations carried out immediately, and investigations conducted in the years after:

The result is not so much a neatly arranged set of statistics revealing some absolute truth in a few telling numbers, but rather a mass of uneven and at times messy documentation compiled in different ways, at different times, for different reasons, by different entities, with different degrees of reliability. [10]

We might then conclude it is speculative to settle on a final figure. Instead we get the following:

Some historians speculate that the figures stand as high as 50 or 60 million people. It is unlikely we will know the full extent of the disaster until the archives are completely opened. But these are the figures informally discussed by a number of party historians. And these are also, according to Chen Yizi, the figures cited at internal meeting of senior party members under Zhao Ziyan. Yu Xiguan, an independent researcher with a great deal of experience, puts the figures at 55 million excess deaths. [11]

Here Dikötter is having his cake and eating it. At the very end of his book he throws out numbers 10 million higher than his introductory estimate and validates them by invoking “figures cited at internal meeting [s],” as if that makes them authoritative. At the same time he covers himself saying we will not know the actual story until the archives open up. He does not and cannot now know, yet he is saying he does.

All this foregoing criticism is not to say bad things did not take place in China in this period, that people died as a result, and that Mao bears no responsibility. They did, and he does. That said—and this verdict will elicit a howl of outrage in certain quarters—these questions are not settled in any way. In this respect it is worth bearing in mind that common knowledge has held that “millions” were executed during the Great Purge in the Soviet Union and that “tens of millions” were executed during Stalin’s rule. Because the Soviet archives opened up in an unprecedented way in the early 1990s, historian J. Arch Getty was able to access formerly secret records and show such figures as wildly inflated—things are more in the realm of hundreds of thousands executed during the Great Purge and on the level of 2 million deaths overall due to repression. [12] That revised figure does not diminish the horror—but facts do matter.

The totality of what happened during the Great Leap Forward needs to be understood without blinkers. To the degree reckless policy, instrumentalist design, and utopian voluntarism played a role in causing enormous human suffering needs to be identified and morally rejected. Yet to the degree that a dynamic was set loose that went beyond the control of those sitting at the levers of power—including natural forces such as drought—those factors, too, need to be understood. With this there is a need for a respect for history. What China went through in the twentieth century—and there were hideous things long before the Great Leap—is of a piece with what much of the rest of the world had to undergo in striving for development and some degree of justice. We do not need another simple-minded screed justifying a priori verdicts and the status quo. We need to understand what happened in China because any effort to get to a different and better future requires it.

Notes

[1] Frank Dikötter. Mao’s Great Famine: The History of China’s Most Devastating Catastrophe, 1958-1962, Walker and Company, 2010, xi.

[2] Jasper Becker. Hungry Ghosts: Mao’s Secret Famine, Henry Holt, 1996, 9.

[3] Dikötter, xiii.

[4] MaoTseTung, “On the Ten Relationships.”

http://www.marxists.org/reference/archive/mao/selected-works/volume-5/mswv5_51.htm

[5] Dikötter,11.

[6] Jack Raymond. ”US Called Read to Use Atom Arms.” New York Times, September 28, 1958.

[7] For a fuller discussion see, Tariq Ali, “On Mao’s Contradictions.” New Left Review 66, November-December 2010

[8]Kakutani, Michiko, “An Insider Views China, Past and Future,” New York Times, May 9, 2011.http://www.nytimes.com/2011/05/10/books/on-china-by-henry-kissinger-review.html?_r=1&pagewanted=all [9] Dikötter 327.

[10] Dikötter 328.

[11] Dikötter 333-334.

[12] J Arch Getty and Oleg V. Naumov. The Road to Terror: Stalin and the Self-Destruction of the Bolsheviks, 1932-39, Yale University Press, 2002, 590-91.


Tuesday, November 15, 2011 - 14:20

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is the author of The American Dream: Short History of an Idea that Shaped a Nation (Oxford, 2003), among other books. He is completing a book currently titled The Arc of American History: What Hollywood Actors Tell Us about Ourselves. Cullen blogs at American History Now.

Is it possible to write a successful novel with unappealing characters? I don't mean a novel in which a protagonist is repellent in an avowedly provocative way, like the unnamed narrator of Fyodor Dostoevsky's Notes from Underground (1864). I mean people who it appears an author really wants us to like, but who we find tiresome. This is the question I found myself asking while reading Jeffrey Eugenides's latest novel, The Marriage Plot. My answer, finally, was no: you can't really write a compelling novel this way. But as failures go, his is an interesting one.

One reason: Eugenides is a virtuoso writer with an extraordinary capacity to render an array of topics with great authority and clarity. In this regard, he's is sort of like Jonathan Franzen with a warmer heart. Eugenides showed such brio in his multi-generational saga Middlesex (2002), and he does it again here. Whether the subject at hand are are the mating habits of the intelligentsia, the pharmacology of mental illness, or the labor force of Mother Teresa's mission in Calcutta, Eugenides renders the details of subcultures with a sense of verisimilitude that impresses and informs. He has a wonderful sense of history, and in some subjects, his talents are dazzling. I can't think of another writer who can talk about religion with the unselfconscious ease he does, for instance. And his command of literary theory, in particular the 1980s mania for poststructuralism, is so sure that he can weave it in as a subtext for a novel that's also a metacommentary on bourgeois fiction of the 19th century. The ending of the novel in particular a delightfully clever.

The problem, again, are the people we're saddled with for this ride. They're a set of Brown students, class of 1982, whom we meet at that unlovely moment in the life cycle: the months following college graduation, when cosseted young adults are suddenly expected to make something of themselves. There's Madeline Hanna of the fictive Prettybrook, New Jersey, a Holly Golightly figure with a yen for literature who finds herself in a love triangle. She's close with her buddy, Midwesterner Mitchell Grammaticus, who pines for romantically. But she's in love with Leonard Bankhead, a brilliant but volatile Oregonian who wins a prestigious science fellowship but struggles with manic depression. We meet these people on graduation day, flash back to their undergraduate years, and move forward as Hanna and Leonard try to find equilibrium in their relationship while Mitchell grapples with his unrequited love by taking a global sojourn that turns into a spiritual quest. The narration rotates between the three characters, and we hear some of the same situations described from more than one point of view.

But this device gets tedious, because these characters are tedious. Madeline is beautiful and smart and rich, and she has a passion for English authors like Jane Austen. But she seems like a highly conventional person, a product of her class in the broader sense of the term, and it's a little hard to reckon what either of the other two men see in her. Leonard's manic depression is rendered with sometimes harrowing detail, but it's hard to separate his grim persona from his illness, and while you find yourself wondering whether his unattractiveness is a function of your own hard-heartedness toward the mentally ill, that's not enough to make you like him. Mitchell, who appears most like a stand-in for the author himself, is a more broadminded figure. But his visionary potential is undercut by his callowness, most evident in his feelings for a girl that we find ourselves wondering, long before he does, whether she's worth all that.

It's a tribute to Eugenides that despite all this, you keep reading. But I doubt this will be seen as his best work. The Virgin Suicides (1993) has its partisans. But for my money, Middlesex is the place to begin. The Marriage Plot is, at best, a subplot in his ouevre.


Thursday, November 3, 2011 - 18:17

 

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is the author of The American Dream: Short History of an Idea that Shaped a Nation (Oxford, 2003), among other books. His work in progress is currently titled "The Arc of American History: What Hollywood Actors Tell Us about Ourselves." Cullen blogs at American History Now.]

This little book manages to do a lot in the space of 170 pages. First published in France in 2007, with an evocative introduction by the late Tony Judt, it surveys its subject with grace and insight, as well as a lot of information.

Lacorne's point of departure in conceptualizing religious history rests on the work of John Murrin, who observed that in the United States "the constitutional roof" was built before the "national walls." As Lacorne is well aware, this assertion is contestable, particularly by those -- from Alexis de Tocqueville to Samuel Huntington, among others -- who have argued that American religious culture, like many other kinds, was well in place by the time of the American Revolution. But an important dimension of this even-handed study is an attempt to balance what he plausibly sees as too much emphasis on the Puritan roots and influence in American society. For Lacorne, a separate strand of U.S. evangelicalism has also been part of the picture. So has, at least as importantly, a largely secular one centered in the thought and legacy of the Founding Fathers. This latter one, whose institutional locus has been the Supreme Court, has been decisive, in his (generally approving) view.

There are are least three ways to use Religion in America, all of them arresting. The first is as a brief survey, one which begins with the Quakers and runs through an epilogue of the Obama years. The second is as a historiographic account of the shifting reputations of evangelicals, Catholics, and other religious movements in the United States, both among their contemporaries and subsequent historians. A related, but discrete, third lens looks more specifically at the French perspective (Lacorne is a senior research fellow at the Centre d'Etudes ed the Recherches Internationales in Paris). France is an especially valuable standpoint for such a study, given its constrast with Anglo-American tradition, its own republican tradition, and the long love-hate relationship between the two countries. Naturally, de Toqueville looms large here, but Lacorne is nuanced in giving him his due even as he points out his limitations.

Lacorne's skill in juggling these three interpretive balls makes the book notably versatile volume for teaching purposes. It's an edifying read for someone seeking grounding in the subject as well as a user-friendly course adoption. The individual chapters are also well-segmented, allowing them to be slotted into general survey in addition to religion courses. Rarely does one encounter such effective one-stop shopping on such a large important subject. One hopes and expects it to become a perennial.


Sunday, October 23, 2011 - 15:55

SOURCE: HNN (10/23/2011)

Luther Spoehr, a Senior Lecturer at Brown University and an HNN Book Editor, co-teaches a course on the history of intercollegiate athletics.

Long, long ago—before the Big Ten had 12 teams and the Big Twelve had 10; back when Knute Rockne was still learning to drop-kick—college football was in crisis.  The crisis had many parts:  already professionalism and commercialism had made inroads into the supposedly “amateur” game.  Players were getting money and perks, coaches were being overpaid (in 1905 Bill Reid, Harvard’s coach, made more than any professor on campus and almost as much as President Charles William Eliot), and rules about everything from how the game could be played to player eligibility ranged from ill-defined to non-existent.  Six-year varsity careers were not unheard of.  Sometimes those careers were compiled at several schools.  In 1896 the great Fielding Yost, enrolled West Virginia University, transferred to Lafayette in time to help the Leopards snap the University of Pennsylvania’s 36-game winning streak, then transferred right back.

The worst aspect of the crisis—because it was the most public and most dramatic—was the game’s increasingly violent character, evidenced by the growing number of players seriously injured or even killed on the field.  As football had evolved from its beginnings (the game considered to be the first intercollegiate contest, between Princeton and Rutgers, took place in 1869) as a kind of combination of soccer and rugby, it was increasingly characterized by “mass play,” most notably the infamous “flying wedge.”  Even when played within the “rules” of the time, it was a bloody affair.  And with players crowded together, battling back and forth on a very small part of the field, opportunities for punching, kicking, biting, and other activities outside the rules were numerous—and exploited.

So, both inside and outside the university, voices began calling for the game to be reformed, even abolished.  The most prestigious voice was that of Harvard’s President Eliot, who decried football’s turn toward professionalism and spectacle, and away from what he saw as sport’s true purpose: participation by real students whose play would promote their fitness and character.  “What bothered Eliot most, it seems,” says John J. Miller in his lively, well-written chronicle of the crisis, “was competition—and how it motivated players to conduct themselves in ways he considered unworthy of gentlemen….Even the behavior of spectators appalled him.  Before the start of a game against Yale in Cambridge, he heard a group of his students chant, ‘Three cheers for Harvard and down with Yale!’  He regarded this as bad mannered….So he proposed an alternative:  ‘Why shouldn’t it be better to sing “Three cheers for Harvard and one for Yale”?’  His suggestion did not catch on.”

Eliot, whose suggestion, Miller says, “burnished his reputation as a killjoy,” serves as Miller’s main foil to Teddy Roosevelt.  TR was president when the crisis came to a boil in 1905, and by working mainly behind the scenes, Miller says, he “saved football,” an outcome Miller seems to regard unreservedly as a Good Thing.   Focusing on Roosevelt’s role also simplifies Miller’s narrative and allows him to digress at length about the Rough Rider’s well-known, lifelong obsession with the “strenuous life” and “muscular Christianity,” driven by his own personal demons and the fear, common within his social class in the late 19th and early 20th centuries, that America’s aristocracy was getting soft and could be displaced by cruder but fitter men. 

Football had been a bone of contention between TR and Eliot for a long time.  When Harvard (briefly) banned the game in 1885, TR called the University’s leaders “fools.”  He was just getting started.  As he became more prominent in American politics, he became a member of Harvard’s Board of Overseers, where his opinion mattered more.  Miller quotes at length (over 2 full pages) TR’s 1895 letter to Yale’s Walter Camp.  It is, Miller says accurately, “vintage Roosevelt,” full of “blunt talk and forceful opinion.”  It is also a document of TR’s time and place, full of concern that America was “tending…to produce in our leisure and sedentary classes a type of man not much above the Bengal baboo, and from this the athletic spirit has saved us.”  From its casual [and supposedly scientific] racism to its desire to find what William James called “the moral equivalent of war,” the letter captures TR’s characteristically assertive, moralistic confidence.  “I am utterly disgusted with the attitude of President Eliot and the Harvard faculty about foot ball,” he says.  Then, displaying his gyroscopic instinct for the middle of the road, he immediately adds, “though I must also say that I feel very strongly in favor of altering the rules, so far as practicable, to do away with needless roughness in playing, and, above all, in favor of severe umpiring, and the expulsion from the field of any player who is needlessly rough, even if he doesn’t come quite within the mark of any specific rule.  I do not know anything about umpiring foot ball games,” he adds (needlessly, but revealingly), “but I have a good deal of experience in umpiring polo games.  However, personally though I would like to see the rules change and to see the needless brutality abolished, I would a hundred fold rather keep the game as it is now, with the brutality, than give it up.”

TR got his way, of course.  And, as has been so often the case, he gets more credit here than he really deserves.  He was good at that.  The great Trust Buster didn’t really bust many trusts, despite his fulminations against “malefactors of great wealth.”  And the former polo umpire didn’t institute or push for the specific changes—such as legalizing the forward pass—that opened up the game and made it less brutal.  The changes didn’t come overnight—as Miller notes, a series of fatal incidents in 1909 revived calls for abolition—but, despite rearguard opposition from powerful figures such as Walter Camp, come they did.

College football’s survival was, if not inevitable, at least over-determined.  It fit perfectly into the Social Darwinian, hypercompetitive ethos of the Rooseveltian elite and made itself a spectator-pleasing, mass spectacle that appealed to audiences well beyond the campus.  Chicago’s William Rainey Harper, who hardly appears in Miller’s book, knew how important “branding” was for his new University and hired Amos Alonzo Stagg (also given only a cameo role), who innovated not only on the field but also in marketing the college football-watching experience, which soon included marching bands, organized cheering sections, and other features pointing directly to the Jumbotrons and luxury suites of today.  Newspapers added and expanded their sports sections, and football became by far the most important, public activity of the university.  (As the president of the University of Rhode Island remarked not long ago, “The Providence Journal doesn’t have a physics section.”)

So Miller’s vivid, quick-paced story isn’t the whole story.  (And one wonders what a writer for the National Review and the Wall Street Journal makes of the current president’s interest in reforming the Bowl Championship Series.)  But he spins his version of the tale well.  If it leaves the reader wanting to know more, so much the better.


Sunday, October 23, 2011 - 11:58

Jim Cullen, who teaches at the Ethical Culture Fieldston School in New York, is a book review editor at HNN. He is the author of The American Dream: Short History of an Idea that Shaped a Nation (Oxford, 2003), among other books. He is also the author of the recently published Kindle Single e-book President Hanks, part of a larger project with the working title "Bodies of Work: Hollywood Actors and the Master Narratives of American History." Cullen blogs at American History Now.

Dorothee Kocks has had an intriguing career. A graduate of the University of Chicago, she went on to pursue a doctorate in American Civilization in the decidedly different climate of Brown (where our paths crossed almost a quarter-century ago). She got a tenure-track job at the University of Utah, proceeding to publish a richly suggestive piece of scholarship, Dream a Little: Land and Social Justice in in Modern America (California, 2000). Then she ditched her teaching post, took up the accordion, and began traveling widely, supporting herself with odd jobs. Last year, she made a foray into fiction by publishing her first novel, The Glass Harmonica, as an e-book with a New Zealand-based publisher. It has just been published in a print edition.

Kocks's unusual vocational trajectory is worth tracing here, because The Glass Harmonica is an unusual book. A work of historical fiction that bridges the late eighteenth and early nineteenth centuries, it also sprawls across Europe and North America. Napoleon Bonaparte makes a cameo appearance, but its core is a love story between a commoner Corsican musician, Chjara Valle, and an entrepreneurial American purveyor of erotica, Henry Garland. The two lovers encounter any number of obstacles -- principally in the form of spiteful people on either side of the Atlantic -- but nevertheless manage to build a life together,  one animated by the mysteriously alluring (and thus to many threatening) glass harmonica, a musical instrument which enjoyed a vogue in the age of its inventor, Benjamin Franklin.

Such a summary makes the book seem simpler than it is. For one thing, The Glass Harmonica is rich with historical texture. Brimming with research, it vividly recreates any number of subcultures, ranging from continental drawing-room entertainments to the feverish intensity of revivial meetings. As one might expect of a writer who has spent much of her life, and much of her work, exploring the concept of place, Kocks also evokes varied geographies -- urban Paris and Philadelphia, rural upstate New York, coastal New England;  et. al. An afterword limns her sources and provides set of footnotes worth studying for their own sake.

Kocks also boldly trespasses over contemporary convention in realistic fiction, eschewing the spare, lean quality of modern prose in favor of lush, omniscient narration. "On the morning Chjara Valle quickened in her mother's womb, the sun reached its red fingers over the Mediterranean Sea," the novel opens. The book is engorged with such biological/anthropomorphic motifs.

But at its core, The Glass Harmonica is a novel of ideas. Sometimes those ideas are suggested in deceptively simple language, as in this exchange with her mother that suggests the paradoxes built into the the very notion of an autonomous self:

"My destiny is here," Chjara said.

"Your destiny is not yours to decide."

"Who decides then?"

"Don't be impertinent."

Other times, characters engage in explicitly philosophical discourse, discussing theology, politics, and other topics.

But for all its its intellectual sophistication, the argument of the novel -- part of its hybrid quality is that one can speak of it having a thesis -- rests on a simple idea: the pleasure principle, expressed most consistently in sexual terms. (The libertarian ethos of the book extends secondarily to economics as well.) Over and over again, her characters affirm it. "She wondered at this idea -- we are God's instruments -- and she vowed to live by the principle that would make us feel more alive was good," Chjara declares at one point. Henry, for his part, "understood that his father's [Puritan] religion was not the only one in the world; Jefferson's deists gave [him] the confidence that the world had been made to work well regardless of his breakfast." The lovers will be forced to question this conclusion repeatedly over the course of the novel, most seriously in its when it appears their choices have damaged their children. Faced with trauma, they look to themselves: when, in a desperate moment, Henry feels compelled to pray, it's not to God but to Chjara. Later their son prays to himself. And yet for all their intimacy, Chjara and Henry also have the secrets, a challenge to their fidelity more vexing than any adultery.

Kocks's libertine stance is both consistent and subtle (no mean trick). As such, it's hard to contest; though her protagonists encounter resistance, some of it internal, to their way of life, she makes a convincing case that that their quest for self-actualization is a bona fide American tradition with deep roots in the Enlightenment. The problem I have with it is less one of contradiction -- or a disposition of intolerance reflected in characters who block the couple's path to bliss -- than insufficiency. The fuel of happiness ultimately depends on sources of power such as money, looks, smarts, health, or the admiration of others (reflected here in the proto-celebrity culture that springs up around Chjara, who exults in adoration), which are in short supply under the best of circumstances. Notwithstanding their obstacles, the couple is suspiciously well endowed in these categories. Lacking them, most of us try to find ways to redeem our lives beyond ourselves, which typically involves some sort of self-sacrifice, beginning with raising children, truly an electric transfer of energy at least as transformative, if not always as felicitous, as procreation (or sexual recreation). But beyond such private leveraging of personal resources, a libertarian sensibility is a thin reed on which to build a community life, too; it seems no accident that the Chjara and Henry are itinerant. Nor is it easy to see, beyond a sympathetic disposition, how constructive their approach might be in other life-affirming quests, like the struggle to end slavery, for example.

As someone of who pledges his loyalty to Adams more than Jefferson, as it were, I'll confess that I'm not certain how much better a life of duty, variously constituted, really is. To be sure, it has evident costs, often paid by others than those who make such a pledge. It is a strength of this book that it forces one to consider such questions. The Glass Harmonica is a provocative novel by an elegant writer who has blazed her own path. It's a path worth surveying, whether or not one takes it.


Sunday, October 16, 2011 - 14:47

Expired Green Card

http://www.schoolexpress.com| http://www.sciencereviewgames.comhttp://www.JoanneJacobs.com|