History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Mon, 22 Mar 2021 01:34:12 +0000 Mon, 22 Mar 2021 01:34:12 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.hnn.us/site/feed Americans Can Vote at 18 Because of Congressional Action 50 Years Ago

 

 

 

This month Congress takes up vital voting rights legislation: the For the People Act, which has already passed the House of Representatives, and the John R. Lewis Voting Rights Act.  Exactly fifty years ago, in March 1971, federal lawmakers did the same, when they debated and overwhelmingly passed what would become the 26th Amendment.  This amendment extended the right to vote to 18, 19, and 20-year olds.  It was the last time that the United States significantly expanded voting rights.

 

Like today, Congress was under pressure to act.  Also like today, voting rights advocates sought to ensure and expand this most fundamental right of citizens in a democracy.  But an exceptional set of developments contributed to congressional pressure in early 1971.  Most immediately, in late December 1970, the Supreme Court announced its ruling in the case of Oregon v. Mitchell, a judicial test of the Voting Rights Act of 1970.  This new Act had extended and amended the landmark Voting Rights Act of 1965.  Although the 1965 Act had enabled an estimated 800,000 to 900,000 African Americans in the South to regain their voting rights, more still needed to be done.  The 1970 Act provided a five-year extension and new coverage of jurisdictions outside the South, a national ban on literacy tests, and—most controversially—the 18-year-old vote. 

 

This last amendment, Title III of the Voting Rights Act of 1970, guaranteed citizens 18 years old and up the right to vote in federal, state, and local elections.  It fulfilled the aim of proponents of youth voting rights, like Jennings Randolph, Democrat of West Virginia, who had been working in Congress to lower the voting age for thirty years.  Beginning in the early 1940s during World War II and lasting to the early 1970s in the context of the Vietnam War, proponents pushed for youth suffrage at the national and state levels.  Early on only a few prominent figures and organizations were involved, and only the states of Georgia and Kentucky plus Guam, American Samoa, and Micronesia had the 18-year-old vote.  But by 1969-1970, a sweeping number of state and territorial legislatures proposed lowering the voting age, and a national youth franchise movement had emerged.  “The hour is striking now for 18-year-old voting,” declared Senator Randolph.

 

This movement involved a broad and bipartisan coalition, formalized as the Youth Franchise Coalition.  The coalition involved Democrats and Republicans, old and young, and a range of organizations.  It included well-known, older, multi-issue organizations and groups, such as the National Association for the Advancement of Colored People and the National Education Association.  New ones driven by young Americans and dedicated solely to the issue of youth voting rights, like Citizens for Vote 18 and Let’s Vote 18, also joined.  “We can find no moral legal or political reason to justify keeping these young people on the outside of the decision-making arena of this country,” declared the NAACP’s James Brown, Jr. to a Senate subcommittee in 1970.  That year, the movement, working together with federal lawmakers in both major parties, succeeded in winning Title III from Congress. 

 

The Supreme Court qualified this achievement.  In Oregon v. Mitchell, the justices upheld every component of the Voting Rights Act of 1970 except Title III.  At stake was the relationship between states’ rights and federal power, an issue that originated with the nation’s founding.  The plaintiffs in the case, with the state of Oregon in the lead, argued they had the right to determine the minimum age for voting in their states, not Congress.  Eight of the nine justices divided evenly, with four in favor of the states and four in favor of Congress.  In a remarkable development, the ninth, Justice Hugo Black, split the difference both ways, forming a majority of one.  Black issued the Court’s opinion that Congress could set the voting age for federal but not for state and local elections.  Designed to ensure the continued enfranchisement of African Americans, the Voting Rights Act of 1970 had enacted the enfranchisement of 18, 19, and 20-year-olds at the national level.

 

Yet the Oregon decision set up an untenable situation.  This situation would require a dual-age voting system in most of the country, whereby young people could vote in some but not all elections.  The situation was also contradictory.  As a founder of the Youth Franchise Coalition Paul J. Myer recalls, it would “allow 18-year-old men and women to vote for President of the United States and Congressmen, but they could not vote for the mayor or the dog catcher.” 

 

This contradiction and the daunting prospect of two different sets of voting regulations led to the 26th Amendment.  In the early months of 1971, members of Congress, state officials, and members of the Youth Franchise Coalition mobilized to pass and ratify a constitutional amendment enfranchising young Americans.  House Joint Resolution 223 and Senate Joint Resolution 7 raced through their respective Judiciary Committees. 

 

In March, the Senate and House debated and voted on the proposed amendment.  A pragmatic reason for passage was the cumbersome and costly burden on state and local governments of dual-age voting, but principled arguments further persuaded.  Young Americans had the maturity and education to vote.  Their participation would strengthen democracy and government in the United States.  Most fundamentally, they fulfilled the responsibilities of citizenship, especially through military service, and thus deserved the rights.  The rallying cry “old enough to fight, old enough to vote” captured this argument. 

 

There was opposition, including those who felt 18, 19 and 20-year-olds were not yet ready for this right.  But when the House and Senate voted, the few dissenters were drowned out by unanimous approval in the Senate and a 401 to 19 vote in the House.  The amendment, stating that “the right of citizens of the United States, who are eighteen years of age or older, to vote shall not be denied or abridged by the United States or by any State on account of age,” had passed Congress.

 

On March 23rd, the amendment went to the states.  The speed of ratification was unprecedented.  Within three months, the required three-quarters (thirty eight states) had ratified.  Vying for first place, five states—Connecticut, Delaware, Minnesota, Tennessee, and Washington—ratified the same day Congress approved it.  Later, the states of Alabama, Ohio, Oklahoma, and North Carolina competed for the 38th spot.  The cascade of support for youth voting rights halved the record set by the ratification of the 12th Amendment in 1803-1804.  By July 1, 1971, the 26th Amendment was the law of the land. 

 

Propelled by a youth franchise movement and pursued through coalition politics, Congress took action 50 years ago to achieve the 18-year-old vote.  Proponents sought to make American democracy more inclusive and responsive to the voices of young citizens.  In 2021, after an election season in which young voters turned out in record numbers and made a difference in key races, voter suppression efforts are on the rise across the country.  Today’s Congress has an opportunity to stem the tide.  The 50th anniversary of the 26th Amendment can serve as inspiration and impetus.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179596 https://historynewsnetwork.org/article/179596 0
Why Deb Haaland Matters

 

 

 

 

In 1968 the National Indian Youth Council served as one of the organizers of the Native American contingent for the Poor People’s Campaign. This multi-racial coalition marched on Washington and, in a document penned by the “Committee of 100,” presented its list of demands. With regard to Indigenous Peoples in the United States, they insisted on “the right to have a decent life in our own communities,” with “guaranteed jobs, guaranteed income, housing, schools, economic development, but most important–we want them on our own terms.”

 

These young protestors singled out for particular condemnation the United States Department of Interior, founded by act of Congress in 1849. “The Interior Department,” the list of demands read, “began failing because it was built upon and operates under a racist, immoral, paternalistic, and colonialistic system.”  There was, they said, “no way to improve upon racism, immorality, and colonialism.”  They argued that “the system and power structure serving Indian peoples is a sickness which has grown to epidemic proportions. The Indian system is sick, paternalism is the virus, and the Secretary of the Interior is the carrier.”

 

These sentiments reflect a long-standing distrust and suspicion of the Interior Department on the part of Native peoples. Indeed, it was a little over a decade ago when Congress passed legislation settling the Cobell case with the three hundred thousand Native American plaintiffs who had demonstrated that the Interior Department had lost hundreds of millions of dollars owed to Indian trust beneficiaries. The Secretary of the Interior at the time admitted that serious mistakes had been made, and the trial court found “a century-long reign of mismanagement.”

 

President Biden nominated Deb Haaland, a congressional representative from New Mexico, to be the first Native American to become a Cabinet secretary and head the United States Department of Interior. Interior oversees more than 480 million acres of public lands, or nearly one-fifth of total land in the United States, mostly in the West, and 11 federal agencies, including the Bureau of Land Management, the National Parks Service, and the US Fish and Wildlife Service. Interior also includes the Bureau of Indian Affairs, Bureau of Indian Education, and Bureau of Trust Funds, having the greatest impact on Indigenous lives and Tribes of any Executive-branch agency.

 

Haaland certainly is well qualified for the job. She is a lawyer, an entrepreneur, and a forward-thinking innovator. She is a good steward of funds, and is familiar with many of Interior’s responsibilities, including for Indian Gaming, having served as chairwoman of the Laguna Development Corporation, New Mexico’s second largest tribal gaming business. And she is knowledgeable about the management of public lands, having served as co-chair of the US House Committee on Natural Resources.

 

Though in many ways conditions in reservation communities have improved since the Committee of 100 issued its demands in 1968, satiating America’s appetite for energy and mineral wealth always has come through exploiting lands important to Indigenous peoples. That has not changed. This was evident in 2016 at Standing Rock, and now at Oak Flat. It is a long history with which Haaland will have to contend.

 

Coming to the agency after Trump appointees who aggressively sought access to the mineral wealth of Indian Country, Haaland’s appointment marks a significant break. One could feel the ground shifting during her painful confirmation hearing, as Republican Senators closely tied to the Energy lobby tried to drag her down. They failed.

 

Secretary Haaland handled their inane and insulting questions with patience and kindness. She is viewed as a threat by those who wish to develop the mineral and energy resources of Indian Country. Her confirmation was a signal event for those who value the environment and appreciate the 172-year long historic relationship between Interior and America’s Native Nations.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179633 https://historynewsnetwork.org/article/179633 0
Rally 'Round the Rune: Fascist Echoes of the CPAC Stage  

 

 

 

And the leaders in charge cry out, "Come, boys, come!"

Shout, shout the battle cry of Freedom!

            —Confederate version of “The Battle Cry of Freedom”

 

 

Much has been made recently of the incorporation of the pre-Roman Odal or Othal rune motif, derived from the Elder Futhark writing system, infamously featured in the insignia of at least two Nazi SS units, into the stage design at the recent Conservative Action Political Conference (CPAC 2021) in Orlando, Florida. As numerous commentators have noted, this design, with the telltale serifs added by the SS, has widely circulated within neo-Nazi and white supremacist circles in recent decades, as a supposedly more palatable alternative to the better-known Nazi swastika itself.

 

Our particular interest is not solely the fact that the CPAC stage design draws upon Nazi and neo-Nazi imagery as such, but that this imagery was incorporated into a stage in particular. What is it about theatrical platforms that offers such powerful affordances for these kinds of highly charged symbolic acts, including those that draw on ancient occult imagery?

 

Although the designers may not have been consciously aware of the fact, their incorporation of mystical symbolism into the stage resonated with millennia-old themes in the history of theatrical platforms. From at least the time of ancient Athens, performance spaces have been intertwined with ritual action and political oratory. Within the Theatre of Dionysus, in the precincts dedicated to the god, masked dances and recitations were understood as sacred offerings that bound mortals and the visible world to the invisible world of divine beings. The adjacent Odeum of Pericles alternated between political speech-making, religious festival activity, and choreographed performances. These and related classical theatrical spaces were understood as dynamic thresholds between this world and the other world of the sacred, bearing profound affinities to altars and other spaces of spiritual veneration.  

 

The sacrality of the stage is echoed in other performance traditions around the world, including in Japan, where kagura dances and sumo matches have been understood for centuries as purifying offerings to the divinities. The ancient Roman innovation of a fixed raised stage incorporated secular and sacred elements, including an altar and a tripod to Apollo, from which oracular pronouncements might be delivered. 

 

The sacred and the theatrical continued to be conflated in medieval dramatic presentations. Our modern term “stage” is in fact derived from mobile platforms or pageant wagons in medieval mystery plays, which dramatized Christ’s stages on the cross, and other moments in the great sweep of scriptural history from Genesis to Doomsday. The portable “stage,” although not in itself a formally consecrated space of worship, was held to transport the lay audience into intimate encounters with the cosmic mysteries of the universe.’ 

 

Nazi propagandists were well aware of the intermingled history of theatrical and religious performance. The Nuremberg rallies, celebrating the union of the National Socialist party, the German state, and its resurgent military apparatus, consciously sought to evoke the grandeur of Gothic cathedrals, transporting participants into a near-mystical sense of oneness with the mythic German Volk. The famous opening shots of Leni Riefenstahl’s The Triumph of the Will provide aerial shots of the assembled on the ground, as the cruciform shadow of Adolf Hitler’s airplane passes over them, effectively blessing them with an apparition descending from the heavens. The great podium presented Hitler as a latter-day Messiah, through whose body and voice hundreds of thousands of followers, chorusing back in unison, were assembled into a vast organic instrument of the leader’s will.

 

The Rune’s Meanings

 

The Odal/Othal rune’s ancient meanings emphasized property, lineage, and nobility. Those associations presumably informed the choice of the rune by the Nazi-era Race and Settlement Main Office, charged with upholding racial purity of the Nazi Schutzstaffel (SS), as well as the rune’s continuing popularity in postwar white supremacist tattoos and insignia, as a manifestation to the supposed pure lineage of whites.

 

We may never know the precise thinking behind the designers choices during the lead up to the American Conservative Union’s 2021 gathering to precisely replicate the rune in the stage. Yet it a reasonable inference that the Odal/Othal rune design, so well known in global white supremacist networks, was a non-oblique message to white nationalist supporters, near and far.

 

Having said that, one of the most intriguing aspects of the whole episode is the overt denial by the American Conservative Union leadership that the Opal/Othal rune was being referenced, in the face of overwhelming circumstantial evidence that the underlying symbolism informed the event space layout. Even the serif wings added by SS units to the rune were precisely reproduced in the stage design, after all. The current strategy on the right seems to be signal to the most committed far right racist base that the leadership is really with them, while maintaining plausible deniability with the broader electorate. The decision to incorporate the rune into the horizontal stage design, as opposed to, say, prominent vertical banners, would seem to be consistent with this overall strategy, of sending out coded messages to the base that may later be disavowed.

 

The Othal rune motif, after all, was not all that visible to those within the hall itself; it was more likely to circulate through television and as a social media image meme, signaling to the alt-right their inclusion within the broader conservative movement. It was in this sense analogous to Donald Trump’s now infamous message to the Proud Boys and their ilk, to “stand back and stand by,” indicating that white supremacists and militia members should be prepared to return to the combat arena when needed.

 

Beyond that, the particular design, of an encircling parallelogram with the two symmetrical extensions upon which orators enter the stage, can be read as concentrating the energy of the speakers (most notably former President Trump himself) and then projecting that energy out into the assembled crowd with the hotel ballroom and through the extended television audience. There may even have been a sense that the rune-modeled base on which the presenters spoke quite literally grounded them to the earth, to the very soil which they claim to be defending, a meaning that may have been linked to the ancient association of the rune with themes of ownership and noble lineage.

 

Theater in the Pandemic Era

 

We speculate that the incorporation of the rune was motivated, whether consciously or otherwise, in part by the current context of the pandemic, in which so much social interaction is mediated and remote. Millions long for a sense of liveness and intimate emotional connectedness. The emotionally potent symbol of the rune, upon which the speakers stood, may have helped activate the hoped-for sense of immediate and visceral engagement with the life of the extended national community, re-animating all that which seemed suspended during the lockdown.

 

During the COVID-19 era, the Republican Party and activists on the right have emphasized what they conceive as the fundamental right to gather, maskless, in crowds, deriving energy and strength from face-to-face encounters with one another. In this, they markedly contrast themselves with the Democratic Party’s emphasis on masks, social distancing, lockdown restrictions, and virtual, digitally-mediated encounters. The jagged angles of the CPAC stage, visibly projecting out into the hall, would seem to emphasize the aggressive celebration of co-mingling, a proud overcoming of risk in this “year of living dangerously.” 

 

It hardly seems coincidental, in this light, that among the highest energy moments of this year’s CPAC convention was the pointed attack by Governor Kristi Noem (R-SD) on Dr. Anthony Fauci, as she stood upon this very stage. She denounced mask mandates and claimed, in staggering denial of the established facts, that South Dakota on her watch had successfully weathered the COVID-19 pandemic, ignoring thousands of documented deaths. Her punch line, “Dr. Fauci is wrong a lot,” elicited a standing ovation. It was as if she conjured up on this stage the ghosts of Shakespeare’s demonized villains, from the Jewish Shylock to the Puritan kllljoy Malvolio—who skulks off stage at the end of Twelfth Night, muttering “I'll be revenged on the whole pack of you.”

 

Blood and Soil, and the Heartland, and the right to gather unabashedly in person, were all juxtoposed against an ethnically-coded representative of the East Coast’s scientific elite (not Jewish per se, but close enough), who counsels caution and prudence. The moment might even have hearkened back to the ritual dramas of ancient Athens, in which the pharmakós or ceremonial scapegoat was assaulted, sacrificed, or exiled. From within the interior of the rune-shaped circle, the domain of sanctified purity and the righteous anger of the crowd, the alien other was expelled in the interest of collective expiation.

 

In ancient Norse lore, knowledge of esoteric runes was first granted to the god Odin as he hung upside down for nine days from the tree of life, Yggdrasil, moving through the domain of death into rebirth and eternal life. To be sure, the vast majority of the intended audience for CPAC are not neo-Pagan adherents conversant with deciphering ancient runes. Yet, many are presumably familiar through the Thor movies and other entries in the Marvel Cinematic Universe with elements of ancient Norse mythology. They are likely to derive a frisson, if only unconsciously, from the imagery of ancient power-concentrating symbols that open doorways to a mysterious alternate world. (We find ourselves thinking of the well-known imagery from the Stargate film and television series, in which esoteric inscriptions are lined up to open a doorway to a distant dimension.)  For a movement that is increasingly predicated on the millenarian faith that “the storm is coming,” the peculiar stage at CPAC 2021, freighted with a long history of ritual theater in general and the particular drama of a rune of power, may well have been apprehended as a portal to a violently transformative future, energized by the promise of collective purification.  For those inclined to read portents in their social media feed, the mysterious rune’s presence at the center of the conference hall may well have signaled a longed-for oracular prophecy: the storm is here.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179598 https://historynewsnetwork.org/article/179598 0
Policing, Protest, and the Role of the University

NU Community Not Cops

 

 

 

On November 1, 2020, Northwestern students gathered at the John Evans Alumni Center to protest policing in the city of Evanston and at Northwestern University. The student group, NU Community Not Cops, organized the protest to express their frustration with the university’s tepid response to their demands to discuss ways to redefine campus safety. They were met by Evanston police and Illinois crowd control officers, all of whom were deployed in riot gear with K-9 support. The officers surrounded the crowd of about 150 students, then unleashed pepper spray and fired pepperballs.

 

Unfortunately, such images of heavily armed police forces confronting students are not unique in American history. Indeed, two of the most famous incidents of campus violence – the shootings at Kent State and Jackson State – were recently back in the headlines as commentators reflected upon their significance on their fiftieth anniversary. And, at the time, the interpretations returned to familiar tropes: generational chasms and differences along racial lines. On the one hand, the Kent State shootings, some argued, represented the generational and political chasms that had emerged in the United States by 1970 and continue to shape the contours of American political culture. As Richard M. Perloff recently noted in an op-ed for The New York Times, Kent State “helped unearth a growing political polarization rooted in different views about the cultural changes wrought by the 1960s.” The forgetting around the shootings at Jackson State University, on the other hand, demonstrates the ways America ignores the violence against communities of color. Jackson State, writes historian Nancy K. Bristow, “was neither the first nor the last time white Americans have neglected the memory of state violence against people of color, unwilling, it seems, to reckon with the devastating realities of white supremacy in our past or to contend with their ongoing meaning in our present.” Without question, the divergent interpretations of the Kent State shootings – its causes and who was to blame – and the persistent forgetting of the shootings at Jackson State University illuminates the stark political and racial divisions in American political culture.

 

But the emphasis on Kent State and Jackson State as political events fails to situate the episodes in the right setting – that of campus violence. Indeed, reading recent campus movements like those at Northwestern in the context of the Kent State and Jackson State shootings reveals that the question about university-funded and supported policing is fundamentally about the institution’s perceived public purpose. The criticisms raised by contemporary student activists concern not only institutional policy-making and town-gown relations, but also reveal deeper, unresolved tensions in the modern American university that relate to ideas of institutional autonomy and neutrality.

 

The anti-war movement at Kent State, like on other college campuses, centered around the convergence of alternative visions of the university’s public purpose and organization. Two issues informed the anti-war movement on college campuses like Kent State University. The first concerned the presence of the Reserve Officers’ Training Corps (ROTC) and military recruitment efforts on campus. The second concerned military-university research contracts. Students called for the end of both, while also calling on the university to actively support the anti-war movement. In the lead up to the march in May, student activists asked the university to support the movement by cancelling classes and providing institutional resources in the form of classrooms, mimeograph machines, and buses.

 

For university officials, such appeals – especially the use of the university for political purposes – ran counter to academic freedom and the university’s perceived public role. As Kent State University President Robert I. White explained,  “Once the university takes a stand on (political issues) it forecloses an essential ingredient of academic freedom.” Moreover, White continued, “requests which would have the university as an institution deviate from this position by word or deed, regardless of how honorable the intention, must be rejected.” In this light, White hedged closely to an orthodox conception of institutional autonomy that held sway in the 1960s. This idea assumed that that university failed in its mission if it took political sides in any intellectual or social dispute. Defenders of this idea of institutional autonomy, like White, argued that the university’s role was to maintain a distance from wider political pursuits in order to enable scholars to pursue particular forms of research, whether normative or technical. The fruits of faculty research represented the university’s unique service to society.

 

In response, Kent State activists, like others in the anti-war movement, extended the institutional logic of the modern university. By providing ROTC facilities and actively aiding military research and recruitment, they argued, the university was already taking a political position. Students believed that supporting the anti-war movement was in fact central to realizing academic freedom. Coopting White’s definition of institutional autonomy and academic freedom, student activists further argued that it was the mission of the university to promote an atmosphere “where all may research and debate the social and political issues dividing modern society.” In this way, students recast intuitional autonomy. They hoped that the university’s distinct distance from society would be used to change the surrounding society.

 

Student views and White’s counterargument represented different political claims made on the American university in the 1960s. The central disagreement between student activists and university officials concerned whether the university should function as a mirror of society or whether it should have a moral obligation to foster change. The different conceptions of the university’ roles had wide-ranging implications in terms of research, education, and the institution’s mission.

 

The different perspectives also accommodated unreconciled tensions. While students called on the university to take an active political position, they also believed the university must be a source of open debate and deliberation. The former had the potential to foreclose the latter. At the same time, university officials like White also adopted two positions that were fundamentally at odds. Like the students, White believed the purpose of the university was to function as a marketplace of debate. But setting limits on campus protest—indeed, calling in the National Guard to quell student activism—while also accepting military funding meant that the university was far from a neutral arbiter.

 

If student activism at Kent State highlighted the complex interplay between state policy vis-à-vis the Vietnam War, academic research, and the university’s mission, student activism at Jackson State University focused on the relationship between campus governance and institutional autonomy. Black student activists in the 1960s had long questioned the decisions of Black university presidents in the context of the civil rights movement. In the early 1960s, for instance, Joyce and Dorie Ladner, two students at Jackson State, got involved in the civil rights movement, later becoming leading fieldworkers in the Student Nonviolent Coordinating Committee (SNCC). But, instead of finding support from Jackson State University president Jacob L. Reddix, they were expelled for their civil rights activism. For many students, the decisions of Reddix demonstrated the ways the Black university failed to support the struggle for political freedom. 

 

John A. Peoples, who became president of Jackson State University in 1967, was more supportive of student activism. As Gene Young, a prominent activist noted, “after years of having a president who didn’t allow students to speak out, Peoples embraced us and allowed us to speak out.” Yet Peoples also had very little power, especially when Governor John Bell Williams called in close to 600 National Guardsmen and the Mississippi State police to quell student activism. In this light, even though students may have been disappointed in Reddix and Peoples for their lack of political bravery, they also recognized that both were ultimately constrained by the structure of Mississippi’s higher education system.

 

In the aftermath of the shootings on May 14, students formed the Committee of Concerned Students. The committee called for the “halt of the unilateral decision-making of M.M. Roberts, President of the State Board of Institutions of Higher Learning” while demanding “50% black representation of that board.” The students linked the power and membership of the State Board to the lack of institutional autonomy for Black universities like Jackson State University. Lynch Street, which ran through the campus, was regularly patrolled by Jackson city police and highway patrol. This had dire consequences for students. It was Jackson city police who fired thousands of rounds into the dormitory and murdered two students, Phillip Lafayette Gibbs and James Earl Green on May 14, 1970.

 

The tragedy was part of a longer history of police violence directed at Jackson State university students and the Black community in Jackson. For years, wrote members of the Committee of Concerned Students, students have watched “police brutality, cold blood murder, and general harassment and intimidation of black people in Jackson.” Prior to and leading up to the shootings on May 14, Jackson city police and highway patrol regularly harassed and arrested Jackson State students for their activism and organizing on campus, even though many students acted within their rights as citizens.

 

The seeds of the protest at Jackson State University thus concerned institutional power, or lack thereof. In the aftermath of the tragedy in 1970, Farish Adam and Warren Buxton, two Jackson State students, filed a complaint against governor John Bell Williams. The complaint focused on the fact that Jackson city police and highway patrolman had begun to clean up the crime scene before federal, local, and private completed their investigations. While the complaint concerned what the students saw as the destruction of evidence, it also demonstrated the ways students ultimately realized that their educational and intellectual pursuits were at the whim of the white political structure of Mississippi.

 

They also drew a direct parallel to the Black community within Jackson, seeing the lack of institutional power as reflective of the lack of political power of Black citizens. In addition to a range of appeals for equal hiring, students at Jackson State called for “‘Community Control’ over the activities of the police” and “state-wide grievance procedure to handle complaints against the highway patrol,” two demands that are likely familiar to the contemporary reader. 

 

In the aftermath of the Kent State and Jackson State shootings, the Nixon administration authorized a commission on campus unrest. Chaired by William M. Scranton, the former Republican governor of Pennsylvania, the commission also included James E. Cheek, the new president of Howard University, and representatives from police departments, the legal community, faculty, and student governments. In the report, the authors identified the “question of politicization” as a central issue that emerged on college campuses. The university community and the general public, the authors noted, were concerned that campus unrest had resulted in the politicization of the university. While recognizing that the university was inherently a political institution, the authors maintained that the university should largely avoid institutional positions, except when policies affected higher education. In this regard, the commission at once recognized the fundamental dilemma facing the university and brushed the issue under the rug.

 

While the report largely sidestepped the more fundamental questions about the university’s public purpose, in particular whether Kent State should be involved in military research and training or if Jackson State has the political power to challenge state education policy, what did change was who was called in during campus protest and unrest. In response to both events and a national outcry, college presidents and university administrators in the 1970s and 1980s lobbied for state legislation that allowed the creation of campus police departments. The 1990 Clery Act, which mandated colleges and universities to track, compile and disclose crimes on and near their campus, served as further justification for some in higher education to expand and arm police forces. Over forty-four states allow colleges to form campus police forces, and most public colleges and nearly all private universities have a standing police department on campus. Instead of calling in city police or the national guard, universities have called in their own police forces.

 

Although the affiliation of the police force may have changed since the 1960s, a number of recent incidents in the past decade suggest that policing in higher education has not evolved much from the violent tactics that were used to suppress Vietnam War and civil-rights activists. In 2011, a University of California Davis police officer was caught on film pepper-spraying a row of nonviolent students participating in an Occupy Wall Street protest. The image, as many commentators noted, rivaled in symbolic power the campus violence four decades prior. And, as both commentators and researchers have demonstrated, students of color have regularly confronted unlawful searches, harassment, racial profiling, assault, arrest, coercion, incarceration, and police violence by the very campus departments designed to make the university community “safer.”

 

In the context of the Black Lives Matter movement, such incidents are again raising questions about the role of campus police and the university that parallel larger debates about the militarization of American society and the continued mistreatment of students of color and those who use their right to protest. Put another way, despite the institutional changes, many of the issues that arose in the late 1960s on campus still persist. Indeed, campus protest of campus policing and institutional partnerships with city police concern the university’s public purpose. This view is best articulated by student organizers at Northwestern University, who called on the university to “invest in life-giving institutions and divest from law enforcement.”

 

Student activists at Northwestern, in other words, are asking a similar question that emerged out of the 1960s: What should be the university’s public purpose? Should it reflect society as is – that is, employ an armed police force and associate with city police forces, or should it function as a bulwark against prevailing social and political forces, and model a different society for students and the wider public? These, of course, are difficult questions. But, they are vital ones if we are to fully engage with the legacy of the now half-century old shootings. 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179632 https://historynewsnetwork.org/article/179632 0
Power to Rule the Skies: A Forgotten Innovator of the Strategic Air Command

Gen. Thomas S. Power, Offutt AFB, 1957

 

 

As a brash, young Army Air Force commander he masterminded and personally led the first mission to firebomb Tokyo, igniting a campaign that would end Imperial Japan.  After the Axis surrender, the general witnessed first-hand the destructive power of the atomic bomb at Bikini Atoll.  At the beginning of the Cold War, he took a role in the Berlin Airlift until called back quickly to the United States to turn the struggling Strategic Air Command from an amateurish collection of has-been World War II bombers and sullen aircrews into the nuclear striking force America needed to stare down the Soviet Union.  From 1948 until 1964, he crafted SAC into the most powerful striking force the world had ever seen and commanded the force – some say as a cruel and even sadistic leader - during some of the Cold War’s most critical days, including the Cuban Missile Crisis.  He started flying World War I-era trainers and canvas bombers and ended his career commanding thousands of bombers and missiles capable of destroying entire nations – all because peace was his profession.

Many military historians may recognize this officer as General Curtis LeMay.  They would be wrong.

Thomas Sarsfield Power has long lived in LeMay’s shadow.  If Power is remembered at all today, it is often as an intellectually dimmer, more sadistic copy of LeMay.  Dimmer because Power didn’t attend college (due only to family hardship).  More sadistic, because Power was often LeMay’s hatchet man at SAC.  These inaccuracies remain because LeMay and Power were almost inseparable for the most of their post-war careers, with LeMay being the senior officer.

 Power was in Kansas perfecting radar bombing in the B-29 while LeMay lead B-17s deep into Germany.  The men met in Guam where Power served as one of LeMay’s B-29 wing commanders.  After leading dangerous but ineffective bombing raids over Japan, Power used his technical experience to develop a plan for his B-29s to use radar to fly low-level bombing missions using incendiary, rather than high-explosive, bombs.  LeMay immediately approved the brilliant idea, and chose Power to lead the initial attack.  The resulting mission was the single most destructive bombing raid in human history, burning 16 square miles of Tokyo in a matter of minutes, proving the low-level radar firebombing technique that would eventually devastate Imperial Japan well before the arrival of the atom bomb.  Contemporary newsmen and later historians credited the attack and its tactics to LeMay, limiting Power’s contribution to flying the mission… if he was mentioned at all.

After World War II, Power’s career was intimately entwined with LeMay’s.  LeMay chose Power to be his deputy commander when LeMay was tasked with rebuilding SAC.  From 1948 to 1954, Power was LeMay’s right-hand as America’s moribund bomber force was forged into America’s Big Stick to keep the Cold War peace.  Many in SAC thought that LeMay ruled with a mailed fist rivaling the one on SAC’s shield, but it was often Power who brought that fist down on members who weren’t up to SAC’s exacting standards.  Only a brief three year stint as commander of Air Research and Development Command– where he led the development of some of the Air Force’s most iconic aircraft and missiles – interrupted Power’s SAC time.  

However, because of their close connection, historians have tended to attribute Power’s successes to LeMay.  As SAC commander, Power incorporated the ICBM into SAC’s operations.  Power established the Joint Strategic Target Planning Staff and the Single Integrated Operational Plan (SIOP) that unified all United States strategic nuclear forces, dramatically enhancing America’s deterrent credibility.  Power led SAC during the Cuban Missile Crisis where SAC went to full airborne alert, dispersed its medium bombers to airfields all over the United States, flew reconnaissance missions that identified Soviet missiles in Cuba, and – perhaps most importantly – brought its untested force of Atlas, Titan, and even the brand-new Minuteman ICBMs to full alert.  When the nation needed it most, Power’s SAC delivered.  LeMay had very little to do with these advances, many of SAC’s most iconic events.

Power was not a secondhand LeMay.  He was instead a brilliant tactician and operational commander in his own right, who himself was responsible for many great Air Force innovations, including setting a bold but forgotten vision for the Air Force in space.  It is far past time for Power to take his rightful place as one of the Air Force’s important and fascinating aerospace commanders.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179634 https://historynewsnetwork.org/article/179634 0
Filibusters Create Legislative Paralysis. Make Them Rare or Eliminate Them

 

 

For years, Republicans have been losing ground to Democrats. Popular votes for electing the President and members of the House and Senate are trending blue. One of the G.O.P.’s most important weapons for holding back the expansion of Democratic influence is the Senate’s legislative filibuster. That procedure, which now requires 60 votes to pass major legislation, is extraordinarily controversial. Republicans employ the filibuster often to block progressive initiatives. Democrats succeeded in enacting a COVID relief package recently thanks to a “reconciliation” exclusion that permitted decisions by simple majority, but the filibuster may create legislative paralysis over the next four years. Republicans’ support for the filibuster could block Democrats’ plans for action on voting rights, climate change, health care, infrastructure, immigration, education, firearms, and wages.

Until recently, many Democrats in Congress defended the filibuster, but most of them now recognize that this legislative procedure has become a toxic force. It thwarts progress. The filibuster gives Republican senators the power to stymie reforms that most Americans want.

If the U.S. political system operated under the principle of majority rule, the G.O.P. would already be in a crisis. Republicans would be outnumbered in all three branches of the government. The G.O. P.’s candidates for President of the United States lost the popular vote in seven of the last eight presidential elections. If popular votes determined the presidency, Supreme Court justices nominated by Democratic presidents would now outnumber justices appointed by Republican presidents by 6-3. Instead, conservative justices have a 6-3 advantage. Democratic senatorial candidates have received substantially more popular votes nationally than Republicans, yet they often fail to achieve control. In 2018 Democrats won twelve million more popular votes than Republicans in Senate elections, but the G.O.P. retained control of the upper chamber. The system favors small states over large and heavily populated states. Republican-leaning Wyoming, with just half a million citizens, receives two Senators as does Democratic-leaning California, with a population of 39 million.

There is no filibuster rule in the U.S. Constitution. The Founders designed the Senate to make decisions through a simple majority. In Federalist No. 22 Alexander Hamilton warned, “If a pertinacious minority can control the opinion of a majority . . . [the government’s] situation must always savor of weakness, sometimes border upon anarchy.” By the 1840s, South Carolina’s political strategist, John C. Calhoun, perfected the technique to defend slavery and white supremacy. After World War II, South Carolina’s Strom Thurmond, Georgia’s Richard Russell, and other southerners invoked the filibuster to block civil rights reform. After the 1964 Civil Rights Act became law, the filibuster was less useful for protecting Jim Crow.

Not long ago, strategists from both political parties backed the filibuster. The procedure required a supermajority to get important measures passed. The Senate was supposed to be the body where extreme bills passed by the House could be cooled, moderated, or left to die. Senators in both parties found the filibuster convenient because it protected the minority party in the Senate from being steamrolled by the majority party. Ideally, the filibuster encouraged negotiation and compromise. To win 60 or more votes for major legislation (the current requirement), the practice seemed to promote deal-making and bipartisanship. Republicans and Democrats knew their party might want to employ it after losing an election and experiencing minority status in the Senate.

Rationales for defending the filibuster no longer hold. In recent decades politics in Washington turned sharply partisan. There is little evidence now of senators crossing party lines and voting with the opposition on important legislation. The pattern of intense partisanship was dramatically evident in the Senate’s vote on the $1.9 trillion COVID relief bill. Not a single Republican senator supported it.

Negotiation to achieve 60 Senate votes is extremely difficult in the current political environment. Meeting in the middle was much easier in the 1960s than in the 2020s. Most bills now languish in the Senate, in part because of the filibuster rule. This situation hurts Democrats much more than Republicans. Democrats favor active government. They propose far more bills for governmental initiatives than Republicans. Conservatives view the filibuster as a useful instrument for thwarting progressive programs.

Mitch McConnell, the Republicans’ Minority Leader in the Senate, defends the filibuster as a sacred tradition. He warns Democrats not to tinker with it. Yet senators adjusted the filibuster rules on numerous occasions in the past. Major changes occurred in 1917, 1974, 1975, and 2013. In 2017 McConnell engineered an important exception to the 60-vote requirement. He wanted to confirm Neil Gorsuch for a seat on the Supreme Court. Lacking 60 votes, McConnell and Republican senators changed the rule and secured Gorsuch’s appointment by a simple majority. Later, they placed Brett Kavanaugh and Amy Coney Barrett on the Supreme Court, again by simple majorities.

Mitch McConnell was not acting on high principle or personal commitment to Senate traditions when he altered the rules. He was determined to give conservative jurists lifetime appointments on the high court. McConnell engaged in raw opportunism.

Decades ago, senators rarely employed the filibuster. The tactic became more common from the 1960s to the 1990s, when divisions between the parties turned more severe. The number of cloture motions filed spiked in the Twenty-First Century, a time of intense partisanship. Now the filibuster rule is creating democratic gridlock. Numerous reform initiatives will die in the Senate because the 60-vote rule gives a minority party the power to act like a majority party. The filibuster needs to be eliminated or made extremely rare.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179628 https://historynewsnetwork.org/article/179628 0
Review – The Ratline: The Exalted Life and Mysterious Death of a Nazi Fugitive by Philippe Sands

 

 

In August 1942, SS Major Otto Wachter was feeling frustrated.  He was working long hours, setting up a concentration camps in German-occupied Ukraine, but he couldn’t find anyone to perform maintenance work at his headquarters in Lemburg.  In a letter to his wife, Charlotte, he complained “the Jews are being deported in increasing numbers and it is hard to get powder for the tennis courts.”

The brutality and callousness of this Nazi officer, an Austrian lawyer, is on full display in The Ratline, the new book by Philippe Sands, which follows Wachter’s wartime career and mysterious death in 1949. Wachter, who indicted after the war for the murder of 400,000 Jews in western Ukraine, regularly boasted about his accomplishments to his wife. Charlotte, in turn, complained about his prolonged absences, while redecorating her new home with artworks and household goods looted from Jewish families and Polish museums.

The Ratline is based in part on several thousand letters exchanged between Otto and Charlotte, and the doomed love story between the ambitious SS major and his lonely wife is one of the running themes of the book. These letters and thousands of other documents and photos are part of the Wachter family archive now in the hands of Otto’s son Horst (born 1939).

The book unfolds as a gripping detective story as Sands, a human rights lawyer based in London, combs through the family’s archive and travels to Washington D.C. and Poland to uncover long-closed war crimes investigations from 1945-49.

The narrative takes an added twist with the perplexing response of the son, Horst Wachter, an elderly man now living in a refurbished castle in northern Austria. After Sands first meets with him in 2012, he gradually wins the trust of Horst, who slowly releases more and more of the family’s copious archives.

Horst even agrees to having a team from the U.S. Holocaust Memorial Museum come to his home. Equipped with high-speed scanning equipment, the museum digitized 9,000 documents and photographs (they can be found online at the museum’s Otto Wachter archive).

Despite the mounting evidence that his father directly ordered the extermination of hundreds of thousands of Jews, Horst insists that his father was innocent of war crimes. Horst makes various excuses for his father, first stating that Otto was “trying to act humanely” in Cracow. Later, when confronted with memos detailing the liquidation of thousands of Jews, Horst insists that his father didn’t “want” to organize the killings, but was “ordered” to do so by a Nazi judge in Germany.

Towards the end of the book, there is a chilling series of photographs showing Otto Wachter directing a group of SS soldiers in the execution of two dozen Polish hostages. When the author presents the photographs to Horst, the son is momentarily stunned, but insists that his father “wasn’t happy about shooting hostages.”

Horst’s cooperation extends to appearing in a 2015 film, What Our Fathers Did: A Nazi Legacy, which features Philippe Sands in a series of conversations with him and Niklas Frank, the son of Hans Frank, the Nazi governor of Poland, who was found guilty at the Nuremberg Trials and hanged in 1946. In the film, Horst defends his father’s actions, maintaining he was trying to be humane. Niklas Frank, on the other hand, denounces his father and agrees that he deserved to be executed for his crimes (the film is available on streaming services).  

Death in Rome

While Otto Wachter’s wartime career is well documented, his public life ends in May 1945 with the collapse of the Nazi regime. He went underground, acquiring a series of different identities, living in the mountains of southern Austria, supplied with food and clothing by his wife. By 1949, Wachter sensed that Allied investigators were closing in. He was on a watch list of Nazi fugitives and worried he would soon be spotted, caught and put on trial. He made an arduous trek over the Alps into Italy, where he spent the last six months of his life living in a monastery controlled by a pro-Nazi Catholic Bishop, Alois Hudal. Although several Nazi fugitives (e.g., Adolf Eichmann) were able to escape from Italy to South America, Wachter was either unable or unwilling to obtain the papers required to get past port officials and successfully emigrate. He dallied in Rome, enjoying the historic city and even getting a part as an extra in an Italian film.

Otto Wachter died suddenly from a mysterious illness in a Catholic hospital in July 1949. Unable to communicate directly with his wife, he died alone and was buried in a Rome cemetery. Charlotte later exhumed his body and had it reburied near her home in central Austria.  

The book’s title, The Ratline, refers to the loose network of ex-Nazis living in Austria and Germany, who worked with a rogue element of the Catholic Church to provide housing and false identity papers to Nazi war criminals fleeing Allied investigators.

In the 1974 Hollywood film The Odessa File, starring Jon Voight, the ratline is portrayed as a centrally operated, well financed operation enabling hundreds of Nazi fugitives to quickly gain new identities and escape justice.

But Sands’ book exposes this to be a Hollywood fantasy. In fact, the ratline was a poorly coordinated, shoestring operation, dependent upon a small group of sympathetic German businessmen and Catholic priests. The Ratline also gives glimpses into the secret operations of the Army’s Counter Intelligence Corps, the predecessor to the CIA, which screened many fugitive Nazis and recruited a select few to spy on Communists in Eastern Europe.  

The Banality of Evil

In 1963, philosopher Hannah Arendt published Eichmann in Jerusalem: A Report on the Banality of Evil, which was based on her attendance at Eichmann’s trial in Israel for the mass murder of Europe’s Jews.

Arendt described the “manifest shallowness” of Eichmann. She asserted that he acted “thoughtlessly” and that his crimes were committed “under circumstances that made it impossible for him to know or feel that he was doing wrong.”

Arendt said “The deeds were monstrous, but the doer, at least the one now on trial, was quite ordinary, commonplace and neither demonic nor monstrous.”

Arendt only attended the first two days of the Eichmann trial, and many subsequent Holocaust historians have disputed her conclusions. For example, Deborah Lipstadt noted that Arendt failed to consider much of the documentation released by the Israeli government. She also pointed out that the senior Nazis were quite aware of their guilt because, as defeat loomed, they feverishly tried to destroy evidence of their crimes.

In The Ratline, we see that Otto Wachter was hardly a banal or thoughtless individual. A highly educated man from an affluent family, he was a successful lawyer in Vienna before fleeing to Nazi Germany after the failed putsch in 1934. Wachter was a zealous anti-Semite and firm believer in Nazi ideology. He was given substantial authority to carry-out mass extermination in the captured territories and he acted with speed and efficiency, directly supervising mass killings when necessary.

Although The Ratline can be enjoyed as an absorbing detective story, it contains important lessons for 21st century readers. Tzvetan Todorov, a Paris-based Holocaust historian, wrote “Understanding evil is not to justify it, but the means of preventing it from occurring again.”

Although the Nazi regime was crushed militarily in 1945, the poison of anti-Semitism and the attraction to fascist dogma remain present to this day. They can be found today in Germany’s far-right, AfD Party and in some of America’s home-grown militia groups.

The Ratline gives readers a look a chilling look at the day-by-day actions of a top Nazi and the troubled legacy his family is still struggling with. 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179600 https://historynewsnetwork.org/article/179600 0
Incognegro, Part II: How New York Law Enforcement Worked to Destroy Core

 

 

The release of the film “Judas and the Black Messiah” has put a renewed spotlight on the Counter Intelligence Program (COINTELPRO) of the Federal Bureau of Investigation (FBI). The film dramatizes the story of Black FBI informant, William O’Neil, who infiltrated the Chicago branch of the Black Panther Party and how his actions ultimately led to the assassination of its leader Fred Hampton and Panther member Mark Clark by the Chicago police department. The assassination was part of a larger set of objectives by police and the FBI’s COINTELPRO program to "misdirect, disrupt, discredit and neutralize" any black activist groups it deemed “Black extremist.”

 

As a result, Fred Hampton was not the only black leader targeted. The recently released memoir The Ray Wood Story reveals a similar circumstance that began with infiltration into the Congress of Racial Equality (CORE), and ended with accusations of COINTELPRO’s role in the Malcolm X assassination. While much is being made of what the book has to say regarding the murder of Malcolm X, there is little discussion of the book’s claims concerning the New York City Police Department’s (NYPD) efforts to destroy CORE, one of the most significant organizations of the civil rights movement.

 

According to the book, Raymond Wood was a newly recruited NYPD officer in the Bureau of Special Services and Investigations (BOSSI) assigned to specifically discredit the local leadership of CORE chapters. Dr. Chenjerai Kumanyika suggests in his foreword for the book that these illegal actions were most likely in response to CORE’s campaigns against police brutality. I would add NYPD’s actions were in general a response to CORE’s growing militancy.

 

According to The Ray Wood Story, in April 1964 Wood became a member of BOSSI, a special division of NYPD created to monitor and surveil political radicals. BOSSI was just one of many such local police groups around the country that functioned as satellites for COINTELPRO.

 

Wood was immediately assigned to infiltrate the Bronx chapter of CORE. The assignment followed only a few weeks after Bronx CORE carried out two major police brutality demonstrations at NYPD headquarters and City Hall. The demonstrations caused such a disruption that Bronx CORE’s chairman Herb Callender was singled out by the police commissioner in the press as one of the three most “irresponsible” Negro leaders in the city, along with Malcolm X and rent strike leader Jesse Grey. Callender was further accused of having a “lust for power,” “sinister motives,” and “no real concern for the fight for equality or for the people waging this battle.” Undaunted, Callender followed with his own statements to the press: “We’ll have to shock the officials and the public to get the city to face up to the realities of the day.”

 

In July, Callender, Wood and another member of Bronx CORE tried to make a citizen’s arrest of New York City (NYC) Mayor Robert Wagner. Callender was himself arrested but instead of just being placed in jail the judge had him committed to Bellevue Hospital for psychiatric evaluation. The implication that Callender may have been mentally ill could have potentially ruined his credibility as a civil right leader. He spent five days there until he was bailed out and eventually found guilty of disorderly conduct. In discussing the incident, James Farmer, in his book Lay Bare the Heart, stated that Callender told him the idea to arrest the mayor was Wood’s.

 

Wood himself confirmed Callender’s claim and admitted that his job was to discredit Callender as a leader and, by extension, potentially neutralize CORE. However, he claimed to later reject the assignment because he saw no real evidence of serious crimes being committed. His supervisor’s response was to assert “extreme pressure… to produce an arrest.” Wood was threatened by his handlers with arrest for selling marijuana, even though it was his handlers who gave him the cover of a drug dealer. Just as in the case of William O’Neil and the Black Panthers, law enforcement and the intelligence agencies used the threat of arrest as a common tool to target and coerce informers into acting as traitors to the Black Freedom movement.

 

This is only part of the story. There is additional evidence not presented in the book that threats to Wood were far more lethal than imprisonment. In personal communication with me, Wood stated his handlers also threatened to expose him as an undercover cop – an act that undoubtedly was intended to make Wood fear a brutal attack and ultimately death in prison. This would help explain his motivation to remain, because in a very short period of time Wood worked his way up the ladder and became not just a leader within Bronx CORE but a delegate to CORE’s annual national convention. The resulting exposure enhanced his activist profile and brought him into contact with and gave him the opportunity to gather intelligence on CORE members from all over the country.

 

Meanwhile, he continued socializing locally with CORE members, dating them and going to other chapters’ functions, such as the events sponsored by the East Harlem chapter, East River CORE. Nicknamed the “River Rats,” East River CORE was arguably at one point the most militant chapter in NYC. On March 6th, the same day Bronx CORE sat in at NYPD headquarters, several members of East River CORE staged a sit in on the nearby Triborough Bridge in the middle of traffic during rush hour. This was the first time the tactic had been used in the civil rights movement and, as the New York Times stated, it was carried out with military precision. Probably not by coincidence, the head of the chapter, Blyden Jackson had been in the Marines.

 

East River CORE member Stuart Wechsler shared with me that when Wood came to meetings he would help to set up chairs and be generally helpful, but also once suggested to Wechsler that they rob liquor stores in order to raise funds for the chapter. Wood would later testify against Blyden Jackson for being a Communist before the House Committee on Un-American Activities in 1967. This helped create a narrative where Jackson was potentially an agent of a foreign power, which played into false claims that the civil rights movement was part of a subversive plot to destroy America.

 

Rafael Martinez, another member of the New York CORE chapter in Harlem, remembers Wood being next to him when he sat in at that year’s World’s Fair demonstration. The sit-in inside the Fair corresponded to an even more audacious protest, the Stall In, which was planned by the so-called “ghetto chapters”--Brooklyn CORE, Bronx CORE and New York CORE. The protest was meant to bring attention to CORE’s ongoing efforts to fight against racism in the construction industry.

 

Set to take place on the opening day of the World’s Fair, April 22, 1964, the Stall In intended to have drivers stall their cars in the middle of traffic throughout the city. With newspaper headlines such as “…CORE's Fair Tie-up Can Paralyze the City,” even the White House took notice. The resulting chaos could have not only potentially disrupted the World’s Fair on a massive scale, as historian Brian Purnell has pointed out in his work on Brooklyn CORE, it could have embarrassed the United States on an international level. The protest threatened to expose America’s hypocrisy for preaching democracy across the globe while failing to practice true democracy at home. The Stall In was announced publicly by Brooklyn CORE as early as April 9. According to the book, Wood was hired by NYPD on April 18th. This supports the idea that he was brought on specifically to deal with the rising militancy in CORE.

 

Even more interesting, Martinez stated to me that when he saw Wood again a few weeks later at the New York CORE office in Harlem, Wood suggested to him in a conversation that they blow up national monuments including the Statue of Liberty.

  

Wood recounted that he left CORE in November and was reassigned to other groups, specifically the Black Liberation Front (BLF). He successfully got BLF members Walter Bowe and Khaleel Sayyed arrested by instigating a plot to blow up the State of Liberty and other national monuments. Wood’s book contends this was done to weaken Malcolm X’s security, making it easier to assassinate him; Sayeed and Bowe had also been members of and worked security for Malcolm X’s Organization of Afro-American Unity.

 

While much is being made of the claims concerning Malcolm X’s assassination, lost in the discussion is how the experiences of CORE members support the claims by Bowe and Sayeed that it was Wood’s idea to commit felonious actions. Wood insisted that the idea came from his handlers.

 

Ray Wood was a classic agent provocateur, once described as an “ebony James Bond” by Max Stanford, one of the founders of the Revolutionary Action Movement. Wood’s actions were classic COINTELPRO in that he not only discredited but potentially neutralized leaders of two of the most militant CORE chapters not just in NYC but in the country. His attempts to engage CORE members in such acts of violence could have helped destroy the reputation of CORE, an organization dedicated to non-violent direct action, and de-legitimize the entire civil rights movement.

 

Wood’s presence demonstrates how CORE was seen as a real threat by the State. His case was hardly an isolated one. Interviews with Harlem CORE members indicate that Wood was far from the only police agent to have infiltrated the CORE chapters in NYC. Other chapters from across the country, like Miami CORE and San Francisco CORE, had over the years been subject to such dirty tricks. In fact, CORE had been monitored and under surveillance by the FBI since its inception in the early 1940’s.

 

The conversation about the case of Ray Wood needs to be expanded beyond its connection to Malcom X. Besides CORE, Wood was interacting with members of many other groups including the Nation of Islam, the Black Panthers, the Revolutionary Action Movement, the Student Non-Violent Coordinating Committee and the Organization of Afro-American Unity. The simplest way to confirm the veracity of Wood’s claims would be to release the FBI and NYPD files not just on Ray Wood and Malcolm X but on all these groups. We need to know and understand the full extent of how much these programs and efforts damaged the civil rights and Black Power movements specifically and the Black community in general.

 

The recent events during the Jan. 6, 2021 riots at the Capitol has brought national attention to the question of White supremacy in law enforcement, the intelligence agencies and the military. If the Biden administration is serious about dealing with this problem, here would be a great place to start.

 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179630 https://historynewsnetwork.org/article/179630 0
White Terrorism: From Post-Civil-War Lynchings to the Present

Detail of Lawrence Beitler's photograph of the lynching of Thomas Shipp and Abram Smith, Marion, Indiana. August 7, 1930.

 

 

The violent occupation of the U. S. Capitol building on January 6 shocked many people into realizing White terrorism is scary. Soon afterward, D. C. Mayor Muriel Bowser told “Meet the Press” host Chuck Todd that the essential question was how seriously our country will take threats of “domestic white extremism” and terrorism. Todd later added that “right-wing [White] terrorists perpetrated the majority of all plots and attacks in the United States from 1994 to 2020. Over the past six years, these attacks have occurred in 42 states. In other words, the violence we witnessed on January 6th has been hiding in plain sight.”

 

But right-wing White terrorism has long been a U. S. problem. Witness the examples of the Ku Klux Klan and lynchings, which were sometimes perpetrated by Klansmen. As historian Ron Chernow has written “Klan violence [of the post-Civil War period] was unquestionably the worst outbreak of domestic terrorism in American history.” Heretofore, however, most Americans have paid no more attention to Klan activities and lynchings than to old cowboy movies. Now, though, media is offering us more reminders. Take, for example, Hulu’s recent “The United States vs. Billie Holiday,” in which the song she made famous, “Strange  Fruit,” has a central role. That song was based on a poem of the late 1930s written by teacher, writer, and songwriter Abel Meeropol, who said that he wrote it after seeing a photo of the 1930 lynching of two Black men in Marion, Indiana (more on this lynching later).

 

Holiday’s “Strange  Fruit,” which in 1999 Time magazine selected as the “song of the century,” reveals graphically the horror of such lynchings.

 

Southern trees bear a strange fruit Blood on the leaves and blood at the root Black bodies swinging in the southern breeze Strange fruit hanging from the poplar trees Pastoral scene of the gallant south The bulging eyes and the twisted mouth Scent of magnolias, sweet and fresh Then the sudden smell of burning flesh Here is a fruit for the crows to pluck For the rain to gather, for the wind to suck For the sun to rot, for the tree to drop Here is a strange and bitter crop

 

A report on lynching (2017) by the Equal Justice Initiative (EJI) documented over 5000 “racial terror lynchings” between 1877 and 1950, overwhelmingly in the South, where until 1910 about 90 percent of Black people lived. (Such lynchings were often hangings, but could also include other forms of illegal killing.) Both before and after this period Black people were also lynched.

The EJI lynching report states, “These lynchings were terrorism”--defined here as the non-governmental use of violence, or threat of its use, for political purposes. The report adds, “Terror lynchings fueled the mass migration of millions of Black people from the South into urban ghettos in the North and West throughout the first half of the twentieth century.”

 

Heavily involved in the early post-Civil-War lynchings was the Ku Klux Klan, founded in Tennessee in 1866.  Historian  Jill Lepore writes that it “was a resurrection . . . of the armed militias that had long served as slave patrols” and “for decades had terrorized men, women, and children with fires, ropes, and guns, instruments of intimidation, torture, and murder.”

From 1866 to 1871 the newly-born Klan terrorized southern Blacks. According to the Southern Poverty Law Center, it “engaged in a violent campaign of deadly voter intimidation during the 1868 presidential election. . . . Similar campaigns of lynchings, tar-and-featherings, rapes and other violent attacks on those challenging white supremacy became a hallmark of the Klan.” The victor in that election, however, was former Union General Ulysses Grant, and he set out vigorously to end Klan terrorism.

Prior to his election, the Civil Rights Act of 1866 (overriding President Andrew Johnson’s veto) and the Fourteenth Amendment aided Black males to vote. Where they could do so without illegal white interference, which was widespread in the Deep South, they voted overwhelmingly for Grant. Once in office he signed (1870) the Fifteenth Amendment, which forbade denying the vote to Blacks. In 1870-71 he approved of three Enforcement Acts to help protect that right. The third, also sometimes referred to as the Ku Klux Klan Act, empowered the president to enforce the act.

About the Klan Act, Chernow has written that the “law stood as a magnificent achievement for Grant, who had initiated and rallied support for it, never wavering,” and “by 1872, under Grant’s leadership, the Ku Klux Klan had been smashed in the South.”

But Elaine Frantz Parsons’ book on the birth of the Klan during Reconstruction states that by the time Grant had destroyed the Klan, it “had already done a great deal to increase the power and prosperity of white Democratic southerners at the expense of freedpeople and their allies.” Klansmen had lynched and shot hundreds, driven many thousands from their homes and official positions, scared off many Black voters, taken over Black properties, and committed various crimes against them, including rape. Nor did the destruction of the KKK end White terror in the South. An EJI Reconstruction report, for example, lists lynchings between 1873 and 1876 that killed over 350 southern Blacks.

 

The decade from 1891-1901 witnessed the most lynchings of African Americans, with over 100 being lynched in every year but two. This was also the decade when the Supreme Court ruled that segregation, by then enacted in most of the South, was legal.

 

One of these Black lynchings was of Sam Hose in Georgia in 1899. He was accused of killing his employer, a farmer, and raping his wife. After Hose was chained to a tree, the mob cut off his ears, fingers, and genitals, stacked kerosene-soaked wood around him, and then set him afire. After his agonizing death, his body was further carved up and parts taken away as souvenirs. The famous Black leader W. E. B. Du Bois saw Hose’s knuckles displayed in an Atlanta store window. Atlanta’s leading newspaper justified “a people intensely religious, home-loving and just” who were outraged at a murder and rape--later evidence indicated the “murder” was in self-defense and the “rape” never occurred. 

 

The second Klan began in Georgia in 1915, but soon spread across the USA, becoming strong in non-southern states like Indiana. After the 1919-1922 period when at least 239 Black lynchings occurred, the frequency of lynchings began to recede, with no year exceeding 29, and from 1935 to 1951 numbering between 1 to 8 per year, and from 1952-1968 an average of less than one per year. And most later lynchings did not directly implicate the Klan, though their earlier example helped create the toxic mix that sustained the lynching climate.

 

Increasingly, however, the KKK used other terrorist methods against Black people, as they did in 1921 in the case of a Dallas, Texas bellhop named Alex Johnson, whom Klansmen accused of having sex with a white woman. They branded KKK onto his forehead, but, as usual, the Klansmen responsible were not prosecuted.

 

The Klan reached its apex in the mid-1920s and then began to decline. In 1925 over a quarter of “native-born, white Indiana males belonged to the Klan,” but that November the head of Indiana’s KKK, David Stephenson, was convicted of raping and murdering a young woman. By 1930, due to that and various other factors, Klan membership, which had numbered maybe 4-5 million in 1925, dropped to an estimated 30,000.

 

The situation in Marion was similar, and in the lynching of two young Black men in 1930 the KKK, as an organization, had no direct part. But that did not mean that it had not contributed to the deadly attitudes that launched the lynching.

 

The two lynched men were Abram Smith and Tom Shipp, both accused of murdering Claude Deeter and raping his companion, 18 year-old Mary Ball. Also accused was another black teenager, James Cameron. But, after having had a rope placed around his neck, he was returned to jail. 

 

Later investigations, as was the case with Sam Hose in 1899, cast serious doubts about the stories the mob believed when they lynched Smith and Shipp, especially the occurrence of any “rape.” In fact, such false accusations against Black men were quite common preceding many lynchings.  

 

The Marion lynchings became so significant partly because of Lawrence Beitler's famous photograph which influenced both songwriter Abel Meeropol and singer Billie Holiday in their composition of her song “Strange Fruit.” In addition the photo was widely displayed in newspapers and other publications. As Leon F. Litwack indicated in his book of photos often reproduced on postcards, Without Sanctuary: Lynching Photography in America, such photos could be displayed not only in publications critical of such lynchings, but by defenders and apologists of them, partly as a warning to Black people not to challenge the gospel of White supremacy. The book and its pictures suggest that many lynch mobs and onlookers were composed of “ordinary people,” and both Madison’s book on the Marion lynching and Beitler’s photo suggest the same.

 

From 1930 until the present, both lynchings and the Klan declined. One report of the 1981 Alabama lynching of Michael Donald refers to it as the “Last lynching in America.” A 2016 report on the KKK by the Anti-Defamation League states that it remains a collection of mostly small, disjointed groups that continually change in name and leadership. . . . There are currently just over thirty active Klan groups in the United States, most of them very small. However, the association of Klan members with criminal activity has remained consistent.” Klan members have, however, been active in some recent right-wing protests, including the “Unite the Right” rally in 2017 in Charlottesville.  In 2020 a Klan member drove his truck through a crowd of Black Lives Matter protesters near Richmond, Virginia.

 

Most white terrorist activities in recent decades, like the Oklahoma City bombing of 1995 or the shooting deaths of nine Black congregants in 2015 in a Charleston S. C. church, were not KKK directed, but still reflected a similar white supremacist mindset. It was often mixed with hostility to the federal government and its perceived infringement on citizen rights--for example, in regard to any type of gun control.

 

In its mixed motives, one of which was demonstrating white supremacy (note the Capitol-occupier carrying the Confederate flag), those who used terrorist means to occupy the U. S. Capitol building in January 2021 were typical of recent right-wing protests. The  FBI recently noted that two right-wing groups, the Proud Boys and Oath Keepers, “were in the vanguard” of the occupiers. Although both groups deny being racist, the Southern Poverty Law Center indicates that white supremacy is part of their ideologies (see here and here).

 

Most occupiers, however, were not members of right-wing extremist groups, but simply Trump followers. Three characteristics that they had in common with many who supported or failed to protest historical lynchings were 1) an us vs. them perspective, 2) a belief in their vision of Christianity, and 3) a willingness to believe rumored falsehoods.

 

In his book on the Marion lynchings, Madison writes of white crowds’ “us and them” outlook; and in observing Trump supporters in 2016, writer George Saunders has written that many of them suffered from “usurpation anxiety syndrome.” He defined it as “the feeling that one is, or is about to be, scooped, overrun, or taken advantage of by some Other with questionable intentions.” The “some Other” could be such groupings as Blacks, illegal immigrants, or big government.

Regarding the 1920s Klan, “the bedrock of [it] remained its commitment to the continuation of native-born white Protestant hegemony in American culture and governance.” In addition to opposing Black equality, the Klan supported Prohibition and targeted Catholics, Jews, and non-Protestant immigrants. Trump’s largest group of loyal supporters in both 2016 and 2020 were White Protestant evangelicals. Neither in the 1920s nor in Trumpian times did such Protestants believe that opposing racism was central to the Christian message.

In many of the lynchings of earlier times, we are struck by the gullibility of lynch mobs--their willingness for example to accept as truth that a Black man had committed rape. Many 2021 Capitol occupiers, and even many other Trump supporters, also were willing to believe in unfounded rumors, especially about the 2020 election being “stolen” from Trump. A smaller percentage of occupiers were even willing to believe in some of QAnon’s more outlandish conspiracy theories.

Although many Trump supporters were unwilling to countenance violent means to overturn the 2020 presidential election, the Capitol occupiers, like lynch mobs before them, were willing to employ such means. And still other Trumpians, like Marion citizens viewing the 1930 lynchings, were insufficiently outraged by such behavior.  Such a lack of indignation can only help White terrorism to continue.

 

 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179629 https://historynewsnetwork.org/article/179629 0
The Same Mistakes Twice? Teaching Dr. Seuss

 

 

 

Ted Geisel, better known as Dr. Seuss, has been subject to distortions from both right and left.  Republicans who normally support the right of private businesses to do what they damn well please, even to the detriment of fellow citizens, as freezing Texans saw recently, are suddenly appalled at the decision of the Seuss publisher and heirs to withdraw six of his titles. These are not among Seuss’s greatest hits. I only recall one, just barely, from grade school sixty years ago.

For an accurate view of Seuss/Geisel's positions on race and ethnicity, see the anthology of his cartoons, Dr. Seuss Goes to War, with an introduction by Art Spiegelman. Incidentally, I bought the book at a special exhibit by the Houston Holocaust Museum. Seuss relentlessly combats isolationists (pp. 52-55), shows the "Father of Hate Radio" Coughlin on the line with Hitler (p. 51), depicts Lindbergh "Spreading the Lovely Goebbels Stuff" from a garbage truck (p. 41), and in one of the best, portrays auntie America First reading the story of Adolf the Wolf to a couple of freaked out kids: "and the Wolf chewed up the children and spit out their bones . . . But those were Foreign Children and it really didn't matter" (p. 45). Brings to mind the jacket slogan of a certain First Lady visiting the Mexican border.

Seuss combats racism and anti-Semitism stateside, especially in war industries (pp. 56-63), turning an old racist epithet on its head "There seems to be a white man in the woodpile" at a plant that posts "No Colored Labor Needed." His caricatures of the Japanese enemy are vicious, but hardly more so than his portrayals of Hitler, and Nazis in general.

But like everyone else, the liberal cartoonist errs occasionally. He depicts Japanese Americans as the "Honorable 5th Column" stocking up on TNT and "Waiting for the Signal From Home" (p. 65), but he's in "good" company there with Earl Warren (R-CA), who later as Chief Justice overturned legal segregation.    

Despite cheap shots from the right and the left, I will continue to use Seuss/Geisel's cartoons in my teaching—and not only the anti-isolationist ones showing America sitting idly by when Britain and Russia were standing up to Hitler alone. Nobody gets through my junior immigration history course or even my freshman U.S. history survey without being posed the question of what Anne Frank would have become if her father had applied for admission to the U.S. (short answer: a concentration camp victim; he did apply in vain). I use Seuss cartoons to show that ignorance of European conditions was no excuse. I follow this up with the unjust internment of Japanese Americans, two-thirds of them U.S. citizens, cheered on by most Americans including Dr. Seuss.

Then I ask my students why it's necessary to include these "un-American" aspects. Rather than answering myself, I leave the final word to Betty Kanameishi. Her "love letter" to America, written behind barbed wire in her 1944 internment camp high school yearbook, closes with an admonition that could also be applied to Dr. Seuss: "I worship you in spite of the errors you have made. Yes, you have made errors and you have roamed on many wrong roads; but everyone makes mistakes. ... All I ask is that you do not make the same mistake twice." Then I tell students, that's my job--and yours as well. And it’s a lesson that Dr. Seuss himself learned in the course of his life.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179635 https://historynewsnetwork.org/article/179635 0
Persevering to Mars

 

 

There is a lot to marvel at Perseverance’s February 18 landing on Mars, beyond robotic exploration as an extreme sport.  Only half of attempted missions to Mars have succeeded, and the sheer technical audacity that stuck Perseverance’s landing is guaranteed to dazzle.  But America’s latest endeavor joins two other missions from civilizations re-emerging as global actors after centuries of exploring quietude. Perhaps more deeply, Perseverance’s first-contact photo, a shadow selfie, raises questions about the very nature of discovery and the character of an explorer.

All peoples are curious, and all travel, but 600 years ago Europe began granting those impulses something like an institutional form dedicated to geographical exploration.  The project was spearheaded by Portugal, a marginal country flanking a marginal continent.  The motivations behind dispatching expeditions of discovery involved plenty of pushes and pulls.  In its founding century exploration often stammered and sputtered amid spurts and long pauses.

Critical to the dynamic was geopolitical rivalry.  The Great Age of Discovery was powered by European states that projected their competitions outward – first between Portugal and Spain.  Early on the pope intervened, negotiating the Treaty of Tordesillas that divided the globe between them.  Where and when they stumbled, other European states moved in.  The chronic quarreling within Europe kept the larger pot simmering. 

By the early 18th century, however, exploration was becoming moribund.  Trade routes were known, and newcomers like the Dutch, French, and British were content to pick off prize sites from Iberia’s creaking overseas empires.  Few new islands were discovered; buccaneering replaced exploration.  Even intellectual and literary enthusiasms were fading.  Robinson Crusoe (1719) is paean to perils of wanderlust; Gulliver’s Travels (1729) skewers discovery that then leads to conquest.  Like a rogue wave, the Great Age of Discovery had, it seemed, risen, spread, and subsided.

Then the process rekindled.  A new era of circumnavigation caught Europe’s fancy, northern Europe replaced Iberia as a colonizing force, and Britain and France began their long (and global) new hundred years war.  This time modern science had arrived to bond with voyages.  Expeditions were sent around the globe to track the transits of Venus and to measure the shape of the Earth, as knowledge and quests to fabled places joined spices and bullion as incentives.  What William Goetzmann termed a Second Great Age of Discovery revived and redefined Europe’s exploring tradition.

Successful exploration became a measure of power and prestige.  As settler societies sprawled across North America, Australia, and parts of Africa, the Neo-Europes began to sponsor expeditions of their own.  If the first age of discovery put the Renaissance to sea, the second age sent the Enlightenment.  A natural-history survey across a continent replaced circumnavigation as the age’s grand gesture.  By the 1870s expeditions had traversed the world’s habitable continents and the Challenger expedition was inventorying the world’s oceans.

Then, this age, too, stumbled.  It ran out of places to visit and, equally, the means to understand the remote lands that remained.  Antarctica posed challenges that led to some of the greatest survival stories in exploration history, but the available political, cultural, and scientific instruments ended in a whiteout amid a continent reduced to a single mineral.  Interestingly, the ‘heroic age’ attracted newcomers like Belgium, recent nations like Norway, aspiring nationalities like Scotland, and fast-industrializing Japan, all interested in using exploration to announce their arrival on the world stage.

Two world wars, a Great Depression, and an intellectual turn toward Modernism left exploration once more in a trough.  Geographic discovery and high culture had, it seemed, little to learn from each other.  Atoms and genes, Cubism and Dadaism had little valence with exploring new lands.  Decolonization removed Europe’s projected quarrels; the European Union, dedicated to replacing deadly competition with cooperation, would deny Europe the internal rivalries that had traditionally stoked exploration. 

But Western-style exploration was not yet ended.  Beginning in the post-war era, announced by the International Geophysical Year (1957-58), a Third Age of discovery began – this one outfitted with rockets, submarines, remote-sensing devices, all of which could take exploration to places previously denied it.  The Cold War between the U.S. and the USSR furnished the necessary competition. 

Antarctica was the pivot between second and third ages, but the major action occurred in the deep oceans and space.  All three realms were uninhabited, and uninhabitable without elaborate prostheses, even self-contained environments.  More broadly, the age’s hardware took it to novel places faster than the cultural software could process what it found.  The art, science, literature, politics, and legal systems that had interpreted previous ages struggled to understand places that lacked abundant specimens, perspectives, and people.  The great explorers are machines.

The moral drama of discovery, both for good and ill, has historically hinged on encounters between peoples.  But the primary explorers of the third age are robots and remote-sensing devices.  Rather than portraying explorers alongside longtime residents, first-contact images tend to be selfies, or in the case of Perseverance, a shadow of a selfie.  There is no Other, and in space, so far, not even other forms of life.  The ugliness by which contact had so often led to colonization does not exist, but neither do inherited expectations about what an encounter means.  Instead of an Other there is, it seems, an All, by which millions can participate in the first contact of Jezero Crater on Mars.

As in earlier ages, once the primary rivals stalled, others moved in.  What is striking today is the origin of those others outside the West.  India launched a Mars mission in 2014; Japan dispatched spacecraft to the asteroid belt; China and UAE currently have orbiters circling Mars.  They are providing the money and motives to propel discovery.  Prestige, technological stimulation, and geopolitical jostling have returned, but without the long and peculiar cultural engagement that had made exploration a quest narrative for Western societies. 

The third age is different: it visits places without people, it has refashioned the identity of an explorer and redefined the notion of an encounter, and it can allow millions of viewers to experience a first-contact moment virtually.  Efforts to imagine expeditions to Mars or the Laurentian Abyss through the prism of previous centuries can end in parodies.  But the tradition has adapted previously to new participants and has repeatedly adjusted to the prevailing culture of its times, and there is hope that the same will happen now, and that the tradition, transformed, will persevere.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179599 https://historynewsnetwork.org/article/179599 0
"Freedom of the Press in Small-Town America"

Civil War Memorial, Jacksonville, Illinois – the hometown of historian and essayist Steven Hochstadt

 

 

This is not a book about freedom of the press, but rather the author sees it as an example of freedom of the press. The bulk of the book consists of selected weekly columns or that ran in the Journal-Courier of Jacksonville, Illinois between November, 2009 and October, 2018. They are supplemented by pieces the author wrote for a list of friends and associates both before the start date and after the closing date. Each piece is given a short introduction. This is liberal opinion published by a conservative small-town newspaper. Steve Hochstadt is a Jew who grew up in a middle-class suburb on Long Island, New York. He taught history at Bates College in Maine before moving to Illinois College in Jacksonville in rural Illinois. The author seems surprised that he was able to publish his liberal opinions in conservative rural Illinois for nine years. Although he received written expressions of distaste, he was never verbally abused or threatened when he ventured out in this small community.

Yet, at the same time, he admits that he was “naïve” in supposing that he could change public opinion by stating a few facts each week in the newspaper. That does not seem to have worked. Morgan County, of which Jacksonville is the county seat, voted 62 to 65 percent Republican for president in every election 2004-2020, with one exception. In 2008, John McCain carried Morgan County by less than one per cent. I wish the author had speculated as to why. 

Hochstadt did pull his punches somewhat in the published columns. The introductions and non-published pieces contain more sharply negative assessments of conservative (and Republican) policies and points of view. He also wrote rather often about non-political subjects—family holidays, gardening, dogs, the seasons, and sports figures, especially Jackie Robinson and Mohammed Ali.

The author is an honest, decent, fair-minded, generous, reasonable, and charitable man. He sees himself as an outsider in America, in part because his father as a Jew had to flee Vienna, Austria in 1938 to avoid the Nazis. He has published on the history of the Holocaust in which he, of course, lost family members. The threat of tribalism in every human society is never far from his consciousness.

He asserts that “My lifetime of opinions depends on the crapshoot of birth, the chance of geography, and the idiosyncrasies of family life.” (p. 118) How is it then, that the reviewer, of mostly German Protestant descent and raised on a financially precarious western Missouri hog farm, agrees with him on almost every topic he chooses to discuss. When Hochstadt asserts that “Political economy is what I care about.” (p. 381), and “When Republicans turn their evil eye on the poor, I get sick.” (p.214), I’m right there beside him as I am on climate change, gun control, respect for science, regulation of business, voting rights, health care, racism, anti-Zionism and the many other topics he addresses. Only in love for sports and dogs do our opinions differ. I suffered a great deal of shame for my klutziness on the school athletic field, and the family dog attacked me when I was about 12.

We do share similar experiences. I am only a single year older than Hochstadt and we are both professionally trained in history. My father, too, had to leave the world he loved best as economic trends forced him off his small family farm when I was in high school. So I, too, have been the skeptical outsider when listening to American corporate capitalism’s promises of plenty. We both faced the Vietnam era draft, although I was drafted, and he escaped with a high lottery number. Yet all this is not enough to satisfactorily explain our political similarity—not enough for me nor for the reader either, one suspects.

Reading this book allows one to review many of the most common political concerns liberals have had over the last dozen years. One likes to have one’s point of view reinforced. But I cannot share the author’s optimism about amelioration. Wish that I could!

Still, in this age of extreme political partisanship, one enjoys reading the musings of such a thoughtful and decent man—even one of firm political opinions.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179636 https://historynewsnetwork.org/article/179636 0
The Roundup Top Ten for March 19, 2021

For 100 Years, the Filibuster has been Used to Deny Black Rights

by John Fabian Witt and Magdalene Zier

The filibuster is often associated with Southern conservatives' opposition to civil rights legislation, but it's important to note that the modern use of the tactic emerged to defeat the 1920 Dyer anti-lynching bill – the NAACP called the filibuster a "license to mobs to lynch unmolested." 

 

We Were Warned about a Divided America 50 Years Ago. We Ignored the Signs

by Elizabeth Hinton

The 1968 Kerner Commission Report on civil disorders recommended police reform, public employment, housing and school desegregation, and a basic minimum income to tackle economic inequality and racial segregation as conjoined problems. LBJ shelved the report, and we pay the price today. 

 

 

Napoleon Isn’t a Hero to Celebrate

by Marlene Daut

The veneration of Napoleon on the 200th anniversary of his death reflects a systemic problem in French education, which touts the color-blind universality of French republicanism (which Napoleon destroyed) without acknowedging his policy of attempted genocide in the effort to retake control of Haiti. 

 

 

Fascism and Analogies — British and American, Past and Present

by Priya Satia

"Historical and local specificities mean all analogies are ultimately inaccurate in ways that historians must always make clear. The point of such comparisons, however, is to uncover darker historical truths obscured by prevailing, more flattering comparisons."

 

 

Manufacturing Isn’t Coming Back. Let’s Improve These Jobs Instead

by Gabriel Winant

Instead of focusing on infrastructure projects, the federal government should act to improve the pay and working conditions of medical and care workers, who have been a growing share of the American working class for decades. This would make poorer and older Americans healthier as well. 

 

 

The Lost Story of Lady Bird

by Julia E. Sweig

"It is perhaps ironic that so many historians, intent as they are on the president, have missed her sway in the White House, because Lyndon himself was not shy in acknowledging Lady Bird’s crucial role in his administration."

 

 

The Forgotten Film That Paved The Way For This Year’s Oscars Contenders

by Rebecca Prime

For the 1968 film "Uptight!," white director Jules Dassin enlisted Ruby Dee and Julian Mayfield to remake the 1935 film "The Informer" around the Black Panther Party, a move which drew on all three principals' experiences with surveillance over political activism and provoked a sabotage effort by the FBI.

 

 

Why Can't Britain Handle the Truth about Winston Churchill?

by Priyamvada Gopal

"Churchill was an admired wartime leader who recognised the threat of Hitler in time and played a pivotal role in the allied victory. It should be possible to recognise this without glossing over his less benign side."

 

 

Neoliberalism with a Stick of Gum: The Meaning of the 1980s Baseball Card Boom

by Jason Tebbe

The baseball card craze of the late 1980s promised Gen X kids a repeat of the collectible windfall like the Boomers enjoyed with their surviving 1950s Topps cards. The reality proved quite different, giving a lens onto economic transformation. 

 

 

Heterophobia? Straightwashing on the Academic Job Market

by Rebecca L. Davis

"Heterophobia is a pernicious idea, one that suggests that to question sexuality’s normative history is to hate people who sexually desire people of a different sex."

 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179627 https://historynewsnetwork.org/article/179627 0
Attacking Critical Race Theory: A Modern Campaign of Conversion?

Louis IX ordered the burning of copies of the Talmud in Paris in 1242.

 

 

One of the older (and odder) books in my personal library is a volume titled Narrative of a Mission of Inquiry to the Jews from the Church of Scotland in 1839. As the title gives away, this book recounts a mission trip to Palestine “to see the real condition and character of God’s ancient people, and to observe whatever might contribute others in their cause”—namely, the cause of converting the Jews in Palestine and beyond to the one true faith, Scottish Presbyterianism. Contained therein is a day-by-day account of their travels from Scotland through Europe (France, Italy, Malta, Greece) and on to Egypt and then Palestine, before returning home on a route through modern-day Turkey and Eastern Europe.

Given that it was authored by a group of missionaries, this Narrative of a Mission of Inquiry offers an amusingly honest account of the travels of these missionaries. For example, they write, of a Jew with whom they spend the evening in Boulogne, “He had been long seeking the truth, and thought he was still doing so, but was not convinced that it lay with us.” However, being Scottish Presbyterians, they cannot fully commit to the humanity of the objects of their mission. Indeed, included in the book is an appendix titled, “Striking Similarity in the Main Features of Judaism and Popery, Proving That They Have One Author” (that author being, of course, Satan).

The very first similarity drawn between Catholicism and Judaism runs thus: “Popery Says: The Bible is not the only rule of faith. The Church is to determine what is believed. Judaism says: The Talmud and Cabbala are as good authorities as the Bible. Nay, the Talmud is wine, but the Scriptures, taken by themselves, are only water.” This strange prejudice against the Talmud finds expression when the missionaries are returning home and, in the city of Hamburg, encounter one Mr. Moritz, himself a Jew turned Christian missionary, who informs the Scotsmen that Russia is “by far the most important field for a Jewish mission. There are at least two millions of Jews in European Russia, not including Poland, and all are Talmudists except in Courland, where a little more light has broken in.”

Railing against the Talmud has a long history within Christian anti-Semitism. For much of its early history, the Church paid little attention to the development of Rabbinic Judaism, regarding Jews primarily as adherents to the Old Testament alone, as essentially unfinished Christians. When, in the late Medieval period, the tradition of Talmudic study became known to the Church, “prelates and polemicists reacted as if this unknown text were a kind of intellectual well poisoning,” writes James Carroll in Constantine’s Sword: The Church and the Jews: A History. “They seized on news of the Talmud as an explanation for Jewish recalcitrance, as if the work’s secrets equipped Jews with the power to withhold the assent that the friars’ preaching demanded.” Pope Gregory IX even described the Talmud as “the chief cause that holds the Jews obstinate in their perfidy.” These “Jewish secrets” provided not only the explanation for the “ongoing Jewish rejection of Christian claims but also of heresy among Christians themselves,” the Talmud and related texts somehow providing the source of division within the Church.

This notion has recurred throughout the course of history, the notion on the part of the powerful that some idea, Idea X, is the reason behind the current state of social disunity, and if we could simply rid our society of Idea X, then we could finally achieve peace everlasting, with all people happy and content in their respective social stations.

In the present-day United States, conservatives have labeled Critical Race Theory and the 1619 Project as the chief cause that holds certain populations in their perfidy—as our Idea X (Conservative concern over the power of Critical Race Theory has even extended to the United Kingdom and France). The legislature in my own home state of Arkansas has recently considered two bills on this very subject. The first, HB 1218, prohibits public schools from offering any activity that “Promotes division between, resentment of, or social justice for a: (A) Race; (B) Gender; (C) Political affiliation; (D) Social class; or (E) Particular class of people.” The second bill, HB 1231, aims to ban any usage of the 1619 Project in state classrooms and describes the project as “a racially divisive and revisionist account of history that threatens the integrity of the Union by denying the true principles on which it was founded.” Notice how these bills use the terms “division” and “resentment,” as if classroom instruction lies at the source of social division, and not the nature of society itself. Pope Gregory IX would certainly recognize the spirit underlying such legislation.

In 1242, King Louis IX (later St. Louis) of France ordered his soldiers to sack Jewish homes and synagogues and confiscate all known copies of the Talmud. The faculty at the University of Paris had declared the Talmud a work of heresy, dubbing it the chief reason that the Jews had refused to convert and arguing that destroying it would allow the Jews to recognize the fulfillment of the Old Testament in the person of Jesus Christ. And so the king’s men set fire to some 12,000 books dumped onto the pavement there in Paris. The fire lasted one and a half days. As James Carroll writes, “In the age to come, Jewish ignorance would be defined, ipso facto, as willful.”

Mass destruction of the Talmud did not, as had been hoped, facilitate mass conversion to Christianity. In fact, Christians continued to blame the Talmud for this state of affairs through several centuries and several waves of the Reformation, so that the words of Gregory IX found themselves reflected (ironically enough, given their attitudes toward “Popery”) in an 1839 mission to the Holy Land by a group of Scottish Presbyterians. And neither will the prohibition of Critical Race Theory or the 1619 Project in public schools facilitate mass acceptance of the traditional metanarrative of American history on the part of marginalized peoples. After all, Critical Race Theory and the 1619 Project are simply models designed to help all understand the difficulties large swaths of the American population continue to face in our allegedly post-racial era. These models are not the source of that marginalization, the source of those difficulties, no more than was the Talmud the source of any Jewish intransigence to the alleged truth of Christianity.

Conservatives can lay the blame now upon these models for all the divisions that face American society at present, but if history is any guide, they will not succeed in uniting the United States but, instead, will continue their campaigns of blame down through the centuries, making very little progress for their cause in the meantime. Eventually, Christian Europe decided the best way to kill Judaism was to kill Jews. So where do American conservatives see their own campaign of conversion ending up?

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179554 https://historynewsnetwork.org/article/179554 0
The Women Who Fought Tooth and Nail for the Flint Sit-Down Strikes

 

A Women's Brigade picketer breaks a window after police tear gassed the occupied Chevrolet Plant 9 during the Flint sit-down strikes.

Still from With Babies and Banners: Story of the Women's Emergency Brigade, Women's Labor History Film Project, 1978. 

Internet Archive.

 

 

In downtown Flint, Mich., stands a pantheon of statues dedicated to automotive pioneers. David Buick and Louis Chevrolet, the namesakes of two of General Motors’ classic brands, are both world famous. Some Flintstones would like to add a statue of a lesser-known figure: Genora Johnson, leader of the Women’s Emergency Brigade during the 1936-37 Flint Sit Down Strike.

The strike began on December 30, 1936, when autoworkers occupied GM’s Fisher One plant, demanding better wages, more job security, and an end to the hated assembly line “speed up,” which so exhausted the workers that they could barely pick up a fork to eat at the end of the shift. That night, the “cut and sew” women in the upholstery department were ordered to leave the plant, to deter rumors of sexual mingling among strikers.

When Genora Johnson first offered to volunteer at strike headquarters, in Flint’s Pengelly Building, she was assigned to the kitchen, like all the other members of the Ladies’ Auxiliary. Johnson, a striker’s wife long devoted to socialist causes, thought women belonged on the front lines of the strike, beside their husbands, brothers, fathers and sons. First, she organized a picket line outside Fisher One. Her two-year-old son held a placard reading “My Daddy Strikes for Us Little Tykes.”

From With Babies and Banners

 

On February 11, 1937, the Flint police attacked Fisher Two, in what became known as the Battle of the Running Bulls. The strikers repelled the police by pelting them with door hinges and spraying them with fire hoses. During their retreat, the police opened fire, wounding 14 unionists. After the shooting stopped, Johnson urged a group of women to break through the police lines, and protect the men inside the plant from further violence.

“I ask all the women here tonight to come down and stand with your husbands and brothers,” she declared through a loudspeaker mounted on a sound car. “If the police are cowards enough to shoot down defenseless men, they’re cowards enough to shoot down women. Women of the city of Flint, break through these police lines, and come down here and stand with your husbands and your brothers, your sons and your sweethearts.”

The Battle of the Running Bulls transformed the Ladies’ Auxiliary from a homemakers’ sodality to a quasi-military force. After the women formed a human shield outside Fisher Two, they realized they were just as courageous as the men, and just as capable of standing up to the police—maybe more so, because the “flatfeet,” as they call the cops, wouldn’t attack women.

“We have got to organize the women,” Johnson declared that night. “We have got to have a military formation of the women. If the cops start firing into the men, the women can take the front line ranks. Let them dare to shoot women!”

The next day, fifty mothers, daughters, wives, and sisters gathered at the Pengelly Building, in answer to Johnson’s call for women willing to place their bodies between police and strikers. “It can’t be somebody who’s weak of heart!” she announced. “You can’t get hysterical if your sister beside you drops down on a pool of blood. We can’t be bothered with having to take care of two people, if one is injured and another is going to go hysterical. Do not sign up for the Women’s Brigade, take your role in the strike kitchen, take your role in the first aid station in the Ladies’ Auxiliary.”

The first to stand was a woman in her seventies.

“This is going to be difficult for you,” Johnson cautioned.

“You can’t keep me out,” the old woman insisted. “My sons work in that factory. My husband worked in that factory before he died, and I have grandsons there.”

Of the thousand women who belonged to the Ladies’ Auxiliary, four hundred joined the Women’s Emergency Brigade. Every member was issued a red beret and red armband with the white letters “E. B.” Johnson appointed herself captain. Tall, with raw-boned features, a deep background in labor rhetoric, and a commanding manner, she was a natural leader. Her five lieutenants each commanded a squadron ready to gather outside a factory at the summons of a phone call.

The women adopted a military-style costume of jodhpurs, a waist-length jacket, and knee-high boots. And they armed themselves. From the men inside Fisher One, one woman acquired a blackjack, attaching it to a wristlet concealed inside her sleeve so she could flick it out at the first sign of trouble. All the women in the Brigade carried billy clubs, their handles whittled down to fit a female grip.

Women's Emergency Brigade picketer smashes a window at Chevrolet Plant 9. From With Babies and Banners.

 

The Brigade got its first taste of battle when the strikers attempted to capture and occupy two Chevrolet engine plants. At Chevy Nine, plant police resisted the takeover by firing tear gas at workers inside the plant. As the close air of the shop floor filled with the choking smoke, a striker broke a window. Pushing his bloodied face through the hole in the glass, he shouted to the women on the sidewalk, “They are gassing us! They are gassing us!”

“Smash the windows!” ordered a voice from the union sound car.

Dressed for battle in red berets and armbands, parading with an American flag at the head of their column, thirty Brigade members pulled billy clubs out from under their long winter coats and swung them at the bank of windows.

“They’re gassing our husbands!” one woman yelled. “Give them air!”

The women shattered every pane they could reach, littering the shop floor with tinkling shards of glass. Tear gas flowed out through the jagged holes. When the Flint police attempted to arrest a club-wielding Brigade member, she wriggled in their grasp, shrieking “Get your hands off me!”

From With Babies and Banners

 

(The next day’s New York Times reported their action under the headline “Women’s Brigade Uses Heavy Clubs.” The Flint Journal wrote, “These women smashed scores of windows in the plant in a hysterical frenzy, seemingly with an urge to destroy, for officials could find no other reason for smashing glass in window after window.”)

After the strikers captured Chevy Four, the Brigade formed a barricade around the plant.

“What kind of cowards hide behind women?” a cop bellowed, loudly enough so the men inside the plant could hear.

Johnson took this taunt personally. Climbing into the sound car, she grabbed the microphone to denounce the company’s “hired thugs.”

“We don’t want any violence,” she declared. “We don’t want any trouble. My husband is one of the Sit Down Strikers. We are going to fight to protect our men.”

From With Babies and Banners

 

The strikers occupied GM’s plants for 44 days before the company capitulated, allowing the United Auto Workers to negotiate on their behalf. The strike guaranteed middle class wages and benefits for autoworkers for generations to come.

Wearing the red beret of the Women's Emergency Brigade, Genora Johnson Dollinger addresses the UAW in 1977, using the occasion of the fortieth anniversary of the Flint strikes to demand the union include issues like child care in contract negotiations. From With Babies and Banners.

 

At the Sit-Downers Memorial Park, outside UAW Region 1-D in Flint, there is a bronze statue of an anonymous Women’s Emergency Brigade member smashing a window with a billy club. On White Shirt Day, held every February 11 in Flint to commemorate the strike’s end, women dressed in red berets and armbands serve bean soup, bread and apples, food the Sit Downers ate in the plants. Genora Johnson, though, has never been memorialized. A downtown statue of Flint’s Spartan woman would honor not only her, but all the women who played a role in winning the Sit-Down Strike.  

 

 

Excerpted from Midnight in Vehicle City: General Motors, Flint, and the Strike That Created the Middle Class by Edward McClelland.  Copyright 2021.  Excerpted with Permission from Beacon Press.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179552 https://historynewsnetwork.org/article/179552 0
Biden Supports Amazon Workers' Right to Form a Union. Is this a Turn Back to Pro-Worker Policy?

 

 

 

The campaign of Amazon workers in Alabama to win representation by the Retail, Wholesale, and Department Store Union (RWDSU) may be a critical turning point in reversing a forty-four- year trend. Since 1978, the corporations and the wealthy elite have accumulated more and more power while the living standards, rights, and power of the U.S. majority have declined. The Amazon union drive points the way to improving workers’ lives and reduces the glaring inequality in the U.S.

 

As innovative as Amazon has been in many respects, it’s been old-fashioned in its abuse of workers’ safety on the job, its low-wage strategy, its anti-unionism, and its flexing of its monopolistic muscle against competitors. The need to change direction in the U.S. has become more compelling as a result of Amazon’s growing power and the harm done to workers’ health and their well-being in the COVID crisis as a result of their relative powerlessness.

 

Users of the Amazon website can support Amazon workers by speaking out, signing petitions, writing letters, and taking other actions to show opposition to its anti-union policies. The election of Bessemer, Alabama warehouse workers is now underway and concludes on March 29. It’s time to send the company a message that however much we appreciate what it has to offer, we insist that it refrain from its oppressive and anti-democratic practices toward the workers who do the work that is the source of its wealth. During the current COVID crisis, Amazon workers should be receiving hazard pay and have their health protected, not bombarded with anti-union propaganda.

 

The RWDSU drive at an Amazon warehouse in Bessemer, Alabama is part of the grass roots movement of low-wage workers in the retail, fast food, hotel, ride-share, nursing home, food processing, and textile industries, and in private home care. It is part of the rank-and-file upsurge for the $15 per hour minimum wage, for health protection for front-line workers, for protecting women workers from sexual harassment and providing full coverage for reproductive health, and for equality in the workplace. It is backed by unions and by independent workers’ centers. Many activists are radical visionaries unafraid of the red-baiting of employers and the political right.

 

Whether the Bessemer election is won or lost, it is part of a campaign to change the country’s labor law so that the institutional weight of government supports unionization as a means to promote grass roots democracy and increase the purchasing power of the masses.

 

The 1935 Wagner Act created the framework of a pro-union role for the federal government, of “encouraging the practice and procedure of collective bargaining,” protecting “freedom of association, self- organization, and designation of representatives of their own choosing,” and “negotiating the terms and conditions of their employment or other mutual aid or protection.” The law forbade as unfair labor practices any action by employers “to interfere with, restrain, or coerce employees” in the exercise of their rights.

 

 This pro-labor framework was weakened by the Taft-Hartley Act of 1947 and by pro-employer court decisions, but the Wagner Act ideals are still law. While employers initiated new tactics that further chipped away at labor rights, unions and their allies are today seeking enactment of the Protecting the Right to Organize Act, their sixth attempt to fully reestablish a pro-union federal policy.

 

An effective alliance between a Democratic president and liberal members of Congress is of central importance. The Wagner Act resulted from a combination of grass roots, left-influenced workers’ movements, support from President Franklin Roosevelt, and leadership in Congress from liberal Democrats and representatives from the American Labor Party of New York, the Farmer Labor Party of Minnesota, and the Progressive Party of Wisconsin. Similarly, the gains by public workers in the 1960s and 1970s began with their own grass roots, left-influenced movements in federal, state, and local governments. Support came from the labor movement and President John F. Kennedy’s Executive Order 10988 establishing limited collective bargaining for federal workers. Strikes by Memphis African-American sanitation workers in 1968, and, then in 1970 by postal workers across the country, two-thirds of whom were African Americans, helped consolidate the position of public worker unions in the U.S. economy.

 

Neither Roosevelt nor Kennedy had a perfect pro-labor record, but both gave crucial assistance to unions. Roosevelt stopped an effort to gut the Wagner Act, passed by the House of Representatives, from becoming law in 1940. Kennedy followed up his initial executive order with a second order establishing a check off of union dues for federal unions.

 

The record of subsequent Democratic presidents has included the appointment of pro-union members to the National Labor Relations Board. However, from 1949 until today, efforts to reform the nation’s labor law have fallen short. Presidents Jimmy Carter, Bill Clinton, and Barack Obama all failed to offer the assistance needed to pass pro-union legislation. An additional factor was the use of the filibuster in the Senate to defeat reforms in 1965, 1978, and 1994. Although unions have been a crucial factor in Democratic election victories, it has been decades since they have reaped a labor law victory. The result has been a continuous decline in union density -- the percentage of workers belonging to unions -- since the early 1950s and an escalating growth of inequality since 1978.

 

The presidency of Joe Biden represents an opportunity for the alliance of unions and a Democratic president to come to the aid of workers seeking more income, a greater say in their workplaces, and improved health and security. Supporting Amazon workers as they cast ballots in a union representation election, as President Biden has done, is a first step. All of the anti-union intimidating practices that Amazon is using in this campaign should be illegal. The principle needs to be restored that deciding on whether to join a union is the decision of workers, a decision in which employers should have no part. A second step will be eliminating the Senate filibuster so that labor law reform can at last be enacted.  A third step is passing the Protecting the Right to Organize Act and then more comprehensive labor law reform.

 

Support for unionization should come from progressive movements for peace, reproductive rights, environmental justice, race and sex equality, defunding the police, LGBTQ rights, Medicare for all, jobs for all, food security, and consumer safetyWorkers and their unions are the heart of the broad movement for progressive change in the United States. The moment has come for progressive organizations to rally around Amazon workers and support their campaign for economic justice.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179555 https://historynewsnetwork.org/article/179555 0
Peace, Waiting to Be Picked Up: The Secret Diplomacy Failure of 1916 that Changed the World

 

 

On August 12, 1916, France's president, Raymond Poincaré, walked up to the British military headquarters at Val Vion, in northern France, for a private conference with Britain's king, George V.  The king came out to greet him, wearing a beribboned khaki military uniform appropriate to the occasion.  President Poincaré joined him in a more somber kind of uniform, a livery of mourning.  Poincaré wore black from head to toe, without a bit of adornment or decoration. 

To the French public, Poincaré was a symbol of the united war effort, a conservative nationalist who personified France's "sacred union" to win the great war.  That was the public man.  But in private, with the distant thunder of the guns in the background, Poincaré had a sober message.  He confided to the king that he was in favor of "bringing the war to a conclusion as soon as possible."

How could this be done?  Poincaré had his eye on the American path to peace.  He expected the American president, Woodrow Wilson, to offer mediation by October.  "When an offer of American mediation comes," the French president explained, "the Allies should be ready to state their terms for peace."  The French public, he added, was "too optimistic."  The people did not know the full situation.  And he also felt "great anxiety in regard to the state of affairs in Russia" -- a country then about seven months away from the revolution that would topple Czarist rule.

Knowing nothing of this French-British exchange, only six days later, on August 18, the chancellor of Imperial Germany sent a momentous and secret cable to his able ambassador in Washington.  He and his Kaiser were also desperate to end the war and ready for compromise, including the restoration of Belgium.  "We are happy to accept a mediation by the President [Wilson] to start peace negotiations among the belligerents who want to bring this about," the German chancellor instructed.  "Please strongly encourage the President's activities in this regard." 

To avoid giving any impression that his country was weak, the chancellor's plea was utterly secret.  The German mediation request was unconditional.

For more than five months, from August 1916 until the end of January 1917, leaders from Germany, Britain, and the United States secretly struggled to end the Great War. They did so far out of public sight, one reason their battle is still little understood today.

Few know that the German government secretly sought peace and pleaded for President Wilson to mediate a peace conference. This was no informal feeler. It was a direct move made at the top, coordinated with allies and key political figures in Germany.  Few know of the German move; fewer still can trace exactly what happened to it.

Few know that Wilson entirely recognized the significance of this move and sought to act on it as quickly and emphatically as he could. He placed it at the top of his agenda as soon as he was reelected. Wilson also knew he had practically absolute leverage—mainly financial—over the Allied ability to continue the war. Given the political climate in the warring countries, it was the Americans who could give the peacemakers in all the warring capitals the face-saving way out.

Few know that the divided British coalition government was intensely, secretly debating its own growing pessimism about the war and its imminent bankruptcy in the dollars to sustain it. These debates were quickened by a still deeper layer of secret knowledge. British intelligence had learned of the secret German peace move.

Few know any of these things because, to outsiders then and to most historians now, it seemed that nothing happened.

During those five months of speculation, arguments, and choices behind closed doors, the future of the war, and the world, hung in the balance as never before.

The winter of 1916–1917 was pivotal for the history of the United States. Six months before America entered the war, few Americans (or British leaders) predicted it would. Even in January 1917, urged to look to the readiness of the armed forces, Woodrow Wilson, who had just been reelected with the slogan “He kept us out of war,” turned sharply on his adviser. “There will be no war,” the president said. “This country does not intend to become involved in this war.”

Until April 1917 the United States in its 141-year history had never sent a single soldier or sailor to fight on the continent of Europe. During the next year and a half, the United States, then a country of about one hundred million people, would send two million of them across the Atlantic Ocean to war. Neither Europe nor the United States would ever be the same.

There is a public story of why and how America’s historic neutrality came to an end. It is a story catalyzed by a debate over German submarine warfare. That story is well understood.

But behind that public story is the secret story. The Germans resumed their full U-boat war, the public road to wider war, because some German leaders concluded that the alternative road, the secret road, the peace road, had, after months of trying, reached a dead end.

The Americans faced the end of neutrality because they too had run out of options: President Wilson’s alternative, his peace diplomacy, had also failed, although—then and later—he never really understood quite what had gone wrong.

The 1916–1917 phase of peacemaking was also a unique moment in the history of the world. After 1916–1917, there would be other discussions about peace. But the alignment of possibilities slipped away. In March 1917, the Russian Revolution began. The Russian war effort slowly collapsed. That collapse eased some major problems for Germany and its allies. It gave them hope to carry on.

After 1916–1917, the British and French also had fresh reason to hope. They had America on their side. That sustained them, quite literally, in their darkest days.

So, what in August 1916 were two years of agonizing war had by November 1918 turned into more than four. Those further years of widening war changed the whole course of world history.

To pick just one example: without a continuation of the war, it is hard to work out any plausible scenario in which the Bolsheviks would have seized power in Russia. As the war continued, profoundly damaged most of all, beyond the countless individual human tragedies, were the future prospects for core regions of the world—Europe and the Middle East.

As horrific as the war had been until the end of 1916, the conflicts of 1917–1918 pushed Europe and the Middle East over the edge. The historian Robert Gerwarth has recently chronicled that descent.

“Notably in its final stages, from 1917 onwards, the Great War changed in nature….It was in this period that a particularly deadly but ultimately conventional conflict between states—the First World War—gave way to an interconnected series of conflicts whose logic and purpose was much more dangerous.”

 

As I wrote in the study of the 9/11 attacks by the 9/11 Commission, “The path of what happened is so brightly lit that it places everything else more deeply into shadow.” Much of what happened in this history, the secret debates and hidden crises, was already in shadow to begin with. This history should see the light, because, beyond the tragedy, it is also a story of inspiring possibilities.

Two roads diverged. Both were uncertain. One led toward peace, the other toward a wider war. The secret battles to end the war were not a blur of explosions and gunfire, the battles that kill thousands. They were the quieter, more secret kind that determine the fates of millions. A small number of leaders, mainly in London, Washington, and Berlin, faced their two roads.

Analytically, one can distill some of the miscues into cold isolates of timing, ambition, dissembling, and incompetence.  But, as with those who first encountered the world of molecular biology, the closer one looks at this episode with the historian's microscope, strange new worlds open to view.  And, as in the greatest tragedies, what stands out are some human beings, flawed as they are, who did strive courageously to avert catastrophe. They wrestled with a challenge that, in its way, was as great as any of the mud-spattered heroics in Flanders or Galicia, at Verdun or Belleau Wood.

The story of the lost peace would be easy if it were merely a story of governments with irreconcilable goals. But the chancellor of Germany and the president of the United States had a vision that meshed with the vision that held sway in much, if not most, of the British cabinet, at times including both of the relevant prime ministers. The possibilities for peace were tantalizingly within reach.

Some leaders rose to the occasion. Others did not. Some demonstrated the greatest civic courage; others, its absence. It was one of those times that reveal a person’s deepest strengths and weaknesses, in ability and in character.

“Peace is on the floor waiting to be picked up!” the German ambassador to the United States pleaded in November 1916. He was right. But with the war in full bloody bloom, peace depended on enough people choosing the less obvious outcome: they had to step onto the road less traveled by.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179515 https://historynewsnetwork.org/article/179515 0
Remembering the Father of Vaccination

 

 

As the US COVID-19 vaccination program reaches full stride, approaching 2 million shots per day, the time is ripe to recall the contributions of the physician-scientist who first put vaccines on the map, Edward Jenner.  Some claim that Jenner saved the lives of more people than any other figure in history, yet his contributions are often poorly understood.

 

Jenner did not invent inoculation

Born in England in 1749, Jenner was inoculated as a child against smallpox, a dread disease that appears to have scarred 3,000-year-old Egyptian mummies.  Caused by the Variola virus, the disease manifested as fever and the development of a blistering skin rash referred to as pox.  It is thought that about 30% of infected people died of the disease, especially infants and young children.

In Jenner’s day, inoculation was by variolation.  The skin was scratched and the smallpox scabs or fluid from an infected person were rubbed into it.  When it worked appropriately, this would cause the variolated individual to develop a mild case of smallpox which usually lasted several weeks, after which the patient would be immune.  But small percentages of variolated individuals died.

The technique had been popularized by Lady Mary Wortley Montagu, who both lost her brother to the disease and suffered severe facial scarring herself.  While traveling in Turkey, she witnessed variolation, and in 1718 she had it performed on her young son.  By the time Jenner was born, variolation had become been widely incorporated into English medicine. 

 

Other physicians studied vaccination

Jenner was not the first to suspect that prior infection with cowpox provided immunity against smallpox.  At least five physicians had tested cowpox, and even a farmer named Benjamin Jesty had used cowpox to vaccinate his wife and children during a smallpox epidemic.  Jenner, however, was the first to study vaccination in a scientifically rigorous way.

Knowing that milkmaids were generally immune to smallpox, Jenner hypothesized that the pustules on the hands of milkmaids could be used to confer immunity.  In 1796, he tested the idea by inoculating James Phipps, the 8-year-old son of his gardener.  He scraped material from the hands of Sarah Nelmes, a milkmaid with smallpox, and inoculated Phipps in both arms.

After several weeks, he variolated Phipps.  Although the boy did develop a mild fever, he did not develop full-blown smallpox.  After a time, Jenner variolated him again, with no effect.  The procedure had apparently conferred immunity.  We now know that the viruses that cause cowpox and smallpox are sufficiently similar that the immune response to one can confer immunity to the other.

 

Jenner died long before viruses were discovered

Today we talk easily of viruses, but in Jenner’s day they were completely unknown.  The great microscopist Antony van Leeuwenhoek had discovered bacteria around 1676, but viruses are far too small to be seen through light microscopes.  It was not until the invention of the electron microscope in 1931 that viruses were visualized for the first time.

Unanswered questions about the mechanism of vaccination led the Royal Society not to publish Jenner’s first manuscript, but after he conducted other trials, including one on his infant son, his paper was published.  Perhaps his greatest contribution was his insistence on challenging those who had received the cowpox inoculation with smallpox to prove they were immune.

The term vaccination betrays its origin.  Vacca is Latin for cow, the source of the cowpox material that Jenner and others used to inoculate against smallpox.  Some decades before Jenner’s work, US founding father Ben Franklin decided not to variolate his young son Francis, a decision he regretted the rest of his life because the boy died of the disease at age 4 in 1736.

 

Jenner’s legacy exceeded even his own dreams

Smallpox vaccination quickly spread around the world.  Spanish expeditions carried it to far-flung lands such as America and China.  Napoleon had his troops vaccinated.  Jenner received a host of domestic and foreign honors.  To allow him to focus his attention on his investigations, Parliament awarded him huge grants of 10,000 and 20,000 pounds. 

But Jenner could not have anticipated where his work would lead.  Immunization by cowpox held sway until the 19th century, when a more modern live-virus vaccine was developed using the lymph of calves.  Today versions of smallpox vaccine are available that do not use live viruses and thus cannot cause disease. 

Yet no one is being immunized against smallpox today, because of a worldwide immunization program that led the World Health Organization to declare the disease eradicated in 1980.  US vaccination ceased in 1972, though many older adults still bear scars.  Today the smallpox virus is found only in a few secure laboratories, where it is used to prepare against the use of smallpox as a bioweapon. 

Jenner died of a stroke in 1823 at the age of 73.  He continued his scientific investigations until the end, presenting a paper on bird migration to the Royal Society in the year of his death.  Whether or not Jenner truly saved more lives than any other person, there is no doubt that his pioneering work on immunization laid the groundwork for today’s most effective tool against COVID-19, the vaccine. 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179557 https://historynewsnetwork.org/article/179557 0
"We Just Did What We Had To": Telling the Story of a Slovenian Partisan and a Kiwi POW

 

 

The Note through the Wire is the true story of Josefine Lobnik, a Slovene resistance fighter, and Bruce Murray, a Kiwi prisoner of war – both complete strangers - who meet by chance when she passes a note through the wire of the local POW camp seeking information on her brother who had been captured by the Nazis. Against all odds, they discovered love in the midst of a brutal war in the heart of Nazi-occupied Europe.

I’ve known about this story for almost 40 years. Bruce and Josefine had always been reluctant to talk about their wartime feats but in 1998, several years after Bruce’s death, I finally persuaded Josefine to tell their remarkable story. Our family had just shifted to France and Josefine had agreed to share her account of the events that had brought her and Bruce together. Tragically, she was killed in a car accident in Slovenia just three days before she was due to visit us in Aix-en-Provence and her memoir was never recorded.

I first heard about their love story from Anemarie, my wife (future wife at that time) who told me a little about how her parents met, but I wanted to learn more. I was too nervous to raise the subject at our first meeting, a traditional Slovenian dinner. To be honest, it was intimidating to be in the company of genuine war heroes, but quickly learned that they were ordinary people. As Josefine always said, with typical modesty: “we just did what we had to.” On our third or fourth meeting, I did ask a few questions but, like many World War II veterans, they were loath to talk about their exploits. I didn’t push it at the time, but I was determined to find out more. After persistent questioning, I started to piece together their story, which only enhanced my admiration and respect for them.

Josefine’s siblings were all active partisans. Her older sister, Anica, held a senior role as liaison officer, and was captured and tortured by the Nazis. Her younger brother, Roman, was also captured and tortured and was seriously wounded twice in partisan battles against the Germans. Her other brother, Leopold, for whom she was searching when she passed the note through the wire, was not in any POW camp. He had been despatched to the dreaded concentration camps: Flossenbürg first and then - one of the worst - Dachau.

Josefine herself undertook many dangerous partisan missions and assisted numerous POWs and airmen escape from Slovenia. She was decorated for bravery. In a letter to the New Zealand authorities, Bruce wrote: ‘For my wife’s part in my escape, and the part she played in the escape of others, she was awarded, for this highly dangerous type of war work, a citation by Field Marshal Alexander…”

After the war, Bruce made an application to the Immigration Department to resettle the whole Lobnik family in New Zealand: ‘All in all, this was quite a family of partisans … they fought as surely as my brother, my father and myself on the side of democracy. Inexplicably, that application was denied.

Bruce and I got on extremely well and, after a few drinks, he would talk more openly about what he had been through than Josefine ever would. I’m sure that painful memories of past suffering prevented her from relaying the horrors of life during Hitler’s brutal regime. Bruce’s two best friends were killed on the same day at Sidi Rezegh. He felt their loss for many years afterwards.

My own father died when I was quite young, and Bruce became more of a father to me than a father-in-law. We spent many happy times together and I often went with him to the local Returned Servicemen’s Club where he more was more inclined to reminisce about his front-line action in Greece and his time in the POW camps with other ex-servicemen who had shared similar experiences.

After the war, Bruce and Josefine lived unremarkable lives and few people would have known about their incredible story, but they were a devoted pair. I’m sure that the obstacles they had to overcome to realise their dream of a life together meant that they never forgot how precious their love for each other was. Bruce was Production Manager for a hosiery company and Josefine was a talented seamstress. It was difficult for Josefine when she first came to New Zealand. She spoke no English and the only common language she and Bruce shared was German. Josefine did learn English but that took some time - and she always spoke it with a strong accent. Once, she went into a local department store and said she needed a ”large shit.” She meant a large sheet.

There was also a lot of prejudice shown towards her. New Zealanders at that time were very suspicious of Eastern Europeans and she was often taunted and told to “go back to where you came from.” This hurt her deeply, as she had risked her life to help Allied soldiers escape and to liberate her country from the Nazis while most of her abusers had lived relatively comfortable lives at home. None would have known that she was a war heroine decorated for bravery.

Josefine was a wonderful cook and loved to prepare Slovenian meals. Maribor, her hometown, was very close to the Austrian border so much of the food had a strong Austrian influence – schnitzels, gulasches, strudels. She also cooked traditional Slovenian delicacies like potica (a nut roll), štruklji (baked rolls with various fillings), klobasa (sausage) - and her favourite, kremna rezina (a custard cream cake). Sunday lunches at their home were often all-day affairs! They were warm, caring, genuine, and generous. I loved them both.

My interest in writing a book about their exploits was rekindled a couple of years ago after a family dinner. We were talking about what Bruce and Josefine had endured and I quickly realised that their great-grandchildren knew very little about their wartime feats – even their grandchildren didn’t know the full story – and I knew then that if I didn’t record the story now it would be lost to future generations. 

The research proved to be more challenging than I had ever imagined – information on Josefine was particularly hard to source because few formal records existed of partisan feats. The turning point came when my wife Anemarie found a box of letters, unopened since World War II, that described, in heart-breaking detail, many of the obstacles her parents had faced and painted a very grim picture of wartime life in Slovenia. The letters provided a wealth of information on the events that shaped their lives and gave an invaluable insight into their wartime experiences; they included not only correspondence between Josefine and Bruce immediately following the war but also letters written during the war from Josefine’s siblings and friends, some of which included coded messages.

One of the things that surprised and shocked me during my research was the harshness of the Nazi regime in Slovenia. I knew that Slovenia was in the heart of Nazi occupied Europe but I had no idea of the extent of the barbarity. The Nazi intrusion into every aspect of life was both toxic and immediate. Every trace of Slovene heritage was removed: shop signs were taken down and German ones put up; street names were changed to their Deutsche equivalents; place names – even personal names – had to be Germanised; and Slovene literature was confiscated or burned. Anything reflecting the Slovene culture or character was destroyed. Murderous reprisals became routine. Anyone with partisan connections could expect no mercy and ordinary citizens randomly selected to face the firing squad fared no better. Josefine herself witnessed these innocent residents being gunned down in cold blood in Maribor’s main square. I hadn’t realised, either, that Slovenia was a country riven by civil tensions and internal conflict.

What moved me most, though, was the bravery and resilience of the Slovene partisans who faced death and danger daily in their struggle to liberate their country from the clutches of the Nazi occupiers. I have unreserved admiration for what they accomplished under the most arduous conditions.

My only regret is that neither Bruce nor Josefine lived to see their remarkable story of courage, conviction and love finally recorded. I hope they would have approved.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179516 https://historynewsnetwork.org/article/179516 0
The Birth, and Life, of a Word

 

 

 

 

During the presidential campaign of 1864, a seventy-two-page booklet appeared on the streets of Manhattan. This publication was titled Miscegenation: The Theory of the Blending of the Races, Applied to the American White Man and Negro. It cost 25 cents. According to the booklet’s anonymous author, “miscegenation” was a word he’d created by combining the Latin root words miscere (to mix) and genus (race). This, the pamphleteer explained, was a more scientific term than “amalgamation,” which he considered a “poor word” to describe white-black intermarriage.

 

The author then expounded at length about the virtues of miscegenation that would inevitably follow a Union victory in the Civil War. “The miscegenetic or mixed races are much superior mentally, physically, and morally to those pure or unmixed,” he wrote. For this reason, “it is desirable that the white man should marry the black woman and the white woman the black man...” When Asians and Indians were added to the mix, he continued, the result would be an improved race of “miscegens.”

 

This position may not sound preposterous in today’s multicultural world, but in the racially charged atmosphere of Civil War–era America, it was incendiary. The idea that intermarriage would be not only an inevitable, but a desirable, consequence of emancipation was a radical and—in the north and south alike—disturbing prospect. Yet this was exactly what Miscegenation proposed.

 

To increase the impact of his booklet, its author sent copies to a number of prominent Americans. The one that went to Abraham Lincoln was accompanied by a note extolling “human brotherhood.” This message expressed hope that the president would stand four-square for equality between “the white and colored laborer,” an inflammatory suggestion in working-class neighborhoods of racially and ethnically polarized cities such as New York. Although Lincoln did not endorse Miscegenation, some abolitionists who received copies did. A few anti-slavery publications, including the Anglo-African Review and the National Anti- Slavery Standard, reviewed it favorably.

 

The contents of Miscegenation quickly became a focal point of campaign oratory. As the 1864 election approached, talk of “miscegenation” dominated America’s political discourse. In Washington, D.C., a Polish expat wrote in his diary, “The question of the crossing of races, or as the newly-invented sacramental word says, of miscegenation, agitates the press and some would be savants in Congress.” Slavery-tolerant Democrats used this neologism to bludgeon antislavery Republicans who, they said, were hell-bent on mongrelizing the white race. Democratic publications warned that race-mixing would be the logical consequence of Abraham Lincoln’s “Miscegenation Proclamation.” The Cincinnati Enquirer advised its readers to beware of “zealous miscegenators.” A Democratic newspaper in New Hampshire ran a widely reprinted article headlined “Sixty-Four Miscegenation,” which claimed, falsely, that sixty-four abolitionist schoolteachers in New England had given birth to mixed-race babies. On the other side of the debate, humorist David Ross Locke, a staunch Republican and favorite of President Lincoln, incorporated the new word into his portrayal of a clueless, semiliterate Confederate named Petroleum Vesuvius Nasby. “Lern to spell and pronownce Missenegenegenashun,” Nasby/ Locke advised. “It’s a good word.”

 

But not one that was genuine. Miscegenation turned out to be the creation of two New York journalists who weren’t advocates of race-mixing at all. Their booklet was a hoax: a political dirty trick meant to sabotage Republican prospects in the election of 1864. Years after Miscegenation was published, its authors were unmasked as David Goodman Croly and George Wakeman of New York’s Democratic newspaper The World. By coining this word and using it as the title of their provocative booklet, Wakeman and Croly hoped they could undermine Republican candidates by making the controversial issue of intermarriage a focal point of political discourse.

 

Without naming its perpetrators, a World article headlined the “Miscegenation Hoax” that appeared two weeks after Lincoln’s 1865 inauguration, expressed horror at the way avid abolitionists had overlooked “the barbaric character of the compound word ‘miscegenation.’” This article predicted that “the name will doubtless die out by virtue of its inherent malformation. We have bastard and hump-backed words enough already in our verbal army corps.” In fact, the World concluded, as a usable word, miscegenation had already “passed into history.”

 

Although the uproar surrounding “miscegenation” did die down after Lincoln’s reelection, the word itself did not. By now it’s a well-established part of our lexicon. In a sort of professional hat tip, the renowned hoaxer (and Republican stalwart) P. T. Barnum devoted an entire chapter of his 1866 book The Humbugs of the World to detailing the shrewd composition and brilliant rollout of Miscegenation. The successful propagation of this mock neologism was due to “one of the most impudent as well as ingenious literary hoaxes of the present day,” wrote Barnum. Even though it wasn’t meant to be taken seriously, or outlive its devious intent, miscegenation caught on and stuck around.  Following the Emancipation Proclamation, such a scientific-sounding term was needed to help us navigate the controversial topic of intermarriage. In the absence of anything better, miscegenation fit that bill. To this very day, that word is used so commonly for racially mixed relationships that it doesn’t even elicit synonyms in an online thesaurus. It has even spawned a verb. As literary scholar Walter Redfern once wrote, “The urge to miscegenate counteracts racism.”

 

David Croly had mixed feelings about his role in coauthoring a word that became so ubiquitous. Long after he died, Croly’s widow recalled the way miscegenation was conceived as her husband and a colleague (George Wakeman) composed their tract by that title. “I remember the episode perfectly, and the half joking, half earnest spirit in which the pamphlet was written,” she said. Be that as it may, Mrs. Croly concluded, her husband’s coinage “added a new, distinctive, and needed word to the vocabulary.”  One American who didn’t agree was David Croly himself. Croly considered amalgamation a perfectly good term, one he used until his death in 1889.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179514 https://historynewsnetwork.org/article/179514 0
FDR and the Need for Truth

Betio Island, Tarawa Atoll, November 1943.

Photo U.S. Navy - U.S. Defense Visual Information Center photo HD-SN-99-03001

 

 

In the mid afternoon of Tuesday, December 28, 1943, Life magazine correspondent Robert ‘Bob’ Sherrod arrived at the White House’s West Wing in preparation for a scheduled 4.00 p.m. presidential radio and press conference.

President Franklin D. Roosevelt had been conducting several media briefings a month since America entered the war with Japan two years earlier. Sometimes, those conferences were filled with juicy material for the media, but Sherrod wasn’t expecting much to come out of this particular press conference. Just days after the Christmas break, which the president had spent at his private residence, Hyde Park, on the Hudson River, this was traditionally a quiet time of year for news.

As invited press and radio men were gathering in the West Wing’s lobby and four o’clock approached, Sherrod was surprised when Steve Early, the president’s long-time press secretary, came to him and took him aside.

“FDR would like a private word,” Early confided.

Sherrod had met the president one-on-one just once before, and briefly at that. Today, since he had only recently gained White House correspondent accreditation, he’d expected to be merely one of many reporters in the room throwing questions at Roosevelt. After serving as a war correspondent for Life in Australia and New Guinea, Sherrod had returned to the States that August, before a stint reporting the US Marines’ Pacific campaign. He had not long been back from covering the American landing at Tarawa Atoll in November.

Steve Early led Sherrod across the West Wing to its southeast corner, and, after knocking, opened the door to the Oval Office, then ushered the reporter inside. A tired-looking Roosevelt, reading papers behind the maple and walnut Hoover desk, looked up, and smiled.

“Ah, Bob,” said the president. He always called pressmen by their first name. Gesturing Sherrod forward, he said, “You were on Tarawa, so I hear.”

“Yes, Mr. President,” the pressman replied.

Roosevelt went on to tell a surprised Sherrod that he wanted his opinion on something. He revealed that, not long before, he had sat through several reels of harrowing 35m.m. film shot by Marine Corps cameramen attached to the Second Marine Division during the bloody taking of Tarawa.

“They’re pretty gory,” FDR remarked. “They show a lot of dead.” He meant American dead as well as Japanese dead.

“Yes, sir.” Sherrod had been there, had seen it firsthand. The battle, which American commanders had originally expected to bring easy victory, had in reality been like a visit to Hell, and Sherrod would never forget scenes he witnessed on Tarawa over several harrowing days.

Two recollections in particular lodged permanently in the reporter’s mind. Sitting on the beach with his back to the seawall, and with a US marine at his side, Sherrod had looked up as another young American walked briskly across the sands toward them, grinning at the man beside Sherrod, apparently a pal. And then the walking man had done a pirouette, to fall at Sherrod’s feet, looking up at him with a frozen look of surprise in his eyes and a sniper’s bullet in his brain.

An exasperated major had subsequently detailed men to find and eliminate that sniper, who turned out to be hiding in a Japanese coconut-log pillbox that had previously been cleared. Sherrod went with them, and watched as one marine nonchalantly tossed blocks of fused TNT into the pillbox. The detonating high explosive sent the sniper running out the side entrance. Another American marine, armed with a twin-cylindered flame thrower, was waiting for him.

The Japanese, caught in a withering stream of flame, flared up like celluloid. He was dead in an instant, but the bullets in his cartridge belt continued to pop for a good minute after the man had been charred beyond recognition. An eye for an eye? A life for a life? It all seemed so senseless to Sherrod.                 

Short clips from the Tarawa film footage had been released to American newsreel companies, none of it showing American dead. Now, as Roosevelt told Sherrod, he was contemplating releasing all the footage, uncensored, to allow it to be shown in movie theatres the length and breadth of the United States. But, he wondered aloud, were the American people ready for the graphic scenes of young Americans floating lifelessly in the surf, of American troops taking ID tags from dead comrades lying on the island sands?

“That’s the way the war is out there, Mr. President,” Sherrod unhesitatingly replied, “and I think the people are going to have to get used to the idea.”

The President nodded thoughtfully. “Good, good.”

Before consulting Sherrod, Roosevelt had been uncertain whether he should release the footage. He had taken a step in that direction in September, when he authorized the publication of a still photograph by Life magazine combat cameraman George Strock that showed three dead GI’s lying on Buna Beach in New Guinea, their bodies covered with maggots. Prior to that time, the War Department had banned the publication of pictures of seriously wounded or dead American service personnel.

Ironically, Strock had taken the picture on captured Japanese film, after his own had been destroyed. Run full-page by Life, the graphic Buna Beach photo had shocked the nation, as Roosevelt had hoped. It had also brought criticism and censure down on the president.

A lesser man would have shied away from giving his critics more ammunition, but Strock’s Buna Beach photo had opened the door to exposing America to the grim realities of this war, and FDR knew that he had to capitalise on Buna Beach’s effect and wed the nation to an uncompromising win-the-war mindset.

Apart from fighting the Axis powers overseas, at home the president was fighting trenchant labor unions, an obstructive Republican-dominated Congress, and alarmingly high absenteeism at factories producing America’s arms and ammunition. Many Americans just didn’t seem to be taking the war seriously enough, thinking a US victory was going to be a walk in the park.     

As Roosevelt instructed Steve Early to usher the remainder of the press corps into the Oval Office, Bob Sherrod suspected that his support had helped the president make the decision to release the Tarawa footage.

At that time, there was a Press Room in the West Wing’s northwest corner – the modern-day Press Briefing Room in the White House sits over what in 1943 was FDR’s private swimming pool. However, for these personal briefings with a select few print and radio journalists, some parts of which were off the record, Roosevelt remained in the Oval Office and had the pressmen brought into him. That way, he was neither seen nor photographed in the wheelchair to which his declining health had confined him.

At 4.07 p.m., following the delay caused by the private Tarawa conversation between Sherrod and the president, the press conference got underway. Questions from the White House correspondents that afternoon covered a range of areas, but the subject of a looming national railroad union strike loomed above all others.

Before Christmas, acting decisively, Roosevelt had appointed nine railroad presidents to the rank of colonel in the US Army, and then made them and their employees answerable to the War Department. At a stroke, FDR had nationalized the railroads, making all railroad workers government employees. This had driven all but three railroad related unions to arbitration, and as Roosevelt now told the press conference, he was confident the three holdouts would also soon come around to his way of thinking and the strike would be averted.

The president was next asked whether he was planning to continue with his New Deal program in the light of the war’s austerity measures. Roosevelt had introduced the New Deal in 1933 in response to the Wall Street Crash and resultant Great Depression. That program had saved the banking system, revolutionized pensions and social services, and slowly righted the economy.

This New Deal question offered the opening that FDR was looking for. Having now decided to release all the Tarawa footage, he knew that he had to further prepare the nation for the new mindset he expected of it. So, Roosevelt now gave the reporters a folksy analogy.

“The United States of America is like a sick man. Two years ago, he had a very bad accident. Not an internal trouble. Two years ago, on the 7th of December, he was in a pretty bad smash-up.”

Everyone in the room knew that he was referring to the Japanese attack on Pearl Harbour.

“Old Dr. New Deal didn’t know nothing about legs and arms,” FDR went on. “He knew a great deal about internal medicine, but nothing about surgery. So he got his partner, who was an orthopaedic surgeon, Dr. Win-the-War, to take care of this fellow who had been in this bad accident. And the result is that the patient is back on his feet. He has given up his crutches. He isn’t wholly well yet, and won’t be until he wins the war.” In case his audience hadn’t got the message, he concluded with: “The overwhelming first emphasis should be on winning the war.”

The reporters left the press conference itching to share FDR’s sick man analogy with their readers and listeners. None, apart from Bob Sherrod, realized its significance, or appreciated that it represented the core of the president’s changing propaganda strategy, in which truth was to replace triumphalism.

Roosevelt had said nothing to the pressmen about the Tarawa footage, but that had dictated his thinking at the press conference. Once his office was cleared, he called Office of War Information director Elmer Davis and instructed him to have the footage put together in a form that would make the greatest impact on the American public.  

Davis had all the rolls of film from Tarawa edited over January and February, 1944 at Warner Brothers Studios in Hollywood, creating a twenty-minute documentary. The film’s writer and director was Richard Brooks. Then a young member of the Marine Corps, Brooks would go on in post-war years to become a successful screenwriter and feature film director whose credits would include Blackboard Jungle, Cat on a Hot Tin Roof, and Looking for Mr Goodbar.

Taking the compile of rough color and black and white footage shot by fifteen different Marine Corps cameramen—two of whom had been killed on Tarawa—under the command of Captain Louis Hayward, a South African-born former movie actor, Brooks added a soundtrack with sound effects, dramatic music and a gritty narration. Brooks personally wrote the narration, as if from the point of view of a marine on Tarawa. As specified by Davis, that narration included a pithy explanation for the sight of American dead: “This is the price we had to pay for a war we didn’t want.”

The resulting documentary, With the Marines at Tarawa, was released to movie houses across the country by Universal Studios on behalf of the OWI on March 2, 1944, and shocked and electrified the nation. It went on to win the 1944 National Board of Review Award for best documentary and the 1945 Academy Award for best documentary, short subject.

And so it was that, with the help of two Life magazine men, George Strock and Robert Sherrod, employees of Roosevelt’s ardent Republican critic, publisher Henry Luce, the president was able to loosen censorship in the United States and cement the public behind him in his bid to harden attitudes and strengthen the war effort.

Following the Buna Beach and Tarawa breakthroughs, the US Government permitted the publication of images of dead American service personnel, as long as they were not gratuitous and individual personnel or their units could not be identified.

Scholars today credit George Strock’s Buna Beach photograph with turning the tide in wartime public opinion in the US and stiffening the American resolve to win. In 2014, Time magazine went so far as to describe it as “the photograph that won the war.”

To the frustration of Roosevelt’s Republican opponents, the Buna Beach and Tarawa images probably also contributed to Roosevelt being returned to office in the November 1944 presidential election. Even so, his defeat of the Republican governor of New York, Thomas E. Dewey, was the closest of all his presidential victories. It was a victory that earned FDR an historic fourth term in the White House. Five months later, he would be dead.

Sadly, Roosevelt’s unvarnished truth approach to war news would not be adopted by future US administrations. By the time of the Vietnam War, inflated enemy body counts, glossy US military situation reports and unrealistic predictions had become the norm, and played a role in the shock experienced by the American nation when the US actually lost that war.

The Trump era has shown that there has never been a greater need for truth in American affairs. As Andrei Sakharov, father of the hydrogen bomb, dissident Soviet scientist, and Nobel Peace Prize winner, was to say: “The most powerful weapon in the world is not the bomb... it is truth.” 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179517 https://historynewsnetwork.org/article/179517 0
The Big Ideas History Syllabus  

 

This video essay articulates my personal Philosophy of History as it current exists. In other words, my “Big Idea History Syllabus” expresses how I see and understand both the past and the study of the past. These ideas inform everything about how I teach, how I write, and how I live day-to-day.

 

Since his viral YouTube video “A Vision of Studies Today” in 2007, I have been following Dr. Michael Wesch’s research, which often focuses on pedagogy. In particular, his “Big Idea Syllabus” for anthropology inspired my “Big Idea History Syllabus.” (As I teach a variety of subjects, I’ve also written a “Big Idea Writing Syllabus” and a “Big Idea Gender Studies Syllabus.”)

 

The core of the “Big Idea Syllabus” framework is to think beyond the specific curriculum and the learning objectives for any given class and instead to articulate points that actually matter in the big picture. Wesch urges educators to think what big ideas can actually remain with students and help them embrace what he calls “the learning worth crying about.”

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179512 https://historynewsnetwork.org/article/179512 0
The Long History of Women Warriors Today the figuresfor the officer corps are significantly higher across almost all services. As of 2018, women represented 19% of the Army officer corps, 19% of the Navy’s, 21% of the Air Force’s, and 8% of the Marines’.   An important milestone occurred in 1976, when the first young women were allowed to enter the three service academies. I was privileged to teach the first group at the U.S. Military Academy at West Point and, in 1980, to witness the first female cadets graduate in 1980 and become second lieutenants.     A significant transformation in the roles women play in the military took place in December, 2015, when the Department of Defense opened to women combat roles across the services. Secretary of Defense Ashton Carter stated: “There will be no exceptions.” “They’ll be allowed to drive tanks, fire mortars and lead infantry soldiers into combat. They’ll be able to serve as Army Rangers and Green Berets, Navy SEALs, Marine Corps infantry, Air Force parajumpers and everything else that was previously open only to men.”   In that same year the Army opened its most challenging training course to women—Ranger School. Lieutenants Kristin Geist and Shaye Haver became the first women to graduate from the school,--a tough, 61-day course—the most demanding training I underwent in my 21-year Army career. As of April 2020, 50 women have graduated from the course.    Today women Army officers are commanding infantry and armor combat companies, indicating that they soon may be commanding combat battalions and larger Army units.    Recent archaeological discoveries and studies show that these current women warriors have actually a long pedigree. Women as warriors—or certainly hunters and not simply gatherers—have a long history reaching back thousands of years to pre-history.    In November of last year, researchers found that the remains of a 9,000-year-old hunter buried in the Andes mountains was a woman. The specialized tool/weapon kit at the burial site indicates she was a big game hunter.    This discovery encouraged the researchers to re-examine evidence from 107 other graves throughout the Americas from the same time period. Out of 26 graves with hunter tools, they were surprised to discover 10 contained women.    These discoveries challenge the traditional beliefs about gender roles in pre-recorded history: Men hunted and women gathered. The picture is now more mixed.    The richest body of literature and artifacts on women warriors in ancient Western history is found in ancient Greek history, and it deals with the mythical Amazons. Amanda Foreman, writing in the Smithsonian Magazine, (April, 2014) explains that the ancient Greek poet, Homer, writing in the 8th century BCE, was the first to mention these women warriors. In his “Iliad,” he mentions them briefly as Amazons “antianeiria,” a term translated variously as “antagonistic to men” or “the equal of men.” In any case, Homer made these women brave and stalwart military opponents to the Greek male military heroes, who of course always vanquished these women warriors.    Future Greek writers continued referencing the Amazons. For example, they supposedly fought for the Trojans in the Trojan War. Also, the demi-god Heracles completed his ninth labor by taking the magic girdle of the Amazon queen, Hippolyta.    Thus tales of the Amazons became inextricably intertwined with the rise of Athenian democracy which began in the 6th century BCE. In this century, images of Amazons battling Greeks spread; they appear not only on pottery but also on their architectural friezes, jewelry, and household items.     Recent archaeological discoveries dating back to the 5th century BCE indicate that the Amazons were rooted in real equestrian, nomadic women of Eurasia—the Scythians. Adrienne Mayor, writing in “National Geographic History” (May/June 2020) states that the Greeks would have encountered these women in the 7thcentury BCE as they established colonies around the Black Sea.    Excavations of Scythian burial mounds began in the 1940s, and revealed skeletons with spears, arrows, axes, and horses. Originally identified as male, more recent DNA testing shows that some human remains were women. About one-third of the Scythian women found in the burial sites had weapons. Also, their bones have indications of combat: marred ribs, fractured skulls, and broken arms.    It is clear that the more egalitarian society we Americans continue to strive to create had an antecedent on the steppes of Eurasia. ]]> Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179551 https://historynewsnetwork.org/article/179551 0 Recommitting America to Historical Principles of Obligation to Veterans

Bonus Marchers camp at US Capitol, July 13, 1932

 

 

 

Veterans’ law is a dead letter. Twenty years after 9/11 and almost 50 years after the beginning of the All-Volunteer Force, the veterans’ benefits system needs reform. At stake are recruitment and retention, the pillars of the United States’ All-Volunteer Force and the heart of the common defense. Because reform must begin with first principles, the United States must once and for all upend the faulty premise that veterans’ benefits are mere gratuities.

 

Less than a generation after President Abraham Lincoln’s second inaugural address—in which he expressed the imperative “to care for him who shall have borne the battle,” the source of VA’s motto—the U.S. Supreme Court, in 1883, announced: “Pensions are the bounties of the government, which Congress has the right to give, withhold, distribute, or recall, at its discretion.” Courts have since transmuted this passage into the proposition that all veterans’ benefits are mere gratuities. This proposition serves as the foundational premise for veterans’ benefits administration and adjudication. History reveals why it’s wrong.

 

To be sure, the establishment of the Court of Appeals for Veterans’ Claims in 1988 (more precisely, its precedents) has gone a long way in upending this premise. Still, with only 6% of veterans’ benefits claims being appealed to the Court, the benefit to veterans at large remains in question (See more here). And although veterans’ benefits are very much at the discretion of Congress—with, after all, the lawmaking powers in the congress and the executive powers in the office of the president—they are far from being mere gratuities. Veterans’ benefits have long had among their purposes to reward and encourage service. In addition, president after president has urged veterans’ care as the nation’s obligation.

 

George Washington for one learned the hard way while in command of the Continental Army that recruitment and retention depend on the nation’s promise of care to those who served. By April 1778, Washington, Commander in Chief of the Continental Army, came to realize that “[soldiers] will not be persuaded to sacrifice all views of present interest, and encounter the numerous vicissitudes of War, in the defence of their Country, unless she will be generous enough on her part, to make a decent provision for their future support” (see more here). At war’s end in 1781 Washington bade farewell, noting “the obligations this Country is under, to that meritorious Class of veteran Non-commissioned Officers and Privates, who have been discharged for inability” (See more here). What Washington’s words conveyed was that veterans’ care was not only necessary to encourage service but also a matter of national obligation. Though notable among the leaders and lawmakers who have learned these lessons, he was not the first.

 

The tradition of state-funded veterans’ benefits in the United States traces to the time of Queen Elizabeth I and the defeat of the Spanish Armada. It was then that parliament passed the first modern veterans’ benefits law, which left no mystery as to its purposes:

 

[S]uch as have . . . adventured their lives and lost their limbs or disabled their bodies, or shall hereafter adventure their lives, lose their limbs, or disable their bodies, in defence and service of Her Majesty and the State, should at their return be relieved and rewarded to the end that they may reap the fruit of their good deservings, and others may be encouraged to perform the like endeavors; Be it enacted . . . .

 

Providing veterans’ care to reward and encourage service continued into the New World. A law passed in the British colony of Virginia in 1624 provided “[t]hat at the beginning of July next the inhabitants of every corporation shall fall upon their adjoyning Salvages, as we did last year . . .”—to fall upon their adjoyning Salvages meant to make war with the close-by natives—and that “[t]hose that shall be hurte upon service to be cured at the publique charge; in case any be lamed to be maintained by the country according to his person and quality.” Other colonies passed provisions that were similar.

 

In 1636, Plymouth Colony passed a law providing veterans’ benefits to encourage service against the Pequots: “[I]f any man shall be sent forth as a soldier and shall return maimed he shall be maintained competently by the Colony during his life.” As a law scholar observed, “This promise of relief was meant to encourage the colony’s soldiers against the Pequod Indians.” Similar laws sprang up in Maryland, New York, and Rhode Island, also to encourage service.

 

Over the course of the generation that followed the Revolutionary War, opinions of the war changed, and with them, opinions of the war’s veterans. Still, what endured was the idea of veterans’ care as a national obligation. By the time John Quincy Adams became the nation’s sixth president, he had much to say on the “duties” of the office, noting among them “the debt, rather of justice than gratitude, to the surviving warriors of the Revolutionary war,” and “the extension of the judicial administration of the Federal Government to those extensive and important members of the Union.”

 

In the era of national security, Franklin Delano Roosevelt (FDR) also reaffirmed the nation’s obligation to veterans. Ten years after he established the Board of Veterans’ Appeals, set up to make final decisions on veterans’ benefits claims, FDR delivered a radio address in 1943 on the mustering out of the veterans of World War Two: “May the Congress do its duty in this regard. The American people will insist on fulfilling this American obligation to the men and women in the armed forces who are winning this war for us.”

 

Other presidents have made similar statements over the course of U.S. history. The post-nine-eleven era is most notable. In this era, each president, Democrat and Republican, has made similar statements on the nation’s obligation to veterans. These statements are important. After all, as political scientist Harold Lasswell has urged (in the preface of the book What Washington Said: Administration Rhetoric and the Vietnam War 1949-1969): “[P]residential statements are political acts. When the president delivers a memorial eulogy in Arlington National Cemetery, he performs a ceremonial act of respect and gratitude, yet the political overtones are unmistakable. Clearly the nation’s power in the world arena is contingent on the support of those who put their lives on the line.”

 

With these political acts, presidents, Democrat and Republican, have spoken with one voice on the nation’s obligation to veterans, proclaiming, from this common ground, a common cause.

 

Notably, amid the United States’ divisions, on 9 February 2021, only a day after being sworn in as the new VA secretary, Denis McDonough noted an opportunity for unity, saying, “At this moment when our country must come together, caring for you – our country’s Veterans and your families – is a mission that can unite us all.” Presidents’ statements on the nation’s obligation to veterans show why he’s right.

 

Veterans’ care as a common cause follows from the concept of the common defense. Far from being only a government obligation, the imperative of veterans’ care is an obligation of the people. In the end, veterans’ care can unite us all, but only if we, the people, do the work.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179553 https://historynewsnetwork.org/article/179553 0
The San Francisco School Board vs. Abraham Lincoln (High School)

 

 

The recent decision by the San Francisco School Board to rename forty-four schools has stirred up a nationwide controversy; most of the attention has justifiably focused on renaming Abraham Lincoln High School because the 16th president approved the execution of 38 Sioux men in Minnesota in 1862. That event is, however, far more complex and nuanced than the school board’s facile judgment. 

There had been massive theft of Sioux land in Minnesota despite treaties which guaranteed them ownership forever (a senator sympathetic to the Indians noted that “forever” in practice meant “until the white people want it”). Finally, a group of desperate and near-starving young Sioux raided an egg farm in southwestern Minnesota, killing five whites. Violence quickly escalated and hundreds of Sioux and white people were killed before the rebellion was crushed. A military commission promptly tried some 1,500 Sioux (42 defendants were tried in one day!), sentenced 303 to death, and demanded that Lincoln approve the sentences promptly to discourage further “savage” violence. Failure to do so, they warned, could make it “nearly impossible to prevent the indiscriminate massacre of all the Indians—old men, women, and children.” The president nonetheless ordered that no executions take place until he could personally review the records of all the convicted men. 

Lincoln was appalled to discover that blatant bias, lack of reliable evidence and failure to provide counsel had all but guaranteed these convictions. Somehow, despite the crushing burdens of the war—the Confederates had just invaded Maryland and set their sights on Pennsylvania—he made the time to carefully review each conviction, reduced the number to be hanged to 38, and pardoned 265 of the prisoners. He also personally wrote out the names of each pardoned man and instructed the Army telegraph operator to be extremely careful in spelling these unfamiliar names since a careless error could result in hanging the wrong man. Several Minnesota officials later told the president that the widespread anger of the white population over the pardons would undermine Republican prospects in the 1864 presidential election; Lincoln responded, “I could not afford to hang men for votes.” He also told an Indian rights activist, “If I live, this accursed system shall be reformed.” 

One might, not unreasonably, expect the school board of a major city like San Francisco to be somewhat more historically literate. But, perhaps Mark Twain was not far off the mark when he quipped, “In the first place, God made idiots. That was for practice. Then He made school boards.”

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179556 https://historynewsnetwork.org/article/179556 0
The Roundup Top Ten for March 12, 2021

The John Birch Society Never Left

by Rick Perlstein and Edward H. Miller

Journalists are calling for the Republicans to follow the lead of William F. Buckley and stand up to far-right extremists in their ranks. The problem is that neither Buckley nor the GOP of the 1960s did any such thing, instead perfecting the technique of speaking to two audiences. 

 

What the Election of Asian American GOP Women Means for the Party

by Jane Hong

The success of Asian-American Republican women candidates in Orange County suggests that the parties' efforts to appeal to a multiracial electorate must focus on the distinct histories and concerns of ethnic communities. 

 

 

The Coronavirus Killed the Gospel of Small Government

by Zachary D. Carter

Revisiting the work of Keynes highlights the fact that struggles to deal with the pandemic are not only public health failures but economic failures — an inability to marshal resources to solve a problem. 

 

 

Thucydides, Historical Solidarity, and Birth in the Pandemic

by Sarah Christine Teets

A classicist reflects on Thucydides' account of the Athenian plague, and concludes that the point of historical knowledge is to empathize, not to strategize. 

 

 

Lessons From All Democracies

by David Stasavage

The idea of the "torch" of democracy passing from one historical society to the present blinds us to understanding how popular sovereignty arises and why it's resilient. If we are concerned with protecting democracy, we must first understand it.

 

 

Tweeting To Find Community

by Varsha Venkatasubramanian

Don't fear Twitter, new historians. Use it for learning, networking, and fun.

 

 

On Shedding an Obsolete Past

by Andrew Bacevich

"Sadly, Joe Biden and his associates appear demonstrably incapable of exchanging the history that they know for a history on which our future may well depend. As a result, they will cling to an increasingly irrelevant past."

 

 

Socialite Mollie Moon Used Fashion Shows to Fund the Civil Rights Movement

by Tanisha C. Ford

Ebony Magazine's Fashion Fair offered a platform to Black designers while raising money for civil rights organizations – more than $60 million over a half-century. 

 

 

How Black Americans Used Portraits and Family Photographs to Defy Stereotypes

by Janette Greenwood

The author and her students researched and curated an exhibition of historical Black family portraits and discovered the way that photography served as a tool for rejecting stereotypes in an era of ascendant racism. 

 

 

We, the Nation, Born Under This Tree

by Sean Cleary

A speech of Edward Everett and a painting by N.C. Wyeth create a mythical founding moment of an American nation conceived as a white homeland. 

 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179547 https://historynewsnetwork.org/article/179547 0
The History Behind Demands for "Trial by Combat"

Depiction (ca 1540) of Trial by Combat in Augsburg in 1409

 

 

Trial by combat would seem to be a thing of the past, or something found in historical fiction like Sir Walter Scott’s Ivanhoe or the TV series Game of Thrones, where Tyrion Lannister demands a judicial duel to resolve a murder charge against him. But appeals for it still crop up occasionally in today’s news.  Just last month, shortly before the January 6 assault on the Capitol, Rudy Giuliani told “thousands of fired-up pro-Trump protestors that they should contest the election results via ‘trial by combat.’ (Giuliani later claimed that he had merely been referring to “Game of Thrones.”)” And as recently as January 2020, a Kansas lawyer called for “trial by combat” (with samurai swords) against his estranged wife and her attorney to resolve a child custody suit.

People today often have a hard time understanding how medieval society could have expected a judicial duel — often a fight to the death in criminal cases — to provide a fair conclusion to a legal dispute.  A duel seems tantamount to a throw of the dice, or, worse, a thinly disguised form of the deeply flawed notion that might makes right.  The

fact that trial by combat, also known as “the judgment of God,” traditionally looked to heaven to assure a just and fair verdict only makes the whole thing seem even more preposterous to us today — although Guiliani’s appeal for it seemed to go down well with his crowd of Trump supporters.

The famous Carrouges-Le Gris affair of 1386, where a case of alleged rape was ultimately resolved by combat, was reputedly the last such combat ever ordered by the Parlement of Paris, although court-sanctioned duels continued to be fought in other parts of Europe long afterwards.  My book The Last Duel (Broadway, 2004) explores this very controversial case and is also the basis of the Ridley Scott film of the same title to be released in October.  Matt Damon, Adam Driver, and Jodie Comer play, respectively, the knight, the squire and the noblewoman caught up in the celebrated affair, which became the fourteenth-century equivalent of a high-profile celebrity scandal today.

In the BBC TV-documentary (2006) also based on my book, historian Jonathan Sumption (at the time Queen’s Counsel and later a Justice of the Supreme Court of the United Kingdom) offered an explanation that may enable us to think our way back, at least a little bit, to a time when laws and beliefs were very different from today and the duel actually made a kind of sense to people:

They thought God supervised the world from moment to moment and intervened at any moment to achieve whatever result he thought desirable.  If you take that view of the world [and] of God, it’s not altogether irrational to suppose that if one man prevails over another in a duel, it’s because God caused it to happen.

 

The Body of Truth

A feature of judicial combat that may also help explain its appeal in the Middle Ages is the centrality of the body in medieval law as well as in chivalry, religion and other areas of life.  In legal confessions, for example, the body was seen as giving testimony against the accused, a view that was used in turn to justify torture as a means of extorting hidden truth.  The notion of hidden truth also supported the judicial duel.

Trial by combat, for all its military pageantry and obvious appeal as blood sport, was at its legal core a formal, sanctioned way to test an oath.   That is, each combatant solemnly swore in advance that he and only he was telling the truth — which clearly meant that one of the two had sworn falsely.  But which of the two?  It was the purpose of the duel to answer exactly that question.  The combat was a public and decisive way to test two opposed and mutually exclusive oaths — just as a jury trial (despite its many flaws) is the usual way today of testing two antagonistic claims, with words rather than swords.

The 1386 duel followed a time-honored tradition of putting legal claims to the test in the form of physical combat.  Just as a confession, even if extracted by torture, was thought to reveal a hidden truth, so the ferocious logic of the duel implied that proof was already latent in the bodies of the two combatants, and that the duel’s divinely assured outcome would reveal which man had sworn falsely and which had told the truth.  By definition, he who lost had lied, and the proof of the lie was his losing.

According to the Parlement’s summary of the 1386 case, Jean de Carrouges accused Jacques Le Gris of several crimes, but chiefly of rape.  If Le Gris refused to admit his guilt and could not be convicted by witness testimony or other evidence, Carrouges demanded a trial by combat in order “to show and prove” his accusations “by his body [de suo corpore] or that of his proxy.” Thus, Carrouges offered his own body in order to obtain proof through battle, a form of proof that required Le Gris’s body as well. 

Rules for Duels

To understand better the logic of the duel and its demand for bodily proof, let’s take a look at the traditional rules and regulations of trial by combat.  These are contained in a type of document known as a formulaire.  The formulaire still in force at the time of the Carrouges-Le Gris case had been issued eighty years earlier by King Philip VI in 1306.  Its full title, translated, is Ceremonies for the Wager of Battle according to the Constitutions of Good King Philip of France, but for convenience we may simply call it “Rules for Duels.”

The Rules for Duels set out four conditions that must be met before a trial by combat may be authorized: (1) the crime must be certain to have occurred; (2) it must be a capital offense such a murder, rape, or treason; (3) the accused person must be widely suspected of the crime; and (4) all other legal remedies had to have been exhausted, with armed combat — “proof  by one’s body” — as the only means of conviction left.

Carrouges’s brief, as we have seen, expressly offers his own body as the means of proving his claim.

The rest of the 1306 decree specifies the elaborate rules and procedures of the duel, from the initial appeal and the formal challenge to the elaborate ceremonies at the field of battle that precede the combat itself.  A key part of the battlefield ritual is the presentation of the scrolls, which takes place right after the two armored and mounted combatants identify themselves and enter the field. 

 

Presentation of Scrolls, from Cérémonies des Gages. Courtesy Bibliothèque National de France

 

The scrolls, prepared in advance by lawyers, contain a summary of the two combatants’ charges — that is, why they have come to fight.  In one image, they sit their horses and brandish their scrolls like weapons!  The image perfectly captures the shift in the progress of their case from legal argument to mortal combat, reminding us that the duel, a legal last resort, was still an integral part of the law, unlike the private and illegal duels of later times. 

Another key ceremony is the series of three solemn oaths that each combatant must swear prior to combat, affirming the truth and justice of his charges.  Their mutually contradictory oaths are embodied by the staging of the second and third oaths in particular, where the two armed opponents kneel face to face, a sign of their antagonism.  With his right hand bared, each man also touches the holy objects on the altar — a massbook or a copy of the gospels, along with a crucifix and possibly saints’ relics as well to make the oath even more binding. 

 

 

The oath seals a mutually destructive contract to fight it out.  In speaking the words of the second oath — and all of the oaths are as heavily scripted as a liturgy — each man swears that his cause is just, that he speaks the truth on peril of his soul, and that he would give up the joys of heaven for the pains of hell if in fact he swears untruthfully.  In concluding this oath, each combatant vows that he places his sole reliance “on God and on my right, by my body [mon corps], by my horse, and my arms.” Again, the body is the means of proof — including the horse’s body!

A Sinister Oath

In some formulaires, the third oath includes an emphatic gesture.  Still kneeling face to face, the two combatants now reach across the altar and clasp their bared hands — using the left hand, not the right, in order to signify a hostile rather than a friendly oath. The marshal also holds their joined hands on his open palm, binding the two men together and to himself, the master of ceremonies (and the king’s representative) in the ritual of the duel.  Then the two combatants swear their final oath, beginning with the appellant.  What Jean de Carrouges would have said goes something like this:

“O thou, Jacques Le Gris, whom I hold by the hand, I swear on the Holy Gospels and by the faith and baptism that I hold from God that the deeds and words that I have attributed and made others attribute to you are true, and I have a good and true cause to summon you, while yours is evil.”

The fact that the accuser holds his opponent “by the hand” is contractual; it seals the solemn oath.  But it also symbolizes the bodily nature of the proof that the two men are about to provide on the field of battle, the physical combat to which the oath binds each of them.

This third and especially fierce oath, like the rest of the ceremonies, is heavily scripted, and yet we can imagine the personal feeling, the deadly hostility, that Carrouges and Le Gris also brought to the occasion.  One man was determined to defend his wife’s honor and to avenge the crime against her, while the other was equally determined to disprove the charges and remove the terrible stain on his name and reputation.

According to the ruthless logic of the judicial duel, or “the judgment of God,” one of the combatants must be guilty, while by the same token the other must be innocent.  At the start of the duel, innocence and guilt remain hidden, but they will become perfectly clear by the end.  Both men will fight, but only one will live. One man will leave the field alive, purified by the ordeal, while the other will be defeated, or lie dead, his vanquished or lifeless body the proof of his guilt before all.

 

Today a call for trial by combat may seem like crazy talk or a sign of sheer desperation with the legal system.  But it may also hint at an old and deep-seated desire for a clear, definitive verdict in tangled human affairs where the truth proves elusive.  How satisfying it would be, some may think, to put the truth to the test of combat!  Indeed, traces of an older, more violent way of settling quarrels are still dormant in our speech today: the accused is known as the “defendant,” and we sometimes speak of “crossing swords” or “championing a cause,” a champion being originally a person who fights in another’s place (as in Tyrion Lannister’s case).  And while few of us may wish to revive trial by combat, its allure will persist as long as some dissatisfied plaintiffs call for the verbal jousting of the law courts to revert to the real thing — or when partisans rejecting the results of a fair and free election insist on settling the matter by violence in the public arena.

  ]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179429 https://historynewsnetwork.org/article/179429 0
Black Votes Have Always Mattered  

 

 

The fear of black power drives today’s right wing politics.  Since Barack Obama’s election, a virulent stream of racism has coursed through the Republican Party, stimulating the Birther campaign which catapulted Donald Trump to prominence. More recently, it drove the paranoia about “fraud” in Philadelphia, Atlanta, and Detroit.

 

There is nothing new to this antagonism. Anxiety about black voting is as old as the Republic, as are repeated attempts to exclude those voters. 

 

During and after Reconstruction, formerly enslaved “freedmen” elected two thousand of their own to office in the former Confederacy.  After 1890, however, Jim Crow Democrats succeeded in excluding all but a handful. Not until 1965 did the Voting Rights Act finally overturn racial disfranchisement in the South, but this victory for democracy proved as temporary as Reconstruction. Since 2000, Republicans committed to the New Jim Crow have used new tools to disfranchise African Americans via voter identification laws, purges of voter rolls, closing polling places, and other underhand tactics. 

 

Most of the above practices should be familiar to those who follow politics and history. Far less documented are the stories of free black men who voted since the Founding, although in 1857 Abraham Lincoln cited those facts to denounce the Dred Scott ruling's claim that persons of African descent had never been citizens and had no voting rights.

 

Before the Revolution, most colonies allowed only Protestant men to vote and hold office. Religious qualifications were voided after 1775, but the Constitution gave the states control over voting, and there was great variation. This lack of consistency gave rise to many myths, including:

 

Only men could vote. From 1776 to 1807, New Jersey’s constitution enfranchised all “inhabitants” meeting a property requirement, including unmarried women.

 

Only men with property could vote. Some states retained the English requirement of a “forty pound freehold,” but five of the original thirteen authorized taxpayer suffrage in some form, and Vermont’s 1786 constitution enfranchised “all freemen.” As universal male suffrage became common, Rhode Island kept its property qualification until 1842 (maintaining it for naturalized citizens) and Virginia retained its until 1850.  

 

Only white men could vote. Of the original thirteen, only Virginia, South Carolina, and Georgia put a racial qualifier in their constitutions. When South Carolina tried inserting “white” into the Articles of Confederation’s definition of citizenship in 1778, it was overwhelmingly rebuffed. Certainly, after 1790 most new states legislated “white suffrage,” and some older ones moved to disfranchise. The exclusion of black voters culminated with Tennessee, North Carolina, and Pennsylvania in the 1830s. It is hardly the whole story, however.

 

In the 1790s and early 1800s, free black men voted and were courted by the likes of John Hancock in Massachusetts and Aaron Burr in New York. Up to the Civil War, Southern newspapers and Northern Democrats expressed their rage over black voting power by constantly attacking black people in New England cities where they held office and gained patronage, including Portland, Boston, Providence, and New Bedford.

 

Historians have emphasized how African Americans were disfranchised everywhere in antebellum America except Maine, New Hampshire, Vermont, and Massachusetts, without noting that they actually regained the vote in other key states.

 

In Ohio, an 1831 Supreme Court decision specified any man “preponderantly white” was legally Caucasian. The Black Republican John Mercer Langston (Oberlin’s elected town clerk) clarified “anybody that will take the responsibility of swearing that he is more than half-white, shall vote.  We do not care how black he is.” 

 

In Rhode Island, “Law and Order” conservatives re-enfranchised the state’s substantial black electorate in 1842 to hold off a populist insurgency demanding universal white male suffrage. In New York, black men allied themselves to the Whig and later Republican parties led by William Seward, and worked to meet the property requirement, with as many as eleven thousand voting by the 1850s. And from 1855 to 1860, Democrats excoriated Ohio’s Salmon P. Chase as a “Negro Governor,” whose winning margin came from black votes.

 

The controversy over black voting culminated in 1860. On election eve, the leading Democratic newspaper, the New York Herald, proclaimed that given the Dred Scott decision, the Democrats controlling Congress should “throw out the fourteen thousand negro votes in Ohio,” depriving Lincoln of an Electoral College majority, so that Congress could pick the president.

 

Uncovering the true history of black voting tells us something important. Since the Revolution, we have been fighting over who is an American, and whether or not real Americans were and are white, with everyone else a guest or second-class. Forget myths of inevitable progress; that fight is hardly over, nor is its result preordained. It helps to know that at the Founding, in Lincoln’s words, “free negroes were voters, and, in proportion to their numbers, had the same part in making the Constitution that the white people had.”

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179452 https://historynewsnetwork.org/article/179452 0
Four Things You (Probably) Don't Know about the Werewolves of the Ancient World

German Woodcut, ca. 1722

 

 

1. Yes, the ancient world did know about werewolves!

The modern conception of the werewolf and his lore (it usually is his) is mainly conveyed by the movies, in an unbroken chain beginning with Universal Pictures’ The Wolf Man of 1941, starring Lon Chaney Jr., and continuing today in the rather good Underworld series. The roots of these lie in the pulp fiction of the early twentieth century. Amongst all this, werewolf afficionados usually single out one novel as retaining some literary merit, Guy Endore’s 1933 The Werewolf of Paris.

But what about before this? Many may instinctively imagine roots further back in the medieval world, and that is right enough: the twelfth century AD is witness to a remarkable flowering of werewolf tales -- from England, France and the Viking realms of Iceland and Scandinavia. However, long before even this, werewolves were stalking the writings of the Greeks and the Romans between the fifth century BC (arguably earlier) and the fifth century AD. Striking stories and claims about them are found in such authors as Herodotus, Pliny (the Elder) and Pausanias, but the best ancient werewolf story is a Latin one preserved by Petronius in his bawdy Satyricon (AD 66).

Petronius tells how, somewhere in Campania, one Niceros had been making a night-time trip to visit his girlfriend, Melissa, the wife of Terentius the innkeeper, and had brought a soldier-friend along for the journey. At a certain point on the grave-lined road, the soldier stopped to pee against one of the tombstones. Niceros suddenly noticed that he had taken all his clothes off and piled them up, and was peeing a circle on the ground around them. Then he turned into a wolf, howled, and ran off into the woods. Approaching the clothes, Niceros discovered they had been magically turned to stone. He continued in terror to Melissa’s home, where he learned that a wolf had just got in amongst their sheep and had been slaughtering them, but it had run off after a family slave had put a spear through its neck. As Niceros made his way back home the next day, he found the clothes were gone from the tomb, but in their place was a bloodstain, and on arrival he found the solider abed, with a doctor treating him for a neck wound. At that point (and only at that point – oddly) he realised that the solider was a werewolf, and refused ever to break bread with him again.

 

2.  The ‘werewolf gap’: it’s all about the folklore

Between St Augustine (c. AD 400) and the twelfth-century flowering of werewolf stories we hear nothing about them. What are we to make of this 500-year gap? A simple explanation might be that the authors of the twelfth century rediscovered the long-forgotten werewolves in their ancient texts and just chose to start writing about them again. Such an explanation might initially seem to be favoured by the fact that Marie de France’s Anglo-Norman werewolf poem Bisclavret of AD 1160-78, for example, has much in common with Petronius’ story: signally, we find the recurring theme of the werewolf’s need to keep his clothes safe if he is to be able to recover his human form, with Bisclavret hiding his clothes under a rock when it is time for him to transform. However, it is unlikely that Marie had direct access to Petronius’ story…

The more interesting and intriguing possibility is that werewolves just went underground, as it were, and continued to thrive under the radar in the realm of folklore and folktale throughout these centuries, only to resurface into the world of fine literature again in the twelfth. And this is almost certainly what happened. A clue to this is to be found in what is a central theme of Marie de France’s tale, and the tales of other writers of her age: that of the adulterous wife.

When Bisclavret’s wife learns that he is a werewolf, she makes him reveal where he hides his clothes whilst under transformation, and accordingly steals them and makes off with them with the help of her lover, with whom she then elopes, leaving Bisclavret stranded as a wolf for many years before his is able to take his revenge on the pair and recover his human form. When we look back at Petronius’ tale we can see that the motif of an adulterous wife is already lurking in it in an incidental detail of which nothing is made: Melissa is conducting an adulterous affair with Niceros. There is no obvious reason why Marie and the writers of her time should have seized upon this incidental detail and elaborated it so greatly – even if they did, after all, have access to Petronius’ text. It is much more likely that, as an artful writer, Petronius had included the incidental detail of the adulterous wife in order to allude to another, related werewolf story he was familiar with but was not on this occasion telling. It will then have been upon this second story, preserved in folklore alone for a millennium, that Marie and her contemporary writers were eventually to seize.

 

3. The ancient world’s werewolves were already, pretty much, our werewolves

If one were to ask a child of today about the typical features of a werewolf story, he or she would probably reply along the following lines: (1) a werewolf is a man that transforms into a wolf for certain, limited, periods of time before turning back into his human form; (2) these times coincide with the full moon; (3) he changes in this way because he is under a curse – ‘the curse of the werewolf’; (4) in his wolf state he attacks and butchers humans; (5) he is best killed by a silver bullet.

All of these features can be found amongst the ancient lore of werewolves, with the unsurprising exception of the silver bullet, the ancient world knowing nothing of bullets, of course. This last well-loved motif seems to appear first in Jeremy Ellis’ pulp-fiction story ‘Silver Bullets’ of 1930. (1) The notion that a man might transform into a wolf in a cyclical way is implicit in Petronius and explicit in a Greek tale of Aesop, in which a thief pretends to be werewolf in order to steal a cloak: he claims that he turns into one whenever he yawns three times. (2) The notion that a werewolf might change at the full moon is strongly hinted at in Petronius’ story, in which we are told that ‘the moon was shining like the midday sun.’ (3) As to the curse, Aesop’s bogus werewolf implies that he is under a divine curse. In Herodotus and the Latin poets of the Augustan age we find wizards and witches with the ability to turn people into wolves, though they tend to use their powers not so much to curse others as to transform themselves, the more easily to pursue their nefarious nocturnal activities. (4) And as to butchering human victims, a curious werewolf described by Pausanias, the so-called Hero of Temesa – described a ghost in a wolfskin – is said to have devoured the city’s most beautiful girl every year. When young men transformed into wolves for a certain period in a mysterious Arcadian ritual associated with Zeus Lycaeus (‘Wolfy Zeus’), they could only turn back if they had – against their instincts, as it were – abstained from the devouring of human flesh in the meantime.  And Aesop’s thief separates his victim from his cloak by pretending that he is about to eat him.

 

4. ‘Lycanthropy’ is an ancient Greek word… but the Greeks didn’t have a word for ‘werewolf’

Our posh word for werewolf, ‘lycanthrope,’ together with the related abstract term ‘lycanthropy,’ is derived from the ancient Greek lykanthrōpos, made up of the elements lykos, ‘wolf’ and anthrōpos, ‘human.’ However, in antiquity, these words, first found in the second-century AD writings of Marcellus Sidetes, were severely restricted in their use. They didn’t designate werewolfism proper, but rather just a medical condition that was metaphorically conceptualised in terms of it. The condition seems to have been a sort of severe depression or mental withdrawal, the sufferers of which were said to hang around tombs listlessly. When the Greeks wanted to talk about werewolves proper, they would simply deploy phrases of the sort ‘turning into a wolf.’ It isn’t until the early-ninth-century AD Byzantine chronicle of Theophanes the Confessor that we first find ‘lycanthrope’ applied indisputably to a werewolf proper. It could be said that the Romans too had no word for ‘werewolf.’ The word Petronius and others deploy is versipellis, which, literally, has the broader meaning of ‘skin-shifter.’ However, one gets the sense that ‘werewolf’ was, nonetheless, a privileged significance of the term. 

Whilst on the subject of language, we might say something of the origins of our own word ‘werewolf.’ It is commonly construed, quite understandably, and by a process of subtraction, as it were, to signify ‘man-wolf.’ An assumption of this sort underpins many derived coinages, including, in recent times, that of Wallace and Gromit’s delightful ‘Were-Rabbit.’ The understanding goes back a long way, all the way indeed, once again, to the twelfth century AD and Gervase of Tilbury. In his Otia imperialia of AD 1210-14, Gervase, in speaking of the word in its Anglo-Saxon form, werewulf, derives the were- element from the Latin vir, ‘man.’ In recent years linguistic historians have related the first element rather to the Anglo-Saxon w(e)arg and the Old Norse vargr, the primary significances of which seems to have been ‘strangler,’ ‘outlaw’ and ‘outsider.’ However, there are vestigial indications in both languages that these words may always have borne a secondary significance of ‘wolf’ in their own right. And so it seems that in origin ‘werewolf’ may have signified ‘wolf-wolf.’ Perhaps, to get the logic of it, one must think oneself back into the situations in which a werewolf is first revealed to be such: ‘That man is a wolf I tell you, a very wolf!’

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179430 https://historynewsnetwork.org/article/179430 0
When Did America Stop Being Great?

 

 

 

My formative experience of America came during this country’s great summertime of resurgence. It was 1984, I was sixteen years old, and I had flown into Los Angeles on the eve of the Olympics. For the next six weeks, I watched, wide-eyed, as the long national nightmare of Vietnam, Watergate and the Iranian hostage crisis was brought to an end by a modern-day gold-rush.

 

A multi-racial team of US athletes, led by the likes of Carl Lewis, Mary Lou Retton, Michael Jordan and Greg Louganis, completely dominated the medal table. Team USA even performed well in some of the more obscure events - a calorific boon for customers of McDonalds, which ran a scratch-card promotion, planned presumably before the Soviet boycott, offering Big Macs, fries and Cokes when Americans won gold, silver or bronze. With the thumping chant of “USA, USA” echoing from coast to coast, it was hard, even as a visiting outsider, not to be swept up in this torrent of patriotism.

 

Later that year, of course, Ronald Reagan surfed this red, white and blue wave to a second term in the White House, winning 49 out of 50 of states. For millions of his supporters, many of them lifelong Democrats, truly it felt like morning again in America, the sunny slogan of his re-election campaign.

 

My new book, When America Stopped Being Great: A History of the Present, began as a quest for understanding. How had the United States gone from the self-confidence and swagger I experienced in the Reagan years to the American carnage of Donald Trump’s dystopian inaugural address? What had turned this country into a place of such chronic disunion, shared land occupied by warring political tribes? 

 

Then, as I was writing it, further questions arose, which came under the same rubric. Why was this superpower so vulnerable to the viral onslaught of COVID-19?  How had we arrived at the point where an insurrectionary mob could storm the US Capitol, violently seeking to overturn a presidential election in which Joe Biden had so obviously emerged the victor?

 

Like the unexpected victory of Donald Trump four years earlier, the botched response to the coronavirus outbreak and the brazen attack on the US democracy were culminating moments. They could not be written off as historical accidents or aberrations. Arguably, they had become historically inescapable.

 

How had this come to pass?

 

In locating the origins of this troubled present, we could reach back to the earliest days of the new Republic. “1776,” chanted the mob of MAGA diehards who invaded Capitol Hill, fervently believing they were acting in the spirit of the Revolutionary War. We could revisit the Constitutional Convention and the deliberations that produced the Electoral College, a relic of the Eighteenth Century that could never be described as the Founding Father’s finest work.

 

To understand America’s inherent contradictions, we could consider how the author of the Declaration of Independence could write that “all men are created equal” while also penning a pseudo-scientific treatise outlining what Thomas Jefferson believed was the biological inferiority of slaves.  Or we could journey to the battlefields of the American Civil War – Fort Sumter, Antietam, Manassas, Gettysburg – to be reminded of how division has long been this country’s default setting.

 

Instead, however, I have retraced the steps of my own American journeys: as an impressionable teenager during the Reagan era; as a student in the Nineties conducting research into the struggle for Black equality; as a fresh-faced foreign correspondent dispatched to Washington to cover the impeachment of Bill Clinton. Witnessing his Senate trial, I felt sure it would be a once-in-lifetime event. But afterwards those kind of mega-stories came thick and fast: the disputed 2000 election, the attacks of September 11th, the war in Iraq, the Great Recession, the election of Barack Obama and presidency of Donald Trump, with its back-to-back impeachments.

 

Part history, part memoir, my book describes the role of each successive president in paving the way for Donald Trump – and, yes, they all contributed to his rise. Reagan, who was the first commander-in-chief since Dwight D. Eisenhower to complete two full terms, elevated the presidency while at the same time dumbing it down.

 

After the showmanship of The Gipper, George Herbert Walker Bush demonstrated the value of a less theatrical presidency, but this moderate Republican failed to halt the rightward lurch of the conservative movement. Radicals, led by Newt Gingrich, snuffed out his thousand points of light. The Baby Boomers, who had cut their political teeth during the culture wars of the Sixties, usurped the Greatest Generation, whose belief in patriotic bipartisanship was forged during the Second World War.

 

Bill Clinton may indeed have built a bridge to the 21st Century, but for much of Middle America it felt more like a bypass. And though he presided over a period of peace and prosperity, the Nineties were pregnant with so many of the problems encountered in the new millennium: the financial meltdown of the subprime crisis, the unchecked power of Big Tech, the problem of mass incarceration.

 

George W. Bush, by pursuing his war on terror in such a polarizing way, failed to seize the opportunity presented by the calamity of 9/11 to reunify an ever more fractious nation. Like his father, he also failed to steer the conservative movement in a more compassionate direction.

 

Barack Obama helped stave off a financial meltdown when he first took office in the midst of the Great Recession, but during his eight years in office he struggled to soothe the fears of blue-collar Americans who felt like castaways in a globalized and digitized economy. His presence in the White House, rather than closing the country’s racial rift, fuelled the rise of white nationalism and the presidential candidacy of the untitled leader of the birther movement.

 

And there are so many more milestones and waystations on the path to polarization.

 

The political success of Donald Trump should not have taken us all by such surprise. So many trend-lines – political, economic, racial, cultural, spiritual and technological - converged and culminated in his presidency. As the 2020 election underscored – a contest, remember, in which he won 25 states and amassed more than 74 million votes – his presidency was not some American aberration. He became the figurehead for much of this country, and remains so even after his role in inciting his moshpit of MAGA diehards on January 6th.   

 

Just as few weeks ago, I was on the inaugural stand in Washington, just as I had been four years earlier, and listened to America’s 46th president, Joe Biden, make his plea for national healing. “We must end this uncivil war,” he said, in one of the more searing lines of his speech. Alas, the disturbing conclusion I reach in When America Stopped Being Great is that genuine national unification may now be impossible to achieve. The United States is riven with so many unbridgeable divides. Its very name has become a misnomer.

 

Travelling this vast land, I struggle to identify where politically, philosophically or spiritually it will find common ground. Not in the guns debate. Not in the abortion debate. Not in the healthcare debate. Not at weddings, where more than a third of Republicans and almost half of Democrats say they would be unhappy if their children married a partner from the other party, compared to 5 percent in 1960. Not in the singing of the national anthem at American football games.  Not in the debate over the country’s history, and how it should be memorialized.

 

Few, if any, national events, are politically benign, ideologically neutral or detached from the culture wars. No longer are there demilitarized zones in US politics. It seems that everything is contested. Even the most rudimentary of facts. Even the simplest of protections, like the facemask. Even the most clear-cut of presidential elections.

 

After talking so much this century about the emergence of a post-America world, I fear we are living in a post-America America. The land that I fell in love with during that summertime of resurgence has entered a bleaker season.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179450 https://historynewsnetwork.org/article/179450 0
Does Ridiculing Q Followers Fuel the Fire? Historical Lessons in Applied Social Science

 

 

 

The recent Senate acquittal of former president Donald Trump renewed attention on QAnon followers, with reports that some followers are taking the trial result as evidence of the correctness of their beliefs, galvanizing their faith in “the plan.” Prior to the acquittal vote, other QAnon followers had soured on the group, particularly after the failed prophecy on January 6 that Trump would remain president.

 

Families are also experiencing hard feelings and tensions associated with QAnon, as people struggle to relate with family members who are strong believers.

 

Whether a follower decides to leave or stay, outsiders and loved ones would do well to resist ridiculing. Deprogramming cultish beliefs is difficult under the best of circumstances. Shaming and making fun of adherents may make you feel good, but it won’t help you achieve the more important goal of getting back your loved ones. Social science helps us understand why.

 

I wrote earlier about the ways devotees to a cause can respond in the wake of a failed prediction. While there are acute differences between the Seekers cult and QAnon followers, particularly when it comes to their post-disconfirmation behavior thus far, we’d be naive to not learn from the allegiances of past failed prophecies.

 

The Seekers example dates back to 1954 when a Doomsday group believed the world would end in a catastrophic flood with only the believers in the group being spared. Social psychologist Leon Festinger chronicled the group’s actions in the months leading up to and after the prophesied day of destruction.

 

What was remarkable about the Seekers (and which confirmed Festinger’s famous theory of cognitive dissonance) was that when faced with knowledge of the failed prediction—there was no flood—believers not only doubled down on their belief, but immediately began to proselytize. They became desperate to win new converts.

 

This may resemble what we’re seeing with QAnon followers post-acquittal, but by comparison, it’s hard to say what they’re currently thinking. Perhaps some are laying low and biding their time. Perhaps others have abandoned the group (as did some of the Seekers). Some may be experiencing a restored sense of hope given the most recent failed impeachment of Trump, but we haven’t yet seen evidence of increased proselytizing by QAnon adherents.

 

Both groups experienced a failed prediction on a massive scale, a prediction in which both groups’ followers believed strongly and to which they demonstrated fierce commitment. Seekers did so by giving up their jobs and handing over their possessions and savings, while QAnon followers risked exposure and prosecution on January 6 by rioting in the nation’s capital. But what determines whether QAnon followers will increase their proselytizing?

 

Inspired by Festinger’s test of his theory of cognitive dissonance, scientists Jane Hardyck and Marcia Braden published a report of a doomsday group they observed in 1962 called the Church of the True Word. The group also faced a disconfirmed prophecy that the world would soon end in nuclear war. In commitment to their belief, the group had built underground fallout shelters and stocked them with canned and dehydrated food, large jugs of water, and power generators. On word that the group’s prophet had received a divine message, having already been inspired by certain passages of the book of Revelation, 135 followers headed into their shelters.

 

Of those who went in, 103 remained for a full 42 days and nights. The faithful received word to come out. Those who’d stayed inside claimed that their stay had strengthened their faith and increased their belief in the work of the group. While the Seekers reframed their own failed prediction as God having spared the world, the True Word group reacted in the same vein by claiming God was testing their faith. In other words, some members of both groups clung to their beliefs.

 

However, whereas the Seekers desperately began to search for new converts, the True Word group did not. What made the difference? The observations of scientists Hardyck and Braden suggested one possibility: news coverage of the Seekers was largely disdainful and ridiculing. By contrast, the True Word followers received very little ridicule from news outlets or local townspeople. In fact, the town mayor was quoted as explicitly calling for no one to ridicule the group for their beliefs.

 

It was ridicule that backed the Seekers into a corner from which relief could come only by attempting to find other people who believed the same. Because the True Word followers weren’t ridiculed, they were not pushed to the point where they felt no recourse but to proselytize. Herein lies the lesson for us.

 

Your loved one’s next move may depend on the amount of ridicule they experience. It may be tempting to scoff at the group, but in the end it may be counterproductive. If we really want to shrink QAnon, or at least curb its outreach efforts and heal our families, we have to recall the differences in treatment given to followers of the failed prophecies of the past. That’s not to say people who broke the law or engaged in other antisocial behavior shouldn’t be held accountable, but history does have a way of repeating itself.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179428 https://historynewsnetwork.org/article/179428 0
Legendary Director Agnieszka Holland and Screenwriter Andrea Chapula on the Ukrainian Famine and Their Film "Mr. Jones"

Director Agnieszka Holland (l) and writer Andrea Chalupa (r)

 

 

Director Agnieszka Holland is celebrated for her career in filmmaking and screenwriting and for her political activism in Poland. Among her achievements as a filmmaker, she collaborated on the screenplay adaptation of Andrzej Wajda's Danton (1983), then directed Angry Harvest (1985), which was nominated for an Academy Award for Best Foreign Language Film. In 1992, she earned even greater international acclaim, including a Golden Globe Award and an Oscar nomination for Best Adapted Screenplay for Europa Europa, based on the true story of a young boy who joins the Hitler Youth to hide his Jewish identity. In 2010, Holland was Nominated for an Emmy in Outstanding Directing for a Drama Series for her work on HBO's Treme (2010). A year later, her feature film, In Darkness, was nominated for an Academy Award for Best Foreign Language Film. In 2017 she received Alfred Bauer Prize (Silver Bear) for her film Spoor at the Berlin International Film Festival. And, in 2020, she was elected President of the European Film Academy.          

Andrea Chalupa, the screenwriter of Mr. Jones, is a journalist and the author of Orwell and The Refugees: The Untold Story of Animal Farm. She has written for TIME, The Atlantic, The Daily Beast, and Forbes. She has spoken widely on Ukraine affairs and is a founder of DigitalMaidan, an online movement that in recent years made the Ukrainian protests the top trending topic on Twitter worldwide. She also hosts the Gaslit Nation podcast she focuses on authoritarianism at home and abroad in her broadcasts. Her expertise includes Ukrainian language, history and politics.

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared at Bill Moyers.com, Salon.com, Writer’s Chronicle, Re-Markings, Crosscut, Documentary, NW Lawyer, Real Change, Huffington Post,  and more. He has a special interest in the history of human rights and conflict. He can be reached by email: robinlindley@gmail.com.

 

Starvation. A protracted, agonizing way to die. As described by Professor Anne Applebaum in her book Red Famine: Stalin’s War on Ukraine (2017), starvation follows a set course once the human body is deprived of food. The body initially consumes its stores of glucose as one grows hungry and thinks constantly of food. In the next few weeks, the body consumes its fats and weakens dramatically. Then, the body cannibalizes its tissues and muscles, and the skin thins, the eyes distend, and the legs and belly swell as chemical imbalances result in water retention. Even small efforts cause exhaustion. As the vital organs fail, infections or illnesses such as pneumonia, typhus, diphtheria, and others may hasten death.

Millions of Ukrainians died in this manner during the horrific famine in 1932-33. The Soviet government under Stalin engineered this genocidal atrocity through policies that killed mostly poor farmers and their families as the Soviet secret police eliminated Ukrainian leaders and intellectuals. Ukrainians refer to intentional famine as the Holodomor—meaning “death by hunger” in Ukrainian. Ukrainians were denied access to grain that was sent out of the region. Men, women and children were reduced to eating weeds, tree bark, wall paint, the corpses of animals. And there were also many incidents of cannibalism.

The Soviet crackdown was a response to Ukrainian resistance to collectivization of farms and other Stalinist policies. Casualty figures range from three million to an astounding fourteen million deaths. In Red Famine, Professor Applebaum contends that at least three million Ukrainians died because the Soviet state deliberately planned to kill them in the Holodomor.

The west knew almost nothing of this mass campaign to destroy the people of Ukraine. A Welsh journalist, Gareth Jones, secretly and courageously gained access to restricted, famine-plagued regions of Ukraine and reported back to the Western press on this widespread catastrophe. Jones’s reports of the famine shocked readers, but the stories were undermined by Soviet propaganda denying his accounts as well as by Western journalists who reported uncritically on the Soviet government to gain Stalin’s favor.

In her recent feature film Mr. Jones, legendary Polish film director Agnieszka Holland depicts the Ukrainian famine through the perspective of the reporter Gareth Jones. The film captures the tenacity and bravery of Jones and the array of forces pitted against him to keep the brutal reality of the famine from escaping the boundaries of Ukraine. The film is based on archival research, diaries, survivor accounts, and other material, much of it uncovered by Andrea Chalupa, the screenwriter for Mr. Jones and expert on Ukrainian history and politics,

Ms. Holland and Ms. Chalupa graciously responded to a series of question about the making of Mr. Jones, the history of the Ukrainian famine, their research process, and more.

 Director Agnieszka Holland is celebrated for her career in filmmaking and screenwriting and for her political activism in Poland. Among her achievements as a filmmaker, she collaborated on the screenplay adaptation of Andrzej Wajda's Danton (1983), then directed Angry Harvest (1985), which was nominated for an Academy Award for Best Foreign Language Film. In 1992, she earned even greater international acclaim, including a Golden Globe Award and an Oscar nomination for Best Adapted Screenplay for Europa Europa, based on the true story of a young boy who joins the Hitler Youth to hide his Jewish identity. In 2010, Holland was Nominated for an Emmy in Outstanding Directing for a Drama Series for her work on HBO's Treme (2010). A year later, her feature film, In Darkness, was nominated for an Academy Award for Best Foreign Language Film. In 2017 she received Alfred Bauer Prize (Silver Bear) for her film Spoor at the Berlin International Film Festival. And, in 2020, she was elected President of the European Film Academy.          

Andrea Chalupa, the screenwriter of Mr. Jones, is a journalist and the author of Orwell and The Refugees: The Untold Story of Animal Farm. She has written for TIME, The Atlantic, The Daily Beast, and Forbes. She has spoken widely on Ukraine affairs and is a founder of DigitalMaidan, an online movement that in recent years made the Ukrainian protests the top trending topic on Twitter worldwide. She also hosts the Gaslit Nation podcast she focuses on authoritarianism at home and abroad in her broadcasts. Her expertise includes Ukrainian language, history and politics.

 

Robin Lindley: How did you both work together on this cinematic historical opus?

Andrea Chapula: Great! It was very easy to work on the script with Agnieszka. We both seemed to be on the same page about most things the entire time. I sent her the script and met with her by phone in August 2015, and she agreed to direct the film September 2015. Then we were off and running. It took us about three years to raise financing and cast the film. 

Agnieszka Holland: We have always worked well together. Andrea wrote the script by herself; and I started to do my own research and then participated in the consecutive versions of the script. I had my own extensive knowledge of Holodomor history, as I read Timothy Snyder’s Bloodlands and several other books about the Holodomor, Stalinian politics of collectivization, and Stalin’s other crimes.   

Robin Lindley: I sense that most Americans (including me) know little about the Ukrainian famine yet this crime against humanity was one of the greatest atrocities in history. How would you briefly introduce this horrific history to readers?

Andrea Chapula: The Holodomor, the Ukrainian word for death by hunger, is Stalin’s genocide famine that killed millions of people, the vast majority in Ukraine. 

Agnieszka Holland: I’ve always felt it as an injustice and that there is a universal gap in knowledge about communist and particularly Stalinian crimes. Even if some facts entered the conscience of the people during the Cold War, or after publication of Solzhenitsyn’s Gulag Archipelago, it was forgotten and forgiven since. And in Russia itself, where in every family you can find a victim of Stalin’s crimes, the memory was washed out and the majority of Russians consider murderous Stalin to be the greatest leader in Russian history. It is unjust toward the victims and dangerous for misunderstanding what is the nature of totalitarian regime. We cannot fully understand the present and hope for a healthy future if we neglect the most important lessons of the past. And it is most important to understand the past to understand the current Ukrainian and world situations. 

Robin Lindley: Was the great Ukrainian famine the result of poor Soviet policy with agricultural collectivization or was it a deliberate genocidal war on the people of Ukraine engineered by Stalin? If the latter, why would Stalin want to eliminate Ukrainians?

Andrea Chapula: The Holodomor was genocide. I wrote and directed a short documentary featuring historians Anne Applebaum, Timothy Snyder, Norman Naimark, Serhii Plokhii, Frank Sysyn, and Alexander Motyl discussing the all-out assault on Ukrainian national identity that accompanied the Soviet-engineered famine. I’ve also interviewed and watched video testimony of survivors describing how their homes were searched by soldiers who confiscated the food they had hidden. One woman described to me how soldiers came and took away the pot of water she was boiling over a fire full of twigs and leaves she was planning to eat since there was no food left. So not only was Ukraine’s grain seized and sold abroad to raise money to help rapidly modernize Stalin’s empire, there were also terror squads of soldiers and agents that searched and destroyed whatever people used to try to feed themselves just to stay alive. This was state organized mass murder.  

Agnieszka Holland: I share the opinion of many historians, that the Holodomor was not only a side effect of the mistakes of collectivization, but also deliberate politics of Stalin toward richer paysans and toward Ukrainians. Ukraine had strong feelings of independence and identity, and had rich soil and the best agricultural organization in Stalin’s territory. Stalin wanted to break their pride and resistance, and reap the riches of their soil and productivity. 

Robin Lindley: The scenes of the famine in Mr. Jones are especially haunting and heartbreaking. What did you learn about the human reality of famine and starvation? How did the famine affect individuals and families?

Andrea Chapula: Anne Applebaum’s Red Famine: Stalin’s War on Ukraine goes into how starvation kills someone slowly and the gradual effects on the body. My grandfather described how his brother was driven mad by hunger and how he had to stop his brother from shoving dirt into his mouth when he was hallucinating and seeing food.

Starvation is a slow torture; it’s a painful way to die. There are horrific stories of cannibalism and packs of orphans wandering ghost villages. The actual history would require a horror film to show it more accurately. In Mr. Jones, we only give people a glimpse of the devastation. 

Agnieszka Holland: The human reaction to the famine is the same in every circumstances. We could see it during Irish famine, Mao’s Big Leap terrible famine, Leningrad’s siege… and it’s all the more violent and destructive when famine is caused by man rather than natural catastrophe. The Holodomor is now the focus of several extensive historical and psychological studies, and we now know that the mental and physical impacts of the famine remain present in the descendants of the survivors and their families, sometimes even many generations later. 

Robin Lindley: Do you have relatives or friends who experienced the famine or other personal connections to the Holodomor?

Andrea Chapula: My grandfather on my mother’s side survived the famine with his family in east Ukraine. 

Agnieszka Holland: No, but in preparing for the film, I read many testimonials of survivors and their descendants; and shooting the film in Ukraine, I had met many Holodomor survivors and spoke with them about their experience. 

Robin Lindley: The famine occurred during the Great Depression in the US. Did the US government know of the famine and did it somehow respond?

Andrea Chapula: Ukrainian diaspora groups knew and tried to raise awareness. As we show in the film, FDR granted the Soviet Union official recognition in 1934. The scene of a fancy banquet with [New York Times reporter Walter] Duranty being toasted in New York to celebrate the US/USSR actually happened. Applebaum goes into it here

Agnieszka Holland: The depression was not deliberately planned by the US government. The Holodomor was deliberately planned and enforced by Stalin. So, with the Great Depression, we can speak about an incompetent reaction of a capitalistic society. In the case of Holodomor, it was a conscious, programmed crime, serving a political and ideological agenda.  

Robin Lindley: How did the famine end? Did poverty and starvation continue through the Second World War or did food shipments resume to Ukraine?

Andrea Chapula: The Holodomor ended when the process of collectivization was complete, but the cover-up continued. People weren’t allowed to talk about it inside the Soviet Union. More info here.

Robin Lindley: What was the research process for the film? The sets, costumes, props and other details are very elaborate and it’s evident that great care was taken in assuring authenticity. Of course, Ms. Holland’s films are renowned for assuring historical accuracy.

Andrea Chapula: I studied History with a focus on Soviet History at UC Davis. I spent several years researching the history that inspired the film. We also had a team of historical advisors to vet the script and the finished film. We worked with an incredible crew that ensured that they were staying within the specific period in terms of props and costumes. Even stamps and packaging on envelopes and the articles the journalists present to Duranty were all created to fit that specific moment in time. 

Agnieszka Holland: We went through an extended research process in preparing for the film, consulting photos, paintings, movies, documentaries, documents. I like to be authentic, but in the first place, to know historical reality well enough to free my imagination. 

Robin Lindley: Your movie follows the journey of the intrepid Welsh reporter Gareth Jones, played by James Norton, who learned of the famine and brought the story to the West. You show Stalinist politics and the famine through the perspective of Jones. What are a few things you’d like readers to know about the Jones? How did you learn of his story? 

Andrea Chapula: The more I dug into the real Gareth Jones, it became undeniable that he was simply a good human being with a strong character. He’s a classic hero. After working on projects about anti-heroes, like House of Cards, Agnieszka was attracted to showcasing a morally courageous person, especially given the times we find ourselves in. She felt, as do I, that the world needs more heroes. 

Agnieszka Holland: When reading Andrea’s script, I thought that I never heard about Jones, but actually his story was told in the Holodomor chapter of Snyder’s Bloodlands, so I had encountered him before. After, I learned more through access to Jones’ notebook, and the documentary his grandnephew shot about the circumstances of his death. 

Robin Lindley: Mr. Jones stands as a tribute to the dauntless Gareth Jones and also stresses the essential role of a free press in a democracy. Was that part of your intention in presenting this story at a time when an American president described members of the press as “enemies of the people”?

Andrea Chapula: I first got the idea in 2003 to pay tribute to my grandfather and all that he had survived in Ukraine under Stalin. I of course never envisioned the story itself being so timely, and still find that surreal. 

Agnieszka Holland: The questions about the role of the media, and the importance of fact checking and investigating honest journalism, were among the main reasons.  Democracy will not survive when media can be corrupted. 

Robin Lindley: What did you learn about Walter Duranty (played by Peter Sarsgaard), the Pulitzer Award-winning New York Times Moscow bureau chief who undermined Jones and refused to report on the famine? Why was he determined—with other Western reporters—to cover up for Stalin’s regime? And did he actually host lavish sex orgies?

Andrea Chapula: The more I dug into Duranty, the worse he seemed. Peter Sarsgaard delivers a sympathetic portrayal. The real Duranty, who had a child with his live-in housekeeper, left both the mother and their child behind when he left the Soviet Union. The hedonism seen in the film is inspired by the drunken orgies Duranty regularly attended in Moscow, including at a club called “Stable of Pegasus.” His biographer Sally Taylor describes this. Duranty shared a lover with the Satanist Aleister Crowley and participated in his black magic sex orgies in 1920s Paris. 

Agnieszka Holland: I mostly used Andrea’s research from her writing process, books, press articles, other journalists’ statements. We are unable to fully know his real intentions, but the presented facts are quite incontrovertible.

Robin Lindley: Who was Ada Brooks (played by Vanessa Kirby), the journalist who befriends and helps Jones get to Ukraine in the movie? Was her character based on a real person?

Andrea Chapula: Ada was invented and based on my own experiences having an awful editor when I first started working in journalism. It turns out there was a young woman, named Rhea Clyman, who worked for Duranty for a time and broke away from him to report the truth about the famine. There’s a documentary that just came out about her. We named a character Rhea Clyman to pay tribute to her. 

Robin Lindley: George Orwell also makes an appearance in the film. Did he and Jones actually meet and become friends? Did Orwell’s Animal Farm grow out of the stories by Jones and the events of the Ukrainian famine, as the film suggests?

 Andrea Chapula: Orwell came into the story inspired by the Ukrainian translation of Animal Farm produced by World War II refugees. It turns out that I have a copy thanks to my uncle who, when he was a kid, immigrated to the US with it from a European refugee camp. I wrote about this in my book Orwell and The Refugees: The Untold Story in Animal Farm. Here’s an overview of that story in a piece I wrote for The Atlantic. 

Orwell and Gareth Jones never met, as far as we know. They shared a literary agent and an independent spirit. They were both around the same age and idealistic. 

Robin Lindley: Filmmaking is a complex, collaborative process, and I appreciate the work both of you did to complete this pioneering work on Jones and the terrible famine. How did the technical making of the film evolve?

Andrea Chapula: Every film that gets made is a miracle. It seemed as though the entire project was about to fall apart and then suddenly, we found ourselves on set in Ukraine in the middle of a snow storm. It was a harrowing experience just to get the film made and finish it within budget and on schedule. This film especially needed a lot of miracles. 

Agnieszka Holland: Script, producers, director and most importantly, money. For this kind of difficult, ambitious, independent movie, the financing is the most difficult part of the story. Then the casting, and— last but not least—the creation of the movie itself. And then another difficult step: effective delivery to the audience. 

Robin Lindley: The cast of Mr. Jones is first rate. As a fan of Grantchester, I especially appreciated James Norton’s star turn as Gareth Jones. How did the casting process work?

Andrea Chapula: We originally cast another actor who then asked us to delay the project for about six months so he could do a TV series. We needed snow and to film that winter. So we had to scramble for another actor and our tenacious casting director Colin Jones found us James Norton who seemed born to play the role. 

Agnieszka Holland: This was a long process, as before our financing was closed, it was difficult to attract names. James came quite late to the game, but was immediately very enthusiastic. And he was the great trouper in this difficult adventure; creatively and humanely. 

Robin Lindley: The cinematography is striking, especially the muted scenes from famine-struck Ukraine. That remarkable transition in visual style was ingenious. I’ll always remember the glowing orange (in color) in the dark railroad cattle car. Can you talk about the decisions that go into cinematography on an epic film like Mr. Jones? I realize you’re a master Ms. Holland.

Andrea Chapula: The orange scene was taken from real life. Gareth Jones experienced that on a train headed into Ukraine. As for the colors and cinematography, that’s the genius of Agnieszka and our director of cinematography Tomasz Naumiuk. 

Agnieszka Holland: Thank you! We had a young, but very talented cinematographer, Tomasz Naumiuk, and we closely collaborated on making the detailed concept of the general visual style and the particular sequences. And then we were inspired by reality: weather, light, sets.

Robin Lindley: Where was Mr. Jones filmed? We’re you on location in Russia and Ukraine?

Agnieszka Holland: Ukraine, Poland, and Scotland. 

Robin Lindley: How have viewers responded to your film? The reviews seem very positive. Did you hear from Russian viewers? Was the film banned anywhere? Did you face any threats?

Andrea Chapula: The film received a huge reception in Ukraine, which was extremely gratifying. One Ukrainian journalist who interviewed me for around two hours had seen the film three times in one week when it premiered in Berlin, and sounded like she read every review and seemed to know the film as well as I did.

The reception in Ukraine was the most exciting part since this is their history that we want to help raise awareness of. We also received a lot of thoughtful questions from Russian journalists at the press conference for the film when it had its world premiere at the Berlin Film Festival. These were of course independent journalists not affiliated with Russian state media. 

Agnieszka Holland: The film has had very good reception around the world, and especially in Ukraine. Unfortunately, it was not possible to sell this film to Russia, and most Russians today still believe the Stalinist version of the story.

Robin Lindley: The recent history of Ukraine is tumultuous, from the Chernobyl disaster and fall of the Soviet Union to the ongoing bitter conflict with Russia. The story of the Holodomor still seems resonant today.  How do you see the recent history of the Ukraine?

Andrea Chapula: Ukraine’s recent history is a cautionary tale of corruption as a human rights issue. As Biden told Ukraine’s parliament when he was Vice President: clean up your corruption the Kremlin weaponizes against you.

What’s important to know as well is that Ukraine’s 2013-2014 Revolution of Dignity -- EuroMaidan -- was driven by people from all walks of life in Ukraine who want to live in a more European society, away from Moscow’s orbit. Many people I interviewed in regards to the revolution told me that Moscow’s oppressive history plays a role in Ukrainians wanting to break free and join Europe. 

Agnieszka Holland: The political and economic situation in Ukraine is extremely difficult. The Donbas war never ended. The division of the country, the corruption, incompetent politicians, the “free world” which doesn’t pay any real attention to real Ukrainian challenges…these all add to the difficulties. But in creating this film, I met countless strong, motivated, educated young people in Ukraine that give me hope, and the Ukrainian identity becomes stronger every year.

Robin Lindley: Donald Trump’s dealings with Ukraine led to his first impeachment. How do you see the Trump-Putin relationship and its effect on Ukraine?

Andrea Chapula: Donald Trump admires and looks up to dictators like Putin, because he wants to be one. I have a podcast that examines the threat of authoritarianism in the U.S. and around the world called Gaslit Nation. It regularly covers this topic. After the horrific quid pro quo pressure campaign Trump put Ukraine through, it must be a huge relief for Ukraine to now have a Biden administration. Biden was and remains a staunch supporter of Ukraine. So the next few years should have a positive impact on Ukraine in terms of getting the support they need from the U.S. to resist Putin’s ongoing invasion and confront corruption through civil society programs from a rebuilt and robust State Department. 

Robin Lindley: Stalin was a master of keeping the story of the Holodomor from the outside world. Is Russia under Putin doing the same thing now in regard to stories out of Ukraine and other issues?

Andrea Chapula: Under Putin, Stalin has been resurrected as a hero. There’s a heartbreaking story of Yuri Dmitriev, a historian who nearly had his life destroyed after uncovering Stalin-era mass graves. Russia’s official state Twitter accounts sometimes like to muddle the truth about the famine. 

Agnieszka Holland: The attempt to kill Navalny and several political murders orchestrated by Putin, show the real nature of today’s Russia. Russia intervenes in free elections and political life in many free countries, USA included. I don’t have illusions about Putin’s intentions. Read Dostoyevsky’s Demons. It is again very relevant book.

Robin Lindley: What do you hope viewers take from your film?

Andrea Chapula: I hope Mr. Jones inspires people to learn more about the famine and read books like Anne Applebaum’s Red Famine: Stalin’s War on Ukraine and Bloodlands by Timothy Snyder. When I watch historical dramas, I always want to know what was real and what was poetic license. As one reviewer wrote of Mr. Jones, the stranger elements of the film tend to be true. 

Agnieszka Holland: Some understanding of the world and its hidden tragedies; the respect for free journalism and the courage of individual reporters; and the knowledge that when the media are corrupted, and the political class is cowardly, lazy and opportunist, and the society is indifferent—the scene is set for evil to arise and take root.

Robin Lindley: What are your next projects? Will you be doing more on Ukraine and its history?

Andrea Chapula: I’m working on a script inspired by my father-in-law who led a student uprising in 1956 Romania in solidarity with the Hungarian Uprising next door. I like to write stories about individuals taking great risks against authoritarian systems. Given my family’s own history, those are the stories I’m attracted to. 

Agnieszka Holland: After Mr. Jones, I directed another film, dealing with the real historical figure, Charlatan, which premiered at Berlinale 2020 and was presented to international Oscars category as a Czech entry. And I am observing closely the reality, waiting for the new inspiration and the output of different processes. 

Robin Lindley: We’re now living at a time of a deadly global pandemic and democracy under threat in many nations. And you’ve explored one of the darkest moments in human history. What gives you hope at this challenging time?

Andrea Chapula: From my years of research into dark chapters of human history, I’ve learned to look at moments like the one we’re currently living in as times of moral courage. Heroes always emerge. There is fierce resistance, because most people are decent and committed to staying human. I have tremendous faith in people and believe that we’re ultimately going to evolve from this dark time that we’re in. 

Robin Lindley: Is there anything you’d like to add about the film, the famine, Ukraine, or anything else?

Andrea Chapula: History is healing. The more we learn of our history, the easier time we have understanding the issues we’re currently dealing with in the world and how to navigate them. The nation that knows its past protects its future.

 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/blog/154479 https://historynewsnetwork.org/blog/154479 0
The Return of Human Rights on the American Agenda?

 

 

 

Almost lost in the flurry of activity flowing from the new Biden Administration, the issue of human rights has been thrust center stage. After a four-year absence from the U.S. agenda, the Biden team announced Tuesday a new set of sanctions against senior Russian officials for the attempted poisoning of Alexei Navalny, the highly charismatic and effective political activist who is leading protests against Russian corruption and autocracy.

 

With his rallies running into the tens of thousands, many believe Navalny poses the greatest threat in years to the stability of Russian President Vladimir Putin ‘s regime. The fact that he was sentenced to two and a half years in prison for alleged “parole violations” shows that officials high in Putin’s government see that threat similarly. The new U.S. move also called for the immediate release of Navalny from prison.

 

During his first full week in office, President Biden raised the issue with Putin in their initial conversation. While it’s not known precisely what Biden said, the important fact is that he raised it at all in a discussion that included such weighty issues as extension of a critical nuclear treaty, Russian hacking of government and private computer networks (“Solar Winds”), and interference in our 2020 elections. One can only conclude Biden was very forceful in conveying to Putin that it’s a new day in U.S.-Russia relations, and that human rights are again high on the list.

 

Biden’s quick response is no surprise; he has embraced the cause of human rights throughout his long public career. It is particularly timely now because it follows a four-year drought of any serious U.S. effort to advocate abroad for the fundamental values that most recent presidents of both parties have voiced in one way or another. Its absence is one of many value norms the previous administration failed to continue, protesting neither Russia’s attempted poisoning of Navalny nor Saudi Arabia’s assassination of an American-based journalist in Turkey. 

 

In his Inaugural address, President Biden thanked three of his predecessors for attending and mentioned that the evening before he had called the only other living former president, Jimmy Carter who, at age 96 was unable to make it; he nonetheless saluted him “for his lifetime of service.” I believe there is a special affinity between Joe Biden and Jimmy Carter, one that began 45 years ago when a young man from Delaware became the first U.S. senator to endorse an obscure former governor of Georgia in his long-shot campaign for the presidency. In his 2010 “White House Diaries,” Carter cited Biden as “my most effective supporter during the 1976 campaign;” that’s the kind of affinity in politics that only grows stronger over time.

 

I once asked Carter -- whose presidency was arguably stronger and more consistently focused on the issue than any other -- where his commitment to human rights came from. He said it sprang from what he described as the “explicit idea” underlying the nation’s founding, his deeply rooted Christian faith, and the experience of growing up where the vast majority of his neighbors were African Americans.

 

Joe Biden too has a deeply rooted faith; it’s apparent also that his whole being has been affected by his profound personal losses.  Whatever the source, Biden has been a consistent a vocal supporter of human rights from his earliest days in the Senate. No one should doubt that support will remain strong during his presidency, as evidenced by his recent sanctioning of military leaders for their coup in Myanmar.  

 

The parallels between these two presidents on human rights are almost uncanny. Just as Biden begins his presidency with a major Russian human rights issue on his plate, so too did Carter more than four decades earlier. On January 20, 1977, he signaled in his Inaugural address a strong commitment to human rights: “Because we are free, we can never be indifferent to the fate of freedom when our moral sense dictates a clear-cut preference for those societies which share an abiding respect for human rights,” adding in case anyone missed the point, “. . . Our commitment to human rights must be absolute.” 

 

The next day Carter received a letter from Andrei Sakharov, a highly regarded Russian Physicist who had won a Nobel Peace Prize for his human rights activism, asking Carter to join the struggle for the cause in the Soviet Union. President Carter responded immediately in a way that both encouraged dissidents to continue their efforts and helped cement the term “human rights” into the international lexicon.

 

President Biden is dealing with a situation that is very different from Carter’s experience with Sakharov. Navalny is running a much more sophisticated and effective campaign than Sakharov was able to, with at least 40 offices throughout Russia, the use of social media and other technologies that keep them several steps ahead of the Russians.  As Navalny courageously returned to Russia knowing he would be arrested and likely imprisoned, his team used the moment to release videos of Putin’s plush “billion-dollar mansion” on the Black Sea, catching Russian officials off-guard and allowing his compatriots to organize rallies in 100+ cities across Russia. His savvy aides have also urged the Americans to work closely with the Europeans to maximize the effect of sanctions, which they hope will be directed to the oligarchs close to Putin, so that they will be felt where it hurts most.

 

Tuesday’s new sanctions put the U.S. in essentially the same place as the Europeans. Administration officials hinted that other punitive measures may follow to address other events such as the Solar Winds hacking scheme, interference in the 2020 elections, and other human rights violations. Secretary of State Antony Blinkin said the sanctions were intended to “send a clear signal that Russia’s use of chemical weapons and abuse of human rights have severe consequences.”

 

While these sanctions did not affect the oligarchs or Putin, their focus on lesser but still important officials points out why human rights have almost always been a conundrum: They are often mixed in a stew of other concerns when policy-makers are forming a comprehensive foreign policy and are seen by some as an irritant or worse by others across the table. Diplomats are not hesitant to argue that pressing human rights can impede progress on what they see as more important issues, usually those of a geopolitical nature. It’s a tension that goes back a while, and it’s evident in President Biden’s recent sanctions of officials in both Russia and Saudi Arabia; the tension is inherent in the practice of human rights diplomacy,  and will likely remain there for the foreseeable future.

 

Inevitably, critics of an American human rights effort will argue that we don’t have clean hands or “standing” when it comes to the subject. How can we urge others how to behave when the United States has within itself such contradictions as centuries of racial injustice and the attempted eradication and subsequent abuse and neglect of indigenous Americans? President Carter addressed that question in 1977 when he told a United Nations audience that though the United States hadn’t always lived up to its ideals, it nonetheless “has a historical birthright to be associated with human rights.”   

 

Franklin D. Roosevelt said even earlier, “The presidency is preeminently a place of moral leadership.” Although FDR likely had dealing with the Great Depression in mind, we should be glad that both Jimmy Carter and Joe Biden took heed from the broader meaning of Roosevelt’s admonition and this president is prepared to face these new human rights violations, perhaps in conjunction with election interference, Solar Winds and other matters.  If so, the result could be an unprecedented set of measures that impose serious consequences for the Russians who are responsible.  

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179448 https://historynewsnetwork.org/article/179448 0
Was Madison Mistaken?

 

 

 

As the convention at Philadelphia wrapped up on September 17, 1787 there was no fist-pumping. Benjamin Franklin was supposedly asked what sort of government the delegates had invented, a monarchy or a republic. The 81-year-old is said to have responded, “A republic, if you can keep it,” as ambiguously as any Greek oracle, and dour. Did he mean, “If you can defend it from foreign powers”? Did he mean “If you can avoid civil war”? Or—as Nancy Pelosi implied when she invoked the quotation to support Trump’s 2018 impeachment—did he mean “If you don’t fall victim to demagogues”?

 

A clearer guide, in fact the best guide to the framing of the American Constitution is James Madison. At our current moment of dangerous national division, we may well remember Madison’s greatest fear about the future of the Republic he did so much to shape: the problem of factions. But it is also a good time to celebrate his overall contribution to the design of our Republic. Perhaps, if we can now display even a fraction of Madison’s resourceful integration of theory and practice, we will manage to hang onto the gift that he and his fellow framers bequeathed to us. 

 

Madison regarded the chief challenge to keeping a republic as the human tendency to separate into factions, posing an inescapable danger to popular governments. Since factionalism could be eliminated only by “destroying… liberty,” the principal task of the Constitution, he maintained, lay in “controlling its effects.”

 

Madison’s Federalist 10 depiction of the sources of faction reads like a prophetic blueprint of American divisions ever since. “A zeal for differences of opinion concerning religion” comes first, followed by “attachment[s] to… leaders ambitiously contending for pre-eminence and power.”  “But the most common and durable source of factions,” his list concludes, “has been the various and unequal distribution of property.”  Those divisions “grow up of necessity in civilized nations, and divide them into different classes, actuated by different sentiments and views.” Writing six decades before the Communist Manifesto, Madison’s prescient insight into class conflict depicted “an equal division of property” as a “wicked project.” Instead, managing economic and other societal divisions would form “the principal task of modern legislation,” involving “the spirit of party and faction in the necessary and ordinary operations of government.” The challenge was to involve factions in ways that do not suppress the liberty of the citizens or unravel popular government.

 

Since a pure democracy would offer “no cure for the mischiefs of faction,” Madison proposed “a republic” governed by “the scheme of representation.” A republic, unlike a direct democracy, could accommodate a large number of citizens and be extended over greater territory. But the question remained “whether small or extensive republics are most favorable to the election of proper guardians of the public weal.” A larger republic was better, Madison argued, not as a means to wealth and power, but because it was less likely to be overwhelmed by “factious combinations.” Greater size would encompass a wider variety of parties and interests; making it “less probable that a majority of the whole will have a common motive to invade the rights of other citizens.”

 

The flaws in that hope were manifest in 1861, when the factional division over slavery devolved into Civil War. And Madison could not have foreseen, at a time when news could travel no faster than a horse, how instantaneous mass communication—radio, then television, then the internet and social media—could become nationwide organizers and simplifiers of faction (conspiracy theories, though, he could imagine). The years since 2016 have, sadly, revealed that what Madison called “a man of sinister design” could forge a powerful faction “adverse to the rights of other citizens” and “to the permanent and aggregate interests of the community.” Vast size alone, we have learned, offers no permanent solution to the challenge of preserving a republic.

 

But the immediate problem Madison faced was creating one. When the Constitution’s draft was published, opposition was fierce. Federalist 10, like all the 85 Federalist essays by Madison, Hamilton and Jay, were efforts to persuade New Yorkers in particular to ratify the Constitution, and that state only did so a month after it went into effect on June 21, 1788. However one judges Madison’s views on how factionalism could be managed, his persuasive optimism contributed to an epochal turning point: the union of all thirteen fractious former colonies under one sovereign American government.

 

Madison was a politician after all, not just a theorist, and the Constitution proved a practical solution to an exigent problem. To his credit, the polemical challenge of the moment did not lead him to mask his belief that factional sources of “instability, injustice, and confusion,” those “mortal diseases under which popular governments have everywhere perished,” would pose a permanent challenge to the United States.

 

Madison’s contribution is of course far greater than his case for a geographically large republic. He authored the “Virginia Plan” that served as the basis of discussion at the Constitutional Convention, where he was the most frequent and persuasive speaker. He took the only full set of notes recording all the discussions, later published in two substantial volumes that remain the best source of information on what the framers meant. Finally, he wrote several of the most decisive Federalist Papers, and number 10 is the best single explanation of how and why his generation could hope for an American Republic to last, and to become an example to other nations.

 

 After all, in very few pages, Federalist 10 presents a whole political theory, explaining why popular government is better than monarchy; why a republic is superior to a democracy; how factions or parties pose its greatest danger; why a larger republic is a better hedge than a smaller one; and, finally, why none of those solutions to preserving liberty and self-governance, or to managing conflict while advancing the general welfare, is guaranteed to last. After watching a classic demagogue deliberately polarize us we see more clearly than ever that Madison was not mistaken. And Franklin? “If you can keep it” was apposite.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179451 https://historynewsnetwork.org/article/179451 0
Life during Wartime 530

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/blog/154477 https://historynewsnetwork.org/blog/154477 0
Review: Does "The Princess Spy" Pierce the Veil of its Subject's Fictions?

 

 

 

The Countess of Romanones, born Aline Griffith of Pearl River, NY filled her memoir The SPY WORE RED: My Adventures as an Undercover Agent in World War II with violent activities created to introduce her as a virtual "Jane Bond" of espionage when, in fact, she was a civil service clerk, grade 5 employed by the Office of Strategic Services (OSS), predecessor of today's Central Intelligence Agency (CIA). These violent activities assassinations, murders, wild car chases and the like, would have drawn the ire of Spain's dictator, Francisco Franco. Agents from both sides of the European conflict were allowed in "neutral" Spain provided they operated under conditions laid down by Franco: NO VIOLENCE.  One gunshot signaled a counter-revolution, a fear of all dictators.

 

THE PRINCESS SPY revisits Griffith’s memoir, with an eye toward separating truth from fiction. "The bigger question I wanted to answer," Larry Loftis noted in his preface, " was whether she fictionalized or embellished all or most of her exploits. If she did, I realized, I'd have to find another spy to write about." Loftis answered that question sufficiently to proceed, but with some reservation, acknowledging the sensational events in the memoir were  "historical fiction" but for one exception: a "bloody body" Aline discovered in her bed which she had to dispose of." Loftis believed the story. I do not. His so-called "confirmation" of the incident comes from the unpublished memoir of a co-worker, one hardly willing to call her a liar.

 

How, then, to establish what Aline Griffith actually did in wartime? During her first year in wartime Spain, Aline Griffith was employed as a civil service clerk, grade 5, trained to de-code messages passing through the OSS Madrid station. Off-duty she became a regular of Madrid's social life where occasionally she picked up information worth reporting to her superiors (pro -Germany Spain hosted thousands of Nazi agents as well as German businesspersons). Was this the setting for dramatic escapades?

 

This author’s book EDMUNDO: from Chiapas, Mexico to Park Avenue: The true story of a Mexican-American who became a World War II spy and married a German Princess (2007), which became a key source for The Princess Spy, offers a starting point through the story of OSS agent Edmundo Lasalle.

 

Following the D-day landings, attention of the OSS turned to PROJECT SAFEHAVEN, a plan "to prevent defeated Germany from sequestering loot and German gold in neutral countries" and "to prevent the escape of possible war criminals" through Spain and other "neutral" countries such as Portugal and Switzerland.

 

In 1943, President Roosevelt has established the Foreign Economic Administration  with wide-ranging responsibilities that would come to include Safehaven. Placed in charge was Samuel Klaus, assigned from the Treasury Department. In September 1944, Klaus, along with a representative of the State Department, met in Madrid with OSS officials to discuss details of what was to become PROJECT  SAFEHAVEN.

 

Edmundo Lassalle was assigned the lead OSS agent in executing this very crucial project-one that continued well-beyond the end of the Second World War.  In January of 1945, he was sent to London (with priority to travel on the PanAm Clipper) where he received training regarding SAFEHAVEN. He was joined by OSS agents from the newly created Art Looting Investigating Unit. The training kept him in (cold and boring, he wrote his wife, Emilie) London for a month. (My own research failed to establish a purpose for the London stay; Loftis’s research was more successful)

 

Also involved in this important project was Aline Griffith. Edmundo, however, soon shifted his attention to courting Princess Agatha Ratibor, divorcing his wife Emilie, and seeking permanent employment as the European Representative of the Disney Company. (As important as Project Safehaven  was, Edmundo used his  "home leave" to return to Washington to inform his wife of his desired divorce, then on to Hollywood to meet Walt Disney in an effort to remain on Disney's payroll as his European Representative.} Needless to say, government bookkeepers were puzzled as to how Edmundo could be employed by the OSS and the Disney Company at the same time. This became a troubling issue as his OSS service was terminated.)

 

The void in executing PROJECT  SAFEHAVEN,  created by Edmundo's shifting interests, was left to other agents including, so it seems, Aline Griffith. Loftis deals with this issue exceedingly well, plying OSS records not yet available when I was writing my biography of Edmundo Lassalle.

 

In July, 1945, following Germany's defeat, Aline was promoted to civil service grade 7 "Intelligence Agent," code name Butch.  Since the OSS was abolished on August 1, this newly appointed "secret agent" served but a few days executing PROJECT SAFEHAVEN. As I noted in my biography of Edmundo, however "In February 1945, Aline Griffith, OSS code name "Butch," submitted her first report to Secret Intelligence (SI) and her last report came on July 15, 1945," Why her promotion was delayed, and the results of her SI work, while she nominally was still working as a code clerk, remained to me unclear. 

 

And we do know, this 22 tear-old former New York fashion model’s attention, at that time, was on Luis, a Spanish Count, reputed to be the wealthiest bachelor in Spain.

 

But as for PROJECT SAFEHAVEN, what were her contributions? Even more important, what were the contributions of the OSS in this very important assignment? This we have never learned.

 

THE PRINCESS SPY is meticulous researched and well-written and might well become a bestseller. It is regrettable , however, that Loftis chose to write a "nonfiction biography" of Aline Griffith, aka the Countess of Romananos, especially having concluded that much of what she included in her memoir THE SPY WORE RED  was "historical fiction." One of Aline's secretaries in New York in the 1980s said the Countess, in private, referred to her memoirs as "factions," a blend of facts and fiction; or as Nigel West, the espionage expert concluded, "the product of her imagination.”

 

A FINAL COMMENT:  The historian Richard Hofstadter once noted that history is largely "a comedy of errors and a museum of incompetence." There is considerable evidence of that in World War II and, for the OSS, writ large. Much confusion, considerable waste, institutional mendacity and, as combat leaders proclaimed, minimal impact on the war's outcome. A condition, no doubt, that explains why President Truman abolished the OSS following the surrender of Japan. Concurrently, it should be noted, our military was recruiting well-trained and experienced secret agents who earlier served Adolf Hitler.  And, as early as 1947, the OSS was re-born as the CIA, destined to add torture as a means of producing intelligence.

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179453 https://historynewsnetwork.org/article/179453 0
The Roundup Top Ten for March 5, 2021

James Weldon Johnson’s Ode to the “Deep River” of American History

by David W. Blight

A biographer of the poet, novelist and activist James Weldon Johnson considers his “St. Peter Relates an Incident of the Resurrection Day” as a reflection on the necessity of persistence and hope even amid dire times. 

 

Moral Evil, Economic Good’: Whitewashing the Sins of Colonialism

by Sabelo J Ndlovu-Gatsheni

Recent efforts to reframe Europe's history of colonialism as a net contribution to human welfare are misguided, argues a scholar of African history. 

 

 

From Washington to Trump: What Is Dereliction of Duty?

by Lindsay Chervinsky

Public ideas of the presidential duty to defend the nation against foreign and domestic enemies have evolved over two centuries; if Donald Trump had been president in 1793, his response to a pandemic wouldn't have cost him reelection. 

 

 

Seeking the True Story of the Comfort Women

by Jeannie Suk Gersen

A Harvard Law School professor tried to understand why her colleague made a provocative and contrarian argument that Korean "comfort women" engaged in voluntary sex work. She discovered that recourse to the facts was both straightforward and frustrating.

 

 

Toys are Ditching Genders for the Same Reason they First Took them On

by Paul Ringel

While social conservatives may bemoan the rise of gender-neutral toys as an attack on traditional values, the history of marketing to children suggests that the impetus for the change isn't coming from the "woke" but from the market. 

 

 

“Making a Living by the Sweat of Her Brow”: Hazel Dickens and a Life of Work

by Emily Hilliard

"Hazel’s song catalog is often divided into separate categories of personal songs, women’s songs, and labor songs. But in her view and experience, these issues all bled together; her songs address struggle against any form of domination and oppression, whether of women, workers, or herself."

 

 

A Rapidly Globalizing World Needs Strengthened Global Governance

by Lawrence Wittner

"The world is currently engulfed in crises—most prominently, a disease pandemic, a climate catastrophe, and the prevalence of war—while individual nations are encountering enormous difficulties in coping with them."

 

 

The Far Right’s Big Money Strategy Has Poisoned Our Politics

by Marc C. Johnson

The 1976 Supreme Court decision in Buckley v. Valeo ruled that spending money to influence a campaign is free speech, launching the era of big money in politics long before the much-maligned decision in Citizens United v. FEC. 

 

 

Originalism’s Original Sin

by Adam Shapiro

Liberal critics should understand the ways that Constitutional originalism's practices of reading and resolving conflicts in the text owes a great deal to biblical literalism. Historians of religion can help understand what's at stake. 

 

 

We Need a Second Season of ‘Mrs. America.’ Here’s Why

by Magdalene Zier

After the defeat of the ERA, Phyllis Schlafly's activist career entered a second act, pushing the federal judiciary in conservative directions.

 

]]>
Mon, 22 Mar 2021 01:34:12 +0000 https://historynewsnetwork.org/article/179447 https://historynewsnetwork.org/article/179447 0