The Liberal Studies Department at Montana Tech recently initiated a new research colloquium and I was invited to be a discussant for a talk by Professor Henry Gonshak on the figure of the Holocaust survivor, as depicted in Sidney Lumet’s 1964 film, The Pawnbroker. I don’t have any training in film studies, so I just prepared some remarks about the political and social context of the film, focusing in particular on Nazism and the Holocaust survivor as cultural signifiers. Henry speaks first and my response begins at 38:51.
Very pleased to learn that my submission to the American Political Science Association (APSA) annual conference has been approved. I last participated in 2010 and enjoyed the experience—but it was only a poster presentation, so this year it will be two firsts: first panel presentation at the premier national political science conference and my first conference as a newly minted doctor.
The paper in question will be a version of the first chapter of my dissertation, entitled “Thresholds of Atrocity: Violence and Vision in Levinas, Murdoch, and Weil.” The benevolent gatekeepers behind the Foundations of Political Theory division have graciously placed me on a panel entitled Trauma and Violence in Contemporary Political Life.
Here’s the abstract I sent:
Every political community sets normative limits to the legitimate use of force—but how much is too much? What distinguishes atrocity from conventional violence? Where should we draw the line between the acceptable and the unconscionable? Few scholars have given atrocity sustained conceptual attention. From the Latin atrox, meaning heinous, cruel, or severe, the very word atrocity implies excess by definition. While Arendt writes that “[v]iolence can be justifiable, but it will never be legitimate,” atrocity is neither justifiable nor legitimate. This paper adapts and engages with the aesthetically-oriented philosophies of Emmanuel Levinas, Iris Murdoch, and Simone Weil to advance a theory of atrocity grounded in an expansive notion of moral vision (one that potentially includes literal vision as well as sounds, smells, voices, texts, etc.). After surveying the metaphorical importance of vision and blindness in relation to the rationalization of extreme violence, this paper draws on the work of Simone Weil, Iris Murdoch, and Emmanuel Levinas to assess the possibility of deriving ethical norms from phenomenological experience. Levinas offers the famous face-to-face encounter, while Murdoch and Weil draw upon Buddhist thought to advocate for moral attention. Crucially, much of the world’s violence that would otherwise register as atrocious is not recognized as such because efforts are taken to obscure and actively control the representation of violence, thereby impeding phenomenological comprehensibility in its myriad forms and helping to legitimize the illegitimate.
My little daughter, my masterpiece,
Child in body, mind and spirit, beautiful,
Child so much a child.
When you have blossomed into womanhood,
May you be a Judith decapitating a Holofernes,
A Joan of Arc leading a people to victory,
A Louise Michel fighting on the barricades,
A Voltairine de Cleyre singing the songs of revolt,
An Emma Goldman preaching the gospel of rebellion.
I dedicate you,
Fruit of my blood, child of my soul,
I dedicate you to the cause of emancipation,
I dedicate you to the cause of truth and justice,
I dedicate you to the Social Revolution.
May your life and your death be the scourge of tyrants
And the inspiration of those who fight for human
This poem was taken from Adolf Wolff’s collection, Songs of Rebellion, Songs of Life, Songs of Love (1914). A self-described “poet, sculptor and revolutionist, but mostly revolutionist,” Adolf Wolff came to the United States as a child from Belgium. History, however, hasn’t been especially kind to either his art or his politics and he’s been reduced to near-total obscurity today.
There’s very little one can find about Wolff in the scholarly record at all. In fact, I only encountered his poem a few months ago while reading Paul Avrich’s biography of Voltairine de Cleyre. As Avrich and Francis Naumann write in a piece specifically about Wolff:
[T]he artistic accomplishments of Adolf Wolff have been almost totally forgotten. His name has fallen into obscurity, and goes unrecorded in even the most thorough histories and lexicons of American sculpture.
In any case, Wolff seems to have been very active in the radical politics that thrived in New York at the turn of the century. Nearly all the figures in that world were recent immigrants from Europe (Wolff included) disillusioned with the dire conditions of urban industrial capitalism and animated by left-wing politics in the heady days just before the Russian Revolution. To be an anarchist, Wolff said, was:
to be a human being without prejudice, without superstition, without fear, even in the face of death, and to love truth and justice, and to speak it out irrespective of consequences, to have law and order within, not without, to require no government except that of one’s conscience, and to be so strongly individual as to embrace with one’s own mind the well-being of the entire human collectivity.
This element of Wolff’s life is usually purged from any dicussion of his artwork—itself a rare occurence. When one of his sculptures appeared on the Antique Roadshow back in 2010, for example, an appraiser euphemistically described Wolff’s politics:
[H]e was also very involved in politics. A lot of artists were. … [H]e was, you know, embroiled with some very interesting characters and very interesting times.
Very interesting characters and times indeed! Though he knew them personally, Wolff would have been just young enough to regard the anarchist iconoclasts Emma Goldman and Voltairine de Cleyre probably more with reverence and awe than any kinship grounded in equality. After all, “Red Emma” was one of the most notorious radicals in the country for at least the two decades prior to her expulsion in the Red Scare of 1919. She was a larger-than-life figure and de Cleyre had already died at a relatively young age (in 1912) by the time Wolff’s poem was published.
Wolff taught art to children at the Modern School—an institution established by followers of the radical educator Francisco Ferrer (who was himself executed by the Spanish state in 1909)—and was arrested several times for his participation in left-wing demonstrations. His own daughter (for whom the poem was written) reportedly appealed to him at one such demonstration: “Please, papa, mamma wants you to smile and not be so angry.”
Wolff’s poem nicely captures his admiration of radical women. It attempts to draw a clear political lineage from the deuterocanonical Book of Judith—whose eponymous heroine assassinates a tyrannical general—and Joan of Arc to modern political radicals like Louise Michel—a prominent figure in the Paris Commune—as well as the two aforementioned American anarchists.
It’s obviously not a great poem, but I think it has a certain charm. It distills the hopes and dreams that attend the arrival of any newborn child and fuses them with high-minded political goals. Considering Wolff’s later tranformation, during WWI, from radical to “upright American citizen,” it also bears an ironic nostalgia for idealism abandoned. Ⓐ
- Adolf Wolff, Songs of Rebellion, Songs of Life, Songs of Love (New York: Albert and Charles Boni, 1914) 15.
- Paul Avrich, An American Anarchist: The Life of Voltairine de Cleyre (Princeton, NJ: Princeton University Press, 1978) 13.
- Francis M. Naumann and Paul Avrich, “Adolf Wolff: ‘Poet, Sculptor and Revolutionist, but Mostly Revolutionist’,” The Art Bulletin 67 no. 3: 486-500.
Planning a homebirth in the United States means constantly dealing with medical professionals who think you’re the scientific and moral equivalent of an anti-vaxxer.
I’ve come to the conclusion that while some doctors do take an active interest in the medical literature, others view themselves mainly as technicians. Unfortunately, most seem to fall into the latter category. Not that I blame them for it. With all they do, how can the average doctor be expected to keep up with current research as well? But the very structure of American medicine seems to encourage an over-reliance on the official positions of the various medical associations (even when they differ quite significantly from international medical opinion). “What do I think about homebirth? I don’t know, what does the ACOG say about it?”
Several weeks ago, we were told by one particularly condescending doctor:
I just want healthy babies and healthy mothers. There’s no reason for anyone to die in childbirth anymore and the hospital is really the safest place. You’re taking a big risk.
Uh, no we’re not. Read the literature.
The United States has a 32.2% C-section rate, much higher than medical necessity might dictate (and even higher among poor women of color). While the American College of Obstetricians and Gynecologists (ACOG) discourages homebirth, insisting that “hospitals and birthing centers are the safest setting for birth,” they nevertheless note that “planned home births are associated with fewer maternal interventions, including epidural analgesia, electronic fetal heart rate monitoring, episiotomy, operative vaginal delivery, and cesarean delivery.” (This still hasn’t stopped them from distributing anti-homebirth bumper stickers: Home Deliveries are for Pizza, not Babies.”)
Given the mind-boggling numbers, the statistical possibility of one’s pregnancy ending in a C-section should in itself raise alarm. In addition to the standard risks associated with major abdominal surgery, women who undergo C-section are four times more likely to die from complications during or after childbirth. It is for this reason the World Health Organization (WHO) says the overall C-section rate should not rise above 10-15%. When rates exceed this level, there is no indication that health outcomes improve. Yet C-section rates continue to climb… and women are dying.
Only about half the states bother to collect data on maternal death and there is no corresponding national effort. The World Health Organization estimates somewhere around 12-28 deaths in the United States per 100,000 mothers—or about 1200 annually. This is an astonishingly high figure for a technologically advanced country; higher than Iran, Turkey, or pre-revolutionary Libya.
Yet numerous studies show that an uncomplicated pregnancy is as safe or even safer at home than in a hospital, where one is much more likely to face medical intervention and its attendant problems.
A recent study on planned homebirth indicates a very slightly higher rick of perinatal death compared to in-hospital births but a significantly lower chance of unwanted medical inventions:
Perinatal mortality was higher with planned out-of-hospital birth than with planned in-hospital birth, but the absolute risk of death was low in both settings.
This study is just the latest installment in a solid body of evidence that out-of-hospital births attended by qualified midwives are a safe option for women with uncomplicated pregnancies. So the argument that hospitals are the “safest setting” is certainly not as obvious as ACOG would like to suggest and the emphasis on risk tends to obscure the fact that, for the vast majority of women, childbirth is a normal physiological process—one for which their body has been well-equipped by the forces of human evolution. The cult of risk strips women of bodily agency. It presents a pregnant woman first of all as an emergency or at least a potential emergency. As City University of New York sociologist Barbara Katz Rothman writes:
Virtually any house can be struck by lightning: Do you care to think of where you live as being ‘low-risk’ for lightning? This is just what contemporary medicine has done to pregnancy. It has distinguished between ‘low-risk’ and ‘high-risk’ pregnancies, with the emphasis always on risk, and then gone on to define an ever-increasing proportion of pregnancies as ‘high-risk.’
Perhaps more than anything else, the obsession with risk is one expression of our deep cultural desire for scientific and technological mastery, for the attainment of absolute certainty over matters of life and limb—even to the point of irrationality and skyrocketing C-section rates. But whether at home or in a hospital, there is no such thing as “risk-free” childbirth. It’s likely there never will be such a thing.
This is not an argument against either hospitals or modern medicine. Hospitals are ideal for medical emergencies and the psychological comfort that provides makes hospital birth a no-brainer for many women. But pregnancy is not an illness. It’s not an emergency. Our midwife knows what she is doing and it’s really getting to be a drag constantly dealing with practitioners clearly less interested in keeping up with the scientific literature than in chastising us for defying the hospital monopoly. Ⓐ
- Armstrong, Elizabeth Mitchell. “Home Birth Matters—For All Women.” Journal of Perinatal Education 19, no. 1 (2010): 8–11.
- Block, Jennifer. Pushed: The Painful Truth About Childbirth and Modern Maternity Care. Cambridge, MA: Da Capo, 2007.
- Cheyney, Melissa, Marit Bovbjerg, Courtney Everson, Wendy Gordon, Darcy Hannibal, and Saraswathi Vedam. “Outcomes of Care for 16,924 Planned Home Births in the United States: The Midwives Alliance of North America Statistics Project, 2004 to 2009.” Journal of Midwifery & Women’s Health 59, no. 1 (2014): 1–11.
- Simonds, Wendy, Barbara Katz Rothman, and Bari Meltzer Norman. Laboring On: Birth In Transition in the United States. New York: Routledge, 2007.
- Snowden, JM, EL Tilden, J Snyder, B Quigley, AB Caughey, and YW Cheng. “Planned Out-of-Hospital Birth and Birth Outcomes.” New England Journal of Medicine 373, no. 27 (2015): 2642–53.
Our daughter is due on March 11th.
It’s a peculiar way to discuss the arrival of a baby: to be “due.” Like a homework assignment. Or a debt. Still, babies are themselves usually reluctant to comply. A mere 4% of children are born on their predicted due date. Whether they come earlier or later, the rest are rebels even before extra-uterine life commences. 20% miss the mark altogether and opt to stay in the womb for at least another week before finally being evicted, which is probably a good thing considering the cognitive benefits demonstrated by children of longer pregnancies.
The uterus defies the pretense of schedules, predictions, forecasts, or prophesies. Scientists still don’t know exactly when or how a woman’s body comes to conclude it’s time to eject its infantile occupant. The mysteries of the uterus and its purported powers have troubled scholars for centuries. Few other organs have caused quite as much contention, grief, speculation, and superstition.
As early as 1900 BCE, we learn from Egyptian papyrus that if a woman is “ill in seeing,” her womb is likely starved or dislocated (Not to worry! A poultice of dried human feces and beer froth will clean the problem right up). Other examples describe a range of symptoms the Greeks, more than 1000 years later, would associate with hysteria, after their word for the uterus, ὑστέρα.
In the Timaeus, Plato writes that an “unproductive” womb “gets irritated and fretful” and travels about a woman’s body “generating all sorts of ailments, including potentially fatal problems, if it blocks up the air-channels and makes breathing impossible.” Aristotle concurred and in his Nichomachean Ethics cites the deleterious emotional impact of uterine defiance (especially menstruation) to justify excluding women from politics.
It was Hippocrates, the “Father of Western Medicine,” who first coined the term hysteria. He postulated the theory of the “wandering womb” and suggested the uterus could literally float around a woman’s body causing mischief. To coax it back into place, he recommended sniffing acrid and foul odors.
Aretaeus of Cappadocia, an advocate of Hippocratic principles, described the doctrine’s basic tenets:
In the middle of the flanks of women lies the womb, a female viscous, closely resembling an animal; for it moves itself hither and thither in the flanks, also upwards in a direct line to below the cartilage of the thorax, and also obliquely to the right or to the left, either to the liver or the spleen; and it likewise is subject to prolapsus downwards, and, in a word, it is altogether erratic. It delights, also, in fragrant smells, and advances towards them; and it had an aversion to fetid smells and flees from them; and, on the whole the womb is like an animal within an animal.
And the Roman physician Galen continued his work centuries later:
I have examined many hysterical women, some stuporous, others with anxiety attacks […]: the disease [hysteria] manifests itself with different symptoms, but always refers to the uterus.
The solution? Hellebore, mint, laudanum, belladonna extract, valerian root and other herbal remedies. Marriage also seemed to work wonders, as it frequently resulted in a guaranteed cure: pregnancy and childbirth.
Helen King explains this apparently “pharmacological interpretation” of “the social processes of marriage and motherhood”:
Not only does intercourse moisten the womb, thus discouraging it from moving elsewhere in the body to seek moisture, but it also agitates the body and thus facilitates the passage of blood within it. Furthermore, childbirth breaks down the flesh throughout the body and, by making extra spaces within which excess blood can rest, reduces the pain caused by the movement of blood between parts of the body. […] Since all disorders of women ultimately result from their soft and spongy flesh and excess blood, all disorders of women may be cured by intercourse and/or childbirth, to which marriage and pregnancy are the necessary precursors.
The myth of female hysteria persisted into the 20th century, making bloody detours along the way through so many inquisitions and witch-burnings. The Aristotelian belief that “the woman is a failed man” found advocates among the Patristic theologians and later in the work of thinkers like Thomas Aquinas. The 17th century English physician William Harvey claimed women were “slaves to their own biology” and described the uterus as “insatiable, ferocious, animal-like.”
Even as late as the Victorian era, women embraced Hippocratic remedies. A sick woman was said to be “womby” or suffering from “wombiness.” To combat this epidemic, it was common practice to carry a bottle of smelling salts with which to tempt the “wandering womb” back to its proper anatomical locale.
Fortunately, modern uteri tend to be rather less troublesome that their unruly predecessors and, by this time next week, a cocktail of hormones will trigger a succession of biological impulses in my partner’s body that will ultimately result in the birth of our daughter. It is a meeting we have anticipated patiently for 40 weeks. Whatever the womb’s mysteries, real or imagined, it’s hard to believe anything might surpass the sheer wonder and anxiety of impending fatherhood. Ⓐ
- Adair, Mark J. “Plato’s View of the ‘Wandering Uterus.’” The Classical Journal 91, no. 2 (1996): 153–63.
- Aristotle. The Nichomachean Ethics. Translated by J. A. K. Thomson. New York: Penguin, 2004.
Gilman, Sander L., Helen King, Roy Porter, G.S. Rousseau, and Elaine Showalter. Hysteria Beyond Freud. Berkeley, CA: University of California Press, 1993.
- Lefkowitz, Mary. “The Wandering Womb.” The New Yorker, February 26, 1996.
Micklem, Niel. The Nature of Hysteria. New York: Routledge, 2015.
- Plato. Timaeus and Critias. Translated by Robin Waterfield. New York: Oxford University Press, 2008.
- Tasca, Cecilia, Mariangela Rapetti, Mario Giovanni Carta, and Bianca Fadda. “Women And Hysteria In The History Of Mental Health.” Clinical Practice and Epidemiology in Mental Health 8 (2012): 110–19.
* This article originally appeared at WarScapes on February 22, 2015.
Torture is a violation of the law, both domestic and international. It also happens to be a moral outrage. Leaving aside the legal definitions, the abstract notion of a moral outrage entails a degree of subjective judgment—but tends to be more easily identifiable to outsiders than to insiders. After all, few of us possess the moral clarity it takes to reflect upon our own transgressions with the same zeal we readily adopt against others.
In the age of modern nationalism, we extrapolate from individuals and apply the same idea to the various institutions and agencies tasked with representing the political community at large. It is a matter of little dispute, for example, that Iran practices torture against prisoners. Most Americans accept this as obvious and uncontroversial, whether or not they happen to have read the latest human rights reports. Yet when agents acting on behalf of the United States stand accused of such practices—as they have with the partial release of the Senate Intelligence Committee’s report on CIA torture—euphemism becomes a national pastime.
Take former vice President Dick Cheney’s recent remarks on Meet the Press, an especially adamant defense of the CIA’s interrogation program. There he insisted that waterboarding, a practice refined by the Spanish inquisitors and later embraced by Nazi Germany during WWII, is not a form of torture. Cheney has been consistently eager to draw a clear distinction between CIA actions and torture: the former ostensibly justified, the latter a legal and moral outrage.
While prominent voices within the mass media rightly dismiss Cheney’s arguments as a weak defense offered by someone directly implicated, a recent poll suggests that half the country doesn’t believe his distinction anyway. 49 percent of respondents believe that CIA methods were torture. Despite this, a whopping 59 percent nonetheless believe it was justified. President Obama can deny it all he likes, but torture has seemingly become very much a part of “who we are.”
The passage of torture from unambiguous moral outrage to just another tool of American power is a remarkable story, but not a particularly surprising one if we consider the efforts undertaken to obscure it. The use of euphemism, legalese, the absence of victim’s voices in the media, and in some cases outright suppression of evidence, have all contributed to keeping the unpalatable details out of the spotlight. How are we to exercise moral judgment without an adequate view of the facts?
The philosopher and novelist Iris Murdoch believed that clear moral vision, the capacity for gaining “a refined and honest perception of what is really the case” requires immense determination. It requires a process of “unselfing,” her term for the stripping away of self-centered conceits and the influence of various ideological justifications (nationalism, sexism, religious dogma, etc.) that make it difficult for us to appreciate how others experience the world. By giving our full attention to the question at hand and exercising empathy, we are able to arrive at an appropriate moral judgment. Yet Murdoch’s emphasis on moral attention is completely meaningless when confronted with an ethical dilemma that cannot be seen in full. This is precisely the problem with torture.
Torture is conducted in secret, in dark rooms, by individuals whose identities are typically unknown to the general public. The victims are often anonymous. Whatever reassurances offered by our political leaders as to the humanity of the methods or to the tightly restricted conditions under which these methods are deployed, there has been very little discussion concerning the very real physical and psychological consequences of torture. Because of this, I suspect that what appears to be robust American support for torture is actually quite flimsy; it would likely collapse if confronted with a bit of transparency (and some Murdochian moral attention).
Is the public aware, for instance, that more than one hundred people have died in U.S. custody and that many of these deaths were later ruled homicides by military investigators? My students certainly were not, until I had them read about it. I teach a course on the history and politics of torture at Lehman College and many of my students—most of who arrived as determined proponents of “enhanced interrogation techniques”—are horrified to learn that people have been literally tortured to death in American custody. Few of us are able dismiss this terrible truth as contemptuously as is Dick Cheney, who apparently has “no problem” with the deaths of wrongfully imprisoned detainees “as long as we achieve our objective.”
The invisibility of torture is not only a byproduct of widespread ignorance of the grisly details, however. It is also a result of the sterile descriptions used. If political language is, as George Orwell famously wrote, designed “to give an appearance of solidity to pure wind,” the written language of torture is designed to render the horrific benign. “Enhanced interrogation techniques” is by now widely recognized as the Bush Administration’s major contribution to the art of euphemism, but the individual techniques themselves continue to go unchallenged.
In print, “forced standing” reads like a minor inconvenience; “sensory deprivation” like a game of hide-and-seek; “rough handling” like a fraternal wrestling match; “stress positions” like a particularly intense session of yoga. This language is intentional. Modern torturers have dispensed with crude methods, and have instead devised techniques that remain either palatable or invisible to the general public. The political scientist Darius Rejali calls such techniques “clean torture,” practices perceived as less physically violent because they leave no permanent scars but which nevertheless cause immense physical suffering and often irreversible psychological damage.
Euphemism is not always up to the task however. One of the more disturbing revelations described in the torture report is the CIA’s practice of “rectal rehydration,” a term that term barely conceals the brutality of the act: force-feeding through the anus. If we take American federal law as a guide, which forbids “[t]he penetration, no matter how slight, of the vagina or anus with any body part or object … without the consent of the victim,” the practice would be more honestly described as “rape.” Even if the American public is told that such a practice is “legal,” as various pundits and politicians have claimed in recent days, the moral outrage remains. When 59 percent of the country claims to support the CIA’s interrogation practices, I doubt they have “rectal rehydration” in mind.
Very occasionally, we are offered more than either euphemisms or silence. The Abu Ghraib prison abuse scandal became the national outrage it did mainly because of the abundance of photographic evidence provided by the perpetrators. As the cliché has it, a picture is worth a thousand words; images of atrocities shock the conscience in ways the written word rarely achieves. One suspects that it is for this reason that President Obama has refused to declassify thousands of unreleased photographs depicting the abuse at Abu Ghraib prison, despite promising to do so as he entered office.
Major General Taguba, who led the investigation into the abuse at Abu Ghraib, has repeatedly claimed the unreleased photographs depict rape (the White House and Pentagon deny this) and yet he still urges that they not be declassified. As he argues, “the mere description of these pictures is horrendous enough, take my word for it.” While the techniques detailed in the Senate Intelligence report easily match and in some cases exceed the abuses committed by soldiers at Abu Ghraib, the CIA wisely destroyed all video evidence of waterboarding. All we are left with are the written descriptions.
Public opinion is important. It signals to our leaders the public’s willingness to either accept or oppose policy prescriptions. As Louis Brandeis once wrote, “Sunlight is said to be the best of disinfectants.” In this case, I doubt enough sunlight has been shed to foster adequate moral vision on the public’s behalf. When we’re told that a large majority of the American people now embrace torture as a necessary part of twenty-first century politics, we need to ask how that position developed. What is it the American people think the CIA has been up to all this time and how well does this picture match the reality? Declassifying the rest of the 6,000-page torture report won’t bridge this chasm on its own, but it would go a long way towards establishing basic conditions for moral vision. It might even allow for the kind of moral attention Iris Murdoch believed was necessary for understanding “what is really the case.”
Noam Chomsky, the M.I.T. linguist and renowned iconoclast of the Left, has exerted a tremendous influence on my political and intellectual development. Some of it had to do with hearing his voice as a young person coming of age politically in the immediate post-9/11 United States, with its hysterical jingoism and spurious justification for military intervention. A greater part of his influence on me however has to do with his gracious nature. In 2006, I was a mediocre, solid B student at San Diego State University, writing my undergraduate thesis on Israel’s construction of the West Bank Barrier (under the supervision of SDSU professors Farid Abdel-Nour and Jonathan Graubart). It was probably the first assignment I took seriously up until that point and I sent a draft of the paper to about a dozen scholars, hoping for feedback, but not expecting anything much. To my surprise, Professor Chomsky was the only one to respond—and with extensive comments. That someone in his position would take the time to interact with someone in my position impressed me immensely. It still does. His encouragement was a revelatory experience for me intellectually and the paper, incidentally, went on to win a California-wide award for Best Undergraduate Research in the Social Sciences. I was also frequently in touch with Professor Chomsky during my time working in the Gaza Strip (2007-08) and he was one of the first scholars to send a letter to Brooklyn College protesting my brief dismissal as an adjunct lecturer there in 2011. Just recently, he agreed to meet with me in his office at M.I.T. to discuss my dissertation. I have reproduced the transcript here, including annotations.
Kristofer Petersen-Overton: You talk about Cartesian common sense in much of your work, concerning our inability to recognize an act of atrocity. Iris Murdoch, whose work I use in my dissertation, uses an idea of moral vision, of attention, to express a similar notion. You give the example of the Soviet invasion of Afghanistan compared to the American invasion of South Vietnam. For your average person, it’s recognized as common sense that this was an invasion in the Afghan case, whereas it sounds bizarre to talk about the U.S. “invasion” of South Vietnam, something that should be common sense but is apparently obscured. My research is very much interested in this idea. What are the filters to actually seeing what is going on? First of all, the idea that certain forms of violence are invisible and perhaps then also even obscured by active manipulation on the part of political and economic elites. I’m thinking also about your own work on the media system. So I wanted to ask you how one begins to go about this project of achieving common sense, of clear moral vision without the filters. Where to start?
Noam Chomsky: Well take say the inability of educated Americans, let alone the so-called man on the street, to perceive American crimes as crimes. There’s a history. So, for example, if you talk about the war in Vietnam, the phrase “U.S. invasion of South Vietnam” simply does not exist in the professional, academic, and general cultural literature. You don’t have such a notion. The “[Soviet] invasion of Afghanistan” is, of course, normal. What’s the difference?
Well, take two original sins of American society. There are very serious crimes. One is slavery. The United States ran literal slave labor camps for centuries. The modern economy, the modern industrial economy, not just of the United States but of England and of other industrial countries that developed from it, is based on slavery. Cotton was the fuel of the early industrial revolution and most of it was produced right here in slave labor camps of a vicious character. Actually, one of the first books on it just came out: Edward Baptist’s book, The Half Has Never Been Told.1 Well, some was known, but he gives a vivid, detailed account of the nature of the slave labor camp and he discusses how maybe the North pretends they weren’t a part of it, but they were. That’s where the merchant manufacturers, bankers, importers of equipment and so on developed their wealth and developed the economy. That’s one and that went on. It didn’t end with the end of slavery. After slavery there was a compact between the North and the South which essentially permitted the South to reintroduce a form of slavery by criminalizing much of the black population and turning them into a slave labor force, except that they were run by the state instead of by the plantations. That’s what the prisons were and much of the American industrial revolution in the later period is also effectively based on slave labor. This went on until the Second World War. It’s been reinstated now with the drug war, which is racist, criminalizing the black male population. Well that’s one crime. It’s not that people are unaware of it. In discussion of let’s say Ferguson, very little attention, in fact virtually none, is given to the fact that in 500 years—1619 is when the first slaves came—African Americans have had a small taste of freedom, sporadically, now and then, for a few decades. Continue reading In Conversation With Noam Chomsky
This interview was conducted by Alex Ellefson of Alternet and originally appeared there on November 24, 2014.
When Israeli bombs were falling on Gaza this summer, killing more than 2,000 Palestinians, it ignited a global controversy about whether Israel’s actions constituted war crimes. That controversy, in some ways, manifested at the University of Illinois at Urbana-Champaign. The board of trustees, responding to intense pressure from donors, voted to block the appointment of Native American studies professor Steven Salaita due to his “uncivil” tweets criticizing Israel’s assault on Gaza. Salaita, who is Palestinian and the author of Israel’s Dead Soul, left his job at Virginia Tech to take a tenured position at the University of Illinois. However, only a few weeks before he was supposed to start his new position, the school’s chancellor informed him that the job offer had been rescinded.
The incident sparked a backlash from scholars, civil rights groups and activists who argued that the university had violated Salaita’s freedom of speech by firing him. More than 6,000 academics have signed on to an academic boycott against the university and 16 of the school’s departments have passed no-confidence votes against the chancellor.
Salaita’s case is not extraordinary in that he is one of many college professors who have been fired or denied tenure for expressing viewpoints critical of Israel. Last week, Salaita spoke at several campuses about his battle with the University of Illinois. One of the lectures took place at Brooklyn College, part of the City University of New York (CUNY), which is not unfamiliar with explosive controversies related to Israel and Palestine. Almost two years ago, several New York City councilmembers threatened to pull funding from Brooklyn College if the school’s political science department did not drop its co-sponsorship of an event advocating for the Boycotts, Divestment and Sanctions (BDS) movement, which seeks to pressure Israel to end its military occupation of Palestine.
Salaita’s appearance at Brooklyn College caused a similar uproar last week. Several New York politicians, including State Assemblyman Dov Hikind demanded that the event be canceled. It was the only stop on Salaita’s tour to elicit such a response from elected officials.
To better understand the controversy at Brooklyn College and Salaita’s case in general, I spoke to Kristofer Petersen-Overton, who in his first teaching position as an adjunct professor at Brooklyn College had an experience similar to Salaita’s. In 2011, several alumni, including Hikind, publicly objected to Petersen-Overton’s appointment to teach a graduate-level class on the Middle East. Hikind accused him of being “an overt supporter of terrorism” because of an academic paper he wrote about the concept of martyrdom in Palestinian society. Brooklyn College, which initially explained it was dismissing Petersen-Overton because he had not completed his PhD and thus was not qualified to teach the class, eventually reinstated him a week later in response to a global campaign from many of the same people protesting the decision against Salaita. Continue reading How Fighting the Corporatization of the American University Can Get You Fired from Your Teaching Job: An Interview with Alternet
On September 9, 2014, I appeared for a second time on Democracy Now! with Amy Goodman to discuss the firing of Steven Salaita at the University of Illinois at Urbana-Champaign.
KRISTOFER PETERSEN–OVERTON: Yeah, well, I mean, I think there are important points of contact between my experience at Brooklyn College and Professor Salaita’s case. I mean, I was hired back in 2011 as an adjunct lecturer, so that’s a significant difference. I’m not a tenured professor. I’m a doctoral student, actually, at the CUNY Graduate Center. But many of us also teach courses in order to support our education. So I was hired to teach a one-semester course on Middle East politics. But before I was able to actually arrive in the classroom, a student complained to the department that she had googled me online and found some of my views apparently she took issue with and complained that I would be slanted and unfair towards Israel. The department asked her to hold off, and she turned around instead and went to a New York state assemblyperson, who then issued a press release calling me a, quote, “overt supporter of terrorism.” And this turned into an enormous controversy, which I didn’t expect, not knowing the political culture of Brooklyn College, not knowing the politics and background of this issue there. And unfortunately, the political science department, while supporting me, was routed by the administration, who intervened and canceled my appointment. And were it not for a large mobilization of students, faculty, activists and all sorts of independent organizations around the country and world, I wouldn’t have gotten my job back five days later.
* This article originally appeared in the Graduate Center Advocate in February 2014.
“We repudiate any effort to foreclose productive dialogue.” Such is the position of CUNY Interim Chancellor William Kelly, who released a short press statement in late December unilaterally reaffirming the consortium’s “long association with Israeli scholars and universities.” Kelly was responding, of course, to the controversial non-binding resolution recently passed by the American Studies Association (ASA) in favor of boycotting formal ties with Israeli universities. Similar statements have been released or signed by senior administrators at Harvard, Yale, Cornell, Amherst, Duke, Tulane, the University of Pennsylvania, and many more. The American Association of University Professors (AAUP), in view of their “long-standing commitment to the free exchange of ideas,” has also reaffirmed its opposition—since at least 2005—to academic boycotts.
Politicians have also joined in on thereaction. In late January the New York State Senate quietly passed a bill that would “prohibit any college from using state aid to fund an academic entity, to provide funds for membership in an academic entity, or fund travel or lodging for any employee to attend any meeting
of such academic entity if that academic entity has undertaken an official action boycotting certain countries or their higher education institutions.” The bill, which the New York Times predicted would have “trample[d] on academic freedoms and chill[ed] free speech and dissent,” bore a disturbing resemblance to the “deeply anti-democratic” legislation passed in Israel that today subjects advocates of a boycott to criminal penalties. Fortunately, the New York version has now been scrapped; but the logic behind such moves is clear: it is necessary to boycott the boycotters in order to stop boycotts. Lost amid the clamor is the very real question of academic freedom itself, which is both poorly represented and widely mischaracterized.
Citing Israel’s occupation of Palestinian land since 1967, its relentless expansion of illegal settlements in the West Bank, the construction of a wall condemned by the International Court of Justice, the systematic discrimination against Palestinians, and the suppression of basic human rights (including the denial of academic freedom), the ASA voted on December 4, 2013 to endorse “the call of Palestinian civil society for a boycott of Israeli academic institutions.” The call is not compulsory and members are expressly encouraged to “act according to their conscience and convictions on these complex issues … [T]he ASA exercises no legislative authority over its members.” Put simply, scholars remain free to pursue their own work, while the ASA as a body simply chooses not to establish formal ties with Israeli institutions. Even the New York Times acknowledges that “the boycott does not apply to individual Israeli scholars engaged in ordinary exchanges,” yet most of the outrage mistakenly claims the opposite.
Such wide condemnation is mainly semantic. After all, who could possibly stomach the idea of “boycotting” the free exchange of ideas? The very suggestion smacks of McCarthyism—or worse! This peculiar interpretation (incidentally not at all what the boycott calls for) has the unfortunate effect of stirring pious indignation among many of the same individuals whose concern for academic freedom does not extend to threats on their own campuses. The potential perils faced by Israeli scholars apparently command more attention than the enormous structural threat to academic freedom posed by the exploitation of adjunct labor at home.
Yet even the contrived administrative concern for the potential threat to Israeli academic freedom is predicated on a misconception. If we agree with the AAUP’s 1940 statement of principles that academic freedom protects the “individual’s ability to conduct teaching and research without interference,” then even a cursory look at what the academic boycott proposes should dispel any suggestion that the boycott is itself a violation of academic freedom.
Each of us chooses to work or not to work with scholars for any number of reasons. is is a negative liberty we enjoy in the academy. As a negative liberty, unless restrictions are put in place that would impede such freedom, it is presumed to prevail. If academic freedom is sufficiently upheld then we cannot be compelled to work with anyone for any reason. e motives behind our decision are irrelevant. Perhaps I resent you personally; perhaps I think you produce shoddy scholarship; perhaps you hold views I find deeply offensive. Whatever my rationale, however correct or misguided, it remains my decision not to work with you. In refusing to establish formal ties with Israeli institutions, the ASA is merely expressing this liberty. Moreover, there’s something particularly obscene about the level of debate, the sheer output of concern over the ostensible threat to academic freedom faced by Israeli scholars while the conditions faced by Palestinian scholars inspires far less piety—even while Palestinian scholars are subject to the inevitable impediments and challenges that military occupation brings with it.
The following case highlights this hypocrisy. Brandeis University recently severed various cooperative ties with Al Quds University in Jerusalem to protest an Islamic Jihad rally that took place on campus, apparently featuring Nazi-style salutes, fake weapons, and photographs of suicide bombers. No one at Brandeis seemed particularly disturbed with the decision to pull out—to effectively boycott Al Quds University—though it means terminating many established academic programs. Yet the entire American Studies at Brandeis department resigned from the ASA in protest of their largely symbolic, non-binding resolution against Israeli institutions.
But let’s assume the academic boycott is, as many claim, a violation of academic freedom. If this is the case, then the logical implications of the argument take us to some fairly untenable conclusions. If it is a violation of academic freedom to refuse to work with certain institutions or to cut established ties with those institutions, then it follows that universities lacking established ties to those institutions are also in violation of academic freedom. I suppose these universities must now be compelled to immediately initiate cooperative endeavors, lest they undermine Israeli academic freedom. is becomes tiring very quickly and obliterates the negative liberty of choosing who or who not to work with, a key element of academic freedom. In a line of reasoning that may have inspired our esteemed state politicians, Indiana University has since with- drawn from the ASA in the name of academic freedom (of course). As Corey Robin writes pointedly:
Indiana University is so opposed to boycotts of academic institutions in Israel that it is going to boycott an academic institution in the United States.
The reader will have noticed that I avoided any discus- sion of the justi cations motivating the boycott. I also did not discuss the boycott’s tactical virtues. As activists and scholars, many of us might disagree with an academic boycott on tactical grounds. Perhaps one feels such a move is counterproductive or will result in negligible gains for the Palestinian struggle. Those are valid arguments and should be taken seriously. Challenging the boycott on grounds of academic freedom is not. Ⓐ