fbpx

Burnt Offerings

Since Aaron Bushnell’s death by self-immolation this week in protest of Israel’s genocide in Gaza, his detractors have warned about the risk of “contagion,” suggesting that his protest will encourage imitators (who, they imply, share his alleged mental instability). There may or may not be additional self-immolators before the slaughter comes to an end, just as Bushnell was preceded by a woman, yet to be identified publicly, who burned herself to death outside the Israeli consulate in Atlanta in December. But the purpose of lighting yourself on fire is not to encourage other people to light themselves on fire. It is to scream to the world that you could find no alternative, and in that respect it is a challenge to the rest of us to prove with our own freedom that there are other ways to meaningfully resist a society whose cruelty has become intolerable.

Aaron Bushnell and the age of immolation

Image via Wikimedia.

Eventually the nerves will burn and the flames eating at the skin won’t feel like anything at all. Soon a place beyond sensation altogether: torrents of adrenaline, a brain relieving itself of duty. But not at first. “I think initially you’re talking about the most excruciating pain that you could ever experience,” one expert says. Often there are accelerants, which produce a hotter, deeper flame. For most people who perish, however, it is the smoke inhalation rather than the heat that is the proximate cause. They choke to death on the fumes of their own flesh.

Immolation seems like an atavism. Entrails burnt on altars; witches burnt at the stake; huts burnt to ashes by stray sparks leaping from the hearth. It’s not the kind of thing that should happen these days. Our civilization is built on combustion—the relentless extraction and incineration of ever-expanding quantities of fuel. But it is a controlled burn. In the US, total deaths from fire and burning declined steadily and significantly over the course of the 20th century, a trend paralleled in much of the rest of the world in recent decades. Building materials are safer, open flames are less necessary for lighting and heating, and fire-response infrastructures are more robust. For most of us, most of the time, fire is innocuous, confined to stovetops and grills and lighters.

There is one important exception. Beginning in World War I, modern warfare has vastly multiplied the range of available methods for burning people to death. Bullets still have their place, of course, and more ancient tactics of siege and starvation, but wars since the early 20th century have probably lit more people on fire than all prior military conflicts in human history. Like so many fires, it happened slowly at first and then all at once. Shortly after the German army debuted the modern flamethrower in the trenches of the Western Front, the weapon became standard in the arsenals of militaries worldwide. But it was a cumbersome and inflexible device and made easy targets, so in World War II its use was mostly limited to dismantling fortifications, suffocating troops entrenched in caves, and exterminating captive civilian populations, as in the Nazis’ suppression of the Warsaw Ghetto Uprising. By that point more efficient immolation delivery devices were widely available.

World War II was when firebombing really came into its own, after more experimental deployment earlier in the century. The Luftwaffe rained down fire on Warsaw and London, and the Allies later retaliated by burning most large German cities to the ground. The British developed a bomb they called the “Superflamer,” which upon detonation emitted a sphere of intense fire in a fifteen-foot radius that burned for at least two minutes without flagging. In the final months of the war Japan was one great conflagration. On one awful night in March 1945, US bombers burned 100,000 residents of Tokyo to death and left a million more homeless. And then came August. “The impact of the bomb was so terrific that practically all living things—human and animal—were literally seared to death by the tremendous heat and pressure set up by the blast,” Tokyo radio reported after the first nuclear weapon was dropped on Hiroshima. Over time the sci-fi novelty of other features of nuclear explosions came to eclipse the fact that the main thing they do is ignite tremendous fires. Daniel Ellsberg later recalled that estimates of the death toll from a US nuclear first strike prepared by the Joint Chiefs of Staff excluded the consequences of firestorms on the specious grounds that their course was difficult to predict precisely. In Ellsberg’s own assessment, taking firestorms into account would have doubled the official estimate, to over a billion people dead.

As it turned out, the cold war’s superpowers managed to avoid deploying additional nuclear weapons (except in 2,121 documented tests). But incineration proceeded apace. The United States dropped 32,557 tons of napalm on North Korea between 1950 and 1953. “We went over there and fought the war and eventually burned down every town in North Korea,” US strategic air commander Curtis LeMay later said, estimating that his forces killed 20 percent of the North Korean population. Ten times as much napalm torched the peasantry of Vietnam in the 1960s and 1970s. We will likely never know exactly how many thousands of people burned to death. There were also survivors, such as Phan Thị Kim Phúc, “the girl in the photo,” who regained the ability to move properly a full decade after her entire back was scorched by South Vietnamese napalm at nine years of age. Eventually the US phased out napalm in favor of an ostensibly new fuel mix, carried in MK-77 bombs, which in its practical effects is essentially the same thing as napalm. At least 30 MK-77s were dropped on Iraq in 2003 alone, including in civilian-populated areas in Baghdad. The US has also extensively deployed ultra-incendiary white phosphorus weapons in its military campaigns since the mid-20th century, most infamously in Fallujah in 2004. Likely the most enthusiastic user of white phosphorus today is Israel. Despite years of controversy and chastisement from human rights groups, Israel has used these munitions yet again in its ongoing genocidal operation in Gaza and its concurrent attacks on southern Lebanon. Its more conventional explosives inflict their fair share of burns as well. In November, Médecins Sans Frontières reported that its burn surgeons in one hospital in Khan Younis were performing about ten operations each day, even though the hospital was “overflowing with hundreds of patients with burns who must wait for surgical care.”

This is all, in theory, illegal. International law has sought to curb attacks on civilians since the burned-out aftermath of World War II, and half-hearted restrictions on the use of incendiary weapons in particular have obtained since the early 1980s. But the age of immolation has also seen the US and many other global powers abandon any pretense of legal or democratic oversight of military activity, an embrace of lawlessness enabled in large part by the increasing significance of aerial bombardment in contemporary warfare. Incendiary weapons are typically launched by planes and missiles and drones, which means that those who launch them seek surprise and therefore secrecy. Their use does not demand the conspicuous massing of troops that infantry-based operations entail. A phone is dialed, a button is pushed, and suddenly the world is aflame. Incineration has become a perfect military machine: precious few humans involved, except those who are burned.


On March 16, 1965, Alice Herz set herself on fire at the intersection of Grand River Avenue and Oakman Boulevard in Detroit. Seven days earlier, Lyndon Johnson had authorized the use of napalm in Vietnam. A Christian socialist of Jewish ancestry, Herz had fled the Nazis and settled in Michigan during the war, although she was denied U.S. citizenship due to her refusal to vow to defend the nation with arms. She expired of her injuries on March 26, at the age of 82. The Detroit Free Press’s headline read: “Human Sacrifice Is Dead of Burns.”

On November 2, 1965, a Quaker anti-war activist named Norman Morrison took his daughter, Emily, to the Pentagon. Standing beneath Robert McNamara’s office, he handed Emily to someone nearby, doused himself in kerosene, and struck a match. Roger Allen LaPorte, a member of Dorothy Day’s Catholic Worker Movement and a former seminarian, emulated Morrison a week later in front of the United Nations building in New York. As he burned, LaPorte sat in the same lotus position assumed by Thích Quảng Đức during his 1963 self-immolation in protest of the persecution of Buddhists by the regime of Ngô Đình Diệm, the United States’ first puppet in its struggle against the Viet Cong.

In all, at least 100 people set themselves on fire in the US and Vietnam to protest the war. After a long history on multiple continents as a tool of protest against religious persecution—the precedent on which Quảng Đức was drawing—these self-immolations cemented a new association in American culture between the tactic and anti-war activism. In February 1991, during the first US war in Iraq, Gregory Levey doused himself in paint thinner and perished in a fireball in a park in Amherst, Massachusetts, leaving behind a small cardboard sign that read, simply, “peace.” Malachi Ritscher, an experimental musician in Chicago, set himself on fire on the side of the Kennedy expressway during the morning rush hour one Friday in November 2006, after posting a long statement on his website explaining that he felt there was no other way for him to escape complicity with the “barbaric war” the US was then waging. He had been arrested at two previous anti-war protests.

Scholars often associate the rise of political self-immolation in the 1960s with the rise of television: a spectacular form of protest for the society of the spectacle. But of course there are less painful ways for protestors to attract eyeballs. The reality is that self-immolation registers the near-total impotence of protest—and even public opinion as such—in the face of a military apparatus completely insulated from external accountability. It the rawest testament to the absence of effective courses of action. When war consists primarily of unelected men in undisclosed locations pouring fire on the heads of people we will never know on the other side of the world, there is very little that ordinary people can do to arrest its progress. But we still have our bodies, and it is in the nature of fire to refuse containment.

To ask whether self-immolation is good or bad, justifiable or non-justifiable, effective or ineffective is in large part to miss the point, which is that it is an option, whether anyone else likes it or not. It illuminates our powerlessness in negative space, but it also affirms the irreducible core of our freedom, that small flame of agency that no repression can extinguish. Since Aaron Bushnell’s death by self-immolation this week in protest of Israel’s genocide in Gaza, his detractors have warned about the risk of “contagion,” suggesting that his protest will encourage imitators (who, they imply, share his alleged mental instability). There may or may not be additional self-immolators before the slaughter comes to an end, just as Bushnell was preceded by a woman, yet to be identified publicly, who burned herself outside the Israeli consulate in Atlanta in December. But the purpose of lighting yourself on fire is not to encourage other people to light themselves on fire. It is to scream to the world that you could find no alternative, and in that respect it is a challenge to the rest of us to prove with our own freedom that there are other ways to meaningfully resist a society whose cruelty has become intolerable.


Aaron Bushnell posted an explanation of his actions on Facebook shortly before his death. “Many of us like to ask ourselves, ‘What would I do if I was alive during slavery?’,” he wrote. It is a vexing question. Most comfortable white Americans who were alive during slavery, after all, did not think of themselves being alive “during slavery,” even as its presence saturated their everyday lives—they were just alive. But as they lived, they often found their thoughts dwelling on the subject of fire. White American Protestants in the eighteenth and nineteenth centuries were especially preoccupied with the fires of hell, imagery that dates back at least to the time of the New Testament but which came to occupy a particularly outsized place in the rhetoric of early American evangelists.

These great awakeners, like the Puritan preacher Jonathan Edwards, faced the task of explaining how it was that a kind and loving God could concoct such a nasty torture chamber for his most beloved subjects. In his 1741 sermon, “Sinners in the Hands of an Angry God,” Edwards answered that the fires of hell originate, in fact, in the human heart. “The corruption of the heart of man is immoderate and boundless in its fury,” Edwards averred, “and while wicked men live here, it is like fire pent up by God’s restraints, whereas if it were let loose, it would set on fire the course of nature.” That fiery loosening is precisely what happens in hell, Edwards concluded. God simply removes his restraint from the sinful impulses of the damned, and immediately they “turn the soul into fiery oven, or a furnace of fire and brimstone.”

Edwards owned at least three slaves in his lifetime, including Venus, who was 14 when Edwards purchased her in Rhode Island, as well as a boy named Titus and another female slave named Leah. In the spring of 1741, the year that Edwards delivered “Sinners in the Hands of an Angry God,” a wave of fires swept the British colony of New York. Authorities, through coerced testimony and a sprinkle of their own imaginations, convinced themselves that a slave revolt was underway. Some of them must have remembered the revolt of April 6, 1712, when about twenty black slaves set fire to a building in lower Manhattan and then attacked the white colonists who came to try to put out the blaze. It remains unclear whether enslaved revolutionaries had hatched a similar plot in 1741, but that didn’t stop the colonial courts from burning thirteen alleged black conspiracists to death that summer, along with dozens of other black people and white collaborators executed in other ways. There was nothing extraordinary about that. Slave-burning was a widespread colonial practice, including in the Northeast; in 1713, New Jersey had passed a law authorizing the burning of law-breaking slaves at the stake.

In 1859, two weeks after John Brown’s raid at Harper’s Ferry, the black abolitionist Henry Highland Garnet wrote that “A box of matches in the pocket of every slave, and then slavery would be set right.” Slavery in the US did ultimately meet its demise in an immense conflagration, what Abraham Lincoln called the nation’s “fiery trial.” As the historian Daniel Immerwahr has shown, cities and plantations and human beings were torched across the south—by slaves revolting against their masters, by Union soldiers, and by vindictive Confederates. One slaveowner said that he’d “burn all his slaves rather than let the Yankees have them.” But the fire was no longer up to the master class to control.

I think that they must have felt these flames already, those pious slavers who expounded on the subtler points of the doctrine of hellfire in the early days of the American republic. On some level they understood that the fire they were starting would be impossible to put out; that it would someday consume them and the whole edifice of murder and torture and kidnapping they had built. And I think Aaron Bushnell felt it too, working as a lowly IT guy in the bowels of the US Air Force, the mightiest incendiary device that humans have ever constructed. It is the same inferno that raged then, the one that has never ceased to rage, the one lit by all our sin and cruelty and lust for violence, never sated, all-devouring. Perhaps this is why he did it: he was already burning. I guess we all are. God gave Noah the rainbow sign. No more water, the fire next time.


If you like this article, please subscribe or leave a tax-deductible tip below to support n+1.


Related Articles

More by this Author