Archives for category: History

Peter Greene writes here about the demand by Christian nationalists to rewrite history to their satisfaction. Whatever promotes the religion of their choice is good, whatever contradicts it must be left out. They want fairy-tale history.

Greene writes:

Recently Oklahoma’s education Dudebro-in-Chief Ryan Walters went on another tear, this time warning textbook publishers that they’d better not try to sell any wokified textbooks in Oklahoma.

“If you can’t teach math without talking about transgenderism, go to California, go to New York,” he told Fox News Digital. He even sent out a letter, just so they’d know. “Listen, we will be checking for these things now. Do not give us textbooks that have critical race theory in them.”

Walters said lots of things. Maybe he’s auditioning for a media spot. Maybe he wants to be governor. Maybe he’s just a tool. But he says all sorts of things like “In Oklahoma, our kids are going to know the basics. We want them to master it. We want them to do exceptionally well academically. We’re not here for any kind of Joe Biden’s socialist Marxist training ground.”

But somewhere in this conversation, Walters lays out a succinct summary of our nation’s history as he believes it should be taught.

“So as you go through, you talk about the times that America has led the free world, that we have continued to be that light. We’ve done more for individual liberty than any other country in the history of the world. And those belief systems that were there in place, it allowed us to do it. You’ve got to talk about our Judeo-Christian values. The founders were very clear that that was a crucial part of our success. Then you go through and you evaluate. Are these times we lived up to our core principles? You’ve got to be honest with kids about our history. So you talk about all of it, but you evaluate it through the prism of our founding principles. Is this a time we lived up to those principles?”

Most of the elements of the christianist nationalist version of US history are here. American exceptionalism– the light that led the free world, the very most ever done for individual liberty. A nation founded on Judeo-Christian values.

With that as a foundation, it’s safe to note some of the lapses, all of which are framed as an aberration, a lapse from our foundation and certainly not part of it (take that, you 1619 project-reading CRTers). In the CN view, every good thing that ever happened is because of our God-aligned nature, and every bad thing is in spite of it, quite possibly because Wrong People were allowed to get their hands on some power.

There are plenty of implications for this view of history. One of the biggest is that these folks simply don’t believe in democracy, because democracy allows too many of the Wrong People to get their hands on power. As Katherine Stewart puts it in her must-read The Power Worshippers

It [Christian nationalism] asserts that legitimate government rests not on the consent of the governed but adherence to the doctrines of a specific religious, ethnic, and cultural heritage.

Or, as she quotes Gary North, a radical free-market libertarian christianist who developed the Ron Paul Curriculum,

Let us be blunt about it: we must use the doctrine of religious liberty to gain independence for Christian schools until we train up a generation of people who know that there is no religious neutrality, no neutral law, no neutral education, and no neutral civil government. Then we will get busy in constructing a Bible-based social, political, and religious order which finally denies the religious liberty of the enemies of God.

The idea of individualism is also important in the CN view of US history. There’s no systemic anything–just the work of either good or bad, Right or Wrong individuals. And if everything is about the individual, then your problems are strictly your problems; your failures are all on you, not on society or community (the village has no responsibility to raise your child). That emphasis on the individual runs all through the Hillsdale 1776 curriculum, both original flavor andthe Jordan Adams stealth version.

The rejection of systemic views of society and history matters. It goes along with the view that we pretty much fixed racism in the 1960s (even we got a little too socialist in the process). From which we can conclude that all attempts to talk racism now are just attempts to grab power with made-up grievances.

To take another angle– the underlying idea of the Classical Education that is so popular with the CN crowd is that there is One Objective Truth. Back in classical times, great thinkers understood this Truth, but the 20th century brought a bunch of relativistic thought and the evil notion that there are different, subjective truths. But our Founding Fathers knew the Truth and encoded it into the Constitution and our founding principles, and as long as we are led by people who follow that Truth, which is somehow both a Christian Truth and an American Truth, we are okay. People who don’t follow that Truth are a threat to the integrity and fiber of our country; consequently, they have to be stopped.

People who claim that history is complicated, that our founders were complicated, that humans are complicated–those people are just trying to confuse the issue, to draw others away from understanding The Truth.

Please open the link to finish the article.

Let me add that I don’t want to go back to 1776. The guys who wrote the founding documents were brilliant, but not on subjects like slavery and women’s rights.

For the past few years, Virginia was a hotbed of dissension over “parental rights.” Governor Youngkin won office by attacking public schools, teachers, trans kids, and libraries. On Tuesday, Virginia’s parents took back most school boards from MAGA extremists.

Pundits cast Virginia’s Tuesday general elections as a referendum on abortion rights. It was more than that. Further down the ballot those votes also sent a strong message to those trying to disrupt public education: listen to parents. Parents who came out to vote in Fairfax, Loudoun and even Spotsylvania, the epicenters of vitriol and fantasy, voted with a resounding “no” to candidates who focused on anti-CRT, book bans and transphobia. Parents overwhelmingly voted for moderate candidates campaigning on safe schools, feeding hungry kids and supporting our teachers.

After almost four years of vile accusations of racism, pedophilia, incompetence and more, voters in Fairfax rejected the lies and returned Rachna Sizemore Heizer, Melanie Meren, Ricardy Anderson and Karl Frish to the School Board, along with a sweep of all pro-public education newcomers. Rachna Sizemore Heizer said “Today, Fairfax County resoundingly rejected the GOP’s divisive politics and relentless attacks on our schools, students and staff, and stood strong in support of public education. It has been a tough four years on the school board, but we’ve stood strong knowing the majority of Fairfax County shared our values of an excellent education in a welcoming and inclusive environment. Now on to work making our great schools even better for every child.”

Spotsylvania County, with one of the most “toxic” school boards in the Commonwealth, flipped from MAGA extremist to centrist, teacher-focused sanity. Carol Medowar, a newcomer to politics, and part of the wave that flipped the Spotsylvania school board, stated “I’m just so happy for the students, families, and educators who really get to breathe a sigh of relief for this race. It’s a huge flip on the Spotsy school board.”

In Loudoun County, the genesis of the politization of public education education, pro-public school supporters held their ground in a clear referendum on Youngkin’s plan to dismantle public schools, drive out teachers and humiliate trans-kids. The acrimony and chaos of the last four years drove every member of the prior school board out of the race. However, the new board, with all new members, will maintain a strong pro-public school majority, despite Youngkin’s concerted, last minute attempt to influence the race. According to Loudoun public school advocate Andrew Pihonek, “a brand new school board will be a breath of fresh air for many in Loudoun.”

Albermarle-Charlottesville followed the same trend as Loudoun, Fairfax and Spotsylvania, rejecting candidates who tried to re-write our history and ban books.

If Glenn Youngkin and his minions truly want to listen to parents, now is their chance. Parents across the Commonwealth, in their first opportunity since his election to send a clear message, have rejected fear-mongering, white-washing, transphobia, sabotage and lack of civility. The question is no longer will we listen to parents, but will he? As Carol Medowar, successful Spotsylvania candidate, pleaded a few weeks ago, “Let’s make school board meetings boring again.”

British historian Simon Sebag Montefiore wrote the following important article for The Atlantic. I urge you to subscribe to The Atlantic. Its content is consistently interesting and thoughtful.

He wrote:

Peace in the israel-palestine conflict had already been difficult to achieve before Hamas’s barbarous October 7 attack and Israel’s military response. Now it seems almost impossible, but its essence is clearer than ever: Ultimately, a negotiation to establish a safe Israel beside a safe Palestinian state.

Whatever the enormous complexities and challenges of bringing about this future, one truth should be obvious among decent people: killing 1,400 people and kidnapping more than 200, including scores of civilians, was deeply wrong. The Hamas attack resembled a medieval Mongol raid for slaughter and human trophies—except it was recorded in real time and published to social media. Yet since October 7, Western academics, students, artists, and activists have denied, excused, or even celebrated the murders by a terrorist sect that proclaims an anti-Jewish genocidal program. Some of this is happening out in the open, some behind the masks of humanitarianism and justice, and some in code, most famously “from the river to the sea,” a chilling phrase that implicitly endorses the killing or deportation of the 9 million Israelis. It seems odd that one has to say: Killing civilians, old people, even babies, is always wrong. But today say it one must.

How can educated people justify such callousness and embrace such inhumanity? All sorts of things are at play here, but much of the justification for killing civilians is based on a fashionable ideology, “decolonization,” which, taken at face value, rules out the negotiation of two states—the only real solution to this century of conflict—and is as dangerous as it is false.

I always wondered about the leftist intellectuals who supported Stalin, and those aristocratic sympathizers and peace activists who excused Hitler. Today’s Hamas apologists and atrocity-deniers, with their robotic denunciations of “settler-colonialism,” belong to the same tradition but worse: They have abundant evidence of the slaughter of old people, teenagers, and children, but unlike those fools of the 1930s, who slowly came around to the truth, they have not changed their views an iota. The lack of decency and respect for human life is astonishing: Almost instantly after the Hamas attack, a legion of people emerged who downplayed the slaughter, or denied actual atrocities had even happened, as if Hamas had just carried out a traditional military operation against soldiers. October 7 deniers, like Holocaust deniers, exist in an especially dark place.

The decolonization narrative has dehumanized Israelis to the extent that otherwise rational people excuse, deny, or support barbarity. It holds that Israel is an “imperialist-colonialist” force, that Israelis are “settler-colonialists,” and that Palestinians have a right to eliminate their oppressors. (On October 7, we all learned what that meant.) It casts Israelis as “white” or “white-adjacent” and Palestinians as “people of color.”

This ideology, powerful in the academy but long overdue for serious challenge, is a toxic, historically nonsensical mix of Marxist theory, Soviet propaganda, and traditional anti-Semitism from the Middle Ages and the 19th century. But its current engine is the new identity analysis, which sees history through a concept of race that derives from the American experience. The argument is that it is almost impossible for the “oppressed” to be themselves racist, just as it is impossible for an “oppressor” to be the subject of racism. Jews therefore cannot suffer racism, because they are regarded as “white” and “privileged”; although they cannot be victims, they can and do exploit other, less privileged people, in the West through the sins of “exploitative capitalism” and in the Middle East through “colonialism.”

This leftist analysis, with its hierarchy of oppressed identities—and intimidating jargon, a clue to its lack of factual rigor—has in many parts of the academy and media replaced traditional universalist leftist values, including internationalist standards of decency and respect for human life and the safety of innocent civilians. When this clumsy analysis collides with the realities of the Middle East, it loses all touch with historical facts.

Indeed, it requires an astonishing leap of ahistorical delusion to disregard the record of anti-Jewish racism over the two millennia since the fall of the Judean Temple in 70 C.E. After all, the October 7 massacre ranks with the medieval mass killings of Jews in Christian and Islamic societies, the Khmelnytsky massacres of 1640s Ukraine, Russian pogroms from 1881 to 1920—and the Holocaust. Even the Holocaust is now sometimes misconstrued—as the actor Whoopi Goldberg notoriously did—as being “not about race,” an approach as ignorant as it is repulsive.

Contrary to the decolonizing narrative, Gaza is not technically occupied by Israel—not in the usual sense of soldiers on the ground. Israel evacuated the Strip in 2005, removing its settlements. In 2007, Hamas seized power, killing its Fatah rivals in a short civil war. Hamas set up a one-party state that crushes Palestinian opposition within its territory, bans same-sex relationships, represses women, and openly espouses the killing of all Jews.

Very strange company for leftists.

Of course, some protesters chanting “from the river to the sea” may have no idea what they’re calling for; they are ignorant and believe that they are simply endorsing “freedom.” Others deny that they are pro-Hamas, insisting that they are simply pro-Palestinian—but feel the need to cast Hamas’s massacre as an understandable response to Israeli-Jewish “colonial” oppression. Yet others are malign deniers who seek the death of Israeli civilians.

The toxicity of this ideology is now clear. Once-respectable intellectuals have shamelessly debated whether 40 babies were dismembered or some smaller number merely had their throats cut or were burned alive. Students now regularly tear down posters of children held as Hamas hostages. It is hard to understand such heartless inhumanity. Our definition of a hate crime is constantly expanding, but if this is not a hate crime, what is? What is happening in our societies? Something has gone wrong.

In a further racist twist, Jews are now accused of the very crimes they themselves have suffered. Hence the constant claim of a “genocide” when no genocide has taken place or been intended. Israel, with Egypt, has imposed a blockade on Gaza since Hamas took over, and has periodically bombarded the Strip in retaliation for regular rocket attacks. After more than 4,000 rockets were fired by Hamas and its allies into Israel, the 2014 Gaza War resulted in more than 2,000 Palestinian deaths. More than 7,000 Palestinians, including many children, have died so far in this war, according to Hamas. This is a tragedy—but this is not a genocide, a word that has now been so devalued by its metaphorical abuse that it has become meaningless.

I should also say that Israeli rule of the Occupied Territories of the West Bank is different and, to my mind, unacceptable, unsustainable, and unjust. The Palestinians in the West Bank have endured a harsh, unjust, and oppressive occupation since 1967. Settlers under the disgraceful Netanyahu government have harassed and persecuted Palestinians in the West Bank: 146 Palestinians in the West Bank and East Jerusalem were killed in 2022 and at least 153 in 2023 before the Hamas attack, and more than 90 since. Again: This is appalling and unacceptable, but not genocide.

Although there is a strong instinct to make this a Holocaust-mirroring “genocide,” it is not: The Palestinians suffer from many things, including military occupation; settler intimidation and violence; corrupt Palestinian political leadership; callous neglect by their brethren in more than 20 Arab states; the rejection by Yasser Arafat, the late Palestinian leader, of compromise plans that would have seen the creation of an independent Palestinian state; and so on. None of this constitutes genocide, or anything like genocide. The Israeli goal in Gaza—for practical reasons, among others—is to minimize the number of Palestinian civilians killed. Hamas and like-minded organizations have made it abundantly clear over the years that maximizing the number of Palestinian casualties is in their strategic interest. (Put aside all of this and consider: The world Jewish population is still smaller than it was in 1939, because of the damage done by the Nazis. The Palestinian population has grown, and continues to grow. Demographic shrinkage is one obvious marker of genocide. In total, roughly 120,000 Arabs and Jews have been killed in the conflict over Palestine and Israel since 1860. By contrast, at least 500,000 people, mainly civilians, have been killed in the Syrian civil war since it began in 2011.)

If the ideology of decolonization, taught in our universities as a theory of history and shouted in our streets as self-evidently righteous, badly misconstrues the present reality, does it reflect the history of Israel as it claims to do? It does not. Indeed, it does not accurately describe either the foundation of Israel or the tragedy of the Palestinians.

According to the decolonizers, Israel is and always has been an illegitimate freak-state because it was fostered by the British empire and because some of its founders were European-born Jews.

In this narrative, Israel is tainted by imperial Britain’s broken promise to deliver Arab independence, and its kept promise to support a “national home for the Jewish people,” in the language of the 1917 Balfour Declaration. But the supposed promise to Arabs was in fact an ambiguous 1915 agreement with Sharif Hussein of Mecca, who wanted his Hashemite family to rule the entire region. In part, he did not receive this new empire because his family had much less regional support than he claimed. Nonetheless, ultimately Britain delivered three kingdoms—Iraq, Jordan, and Hejaz—to the family.

The imperial powers—Britain and France—made all sorts of promises to different peoples, and then put their own interests first. Those promises to the Jews and the Arabs during World War I were typical. Afterward, similar promises were made to the Kurds, the Armenians, and others, none of which came to fruition. But the central narrative that Britain betrayed the Arab promise and backed the Jewish one is incomplete. In the 1930s, Britain turned against Zionism, and from 1937 to 1939 moved toward an Arab state with no Jewish one at all. It was an armed Jewish revolt, from 1945 to 1948 against imperial Britain, that delivered the state.

Israel exists thanks to this revolt, and to international law and cooperation, something leftists once believed in. The idea of a Jewish “homeland” was proposed in three declarations by Britain (signed by Balfour), France, and the United States, then promulgated in a July 1922 resolution by the League of Nations that created the British “mandates” over Palestine and Iraq that matched French “mandates” over Syria and Lebanon. In 1947, the United Nations devised the partition of the British mandate of Palestine into two states, Arab and Jewish.

The carving of such states out of these mandates was not exceptional, either. At the end of World War II, France granted independence to Syria and Lebanon, newly conceived nation-states. Britain created Iraq and Jordan in a similar way. Imperial powers designed most of the countries in the region, except Egypt.

Nor was the imperial promise of separate homelands for different ethnicities or sects unique. The French had promised independent states for the Druze, Alawites, Sunnis, and Maronites but in the end combined them into Syria and Lebanon. All of these states had been “vilayets” and “sanjaks” (provinces) of the Turkish Ottoman empire, ruled from Constantinople, from 1517 until 1918.

The concept of “partition” is, in the decolonization narrative, regarded as a wicked imperial trick. But it was entirely normal in the creation of 20th-century nation-states, which were typically fashioned out of fallen empires. And sadly, the creation of nation-states was frequently marked by population swaps, huge refugee migrations, ethnic violence, and full-scale wars. Think of the Greco-Turkish war of 1921–22 or the partition of India in 1947. In this sense, Israel-Palestine was typical.

At the heart of decolonization ideology is the categorization of all Israelis, historic and present, as “colonists.” This is simply wrong. Most Israelis are descended from people who migrated to the Holy Land from 1881 to 1949. They were not completely new to the region. The Jewish people ruled Judean kingdoms and prayed in the Jerusalem Temple for a thousand years, then were ever present there in smaller numbers for the next 2,000 years. In other words, Jews are indigenous in the Holy Land, and if one believes in the return of exiled people to their homeland, then the return of the Jews is exactly that. Even those who deny this history or regard it as irrelevant to modern times must acknowledge that Israel is now the home and only home of 9 million Israelis who have lived there for four, five, six generations.

Most migrants to, say, the United Kingdom or the United States are regarded as British or American within a lifetime. Politics in both countries is filled with prominent leaders—Suella Braverman and David Lammy, Kamala Harris and Nikki Haley—whose parents or grandparents migrated from India, West Africa, or South America. No one would describe them as “settlers.” Yet Israeli families resident in Israel for a century are designated as “settler-colonists” ripe for murder and mutilation. And contrary to Hamas apologists, the ethnicity of perpetrators or victims never justifies atrocities. They would be atrocious anywhere, committed by anyone with any history. It is dismaying that it is often self-declared “anti-racists” who are now advocating exactly this murder by ethnicity.

Those on the left believe migrants who escape from persecution should be welcomed and allowed to build their lives elsewhere. Almost all of the ancestors of today’s Israelis escaped persecution.

If the “settler-colonist” narrative is not true, it is true that the conflict is the result of the brutal rivalry and battle for land between two ethnic groups, both with rightful claims to live there. As more Jews moved to the region, the Palestinian Arabs, who had lived there for centuries and were the clear majority, felt threatened by these immigrants. The Palestinian claim to the land is not in doubt, nor is the authenticity of their history, nor their legitimate claim to their own state. But initially the Jewish migrants did not aspire to a state, merely to live and farm in the vague “homeland.” In 1918, the Zionist leader Chaim Weizmann met the Hashemite Prince Faisal Bin Hussein to discuss the Jews living under his rule as king of greater Syria. The conflict today was not inevitable. It became so as the communities refused to share and coexist, and then resorted to arms.

Even more preposterous than the “colonizer” label is the “whiteness” trope that is key to the decolonization ideology. Again: simply wrong. Israel has a large community of Ethiopian Jews, and about half of all Israelis—that is, about 5 million people—are Mizrahi, the descendantsof Jews from Arab and Persian lands, people of the Middle East. They are neither “settlers” nor “colonialists” nor “white” Europeans at all but inhabitants of Baghdad and Cairo and Beirut for many centuries, even millennia, who were driven out after 1948.

A word about that year, 1948, the year of Israel’s War of Independence and the Palestinian Nakba (“Catastrophe”), which in decolonization discourse amounted to ethnic cleansing. There was indeed intense ethnic violence on both sides when Arab states invaded the territory and, together with Palestinian militias, tried to stop the creation of a Jewish state. They failed; what they ultimately stopped was the creation of a Palestinian state, as intended by the United Nations. The Arab side sought the killing or expulsion of the entire Jewish community—in precisely the murderous ways we saw on October 7. And in the areas the Arab side did capture, such as East Jerusalem, every Jew was expelled.

In this brutal war, Israelis did indeed drive some Palestinians from their homes; others fled the fighting; yet others stayed and are now Israeli Arabs who have the vote in the Israeli democracy. (Some 25 percent of today’s Israelis are Arabs and Druze.) About 700,000Palestinians lost their homes. That is an enormous figure and a historic tragedy. Starting in 1948, some 900,000 Jews lost their homes in Islamic countries and most of them moved to Israel. These events are not directly comparable, and I don’t mean to propose a competition in tragedy or hierarchy of victimhood. But the past is a lot more complicated than the decolonizers would have you believe.

Out of this imbroglio, one state emerged, Israel, and one did not, Palestine. Its formation is long overdue.

It is bizarre that a small state in the Middle East attracts so much passionate attention in the West that students run through California schools shouting “Free Palestine.” But the Holy Land has an exceptional place in Western history. It is embedded in our cultural consciousness, thanks to the Hebrew and Christian Bibles, the story of Judaism, the foundation of Christianity, the Quran and the creation of Islam, and the Crusades that together have made Westerners feel involved in its destiny. The British Prime Minister David Lloyd George, the real architect of the Balfour Declaration, used to say that the names of places in Palestine “were more familiar to me than those on the Western Front.” This special affinity with the Holy Land initially worked in favor of the Jewish return, but lately it has worked against Israel. Westerners eager to expose the crimes of Euro-American imperialism but unable to offer a remedy have, often without real knowledge of the actual history, coalesced around Israel and Palestine as the world’s most vivid example of imperialist injustice.

The open world of liberal democracies—or the West, as it used to be called—is today polarized by paralyzed politics, petty but vicious cultural feuds about identity and gender, and guilt about historical successes and sins, a guilt that is bizarrely atoned for by showing sympathy for, even attraction to, enemies of our democratic values. In this scenario, Western democracies are always bad actors, hypocritical and neo-imperialist, while foreign autocracies or terror sects such as Hamas are enemies of imperialism and therefore sincere forces for good. In this topsy-turvy scenario, Israel is a living metaphor and penance for the sins of the West. The result is the intense scrutiny of Israel and the way it is judged, using standards rarely attained by any nation at war, including the United States.

But the decolonizing narrative is much worse than a study in double standards; it dehumanizes an entire nation and excuses, even celebrates, the murder of innocent civilians. As these past two weeks have shown, decolonization is now the authorized version of history in many of our schools and supposedly humanitarian institutions, and among artists and intellectuals. It is presented as history, but it is actually a caricature, zombie history with its arsenal of jargon—the sign of a coercive ideology, as Foucault argued—and its authoritarian narrative of villains and victims. And it only stands up in a landscape in which much of the real history is suppressed and in which all Western democracies are bad-faith actors. Although it lacks the sophistication of Marxist dialectic, its self-righteous moral certainty imposes a moral framework on a complex, intractable situation, which some may find consoling. Whenever you read a book or an article and it uses the phrase “settler-colonialist,” you are dealing with ideological polemic, not history.

Ultimately, this zombie narrative is a moral and political cul-de-sac that leads to slaughter and stalemate. That is no surprise, because it is based on sham history: “An invented past can never be used,” wrote James Baldwin. “It cracks and crumbles under the pressures of life like clay.”

Even when the word decolonization does not appear, this ideology is embedded in partisan media coverage of the conflict and suffuses recent condemnations of Israel. The student glee in response to the slaughter at Harvard, the University of Virginia, and other universities; the support for Hamas amongst artists and actors, along with the weaselly equivocations by leaders at some of America’s most famous research institutions, have displayed a shocking lack of morality, humanity, and basic decency…

The Israel-Palestine conflict is desperately difficult to solve, and decolonization rhetoric makes even less likely the negotiated compromise that is the only way out.

Since its founding in 1987, Hamas has used the murder of civilians to spoil any chance of a two-state solution. In 1993, its suicide bombings of Israeli civilians were designed to destroy the two-state Oslo Accords that recognized Israel and Palestine. This month, the Hamas terrorists unleashed their slaughter in part to undermine a peace with Saudi Arabia that would have improved Palestinian politics and standard of life, and reinvigorated Hamas’s sclerotic rival, the Palestinian Authority. In part, they served Iran to prevent the empowering of Saudi Arabia, and their atrocities were of course a spectacular trap to provoke Israeli overreaction. They are most probably getting their wish, but to do this they are cynically exploiting innocent Palestinian people as a sacrifice to political means, a second crime against civilians. In the same way, the decolonization ideology, with its denial of Israel’s right to exist and its people’s right to live safely, makes a Palestinian state less likely if not impossible.

The problem in our countries is easier to fix: Civic society and the shocked majority should now assert themselves. The radical follies of students should not alarm us overmuch; students are always thrilled by revolutionary extremes. But the indecent celebrations in London, Paris, and New York City, and the clear reluctance among leaders at major universities to condemn the killings, have exposed the cost of neglecting this issue and letting “decolonization” colonize our academy.

Parents and students can move to universities that are not led by equivocators and patrolled by deniers and ghouls; donors can withdraw their generosity en masse, and that is starting in the United States. Philanthropists can pull the funding of humanitarian foundations led by people who support war crimes against humanity (against victims selected by race). Audiences can easily decide not to watch films starring actors who ignore the killing of children; studios do not have to hire them. And in our academies, this poisonous ideology, followed by the malignant and foolish but also by the fashionable and well intentioned, has become a default position. It must forfeit its respectability, its lack of authenticity as history. Its moral nullity has been exposed for all to see.

Again, scholars, teachers, and our civil society, and the institutions that fund and regulate universities and charities, need to challenge a toxic, inhumane ideology that has no basis in the real history or present of the Holy Land, and that justifies otherwise rational people to excuse the dismemberment of babies.

Israel has done many harsh and bad things. Netanyahu’s government, the worst ever in Israeli history, as inept as it is immoral, promotes a maximalist ultranationalism that is both unacceptable and unwise. Everyone has the right to protest against Israel’s policies and actions but not to promote terror sects, the killing of civilians, and the spreading of menacing anti-Semitism.

The Palestinians have legitimate grievances and have endured much brutal injustice. But both of their political entities are utterly flawed: the Palestinian Authority, which rules 40 percent of the West Bank, is moribund, corrupt, inept, and generally disdained—and its leaders have been just as abysmal as those of Israel.

Hamas is a diabolical killing sect that hides among civilians, whom it sacrifices on the altar of resistance—as moderate Arab voices have openly stated in recent days, and much more harshly than Hamas’s apologists in the West. “I categorically condemn Hamas’s targeting of civilians,” the Saudi veteran statesman Prince Turki bin Faisal movingly declared last week. “I also condemn Hamas for giving the higher moral ground to an Israeli government that is universally shunned even by half of the Israeli public … I condemn Hamas for sabotaging the attempt of Saudi Arabia to reach a peaceful resolution to the plight of the Palestinian people.” In an interview with Khaled Meshaal, a member of the Hamas politburo, the Arab journalist Rasha Nabil highlighted Hamas’s sacrifice of its own people for its political interests. Meshaal argued that this was just the cost of resistance: “Thirty million Russians died to defeat Germany,” he said.

Nabil stands as an example to Western journalists who scarcely dare challenge Hamas and its massacres. Nothing is more patronizing and even Orientalist than the romanticization of Hamas’s butchers, whom many Arabs despise. The denial of their atrocities by so many in the West is an attempt to fashion acceptable heroes out of an organization that dismembers babies and defiles the bodies of murdered girls. This is an attempt to save Hamas from itself. Perhaps the West’s Hamas apologists should listen to moderate Arab voices instead of a fundamentalist terror sect.

Hamas’s atrocities place it, like the Islamic State and al-Qaeda, as an abomination beyond tolerance. Israel, like any state, has the right to defend itself, but it must do so with great care and minimal civilian loss, and it will be hard even with a full military incursion to destroy Hamas. Meanwhile, Israel must curb its injustices in the West Bank—or risk destroying itself—because ultimately it must negotiate with moderate Palestinians.

So the war unfolds tragically. As I write this, the pounding of Gaza is killing Palestinian children every day, and that is unbearable. As Israel still grieves its losses and buries its children, we deplore the killing of Israeli civilians just as we deplore the killing of Palestinian civilians. We reject Hamas, evil and unfit to govern, but we do not mistake Hamas for the Palestinian people, whose losses we mourn as we mourn the death of all innocents.

In the wider span of history, sometimes terrible events can shake fortified positions: Anwar Sadat and Menachem Begin made peace after the Yom Kippur War; Yitzhak Rabin and Yasser Arafat made peace after the Intifada. The diabolical crimes of October 7 will never be forgotten, but perhaps, in the years to come, after the scattering of Hamas, after Netanyahuism is just a catastrophic memory, Israelis and Palestinians will draw the borders of their states, tempered by 75 years of killing and stunned by one weekend’s Hamas butchery, into mutual recognition. There is no other way.

Simon Sebag Montefiore is the author of Jerusalem: The Biography and most recently The World: A Family History of Humanity.

Chris Tomlinson is an award-winning columnist for the Houston Chronicle. In this column, he describes the damage that extremists are doing to our country.

The true story was almost custom-made for Hollywood.

A powerful man convinces his nephew to marry into a wealthy family, and the two conspire to kill four of the bride’s relatives to inherit their fortune. Then, they begin slowly poisoning the loving, unsuspecting wife. Will anyone catch on and stop the villains before they complete their nefarious plot?

A good yarn, but like too many products these days, it’s gotten caught in the culture wars because the killers were white supremacists, and their victims were members of the Osage tribe.

The National Review says the only thing Martin Scorsese’s “Killers of the Flower Moon” has going for it “is the woke idea that America’s white men are spiritually sick

I think the only thing conservative media has going for it is racial grievance, convincing older white people of their imminent demise if they don’t elect white supremacists. Bankrupted of ideas, these billionaire-financed outlets have become nothing more than outrage factories.

Conservative media wants to make every aspect of American life political. Want a vaccination? Woke! Don’t want to pay taxes? Righteous! Want renewable energy? Woke! Want a military-style rifle? Righteous! Electric vehicles? Woke!

Why? Because powerful people with financial interests vulnerable to human progress want voters to elect backward politicians who will protect their profits.

So, partisans work to make every consumer purchase a political talisman.

Come back to the office with a Chick-fil-A bag, and some of your coworkers may suspect you oppose LGBTQ rights. But conservatives will also give you a side-eye too because they’ve launched a Chick-fil-A boycott over the company employing an executive overseeing diversity, equity and inclusion policies

Conservatives and progressives have extensive lists of companies and individuals that have perpetrated some egregious act. The perceived misdemeanors are as wide-ranging as they are sometimes absurd. But propagandists know calling out a well-known brand, especially on Twitter, now called X, is a surefire way to grab attention…

I saw “Killers of the Flower Moon” last weekend, and Scorsese made another fine film about what Hannah Arendt might call the banality of evil. Almost all his films have been about America’s spiritually sick white men; this one is no different.

I agree with critics who say Scorsese spends too much time with the white guys and not enough with the Osage, whose people were murdered. Woke this film is not.

The director’s biggest mistake was making a historical drama revealing how white people did terrible things to people of color. Teaching history has also become a political act.

Anyone who deviates from the white supremacist narratives established between 1875 and 1955 should brace for conservative condemnation, no matter how many endnotes they include. Conservatives want to ignore how many of our ancestors sweetly depicted in sepia-toned photographs committed crimes against humanity. Talking about it gets you labeled woke or worse.

The Texas Legislature has made it a crime to teach American history that might make children uncomfortable. By that measure, no teacher can screen “Killers of the Flower Moon” without fear of persecution.

The systematic murder of the Osage took place in Oklahoma, but Texas also has a long history of atrocities. The Texas Rangers are celebrating their bicentennial, but few are talking about how troopers massacred Mexican Americans or ethnically cleansed Native Americans in shocking numbers and violence.

The Republican majority has also made it illegal to explain how slavery was the original sin of the U.S. Constitution. Teachers must say slavery was a deviation from American values, even though Southerners forced Thomas Jefferson to cut a proposed part of the Declaration of Independence that called for abolition. The Constitution ordered that enslaved people only count as three-fifths of a human.

Pretty originalist to me.

These days, politicians rely on grievance and fear rather than ideas and hope. But politicizing everything only divides us, and ignoring our history condemns us to repeat it; look at the resurrection of fascism.

Patriots don’t hate their fellow citizens; they learn from the past and compromise for a more perfect Union.

This story is fascinating. It’s about the quest to understand the origins of a painting of three white children that originally included a young slave. At some point, the enslaved youth was painted over and eliminated.

One determined art collector enlisted the help of art historians to identify the white children and the enslaved youth. The young man was named Bélizaire. The painting was held for decades in storage in a New Orleans art museum. It was just another family painting: three children. If you can open the video, please do (I don’t know if it is behind a pay wall).

Once the original painting was restored and its history documented, it was purchased by the Metropolitan Museum of Art, where it is prominently displayed.

How a Rare Portrait of an Enslaved Child Arrived at the Met” is a 10-minute film that touches on themes of race, art and history.

For many years, a 19th century painting of three white children in a Louisiana landscape held a secret. Beneath a layer of overpaint meant to look like the sky: the figure of an enslaved youth. But a 2005 restoration revealed him and now the painting has a new, very prominent home at the Metropolitan Museum of Art. Who was the enslaved child? Who covered over his figure? Why did the painting languish for decades in attics and a museum basement?

The Times further describes the journey of the painting:

To learn more, read “‘His Name Was Bélizaire’: Rare Portrait of Enslaved Child Arrives at the Met.” Alexandra Eaton writes:

One reason “Bélizaire and the Frey Children” has drawn attention is the naturalistic depiction of Bélizaire, the young man of African descent who occupies the highest position in the painting, leaning against a tree just behind the Frey children. Although he remains separated from the white children, Amans painted him in a powerful stance, with blushing cheeks, and a kind of interiority that is unusual for the time.

Since the Black Lives Matter movement, the Met and other museums have responded to calls to reckon with the presentation of Black figures. When the European Galleries reopened in 2020, the museum included wall texts to highlight the presence of African people in Europe and to call attention to issues of racism, previously unmentioned. In the American Wing, which had presented “a romanticized history of American art,” Kornhauser said, a presidential portrait was recast with the consciousness of the present: John Trumbull’s 1780 portrait of George Washington and his enslaved servant William Lee identified only the former president until 2020, when Lee’s name was added to the title. However, unlike Bélizaire, Lee is depicted at the margins, lacking in any emotion or humanity.

Jeremy K. Simien, an art collector from Baton Rouge, spent years trying to find “Bélizaire” after seeing an image of it online in 2013, following its restoration, that featured all four figures. Intrigued, he kept searching, only to find an earlier image from 2005, after the painting had been de-accessioned by the New Orleans Museum of Art and was listed for auction by Christie’s. It was the same painting, but the Black child was missing. He had been painted out.

“The fact that he was covered up haunted me,” Simien said in an interview.

One of my grandsons sent me an article about the national rush to mandate “the science of reading,” and it caused me to explain briefly (without boring him) the background of the latest panacea.

I didn’t tell him the history of the “reading wars,” which I researched and wrote about in Left Back (2000). I didn’t tell him that reading instruction has swung back and forth between the phonetic method and the “whole word” method since the introduction of public schooling in the first quarter of the 19th century. Horace Mann opposed phonics. But the popular McGuffey readers of that century were phonetic. In 1930, the Dick-and-Jane readers were introduced, and they swept the country. Unlike the McGuffey readers, they featured pictures of children (white and suburban), they used simple words that could be easily recognized, and they were bright and colorful. By the 1950s, Dick and Jane style readers were used in about 80% of American schools. They relied on the whole word method, also know as look-say.

In 1955, this national consensus was disrupted by the publication of Rudolf Flesch’s wildly popular book, Why Johnny Can’t Read, which castigated the look-say method and urged a revival of phonics. The fervor for phonics then is similar to the fervor now.

But the debate about which method was best quickly became politicized. “Bring back phonics” was the battle cry of very conservative groups, who lambasted the whole-word method as the conspiratorial work of liberal elites. Phonics thus was unfairly tarnished as a rightwing cause.

The definitive book about the teaching of reading was written in 1967 by Harvard literacy expert Jeanne Chall: Learning to Read: The Great Debate. Chall wrote about the importance of phonics as part of beginning reading instruction, followed up by wonderful children’s literature. She warned against going to extremes, a warning that has been ignored with every pendulum swing.

The 1980s began the dominance of whole language, which brought back whole-word sight reading and de-emphasized phonics. Textbook companies boasted that their programs were whole language. Literacy conferences were focused on whole language. Phonics was out. Many reading teachers held on their phonics books, even though phonics was out of style.

There is always a crisis in reading, so in the late 1990s, the pendulum began to move again. As it happened, a very influential supporter of phonics held a key position at the National Institutes of Health. Dr. Reid Lyon was director of the NIH’s National Institute of Child Health and Development. His field of expertise was learning disabilities.

From Wikipedia:

From 1992 to 2005, Lyon served as a research neuropsychologist and the chief of the Child Development and Behavior Branch of the NICHD at the National Institutes of Health; in this role he developed and oversaw research programs in cognitive neuroscience, learning and reading development and disorders, behavioral pediatrics, cognitive and affective development, School Readiness, and the Spanish to English Reading Research program. He designed, developed and directed the 44-site NICHD Reading Research Network.

Lyon selected the members of the National Reading Panel. Like him, most were experimental researchers in higher education. Only one—Joanne Yatvin— was experienced as an elementary school teacher and principal. She wrote a “minority view” dissenting from the report, and she worried that the report would be misused.

President George W. Bush signed No Child Left Behind into law on January 8, 2002. This law was the single largest intrusion of the federal government into education in American history. Before NCLB, education was a state responsibility. Since passage of NCLB, the federal government established mandates that schools had to obey.

One of the components of this law was the Reading First program. RF was based on the report of the National Reading Panel, which emphasized the importance of phonemic awareness, phonics, decoding, and fluency.

The Reading First program allocated $6 billion over six years to encourage districts to adopt the “science of reading,” as established by the National Reading panel.

There were two reasons that the program ended.

First, there were financial scandals. Google “Reading First Program Scandals”). The New York Times reported here about conflicts of interest and steering of contracts to favored textbook publishers. “In a searing report that concludes the first in a series of investigations into complaints of political favoritism in the reading initiative, known as Reading First, the report said officials improperly selected the members of review panels that awarded large grants to states, often failing to detect conflicts of interest. The money was used to buy reading textbooks and curriculum for public schools nationwide.”

Second, the final evaluation of the program found that it taught what it aimed to teach but there was no improvement in students’ comprehension.

Here is the summary of the final evaluation:

The findings presented in this report are generally consistent with findings presented in the study’s Interim Report, which found statistically significant impacts on instructional time spent on the five essential components of reading instruction promoted by the program (phonemic awareness, phonics, vocabulary, fluency, and comprehension) in grades one and two, and which found no statistically significant impact on reading comprehension as measured by the SAT 10. In addition to data on the instructional and student achievement outcomes reported in the Interim Report, the final report also presents findings based upon information obtained during the study’s third year of data collection: data from a measure of first grade students’ decoding skill, and data from self-reported surveys of educational personnel in study schools.

Analyses of the impact of Reading First on aspects of program implementation, as reported by teachers and reading coaches, revealed that the program had statistically significant impacts on several domains. The information obtained from the Test of Silent Word Reading Fluency indicates that Reading First had a positive and statistically significant impact on first grade students’ decoding skill.

The final report also explored a number of hypotheses to explain the pattern of observed impacts. Analyses that explored the association between the length of implementation of Reading First in the study schools and reading comprehension scores, as well as between the number of years students had been exposed to Reading First instruction and reading comprehension scores were inconclusive. No statistically significant variation across sites in the pattern of impacts was found. Correlational analyses suggest that there is a positive association between time spent on the five essential components of reading instruction promoted by the program and reading comprehension measured by the SAT 10, but these findings appear to be sensitive to model specification and the sample used to estimate the relationship.

The study finds, on average, that after several years of funding the Reading First program, it has a consistent positive effect on reading instruction yet no statistically significant impact on student reading comprehension. Findings based on exploratory analyses do not provide consistent or systematic insight into the pattern of observed impacts.

After the disgrace of the Reading First program, support for phonics dissipated. But in the past few years, journalists (led by Emily Hanford) have trumpeted the idea that the report of the National Reading Panel established the “science of reading.” New York Times columnist Nicholas Kristof wrote about the “Mississippi Miracle,“ claiming that the “science of reading” had lifted fourth grade reading scores, and no new spending was needed in a very poorly resourced state. Kristof did not explain why the SOR did not cause a rise in eighth grade scores in Mississippi, nor did he understand that retaining low-scoring third graders raises the percentage of fourth graders who get high test scores. State after state is now mandating the “science of reading.”

And so the cycle begins again.

As a young person and a Jew, I swore I would never visit Germany. Growing up in Houston in the late 1940s and early 1950s, I occasionally met people who had a blue number tattooed on their arm, a legacy of their time in a Nazi concentration camp. I learned about the Holocaust at religious school, not public school. With my knowledge of the Holocaust, I was determined to avoid the nation that sought to eliminate the Jews of Europe. I was fortunate that my father’s parents came to America from Poland in the 19th century, and my mother arrived from Bessarabia after World War 1. Every member of their families who remained in Europe was slaughtered. Not one survived.

In 1984, I received an invitation from the State Department to visit West Germany and Yugoslavia to speak about education. I decided to go. It was a fascinating trip, and I overcame my phobia about visiting Germany.

Years later, after the Wall had come down, I went to Germany as a tourist with my partner and our Brooklyn neighbors. The wife, an emergency room nurse, was born in Germany, and is one of the kindest people I know. For the first time, I saw Germany as a vibrant and thriving nation. I visited the Holocaust Museum in Berlin and saw the honesty with which Germany was confronting its past. Every town we visited had its memorials to those who had perished because of Hitler’s genocide.

A few days ago, I was again in Berlin. Frankly, I fell in love with Berlin. The German people acknowledge the horrors of their past. They don’t sugar coat it. Their contrition is impossible to ignore. There are memorials scattered across the city to those who were unjustly murdered—Jews, Roma, homosexuals, and others.

Right near our hotel was a field of 2,711 stelae of different sizes that looked like coffins. We stopped to view the site where Hitler’s bunker once existed. It’s now just blank ground with a large marker explaining what it was. It was where Hitler and Eva Braun married, knowing all was lost. She killed herself. Hitler killed himself. When the Soviets entered Berlin, they totally destroyed the bunker.

Several readers corrected my statement that Hermann Göring and his wife and children died in the bunker. They are right. It was Joseph Goebbels and his family who committed suicide in the bunker. Göring committed suicide in Nuremberg the night before he was to be executed by hanging.

As the war drew to a close and Nazi Germany faced defeat, Magda Goebbels and the Goebbels children joined Hitler in Berlin. They moved into the underground Vorbunker, part of Hitler’s underground bunker complex, on 22 April 1945. Hitler committed suicide on 30 April. In accordance with Hitler’s will, Goebbels succeeded him as Chancellor of Germany; he served one day in this post. The following day, Goebbels and his wife committed suicide, after having poisoned their six children with a cyanide compound. (Wikipedia)

On our last day in Berlin, we intended to go to the museum of the Stasi, the secret police that monitored every East German’s life. But we decided instead to visit the memorial center of the German resistance.

The museum tells the story of Germans who opposed the rise of Hitler in the 1930s, who worked against him during the war years, who anticipated that he would destroy Germany’s struggling democracy, and who worked to end his brutal tyranny. There were stories of opposition to Hitler by trade unionists and Communists, by Jews and Catholics and Protestants. The museum identified religious leaders, scholars, scientists, educators, students, social workers, and others who worked against Hitler. Most were killed. It went into great detail about the failed assassination attempt by leading German officers on July 20, 1944. All of them were murdered.

My partner, a former teacher of history and social studies, wondered why Holocaust studies in the schools do not tell their stories. In some sick way, the constant focus on bodies and atrocities was not having its intended effect; it was desensitizing the students to cruelty and inhumanity.

Of course, the brutality must be shown and remembered. But why not make resistance to evil the centerpiece? Why not focus on courage and heroism in the face of overwhelming force? Why not tell the story of Georg Esler, the German carpenter who tried to assassinate Hitler in 1939? Or the story of the White Rose Society, the college students who bravely distributed flyers about Nazi atrocities in 1942-43, who were captured and executed? They should be celebrated for their courage and conviction.

Meanwhile, back home, our own nation is convulsed by battles about teaching the past. Some insist on whitewashing history because the truth might make young people “uncomfortable.”We see the rising influence of groups like “Moms for Liberty,” who demand censorship and oppose honest teaching of the past and the present. They have a right to speak, but they should not have the right to impose their bigotry and intolerance on others. Moms for Liberty should learn from Germany about the importance of teaching truth.

If you visit Berlin, don’t miss this tribute to the resistance.

Jan Resseger writes brilliantly about the importance of education in a democracy. She reads widely in the work of authors who understand why education should not be privatized and turned into a consumer good. You will enjoy reading this essay.

She writes:

I find myself struggling these days to understand how those of us who prize our U.S. system of public education seem to have lost the narrative. As I listen to the rhetoric of today’s critics of public schooling—people who distrust or disdain the work of school teachers and who believe test scores are the only way to understand education, I worry about the seeming collapse of the values I grew up with as a child in a small Montana town whose citizens paid so much attention to the experiences its public schools offered for the community’s children. The schools in my hometown provided a solid core curriculum plus a strong school music program, ambitious high school drama and speech and debate programs, athletics, a school newspaper, and an American Field Service international student every single year at the high school. While many of us continue to support our public schools, what are the factors that have caused so many to abandon their confidence in public education?

It is in this context that I found myself reading “Education and the Challenges for Democracy,” the introductory essay in the current issue of Education Policy Analysis Archives. In his essay, Fernando M. Reimers, a professor in the graduate school of education at Harvard University, explores the interconnection of public education and democracy itself. Reimers explains, for example, that the expansion of our democracy to include more fully those who have previously been marginalized is likely to impact the public schools in many ways and that these changes in the schools will inspire their own political response:

“(T)he expansion of political rights to groups of the population previously denied rights (e.g. women, members of racial or religious minorities) may lead to increased access for these groups to educational institutions and a curriculum that prepares them for political participation. These changes, in turn, feed back into the political process, fostering increased demands for participation and new forms of representation as a result of the new skills and dispositions these groups gained by educational and political changes. But these increases in representation may activate political backlash from groups who seek to preserve the status quo. These forces may translate into efforts to constrain the manner in which schools prepare new groups for political participation. In this way, the relationship between democratic politics and democratic education is never static, but in perpetual, dynamic, dialectical motion that leads to new structures and processes. The acknowledgement of this relationship as one that requires resolution of tensions and contradictions, of course, does not imply an inevitable cycle of continuous democratic improvement, as there can be setbacks—both in democracy itself, and in education for democracy.”

Reimers continues: “Democracy—a social contract intended to balance freedom and justice—is not only fluid and imperfect but fragile. This fragility has become evident in recent years… In order to challenge the forces undermining democracy, schools and universities need to recognize these challenges and their systemic impact and reimagine what they must do to prepare students to address them.” While Reimers explains that the goal of his article is not only, “to examine how democratic setbacks can lead to setbacks in democratic education, but also how education can resist those challenges to democracy,” he presents no easy solutions. He does, however sort out the issues to which we should all be paying attention—naming five specific challenges for American democracy:

“The five traditional challenges to democracy are corruption, inequality, intolerance, polarization, and populism… The democratic social contract establishes that all persons are fundamentally equal, and therefore have the same right to participate in the political process and demand accountability. Democracy is challenged when those elected to govern abuse the public trust through corruption, or capturing public resources to advance private ends… Democracy is also challenged by social and economic inequality and by the political inequalitythey may engender… One result of political intolerance is political polarization… Political intolerance is augmented by Populism, an ideology which challenges the idea that the interests of ordinary people can be represented by political elites.” (emphasis in the original)

Reimers considers how these threats to democracy endanger our public schools: “The first order of effects of these forces undermining democracy is to constrain the ability of education institutions to educate for democracy. But a second order of effects results from the conflicts and tensions generated by these forces….” As the need for schools and educators to prepare students for democratic citizenship becomes ever more essential, political backlash may threaten schools’ capacity to help students challenge the threats to democracy.

In their 2017 book, These Schools Belong to You and Me, Deborah Meier and Emily Gasoi articulate in concrete terms what Reimers explains abstractly as one of the imperatives that public schools must accomplish today: “(W)e need a means of ensuring that we educate all future citizens, not only to be well versed in the three Rs, and other traditional school subjects, but also to be able to see from multiple perspectives and to be intellectually curious and incisive enough to see through and resist the lure of con artists and autocrats, whether in the voting booth, the marketplace, or in their social dealings.” (These Schools Belong to You and Me, p. 25) Schools imagined as preparing critical thinkers—schools that focus on more than basic drilling in language arts and math—are necessary to combat two of the threats Reimers lists: corruption and populism.

But what about Reimers’ other threats? How can schools, in our current polarized climate, push back against intolerance, inequality, and polarization? Isn’t today’s attack on “diversity, equity and inclusion” in some sense an expression of a widespread desire to give up on our principle of equality of opportunity—to merely accept segregation, inequality and exclusion? This is the old, old struggle Derek Black traces in Schoolhouse Burning—the effort during Reconstruction to develop state constitutions that protect the right to education for all children including the children of slaves—followed by Jim Crow segregation—followed by the Civil Rights Movement and Brown v. Board of Education—followed by myriad efforts since then to keep on segregating schools. Isn’t the attempt to discredit critical race theory really the old fight about whose cultures should be affirmed or hidden at school, and isn’t this fight reminiscent of the struggle to eliminate the American Indian boarding schools whose purpose was extinguishing American Indian children’s languages and cultures altogether? Isn’t the battle over inclusion the same conflict that excluded disabled children from public school services until Congress passed the Individuals with Disability Education Act in 1975? And what about the battle that ended in 1982, when, in Plyler v. Doe, the U.S. Supreme Court protected the right to a free, K-12 public education for children of undocumented immigrants? Our society has continued to struggle to accept the responsibility for protecting the right to equal opportunity. As Reimers explains, action to address inequality has inevitably spawned a reaction.

Educators and political philosophers, however, have persistently reminded us of our obligation to make real the promise of public schooling. In 1899, our most prominent philosopher of education, John Dewey, declared: “What the best and wisest parent wants for his own child, that must the community want for all of its children… Only by being true to the full growth of all the individuals who make it up, can society by any chance be true to itself.” (The School and Society, p. 1)

In 1992, political theorist Benjamin Barber advocated for the very kind of public schooling Reimers would like to see today: “(T)he true democratic premise encompasses… the acquired virtues and skills necessary to living freely, living democratically, and living well. It assumes that every human being, given half a chance, is capable of the self-government that is his or her natural right, and thus capable of acquiring the judgment, foresight, and knowledge that self-government demands.… The fundamental assumption of democratic life is not that we are all automatically capable of living both freely and responsibly, but that we are all potentially susceptible to education for freedom and responsibility. Democracy is less the enabler of education than education is the enabler of democracy.” (An Aristocracy of Everyone, pp. 13-14)

In a 1998 essay, Barber declared: “America is not a private club defined by one group’s historical hegemony. Consequently, multicultural education is not discretionary; it defines demographic and pedagogical necessity. If we want youngsters from Los Angeles whose families speak more than 160 languages to be ‘Americans,’ we must first acknowledge their diversity and honor their distinctiveness. English will thrive as the first language in America only when those for whom it is a second language feel safe enough in their own language and culture to venture into and participate in the dominant culture. For what we share in common is not some singular ethnic or religious or racial unity but precisely our respect for our differences: that is the secret to our strength as a nation, and is the key to democratic education.” (“Education for Democracy,” in A Passion for Democracy: American Essays, p. 231)

These same principles are prophetically restated by William Ayers in his final essay in the 2022 book, Public Education: Defending a Cornerstone of American Democracy: “In a free society education must focus on the production—not of things, but—of free people capable of developing minds of their own even as they recognize the importance of learning to live with others. It’s based, then, on a common faith in the incalculable value of every human being, constructed on the principle that the fullest development of all is the condition for the full development of each, and conversely, that the fullest development of each is the condition for the full development of all… Schools don’t exist outside of history or culture: they are, rather, at the heart of each. Schools serve societies; societies shape schools. Schools, then, are both mirror and window—they tell us who we are and who we want to become, and they show us what we value and what we ignore, what is precious and what is venal.” (Public Education: Defending a Cornerstone of American Democracy, p. 315)

Please open the link to complete the reading.

Nancy Bailey criticizes the ongoing campaign to raise academic expectations and academic pressure on children in kindergarten. She traces the origins of this misguided effort on the Reagan-era publication “A Nation at Risk” in 1983.

Although the gloomy claims of that influential document have been repeatedly challenged, even debunked*, it continues to control educational discourse with its assertion that American schools are failing. “A Nation at Risk” led to increased testing, to the passage of George W. Bush’s No Child Left Behind in 2002, to the creation of Barack Obama’s Race to the Top in 2009, to the release of the Common Core standards in 2010.

Despite nearly a quarter century of focus on standards and testing, policymakers refuse to admit that these policies have failed.

And nowhere have they been more destructive than in the early grades, where testing has replaced play. Kindergarten became the new first grade.

But says Bailey, the current Secretary of Education wants to ratchet up the pressure on little kids.

She writes:

In What Happened to Recess and Why are our Children Struggling in Kindergarten, Susan Ohanian writes about a kindergartner in a New York Times article who tells the reporter they would like to sit on the grass and look for ladybugs. Ohanian writes, the child’s school was built very deliberately without a playgroundLollygagging over ladybugs is not permitted for children being trained for the global economy (2002, p.2).   

America recently marked forty years since the Reagan administration’s A Nation at Risk: The Imperative for Educational Reform which blamed schools as being eroded by a rising tide of mediocrity that threatens our very future as a Nation and a people.

Berliner and Biddle dispute this in The Manufactured Crisis: Myths, Fraud, and the Attack on America’s Public Schools. They state that most of these claims were said to reflect “evidence,” although the “evidence” in question either was not presented or appeared in the form of simplistic, misleading generalizations (1995, p. 3)

Still, the report’s premise, that public schools failed, leading us down the workforce path of doom, continues to be perpetuated. When students fail tests, teachers and public schools are blamed, yet few care to examine the obscene expectations placed on the backs of children since A Nation at Risk.

Education Secretary Cardona recently went on a bus tour with the message to Raise the Bar in schools. Raising the bar is defined as setting a high standard, to raise expectations, to set higher goals.

He announced a new U.S. Department of Education program, Kindergarten Sturdy Bridge Learning Community.

This is through New America, whose funders include the Bill and Melinda Gates Foundation, the Waltons, and others who want to privatize public education. Here’s the video, Kindergarten as a “Sturdy Bridge”: Place-Based Investments, describing the plan focusing on PreK to 3rd grade. This involves Reading by 3rd and the Campaign for Grade Level Reading.

Cardona says in the announcement:

Getting kindergarten right has to be top of mind for all of us, because what happens there sets the stage for how a child learns and develops well into their elementary years and beyond. 

Ensuring that kindergarten is a sturdy bridge between the early years and early grades is central to our efforts both to Raise the Bar for academic excellence and to provide all students with a more equitable foundation for educational success. The kindergarten year presents an opportunity to meet the strengths and needs of young learners so they can continue to flourish in the years to come.

Raise the bar? Kindergarten is already the new first grade. What will it be now? Second? Third? Fourth? What’s the rush? How is this developmentally sound? One thing is for sure: there will still be no idle time for children to search for ladybugs.

Few bear the brunt of A Nation at Risk,as do early learners whose schools have been invaded by corporate schemes to force reading and advanced learning earlier than ever expected in the past.

If kindergartners aren’t doing well after all these years of toughness, higher expectations, and an excruciating number of assessments, wouldn’t it seem time to back off, instead of raising the bar higher?

Editor’s note:

*James Harvey and I will discuss the distortions contained in the “Nation at Risk” report at the Network for Public Education conference on Oct. 28-29 in Washington, D.C. James Harvey was a high-level member of the staff that wrote the report. He has written about how the Reagan-era Commissuon in Excellence in Education “cooked the books” to paint a bleak—but false—picture of American public schools. Please register and join us!

Thom Hartmann writes here about how George W. Bush and Dick Cheney cynically used the attacks of 9/11 to get us into America’s longest war. They wanted to go to war. I can’t help but think that if 537 votes in Florida had gone a different way, the world would be a different place today. It was those 537 votes that made Bush the President, not Al Gore. Remember that: Every vote counts.

Hartmann writes:

America has been lied into too many wars. It’s cost us too much in money, credibility, and blood. We must remember the lies, and tell our children about them so that memory isn’t lost…

Today is 9/11, the event that first brought America together and then was cynically exploited by George W. Bush and Dick Cheney to have a war against Iraq, followed by their illegal invasion of Afghanistan just a bit more than a year earlier.

Yet the media today (so far, anyway) is curiously silent about Bush and Cheney’s lies.

Given the costs of both these wars — and the current possibility of our being drawn deeper into conflict in both Ukraine and Taiwan — it’s an important moment to discuss our history of wars, both illegal and unnecessary, and those that are arguably essential to the survival of democracy in the world.

To be clear, I support US involvement — and even an expanded US involvement — in the defense of the Ukrainian democracy against Putin’s Hitler-grabs-Poland-like attack and mass slaughter of Ukrainian civilians. Had the world mobilized to stop Hitler when he invaded Poland in 1939 there almost certainly wouldn’t have been either the Holocaust or WWII, which is why Europe is so united in this effort.

If Putin succeeds in taking Ukraine, his administration has already suggested that both Poland and Moldova are next, with the Baltic states (Latvia, Lithuania, Estonia) also on the menu. That would almost certainly lead to war in Europe.

And China is watching: a Putin victory in Ukraine will encourage Xi to try to take Taiwan. Between the two — war in both Europe and the Pacific — we could find ourselves in the middle of World War III if Putin isn’t stopped now.

That said, essentially defensive military involvement like with Ukraine or in World War II have been the exception rather than the rule in American history. We’ve been far more likely to have presidents lie us into wars for their own personal and political gain than to defend ourselves or other democracies.

For example, after 9/11 in 2001 the Taliban that then ran Afghanistan offered to arrest Bin Laden, but Bush turned them down because he wanted to be a “wartime president” to have a “successful presidency.”

The Washington Post headline weeks after 9/11 put it succinctly: “Bush Rejects Taliban Offer On Bin Laden.” With that decision not to arrest and try Bin Laden for his crime but instead to go to war, George W. Bush set the US and Afghanistan on a direct path to disaster (but simultaneously set himself up for re-election in 2004 as a “wartime president”).

To further complicate things for Bush and Cheney, the 9/11 attacks were not planned, hatched, developed, practiced, expanded, worked out, or otherwise devised in Afghanistan or by even one single citizen of Afghanistan.

That country and its leadership in 2001, in fact, had nothing whatsoever to do with 9/11, as I detailed in depth here on August 15th of last year. The actual planning and management of the operation was done out of Pakistan and Germany, mostly by Khalid Sheik Mohammed.

The Taliban were bad guys, trashing the rights of women and running a tinpot dictatorship, but they represented no threat whatsoever to America or our allies.

Almost two decades later, though, then-President Trump and Mike Pompeo gave the Taliban everything they wanted — power, legitimacy, shutting down 9 of the 10 US air bases in that country to screw incoming President Joe Biden, and the release of 5000 of Afghanistan’s worst Taliban war criminals — all over the strong objections of the democratically elected Afghan government in 2019.

Trump did this so could falsely claim, heading into the 2020 election, that he’d “negotiated peace” in Afghanistan, when in fact he’d set up the debacle that happened around President Biden’s withdrawal from that country.

”The relationship I have with the Mullah is very good,” Trump proclaimed — after ordering the mullah who then named himself President of Afghanistan — freed from prison over the furious objection of Afghan’s government, which Trump had cut out of the negotiations.

Following that betrayal of both Afghanistan and America, Trump and the GOP scrubbed the record of their embrace of the Taliban from their websites, as noted here and here.

And the conservative Boris Johnson administration in the UK came right out and said that Trump’s “rushed” deal with the Taliban — without involvement of the Afghan government or the international community — set up the difficulties Biden faced.

“The die was cast,” Defense Minister Ben Wallace told the BBC, “when the deal was done by Donald Trump, if you want my observation.”

So, Republican George W. Bush lied us into both the Afghanistan and Iraq wars, and then Donald Trump tried to lie us out of at least one of them.

But this was far from the first time a president has lied us into a war.

— Vietnam wasn’t the first time an American president and his buddies in the media lied us into a war when Defense Secretary Robert McNamara falsely claimed that an American warship had come under attack in the Gulf of Tonkin and LBJ went along with the lie.

— Neither was President William McKinley lying us into the Spanish-American war in 1898 by falsely claiming that the USS Maine had been blown up in Havana harbor (it caught fire all by itself).

— The first time we were lied into a major war by a president was probably the Mexican-American war of 1846 when President James Polk lied that we’d been invaded by Mexico. Even Abraham Lincoln, then a congressman from Illinois, called him out on that lie.

— You could also argue that when President Andrew Jackson signed the Indian Removal Act in 1830 leading to the Trail of Tears slaughter and forced relocation of the Cherokee under President Buchanan (among other atrocities) it was all based on a series of lies.

Bush’s lies that took us into Afghanistan and, a bit over a year later into Iraq, are particularly egregious, however, given his and Cheney’s reasons for those lies.

In 1999, when George W. Bush decided he was going to run for president in the 2000 election, his family hired Mickey Herskowitz to write the first draft of Bush’s autobiography, A Charge To Keep.

Although Bush had gone AWOL for about a year during the Vietnam war and was thus apparently no fan of combat, he’d concluded (from watching his father’s “little 3- day war” with Iraq) that being a “wartime president” was the most consistently surefire way to get reelected (if you did it right) and have a two-term presidency.

“I’ll tell you, he was thinking about invading Iraq in 1999,” Herskowitz told reporter Russ Baker in 2004.

“One of the things [Bush] said to me,” Herskowitz said, “is: ‘One of the keys to being seen as a great leader is to be seen as a commander-in-chief. My father had all this political capital built up when he drove the Iraqis out of (Kuwait) and he wasted it.

“[Bush] said, ‘If I have a chance to invade Iraq, if I had that much capital, I’m not going to waste it. I’m going to get everything passed I want to get passed and I’m going to have a successful presidency.’”

The attack on 9/11 gave Bush his first chance to “be seen as a commander-in-chief” when our guy Osama Bin Laden, who the Reagan/Bush administration had spent $3 billion building up in Afghanistan, engineered an attack on New York and DC.

The crime was planned in Germany and Florida and on 9/11 Bin Laden was, according to CBS News, not even in Afghanistan:

“CBS Evening News has been told that the night before the Sept. 11 terrorists attack, Osama bin Laden was in Pakistan. He was getting medical treatment with the support of the very military that days later pledged its backing for the U.S. war on terror in Afghanistan.”

When the Obama administration finally caught and killed Bin Laden, he was back in Pakistan, the home base for the Taliban.

But attacking our ally Pakistan in 2001 would have been impossible for Bush, and, besides, nearby Afghanistan was an easier target, being at that time the second-poorest country in the world with an average annual per-capita income of $700 a year. Bin Laden had run terrorist training camps there — unrelated to 9/11 — but they made a fine excuse for Bush’s first chance to “be seen as a commander-in-chief” and get some leadership cred.

Cheney, meanwhile, was in a world of trouble because of a huge bet he’d made as CEO of Halliburton in 1998. Dresser Industries was big into asbestos and about to fall into bankruptcy because of asbestos lawsuits that the company was fighting through the court system.

Cheney bet Dresser would ultimately win the suits and had Halliburton buy the company on the cheap, but a year later, in 1999, Dresser got turned down by the courts and Haliburton’s stock went into freefall, crashing 68 percent in a matter of months.

Bush had asked Cheney — who’d worked in his father’s White House as Secretary of Defense — to help him find a suitable candidate for VP.

Cheney, as his company was collapsing, recommended himself for the job. In July of 2000, Cheney walked away with $30 million from the troubled company and the year after that, as VP, Halliburton subsidiary KBR received one of the first no-bid no-ceiling (no accountability and no limit on how much they could receive) multi-billion-dollar military contracts.

Bush and Cheney both had good reason to want to invade Afghanistan in October 2001. Bush was seen as an illegitimate president at the time because his father’s corrupt appointee on the Supreme Court, Clarence Thomas, had cast the deciding vote in the Bush v Gore lawsuit that made him president; a war that gave him legitimacy and the aura of leadership.

Cheney’s company was in a crisis, and Afghanistan War no-bid contracts helped turn around Halliburton from the edge of bankruptcy into one of the world’s largest defense contractors today.

Even Trump had to get into the “let’s lie about Afghanistan” game, in his case to have bragging rights that he’d “ended the war in Afghanistan.”

In 2019, Trump went around the Afghan government (to their outrage: he even invited the Taliban to Camp David in a move that disgusted the world) to cut a so-called “peace deal” that sent thousands of newly-empowered Taliban fighters back into the field, and then drew down our troops to the point where today’s chaos in that country was absolutely predictable.

Trump’s deal was the signal to the 300,000+ Afghan army recruits we’d put together and paid that America no longer had their back and if the Taliban showed up they should just run away. Which, of course, is what happened on Trump’s watch. As Susannah George of The Washington Post noted:

“The Taliban capitalized on the uncertainty caused by the [Trump] February 2020 agreement reached in Doha, Qatar, between the militant group and the United States calling for a full American withdrawal from Afghanistan. Some Afghan forces realized they would soon no longer be able to count on American air power and other crucial battlefield support and grew receptive to the Taliban’s approaches.”

Jon Perr’s article at Daily Kos did a great summary, with the title: “Trump put 5,000 Taliban fighters back in battle and tied Biden’s hands in Afghanistan.”

Trump schemed and lied to help his own reelection efforts, and the people who worked with our military and the US-backed Afghan government paid a terrible price for it.

As President Biden told America:

“When I came to office, I inherited a deal cut by my predecessor—which he invited the Taliban to discuss at Camp David on the eve of 9/11 of 2019—that left the Taliban in the strongest position militarily since 2001 and imposed a May 1, 2021 deadline on U.S. Forces. Shortly before he left office, he also drew U.S. Forces down to a bare minimum of 2,500.

“Therefore, when I became President, I faced a choice—follow through on the deal, with a brief extension to get our Forces and our allies’ Forces out safely, or ramp up our presence and send more American troops to fight once again in another country’s civil conflict. I was the fourth President to preside over an American troop presence in Afghanistan—two Republicans, two Democrats. I would not, and will not, pass this war onto a fifth.”

America has been lied into too many wars. It’s cost us too much in money, credibility, and blood. We must remember the lies, and tell our children about them so that memory isn’t lost.

When President Ford withdrew US forces from Vietnam (I remember it well), there was barely a mention of McNamara’s and LBJ’s lies that got us into that war.

Similarly, today’s reporting on the chaos in Afghanistan and the war to seize the Iraqi oil fields almost never mention Bush’s and Cheney’s lies and ulterior motives in getting us into those wars in the first place.

George Santayana famously noted, “Those who cannot remember the past are condemned to repeat it.”

We can’t afford to let these lies go down the memory hole, like we have the other wars we were lied into that I mentioned earlier. Sadly, it’s clear now that neither Bush nor Cheney will be held accountable for their lies or for the American, Afghan, and Iraqi blood and treasure they cost.

But both should be subject to a clear and public airing of the crimes they committed in office and required — at the very least — to apologize to the thousands of American families destroyed by the loss of their soldier children, parents, and spouses, as well as to the people of both Afghanistan and Iraq.

If the media refuses to mention the Bush/Cheney lies on this anniversary of 9/11, it’s all the more important that the rest of us use this opportunity to do so. Pass it on.