31 October 2014

Ghost hunter: Ted Cruz and the quest for the hidden majority

I empathize with the despair Senator Cruz of Texas must feel at the thought of a 2016 presidential election contested by a Bush and a Clinton, even though I don't necessarily share his fear that the nomination of Jeb Bush by the Republicans would guarantee the election of Hillary Clinton for the Democrats. The country seems to be in an eight-year presidential cycle that leaves voters disgusted enough by the standard-bearer of one party after two terms to choose the candidate -- any candidate, it must seem by now -- of the other major party. "Clinton fatigue" was blamed for George W. Bush's elections by those who didn't want to blame it entirely on the hanging chads in Florida, and something similar caused by Bush himself contributed to the election of Barack Obama. By now it's clear that many Americans will be quite sick of Obama by the fall of 2016, and their contempt will most likely benefit a Republican -- but Cruz isn't so sure. He worries that Republican primary voters and power brokers will waste the golden opportunity of 2016 if they opt for the latest Bush.  Whatever Cruz thinks of Jeb's brother, he has decided that the former governor of Florida is too much a moderate, too much in the mold of John McCain and Mitt Romney, to win a general election.

Cruz presumably doesn't agree that "Bush fatigue" helped Obama win in 2008. The Senator is one of many Republicans convinced that there remain vast untapped resources of Republican voters who will only come to the surface if the GOP gives them authentically impassioned and ideological conservative candidates. He's a believer in the hidden majority that must exist if his vision of America is to be viable. Like many a reactionary or ideologue, Cruz sees America not so much as what the American people say it is, and not even so much as what the Constitution dictates it should be, but as how Crassus saw Rome in Dalton Trumbo's script for Spartacus: as "an eternal thought in the mind of God." More importantly, he sees the American people themselves that way: not as what they say they are in the evidence you can see and hear, but necessarily as they should be according to Cruz's absolute ideology. There must be a majority out there that affirms traditional values, free enterprise and limited government, in his view, and they will reveal themselves when we use the right words and the right voice to summon them. Or else the America he believes in doesn't really exist.

Reality has disappointed Cruz before. Recall his encounter earlier this year with representatives of the oppressed Christian communities of the Middle East. He seemed authentically stunned that his co-religionists, beset by radical Islam, would not recognize Israel as their natural and rightful ally in the region. He will be disappointed again, in all likelihood, if his argument against Bush is really a plea for his own nomination. As some observers have suggested, a Cruz candidacy is, if anything, more likely to assure Mrs. Clinton's election or that of any Democratic nominee than nominating another Bush. He's simply too polarizing a figure after only a short time in the national spotlight, unless you assume that the hidden majority will tip the balance toward one pole. However, Cruz might be surprised, more or less happily, by the outcome of a Bush-Clinton race. Hillary is sure to suffer for voters' "Obama fatigue," since she served in his administration, while Jeb has always been perceived as the more moderate and smarter Bush brother. More importantly, this is Hillary Clinton we're talking about -- the Devil Herself to a generation of right-wingers. If Republicans aren't motivated to go to the polls simply to vote against her, then they aren't as rabidly reactionary as we assume they are. Maybe this is what Cruz believes -- that the hidden majority isn't driven by fear but needs to hear a positive affirmation of their values, whatever that may sound like, before they'll rise in their majesty and reclaim the country.

Still, Cruz is correct to think of Bush vs. Clinton as a bad choice for the nation, if only because it would reaffirm a disquieting quasi-aristocratic turn in American politics. While both prospective candidates have won multiple elections in their own right, both still embody the clannish idea that virtue somehow inheres in families, and neither may have received a chance at election if not for their last names. This political version of brand-name loyalty has no more place in a working democratic republic than our brand-name loyalty to parties. We ought to hope that there's a hidden majority that would reject both families, but Cruz is right again if does think that any hidden majority needs to hear more than fearmongering before it will assert itself. Cruz's mistake is that his message is something an actual hidden majority hasn't already heard. Our mistake, as I've warned elsewhere, would be to assume that there's a hidden majority that can be rallied behind one platform or vision for the country -- that any hidden majority is essentially centrist or moderate. Saving our country almost certainly won't be that easy.


29 October 2014

Partyism: the new bigotry?

Cass Sunstein, a liberal scholar, makes the audacious claim in an op-ed for Bloomberg News that prejudice on the basis of political partisanship is now more widespread than prejudice on the basis of race. On the basis of various surveys, he reports that people are more likely than ever to reject job applicants, or suitors for their children, for belonging to the wrong party. Nearly half of Republicans in one survey don't want their children marrying Democrats; the feeling is mutual for a third of Democrats. Fifty years ago, the numbers were in single digits on both sides. The problem is worse in the job market; partisans are more likely to reject applicants whose opposite-party affiliation can be inferred from their resumes, even when those applicants have better credentials for the job at hand. Following up on Sunstein, David Brooks laments what he calls the "hyper-moralization" of American politics. He seems to mean that people have grown more sweepingly judgmental about politics than ever before, with ever-worsening social consequences as partisans withdraw into their own affinity bubbles and avoid interaction with different points of view. Both Sunstein and Brooks blame the political advertising complex for this alarming state of affairs, but Brooks adds the observation that politics has grown more bitter because political debates have become our substitute for the sort of moral debates he claims were once more common in our culture. As philosophers and theologians have yielded the public sphere to parties and "media provocateurs," Brooks argues, political debates have escalated in fervor and rancor beyond all proportion to the issues at stake. Political campaigns become "a Manichean struggle of light and darkness" over "the existential fabric of life itself."

In one of the local papers, Brooks's column shared an editorial section with a Michael Smerconish op-ed on political polarization that puts the other writers' concerns in a different perspective. Looking at still more surveys, Smerconish agrees that polarization is worse than ever, but notes that only a small percentage of the American population is polarized. No more than 20% of the public occupies the poles, he notes, while the vast majority don't identify with the polar positions. For Smerconish, the problem with American politics today is that "those 20 percent still hold sway over the 80 percent." He's well aware of the reason for this: the extremes on either side are more committed to voting in every election, not to mention in party primaries. As a result, the 20 percent get to choose what the 80 percent has to choose from -- and it's no surprise that many in the larger group choose "none of the above" and don't bother voting. Smerconish advises this silent majority not to "keep sleeping" but to wake up and vote in order to end the rule of the extreme partisans.

Are Sunstein and Brooks describing a phenomenon exclusive to Smerconish's 20 percent, or does much of the 80 percent share "partyist" prejudices without any other political engagement? We should be careful not to assume that the 80 percent occupy a middle ground between the extremes; many may be even more radical or reactionary than the party faithful; their complaint may be that the major parties, despite their rhetoric, still make too many compromises for the sake of elections or money. I suspect that many in the 80 percent take a more draconian point of view on political and social problems, but not in a manner consistent with major-party platforms. We might well see prejudice against both Republicans and Democrats as representatives of a corrupt establishment -- but would that be prejudice?

I understand why some pundits worry about self-segregation on partisan lines in social media and elsewhere. Brooks believes that partyism can only damage communities and institutions that are denied "the benefits of divergent viewpoints and competing thought." In the practical world, partisanship and ideology shouldn't determine or constrain anyone's ability to work with others. However, certain beliefs that grow more widespread -- that Republicans don't believe in science, or that Democrats don't believe in morality or hard work -- may make the possibility of even practical cooperation on presumably uncontroversial projects more problematic. But I also find the discovery of partyism as a new form of prejudice problematic, and Sunstein's alarmist hint that partyism is worse than racism even more problematic. Any attempt to equate discrimination based on beliefs with discrimination based on race or ethnicity is problematic because race prejudice is the definitive case of hating people for what they are rather for what they do. Racists may claim that other races all do certain hateful things, but all such claims are easily disproved. With partisanship, ideology or religion the difference is that there is a baseline of what the hated people do that can be verified. You can disprove the prejudice that all Republicans reject science, for instance, but there must be something that all Republicans believe, or else there are no Republicans -- and what they believe can be judged and found wanting. While it would be prejudicial to ostracize Republicans from whole professional spheres where their partisanship is or should be irrelevant, it is not prejudicial to judge the fundamentals of Republican or Democratic identity, as long as those fundamentals are not identified prejudicially but determined objectively by careful study of what partisans say and do on their own terms. In fact, such judgments need to be made, more so by the 80 percent than by the 20 percent. They should be made at the ballot box, and pundits shouldn't fear the possibility of the ballot box condemning certain ideologies to extinction. It's problematic enough to equate partisan hatred with race hatred, but it'd be even worse to equate parties and ideologies with endangered species on the assumption that there must always be a Democratic party, or that there must always be "conservatives" on the Barry Goldwater or Ronald Reagan model. I've long believed that the failure of either the Democrats or Republicans to go extinct the way the Whigs did more than 150 years ago has exposed a flaw in the American political mechanism. Did the Whigs go extinct because of prejudice? Should their extinction be lamented today like the extinction of the dodo or the passenger pigeon? If not, than neither should the possible extinction of the Republican or Democratic parties be feared as a triumph of prejudice. We should not want individual partisans to be treated unfairly, or deny ourselves the benefit of their actual expertise due to prejudice, but we should not mistake the destruction of one or both major parties down the line for its victimization by prejudice. Individuals should not suffer, but perhaps parties should. To deny that may be a form of prejudice we'd be better off without.

28 October 2014

Why Take Chances?

I took my mom to the hospital yesterday. Two or three times during the admitting process, she was asked if she had been in West Africa within the  past month or so. Knowing my mom as I do, I had to chuckle inwardly at the absurdity of the question. This was a subjective absurdity, however; it seemed absurd only because I know her so well as no world traveler. For the hospital staff, who don't know the absurd particulars of the situation, it remains a practical necessity to ask the question. To many health-care professionals, meanwhile, it seems similarly absurd for politicians, not to mention the general public, to demand that they submit to quarantine after returning from the West African hot zones. We've just seen the state of New Jersey buckle under the threat of litigation from a nurse who resented her admittedly crude quarantining, despite having tested negative for the Ebola virus. The medical establishment has rallied around the nurse and against the aggressive policies recently adopted in both New Jersey and New York, by a Republican and a Democratic governor respectively, while the White House backs the medical establishment. I'm no distruster of institutions or resenter of elites, but there is a whiff of arrogance in the medical establishment's campaign against more sweeping quarantines. Certainly the doctors know better than the hysterical yahoos in local governments or the national media, after all! Their case boils down to this: if Ebola isn't contagious until a patient exhibits symptoms, than anyone who has treated Ebola should have full freedom of movement until he or she exhibits symptoms. But if the doctors and nurses want to persuade the general public of this, they need to make absolutely clear how quickly infected people can begin to exhibit symptoms, and how quickly those with symptoms can become contagious. If they prove unpersuasive, the public remains justified in asking: why take chances?

The federal government prefers to amplify a secondary argument about incentives. The problem with stricter quarantines, it's argued, is that American medical personnel won't want to go to Africa to treat Ebola if it means submitting to quarantine and becoming "pariahs" at home. This is a curious argument. The claim is that people willing to risk contracting the Ebola virus can't stand the thought of a 21-day quarantine. At one moment, they're pretty brave; in the next, they're throwing hissy fits because of some people's possibly excessive concern for public safety. If they're willing to risk their lives, however, they ought to have the fortitude to stand a quarantine of some sort. But what looks like inconsistency is really a kind of professional arrogance. What probably irks these people as much as the prospect of enforced isolation is the idea that someone other than them, outside the medical establishment, wants to declare a quarantine and has the power to do so -- that someone other than them can ask: why take chances? I suspect that this will cease to be an issue before long, as Ebola seems to prove difficult to catch and relatively easy to survive in the U.S.  By sometime next year many people may feel that their fears of Ebola were silly. But until then a little more respect for the concerns of the general public would be a good idea. In return, a little more faith in the objective findings of doctors would be a good idea as well.

24 October 2014

An epidemic of bad faith

The diagnosis of Ebola for a Doctors Without Borders physician who recently returned to New York City from Africa has reignited the debate over the doctor's and the government's public-safety responsibilities. One side of this debate believes that the doctor should have been quarantined for the 21 days from contact with Ebola patients during which the virus may incubate in his body. His critics contend that it's especially irresponsible for a physician not to take this prudent measure, and some make this another occasion to denounce the CDC and the Obama administration for not forcing the doctor and his peers into quarantine. The other side argues that a sweeping quarantine is unnecessary. If one side bemoans the doctor's freedom to ride the subway and go bowling, the other points out that he didn't begin to show symptoms until after his various excursions, and they're adamant on the point that he and other victims of the virus are not contagious until they show symptoms. Reading a comment thread on one news site, I saw this belief -- that asymptomatic carriers aren't contagious -- dismissed as an "article of faith" and defended as an empirical observation. The skeptics appear disinclined to believe what the CDC says on this subject. This skepticism seems based on an overall suspicion of authority, or a suspicion of the motives for not doing what the skeptics consider a matter of common sense. Common sense seems to ask "Why take chances?" while suspicion sometimes imagines sinister reasons for taking chances. An objective, nonpartisan debate on the wisdom of a quarantine should be possible, but a lot of nonpartisan things should be possible. If some people automatically denounce the lack of a quarantine (or ban on flights from the afflicted countries in Africa) because they doubt Obama's competence or question his ultimate motives, others may rush to defend his policies on a knee-jerk impulse.  The latter wouldn't say these policies are automatically right because they're Obama's, but they may say they must be right because right-wingers or apparent paranoids oppose them. Yet it seems like an argument for a mandatory quarantine could be made without being labeled paranoid, hysteric or partisan. After all, what if you start to show symptoms in the middle of a public event? A lot of variables would remain, and I suspect that there's research to be done about susceptibility to Ebola that might mitigate if not minimize current fears. As we recently learned, the housemates of the Liberian who died in Texas have passed the incubation period and are Ebola-free, while so far only two of the nurses who supposedly breached protocols in treating the victim have contracted the virus, and one is already declared cured. Many factors apart from proximity to a blatantly sick patient may make some more likely to catch Ebola, some less. But while this is purely my own speculation, there seems to be no reason to dismiss out of hand pragmatic arguments for a quarantine for doctors returning from the hot zones. It may well be that many arguments for quarantine are made in bad faith -- out of irrational distrust of authority, or for partisan advantage -- but to dismiss the idea of quarantining returning doctors because some or even many making the argument are partisans or crackpots is to make an ad hominem argument, which according to logic is a fallacy. At this time we should be careful that bad faith doesn't spread further on both sides of the debate.

22 October 2014

Philistines and Palestinians at the Opera

The Death of Klinghoffer was composer John Adams's 1991 follow-up to Nixon in China, arguably the most popular if not the best American opera of the last half-century. Nixon had already been somewhat controversial, as its title character was still alive at the time it premiered. For Klinghoffer Adams and librettist Alice Goodman raised the stakes, making an opera of the 1985 hijacking by Palestinian of the cruise ship Achille Lauro and the murder by the hijackers of Jewish passenger Leon Klinghoffer. Not nearly as memorable musically as Nixon, Klinghoffer is best known for being controversial. Give a performance and people will protest. From the beginning, what's been protested is the creators' failure to sufficiently demonize the terrorists. Because the terrorist singers are allowed to state their viewpoint in their own terms instead of singing something like, "We hate Jews because we're mean," the opera is accused of "glorifying" terrorists. "Glorifying" is the standard term employed by the censorious when morally questionable characters in media aren't demonized to the satisfaction of certain sensibilities. Even though Jimmy Cagney's character in The Public Enemy dies a gruesome death, that film was accused of "glorifying" gangsters because Cagney looked cool until he died. So it has been with crime movies ever since. A certain mentality never trusts audiences to make their own sound judgments; it requires art to become propaganda, moral or political, telling audiences quite explicitly what they should think of questionable characters. Gangsters should show no appealing (much less redeeming) qualities; nor should terrorists.

The Metropolitan Opera premiered a new production of Klinghoffer this week, and the protesters were led, rhetorically at least, by Rudolph Giuliani, the former mayor of New York City. Giuliani released a statement explaining his protest while defending himself against the charge of philistinism. I am not so a cultural illiterate, America's Mayor writes: "As an opera, the music and choruses are quite excellent. John Adams is one of America’s greatest composers, and I admire and enjoy his music." Alas, Klinghoffer is politically and hence morally incorrect. It is "factually inaccurate and extraordinarily damaging to an appropriate description of the problems in Israel and Palestine."

Giuliani's tortured explication of what's "appropriate" is revealing. He argues that the hijacking and murder must be understood as cynically motivated to promote the brand of the Palestinian Liberation Organization as the organization with which the world must deal. It seems important to Giuliani that the murder be seen as a dispassionate act, not as a lashing out by angry people. The key sentences in Giuliani's screed are: " It was not the act of people feeling oppressed. This was the act of an organized group seeking international recognition, moral equivalency, and money." Therefore, any scene or aria in which Palestinian characters express feelings of oppression and grievance are "inaccurate" and "damaging." 

A consistent part of the right-wing reaction to terrorism against its interests or allies is to deny the legitimacy of grievances. The right-wing argument is always that terrorists hate their targets not for what we do, but for what we are. A corollary argument is that the terrorists, rather than making reprisals against perceived oppressors, are always the aggressors. This fits a popular picture of Islam portraying the religion as always hostile and always on the offensive against infidels purely by virtue of their faith. From this standpoint, to let a terrorist character on stage say or sing what an author might fairly imagine is on his mind, even while making his terrorism obviously odious, is always subversive. Instead, the terrorist, or the enemy agent, must be motivated exclusively by hate, fanaticism, or selfish personal ambition. Anything else might make audiences think, even if the authors clearly don't intend audiences to take the villain's side. By protesting The Death of Klinghoffer, Giuliani claims he wants people to know the truth about the story behind the opera, but the truth about him is that he doesn't want people to think. He wants them to hate. I imagine most people who watch the opera will hate the terrorists anyway, but for people like Giuliani it has to be the right kind of hate, and it's up to him, apparently, to teach us how to hate properly. As Rodgers and Hammerstein wrote on a lower rung of musical-theater ambition, "you have to be carefully taught."

20 October 2014

Unfit to keep and bear arms in New York State

A local paper reports that not only gun-rights advocates but mental-health advocates are protesting the designation by the state of New York of "some 34,500 persons" as mentally unfit to have firearms. Most of these people apparently self-diagnosed themselves, since it turns out that less than 300 people will have to give up their guns after these findings. Nevertheless, a Queens doctor, representing other mental-health advocates, worries that "too many people are being deemed dangerous." This doctor worries that some genuinely troubled people will be discouraged from seeking help because it might mean losing or being denied guns. He doesn't like that he has to report "any kind of dangerousness," which begs the question whether he believes in acceptable levels of "dangerousness" in mental patients. By comparison, the NRA looks almost reasonable in asking that decisions on mental fitness not be made "capriciously or maliciously," though their measures of caprice and malice may differ from other peoples'. Amid these concerns, it seems only fair that someone ask whether the number is actually too low. I concede, however, that other measures of fitness may be too subjective or controversial for psychiatrists to address scientifically -- even if those others may be the ones that count most.

17 October 2014

What ails America?

You don't have to be irrational about the prospects of the Ebola virus spreading across the U.S. to be appalled at the poor handling of the initial outbreak. Two people may not make an outbreak -- not counting the Liberian who brought his infection here and died earlier this month -- but compared to the efficiency shown in treating American aid workers who contracted Ebola in Africa, the performance of that Texas hospital and the CDC are troubling. The two infected nurses may be the end of the chain, but Americans need to think about alternate scenarios, yet may have a hard time doing so. As Charles Krauthammer notes, "In the face of a uniquely dangerous threat, we Americans have trouble recalibrating our traditional (and laudable) devotion to individual rights and civil liberties. That is the fundamental reason we’ve been so slow in getting serious about Ebola." Nothing taken to excess is laudable, however, and in facing the prospect of pandemics that American devotion may prove a handicap sometimes. Back during the George W. Bush presidency people worried that a pandemic might be used as a pretext for martial law; the advent of Barack Obama only changed the identities of some of the worriers. But you may not need to be paranoid to take an "I don't have to do that" attitude toward recommended precautions or protocols. Krauthammer writes that "choosing between security and liberty ... is the eternal dilemma of every free society," yet our entire culture, it sometimes seems, conditions us to prefer liberty every time. It certainly seems to discourage us from recognizing inherent obligations to our fellow citizens, yet our obligations only grow more obvious as a virus grows more virulent. Ebola has raged through Africa because of inadequate infrastructure and bad cultural habits, we're told. American habits may prove nearly as harmful in the absence of an ethical infrastructure suited to the challenge. This alarmist tone may prove premature insofar as this outbreak may peter out after a handful of cases. But if a wider outbreak, now or in the future, can be blamed on people failing or refusing, from a desire to stay "free," to do the right things, more Americans may finally question whether "freedom" really should be any culture's supreme value.

15 October 2014

The Iraq WMD Bush didn't want you to know about

The big twist in the New York Times story about chemical weapons found in Iraq during the 2003 American invasion is that the George W. Bush administration never took advantage of the discoveries to vindicate the President's decision to invade. Given how some Republicans today are pouncing on the news as proof that Bush was right all along, you wonder why W. or his handlers doubted the benefits of an announcement -- and you get a chilling suspicion that however dumb Dubya may have been, he was smarter than his base. The problem with the WMD the Americans found is that they were old: 1980s-vintage stuff left over from the Iran-Iraq war. In calling for the 2003 invasion, Bush argued that Saddam Hussein's government was making new, more dangerous chemical weapons, and nothing of that sort has yet been found. Worse for the Americans, their own people played roles in the production of some of the weapons found during the invasion and occupation. Reminders of our past relations with Iraq could only further fuel criticisms of U.S. Middle East policies guaranteed to generate "blowback." The Bush cover-up may also have been motivated partly by a desire to avoid responsibility for American soldiers sickened by handling the captured weapons. But considering how ready Republicans still are to believe the pre-invasion narrative about Saddam's threat, we might feel justified in concluding that Bush never bothered publicizing these finds because, in the end, he never really cared whether or not Saddam had old or new WMD. For him and his cronies, more likely, the invasion was a means to a more ambitious strategic goal -- the "democratization" of the region -- rather than an essential act of national defense.

But if the Times piece was intended to further damn Bush or revive skepticism toward meddling in the Middle East, the article undercuts itself with an alarmist note about the possibility of remaining stockpiles falling into the hands of fighters for the self-styled Islamic State. The report claims that the weapons as found were no threat to the U.S., but that components could be repurposed -- and were during the occupation -- for small-scale use in guerrilla warfare. It would be grimly ironic if the same stuff that comes closest to evidence against Saddam Hussein were used again as evidence justifying an escalation of U.S. opposition to the IS. But at a time when even Democratic pundits question the effectiveness of bombing against a mobile enemy, nothing so ironic would surprise me.

14 October 2014

Religion as a last resort

The soldiers of the self-styled Islamic State are unapologetic about the atrocities they commit. They provoked a fresh wave of outrage this week, not with any new beheadings, but with the publication of the latest issue of their English-language magazine, Dabiq, in which IS writers reaffirm their right to slaughter alleged idolaters and enslave the women they capture. Browsing through the issue myself, I was struck by how these guys argue that it's better to take slaves -- for sex! -- than to commit adultery. But if it all adds up to an appalling system of values, it's still a system of values; the IS justifies it all by saying this is what God allows or orders them to do. That's why I think Thomas Friedman is wrong to characterize the IS as a force of disorder, or to say, quoting from a Batman movie, that the IS fighters just want to watch the world burn. They want to create order in their little caliphate, but on the basis of such authoritarian violence that many liberals simply refuse to recognize it as order. The desire for order is at least as much a factor in the appeal of the IS as the desire for violence. That desire for order is why so many people turn to religion in bad times. For a while during the 20th century it looked like people might look to themselves, or at least to Marx, Lenin, Stalin or Mao, to create order in the world, but Communism as collectively authored by the last three was "the god that failed." In the underdeveloped world especially, young people whose parents or grandparents vested their hopes in socialism or communism now turn to Islam, Pentecostalism, a more assertive and chauvinistic Hinduism, and so on. Why this seeming relapse? Is it because people still hope a god will provide for them when the man-gods of Marxism-Leninism failed? That's probably true to some extent -- it'd be the extent to which the desire for a god reflects people's feeling that they should be provided for, and that the power to provide for them must be out there somewhere. But there's more to religion's enduring appeal than that. It may be that, compared to the Market, the Party or even the State, a religion is something that can always use more people. The Market doesn't need all of us; it tells us to make ourselves useful or rot. States and parties too often sacrifice people's livelihoods, if not their lives, to austerity or competitiveness. Religions are no better, inherently, at providing for people than markets or states, but they promise everyone a place in an eternal order on what look like relatively easy terms -- especially, in the case of the IS, if you're a man. Those who worship the Market as a different sort of god make no such promises because they think it would encourage freeloading. Religions know better because, as I wrote, they can always use more people. Their promises lost their appeal not so long ago, but while the failures of the recent past loom large people around the world forget the lessons of the more distant past and assume that the old gods never failed. As some have suggested for some time now, it may take something like a Thirty Years War in the Middle East -- something that seems ever more likely lately -- to break the spell of Islamism, while we probably won't need anything so drastic to break the spell of Pentecostalism in the Third World. But where will the poor and all the people who feel that they have no place in the world look then? Somebody better have an answer.

13 October 2014

Columbus Day is the real Festivus

That second Monday in October is here again, and Americans, in some cases, will find time during the holiday to debate the legacy of Christopher Columbus. By now no one buys the idea that Columbus "discovered America," but his voyages clearly mark the beginning of an epoch of exploration and exploitation to which the U.S. owes its existence. Back in 1892, the 400th anniversary of his first voyage was a tremendous patriotic occasion from which came our present Pledge of Allegiance, if not all its controversy. Italian-Americans subsequently adopted the day as their own, their answer to St. Patrick's Day, albeit with less beer. In modern times, as his legacy of conquest if not genocide grew unbearable for many Americans, the idea of celebrating Columbus with a holiday grew more offensive. In some places "Indigenous Peoples' Day" or something like it is celebrated, while U.S. traditionalists protest that trend as a further advance of "political correctness." By this point no one, as far as I can tell, is calling for Columbus to be celebrated uncritically as a hero, but many argue that we should recognize that something important happened on or around the second Monday in October, 1492, and some feel that to repudiate the event entirely, as others seem to want, is somehow to repudiate our own national existence. So there may be parades in some places, but for the most part, when the occasion is noted it becomes the subject of argument and the airing of grievances from across the cultural spectrum. If the alternatives are unthinking patriotism and activist education about indigenous peoples, a holiday defined by debate looks just right.  If we trace our nation back to Columbus, it's only appropriate that his day be noted with griping from all sides.

10 October 2014

When Republicans defended voting rights, and Democrats cried fraud

Recent court decisions have pushed the question of identification requirements for voters back to the forefront of U.S. politics, just in time for an election season. By now the storyline is familiar: Republicans want to require voters to show photo i.d. at polling places in order to prevent fraudulent voting; Democrats protest, while dismissing all suspicions of fraud, that Republicans simply want to make it more difficult for certain populations who are less likely to have an ID card, or the documents necessary to get one, to vote Democrat. But it wasn't always so. In the course of my research for another project, I saw that one hundred years ago this month, New York State struck down a law that had been passed by a Democratic legislature over protests from Republicans that the measure's only purpose was to keep certain people from voting. Given where Republicans and Democrats stood 100 years ago, you might expect that black votes were at stake, but in fact the Democrats were out to make life more difficult for rural voters in general. The controversial law required residents in rural communities to register in person with the local election board when they moved from one municipality to another. Democrats argued that in-person registration was necessary to prevent fraud by election workers, who in theory could add names to voting lists arbitrarily otherwise, while Republican complained that requiring registration in person imposed a hardship on farm people who would have to travel long distances to take care of the paperwork at a time when transportation options were still quite limited. It was hard enough, presumably, to take the long trip just to vote; to require an extra, earlier trip simply to register would only discourage country people from voting. I don't know on what basis the court struck down the law, but in any event it was ruled unconstitutional, and Republicans immediately calculated how many more votes they'd get in that November's elections.

At that time, in the South, Democrats found every means possible to keep black people from voting. It took a change in black voting habits for Democrats to turn from opponents to defenders of black voting rights, while Republicans seem to have learned indifference to cries of hardship when it appears necessary to suppress "fraud" at the polls. We know they're not entirely indifferent, however, given how they accused Democrats of trying to ignore votes from overseas military personnel in recent elections. It still comes down to who you want to vote. If the groups adversely impacted by photo-ID requirements didn't vote consistently Democrat, Democrats would most likely not oppose those requirements so much, but it's also true that Republicans might not press for them so much. The question isn't whether one party or another is more inclined to commit fraud. Both major parties jockey for advantage constantly, and have done so throughout their 150+ year struggle for the American electorate. Republicans have just about always accused Democrats of driving immigrants to the polls regardless of their actual entitlement to vote, while Democrats until relatively recently strove to thwart black voting, not so much because Democrats were racist (though many were) but because blacks voted Republican. If a future demographic shift in voting habits threatens to tilt the balance of power, one party will seek ways to facilitate it, and the other will seek to thwart it. Each party really is more interested in maximizing the turnout of the most loyal populations than maximizing the vote of the entire American people -- but both could probably be defeated if Americans didn't do such a good job of suppressing their own votes through ignorance, complacency, or lack of imagination. That apathy toward alternatives to the two-party system is a greater threat to democracy than any of the tricks the two parties play on each other.

09 October 2014

Capitalism is history

The "interchange" of scholars on the history of capitalism in the new issue of the Journal of American History has an inescapable "blind men and the elephant" quality to it. I mean no offense to the scholars, all experts in their fields facing the challenge to teach about capitalism in history classes. The definition of capitalism still seems open to dispute. How do you define it? At first glance, capitalism is an "ism" or ideology, but in terms of economic or social history capitalism is no more an ideology than "feudalism" was. But capitalism isn't merely a phenomenon of economic history, not simply a set of practices adopted at some point or destined to be abandoned at another point. Back when I was in grad school historians were debating when the U.S. became capitalist. They focused on perceived transitions to a "market" economy from more traditional "moral" economies, or looked for turning points when people began producing primarily for markets rather than to achieve subsistence or "sufficiency" for themselves. A debate continues over whether slavery was a capitalist phenomenon. Some historians rule it out because they consider "free labor" a defining component of capitalism, or assume that since "reinvestment" is another defining component, capitalist slaveholders would have invested more in keeping their slaves fit to work at their most efficient. Another group of historians rejects stark separations of slave and capitalist economies, noting that much of the money many pioneer capitalists had to invest in things came from the slave trade in one way or another. Simpler broad-stroke definitions simply won't do, either, since historians can show readily that profit motives and acquisitive impulses have existed throughout history.

Capitalism might best be described as an event, but is hard to describe because the event isn't over yet. We can begin to describe it because it has had to define itself in reaction to challenges from socialism, the labor movement, the regulatory state, etc. At a certain point in history, capitalism is simply progress. Once it becomes a reactionary force -- and even if you dispute whether what it resists is really "progress" or an inevitable change -- we can begin to define it by negation, by what it can not or will not become. The beginnings are not so obvious, in part because there's no such thing as the "Capitalist Manifesto." Unlike with socialists, no author I know of offered a package of ideas called "capitalism" as the next necessary step in human progress back when kings and nobles still reigned and economies were governed by guilds, "just price" customs and other traditions. If socialism always has an element of conspiracy to it, if only because socialists knew what they wanted, capitalism developed more spontaneously and with much less sense of destiny. In broadest terms, the opportunities created by technological innovation and the Age of Exploration enticed people into challenging old rules and customs that appeared to impede the enrichment of ambitious individuals, of nations, and to some extent societies as a whole. If capitalism as an event comes to an end eventually, it most likely won't be ended by the sort of working-class revolution Marx expected. It does not appear to follow as logically as he thought that workers will want to take over the means of production. As long as they are satisfied with life after hours, the working classes in developed countries can reconcile themselves to whatever alienation or exploitation they endure at work. It seems more likely now, in an era of diminishing resources, that the state, defined not as the vehicle of the proletariat that will wither away but as the thing that keeps us alive and always must, will set the terms for the end of capitalism. Nothing except dogma says that states can't function as capitalist investors themselves -- China belies the claim -- but the crises of the future may compel states to become the sort of "command economies" that are seen as anathema to capitalism.  In simpler terms, capitalism has been fueled by generations of new ideas about what can be done, but the future seems likely be governed by a greater overriding sense of what must be done. If capitalism is ultimately irreconcilable with such necessities, future generations may be able to define it more precisely as a thing of the past, though what follows -- global democracy, global totalitarianism, global religion, or perhaps a new dark age -- may remain a mystery to future historians for some time afterward.

06 October 2014

Ben Affleck, defender of the faith

It was an interesting way for Academy Award Winner and future Batman Ben Affleck to promote his current picture, but there he was on Bill Maher's talk show debating Maher and "militant atheist" Sam Harris (of The End of Faith fame) about Islam. Affleck questioned Harris's credentials as an expert on Islam and the menace the religion as a whole purportedly presents to the rest of the world, as well as Harris's contention that "Islamophobia" is used as a pejorative to discourage legitimate criticism of the religion. Basically, Harris and Maher were saying that one can criticize aspects of Islam without being a bigot, but Affleck wasn't buying it. He understood the other two to be saying that the threat of jihadist violence was inherent in Islam; to him that meant they were portraying one billion or so Muslims as potential jihadists, and to him that's bigotry. Maher and Harris stuck to their guns, citing surveys purporting to show that majorities of Muslims around the world believe, for instance that apostates from Islam (like atheist heroine Ayaan Hirsi Ali) should be killed.You can read what all these people actually said here.

If Affleck and his antagonists seemed to talk past each other, it's probably because both sides are right to different extents. Affleck is certainly right in his assumption that the majority of Muslims around the world have never had a jihadist thought in their lives, while Harris and Maher are just as right to argue that the doctrines of Islam, and their political implications in the secular world, are proper subjects for criticism. The atheists are right to reject the equation of reasoned criticism of religious beliefs or cultural values with a "phobia," but the actor is right to worry about the collective stigmatization of practitioners of the world's second-largest religion. Maher and Harris are really taking the same stand critics of Israel take when accused of anti-semitism. On that analogy, Affleck is wrong to equate their criticism of Islam with anti-semitism, but he's still right to warn against an implication of collective responsibility that's really no different from the self-styled Islamic State's rationale for beheading harmless reporters, aid workers, etc. in reprisal for their nation's acts of war.  Harris and Maher insist that "moderate" Muslims have a duty to intervene against their violent, radical co-religionists. Implicitly, they'll judge Muslims by how well they perform this duty, regardless of any other issues Muslims themselves may raise. But if collective responsibility is wrong when asserted by the IS or al-Qaeda to justify the murder of American civilians, it must be wrong even when Americans merely blame all Muslims for failing to suppress their radical elements. As Affleck himself noted, Americans would reject the idea that U.S. atrocities in Iraq are "a reflection of what we believe in." In that case, implicitly, we should look elsewhere besides in the Qur'an (the "mother lode of bad ideas" in Harris's term) to understand why Muslims are waging jihad. Religion may shape the attack, but it doesn't necessarily provide the impetus. Jihad in our time is a reaction against something, or several things, not merely some madman's dream of power and plunder. While neither Harris nor Maher is a neocon, and Maher's definitely no Republican, their antipathy toward Islam may blind them to other causes of jihad that come from outside the dar al-Islam. It probably doesn't make sense to ask why they don't criticize the Christian Right as vehemently as they criticize Islam, but they probably should ask why outsiders have been riling up the Muslim world for the last century or so, and whether there's a way of thinking to blame that ought to be reformed.

03 October 2014

Talking Turkey about Syria: stupidity isn't Made in USA

The U.S. and "the West" in general suffer from what might be called democratic chauvinism. These countries presume that liberal or constitutional democratic republics are really the only legitimate form of government on earth, and reserve for themselves the right to determine whether any government that isn't such a republic is a threat not just to world peace or order, but to "freedom" as well. Some regimes, like the monarchy of Saudi Arabia, pass the test because they have something to offer and are willing to deal with us. When that isn't the case, the West has few reservations about demanding "regime change" as a moral imperative. To the rest of the world this looks like western imperialism or "arrogance," but the example of Turkey suggests that such arrogance may come inevitably to republics, wherever they emerge. Turkey has been a republic for nearly a century now, not counting periods of military rule. The republic's Islamist leader is accused, inevitably, of "authoritarian" tendencies both by outside observers and domestic opponents, but President Ergodan shows what might be described as small-r republican arrogance when it comes to his neighbor, Syria. Turkey showed some reluctance about joining the coalition against the self-styled Islamic State a little while ago, but with the IS driving Syrian Kurds across the Turkish border into refugee camps Erdogan now vows to fight the IS. But just like the Americans, he stubbornly refuses to recognize the most obvious, if not most necessary ally against the takfiri headcutters: the Syrian government. In practically the same breath, Erdogan says "we will continue to prioritize our aim to remove the Syrian regime." No doubt, also like the Americans, he assumes he can find rebels capable of toppling Bashar al-Assad, yet incapable of becoming a threat to his own country's interests.

In Turkey's case, this may be less democratic chauvinism than plain old Sunni chauvinism. The Turks may hate Assad less because he's a tyrant than because he's an Alawite -- though for all I know they may think him a tyrant because he's an Alawite. Like other Sunnis, Erdogan probably considers Assad too friendly with Shiite Iran. But worrying about Iran -- as the government of Israel also recommends to everyone -- seems to be a priority error at the present time. The IS, after all, is the latest existential threat to global peace, order, freedom, etc. Their high-profile but limited-scale atrocities have stigmatized them as the barbarians of the hour. Some people, no doubt, regard the self-styled IS caliph -- a figure who seems to keep a relatively low profile, at least as far as western media is concerned -- as the next Hitler. But if "next Hitler" is anything more than a rhetorical device, might the world not learn from how it dealt with the first Hitler? Wasn't it Winston Churchill, idol of neocons, who said, to justify Great Britain's alliance with Stalin's USSR, that if Hitler invaded Hell he'd make an ally of the Devil? It's all about priorities. The "greatest generation" understood this, so it makes sense that an American veteran of World War II, George H. W. Bush, having identified Saddam Hussein, however foolishly, as a "next Hitler" after Iraq invaded Kuwait, went to the trouble of assembling as broad a coalition as possible against Saddam -- including Syria under Bashar al-Assad's dad. It should be a matter of common sense, should you find a threat to global peace striving to overthrow a country, to join forces with that country's government, whether you like it very much or not. But that pragmatic step is morally or strategically unacceptable to Americans and Turks alike. It's reasonable to assume that each group is more interested in having their man or clique in control of Syria than in eliminating the IS.  Each probably wants to blame Assad for the rise of the IS -- for not disappearing conveniently when asked, or for not getting killed in a timely manner -- but the impulse from outside to destabilize Syria probably keeps the IS going more than anything else, even if no outsider really wants them to take over. Like it or not, the surest way to suppress the IS in Syria is to shore up the Assad regime. That doesn't mean Assad is a good man or a good leader or that Syria needs a dictatorship or whatever else you might infer. It means that the world -- the Americans, the West, the Sunnis --  needs to stop fantasizing about third forces and make the only choice possible in Syria, apart from letting history run its course.

02 October 2014

Could term limits empower Congress?

It's nothing new for George Will to advocate term limits for people in Congress. The columnist has long opposed the ascendancy of a careerist "political class" alleged to be interested only in expanding government for their own benefit. Term limits, Will hopes, would assure a steady rotation through the legislature of people with real careers -- people presumed familiar and sympathetic with the needs of the private sector upon which national prosperity depends. Whatever his biases, there are small-d democratic arguments for and against term limits. One argument against is that if the people are sovereign, they should be able to elect whom they will as often as they wish. On the other hand, if democracy means more than elections to you, you might grow suspicious of the masses' dependence upon a handful of great leaders, and you might believe that rotation in office, enforced through term limits, affirms the democratic idea that any educated person is capable of legislating. Advocates of limited government like Will advocate term limits, but that doesn't mean that term limits only serve to limit government.

This week, Will is concerned less with limiting government in general than with maintaining checks and balances. Like many observers on both right and left, he's concerned that Congress has grown too deferential toward the President, be he a Republican or a Democrat, on questions of airstrikes, troop deployments, etc. He worries that Congress is insufficiently jealous of its prerogatives when it comes to making war. His possibly paradoxical suggestion is that a Congress constrained by term limits would be more assertive of its rights against the other branches of government.

If anything, Will takes a more damning view than ever of the alleged political class. Promoting the views of Greg Weiner, a conservative constitutional scholar, Will argues that career politicians don't make a career of politics for the power as much as they do for the perks of office. In Weiner's words, "increasingly, there is evidence that members of Congress do seek office for something other than the power, for power is something with which they are not merely willing but often eager to part." This sounds more plausible when you think of war-powers issues than when you recall the determined obstructionism, often applauded by Will, of a Republican majority in the House of Representatives. But give both Weiner and Will credit if they're looking beyond the partisanship of the moment: congressional deference to the Commander-in-Chief on questions of war should be of concern to every citizen, regardless of party or ideology.

How would term limits reverse the trend? Here Will jumps from Weiner to James Madison, who wrote in the Federalist Papers that for checks and balances to work, "The interests of the man must be connected with the constitutional rights of the place." Will's hope is that limiting the career incentive would leave candidates for Congress who are true believers above all in the rights and prerogatives of Congress. These worthies would be less interested in bringing the bacon home to their constituents than in keeping an eye on the other branches of government, the executive in particular, lest they overstep their constitutional bounds. Having done their round of vigilance, they would then return happily to private life.

It would be easy to caricature this proposal as an exchange of careerists for ideologues, but for the sake of argument let's entertain the premise that people who aren't motivated to run for office by the promise of perks are more likely to act on principle (if not ideology) when they're elected. If that's the case, why not go further than limiting terms and eliminate as many of the corrupt incentives that Washington offers as possible? Without going so far as to deter working-class people from running for office, we might expect our representatives to live as spartan a life in the capital as possible consistent with the dignity of their offices. Put them up in dorms if need be -- comfortable but no place you'd want to spend the next twenty years. At the back end, take even more steps to keep them from cashing out by taking jobs with lobbying firms or anyone whose promise of future wealth might corrupt a legislator today. Do everything that would filter out anyone interested in anything other than helping Congress play its appropriate constitutional role. Would that leave only Tea Partiers? There's no reason to think so. People still run for office to accomplish things, not just to guard against the Executive Branch. While there's been many a shameful example of Democratic congresspersons living large while playing champion of the little guy, it doesn't follow that liberals, much less leftists, go to Washington for the glitz or the glamor. Nor does it follow that any one person needs to stay in Congress for a lifetime to accomplish something for the little guy. Democrats and liberals routine argue against term limits mainly because, going back to the 1980s, they saw them as a sour-grapes Republican swipe at a then-entrenched Democratic majority in Congress. The fact that we see conservatives still calling for term limits after almost twenty years of GOP control of the House suggests that there's more than sour grapes involved. We need not agree with those conservatives about the necessity of term limits, but we might find them useful if not appropriate to our kind of democracy. In the end, however, it still comes down to what voters want from Congress, and in a democratic republic there are no term limits for the people.

01 October 2014

People Power in Hong Kong?

Instead of a "color" revolution, the Chinese island of Hong Kong is having an "umbrella" revolution this week. The name comes from the inexperienced protesters' resort to umbrellas to fend off tear gas. The protesters have gathered by the tens of thousands to protest what they consider a violation by the mainland Chinese government of the "one nation, two systems" principle under which the former British colony reverted to China in 1997. Since then, Hong Kong residents have continued to enjoy a greater degree of civil liberty than exists on the mainland -- but the islanders are understandably jealous of their privileged status and vigilant against encroachments on their freedom by the central government. In a few years, Hong Kong will elect its local chief executive for the first time; under the British the governor was appointed by London. For the present protesters, the milestone has been marred by Beijing's assumption of a right to screen or vet all candidates for the high office. That is, whoever might get nominated by the islanders, the central government will decide who actually gets to run. For the protesters, this means that the Communist party on the mainland gets to play the role of the guardian council in Iran. While that doesn't mean Communists will take over, it is presumed to mean that no one will be elected who isn't duly deferential toward the Communist leadership in Beijing, just as the Iranian council ensures that no one is elected who isn't duly deferential to Ayatollah Khamenei. The umbrella revolutionaries want Beijing to rescind its decision, but both Beijing and the current regime in Hong Kong has made it clear that that won't happen -- especially not at the dictate of a relative handful of "radicals."

To the extent that Chinese Communists remain "totalitarian," they are the people least likely to make any concessions to "people power," as was made clear most forcefully in Tiananmen Square 25 years ago. A communist party typically sees itself as the only legitimate representative of "the people," whether it's in power or not -- and leaving communism out of the question there are plenty of reasons to question the legitimacy of "people power" demonstrations anywhere, especially when outside observers spend so much time lecturing China about the importance of the rule of law. "People power" is incompatible with any vision of "rule of law." Liberals who applaud "people power" wherever they see it betray their liberalism, since "people power" is always a radical strategy. It may be appropriate sometimes for liberals to become radicals when "people power" rises in a polity that lacks a rule of law, but liberals often are too quick to identify the lack of liberalism in a country with the lack of a rule of law. They find it hard to concede that there might be a rule of law that isn't a liberal rule of law. Some see no rule of law in China because they assume that the Communist party can act arbitrarily when it pleases, with nothing to constrain it. Therefore China must be in the wrong in the current dispute with Hong Kong -- it must be exerting arbitrary power over the local elections and thus the locals are right to resist. It's certainly the protesters' prerogative to dislike Beijing's decision, and they should have a right to say so -- and in Hong Kong, as far as I know, they do have that right without having to hit the streets. But that doesn't mean that they're right. For one thing, they may not speak for a majority of people on Hong Kong itself. For another, it isn't for outsiders to judge -- at least not by their foreign standards. Is it wrong for the U.S. Constitution (Article IV, Section 4) to require that each state in the Union have a republican form of government? Does that compromise local autonomy at all -- and for that matter do all Americans agree on the meaning of small-r republicanism? It may sound outlandish to Americans for anyone to raise those questions, and it probably sounds just as outlandish to the Chinese government when people question its right to hold candidates for local offices to some standard of fitness. That doesn't mean that the Chinese are right, but we ought to consider the possibility that there is no right or wrong in such matters. That in turn doesn't mean the umbrella people are wrong, but it should be one thing to sympathize with protesters, and another to assume, as we too often do, that they deserve to win and should be helped to win. Neither side in Hong Kong is necessarily our side. When we decide that one side is our side, we've probably overstepped our bounds. Individuals should be able to choose sides without crossing that line, but governments ought to be more diplomatic. To use a sports metaphor, root for whom you like, but don't run on to the field.