The Neo-Imperialism of Intervention: Syria & the West

Media outlets reported on Friday that Bashar al-Assad’s government killed 37 more people in the ongoing crackdown in Syria against pro-democracy protesters. This mounting death toll, combined with horrific images of activists and civilians alike mowed down by tanks and machine guns, will surely contribute to the growing consensus that the West should intervene to stop the massacres. Considering NATO displayed a marked keenness to step in when Muammar Gaddafi brutally repressed the insurgents who rose up against him, many observers have denounced what they have perceived as hypocrisy. Are Syrian lives somehow worth less than Libyan lives? Is al-Assad any less of a cruel despot than Gaddafi was? The West, these commentators argue, has nothing less than a moral obligation to intercede on behalf of the Syrian people, who stand no chance against the much more well-armed Syrian security forces. Institute a no-fly zone, begin bombing Damascus, supply weapons to the rebels – the whole nine yards.

Of those voices sounding off against intervention, they offer a rationale that Syria represents a different situation than Libya did when the West rode in to the rescue. Unlike Libya, they say, civil war has not torn Syria asunder. The anti-government movement holds no territory of its own, and given the dominance and sophistication of the Syrian armed forces, any internal conflict is likely to be a one-sided affair. Additionally, Syria can count on support from its nearby allies, Hezbollah in Lebanon and Iran, whereas Gaddafi lacked friendly neighbors. Gaddafi could also not credibly warn that he would launch a suicidal attack on Israel the moment the first U.S. cruise missile hit his country’s soil. On top of all this, these voices note, Russia and China – two illiberal countries exasperated with the Arab Spring and this whole fascination with free and fair elections – are putting their collective feet down, putting the kibosh on an emerging precedent of the West liberating citizens aspiring for freedom under authoritarian rulers. In other words, the West is not being hypocritical in “freeing” Libya and leaving Syria to descend into chaos; it is just that Libya represented a unique opportunity, with its tribal divisions, pariah state status and lack of resources. His tyranny only needed a slight push (in the form of air raids and drone strikes, plus crates of automatic weapons) to bring down.

The only thing keeping the West from fulfilling its ethical responsibility of preventing slaughter of the innocent, according to this line of thinking, are unfavorable conditions particular to the Syrian case. Yet Western inaction concerning Syria is not the exception; it is by far the norm. You might find reams of words on “respected” news and commentary Web sites dedicated to Western outrage over oppression and butchery in Syria, and going back a little farther, will find similar impassioned editorials and blog posts about the persecution of homosexuals in Uganda or the “ethnic cleansing” of Africans by Arabs in Darfur. You will find much fewer paragraphs demanding a Western intervention in places like Bahrain, where over a dozen people have died over the last year due to government use of tear gas against peaceful protestors, or Saudi Arabia, where demonstrations are illegal and the Shiite minority in the eastern provinces has been viciously dealt with, sometimes fatally. In addition, could you imagine the backlash that would occur if The New Republic or The Atlantic – much less TIME or The Washington Post – ran a piece advocating that the United States bomb or invade Israel to end its illegal blockade of the Gaza Strip, which leaves a densely populated area without access to imported food, machinery and medicine?

The scholar Mahmood Mamdani has done an excellent job studying why some human right violations and global bloodbaths receive more attention than others do. In his deconstruction of the campaign to save Darfur, Mamdani notes the silence in the West when the United Nations reported that 1,000 people were dying every day in Angola between May and October 1993, and how 3.9 million dead in the Democratic Republic of the Congo did not raise a murmur in 2004. By way of explanation, Mamdani cited the work of the journalist Lara Pawson, who observed that it was no coincidence that the U.S. received eight percent of its oil imports from Angola during its spiral into violence, while 18 British-based companies currently enjoy access to Congo’s rich mineral deposits.  It would therefore not be in the economic interest of the West for military operations to disrupt the extraction happening in these countries, despite the bloodshed. Yet Mamdani goes beyond economic factors, and argues that the “Save Darfur” campaign framed a complex and nuanced problem as simplistic acts of political violence. It did not inform Westerners about what happened or was happening in Sudan, but played into the hands of the War on Terror, providing yet another example of “Arab Muslims behaving badly.” As in all marketing strategies, the message was clear enough for all to understand: “Muslims, who are evil and just so crazy, are killing Christians, and we all know how religion just makes people do crazy things, not like here in the secular world, am I right?”

It is mentally easy to connect war with profiting from natural resources, to boil wars down to “blood for oil.” Yet I tend to agree with Mamdani as well as the analysis by Immanuel Wallerstein that Western governments and civic institutions are not riled up for refineries, diamond mines and gas reserves alone. There is also genuine bleeding heart liberal faith that the West has a “responsibility to protect,” to institute a “liberal imperialism” to defeat the “bad guys” and preserve the right to life, liberty and the American way – which, at Fukuyama’s “End of History,” is now the international way. If this sounds like the neoconservative ideology, that is because it is. Critics of Mamdani have reviled him for linking the “Save Darfur” movement, an offspring of the U.S. Left, with Dick Cheney, Paul Wolfowitz and other neocons. Let us not forget, however, the reason why the intellectual originators of neoconservatism are not called simply “conservatives” is that they used to be radical left-wingers, yet flocked to the imperialist right-wing over what they perceived as a lack of adamant democracy promotion – such as through regime change. It would of course be incorrect to lump all pro-interventionist left-wingers and neoconservatives together, but in essence, their idealism springs from the same conceit: that the West, with its unchallenged power and enormous wealth, has license to stop the worst excesses of humanity, embodied by the Holocaust, from ever taking place again. The West, like a global Robespierre, should decide who lives and who dies, all for the sake of making the world “better.”

There has been a lot of hand-wringing on the Left over siding with traditional anti-imperialist arguments or being guilt-tripped into supporting excursions to topple dictators, end wanton murders and establish democracies. Yet those who struggle with this question should remember the adage to judge a policy by its consequences, not its intentions. Western interventions, even when they have been “successful,” have done little to get at the heart of a cleavage or a conflict, to address the long-standing issues that led to this civil war or that uprising. They have been flash-in-the-pan exercises – bombs and bullets, followed by rushed negotiated settlements that either unravel or barely contain the unsatisfied resentments of all parties. The bottom line is that the “white man’s burden” – to bring civilization and order to backward, bloodthirsty “savages” – is morally bankrupt, regardless of whether it is used as cover by the self-interested or is sincerely believed by naïve pseudo-leftists.


The Truly Dangerous: Iran vs. the West

Earlier this week, a classmate and I discussed the recent posturing by both the United States and Iran, which has caused many observers to pontificate on the possibility of a war between the two countries. I argued that such a conflict was unlikely, as it was difficult to perceive how it would benefit anyone. My classmate, however, claimed that, if pushed far enough, the Iranian government could very well start a war, even if defeat was certain. The kernel of his assertion was that Iran’s leaders – specifically the hard-line clerics and Revolutionary Guard officers – would be open to going out “guns blazing” if pressure, internal and external, became so great that their continued rule became unviable. In other words, while Iranian elites may act rationally under normal conditions, the ideology of political Islam means that irrational, self-destructive behavior is a distinct and dangerous possibility under the right circumstances.

My sole objection to this line of thinking is that it ascribes hazard only to the Iranians and their “exotic” and “inscrutable” culture and beliefs. Just because it is hard to fathom why so many Iranians, past to present, support the authoritarian theocracy that dominates their lives and curbs their freedoms, this does not mean that Iranians, even the vehement ultra-conservatives, possess alien mentalities beyond Western ken. It is just that their values are different from ours, their historical experiences vastly dissimilar. Various world powers have besieged Iran since the end of the 17th century, and while it no longer contends with Ottoman encroachment or European colonialism, Iran continues to view its relation to the world with a siege mentality. Today, the country is rightfully wary of an unchallenged, militaristic and arrogant superpower that has invaded and occupied two Middle Eastern countries and seeks to promote its imperial grand strategy of permanent world hegemony. Closer to home, Iranians still remember the 1981 Israeli airstrike that took out a nuclear reactor in Osirak, Iraq, solely on the Israeli suspicion that it would be used for nuclear weapons. Regardless of whether Saddam Hussein was planning to pursue a nuclear arsenal or attack Israel in 1981, what is relevant is that Israel launched an attack based on mere mistrust rather than hard proof.

Military action based on suspicion and hunches has become the new normal. The 1981 Israeli airstrike on Osirak and the 2003 U.S. invasion of Iraq both serve as sterling examples of preventive strikes – not pre-emptive ones, arising from an imminent threat, but ones designed to weaken an enemy, real or imagined, for the sake of regional or international interests. As the historian and activist Arthur M. Schlesinger pointed out, the Japanese sneak attack on Pearl Harbor previously embodied preventive strikes in the American mindset, reinforcing the natural distaste for treacherous acts of unwarranted aggression. Now, preventive strikes receive praise rather than denunciation. During the 1980s, the U.S. actively supported the Contras in their insurgency against the Sandinistas in Nicaragua, not because the country had become a “second Cuba,” but because it might eventually. The U.S., the wealthiest and strongest country in the world, felt compelled to intervene at the prospect of the Soviet Union, well on the decline, having not just one but two friends in all of the Americas. When the Soviet Union finally collapsed, the U.S. could no longer claim its unprovoked intrusions were acts of self-defense, no matter how absurd claims like Nicaragua had been. Briefly, the U.S. and her allies promoted “humanitarian interventions” in conflict-ridden states, even though these police actions did little in the long-term to settle long-standing ethnic and religious cleavages and were geared more toward positive publicity for the continuance of NATO than truly ending genocide and civil wars. The U.S. struggled to convince either the world or its own population that it could be a global police officer, so much so that by 2000 George W. Bush actually ran as a non-interventionist skeptical of regime change. The September 11, 2001 attacks, however, provided an opportunity for the U.S. to initiate military operations as “defensive” measures. The “War on Terror” proved an even greater godsend than the Cold War, as “terror” – unlike the Soviet Union – would never fade away or surrender. Ironically, the various wars, bombing campaigns and drone attacks carried out under the “War on Terror” banner have produced far-reaching resentment against U.S. jingoism and bravado, gifting anti-Western terrorists with greater recruitment tools than they could have envisioned.

In contrast to the U.S., Israel can point to Arab invasions and acts of terrorism committed against it throughout its existence, in addition to the anti-Semitism that has plagued the Jewish people for time out of mind. Traditionally, foreign states have acknowledged Israeli appeals to its right to defend itself and provided it with exceptional leeway. Yet, in the last decade at least, that leeway has reached its breaking point. Israeli hawks have become so excessive in their fight for “national security” that they have blockaded Gaza and induced its people to tremendous suffering, and apparently fail to see how decisions – such as the 2010 lethal raid on a ship that tried to break the blockade and bring humanitarian supplies to the Gazans – might invite disgrace and condemnation. It has gotten to the point where some U.S. politicians, usually thoroughly faithful in their support for Israeli policies, have begun to question the Israeli inclination to shoot first and ask questions later. There has not been much disquiet in regards to Israeli concerns over Iran’s nuclear ambitions, however, which is ironic, as the U.S. was notably silent about Israel’s own nuclear program dating to the late 1940s and has not, as in the case with Iran, demanded inspections and promises of peaceful purposes from the Israeli government. This is despite the fact that Israel has been ruthless in suppressing any knowledge of its nuclear weapons. When Israeli nuclear technician Mordecai Vanunu blew the whistle on Israel’s stockpile to the British press in 1986, he was soon thereafter drugged by Israeli agents, sentenced in a secret trial and imprisoned for 18 years, 11 of which he spent in solitary confinement. One would think that Tel Aviv would have softened in such brutal treatment of a prisoner of conscience, but since his release in 2004, he has been re-jailed several times, usually for talking with foreign journalists and trying to leave the country that persecutes him.

The above facts alone may be enough to understand not only why Iran has been less than submissive to Western demands, but why so many regular Iranian citizens support the actions of the government. During the Arab Spring, Western observers were usually shocked when they saw counter-demonstrations by supporters of Mubarak, Qaddafi and Assad. That so-called “rogue states” would rush into the arms of the West were they freed from their despotic rulers remains a popular trope in the U.S., most dangerously within the minds of neo-conservative thinkers. For example, the media pays little attention to the fact that Iranian reformers, including Green Movement leader Mir-Hossein Mousavi, generally oppose Iran giving up its desires for nuclear technology. The popular desire in Iran to refuse dictates issued from abroad are perhaps only equaled by U.S. and Israeli rejection of abiding by international law and U.N. resolutions.

“Still,” my classmate might say, “at least Mousavi would not threaten to shut down the Strait of Hormuz and possibly cause World War III.” Wouldn’t he? What choice would he have? Rationality, it must be remembered, is defined as the proper exercise of reason and realizing what one’s reality is. For example, during the Iran-Iraq War, Iran employed the Basij, a volunteer militia comprised of young boys and old men, in human wave attacks, sending them in rows, unarmed, against Iraqi positions. Irrational on the face of it, such a tactic makes perfect sense when one side in a conflict suffers from a technological inferiority but a surplus in population. The Soviet Union used the tactic to great effect against Nazi Germany, after all. Similarly, suicide bombings – including those performed by members of Hezbollah, sponsored by Iran – are generally viewed as irrational, considering they inevitably result in casualties to your side as well as the enemy’s. Yet, when weighed against the alternative, there is no feasible way in which Hezbollah or any terrorist group can, at least on a consistent basis, fight a conventional war, guerilla or otherwise, against a highly advanced and sophisticated opponent. We should be careful to draw a distinction between suicidal tactics and a suicidal strategy.

On the face of it, threatening to shut down the Strait of Hormuz and discussing how easy it would be to do so seems self-destructive. Yet Iran knows as well as anyone else that the U.S. Fifth Fleet would respond almost immediately to restore the flow of oil from the Gulf. It also knows that its own economy would suffer from a lack of petroleum exports. It admittedly would be a lose/lose situation, but it is also the only viable situation Iran can present in which the West loses at all. Otherwise, it has no reply to the West using sanctions and other economic warfare to get Iran to do what it wants. While the West reconsiders whether it wants to pay the cost of an oil crisis to prevent Iran’s uranium enrichment, Iran in the meantime improves relations with major players outside the West, such as Russia, China and Latin American countries, to work to keep its economy functioning in the wake of closed Western markets. It may not be orthodox foreign policy, but when the choice is between audacious brinksmanship and surrendering sovereignty and losing domestic support, can we really believe Mousavi would choose the latter course?

I will not argue that the Iranian leadership will always act in a sane fashion. A person can no more predict human behavior than count the stars in the sky. Certainly, there are ample precedents where regimes, faced with almost certain downfall, have opted to take as many others with them as they collapse. I will argue, however, that there is nothing about Tehran’s words or deeds that gives credence to fears that the saber-rattling has been anything but the usual posturing. In my opinion, the greatest menace to the continued existence of humanity appears not on distant horizons but within our own country. It is the United States, with its bellicose approach to world affairs and its dismissive attitude to international laws and conventions, that has most exercised the will and the means to scorn peace and make unjust war in recent history. If we want to oppose countries engaged in “irrational, self-destructive behavior,” then the battle begins at home.

The Obvious Offense We See: Afghanistan

In a story that is still developing, an Internet video has surfaced that apparently portrays four U.S. Marines urinating on the corpses of dead Taliban fighters. The U.S. government has of course condemned what the video depicts, and it appears as though the military has already identified the soldiers responsible.

The video has already sparked outrage in Afghanistan and elsewhere in the region, as it continues an unfortunate trend in which the U.S. has cast human rights to the wayside when it comes to Islam and Islamic insurgents. In 2005, Newsweek reported that interrogators at Guantanamo Bay had desecrated the Quran by flushing it down a toilet. Afghans attacked U.N. employees last year when American pastor Terry Jones followed through on his on-again, off-again threat to burn the Islamic holy book. More than a few outlets have compared this latest debacle to the photos that came out of Abu Ghraib in 2004, which revealed that U.S. Army personnel and others had been torturing and abusing detainees held in Iraq.

Interestingly, the group you would most expect to be livid over the video, the Taliban, has stated that it will not deter them from participating in possible peace negotiations with the U.S. and Afghan governments. They are, they claim, not surprised by the latest insult in a string of disrespectful and barbaric practices the West has subjected them to in the course of its occupation. The problem with this position is not that it is irrational rhetoric by a band of militant zealots. Given our record in Afghanistan, it is actually quite logical.

It is easy to be angry and ashamed watching the video in question. The disregard for the dead is obvious; the conduct of men representing the United States is clearly juvenile and dishonorable. What is less evident – and therefore less discussed and rarely denounced – is how the war in Afghanistan has long been lacking in honor, and that the U.S. has not only repeatedly dismissed concern for slain combatants but for living innocents. No one has captured on film all the Afghan civilians killed in their fields and their homes in countless air raids and drone attacks. No one has piled high the child corpses that the war has claimed as “collateral damage” in the more than a decade this war has gone on. Granted, no one may have urinated on those bodies, but their deaths nevertheless speak to how brutal and callous modern “warfare” has become and how indiscriminate the daily slaughter is.

The difference, some would say, is that urinating on dead Taliban militants serves no purpose, whereas the deaths of civilians, while tragic, is an unavoidable byproduct in our overall mission to liberate Afghanistan and defeat terrorism. If this is the case, then that overall mission has been a failure.

Afghanistan is certainly not free, except in that it possesses the “freedom” to do what the U.S. allows it to do. Last year, Afghan President Hamid Karzai announced a ban on the bombing of houses due to the mounting number of noncombatant deaths such bombings caused, and the furor that arose as a result. U.S. officials quickly noted that Karzai did not have the power to veto who or what the U.S. wanted to bomb in his country. More recently, Karzai has protested NATO (in other words, U.S.) night raids into Afghan homes, again due to civilian casualties but also due to the understandable dislike Afghans feel about having their private sanctuaries violated without their permission. Coming from Oklahoma, I could not imagine U.S. troops invading civilian homes at 4 a.m. without judicious exercise of the Castle Doctrine, much less foreign troops. I could also not imagine any popularly elected politician who allowed such raids to happen to enjoy much support, and with Karzai’s decisions and interest articulation rendered to mere suggestions by overruling U.S. orders, it is hard to envision Karzai having much credibility with his citizenry. After all, he has been the main native facilitator behind NATO (again, U.S.) military operations in Afghanistan, and those are deeply unpopular.

So is, admittedly, the Taliban. Yet the Taliban’s political arm knows that it does not need popularity; it just needs Afghans to see Karzai as a Western stooge, an illegitimate ruler propped up by Western wealth and power. In that instance, whatever organization can claim a legacy of resistance against the deprival of sovereignty, against civilian slaughter, against endless occupation will mobilize support to its side. Much as how Hamas evolved from a Palestinian offshoot of the Muslim Brotherhood into a Palestinian nationalist party, it is feasible that the Taliban could coalescence Afghan grievances into an endorsement for its radical opposition to the West, if not for its strident adherence to the ascetic lifestyles of the Prophet Muhammad’s early followers. The Taliban could transition from a revolutionary movement into a democratic institution. Again, the potential parallels with Hamas are striking. Just as Israel, with U.S. support, bestowed legitimacy upon Hamas repeatedly even when it was on the verge of alienating Palestinians with the strident elements of its platform, the U.S. may very well instigate a repeat of the 2006 Palestinian legislative elections, which transformed Hamas overnight from a terrorist group to a democratically elected government.

As to the charge that the Afghan war has not defeated terrorism, the proof should be common sense. Many in the Muslim world did not “hate us for our freedom” circa 2001, but since then many have to come to despise us the appalling incidents that have colored our occupations in Afghanistan and Iraq.  In the background to these events, too, has been a persistent Islamophobia, evidenced by everything from the outcry over the so-called “World Trade Center mosque” to the sudden respectability of virulent atheists like the late Christopher Hitchens, who (for some reason) seemed to receive attention when he directed his ire against Islam specifically. When some of us read about efforts in Oklahoma to ban the implementation of sharia law, it is natural (and appropriate) for us to roll our eyes and laugh it off. How many of us, however, pause to consider what the impact is when this nonsense is reported in Istanbul, Abu Dhabi or Kuala Lumpur? How long can you denigrate an entire religious community, the third largest in the world, before that breeds resentment, manifest in this case for, if involvement, then at the least sympathy for anti-Western causes?

There is no defense for the actions of the Marines displayed in the urination video. No matter how much some apologists may retreat into jingoistic comments or vague “war is hell” arguments, nothing can surpass the basic fact that those who wear our country’s uniforms represent us all, and undoubtedly most Americans would not want such behavior undertaken in their name. Yet we must also ask ourselves whether the incessant mowing down of countless militants as well as innocent bystanders is something else we want done in our name as well.

It is easy to feel anger for the obvious offense that we see. It is more difficult to feel anger for the injustice we are made less aware of, but exists nonetheless.

What’s Left?: The State of Play

When the world plunged into a severe economic recession a few years ago, it appeared as though there would be a major shift in the U.S. political landscape. The Democratic Party, well into ascendance following the 2006 midterm elections, had already profited from an unpopular Bush presidency, its disastrous foreign policy and a Republican Party mired by scandals. Reforming Wall Street moved to the top of the political agenda, prompting newly elected President Barack Obama to urge the financial industry to “learn the lessons of Lehman” and embrace rather than resist a regulatory overhaul. U.S. leaders echoed the pronouncements from the United Kingdom and France that they would no longer blindly trust in the ethos of laissez-faire capitalism.

Three years on, the political crosshairs have moved on from the “banksters” to “big government.” The Dodd-Frank Act – which was supposed to be a “bold” and “sweeping” introduction of major government oversight over the market – has lingered in limbo for over a year, 80% of its regulations unfinished, 90% of its rules incomplete and seven studies it mandated undone. Republicans, galvanized by the grassroots conservative Tea Party movement, neutered the bill as much as they could, and have since successfully shifted the economic narrative away from excessive deregulation to the enormous U.S. debt figures. The desperate need for spending cuts quickly has replaced the desperate need for stimulus, culminating in the copious buzz that surrounded Paul Ryan’s “Path to Prosperity” plan when it was released in April. The Ryan plan called for swapping Medicare with a voucher system, proposed drastic cuts to the vulnerable and disadvantaged, and placed most of the debt repayment burden on the poorest Americans. It never stood a chance at becoming law, but it aided in extracting future budget concessions amounting to $917 billion in spending cuts over a decade.  It is widely understood that the cuts will primarily target “entitlements” – in other words, the social safety net extended to the impoverished and vulnerable among us.

That Obama and the short-lived Democratic majority in Congress would falter against the neoliberal consensus rather than challenge it comes as little surprise. In the 1930s, in the midst of the Great Depression, Franklin Delano Roosevelt clung to an orthodox approach to the economy that advocated balanced budgets and spending cuts, and had to be prodded and pulled into accepting the radical program now known as “The New Deal.” That program embodied in many ways the global Keynesian consensus that dominated the economy for the next four decades (even under Nixon), although the government rarely extended the welfare state. When it dared to do so, such as during L.B.J.’s Great Society endeavor, financial analysts hollered about inflation and recession. When these twin demons did appear in the 1970s, severe supply shocks were to blame, but the Right used the opportunity to declare Keynesian approaches wrong and Friedman, Hayek and von Mises right. Reagan and Thatcher may have ruling from Washington and London, but their economics came from Chicago and Vienna. The Western Left, still receding from its high water mark of agitation in the 1960s, found itself in disarray, divided and discredited.

When the Left finally did return to the political arena decades later, it did so by proceeding along the so-called “Third Way.” This ideology-without-an-ideology claimed to embrace whatever worked, but more truthfully accepted whatever was conventional wisdom. That wisdom portrayed “tax and spend liberals” as odious and naively optimistic, and insisted government interference in the efficiencies of the market was a greater threat than any shady business on trading room floors. To his credit, Bill Clinton attempted a more humane form of capitalist society by bringing back public services, although Clinton’s attempts at a health care overhaul provided more disaster than delivery. On welfare reform, however, Clinton had few qualms about working with Newt Gingrich and other Republicans to vilify the poor as lazy fraudsters, driving them into low-paying jobs in the guise of “welfare-to-work.” In terms of deregulation, Clinton signed the Gramm–Leach–Bliley Act into law in 1999, effectively repealing the 1933 Glass–Steagall Act and enabling the rise of the mammoth “too-big-to-fail” banking institutions that caused the recent crisis (and have yet to be held to true account for it). Obama framed his electoral narrative as offering a genuine alternative to this trend, yet it became readily apparent to perceptive observers that the administration contained little in the way of either new personalities or new ideas. In terms of his economic team, this was evident by the prominence of Larry Summers (Clinton’s last Treasury Secretary) and Timothy Geithner (a protégé of Summers’ predecessor, Robert Rubin). “Change” served its purpose as a useful rhetorical device, even when there was no substance to it.

What is interesting is that it is coming back again, but this time during an election when Obama is not railing against a disliked incumbent and an unfair status quo as an ambitious, unknown challenger. He is the incumbent; he represents the status quo. Yet last month Obama delivered a speech in Osawatomie, Kansas that harkened to the progressive ideals of Teddy Roosevelt. More recently, he took the uncharacteristically brazen step of appointing the head of the new Consumer Protection Agency without seeking the official seal of approval from Congress. What pushed this typically reticent president to, at the very least, posture as a more aggressive and reform-minded leader, in contrast to the “compromise-at-all-costs only-adult-in-the-room” image he so eagerly wanted to present himself as during the budget fight?

A lot of it is the usual election year bluster. Still, Occupy Wall Street deserves some credit. This rare, genuine widespread social movement demonstrates that the Left is not dead, and more importantly, that young people today are not the depoliticized, apathetic perpetual adolescents society often makes them out to be. Granted, not all of us are so self-aware, and many prefer following Jersey Shore to Zucotti Park. Yet you can only burden a generation with so much student debt, so many dim employment options and so many threats of austerity and sacrifice before it begins to see the raw deal it is facing and starts to fight back. Unlike our parents, we are not just rebelling against a government that has lost its claim to respect and dignity, or against a senseless war predicated on a lie – although we are living with those things as well. We are perturbed primarily by the realization that we will have to work harder than past generations to enjoy the things they enjoyed, and that it is very likely that our children may never even have them. We – and our progeny – may not have the “luxury” of retirement at a reasonable age or the “comfort” of social insurance against market failures and social injustices. Already we are widely encountering the emptiness in the “promise” of a good education, the mantra that studying hard and cultivating intelligence would be sufficient to get us not just the good jobs available, but the jobs we wanted. This anxiety, this fear for the future, fueled the passion witnessed throughout 2011, when protestors were faced with repression and derision – beaten by the clubs of the police and mocked by the words of the press – yet sustained their anger and their energy.

Still, the weakness of the Left is evident. There is no clear organization behind Occupy Wall Street, and while it has been quick to diagnosis the problems within the system, it has come up short on prescriptions. There is no unifying ideology to believe in, no alternative economic arrangement to implement. The end of the Cold War has meant that, essentially, there is only one ideology, only one system available: market-oriented bourgeois democracy. Fukuyama’s “end of history,” derided as it has been in academic circles, contains a kernel of truth, and this absence of choice has left nascent dissidents with a distrust of institutions, of programs, of politicians and parties. They are too jaded to work within the system, yet not adequately revolutionary to overturn it. As such, there is a constant danger that those in the parks and streets today will retreat from their zeal and, like the protestors of the 1960s and 1970s. As their elders, they will find solace in the cynical wisdom that, simply, the system cannot be changed, it has always been corrupt and always will be, and that while freedom and equality sound nice as democratic principles, they do not really work in reality. This sort of “wisdom” has probably defeated more protest movements than brute force has ever done.

The farther we get into 2012, however, the more cause we have to be guardedly optimistic that we are not entirely past the high water mark of an activist Left. It does not seem, after all, that change will come from anywhere else but the grassroots. For all their agitation and placard-waving, the Tea Party will be rewarded with a presidential nominee in the form of Mitt Romney, a blatant political opportunist whose principles shift with opinion polls. Meanwhile, the punditry class eagerly wrote off Ron Paul, the one candidate with a consistent record of promoting the Tea Party’s penultimate goal – federal government on a Lilliputian scale – as not just a “long shot” but as someone who simply could never reach the White House. Similarly, no matter how many speeches he gives, Obama will be unable to hide the millions he took (and will take again) from banks and corporations, the sweetheart deals he made with Big Pharma on health care, or the naked sell-outs he made to the G.O.P. opposition or his Wall Street benefactors. Leading Democrats may pay lip service to the Occupy protestors, but they also will not deign to meet with them and hear their concerns. At a time when so many Americans are politically conscious and antsy about what the future holds, it would be reasonable to assume that a lot would be riding on this election. In truth, however, the upcoming contest will very clearly be between two personalities sanctioned by the Establishment owing to their marketability and inoffensiveness to vested interests. Few can look at this election and say it represents a meaningful choice for a highly polarized electorate; the democratic façade is slipping.

Hopefully, this will mean a rejection of the carefully scripted party conventions, photogenic rallies and soundbite-laden stump speeches. If the frustration felt everywhere persists and economic growth remains sluggish, it is feasible that activists on the Left will resume their work outside the official political arena, summoning the spirit of 1968, filling the avenues and once more forcing a confrontation with elites. I will not predict that, even if such a confrontation were to take place, that the people would emerge victorious. Yet what is certain is that no change will come if the people expect someone else to bring it to them on their behalf. If a triumph for the Left is coming in 2012, it will have to come in the streets.

Democracy or Meritocracy?

Writing in the early 20th century, the English economic historian and Christian socialist R.H. Tawney wrote dismissively about what he termed “the Tadpole Philosophy.” In response to growing socioeconomic inequality in Britain, some social critics were advocating for greater “equality of opportunity,” in which the impoverished and underrepresented would be able to escape their lowly status by climbing ladders of opportunity to more privileged social tiers. Tawney regarded this as akin to the relatively minute chances of tadpoles growing into frogs, leaving the dire situation of the poor masses subordinate to the opportunities open to a lucky few. This post analyzes the assertion that, in market-oriented democracies, socioeconomic inequalities generated by markets contradict the democratic principle of political equality. It then turns to the question of whether meritocracy, an ideology promoting social mobility common in many modern industrial democracies, is able to resolve the socioeconomic inequality/political equality discrepancy, or whether it is merely a façade that engineers class repression instead of hindering it. It is clear that, despite the appeal of meritocratic “equality of opportunity,” the facts suggest Tawney was correct. Meritocracy has not decreased inequality in market-oriented democracies, but has instead reinforced an oligarchic system more akin to aristocracy than democracy.

For all the lip service paid to its value in most countries in which it flourishes, democracy has usually existed only within the political realm. Democratic citizenship carries with it certain political rights, characteristically the ability to exercise certain political freedoms. Yet these freedoms rarely extend to social and economic life. In market-oriented democracies, the market largely decides conditions such as employment opportunities and the costs of goods and services, which in turn determine how much citizens will be able to afford their education, their housing, and other factors that decide their place in the social hierarchy. As the market distributes resources, it generates and reinforces socioeconomic inequality, which in turn undermines a key element of optimal democracy: that all citizens exercise the same degree of influence politically (Przeworski, 2010, pp. 66-67). The existence of a socioeconomically privileged class would inevitably lead to that class using its ample resources (and all those resources could buy) to ensure that political decisions protected their interests. Of course, it was natural that the socioeconomically underprivileged would use what political power they had to fight back. Political theorists, such as James Mackintosh and Karl Marx, argued that, once states extended suffrage to the poor masses, those masses would mobilize to appropriate wealth and equalize society, leaving the wealthy little resource other than to restrict suffrage, or failing that, resort to force to keep the masses in line (ibid, pp. 82-83). There is a great deal of credence, then, to the view that democracy has been “a project simply blind to economic inequality” and that, by initially withholding political rights to the poor, democratic institutions merely “replaced aristocracy with oligarchy” (ibid, p. 85). Yet, other than the occasional strike or spurts of protest over inequality, democracies have not proven as unstable as some theorists predicted they would be. Generally, the poor have not used democratic means to make themselves richer and elites employing violence to suppress dissent over inequality features in some democracies but not all. (One should note that, while scholars and laypeople alike tend to ascribe such violence to developing Third World states, the recent unparalleled brutality used by authorities against Occupy Wall Street protestors throughout the United States suggests that cases of such violence, while not everywhere common practice, may be observed in most market-oriented democracies, regardless of development.)

There are many competing explanations for why market-oriented democracies do not collapse by their naturally arising inequality. One such argument is that, through collective struggle and the relative dissolution of economic power, the poor oblige the state to recognize them and thereby grant greater social inclusion. This is the kernel of the argument presented by Philip Oxhorn in his 2003 essay, Social Inequality, Civil Society, and the Limits of Citizenship in Latin America. Challenging T.H. Marshall’s claim that the evolution of civil and political rights for the working class precedes the certain provision of social rights, Oxhorn argues that social inclusion must be preceded by economic change resulting in “greater dispersal of power resources” and “the capacity of distinct groups to organize” (Oxhorn, 2003, p. 41). This latter development is dependent on the state allowing a strong civil society to operate independent from it. Oxhorn contrasts the occurrence of these developments in Britain to Latin America, where civil society was weak and the working class was subject to “controlled inclusion” – in which the state captured the working class via corporatist organizations and attenuated its ability to mobilize against the ruling class (ibid, p. 44). For Oxhorn, the British working class gained a larger degree of political participation because of opportunities afforded to them economically and socially; these opportunities, more than trade unions or peasant organizations, paved the way for social and political mobility. In essence, Oxhorn’s argument is one of meritocracy. Socioeconomic inequality ceased to be a deterrent for the British working class in politics once conditions enabled them to utilize resources that allowed them to display entrepreneurship and intelligence outside the confines of the state, propelling them to a more comparable standing with the dominant classes. The inequality created by the market became justified (or at more tolerated) because spreading affluence and institutions autonomous of the state supplied ladders out their dire location on the economic hierarchy. Recent economic data, however, indicates that despite whatever “social inclusion” the British poor may have achieved through the conditions Oxhorn describes, being poor remains a profound disadvantage in the United Kingdom. A report released last year entitled An Anatomy of Economic Inequality in the UK found that “the household wealth of the top 10%” is “over 100 times higher than the wealth of the poorest 10%” and that the wide differences between resources available to the rich and poor hinders the latter’s ability to meet their potential (Gentleman and Mulholland, 2010). Why, then, has the lot of the British working class not meaningfully improved (at least in the last few centuries), assuming that Oxhorn is correct and developments in Britain permitted their ascendance to a position where they could agitate for more equality? The answer lies in that “controlled inclusion” does not necessarily need to be a state construct, but can operate within society as well, and meritocracy serves as an example of this.

The term “meritocracy” began its life not in a positive sense, but as a satirical idea conceived by Michael Young, a British sociologist and Labour politician. In 1958, Young wrote about a Britain where being born into wealth or good pedigree had become irrelevant, so the elite ceased using economic or social class to select its membership but instead used education. Those deemed “intelligent” – the best students from the most prestigious schools – were praised and promoted, while those deigned less promising found their opportunities narrowed or closed. Although nature allocates ability at random, society only opens doors for a tiny minority (which then reproduces itself and concentrates power in its hands), leaving the remainder vulnerable, and marginalized (Young, 1958). Those at the top, believing that they have reached where they are by their own merit, have few qualms in subsequently rewarding themselves with all manner of luxuries, including substantial salaries and bonuses. In our current society, where meritocracy is not a dirty word but honored ideology, there is evidence that Young’s satire is reality. In a 2011 article in The Washington Post, Peter Whoriskey documented that widening income inequality in the United States can be attributed to recent significant increases in the salaries of corporate executives, who validate the excess because “companies are larger and more complex” and that, when profits grow, so should the rewards for decision-makers (Whoriskey, 2011). Some commentators have pointed to an evolving ethos, arguing that in recent decades greed has taken on a degree of permissiveness. It is easy, however, to find indications that suggest that the rich believe they are rich because their merit made it inevitable. One need only consider the words of John D. Rockefeller: “I believe the power to make money is a gift of God … to be developed and used to the best of our ability for the good of mankind. Having been endowed with the gift I possess, I believe it is my duty to make money and still more money and to use the money I make for the good of my fellow man according to the dictates of my conscience” (Flynn, 2007, p. 401).

It has been said that the study of politics is the study of power. Yet the study of politics often reveals that it is not always in the political realm where true power lies. In market-oriented democracies, power ostensibly lies in the hands of the people, yet the market and those who benefit most from it concentrates economic power in such a way that a majority of people cannot reach the Aristotelian end of politics: the good life. When the state fails to improve the condition of the poor, there is a temptation to look outside the state, to point to society as a means by which the worst off might become better. Ideology does not determine governments and policies alone, however. Ingrained in our culture are fundamental beliefs that seek to portray the status quo as fair, that the balance of power is natural and proper. Meritocracy is such a cultural ideology. Much as monarchs and later aristocrats appealed to divine right and age-old customs to validate their rule, the unequal societies that characterize most modern democracies cites classical liberal faith in individual innovation and worth. If there is indeed widespread interest in not making the good of all secondary to the ascension of the minority, meritocracy is not the solution. Rather, it is an embrace, not a cynical wisdom, of democracy’s belief in the equality of humanity, and that this equality should not just exist in our politics, but in our economic and social lives as well.

Works Cited

Flynn, John. 2007.  God’s Gold: The Story of Rockefeller and His Times.  Auburn: The Ludwig von Mises Institute

Gentleman, Amelia and Hélène Mulholland. 2010. ” Unequal Britain: richest 10% are now 100 times better off than the poorest,” The Guardian, 27 January 2010

Oxhorn, Philip.  2003.  “Social Inequality, Civil  Society, and the Limits of Citizenship in Latin America,” in  What Justice?  Whose Justice?  Fighting for Fairness in Latin America. Eds. Susan Eckstein, Eva and Timothy P. Wickham-Crowley. Berkeley: University of California Press, pp. 35-63

Przeworski, Adam. 2010.  Democracy and the Limits of Self-Government.  Cambridge: Cambridge University Press, pp. 66-98

Whoriskey, Peter. 2011. “Income Gap Widens as Executives Prosper,” Washington Post, 19 June 2011

Young, Michael. 2008. The Rise of the Meritocracy.  New Brunswick: Transaction Publishers