I Became an American

From Charles C.W. Cooke:

This morning, at 8 a.m., I did something I’ve wanted to do for as long as I can remember: I became an American.

I first applied for a visa in early 2011, and since then I have slowly worked my way through the system — first as a visa-holder, then as a permanent resident (green card), and, finally, as a citizen. It feels odd finally to be at this point. I decided that I wanted to be an American on my first visit here at age three. Sure, at that point it wasn’t quite clear to me whether there was an America outside of Walt Disney World. But on subsequent visits, of which there were many, I discovered that there really was, and that it was a giant, rambunctious, beautiful place. Since then, I’ve never wavered in my ambition to be of it.

Why? Well, how long have you got? I’ve been sharing my view that this is the last great hope for mankind for almost seven years now. It ebbs and flows as all experiments do, but America continues to serve as the last surviving incubator of the great classically liberal values. If you believe in human freedom, this is your huckleberry. But there’s something more than that to this — something ineffable. I tried to capture this in a cover story back in 2014:

Being asked to explain why I love America is sometimes like being asked to explain why I love my fiancée. There are all the tangible things that you can rattle off so as not to look clueless and sentimental and irrational. But then there is the fact that you just do, and you ultimately can say little more than that.

I don’t know why I love the open spaces in the Southwest or Grand Central Terminal or the fading Atomic Age Googie architecture you see sometimes when driving. I don’t know why merely glimpsing the Statue of Liberty brings tears to my eyes, or why a single phrase on an Etta James or Patsy Cline record does what it does to me. It just does. I have spoken to other immigrants about this, and I have noticed that there is generally a satisfactory explanation — religious freedom, the chance at self-expression, the country’s size — and then there is the wistful stuff that moistens the eyes. Show me a picture of two canyons, and the fact that one of them is American will make all the difference. Just because it is American. Is this so peculiar? Perhaps.

“My fellow Americans.” How sweet that sounds.

Source

Advertisements

A Country of Timeless and Universal Ideals

From Charles C.W. Cooke:

Today is my son’s first Independence Day.

He doesn’t know that, of course, because he’s only three-and-a-half months old. But my wife and I do, and we’ve attempted to mark the occasion nevertheless — in loco filius, if you will. As such, Jack will be dressed today in a special onesie (stylized picture of a milk bottle, “Come and Take It” tagline); he will wear his Old Glory sun hat; and he will be involved in all the festivities that the family has to offer. Naturally, none of this will make even the slightest bit of sense to him; as a matter of fact, today will be the same as is any other day in the life of a baby, just with more people around and a surfeit of BBQ. But you have to start somewhere, right?

Because Jack is three months old, it is acceptable for his parents to treat July Fourth as an excuse for the purchase of kitsch. But what about after that? What about when he is five? Or twelve? Or nineteen? As a native Brit, I am accustomed to the self-deprecating instincts that are the hallmark of British society, and I am acquainted, too, with the reflexive aversion to patriotism that is all-too customary in the birthplace of Western liberty. In consequence, I know that if I were to leave my son befuddled by America’s Independence Day proceedings, he would probably stay that way in perpetuity. And that would be a tremendous, unconscionable shame — a shame that, frankly, would reflect poorly on me.

Once they reach a certain age, we expect our children to know what is what. As soon as they start speaking, we begin to teach them right and wrong; once they are old enough to be trusted with responsibility, we monitor closely how it is being used; and, in a process that is hopefully never-ending, we make sure that they know as much about the world around them as they are capable of taking in. It is in pursuit of this lattermost goal that we designate national holidays. In May, we celebrate Memorial Day, lest we forget what we owe our ancestors. In January, we observe Martin Luther King Day, that we might bring to mind the most uncomfortable parts of our nation’s past. And on July Fourth we arrange an ostentatious display of patriotism, in resounding commemoration of the moment that a ragtag bunch of philosopher-king rebels set their revolutionary ideals before a candid world, and changed human history forever.

In certain quarters it is fashionable to disdain these occasions, and, in so doing, to treat the past as if it were wholly disconnected from the present. Indeed, staunch defenders of the American Founding are often told that to embrace modernity it is necessarily to jettison the antique. “Why,” it is asked, “do we celebrate these flawed men and their pieces of parchment? After all, John Adams couldn’t even have imagined Tinder.”

Though narrow, this critique is indisputably correct. John Adams could not have imagined Tinder, and I daresay that he had no conception of high-frequency trading, of synthetic fibers, or of advanced robots either. But, ultimately, that is irrelevant. The beauty of the American Founding was not that it provided a detailed roadmap that could predict the minutiae of the future in glorious perpetuity, but that it laid out for all people a set of timeless and universal ideals, the veracity and applicability of which are contingent upon neither the transient mood of the mob nor the present state of technology. Among those ideals are that “all men are created equal,” and that they “are endowed by their Creator with certain unalienable Rights”; that “Governments are instituted among Men” in order to “secure” their “rights”; that legitimate power derives “from the consent of the governed”; and that if any such government is seized or corrupted by tyrants, “it is the Right of the People to alter or to abolish it.” At times, the United States has failed disastrously to live up to these principles, and, on at least one occasion, significant forces within the union have rejected them outright. But that an ideal has been violated in no way undermines its value, and it seems patently obvious to me that the country has been blessed by having had an eloquent North star to which its downtrodden could point from their moments of need.

If July Fourth is to represent anything concrete, it should serve as a golden opportunity to ensure that that star does not wither or implode or disappear from public view. In Britain — a less propositional nation in which the constitution is uncodified, in which there are no indisputably “foundational” documents, and in which there are no widely celebrated national days of meaning — it can be difficult to convey the importance of core national values, whatever those may be. Americans, by contrast, have fallen heir to an embarrassment of riches. If I cannot explain to my son how lucky he is to have been born here — and if I cannot demonstrate what a heavy responsibility it is to keep the candle burning — I do not deserve to be called “Dad.”

In a 1788 letter to Thomas Jefferson, James Madison outlined the didactic justification for the construction of the Bill of Rights. “Political truths declared in that solemn manner,” Madison proposed, tend to “acquire by degrees the character of fundamental maxims of free Government, and as they become incorporated with the national sentiment, counteract the impulses of interest and passion.” Such benefits are not limited to the Bill of Rights. Just as Americans will proudly cite the first ten Amendments in the course of defending the ordered liberty that is the birthright of all free men, so they are prone to cite the most explosive literature of the revolutionary era. If internalized and cherished, Abraham Lincoln argued, the Declaration of Independence would have the salutary effect of acting as “a rebuke and a stumbling-block to the very harbingers of re-appearing tyranny and oppression,” for Jefferson’s work was not “a merely revolutionary document” but the embalming of “an abstract truth, applicable to all men and all times.”

Yes, even to three-month-olds. Come and take it.

Source

The Left Won’t Let the Amtrak Tragedy Go to Waste

From Charles C. W. Cooke:

It took just a few sorry hours for the news to become politicized. On Tuesday evening, we were told of a tragedy. An Amtrak train running between New York City and Washington D.C. had derailed disastrously at Philadelphia, killing eight and wounding two hundred. By Wednesday morning, tragedy had become transgression. Speaking from the White House, press secretary Josh Earnest explained that he didn’t know for sure why the train had crashed, but that it was probably the Republicans’ fault. “We have seen a concerted effort by Republicans for partisan reasons to step in front of those kinds of advancements” that would have prevented crashes such as this one, Earnest proposed slyly. His message: “Yeah, the conservatives did it.”

Before long, this theory had become omnipresent on the Left. At PoliticsUSA, Sarah Jones complained that, “gambling with Americans lives,” “reckless Republicans” were planning to respond to the “deadly derailment with more proposed cuts to Amtrak.” At MSNBC meanwhile, erstwhile transportation expert Rachel Maddow contrived to play Sherlock Holmes. “There’s no mystery about this disaster in Philadelphia,” Maddow submitted, ‘and there will be no mystery when it happens again.” The culprit, she proposed, was a lack of infrastructure spending. “This is on Congress’s head.” Not to be outdone, Mother Jones got in on the act, too: “The Amtrak Crash,” Sam Brodey declared excitedly, “Hasn’t Stopped Republicans From Trying to Cut Its Funding.” Well, then.

In all cases, the implication was clear: The dead were dead and the injured were injured because old rails had buckled under new weights; because underserviced wheels had locked up and given out; because the electrical wires that undergird the information systems had finally disintegrated and gone back to seed. Thus was a new tragedy ghoulishly recruited to an old cause. Rare is the day on which we are not told that America’s bridges are crumbling and that its roads are cracking, and that selfish and unimaginative politicians in Washington are rendering the United States as a shadow of its former self. Rare, too, is the day on which it is not asserted by someone that if we would just have the good sense to funnel more money to our favorite groups, we would be able to escape our present economic mess. With the news of a terrible crash, the would-be spenders were given a chance to wave the bloody shirt and to put a face on an agenda. Disgracefully, they took it.

In a sensible world, this execrable line of inquiry would have been abandoned at the very moment that it was revealed that the train had been traveling at almost twice the rated speed limit when it flew off the tracks, and thus that physics, not funding, was the proximate cause of the crash. But, alas, we do not live in a sensible world. And so, rather than conceding that we should treat the questions of infrastructure spending and of Amtrak’s subsidies separately from the questions surrounding this incident, the partisans scrabbled around to find an alternate — and conveniently non-falsifiable — theory: To wit, that if more money had been available to Amtrak’s engineers, they would probably have been able to find a way of saving the deceased. Never mind that the money is already there, but is being spent elsewhere; never mind that the reason that existing “crash-preventing” technology has not been implemented has more to do with “unique” “logistical challenges” than with an absence of funding; never mind that new technology is as capable of failing as old technology. If Amtrak had just had some more money in the bank, something would have been different. If we had rendered unto Caesar what his acolytes had demanded, the laws of physics would have smiled more kindly on the Northeast.

At the Federalist yesterday, Molly Hemingway argued persuasively that this sort of magical thinking is ultimately born of a peculiar form of secular theodicy, in which money has taken the place of piety and in which all accidents, hiccups, and human mistakes can be blamed squarely upon the unwillingness of the American taxpayer to pay their April tithes with alacrity. On Twitter, Red State’s Erick Erickson concurred, writing pithily that “the leftwing reaction to the Amtrak derailment” reminded him of televangelist “Pat Robertson’s reaction when a hurricane hits somewhere.” There is, I think, a great deal of truth to this. In our debates over education, healthcare, energy, and . . . well, pretty much everything, the progressive instinct is invariably to call for more money, regardless of the nature of the problem at hand. Naturally, there is a cynical pecuniary aspect to these entreaties: behind every “for the children” plea, it seems, is a union that is looking to get its claws into your wallet. But there is also a bloody-minded refusal to accept the world as it really is. We do not, pace Thomas Paine, “have it in our power to begin the world over again,” and we never will — however many zeroes the Treasury is instructed to scrawl on its checks. Accidents happen. Humans err. Evil prevails. Perfection is a pipe dream. The question before us: How do we deal with this reality?

On the left, the usual answer is to deny that there is any such reality. Just as conspiracy theorists prefer to take shelter in the comforting belief that 9/11 was the product of omnipotence and not of the unavoidable combination of evil, luck, and incompetence, the progressive mind tends to find calm in the heartfelt conviction that if we adjust our spreadsheets in the right way — and if we elect the correct people to public office — we will be able to plan and spend and cajole our way into the establishment of a heaven on earth. Thus did the arguments yesterday so dramatically shift and bend in the wind. Thus were their progenitors willing to say anything — yes, anything — in order to avoid the conclusion that the world can be a scary and unfair place and that there is often little we can do it about. The crash was caused by a lack of infrastructure spending that has left the railways in a dangerous shape! No, it was caused by a lack of interest in finding a way to prevent human error! No, it was caused by a general American unwillingness to invest in the sort of trains they have in Europe or Japan! Republicans did it! Midwesterners who don’t use trains did it! The rich did it! Quick, throw money at the problem, and maybe it’ll go away!

Throwing money at a problem is not always the wrong thing to do, of course. But one has to wonder where the limiting principle is in this case. There is no department or organization in the world that would struggle to find a use for more cash were it to become available. If our standard is a) that more funding might potentially equal less death, and b) that all death must inevitably be assuaged by more funding, we will soon run out of treasure. Alternatively, if the conceit is less absolute — i.e. if we accept that we do not have infinite resources and that this debate is there about priorities — one will still have to question the choices that Amtrak’s boosters would have us make. To support federal spending on Amtrak is by definition to suppose that every dollar spent on the trains is money that could not be spent better elsewhere: not by taxpayers; not by businesses; not by other parts of the government; not on paying down the debt; not on anything else on this earth. This, naturally, is highly debatable. Per the agenda-less, data-driven denizens of Vox, Americans today are 17 times more likely to be killed in a car accident and 213 times more likely to be killed on a motorcycle than they are to be killed on a train. Trains, in other words, are relatively safe. That being so, one has to ask why anybody would advocate increasing the train budget. By rights, shouldn’t that money be going to General Motors or to Harley Davidson or to the various DMVs up and down the land? Shouldn’t it be “invested” in areas where it will be 17 and 213 times more useful? If it should not, why not? Why do those who wish to spend money on the trains and not on motorcycle safety not have blood on their hands, just as we are supposed to believe that those who wish to cut Amtrak’s budget do? Surely if Harley Davidson had a little more money, they could develop systems to save lives. Why, pray, are they being denied that money?

One’s answers to these questions will vary according to one’s ideological outlook and one’s broader political judgment. For my part, I am not wild about the idea of subsidizing Amtrak at all. Others, I know, want the state to underwrite a much wider network of trains, the better to discourage Americans from flying or from driving their cars. Such disagreements are reasonable and, perhaps, inevitable. And yet they are only instructive when indulged dispassionately. It may make us feel good to hover over rapidly cooling bodies and, searching for anything that might assuage our grief, entertain our “what ifs” and nominate our villains. But it is certainly no grounds for the establishment of public policy. Whether they are broke or they are flush, terrible — yes, even fatal — things happen to good people all the time. Accepting that this is inevitable is the first step toward maturity. In Philadelphia, the inevitable happened; and “shoulda, woulda, coulda” were the last words of the charlatans.

Source

Spineless

Rob Lowe

From Charles C.W. Cooke:

It just gets worse. Yesterday evening, I lamented the astonishing news that the American film industry was being dictated to by a bunch of North Korean hackers. At the time of writing, the following had happened:

First, Sony Pictures, which produced the film, canceled tomorrow’s inaugural showing. (“Security concerns,” natch.) Then the Carmike Cinemas chain, which owns 278 theaters in 41 states, announced that it would not be showing it at all. In the last few hours, the Hollywood Reporter has suggested, the other four giants of American cinema — Regal Entertainment, AMC Entertainment, Cinemark, and Cineplex Entertainment — elected to join in the boycott. And, finally, the studio pulled the December 25 release entirely

Now, per Reuters, Sony has pulled the whole thing:

Sony Pictures has canceled the release of a comedy on the fictional assassination of North Korea’s leader, in what appears to be an unprecedented victory for Pyongyang and its abilities to wage cyber-warfare.

Hackers who said they were incensed by the film attacked Sony Corp (6758.T) last month, leaking documents that drew global headlines and distributing unreleased films on the Internet.

Washington may soon officially announce that the North Korean government was behind the attack, a U.S. government source said.

This decision is not restricted to theaters:

“Sony has no further release plans for the film,” a Sony spokeswoman said on Wednesday when asked whether the movie would be released later in theaters or as video on demand.

In other words, a group of computer experts — which may or may not be backed by the North Korean government — has managed to convince a major American industry to write off a $44 million investment because . . . they don’t like some of its jokes. How utterly grotesque. How shameful. How antithetical to all of those principles for which the people of the United States are supposed to stand.

Worse still, those hackers have managed to convince Hollywood to cancel future projects, too. Deadline Hollywood reports:

The chilling effect of the Sony Pictures hack and terrorist threats against The Interview are reverberating. New Regency has scrapped another project that was to be set in North Korea. The untitled thriller, set up in October, was being developed by director Gore Verbinski as a star vehicle for Foxcatcher star Steve Carell. The paranoid thriller written by Steve Conrad was going to start production in March. Insiders tell me that under the current circumstances, it just makes no sense to move forward. The location won’t be transplanted. Fox declined to distribute it, per a spokesman.

If this is to be our approach, why not formalize the arrangement and run every script idea past our enemies before production starts? In fact, why not submit all creative speech to the roving gangs of outrage merchants and armed hecklers to which our appeasement is granting hope? I daresay that most books, movies, newspapers, television shows, and websites contain material that someone doesn’t like. Why not include them in the party, too? Sure, by the time that an idea has been scrutinized by Tehran, Moscow, and the University of Berkeley it will probably have been stripped of all its charm. But we wouldn’t want anybody to be upset, would we?

I’ve heard people arguing that this reflects “only on Sony.” But it doesn’t, really. It reflects on the many, many theater chains that canceled their screenings. It reflects upon a general corporate culture that is just too damn risk averse. And it reflects upon the zeitgeist, within which caving to pressure from the “offended” has become the unlovely norm. We are now reaching a point at which no college commencement speaker is permitted to do his thing unless he has been neutered, read whatever catechism is en vogue this week, and then had his entire past vetted by the most boring, self-indulgent, ignorant people in the world. George Will wasn’t allowed to speak at Scripps because a few of its students objected to a column he’d written in a newspaper. What exactly did we think was going to happen when the censors threatened to turn up with guns?

America doesn’t feel very free, or very brave, today.

Source

The Need for Transcendence

From Charles C.W. Cooke:

Per the New York Daily News:

A suspect in the beheading of American journalist James Foley is a British-raised rapper who left his parent’s million-dollar London home last year to fight for radical Islam in Syria.

Homegrown jihadist Abdel-Majed Abdel Bary, a 23-year-old rapper, may be the masked man who severed Foley’s head with a knife in a YouTube video in retaliation for U.S. airstrikes on ISIS in Iraq, according to reports in several British papers. . . .

In July 2013, he posted on Facebook, “The Unknown mixtape with my bro tabanacle will be the last music I’m ever releasing. I have left everything for the sake of Allah.”

On Aug. 13, he tweeted a photo of himself in Iraq holding a severed head with the caption, “Chilllin’ with my homie or what’s left of him,” The Times of London reported. His Twitter account was suspended soon afterward. It is unclear whose head he was holding.

Bary also tweeted a threat in June: “The lions are coming for you soon you filthy kuffs (infidels). Beheadings in your own backyard soon.”

Like many others, Bary has been taken in by an ideology — a disastrous, abhorrent, absolute, and apparently irresistible ideology. His discontent is not driven by poverty or oppression or historical experience. It’s driven by ideas, and by the human needs that those ideas seek to satiate. Bary, the Daily Mail reports, “grew increasingly radical and violent after mixing with thugs linked to hate preacher Anjem Choudary.” This, sadly, is too common a story. Look through the biographies of the 9/11 attackers. How many of them lacked food or healthcare?

Over the weekend, the New York Times’ Ross Douthat observed that the world’s uglier movements will always attract the bored, which is why, he suggested,

writing off the West’s challengers as purely atavistic is a good way to misunderstand them — and to miss the persistent features of human nature that they exploit, appeal to and reward.

These features include not only the lust for violence and the will to power, but also a yearning for a transcendent cause that liberal societies can have trouble satisfying.

As The Week’s Michael Brendan Dougherty argues, discussing the Europeans who have joined up with ISIS, liberalism’s “all-too-human order” — which privileges the sober, industrious and slightly boring — is simply “not for everyone.” Nor, most likely, will it ever be: in this century, the 22nd, or beyond.

Bary did not discover militant Islam over the Internet, but through his father:

Bary is one of six children of Adel Abdul Bary, an Egyptian militant who is facing terrorism charges in connection with Al Qaeda’s twin 1998 bombings at the U.S. embassies in Kenya and Tanzania that killed 224 people.

Nevertheless, he clearly found this lifestyle more appealing than the alternative, which was to live as an upper-middle-class musician in the West.

One reason that liberty can be difficult to preserve is that it so often lacks the romance, the heroism, and the sense of involvement that so many appear to crave. Bound by relatively few governmental or social constraints, citizens of free countries are obliged to make their own decisions, to establish and to participate in their own communities, and — crucially – to create their own sense of meaning. This can be tough — scary, even. To join a strictly defined and quasi-totalitarian movement such as IS, on the other hand, is instantly to feel a sense of belonging. As someone who is keenly motivated by a desire to leave people alone, it is distressing for me to acknowledge action-based collectivist philosophies are much, much more popular than I would wish. But they are. Why would Abdel-Majed Abdel Bary want to involve himself with a bunch of such extraordinary thugs? Well, at least they’re doing something.

Source

In Praise of Gridlock

From Charles C.W. Cooke:

A political cartoon, published in a newspaper at some point in the early 1990s, has long been burned into my memory. In it, newly elected President Clinton is being shown around the White House by a man in a butler’s uniform. Clinton arrives at a wall on which sit an abundance of political levers and buttons and, thrilled, his eyes widen. Yet he is quickly disappointed. “Sorry, sir,” the orderly’s speech bubble reads, “but these are all connected to Congress.”

The brinkmanship, gridlock, and rancor that the fight over the continuing resolution has yielded is disliked, at least in some manner, by almost everyone involved. But opinions as to what might be done about it vary wildly. On Friday, Wonkblog’s Dylan Matthews provided a suggestion: Americans, he wrote, “oughta start thinking seriously about how to prevent divided government from ever happening again.”

This is not partisan posturing. On the contrary, Matthews earnestly and consistently believes that America’s system is intrinsically unviable, and that it is to blame for our current predicament. And he is tapping into a sentiment that is reasonably popular among his peers. The last time that the United States teetered toward a shutdown or a default, Slate’s Matthew Yglesias wrote at length about what he regards as the “long-simmering problems with the basic structure of American political institutions.” Were Yglesias to draw the next panel of the old cartoon, he would presumably have Clinton do some rewiring.

This line is not a new one. Hostility toward America’s rigid separation of powers has a rich, if unappealing, history on the Left. Woodrow Wilson — a man whose animus to the constitutional order that he had sworn to uphold approached almost treasonable levels — was savvy enough to recognize that the expansive long-term ambitions of the Progressive movement were simply incompatible with the country’s founding documents. In consequence, Wilson proposed, Americans should change their expectations of government, invest their democratic ambitions in one man (the president), and abandon the country’s messy political settlement in favor of a streamlined and “efficient” state that was more akin to that in the Kaiser’s Germany or in the King’s Britain. “The Constitution,” Wilson wrote, “was not made to fit us like a straitjacket. In its elasticity lies its chief greatness.”

While his insistence that the Constitution was not supposed to be a “straitjacket” is incorrect, Wilson and his descendants are correct in their basic complaint: Separation of powers is inefficient; it is an obstacle to substantial change; and it will not only “allow” gridlock but it is explicitly designed to encourage it. Where they are wrong is to conclude that this should change with the times. The Constitution is the product of abiding insight into politics — an insight that does not change with the wind. Rather amazingly, Yglesias claims the opposite to be the case: The problem of gridlock, he wrote in 2011, stems directly from the Founders’ having had “little in the way of experience to guide them in thinking about how political institutions would evolve.”

This is not simply untrue, it is the perfect opposite of the truth. Having watched the radical transformation of the British system during the 17th and 18th centuries — and studied undulations of the classical world, for good measure — most of the Founders were strikingly well versed in political theory. The introduction of limiting tools such as the rule of law, term restrictions, a codified constitution, a bill of rights, and divided government were intended to dispense with the presumption, famously termed “elective dictatorship” by Lord Hailsham, that the man who is voted in as leader every four or so years should have carte blanche to get things done. In other words, the Founders sought to block precisely what Yglesias and his cohorts covet. Nobody is perfect, of course, but I would wager everything I own that the architects of America were more au courant with the vagaries of human nature and the concentrating tendency of political actors than are the writers at Slate.

In some respects, Wilson has got his wish. Witness, for example, the peculiar manner in which many citizens, journalists, and legislators have presumed that Obama’s wishes for the congressionally designed budget should be the national starting point. Why? Because he won election to head up the executive branch, obviously! It seems that our debate has been upside-down from the start: Constitutionally speaking, if any elections should suggest the direction of the budget and of the laws, they are the 435 that determine the composition of the House. Alas, this no longer appears to be the case.

The truth that dare not speak its name is that the pronounced disharmony on show in the United States has a clear root cause — and it is not the structure of government. Democrats who complain that the House is being particularly obstinate are absolutely correct — it is. But rarely do they stop and ask “Why?” It seems obvious to me that at the root of our interminable trench warfare is the fact that one party made the regrettable decision to push through the most controversial piece of social legislation in a century without a single opposition vote. That party was, of course, entirely within its rights to do this when unified government presented them with the chance. Nevertheless, it is childish for it to complain that, the other side having been given a clear mandate to try to undo the measure, it is now doing just that. Elections do indeed “have consequences” — and that means all of them.

Critics of the United States correctly, if oddly, point out that the system of separated powers works only here. “We are the only country in the world in which . . .” is a typically witless refrain. In South America, where presidential democracies have been tried, gridlock has customarily led to the president “speaking for the people” by ordering a military coup and removing from the equation the legislators who demonstrated the temerity to serve as a check and a balance.

As a result of its mature political heritage and its British roots, the United States was spared this trend, blossoming quickly into a country in which the conflict that usually results from divided government is virtuously accepted by the people as the price of liberty. In America, Yale’s Juan Linz argues, strife that has led to violence in less-developed nations has become regarded as “normal.” Make no mistake: Dylan Matthews and his myopic ilk would unashamedly like to change this, rendering illegitimate the positions of the minority and subjugating the exquisite fractiousness of Congress to the imperium of a national leader. This is, of course, a prerogative they enjoy as free men. But there is nothing “progressive” about it at all.

Source

Words and Reality

To Mr. Obama, words matter. But they need not have any connection to reality.

From Charles C.W. Cooke:

I’m honestly at a loss as to where one might start with President Obama’s healthcare speech earlier. It was so utterly mendacious and cynical as to inspire awe. Perhaps one should begin with the outrageous claim that healthcare is an American “right,” which is not only untrue but was bizarrely allied with the insistence that, before Obamacare came along, health insurance had been reserved to the “privileged few” (also know as 90 percent of the country)? Perhaps one should start with that lawyerly language in which the the president insisted that premiums would be “lower than expected,” an irrelevant, vague and misleading metric that conveniently ignores his specific promise that costs would either go down or stay the same for absolutely everyone? Maybe one should start with the peculiar claim that “there’s no widespread evidence that the Affordable Care Act is hurting jobs”? Or, perhaps, with the president’s singling out of a silly and fringe HuffPo piece that compared the law to The Fugitive Slave Act, which was pulled from obscurity and launched into the mainstream in order to tar all conservatives as extremists and racists?

These were all astonishing and infurating in equal measure. But they paled in comparison to the dishonesty of the president’s central claim, which was that the attempt to link defunding to the debt-ceiling fight is illegitimate because Obamacare has “nothing to do with the budget.” I struggle to imagine how the president could have kept a straight face when he said this. This is a law, remember, that was crowbarred through Congress with the questionable use of reconciliation, a parliamentary procedure that is reserved exclusively for budgetary matters; a law that was sold as a deficit-reduction measure; a law that contains a significant spending component, including a 5-10 percent increase in the size of the federal budget; and, alas, a law that boasts a central mandate that was upheld (rewritten) by the Supreme Court as a tax, thus ensuring that any changes to the penalties must be approved by the House. “Nothing to do with the budget”? This is what we call a lie, Watson.

Source