Wednesday, December 7, 2011

Rick Perry takes on Obama's 'war on religion'

“I’m not ashamed to admit that I’m a Christian,” the Texas governor says in the spot. “But you don’t have to be in the pew every Sunday to know that there’s something wrong in this country when gays can serve openly in the military, but our kids can’t openly celebrate Christmas or pray in school. As president, I’ll end Obama’s war on religion, and I’ll fight against liberal attacks on our religious heritage.”

But everybody can openly celebrate Christmas! They can pray in school! They just can't require that the school force everybody to do so. Obama's "war on religion"—to the extent that the president has even been part of these culture wars, which isn't much—only means that Christians don't have a privileged position in forcing the rest of society to observe their rituals.

Not that I expect to persuade anybody. There's a variety of Christian who believes that if they're not allowed to dominate society, they're being persecuted. These folks vote Republican.

It's probably a good thing Pearl Harbor is fading into history



On this 70th anniversary of the Pearl Harbor attack, there seems to be a great deal of hand-wringing that the event is soon to no longer be part of our collective living memory. Today's New York Times story is pretty typical of the angst:
The fact that this moment was inevitable has made this no less a difficult year for the survivors, some of whom are concerned that the event that defined their lives will soon be just another chapter in a history book, with no one left to go to schools and Rotary Club luncheons to offer a firsthand testimony of that day. As it is, speaking engagements by survivors like Mr. Kerr — who said he would miss church services on Sunday to commemorate the attack — can be discouraging affairs.

“I was talking in a school two years ago, and I was being introduced by a male teacher, and he said, ‘Mr. Kerr will be talking about Pearl Harbor,’ ” said Mr. Kerr. “And one of these little girls said, ‘Pearl Harbor? Who is she?’

“Can you imagine?” he said.
Well, yeah, I can imagine. I don't have any idea how old this girl was, but it's entirely conceivable—even probable—that Pearl Harbor took place before her grandparents were born. This isn't just history to today's elemetary school students: It's ancient history. Put it this way: If you were in elementary school 30 years ago—as I was—how much did you know and understand about World War I? I was a kid when this "Cheers" episode came out, and I remember being astonished as a child that there were any veterans of that war left.

This isn't a call to let Pearl Harbor slip from our collective memories. "Those who forget the lessons of history are doomed etc." But it's probably not a bad thing to let that memory become a little less urgent. There are plenty of cultures around the globe which harbor grudges from wars and battles that took place centuries upon centuries ago—those memories have remained urgent, often with the result that those cultures have a hard time moving into the future: They're too busy clinging to the past. There are still people who hate the Japanese because of Pearl Harbor. What a wasted, useless emotion.

And there are some folks who use their observance of the anniversary as a kind of "more American than thou" proclamation, a cudgel against those who don't keep the flame burning quite as bright. I guess I don't have much patience for that.

The longer our country and culture survive, the more battles we'll have under our belt. They'll be and seem incredibly life-shattering at the time. But we can remember them without living with them as part of our present, and we probably should: It's probably healthiest that we eventually let the old battles go. I salute the survivors of Pearl Harbor, but it's not a sin to let the memory fade, just a bit, as they fade away.

Monday, December 5, 2011

Stay-at-home dadding: Turns out I'm a trendsetter

You wouldn't know it from Fitler Square, where I'm often the lone daddy in a sea of mommies and nannies, but it turns out that stay-at-home dadding is increasingly common:
Among fathers with a wife in the workforce, 32 percent were a regular source of care for their children under age 15, up from 26 percent in 2002, the U.S. Census Bureau reported today. Among these fathers with preschool-age children, one in five fathers was the primary caregiver, meaning their child spent more time in their care than any other type of arrangement.
I'm lucky, in that my career and skills make it possible for me to earn money while staying at home with my son. It's an economic no-brainer on one hand: Child care is frickin' expensive, and my staying home while writing subtracts that cost from our burdens while still letting me make enough money to pay the rent.

And I'm also lucky that I get to spend so much time around my son during his formative years. My dad was a hard worker: When I was young he was in college and worked full-time, and after he graduated he was on the road a lot; I got plenty of fathering, believe me, and everything he did was in the service of supporting his family. But I also know that I've had more of a chance to watch my son grow than he—or, really, all but a few men of his generation—ever did. There's a tradeoff: I'm not getting rich or skyrocketing to the top of my profession right now. Often, though, I wake up these days with my 3-year-old son climbing into bed with me and throwing his arms around my neck. It's a privilege to receive that and earn money, I realize. I might as well enjoy it.

A death blow for printed magazines

News from the United States Postal Service:
Unprecedented cuts by the cash-strapped U.S. Postal Service will slow first-class delivery next spring and, for the first time in 40 years, eliminate the chance for stamped letters to arrive the next day. 
The estimated $3 billion in reductions, to be announced in broader detail later Monday, are part of a wide-ranging effort by the Postal Service to quickly trim costs and avert bankruptcy. They could slow everything from check payments to Netflix's DVDs-by-mail, add costs to mail-order prescription drugs, and threaten the existence of newspapers and time-sensitive magazines delivered by postal carrier to far-flung suburban and rural communities.
According to the story, periodicals could take up to nine days to reach their destination through the mail. Which should pretty much destroy printed magazine subscriptions.

Maybe not all of them: If you subscribe to one of those Home and Garden magazines, it probably doesn't make much difference when you get them. But if you read something even a little time-sensitive—The New Yorker, The Atlantic, Time, etc.—you're running out of reasons to stick with print. Digital subscriptions will keep you up to date just fine.

Tim Tebow's ostentatious faith



National Review's Mario Loyola tries to get to the bottom of why Denver Broncos quarterback Tim Tebow annoys people so much, and concludes: "People aren’t upset at Tebow’s God talk. They’re upset that he might actually believe it."

Meh. Tim Tebow doesn't bother me one way or another, though I admit to finding his success this season rather fascinating. (And I'm not really a football guy.) Nonetheless, when I see his ostentatious displays of faith on the field, I'm reminded of some old gospel verses
5 “And when you pray, do not be like the hypocrites, for they love to pray standing in the synagogues and on the street corners to be seen by others. Truly I tell you, they have received their reward in full. 6 But when you pray, go into your room, close the door and pray to your Father, who is unseen. Then your Father, who sees what is done in secret, will reward you. 7 And when you pray, do not keep on babbling like pagans, for they think they will be heard because of their many words. 8 Do not be like them, for your Father knows what you need before you ask him.
Guy named Jesus supposedly said that. But I'm sure Tim Tebow knows better.

(Photo from Tebowing.com)

Buskers are not the problem at Washington Square Park in New York

In New York, the city is busting buskers for violating an obscure rule against "vending" near monuments:
The department’s rule, one of many put in place a year ago, was intended to control commerce in the busiest parks. Under the city’s definition, vending covers not only those peddling photographs and ankle bracelets, but also performers who solicit donations. 
The rule attracted little notice at first. But the enforcement in Washington Square Park in the past two months has generated summonses ranging from $250 to $1,000.
I've actually spent a little time in Washington Square Park, and if there's a "commerce" problem there, it really isn't the buskers: It's all the open-air drug dealing. There's nothing subtle about it. And there's something both misguided and embarrassing about a municipal government that would focus on cracking down on folk singers and mimes while leaving the dealers relatively undisturbed. Seems to me the latter pose the far greater threat to the quality of life in the park and neighborhood.

Siri is just a crappy doctor

My PhillyMag editor Erica Palan loves her new iPhone 4S, but she's not worried that Siri won't point her to the nearest abortion doctor:
When I asked Siri where I could get a Pap smear nearby, she didn’t understand. She was similarly confused when I asked her where I could get a mammogram, a colonoscopy and a prostate exam. Also, when I told her I had a broken hand, she asked for an email address instead of recommending hospitals or doctors. When I asked for advice for my bunion, she responded, “I don’t know what you mean by Bunyon. [sic]” After wondering who could fix my toothache, Siri asked me to provide a contact name. She’s not pro-life. She’s just not a very good medical professional. 
Siri doesn’t know how to correlate the names of specific medical procedures to nearby results. Ask Siri where the nearest Planned Parenthood is and she instantaneously provides addresses. She also suggested 16 gynecologists in my area. Ask her for an emergency dentist and she serves up search results within seconds.

The rich aren't so different from you and me, politically

When the party leanings of independents are taken into account, 57% of the nation's wealthiest adults associate themselves with the Republican Party, compared with 44% of the "99%." At the same time, Gallup polling finds little difference in the two groups' ideological views. Among the very wealthy, 39% say their political views are conservative, 41% call themselves moderate, and 20% liberal, similar to the percentages seen among all others.

But even if their self-identifications are similar, that doesn't mean their priorities—and their ability to press for those priorities—is the same. Lots of people call themselves "fiscal conservative/social liberal" for example, but that means a lot of different things depending on who is doing the talking.

The problem with third parties

Philadelphia Daily News columnist John Baer gets Third Party Fever:
WANT AN alternative to a 43-precent-job-approval president and whoever the Republican "Gong Show" offers? Fed up with Obama but afraid of the right?
Well, there's a move afoot, kicked off by a billionaire biz guy, to nominate a centrist, bipartisan ticket through a national online convention.

It's scheduled for next June, run by something called Americans Elect and open to all registered voters.

Its pitch: Pick a president, not a party. It plays to national disgust with Washington gridlock, Democratic disappointment in Obama and GOP angst over the Republican field. And it isn't a bad idea.
It's not a great idea. It would be nice to break through the Washington gridlock on occasion, yes, but electing a third party president won't do much to end that. If you're really wanting to shake things up, you've got to start electing Americans Elect candidates to Congress.

That's arguably harder to do than fielding a candidate for presidency. To run for Congress, you've actually got to develop a local constituency, and to do that you probably need to have an agenda somewhat more detailed than "those guys suck"—voters need to know, generally, what you stand for and what you'll try to get accomplished in office. And if you actually get to Congress, there's no guarantee that your presence there won't further calcify the divisions.

But Congress, not the presidency, is where the gridlock is. Third party efforts suck because they focus on the unattainable goal of hitting a home run and capturing the presidency; it skips the hard work that would really be needed to field a consistently credible alternative party.

Sunday, December 4, 2011

Federalist 54, slavery, and 'The 1 Percent'

Like a lot of liberals, when I think about the Constitution's original provision that counted slaves as three-fifths of a person for the purpose of apportioning representation in Congress, I often think about it in racial terms: They were literally saying that black people were less than fully human! Sometimes I think about it in political terms: Southern politicians were accruing power—and thus preserving slavery—by giving slaves any human weight at all. But I don't often think about it in economic terms.

Federalist 54 changes that for me. This is the paper in which James Madison must justify the three-fifths apportionment to the people of New York. And his primary justification is this: Slaves are a form of wealth. And wealth deserves a little extra representation in the halls of government.

No really. This is what Madison writes:
"After all, may not another ground be taken on which this article of the Constitution will admit of a still more ready defense? We have hitherto proceeded on the idea that representation related to persons only, and not at all to property. But is it a just idea? Government is instituted no less for protection of the property, than of the persons, of individuals. The one as well as the other, therefore, may be considered as represented by those who are charged with the government. Upon this principle it is, that in several of the States, and particularly in the State of New York, one branch of the government is intended more especially to be the guardian of property, and is accordingly elected by that part of the society which is most interested in this object of government. In the federal Constitution, this policy does not prevail. The rights of property are committed into the same hands with the personal rights. Some attention ought, therefore, to be paid to property in the choice of those hands.

"For another reason, the votes allowed in the federal legislature to the people of each State, ought to bear some proportion to the comparative wealth of the States. States have not, like individuals, an influence over each other, arising from superior advantages of fortune. If the law allows an opulent citizen but a single vote in the choice of his representative, the respect and consequence which he derives from his fortunate situation very frequently guide the votes of others to the objects of his choice; and through this imperceptible channel the rights of property are conveyed into the public representation. A State possesses no such influence over other States. It is not probable that the richest State in the Confederacy will ever influence the choice of a single representative in any other State. Nor will the representatives of the larger and richer States possess any other advantage in the federal legislature, over the representatives of other States, than what may result from their superior number alone. As far, therefore, as their superior wealth and weight may justly entitle them to any advantage, it ought to be secured to them by a superior share of representation."
Basically: Rich men have disproportionate influence on selecting representatives in government—more because of their awesomeness than because they purchase it seems—and so should rich states. That's why it's fair to (mostly) count slaves when determining a state's representation in the House of Representatives.

The logic, as Madison admits, is "a little strained." If wealth determines representation, then why not make a tally of all the assets within a state and determine representation accordingly? The answer, it appears, is that slaves can be punished for committing crimes—that separates them from mere livestock, and, well, it all gets very depressing to read and think about.

But Federalist 54 is interesting in light of the recent "Occupy Wall Street" protests. At the heart of the demonstrations, I believe, is a belief that every citizen should have roughly equal representation in the federal government—the anger against "The 1 Percent" is anger not just that rich people are getting richer much faster than the rest of us, but that they have disproportionate influence with our government to bend policies to their will. To the protesters, that seems undemocratic—a betrayal of the American promise.

If we're to take Madison at his word, though, the problem is actually pretty foundational: The idea that wealth deserves more say in the halls of our democratic government seems at odds with the "one person, one vote" ideals we're usually taught, but it's baked into our government's DNA, part of the founding documents. 

Karen Heller's oversight on Herman Cain

At the Philadelphia Inquirer today, Karen Heller pooh-poohs the idea a candidate—say, like Herman Cain—should have to give up pursuing the presidency just because of adultery allegations. "Adoring your spouse is an admirable quality, particularly in one's own partner. But fidelity shouldn't be the determining factor on which candidate gets your vote. Richard Nixon was faithful to Pat, just not to the Constitution," she writes.

I don't entirely disagree: I wrote something similar back when Anthony Weiner was in hot water. But that said: It's absolutely a good thing that Cain was driven from the race.

Why? Not because of the adultery allegations, at least not on their own. The problem is that the adultery allegations came after stories that Cain had sexually harassed subordinates back when he was running the restaurant lobbying association. Essentially: He used and abused his power to try to get women to go to bed with him. Those allegations didn't merely suggest Cain was a bad boy in his private life; they suggested that Cain handled the perks of leadership in selfish, abusive, distorted fashion. That should be the concern of voters—and to my mind, should've been enough to drive him from the race on their own. The adultery allegations were the straw that broke the camel's back.

So I agree with Heller: Adultery, on its own, shouldn't be a disqualifier from high office. When it's combined with power abuses, though, there's a real problem. It would've been nice if she'd at least acknowledged that part of the story, instead of stripping it down to a mere tale of adultery.

Friday, December 2, 2011

I wonder what Newt thinks of this?

Children as young as 12 toil on farms as long as 12 hours a day, six to seven days a week, often in sweltering conditions, a recent report by Human Rights Watch found. Because of biological characteristics (such as a greater surface-area-to-body-mass ratio and a lower sweat capacity) and a reduced tendency to know when to take a break in response to heat symptoms, young farm workers are particularly at risk of excessive heat exposure, Public Citizen said in its comments.

Reserving its objections to the practice of child labor, to which it is opposed, Public Citizen called on the DOL to establish a heat-stress threshold that requires employers to take immediate action to prevent the onset of heat injury, among other protective measures.

Do we deserve a Great Depression because the Greeks were irresponsible?

Rod Dreher seems to think so. Here he is, commenting on David Brooks' column sticking up for Germans who don't want to bail out their Eurozone counterparts:
I wonder what would be worse: a Depression that serves as nemesis for the hubris of the Eurozone tower of Babel, or saving the Eurozone by throwing overboard the “precious social construct” of moral hazard and an economic system that rewards virtue and punishes vice.
Those are two bad choices, but you know what? The Depression is worse. In the latter scenario, people who don't deserve to live comfortable lives get to do so—but so do the people who do deserve to. In the former scenario, people who made bad choices pay for them—but so do a lot of other people who don't. I'm not a fan of tripping lightly over moral hazard, but I'm even less a fan of the misery that accompanies a Depression.  And there's no telling where those ramifications would end. The last Depression ended with a genocidal world war, after all.

Is Islamic terrorism worse than other terrorism?

I'm perusing a Congressional Research Service report on "homegrown jihadism" in the United States—it'll take a bit to digest—but I couldn't help but notice the kicker to this paragraph:
How serious is the threat of homegrown, violent jihadists in the United States? Experts differ in their opinions. In May 2010 congressional testimony, terrorism expert Bruce Hoffman emphasized that it is, “difficult to be complacent when an average of one plot is now being uncovered per month over the past year or more—and perhaps even more are being hatched that we don’t know about.”By contrast, a recent academic study of domestic Muslim radicalization supported by the National Institute of Justice reveals that “the record over the past eight years contains relatively few examples of Muslim-Americans that have radicalized and turned toward violent extremism” and concludes that “homegrown terrorism is a serious but limited problem.” Another study has suggested that the homegrown terrorist threat has been exaggerated by federal cases that “rely on the abusive use of informants.” Moreover, the radicalization of violent jihadists may not be an especially new phenomenon for the United States. Estimates suggest that between 1,000 and 2,000 American Muslims engaged in violent jihad during the 1990s in Afghanistan, Bosnia, and Chechnya. More broadly, terrorism expert Brian Michael Jenkins notes that during the 1970s domestic terrorists “committed 60-70 terrorist incidents, most of them bombings, on U.S. soil every year—a level of activity 15-20 times that seen in most years since 9/11.”  Few of the attacks during the 1970s appear to have involved individuals motivated by jihadist ideas.
So, no big deal then, right?

Now, it's true that jihadists scored one really spectacular attack with 9/11—and that attack, not all the small-bore and (mostly) ineffective operations since then is what we've decided to address with the creation of the Department of Homeland Security and the reorientation of our national security infrastructure over the last decade. It's understandable, if not always laudable.

But the truth is that 1970s radicals were, on an ongoing basis, more deadly than American-grown jihadists. And it's also true that a government agency that points out that fact feels compelled to add something along the lines of: "Sure, the radical hippies committed a lot more bombings. But they weren't Muslim or anything."

Don't celebrate those new job numbers too much.

However, at this pace of job growth, it will be more than two decades before we get back down to the pre-recession unemployment rate. Moreover, a shrinking labor force is not the way we want to see unemployment drop.  At this rate of growth we are looking at a long, long schlep before our sick labor market recovers.

Something is really, really wrong with the economy

The Commerce Department’s Bureau of Economic Analysis reports that in the third quarter, wages as a share of gross domestic product were the lowest they’ve been since 1929, and compensation (that includes health insurance) as a share of GDP was at its lowest point since 1955. Corporate profits as a share of GDP, by contrast, are the highest they’ve been since 1929.

It's not a depression. But it's depressing.

According to the study, to be released Friday by the John J. Heldrich Center for Workforce Development at Rutgers, just 7 percent of those who lost jobs after the financial crisis have returned to or exceeded their previous financial position and maintained their lifestyles.

The vast majority say they have diminished lifestyles, and about 15 percent say the reduction in their incomes has been drastic and will probably be permanent.

About those defense budget cuts

Here's a graph from the Congressional Research Service showing what proposed "cuts" to the defense budget mean: We still spend more ... just not quite as fast as we expected to.

Paul Pillar on the Nazi analogy approach to foreign policy

There also is other, broader and longer term damage from the loose, profligate playing of the Nazi card. Repeatedly playing the card represents a failure to discriminate among different levels of threat. That undermines the tailoring of policy responses to make them appropriate for each threat. More specifically, it diminishes appreciation for the enormous magnitude of what the real Nazis did. If even problems that do not come anywhere close to what they did are rhetorically equated with Nazism, then the currency of discourse about human evil is debased. The rhetorical equation undermines understanding of the gigantic scale of the evil that the Nazis perpetrated, including the Holocaust.

Winning wars is OK. Waging wars is better.

At The National Interest, John Mueller suggests that Obama won't get much electoral lift from winning the Libya war—we're still talking about that?—because presidents rarely do:
Nobody gave much credit to Bush for his earlier successful intervention in Panama, to Dwight Eisenhower for a successful venture into Lebanon in 1958, to Lyndon Johnson for success in the Dominican Republic in 1965, to Jimmy Carter for husbanding an important Middle East treaty in 1979, to Ronald Reagan for a successful invasion of Grenada in 1983, or to Bill Clinton for sending troops to help resolve the Bosnia problem in 1995. Although it is often held that the successful Falklands War of 1982 helped British prime minister Margaret Thatcher in the elections of 1983, any favorable effect is confounded by the fact that the economy was improving impressively at the same time.
Right: Americans expect to win wars, so you don't really get special consideration as president for getting the job done. There's really only two war-related situations that seem to make much of a difference to a president's standing:

• Losing wars is bad. Think LBJ, of course, but even the relative success of the surge in Iraq wasn't enough to overcome Americans' (entirely correct) belief that George W. Bush had mostly prosecuted the war very badly. That led to Democrats' electoral success in 2006 and 2008.

• Going to war, on the other hand, is tremendously good in the short-term. George HW Bush saw his tepid popularity skyrocket when he led the U.N. coalition against Saddam in 1991; his son saw a similar boost after 9/11. A lot of that depends on the perceived justness of the cause; Obama didn't get a boost, most likely, because A) Americans barely cared about the war there, B) American military involvement was mostly kept out-of-sight, and C) his administration didn't do much in terms of rallying around the flag.

And a president has to show himself to be willing to go to war. Every president is scared of looking weak, and certainly political opponents are always willing to scream "appeasement" if a rival country gains an inch anywhere else in the world.

The lesson? Be willing to go to war. Make sure you win it. Losing is really the only part of the equation that is for ... losers.