Where Yinz From?

I’m working late, so all I have for you tonight is this little quiz. Give it a go.

On Reaction

Julius Evola, from the opening pages of Men Among The Ruins:

Recently, various forces have attempted to set up a defense and a resistance in the sociopolitical domain against the extreme forms in which the disorder of our age manifests itself. It is necessary to realize that this is a useless effort, even for the sake of merely demonstrative purposes, unless the disease is dealt with at its very roots. These roots, as far as the historical dimension is concerned, are to be found in the subversion introduced in Europe by the revolutions of 1789 and 1848. The disease must be recognized in all of its forms and degrees; thus, the main task is to establish if there are still men willing to reject all the ideologies, political movements, and parties that, directly or indirectly, derive from those revolutionary ideas (i.e., everything ranging from liberalism and democracy to Marxism and communism)…

Strictly speaking, the watchword could then be counterrevolution; however, the revolutionary origins are by now remote and almost forgotten. The subversion has long since taken root, so much so as to appear obvious and natural in the majority of existing institutions. Thus, for all practical purposes, the formula of “counterrevolution” would make sense only if people were able to see clearly the last stages that the world subversion is trying to cover up through revolutionary communism. Otherwise, another watchword is to be preferred, namely reaction. To adopt it and call oneself “reactionary” is a true test of courage. For quite some time, left-wing movements have made the term “reaction” synonymous with all kinds of iniquity and shame; they never miss an opportunity to thereby stigmatize all those who are not helpful to their cause and who do not go with the flow, or do not follow what, according to them, is the “course of History.” While it is very natural for the Left to employ this tactic, I find unnatural the sense of anguish that the term often induces in people, due to their lack of political, intellectual, and even physical courage; this lack of courage plagues even the representatives of the so-called Right or “national conservatives,” who, as soon as they are labeled “reactionaries,” protest, exculpate themselves, and try to show that they do not deserve that label.

What is the Right expected to do? While activists of the Left are “acting” and carrying forward the process of world subversion, is a conservative supposed to refrain from reacting and rather to look on, cheer them on, and even help them along the way? Historically speaking, it is deplorable that a “reaction” has been absent, inadequate, or only half-hearted, lacking people, means, and adequate doctrines, right at the time when the disease was still at an embryonic stage and thus susceptible to be eliminated by immediate cauterization of its infectious hotbeds; had that been the case, the European nations would have been spared untold calamities…

Naturally, the term “reaction” intrinsically possesses a slightly negative connotation: those who react do not have the initiative of action; one reacts, in a polemical or defensive way, when confronted by something that has already been affirmed or done. Thus, it is necessary to specify that reaction does not consist in parrying the moves of the opponent without having anything positive to oppose him with. This misperception could be eliminated by associating the formula of “reaction” with that of “conservative revolution,” a formula in which a dynamic element is evident. In this context “revolution” no longer signifies a violent overthrow of a legitimate established order, but rather an action aimed at eliminating a newly emerged disorder and at reestablishing a state of normalcy. Joseph De Maistre remarked that what is needed, more than a “counterrevolution” in a polemical and strict sense, is the “opposite to a revolution,” namely a positive action inspired by the origins. It is curious how words evolve: after all, revolution, according to its original Latin meaning (re-volvere), referred to a motion that led again to the starting point, to the origins.

In conversation with my friends on the Left I often hear the phrase “the wrong side of history”; implicit in the use of this expression is the idea that it is the flow of history itself that ratifies changes in the condition of human society, rather than any higher and more permanent principle. The stark contrast between this view and that of the reactionary was borne home to me in two exchanges over the past weekend.

In the first, I replied to a remark made on Twitter about gay marriage. Someone had tweeted:

In 20 years, conservatives will be pointing out the positive effect marriage has on the gay community.

I replied:

And liberals will point out that opposition to gay marriage 20 yrs ago was just as strong as opposition to interspecies marriage is now.

In my mind this was a reductio ad absurdum, intended to show the lack of a limiting principle, and the folly of ascribing intrinsic wisdom to the entropic evolution of history.

A day later, the subject came up again, this time in private conversation with a dear, but very liberal, friend. I pointed out that, now that the ancient and universal understanding of marriage had been overthrown, marriage could defensibly become a relationship between a man and his goat.

She responded by reminding me that I was too mired in present-day attitudes, and that in a few decades it may well turn out to be considered perfectly acceptable for a man to marry his goat. What had been for me a reductio ad absurdum, then, was for her a perfectly plausible progression; in other words, it was her view that whatever such norms might become in the future, they are ratified, and justified, simply by virtue of their having evolved into whatever they will have become. This is the implicit meaning of “the wrong side of history”.

It seems to to me that “reaction” stands in relation to this worldview in the same way that position is related to momentum in quantum-mechanics: it is a complementary property of the human psyche. If we analyze the eigenfunction of a quantum particle so as to determine its position, we introduce uncertainty as to its momentum; by focusing on location, we lose sight of its motion. Likewise, we can understand the condition of a society either in terms of its location relative to an absolute frame of reference — i.e., to a set of immutable principles, or our concept of the sacred — or simply in terms of its momentum.

Uh-Oh!

From Australia’s News.com.au:

New forms of discrimination, known as “neoracism”, are taking hold in scientific research, spreading the belief that races exist and are different in terms of biology, behaviour and culture, according to anthropologists who spoke at the annual American Association for the Advancement of Science conference in Chicago.

This would be bad enough all by itself — but what makes the spread of this belief particularly worrisome is the fact that races exist, and are different in terms of biology, behaviour and culture.

Read the rest here.

Links

Yep, they’ve been piling up again. (Just like the snow is supposed to do, again, here in the Outer Cape tonight.)

Life, and love, in Russia.

The GDP of American states and foreign nations.

Theodore Dalrymple on the suppression of dissent.

See above.

More from Russia.

The most beautiful rendition of the National Anthem you will ever hear.

– Magnus Carlsen crushes Bill Gates.

The arrogance and ignorance of hoplophobic liberals.

Amazing paper sculptures.

A spectacular new lode of Burgess Shale fauna.

The “science is settled”, so the observations must be wrong.

The bloom is off the rose.

Another nail in the blank-slate coffin.

From sky to sty.

Crazy ants.

Jim Donald on natural law. Excellent.

Benghazi update.

The Dark Enlightenment Exposed

In which blogger Mark Shea is shamelessly pwned by a spectacularly imaginative troll. Here.

Recoil

Some heartening news on the Second Amendment front:

First, the good people of Connecticut are defying that state’s new registration laws with some old-fashioned civil disobedience. This is what you get when you pass bad laws: people will not respect them, and will not obey them.

Second, the Ninth Circuit (!) has tossed out, as unjustifiably restrictive, the “good cause” provision in California’s handgun-carry law. This is a major victory for the citizens of the Golden State, and will no doubt set an important precedent. New York, for example, has similar restrictions; I hope this ruling will provide ammunition for related litigation there and elsewhere.

Third, this “bears” watching: the Supreme Court will be announcing shortly whether it will take on yet another basic Second Amendment issue.

Caudillo

Got two items in the mail today from Barack Obama’s Ministry of Truth, shortly after learning that He had decided, on a whim, to change an explicitly articulated (and politically damaging) proviso of the healthcare law until after the upcoming elections.

Make sure you read the letter from ‘Cathi’, which I’ve pasted in below the ‘Selfie With The President’ email.

Does any of this bother you? The personality cult? The pathological, unceasing narcissism? The utter contempt for the law, and for us?

Well, get over it. As He reminded Himself earlier today, on camera: “That’s the good thing about being president: I can do whatever I want.”





“This could be you”? A tattooed Eloi female, in thrall to a malevolent megalomaniac? I think not.

Anyway, next there’s this:

*      *      *

 

Malcolm –

If you think you’re never going to win a chance to go backstage to meet President Obama, I need to tell you something:

That’s what I thought, too.

When I got the phone call to tell me my name had been selected, I was floored. It wasn’t long before my husband and I were getting on a free flight to D.C. to meet the President face to face.

I’m here to tell you — if you’re still on the fence, you should go for it right now.

Anyone who adds their name to help with the final push on health care enrollment is automatically eligible to win. Chances are, you were probably planning to help out anyway. This is a pretty amazing bonus.

When you get home, all your friends will want to know: What is it like to meet President Obama?

Here’s the truth — it felt like meeting an old friend.

The President is so warm and genuine — he wanted to know all about me and my husband. Talking with him completely reaffirmed my commitment to keep fighting for what is important. And the photo of the three of us is one of my favorite things to show off.

If I could go again, I’d do it in a heartbeat — heck, I’m going to throw my name in too, just in case.

Add your name to help spread the word on health care enrollment, and you’ll be automatically entered to meet President Obama himself:

http://my.barackobama.com/This-is-Your-Chance

Thanks — I’ll be pulling for you!

Cathi

The Employer Mandate

 

 

Plug

If any of you happen to be baseball fans, my son Nick has just launched a new website: Pitcher GIFS. Go have a look.
.

Cheer Up!

Here’s your antidote for those Monday blues: two stories to warm your heart.

First you’ll be glad to see that New York State residents can own an AR-15 after all, and at the same time show just how foolish these hysterical “assault-rifle” bans — which prohibit, on the basis of superficial appearance, weapons that are almost never used in gun violence — really are.

And then there’s this.

How Can This Be?

The CBS program 60 Minutes reported tonight, to everyone’s astonishment and dismay, on a recent, and heretofore completely unsuspected, scientific discovery.

The context was specific — differences in the effect of the sleeping pill Ambien on men and women — but it appears, shockingly, that the scope of the problem might be far more general, with truly horrifying implications.

I won’t sugar-coat it for you, readers; we are all grown-ups here, and I’m just going to give it to you straight: these profoundly disturbing new findings suggest that men and women might actually be quite different.

In a clip from this evening’s show, reporter Lesley Stahl interviews UC Irvine neuroscientist Larry Cahill. Ms. Stahl explains that Dr. Cahill “used to share his field’s assumption that males and females, outside the reproductive system, were fundamentally the same.”

One can almost feel the ground shifting beneath their feet as the interview continues. Asked by Ms. Stahl if this problem of sex-based differences might be bigger than just Ambien, Dr. Cahill confirmed her darkest fears:

“Once you see this difference — and that difference — and that difference — and that difference — and that difference — and that difference — and you see this thing’s everywhere, you go ‘wait a minute!’ So the assumption we’re making, that it really doesn’t matter — sex — is not a valid assumption!”

Later, in a panel discussion, Ms. Stahl was asked: “what surprised you most of all the things that you found out?”

She replied: “I guess the big thing is how pervasive the differences between men and women are.”

Yes, it’s surprising, all right — unless, of course, you have any acquaintance whatsoever with literature, history, folklore, human nature, human societies, actual men and women, the real world, the theory of evolution, or simple common sense.

As bad as this is, folks, it gets even worse. Our pal Mangan has more.

Food For Thought

A while back, the comment-thread of a post about the government shutdown turned into a discussion about the obesity of the American poor. A commenter remarked:

The reason why many poor people are obese, of course, is that the cheapest foods tend to be high in carbs and low in nutrients, which often leads to obesity, diabetes, and cardiovascular problems.

I replied:

Interesting. The cheapest food everywhere, throughout history, has always been starchy and nutrient-poor.

If you go to, say, poor Asian neighborhoods, where people of no greater income live (especially considering that government food subsidies are readily available), you don’t see much obesity. You see women in the markets, selecting inexpensive but nourishing ingredients to take home to cook.

Perhaps the difference is mostly that: taking eating seriously enough to make the effort to seek out nourishing food, and to cook it at home, rather than as just another passing appetite to satisfy with the least effort possible, and the no reflection upon the long-term consequences of one’s dietary choices.

The exchange continued:

This probably has a lot to do with what is locally available. If you live in Flushing, you can get great food which is fresh and cheap. If you live in Harlem, your choices are mostly overpriced supermarkets, overpriced Korean markets, and fast food joints.

I pointed out:

People only sell what others will buy.

This subject of “food deserts” — inner-city areas where fresh and nutritious food is hard to find — has been in the air again lately. In yesterday’s Best of the Web opinion digest, the Wall Street Journal’s James Taranto had this to say:

Poverty used to mean going hungry. These days — at least in the developed West, and especially in America — it means getting hungry, consuming loads of inexpensive carbohydrates, and becoming fat and unhealthy. It’s progress of a sort, but those concerned with social uplift aren’t wrong to see a problem here. But their assumptions about its cause and solution have been tested and found wanting.

As National Journal’s Clara Ritger describes it:

With the obesity epidemic in full swing and millions of American [sic] living in neighborhoods where fruits and vegetables are hard to come by, the Obama administration thought it saw a solution: fund stores that will stock fresh, affordable produce in these deprived areas.

But now, three years and $500 million into the federal Healthy Food Financing Initiative, there’s a problem: A study suggests it’s not working.

The idea behind what we wish the administration had the wit to call the Affordable Pear Act is that “food deserts” exist because of a market failure–because produce is in short supply. The experiment suggests the problem is more one of demand.

Ritger reports on a study in the February issue of Health Affairs. Researchers from Penn State and the London School of Hygiene and Tropical Medicine “studied two comparable neighborhoods in Philadelphia”:

When a grocery store was opened in one Philadelphia food desert, 26.7 percent of residents made it their main grocery store and 51.4 percent indicated using it for any food shopping, the report found. But among the population that used the new supermarket, the researchers saw no significant improvement in BMI, fruit and vegetable intake, or perceptions of food accessibility, although there was a significant improvement in perception of accessibility to fruits and vegetables. . . .
The researchers compared the Philadelphia neighborhood that would soon receive a new supermarket to a similar community three miles away, hoping to avoid any crossover effect from the opening of the new store. They polled the two communities before and after the store opened to see the effect of the change.

The results “mirror findings in the U.K., where researchers created a similar comparison of two neighborhoods in Scotland and observed no net effect on fruit and vegetable intake,” Ritger adds.

All of which suggests that the Affordable Pear Act rests on a backward assumption about cause and effect. It’s not that most “food desert” denizens eat unhealthy food because grocers refuse to supply them with fruits and vegetables. Instead, grocers don’t supply them with fruits and vegetables because the demand is insufficient.

Theodore Dalrymple wrote a piercing essay, back in 2002, about the malnourishment of the underclass. He wrote:

It has become a truth universally acknowledged that food deserts actually exist and must be the fault of the supermarket chains (and, by extension, the System). Indeed, the government, ever on the lookout for new areas of life to control with its dictatorial benevolence, has proposed a new law to eradicate what is now known as “food poverty” by irrigating these deserts with subsidies to food suppliers. As yet the additional provisions of the bill are not at all firm, except for the establishment of a Food Poverty Authority in every district, manned by bureaucrats, who will measure food poverty and count the miles people have to go to get fresh vegetables. One man’s poverty is another man’s employment opportunity: as long ago as the sixteenth century, a German bishop remarked that the poor are a gold mine.

Nothing has changed, at least not for the better. Read Dalrymple’s article here. (Though depressing, it’s well worth your time — and it contains the cleverest paraphrase of Hobbes ever written.)

Selected Shorts

Some good reads from the Web today:

Matt Ridley on inequality.

Kevin Williamson on feminism.

From the Statistics Lab at Cambridge University, a look at some climate-alarmist buncombe.

Little By Little

The Second Amendment notwithstanding, it appears that some pretty serious infringement is under consideration in Massachusetts, right-to-bear-arms-wise. Boston.com reports (my emphasis):

More than a year after the school shootings in Newtown, Conn., a panel of academic experts today released a long-awaited report recommending that Massachusetts tighten its gun laws, which are already considered among the toughest in the country.

The panel made 44 recommendations, including that Massachusetts join a national mental health database for screening potential gun owners, that it beef up firearms training requirements, and that it eliminate Class B gun licenses, which are seldom used.

It recommended that the Massachusetts Chiefs of Police Association help define a series of factors that could be used to prohibit “unsuitable persons” from acquiring firearms. The panel said the current process allows local law enforcement officials too much discretion to determine whether a person is suitable to be granted a license to carry.

It also said Massachusetts should require anyone wanting to purchase a hunting rifle or a shotgun to pass those standards of suitability. That could allow local police chiefs to deny gun purchases to people who have been arrested, but not convicted, of a crime.

Charles C. W. Cooke tweeted:

In other words, MA wants to deny a constitutional right to anyone who the state has accused of doing something wrong.

In other words:

1) You annoy the State.

2) The State arrests you, on charges that it knows it cannot prove.

3) You can now be legally and permanently disarmed.

Even in these dark days, it startles me that a “panel of academic experts” (i.e. statist Eloi hoplophobes) could issue, without fear of inciting widespread rebellion, such a breathtakingly audacious insult to the liberty of the people of Massachusetts, and to the Constitution that supposedly guarantees their most basic rights. (I should know better by now; things are moving very fast these days.)

Keep your powder dry, people.

Daniel Dennett on Sam Harris on Free Will

I used to spill a lot of ink here about the question of free will. In the most recent of a series of thirteen related posts on the topic, I mentioned a disagreement on this topic between two writers whose names are often linked: the philosopher Daniel Dennett and the neuroscientist Sam Harris. Both are, probably, best known to the public as members, along with Richard Dawkins and the late Christopher Hitchens, of the prominent quartet of atheist authors often lumped together as the “Four Horsemen”.

Before all this atheism, though, Daniel Dennett had already made a name for himself for his controversial work on the philosophy of mind, and for several books and papers on free will. When it comes to free will, Dennett is what’s known as a “compatibilist”: he believes that our traditional concept of volitional freedom, when examined closely, is incoherent, but that even an ordinary materialistic understanding of our nature — in which our minds, and our choices, are entirely supervenient upon the workings of our biological brains — can give us, nevertheless, all the free will that’s “worth wanting”. I have found his two books, Elbow Room and Freedom Evolves, to be persuasive, and I would say that I am a “compatibilist” myself. (If you are interested in this ancient question, you should read them both.)

Sam Harris, on the other hand, is what you might call a “hard determinist”. He agrees with Professor Dennett both that our ordinary idea of free will is incoherent, and that our minds and choices are simply the output of our material brains — but unlike Dennett, he argues that free will is simply an illusion, and that this has important moral consequences. He has written a book of his own about this.

Daniel Dennett has now written a serious critique of Sam Harris’s book, which Dr. Harris has posted at his own website. You can read it here. If this topic interests you at all, it’s well worth your time.

Paving The Road To Hell

Over the years readers have mentioned to me that too much of the discussion here takes place in the comment-threads, which are often far longer than the posts themselves. The days go by, the posts roll away down the screen, and exchanges that happen days after the original post are, effectively, hidden. I’ve been trying to remind myself to make substantial responses to comments in new posts, but I often forget.

In a recent item I mentioned that most conservatives regard liberals as decent, well-intentioned people. Responding to our consistently leftmost commenter, I wrote:

I think that both conservatives and well-intentioned liberals such as yourself want the same thing, which is to create and sustain a prosperous and well-functioning American society that maximizes opportunity and happiness, in harmony with our nature. What we disagree about is how best to achieve it (and I think this is due in large part to disagreements about the realities of human nature).

This prompted another commenter to inquire:

What I don’t understand is: why do conservatives insist on seeing liberals as good people with bad ideas?

I replied:

I feel that way only because I know so many of them. Looking at history with aloof, post-Enlightenment skepticism, modern liberals have come to believe that the source of humanity’s endless conflict and suffering is the self-confidence of traditional societies — above all, the discriminations that such cultures, in order to survive and flourish, necessarily make between friend and foe, higher and lower, self and other, good and evil, beauty and ugliness, wisdom and folly, sacred and profane, and right and wrong.

The key to harmony and happiness, then, is to reject and abandon all such confident discrimination in favor of radical doubt, which leads in turn to radical relativism. The equally radical — and poisonous — consequence of this is that if nothing and nobody can rightly be judged to be better or worse, or right or wrong, then the world’s obvious inequalities must mean that somehow, somebody cheated.

For example, look at the phenomenal success of Western, Judeo-Christian civilization, which effectively conquered and transformed the entire world, while creating sublime works of art, immeasurable wealth, and lifting billions out of poverty.

There are two possibilities. One is that for such a stupendous conquest to have happened, there must simply have been something inherently superior about such a civilization — and, because cultures do not fall from the sky, something special also about the people who created it.

But if you must rule that out, because it is offensive even to imagine such a thing, then what’s left? Only that somehow it was a dirty trick, a great injustice — and that in the name of justice the villains must be brought down, their perfidy exposed, their altars shattered, their ill-gotten wealth confiscated, and their ambitions confounded.

This necessary insight — that the essence and primum movens of modern liberalism is the pursuit of a radical equality that ultimately eliminates all basis for conflict, and that this pursuit makes necessary a corrosive and ultimately suicidal relativism — is obviously not my own. I had my first serious encounter with it in Allan Bloom’s 1987 book The Closing of the American Mind. Reading that book was what started me on my own road to Damascus.

Bloom wrote this about the students entering his university (my emphasis):

They are unified only in their relativism and in their allegiance to equality. And the two are related in a moral intention. The relativity of truth is not a theoretical insight but a moral postulate, the condition of a free society, or so they see it. They have all been equipped with this framework early on, and it is the modern replacement for the inalienable natural rights that used to be the traditional American grounds for a free society…

The danger they have been taught to fear from absolutism is not error but intolerance. Relativism is necessary to openness; and this is the virtue, the only virtue, which all primary education for more than fifty years has dedicated itself to inculcating. Openness – and the relativism that makes it the only plausible stance in the face of various claims to truth and various ways of life and kinds of human beings – is the great insight of our times. The true believer is the real danger. The study of history and of culture teaches that all the world was mad in the past; men always thought they were right, and that led to wars, persecutions, slavery, xenophobia, racism, and chauvinism. The point is not to correct the mistakes and really be right; rather it is not to think you are right at all.

President Obama recently called the quest for equality “the defining challenge of our time”, and said “it drives everything I do in this office”. It is the Holy Grail.

In my response to our commenter, I ended with a polemical flourish:

And so liberals, with the best of intentions, have made themselves the enemies of everything good, everything true, everything superior, and everything sacred: in short, everything that once made our civilization great.

This is, perhaps, a trifle excessive, though not by much. I certainly do not mean to say that in modern liberalism’s relentless jihad against all forms of inequality, nothing good has been accomplished. Although the capacity to make essential discriminations is necessary for the survival of any living organism — from a paramecium to a civilization — not all discrimination is good or just, and the liberal juggernaut of the late 20th century achieved some genuine moral victories, particularly in its struggle to ensure that all Americans, regardless of race, stood as equals before the law. The problem is that the radically egalitarian anti-discrimination at the heart of modern liberalism knows no limiting principle: it is a “universal acid” that gnaws relentlessly at any vessel that tries to contain it. Having leached away the pernicious discriminations in our society’s laws, it began next to attack the civil, social, and educational framework of society itself — and now dissolves the very foundations upon which our nation, and our civilization, were erected. In particular, it is antagonistic to both truth and liberty: to truth, because it must deny the permanent, natural inequalities of the world, and to liberty, because, as Will Durant reminds us:

[F]reedom and equality are sworn and everlasting enemies, and when one prevails the other dies. Leave men free, and their natural inequalities will multiply almost geometrically, as in England and America in the nineteenth century under laissez-faire. To check the growth of inequality, liberty must be sacrificed, as in Russia after 1917. Even when repressed, inequality grows; only the man who is below the average in economic ability desires equality; those who are conscious of superior ability desire freedom; and in the end superior ability has its way.

Links

I’ve been working long hours this week, and haven’t had the time — nor sufficient sleep — to write anything worth reading. Meanwhile, the links have been piling up, as they tend to do. So it’s time once again to flush, as we programmers say, the cache:

– Another reason that I hope humans never lose their fondness for creating beautiful, useless things.

Homage to Vermeer.

– From Ben Franklin, a handy compendium.

Goats.

Going, going

Paging Mr. Fort.

– We’ve been neglecting this topic. More later.

U.S. cities over time, in a lovely chart.

Income mobility.

Behind the mask.

Nice UI. (It was recommended for use with the GSS, as I recall.)

Safe house. See also here.

The Hole at the Pole.

Steve Sailer’s Race FAQ.

– A very strange ping-pong match.

Hitchens and Buckley, long, long ago.

Send In The Crowns

Here’s a pleasingly dyspeptic assessment of the State of the Union pageant, from Kevin D. Williamson.

Local Color

It’s the dead of winter here in the Outer Cape, and it’s been unusually cold and snowy. That doesn’t stop me from getting outside, though. Here are a few cell-phone shots from some recent walks around Wellfleet:

First, our little wooded lane, after Tuesday’s snowstorm:

Next, a few shots of Duck Harbor from about ten days ago. (The beach faces due west, across Cape Cod Bay.) These were taken at the end of a rainy day:

 

 
Just as the sun was about to set, the clouds broke. Here’s some of that famous Cape light:

It’s been so cold that the harbor was choked with ice earlier today:

From Powers Landing, a couple of days ago: snow, sand, ice, sea, and sky.

And from just before sunset today, a view of The Gut, from the Herring River dike:

I feel sorry, sometimes, for the throngs of people who come here every summer, but never get to see how beautiful it is in the winter. Can’t really say I miss ‘em, though.

Cast Out The Beam

Here’s a mighty funny item from The Daily Show.

Casting Out The Devil

Here’s something that seems to be in the air today.

Yesterday I added a comment to our Benghazi thread from a few days back. As usually happens as threads lengthen, the conversation had wandered off-topic toward the more general sort of ideological scuffling that is a constant attractor in any discussion of current events these days.

I was responding to a typical example of what self-styled “progressives” think conservatives are all about, to wit:

The Republican Party does “really stand for something.” It stands for a lot of things: tax cuts for high income households, subsidies for farmers, science denial, the criminalization of abortion, hostility to immigrants, indifference towards the environment, bans on gay marriage, favored treatment towards their favored industries, and so forth. You could look at the concatenation of Ayn Rand / Rand Paul / Paul Ryan and see a miserly and contemptuous worldview which would make Ebenezar [sic] Scrooge blush.

After I had responded to some of the other points the commenter had made, I ended with:

I will say this though, in a spirit of generosity: leaving aside professional political operatives of both parties, whose only aim is to seize and retain power, I think that both conservatives and well-intentioned liberals such as yourself want the same thing, which is to create and sustain a prosperous and well-functioning American society that maximizes opportunity and happiness, in harmony with our nature. What we disagree about is how best to achieve it (and I think this is due in large part to disagreements about the realities of human nature). It’s wrong of you to impugn our motives so, and I wish you’d stop doing it.

I had also added, but then deleted shortly after posting, an extra passage in which I pointed out an asymmetry between the ways conservatives and liberals view each other — namely that while conservatives generally think that liberals are misguided, and live in deep denial of obvious truths about human nature and the way the world actually works (as opposed to the way they think it ought to work), liberals view conservatives not just as misguided, but as morally evil. From there, it is easy to demonize and dehumanize them.

It may well be that our differences are truly intractable (indeed, I think they are), and that what we need is some sort of divorce, or disaggregation — but at the very least it would be nice to accomplish this without bloodshed, and if history teaches us anything at all, it teaches us that framing your political opponents not only as different, but evil, has a worrisome track-record.

Shortly after I posted this abridged comment, our reader Henry sent along a link to a column by Thomas Sowell, published just today on this very subject.

We read:

From the 18th century to today, many leading thinkers on the left have regarded those who disagree with them as being not merely factually wrong but morally repugnant. And again, this pattern is far less often found among those on the opposite side of the ideological spectrum.

The visceral hostility toward Sarah Palin by present day liberals, and the gutter level to which some descend in expressing it, is just one sign of a mindset on the left that goes back more than two centuries.

T.R. Malthus was the target of such hostility in the 18th and early 19th centuries. When replying to his critics, Malthus said, “I cannot doubt the talents of such men as Godwin and Condorcet. I am unwilling to doubt their candor.”

But William Godwin’s vision of Malthus was very different. He called Malthus “malignant,” questioned “the humanity of the man,” and said “I profess myself at a loss to conceive of what earth the man was made.”

This asymmetry in responses to people with different opinions has been too persistent, for too many years, to be just a matter of individual personality differences.

Although Charles Murray has been a major critic of the welfare state and of the assumptions behind it, he recalled that before writing his landmark book, “Losing Ground,” he had been “working for years with people who ran social programs at street level, and knew the overwhelming majority of them to be good people trying hard to help.”

Can you think of anyone on the left who has described Charles Murray as “a good person trying hard to help”? He has been repeatedly denounced as virtually the devil incarnate — far more often than anyone has tried seriously to refute his facts.

Such treatment is not reserved solely for Murray. Liberal writer Andrew Hacker spoke more sweepingly when he said, “conservatives don’t really care whether black Americans are happy or unhappy.”

Even in the midst of an election campaign against the British Labour Party, when Winston Churchill said that there would be dire consequences if his opponents won, he said that this was because “they do not see where their theories are leading them.”

But, in an earlier campaign, Churchill’s opponent said that he looked upon Churchill “as such a personal force for evil that I would take up the fight against him with a whole heart.”

In today’s Best of the Web James Taranto also picked up the same theme. Mr. Taranto commented on New York governor Andrew Cuomo’s harsh remarks about traditional conservatives (he said on TV the other day that they had “no place in the state of New York”) — an opinion that was quickly and enthusiastically seconded by Gotham’s radically left-wing Mayor, Bill de Blasio. Mr. Taranto wrote (my emphasis):

[T]here’s something a bit puzzling about the sheer viciousness of the governor’s and the mayor’s rhetoric. Liberals, after all, pride themselves on their toleration and open-mindedness, but often they sound like Michael Caine’s character in “Austin Powers in Goldmember” who said: “There’s only two things I hate in this world. People who are intolerant of other people’s cultures and the Dutch.” Cuomo and de Blasio, unlike Caine, don’t understand they’re the butt of the joke.

One explanation for this phenomenon comes from social psychologist Jonathan Haidt, author of “The Righteous Mind: Why Good People Are Divided by Politics and Religion.” Todd Zywicki, coincidentally on the same day Cuomo made his remark, summed up the relevant finding in a Volokh Conspiracy post:

Haidt reports on the following experiment: after determining whether someone is liberal or conservative, he then has each person answer the standard battery of questions as if he were the opposite ideology. So, he would ask a liberal to answer the questions as if he were a “typical conservative” and vice-versa. What he finds is quite striking: “The results were clear and consistent. Moderates and conservatives were most accurate in their predictions, whether they were pretending to be liberals or conservatives. Liberals were the least accurate, especially those who describe themselves as ‘very liberal.’ The biggest errors in the whole study came when liberals answered the Care and Fairness questions while pretending to be conservatives.” In other words, moderates and conservatives can understand the liberal worldview and liberals are unable to relate to the conservative worldview, especially when it comes to questions of care and fairness.

In short, Haidt’s research suggests that many liberals really do believe that conservatives are heartless bastards–or as a friend of mine once remarked, “Conservatives think that liberals are good people with bad ideas, whereas liberals think conservatives are bad people”–and very liberal people think that especially strongly. Haidt suggests that there is some truth to this.

Haidt has a theory that moral reasoning is driven by, as Zywicki writes, “five key vectors or values of psychological morality: (1) care/harm, (2) fairness, (3) loyalty, (4) authority, and (5) sanctity.” Haidt posits that “conservative values are more overlapping than liberals–conservatives have a ‘thicker’ moral worldview that includes all five values, whereas liberals have a ‘thinner’ view that rests on only two variables,” in Zywicki’s summary.

I don’t know much about Haidt’s research, but I do know, from extensive personal experience, the vicious antipathy — it hardly seems excessive to call it hatred — that “progressives” so often feel toward conservatives. Again and again, when I have been with them in social settings, and the conversation turns to politics, I have heard truly intemperate expressions of moral condemnation and venomous loathing expressed about conservatives, with the general agreement of those assembled — and when I have chosen not to sit in silence, but to engage their ideas from a conservative viewpoint, I have seen, again and again, their faces darken with resentment as their eyes narrow with a look of barely contained fury. It is altogether familiar and predictable. It is clear that to them I am not just someone with a different opinion about what makes for happy, harmonious societies: I am a heretic, a malefactor, a threat, a devil. It is distinctly unpleasant to be on the receiving end of such malignance, and for some folks it must even be frightening. I can easily see why some people are afraid to speak up.

Thomas Sowell concludes:

Examples of this asymmetry between those on opposite sides of the ideological divide could be multiplied almost without limit. It is not solely a matter of individual personality differences.

The vision of the left is not just a vision of the world. For many, it is also a vision of themselves — a very flattering vision of people trying to save the planet, rescue the exploited, create “social justice” and otherwise be on the side of the angels. This is an exalting vision that few are ready to give up, or to risk on a roll of the dice, which is what submitting it to the test of factual evidence amounts to. Maybe that is why there are so many fact-free arguments on the left, whether on gun control, minimum wages, or innumerable other issues — and why they react so viscerally to those who challenge their vision.

I’m sure that’s true. But the intensity of this reaction seems to go beyond mere defensiveness about one’s self-image. It seems downright… religious.

In Harm’s Way

A common response from those who wish to inoculate the Obama administration, and in particular Hillary Clinton, from charges of negligence and malfeasance in the Benghazi murders, is to suggest that Ambassador Chris Stevens was in large part responsible for the absence of security at the diplomatic compound. In our own comment thread, for example, we saw:

What looks obvious in hindsight appears much less so in real time: we know that Ambassador Stevens twice declined offers of military assistance shortly before the attacks. If the Ambassador on the ground considers the consulate to be safe, one could forgive people in Washington for minimizing the risk.

Right, there’s Hillary off the hook. Anyway, what difference, at this point, does it make?

Well, not so fast, says Gregory Hicks. He should know; he was there.

It’s good to see someone sticking up for Mr. Stevens, even now. Too bad nobody did when it mattered.

Let’s Roll

The venerable liberal journalist Nat Hentoff joins the chorus calling for Barack Obama’s impeachment.

Worth a try, say I. The only plausible objection I can think of is Joe Biden, but at this point that’s a trade I’d be glad to make.

Links

Boy, do these things pile up quickly.

Ice balls.

I can almost hear him say “I told you so.”

– Maybe we can get the Dems to primary this guy in 2016.

Well, whaddya know?

Satan’s Kimchi.

Common sense from Jim Cramer.

Even more personality tests.

Race trumps sex, gals. (There’s plenty of room over here on the right.)

– 102 years old and counting: inside the 1911.

– Coming soon: the 28th Amendment. Not.

Bundle up, folks, and throw some more carbon on the fire.

– More on this one later.

Windmills, schwindmills.

– The UK: as good as dead.

Best Rickroll ever.

The New York Times And Benghazi

A lot has been made of The New York Times’s recent article, by David Kirkpatrick, about the sacking of the U.S. diplomatic mission in Benghazi on September 11th of 2012, in which four men, including our ambassador, were killed. The Obama administration’s partisans have given the article a triumphal reception, and have announced repeatedly that it ‘debunks’ the Right’s criticism of the administration’s handling of the affair. I was happy to let this one pass by in silence — there has been ample commentary from all sides, and most people’s minds are already made up when it comes to this shameful affair — but having had some requests from readers and commenters, I thought I should put together a post. I’m sorry that it’s a little “after the fact”.

What, then, does the article say? I’ve read it several times now, and as far as I can see, it makes only two claims that might be understood as ‘debunking’ anything. These claims are:

1) That the attack was not an al-Qaeda operation, and
2) That the attack was in fact a reaction to a provocative video, as the Obama administration claimed in the days and weeks after the attack.

The first point was made clearly and explicitly by Mr. Kirkpatrick:

Months of investigation by The New York Times, centered on extensive interviews with Libyans in Benghazi who had direct knowledge of the attack there and its context, turned up no evidence that Al Qaeda or other international terrorist groups had any role in the assault.

This assertion was taken up immediately and effectively by a great many sources. The Washington Post, for example, gave us the following:

Militiamen under the command of Abu Sufian bin Qumu, the leader of Ansar al-Sharia in the Libyan city of Darnah, participated in the attack that killed U.S. Ambassador J. Christopher Stevens and three other Americans, U.S. officials said…

Qumu, 54, a Libyan from Darnah, is well known to U.S. intelligence officials. A former tank driver in the Libyan army, he served 10 years in prison in the country before fleeing to Egypt and then to Afghanistan.

According to U.S. military files disclosed by the anti-secrecy group WikiLeaks, Qumu trained in 1993 at one of Osama bin Laden’s terrorist camps in Afghanistan and later worked for a bin Laden company in Sudan, where the al-Qaeda leader lived for three years.

Qumu fought alongside the Taliban against the United States in Afghanistan; he then fled to Pakistan and was later arrested in Peshawar. He was turned over to the United States and held at Guantanamo Bay.

He has a “long-term association with Islamic extremist jihad and members of al-Qaida and other extremist groups,” according to the military files. “Detainee’s alias is found on a list of probable al-Qaida personnel receiving monthly stipends.”

Qumu also had links to Zayn al-Abidin Muhammed Hussein, known by his alias Abu Zubaida, a key al-Qaeda facilitator who is being held indefinitely at Guantanamo.

Even the New York Times itself had previously acknowledged Mr Qumu’s a-Qaeda connections.

After all this came out, the Times tried to backpedal. In particular it denied that Qumu was involved. The Weekly Standard stops this cold. (Why were Ansar al-Sharia Derna’s men in town that day?)

The Times also reported, on October 29th of 2012, that one Muhammad Jamal was involved in the attack. Mr. Jamal, who trained with al-Qaeda, is a long-standing acolyte of Ayman al-Zawahiri, al-Qaeda’s global leader. Here are some of Mr. Jamal’s letters to Zawahiri. Both the State DepartmentFoggy Bottom and the UN have certified Jamal’s link to al-Qaeda.

Regarding Mr. Jamal, the Weekly Standard reports:

Jamal was arrested in November 2012 by Egyptian authorities and identified as a leader of the so-called Nasr City cell, which has multiple ties to al Qaeda.

Jamal is not the only key suspect omitted by Kirkpatrick. Another suspect is Faraj al-Shibli, a Libyan who, according to U.S. intelligence officials contacted by THE WEEKLY STANDARD, served as Osama bin Laden’s bodyguard during the 1990s. According to these same officials, al-Shibli is suspected of bringing materials from the Benghazi compound to senior al Qaeda leadership in Pakistan. Al-Shibli was detained in Pakistan and then Libya. Al Shibli did not immediately admit his involvement in the Benghazi attacks and was subsequently released. But U.S. officials continue to believe he played a role.

Far from being ‘on the run’, as the Obama administration would have us believe, al-Qaeda is doing very well, and its web of influence appears to be expanding throughout the Mideast and the Maghreb. In August, Reuters reported on connections between the Benghazi attackers and the jihadis who made a deadly January 2013 assault on a gas plant in Algeria, in which 39 foreigners were killed. Reuters added:

At the center of the web is Al Qaeda in the Islamic Maghreb (AQIM), which has expanded far from its Algerian birthplace and now has links to other jihadi groups in Maghreb countries, including Tunisia and Libya. Their shared ideology combines with other, often financial, interests.

Just a week or so ago, CNN reported that al-Qaeda now controls more territory than ever in the Middle East.

Besides its claim that there was no al-Qaeda involvement in the attack at Benghazi, the Times‘s article tried to shore up the Obama administration’s early insistence that the massive assault was simply a spontaneous demonstration about a blasphemous movie, Innocence of Muslms. To support this assertion, the article featured a photo taken on September 11th at the U.S. embassy in Cairo, which showed angry protestors (waving al-Qaeda flags, by the way), and bore the following caption:

Egyptian protesters tearing down the United States flag at the American Embassy in Cairo on Sept.11, 2012, during a demonstration against “Innocence of Muslims,” a video offensive to Islam.

As multiple sources have reported, though, the Cairo demonstration wasn’t about a movie, but was about the continued imprisonment of the “blind sheik” Omar Rahman, who remains incarcerated for his role in the 1993 World Trade Center bombing. Aaron Klein reminds us that CNN covered the story that day:

On the day of the Sept. 11, 2012, protests in Cairo, CNN’s Nic Robertson interviewed the son of Rahman, who described the protest as being about freeing his father. No Muhammad film was mentioned.

In the wake of the Obama administration’s media blitz about the movie, Fox News commissioned Agincourt Solutions (now Babel Street), a global social-media analysis firm, to look for signs of a groundswell of resentment about the film. They found none:

As the State Department began Tuesday to circulate a highly anticipated report into what happened in the Sept. 11 Libya consulate attack, a separate analysis found that the first reference to the anti-Islam film that was initially blamed for sparking the attack was not detected on social media until a day later.

The independent review of more than 4,000 postings was conducted by a leading social media monitoring firm.

“From the data we have, it’s hard for us to reach the conclusion that the consulate attack was motivated by the movie. Nothing in the immediate picture – surrounding the attack in Libya — suggests that,” Jeff Chapman, chief executive with Agincourt Solutions told Fox News.

This attack employed arson, small arms, vehicle-mounted machine guns, rocket-propelled grenades, and mortars. (Much of this weaponry, by the way, was under the control of our former ally Muammar Qaddafi — until we threw him to the dogs. Our military intervention, made entirely without Congressional approval, allowed his arsenals to fall into the hands of jihadist militias.) There is also evidence that the attackers knew the layout of the compound in advance, including the location of the “safe room” in which Ambassador Stevens was asphyxiated. There is simply no compelling reason for any thinking person to doubt that it was a premeditated assault. The Times invests a lot of ink in parsing the distinction between the al-Qaeda core, its major affiliates, and the many sympathetic warlords and militias who carry out jihadist warfare throughout the bloody Muslim world, but at bottom, to quote the woman for whose sake the article was obviously written: “what difference does it make?” Our people were put in harm’s way, with grotesquely insufficient security, in a failed state that had become a viper’s nest of violent and heavily armed jihadists. How could something awful not have happened?

Leaving aside its claims about al-Qaeda and the video, there is much in the Times’s article that is damning indeed, and indicates serious incompetence, and astonishing unwisdom, at the State Department. Again and again Mr. Kirkpatrick exposes the naive optimism that led to this catastrophe. In the first section of the article, Mr. Kirkpatrick refers to State Department’s “months of American misunderstandings and misperceptions about Libya and especially Benghazi”:

The United States waded deeply into post-Qaddafi Libya, hoping to build a beachhead against extremists, especially Al Qaeda. It believed it could draw a bright line between friends and enemies in Libya. But it ultimately lost its ambassador in an attack that involved both avowed opponents of the West and fighters belonging to militias that the Americans had taken for allies.

Bloody fools.

A fuller accounting of the attacks suggests lessons for the United States that go well beyond Libya. It shows the risks of expecting American aid in a time of desperation to buy durable loyalty, and the difficulty of discerning friends from allies of convenience in a culture shaped by decades of anti-Western sentiment. Both are challenges now hanging over the American involvement in Syria’s civil conflict.

There simply are not words to express the historical, cultural, and strategic naiveté on display here. How could anyone not a child or an idiot be surprised by any of what happened? Yet this administration, and this reporter, having at last been “shown the risks” by the actual slaughter of our people, stand in dawning awareness only now.

As depressing as all of this is, far, far worse is the prospect that the architect of this folly — the heartless, ambitious woman who set the stage for this calamity, and who has declined, as both she and her husband always have throughout their disgraceful career in public life, to accept any responsibility for the wreckage she has left in her wake — may in a few short years ascend to the Presidency.

Note: so polarizing is this topic, and so unlikely to produce fruitful discussion, that it is with some reluctance that I leave the comment-box open. But comment away, if you like. I doubt I’ll join in.

This Isn’t Rocket Science

According to the New York Times, the “prolonged” execution of one Dennis McGuire — who had been condemned for the brutal murder of a young pregnant woman — has raised, once again, questions about the humaneness of various methods of execution. In Mr. McGuire’s case, the technique was lethal injection:

As the lethal drugs flowed into his veins in the Ohio death chamber, Dennis B. McGuire at first “went unconscious” and his body was still, his daughter, Amber McGuire, said Friday.

But a few minutes later, she said, she was horrified to see her father struggling, his stomach heaving, a fist clenching.

“He started making all these horrible, horrible noises, and at that point, that’s when I covered my eyes and my ears,” said Ms. McGuire, who watched the execution on Thursday at the Southern Ohio Correctional Facility, near Lucasville. “He was suffering.”

Mr. McGuire’s execution, conducted with a new and untested combination of drugs, took about 25 minutes from the time the drugs were started to the time death was declared. The process, several witnesses said, was accompanied by movement and gasping, snorting and choking sounds.

It has not been established whether Mr. McGuire was conscious of pain or whether the drugs that were used were responsible for his prolonged death. But at a time when the drugs once routinely used in executions are in short supply and states are scrambling to find new formulas, the execution is stirring intense debate about the obligations of the state toward those it kills.

Let’s leave aside the not-uninteresting question of the “obligations of the state toward those it kills” (some might argue, after all, that flaying and exposure would be a more suitable response, and a more effective deterrent, for the kind of crime Mr. McGuire committed). We will assume, arguendo, that what we want is a means of execution that is sudden, 100% reliable, and can safely be assumed to cause no lingering discomfort.

Is this really so difficult to produce? How about, say, three simultaneous shotgun blasts to the back of the head, at point-blank range? That ought to satisfy our criteria. What about dropping a ten-ton flat steel slab from a height of about fifteen feet? Why not a Semtex neck-pillow? (I could go on, but you get the point: this is easy.)

What am I missing here? Why all this fussing? Given what we are trying to accomplish here this should be, if you will forgive me, a no-brainer.

Here’s the only answer I can think of: we’re squeamish. All the methods I’ve suggested are messy, and create a lot of splatter. I think that we prefer silly, complicated arrangements — gas chambers, electrocution, and lethal injection — because they make killing the condemned more painless for us.

Is All Inequality Created Equal?

For today’s reading, we have an essay on income inequality by tech entrepreneur Paul Graham.

Mr. Graham makes two key points:

First, he reminds us that in a free society, the natural diversity of human characteristics, talents, and dispositions will always result in inequalities of wealth:

When people care enough about something to do it well, those who do it best tend to be far better than everyone else. There’s a huge gap between Leonardo and second-rate contemporaries like Borgognone. You see the same gap between Raymond Chandler and the average writer of detective novels. A top-ranked professional chess player could play ten thousand games against an ordinary club player without losing once.

Like chess or painting or writing novels, making money is a very specialized skill. But for some reason we treat this skill differently. No one complains when a few people surpass all the rest at playing chess or writing novels, but when a few people make more money than the rest, we get editorials saying this is wrong.

Why? The pattern of variation seems no different than for any other skill. What causes people to react so strongly when the skill is making money?

I think there are three reasons we treat making money as different: the misleading model of wealth we learn as children; the disreputable way in which, till recently, most fortunes were accumulated; and the worry that great variations in income are somehow bad for society. As far as I can tell, the first is mistaken, the second outdated, and the third empirically false. Could it be that, in a modern democracy, variation in income is actually a sign of health?

Mr. Graham kindly (or prudently) frames the cause of these inequalities — that, say, between a Kasparov and the average patzer — only in terms of “caring enough about something to do it well”. Obviously, though, there is more to the story than that: “care” as they might, Danny DeVito would never have been a LeBron James, nor Rachel Jeantel a Feynman. Readers will recall that in their book The Lessons of History, Will and Ariel Durant expressed the same idea, in slightly darker terms:

Nature smiles at the union of freedom and equality in our utopias. For freedom and equality are sworn and everlasting enemies, and when one prevails the other dies. Leave men free, and their natural inequalities will multiply almost geometrically, as in England and America in the nineteenth century under laissez-faire. To check the growth of inequality, liberty must be sacrificed, as in Russia after 1917. Even when repressed, inequality grows; only the man who is below the average in economic ability desires equality; those who are conscious of superior ability desire freedom; and in the end superior ability has its way. Utopias of equality are biologically doomed, and the best that the amiable philosopher can hope for is an approximate equality of legal justice and educational opportunity. A society in which all potential abilities are allowed to develop and function will have a survival advantage in the competition of groups. This competition becomes more severe as the destruction of distance intensifies the confrontation of states.

Mr. Graham’s second point is that where once the only way to get rich was through ‘zero-sum’ appropriation of the wealth of others — by military conquest, taxation, or political cronyism — most of today’s wealth is created by offering desirable goods or services that also enrich the lives of others (see the original for footnotes):

[F]or most of human history the usual way to accumulate a fortune was to steal it: in pastoral societies by cattle raiding; in agricultural societies by appropriating others’ estates in times of war, and taxing them in times of peace.

In conflicts, those on the winning side would receive the estates confiscated from the losers. In England in the 1060s, when William the Conqueror distributed the estates of the defeated Anglo-Saxon nobles to his followers, the conflict was military. By the 1530s, when Henry VIII distributed the estates of the monasteries to his followers, it was mostly political. [9] But the principle was the same. Indeed, the same principle is at work now in Zimbabwe.

In more organized societies, like China, the ruler and his officials used taxation instead of confiscation. But here too we see the same principle: the way to get rich was not to create wealth, but to serve a ruler powerful enough to appropriate it.

This started to change in Europe with the rise of the middle class. Now we think of the middle class as people who are neither rich nor poor, but originally they were a distinct group. In a feudal society, there are just two classes: a warrior aristocracy, and the serfs who work their estates. The middle class were a new, third group who lived in towns and supported themselves by manufacturing and trade.

Starting in the tenth and eleventh centuries, petty nobles and former serfs banded together in towns that gradually became powerful enough to ignore the local feudal lords. [10] Like serfs, the middle class made a living largely by creating wealth. (In port cities like Genoa and Pisa, they also engaged in piracy.) But unlike serfs they had an incentive to create a lot of it. Any wealth a serf created belonged to his master. There was not much point in making more than you could hide. Whereas the independence of the townsmen allowed them to keep whatever wealth they created.

Once it became possible to get rich by creating wealth, society as a whole started to get richer very rapidly. Nearly everything we have was created by the middle class. Indeed, the other two classes have effectively disappeared in industrial societies, and their names been given to either end of the middle class. (In the original sense of the word, Bill Gates is middle class.)

But it was not till the Industrial Revolution that wealth creation definitively replaced corruption as the best way to get rich. In England, at least, corruption only became unfashionable (and in fact only started to be called “corruption”) when there started to be other, faster ways to get rich.

Seventeenth-century England was much like the third world today, in that government office was a recognized route to wealth. The great fortunes of that time still derived more from what we would now call corruption than from commerce. [11] By the nineteenth century that had changed. There continued to be bribes, as there still are everywhere, but politics had by then been left to men who were driven more by vanity than greed. Technology had made it possible to create wealth faster than you could steal it. The prototypical rich man of the nineteenth century was not a courtier but an industrialist.

With the rise of the middle class, wealth stopped being a zero-sum game. Jobs and Wozniak didn’t have to make us poor to make themselves rich. Quite the opposite: they created things that made our lives materially richer. They had to, or we wouldn’t have paid for them.

But since for most of the world’s history the main route to wealth was to steal it, we tend to be suspicious of rich people. Idealistic undergraduates find their unconsciously preserved child’s model of wealth confirmed by eminent writers of the past. It is a case of the mistaken meeting the outdated.

He concludes:

If I had a choice of living in a society where I was materially much better off than I am now, but was among the poorest, or in one where I was the richest, but much worse off than I am now, I’d take the first option. If I had children, it would arguably be immoral not to. It’s absolute poverty you want to avoid, not relative poverty. If, as the evidence so far implies, you have to have one or the other in your society, take relative poverty.

You need rich people in your society not so much because in spending their money they create jobs, but because of what they have to do to get rich. I’m not talking about the trickle-down effect here. I’m not saying that if you let Henry Ford get rich, he’ll hire you as a waiter at his next party. I’m saying that he’ll make you a tractor to replace your horse.

A final thought: Mr. Graham makes a perfectly rational choice in the passage just above, but he is mistaken to think that everyone would. Relative status is a powerful social, psychological, and historical force. To quote the Durants again, also from The Lessons of History:

The experience of the past leaves little doubt that every economic system must sooner or later rely upon some form of the profit motive to stir individuals and groups to productivity. Normally and generally men are judged by their ability to produce — except in war, when they are ranked according to their ability to destroy.

Since practical ability differs from person to person, the majority of such abilities, in nearly all societies, is concentrated in a minority of men. The concentration of wealth is a natural result of this concentration of ability, and regularly recurs in history … the concentration may reach a point where the strength of number in the many poor rivals the strength of ability in the few rich; then the unstable equilibrium generates a critical situation, which history has diversely met by legislation redistributing wealth or by revolution distributing poverty.

… We conclude that the concentration of wealth is natural and inevitable, and is periodically alleviated by violent or peaceable partial redistribution. In this view all economic history is the slow heartbeat of the social organism, a vast systole and diastole of concentrating wealth and compulsive recirculation.

For Mr. Graham, because the process that creates inequality today has, in absolute terms, a beneficial effect for everyone, it should no longer be seen as a social evil to be eradicated, and we should be able, finally, to escape the ancient cycle descibed by the Durants. But is this rational argument really enough to override our ancient, universal concern for relative status? As my mother used to say: I hae me doots.

Be Very Afraid

The psychotic, pestiferous torrent of cultural-Marxist sludge excerpted below is, apparently, what a college education in the humanities produces nowadays.

This is the moment when you make of your fist the same clench in your teeth, make of your tongue all the textbooks your school was not funded enough to provide you with, make of your fingerprints the first draft of a revolt, when the follicles of capitalism’s hips falls on your school like angel dust, an army of unbranded jungle mouths will shout and yell and the trinity will linger there too like an arena of thunderclaps, and I will be there too and I will wonder how a genocide of shredded trees somehow makes some people think they can somehow play god, and then I will further wonder what made white people so insecure that they will do everything to put us in the lion’s den, and call us everything opposite of Daniel, which is to say the real reason for underfunded city schools is because deep down those in power know of the prophet that hugs our skeleton, know of the Solomon in our gene pools, know of the burning bush our spirit is, and if that takes flight…

For an extra frisson of horror, reflect on the fact that it was read at City Hall yesterday, at the request of Gotham’s new chief magistrate, Nyarlathotep Mayor Bill De Blasio. Know, too, that the author, a student at St. John’s, is the “2014 Youth Poet Laureate of New York City”. And despair.

The Finger Pointing At The Moon

Here is a good example, from the Huffington Post, of a modern Eloi woman: a psychotherapist who responds to her young son’s naturally boyish play by wondering where she “went wrong”. (As one commenter remarks, it’s as if she sees her normal boy as a “defective girl”.) In particular, she is horrified that he might be interested in guns. I’d have thought that, as a psychologist, she might have engaged in sufficient introspection to examine her own hoplophobia, but I suppose that’s hardly realistic.

The safety and security that make possible this young mother’s pacifism and loathing of weapons are luxuries she enjoys only because, to paraphrase Orwell, “rough men stand ready to do violence on her behalf”. She acknowledges this in passing, but clearly it is not the sort of thing she would want her own boy to have anything to do with; presumably her defenders are of another species altogether, and are no mother’s sons. The “very thought” of guns evokes in her no recognition of the essential role they play as a means of safeguarding her family from the predatory evil that is always at large in the world. It evokes only emotion: “a wave of sadness and despondence.”

It is as if she thinks that weapons create the violence within us. Rather than confront and understand the stubborn and comfortless truth of our nature, she prefers instead to fetishize the artifacts it has created, and imagines that by rejecting these potent objects, she can transform human nature itself.

I’m glad, of course, that she and her child are able to feel so secure in their home. It would be nice, though, if she had some appreciation of what makes this so — and some measure of respect for the sort of people who will fight and die to preserve and protect her blithe existence, and for the tools they must use to do so.

What is gratifying, however, about this article is the near-unanimous criticism, in the comment thread, of the author’s ideological enthrallment. Perhaps the wind is changing.

Service Notice

I’m having a busy stretch here; the blog might be rather quieter than usual for a few days. Back to normal soon.

Implosion

Today we have an interesting piece by Nick Land on John Smart’s novel approach to the Fermi Paradox (see here for more about the Fermi Paradox, if you aren’t familiar with the term): that advanced civilizations, rather than expanding into space, relentlessly turn inward.

We read:

John M. Smart’s solution to the Fermi Paradox is integral to his broader ‘Speculations on Cosmic Culture’ and emerges naturally from compressive development. Advanced intelligences do not expand into space, colonizing vast galactic tracts or dispersing self-replicating robot probes in a program of exploration. Instead, they implode, in a process of ‘transcension’ — resourcing themselves primarily through the hyper-exponential efficiency gains of extreme miniaturization (through micro- and nano- to femto-scale engineering, of subatomic functional components). Such cultures or civilizations, nucleated upon self-augmenting technological intelligence, emigrate from the extensive universe in the direction of abysmal intensity, crushing themselves to near-black-hole densities at the edge of physical possibility. Through transcension, they withdraw from extensive communication (whilst, perhaps, leaving ‘radio fossils’ behind, before these blink-out into the silence of cosmic escape).

If Smart’s speculations capture the basic outlines of a density-attracted developmental system, then cities should be expected to follow a comparable path, characterized by an escape into inwardness, an interior voyage, involution, or implosion. Approaching singularity on an accelerating trajectory, each city becomes increasingly inwardly directed, as it falls prey to the irresistible attraction of its own hyperbolic intensification, whilst the outside world fades to irrelevant static. Things disappear into cities, on a path of departure from the world. Their destination cannot be described within the dimensions of the known – and, indeed, tediously over-familiar – universe. Only in the deep exploratory interior is innovation still occurring, but there it takes place at an infernal, time-melting rate.

Read the rest here.

Links

I haven’t anything substantial prepared for tonight, so just a brief salmagundi:

Fun with sound waves.

A lexical-distance graph of European languages. (Where’s Basque? So far out it’s off the chart, maybe.)

– “Exponential medicine”.

VDH vs. Pajama Boy.

– Amazing to see this, from the Times’s senior science writer. Is the edifice starting to crumble under the weight of too much reality?

The strange art of Zdzislaw Beksinki.

MSLSD.

A puffin with its beak full of eels. And more.

– What? You aren’t reading The Fortean Times?

– Now this I’ll pay taxes for.

– Yet another political-personality test. (I’m 85% you-know-what.)

AP computer-science exam race/sex data.

Some awkward Father’s Days on the way.

A thousand-year-old chess problem.

The DOJ’s idea of justice, or Auster’s First Law.

Heather Mac Donald discusses the high culture of the West, and where it’s headed.

Rawls And Abortion

In the comment-thread to our post about Duck Dynasty a few weeks back, the discussion turned to abortion rights. I wrote this:

Are the not-yet-born rights-bearing persons, deserving of moral consideration? One would think that in a morally consistent ethics this would be an attribute inhering in the unborn person — but apparently in many people’s opinion it depends merely upon the whim of the mother. From the perspective of the unborn, that’s a mighty precarious position to be in — and rather unfairly so, it seems to me.

It’s Schrödinger’s Kid.

Later on I added:

To leave that ontological determination up to the whim of the person doing the killing is a unique moral and philosophical ambiguity. If you were a developing fetus, would you want your fate to be in such a precarious position?

Given that we are all brought into the world this way, one could say this is the ultimate Rawlsian question.

In John Rawls’s theory of justice, the “original position” is the position, behind the “veil of ignorance”, of people not yet placed into the world, who have no idea into what circumstances they will be delivered. The idea is that the optimally just society would be one designed by those behind the veil, because without foreknowledge of the station they are about to occupy, they will be maximally impartial.

What better instantiates the “original position” than our situation in utero? One might find oneself in the belly of a pleasure-seeking, atheistic NYU sophomore (a precarious situation!), or that of a pious, married Catholic. Under our law, your very personhood — your existential status, your chance at moral consideration, and most important of all, your chance at ever getting past the original position — depends upon nothing intrinsic to yourself. Indeed, it depends upon no consistent principle whatsoever, but rather upon the caprice of the woman into whose womb you happen to have been deployed.

Before I go looking around online, I’ll ask the question here: how do Rawlsians address this issue?

Moscow On The Hudson

Here’s an outstanding piece by Heather Mac Donald on our new, Marxist mayor. I had begun to prepare some excerpts and commentary, but you should really just go read the whole thing, here.

Memento Mori

It’s a sad day here at waka waka, where we’ve just heard that an old friend, Dr. Clive Sell of Phoenix, Arizona, has died unexpectedly of a heart attack.

I got to know Clive many years ago, and hadn’t seen him in a long time, but he was a fine man: a charming Southerner, exceptionally intelligent, tall and handsome, a gifted athlete, and one of the cheeriest, friendliest fellows you might ever hope to meet.

Our sympathy goes out in particular to our high-school chum, longtime commenter and regular sparring partner Peter, AKA ‘The One Eyed Man’. Clive and Peter had been best friends since they were roommates together at Amherst College in the early 1970s, and I am sure that Clive’s death is for Peter a grievous loss.

It was good to know you, Clive, and you should have been with us many a long year more. May flights of angels sing thee to thy rest.

Foggy-Bottom Baksheesh For The Ikhwan?

Here’s a story that might get interesting. I’ll let you know.

Noblesse, Sans Oblige

In part 3 of his “Gentle introduction“, the reactionary and monarchist writer Mencius Moldbug examines a possible framework for the creation of a new ‘noble’ class:

Let’s say you were a person who didn’t care at all about the Constitution, and you wanted to take America back to the past and establish a new order of hereditary nobility. What could be more deliciously reactionary than that? Real, live nobles, walking around on the street. So let’s see what it would take to make it happen.

First, we need to define noble status. Our rule is simple: if either of your parents was a noble, you’re a noble. While this is unusually inclusive for a hereditary order, it is the 21st century, after all. We can step out a little. And nobility remains a biological quality – a noble baby adopted by common parents is noble, a common baby adopted by noble parents is common.

Fine. What are the official duties and privileges of our new nobility? Obviously, we can’t really call it a noble order unless it has duties and privileges.

Well, privileges, anyway. Who needs duties? What’s the point of being a noble, if you’re going to have all these duties? Screw it, it’s the 21st century. We’ve transcended duties. On to the privileges.

The basic quality of a noble is that he or she is presumed to be better than commoners. Of course, both nobles and commoners are people. And people do vary. Individual circumstances must always be considered. However, the official presumption is that, in any conflict between a noble and a commoner, the noble is right and the commoner is wrong. Therefore, by default, the noble should win. This infallible logic is the root of our system of noble privilege.

For example, if a noble attacks a commoner, we can presume that the latter has in some way provoked or offended the former. The noble may of course be guilty of an offense, but the law must be extremely careful about establishing this. If there is a pattern of noble attacks on commoners, there is almost certainly a problem with the commoners, whose behavior should be examined and who may need supplemental education.

If a commoner attacks a noble, however, it is an extremely serious matter. And a pattern of commoner attacks on nobles is unthinkable – it is tantamount to the total breakdown of civilization. In fact, one way to measure the progress that modern society has made is that, in the lifetime of those now living, it was not at all unusual for mobs of commoners to attack and kill nobles! Needless to say, this doesn’t happen anymore.

This intentional disparity in the treatment of unofficial violence creates the familiar effect of asymmetric territorial dominance. A noble can stroll anywhere he wants, at any time of day or night, anywhere in the country. Commoners are advised not to let the sun set on them in noble neighborhoods, and if they go there during the day they should have a good reason for doing so.

One of the main safeguards for our system of noble authority is a systematic effort to prevent the emergence of commoner organizations which might exercise military or political power. Commoners may of course have friends who are other commoners, but they may not network on this basis. Nobles may and of course do form exclusive social networks on the basis of nobility.

Most interactions between commoners and nobles, of course, do not involve violence or politics. Still, by living in the same society, commoners and nobles will inevitably come into conflict. Our goal is to settle these conflicts, by default, in favor of the noble.

For example, if a business must choose whether to hire one of two equally qualified applicants, and one is a noble while the other is a commoner, it should of course choose the noble. The same is true for educational admissions and any other contest of merit. Our presumption is that while nobles are intrinsically, inherently and immeasurably superior to commoners, any mundane process for evaluating individuals will fail to detect these ethereal qualities – for which the outcome must therefore be adjusted.

Speaking of the workplace, it is especially important not to let professional circles of commoner resistance develop. Therefore, we impose heavy fines on corporations whose internal or external policies or practices do not reflect a solid pro-noble position. For example, a corporation which permits its commoner employees to express insolence or disrespect toward its noble employees, regardless of their relationship in the corporate hierarchy, is clearly liable. Any such commoner must be fired at once if the matter is brought to the management’s attention.

No need to worry, though, about anything remotely resembling this ever happening in the modern West. Moldbug concludes, and I’m sure you’d agree, that such a system would be “profoundly unhinged and bizarre, and completely inappropriate in anything like a sane, civilized society.” He notes also his confidence “that this bizarre version of what we can call ignoble privilege would take no more than two generations to produce a culture of worthless, unredeemable scoundrels.”

We’re From The Government, And We’re Here To Help

Here’s a little late-night reading to make you love your local Leviathan just a little bit more: an Obamacare threefer, and then an NSA nightcap.

First, this was too much even for Sonia Sotomayor.

Second, some number-crunching from James Taranto.

Third, a little historical perspective from Jay Cost.

Last, this comforting item by way of our own JK.

Sweet dreams!

Know Your Limitations

The computer scientist David Gelernter has just posted an essay about the aggressiveness and overreach of contemporary scientism and transhumanism. In particular, he focuses on what he perceives to be an assault on the essence of our humanity — our subjectivity, which so far remains an impenetrable mystery.

We read:

Today science and the “philosophy of mind”—its thoughtful assistant, which is sometimes smarter than the boss—are threatening Western culture with the exact opposite of humanism. Call it roboticism. Man is the measure of all things, Protagoras said. Today we add, and computers are the measure of all men.

Many scientists are proud of having booted man off his throne at the center of the universe and reduced him to just one more creature—an especially annoying one—in the great intergalactic zoo. That is their right. But when scientists use this locker-room braggadocio to belittle the human viewpoint, to belittle human life and values and virtues and civilization and moral, spiritual, and religious discoveries, which is all we human beings possess or ever will, they have outrun their own empiricism. They are abusing their cultural standing. Science has become an international bully.

Nowhere is its bullying more outrageous than in its assault on the phenomenon known as subjectivity.

Your subjective, conscious experience is just as real as the tree outside your window or the photons striking your retina—even though you alone feel it. Many philosophers and scientists today tend to dismiss the subjective and focus wholly on an objective, third-person reality—a reality that would be just the same if men had no minds. They treat subjective reality as a footnote, or they ignore it, or they announce that, actually, it doesn’t even exist.

Dr. Gelernter seems to have the same view of all this that I do, namely that a) our subjectivity is ineliminably, ontologically real; b) the conscious mind is, somehow, the product of the substance and activity of our brains; and c) that science and philosophy have as yet absolutely no idea, within existing paradigms, as to how, or in virtue of what quality or property, our brains can give rise to our consciousness.

In this context he talks about the harrying of philosopher Thomas Nagel for his recent book Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False:

The modern “mind fields” encompass artificial intelligence, cognitive psychology, and philosophy of mind. Researchers in these fields are profoundly split, and the chaos was on display in the ugliness occasioned by the publication of Thomas Nagel’s Mind & Cosmos in 2012. Nagel is an eminent philosopher and professor at NYU. In Mind & Cosmos, he shows with terse, meticulous thoroughness why mainstream thought on the workings of the mind is intellectually bankrupt. He explains why Darwinian evolution is insufficient to explain the emergence of consciousness—the capacity to feel or experience the world. He then offers his own ideas on consciousness, which are speculative, incomplete, tentative, and provocative—in the tradition of science and philosophy.

Nagel was immediately set on and (symbolically) beaten to death by all the leading punks, bullies, and hangers-on of the philosophical underworld. Attacking Darwin is the sin against the Holy Ghost that pious scientists are taught never to forgive. Even worse, Nagel is an atheist unwilling to express sufficient hatred of religion to satisfy other atheists. There is nothing religious about Nagel’s speculations; he believes that science has not come far enough to explain consciousness and that it must press on. He believes that Darwin is not sufficient.

The intelligentsia was so furious that it formed a lynch mob.

Dr. Gelernter next takes up transhumanism — a central theme of Ray Kurzweil’s Singularity University, where I have friends and connections (and where I spent a week-long session, in 2012, as the resident pessimist):

The voice most strongly associated with what I’ve termed roboticism is that of Ray Kurzweil, a leading technologist and inventor. The Kurzweil Cult teaches that, given the strong and ever-increasing pace of technological progress and change, a fateful crossover point is approaching. He calls this point the “singularity.” After the year 2045 (mark your calendars!), machine intelligence will dominate human intelligence to the extent that men will no longer understand machines any more than potato chips understand mathematical topology. Men will already have begun an orgy of machinification—implanting chips in their bodies and brains, and fine-tuning their own and their children’s genetic material. Kurzweil believes in “transhumanism,” the merging of men and machines. He believes human immortality is just around the corner. He works for Google.

Whether he knows it or not, Kurzweil believes in and longs for the death of mankind. Because if things work out as he predicts, there will still be life on Earth, but no human life. To predict that a man who lives forever and is built mainly of semiconductors is still a man is like predicting that a man with stainless steel skin, a small nuclear reactor for a stomach, and an IQ of 10,000 would still be a man. In fact we have no idea what he would be.

Each change in him might be defended as an improvement, but man as we know him is the top growth on a tall tree in a large forest: His kinship with his parents and ancestors and mankind at large, the experience of seeing his own reflection in human history and his fellow man—those things are the crucial part of who he is. If you make him grossly different, he is lost, with no reflection anywhere he looks. If you make lots of people grossly different, they are all lost together—cut adrift from their forebears, from human history and human experience. Of course we do know that whatever these creatures are, untransformed men will be unable to keep up with them. Their superhuman intelligence and strength will extinguish mankind as we know it, or reduce men to slaves or dogs. To wish for such a development is to play dice with the universe.

This was my own worry at SU. I did not doubt that the radical developments they forecast are coming — and I agree that many of them hold great promise — but I was deeply concerned by the blithe and all-encompassing optimism about them that everyone at SU seemed to share. In discussion after discussion there I was the skunk at the garden party, the turd in the punchbowl, the scowling reactionary raining on every parade. (The ambient optimism at SU is highly infectious and energizing, and I certainly absorbed some of that contagious enthusiasm myself — but now, almost two years later what lingers more in my mind is the worry.)

On to subjectivity itself, and how by its intrinsically private, unshareable nature it irritates the modern, scientistic mind:

Many wish to banish subjectivity altogether. “The history of philosophy of mind over the past one hundred years,” the eminent philosopher John Searle has written, “has been in large part an attempt to get rid of the mental”—i.e., the subjective—“by showing that no mental phenomena exist over and above physical phenomena.”

Why bother? Because to present-day philosophers, Searle writes, “the subjectivist ontology of the mental seems intolerable.” That is, your states of mind (your desire for adventure, your fear of icebergs, the ship you imagine, the girl you recall) exist only subjectively, within your mind, and they can be examined and evaluated by you alone. They do not exist objectively. They are strictly internal to your own mind. And yet they do exist. This is intolerable! How in this modern, scientific world can we be forced to accept the existence of things that can’t be weighed or measured, tracked or photographed—that are strictly private, that can be observed by exactly one person each? Ridiculous! Or at least, damned annoying.

And yet your mind is, was, and will always be a room with a view. Your mental states exist inside this room you can never leave and no one else can ever enter. The world you perceive through the window of mind (where you can never go—where no one can ever go) is the objective world. Both worlds, inside and outside, are real.

Dr. Gelernter then addresses functionalism, and computationalism, neither of which I have ever found compelling. He also raises the “zombie argument”:

By zombie, philosophers mean a creature who looks and behaves just like a human being, but happens to be unconscious. He does everything an ordinary person does: walks and talks, eats and sleeps, argues, shouts, drives his car, lies on the beach. But there’s no one home: He (meaning it) is actually a robot with a computer for a brain. On the outside he looks like any human being: This robot’s behavior and appearance are wonderfully sophisticated.

No evidence makes you doubt that your best friend is human, but suppose you did ask him: Are you human? Are you conscious? The robot could be programmed to answer no. But it’s designed to seem human, so more likely its software makes an answer such as, “Of course I’m human, of course I’m conscious!—talk about stupid questions. Are you conscious? Are you human, and not half-monkey? Jerk.”

So that’s a robot zombie. Now imagine a “human” zombie, an organic zombie, a freak of nature: It behaves just like you, just like the robot zombie; it’s made of flesh and blood, but it’s unconscious. Can you imagine such a creature? Its brain would in fact be just like a computer: a complex control system that makes this creature speak and act exactly like a man. But it feels nothing and is conscious of nothing.

Many philosophers (on both sides of the argument about software minds) can indeed imagine such a creature. Which leads them to the next question: What is consciousness for? What does it accomplish? Put a real human and the organic zombie side by side. Ask them any questions you like. Follow them over the course of a day or a year. Nothing reveals which one is conscious. (They both claim to be.) Both seem like ordinary humans.

So why should we humans be equipped with consciousness? Darwinian theory explains that nature selects the best creatures on wholly practical grounds, based on survivable design and behavior. If zombies and humans behave the same way all the time, one group would be just as able to survive as the other. So why would nature have taken the trouble to invent an elaborate thing like consciousness, when it could have got off without it just as well?

Such questions have led the Australian philosopher of mind David Chalmers to argue that consciousness doesn’t “follow logically” from the design of the universe as we know it scientifically. Nothing stops us from imagining a universe exactly like ours in every respect except that consciousness does not exist.

I’ve always thought that the zombie argument goes too far, because the fact that we can imagine something doesn’t mean it’s actually possible; it might just mean that we don’t fully understand the relevant phyiscal facts. (It may in fact be impossible for biological “zombies”, physically indistinguishable humans, to exist. See my own mention of the “Floating Iron Bar” argument in this same context, back in 2007.) But I do believe that human consciousness is not “on or off”, and that we can do almost everything we do quite unconsciously, including maintaining intentional states — so the question itself, of what consciousness is for, and how it could be the result of natural selection, is very much a valid one.

I won’t excerpt any more of Dr. Gelernter’s article; you should go and read it yourself, here.

Links

They’ve been piling up a bit, I’m afraid.

Graphene: the gift that keeps on giving.

Remember what happened to the Shakers, kitten.

How to keep your man.

The Daily Telegraph, just a century ago.

– “You will know us by the trail of dead.

Ice, Ice, Baby.

Eagle grinders: the downside.

Duh.

Also duh.

And then they roll in it.

Works for me!

Duh #3.

I want one.

Sharks who tweet.

Faith in humanity restored.

This is why we rule the planet.

Thomas Sowell on income inequality.

Apparently not satire.

Old school.

I’d buy this on looks alone.

Radical chic.

Well, whaddya know?

Passweird.

Does this work?

Trick shot Titus.

The Weather Outside Is Frightful

As you may have heard, there’s quite a storm on in the Northeast tonight. The lovely Nina and I are riding it out in our snug little dacha at the far end of Cape Cod, twenty-five miles out in the Atlantic. It’s been snowing all day, and we’re supposed to end up with about two feet of the stuff by the time it’s all over, tomorrow evening.

With luck, the power won’t go out; that would be bad. But it’s getting much colder now, and the snow is already fluffy enough to blow out of the trees, and not bring down many branches. The car is safely parked on the dirt road at the bottom of our little hill, and we have ample provisions and plenty of firewood. (I hope you are all safe and warm, wherever you are.)

Let it snow!

Search Me!

Once again, here’s our New Year’s selection of some of the search-engine keyphrases that have brought visitors our way in the past year:

dark enlightenment
mola mola
compelling natural force
washington monument syndrome
freedom go to hell
what is a moral fact
he’s no fun he fell right over
hirsutative
nipples
brooklyn outwash moraine
fools rush in
baxters glass eye
type c materialism
testability is a virtue
what does close up those barren leaves mean
hot chip gurdjieff
definition of cumasta
shfmdukiah
something fascinating
hog swaddled
chitty chitty bang bang vocabulary
tab hoarding
eliane elias bikini
zombie soldiers
waka inane
holly diwali
malignant omphaloskepsis
brevity is for the weak
thin pine tree
cape cod glaciation
totally exposed breasts
drool britannia
wakka wakka is a form of enculturation
marshy soil
tocqueville civic engagement
lift your chin up and stand straight
tiger fork
10 voiders of islam
let us therefore brace ourselves to our duties
college inn collagen
atheist pelicans
sequester cuts how will they affect cobblestone near fort carson?
washington irving statue in prospect park
knees d’antan
need a picture of a pig
rash around nipples
plural of waka
fat grubs immodest ass
mattress grave
dank buds
schwingtime for hitler
onomastic lizards
there is something fascinating about science
never fall in love with a government
gravy siphon
llap goch
historic hardasses
apostatic psychosis
he was a beautiful wiry slender athlete
references foot notes drunk tank pink
how many people died on the waka
lichens of the southeast
jesus in his flivver

Here’s To You

Happy New Year, everybody. And have a rip-roaring Hogmanay.

Thanks again. See you in 2014!

It’s Different For Girls

In this blog post, a New York venture capitalist expresses his concern about an urgent national problem: the underrepresentation of women in software engineering.

Why this would, by itself, be an urgent national problem is hard to imagine. From an end-user’s perspective, what matters is that software does what it’s supposed to, reliably and without risk. (The genital anatomy of the programmer, as far as I am aware, is ignored by most compilers.)

No, this is seen as a social problem, of considerable urgency. Why would that be? I can think of two possible reasons.

The first is the belief that the relatively low numbers of female programmers must be due to pernicious social oppression and obstruction. This is a serious charge! In order to convict, however, one should establish its truth beyond a reasonable doubt. To do that, one must rule out other possible causes.

First among the possibilities to eliminate, of course, is that the innate cognitive and dispositional qualities that confer programming talent, and that attract people to this peculiar and highly abstract profession,might be distributed differently among males and females, as a result of our evolutionary history. There will always, nevertheless, be women who love solving the sort of problems that programmers solve, and who don’t mind spending long hours at the computer every day solving them. I’m a professional software engineer myself, and I have known some gifted female coders. But might it not be the case that software engineering is simply the sort of thing to which more males are attracted, and for which more males have the talent to succeed? One thing I know about programming: if you don’t love it, you’ll soon come to hate it. The hours are long and sedentary, the work can be terribly frustrating, and the workplace pressure can be almost unbearable. Maybe males are just, on average, more likely to be the ones who love it enough to want to do it for a living. Maybe they are also just somewhat more likely to have the kind of cognitive architecture it takes to do it well.

Do we have any compelling reason to eliminate this hypothesis? No. What we have instead is, rather, a growing body of evidence that coherently and consistently supports it, as well as the simple common sense of the ages, and the plain, overwhelming fact that all human societies, always and everywhere, have divided themselves into sexually differentiated roles.

Does this mean that some sort of social obstruction doesn’t exist, or has never existed, regarding women programmers? Of course not. But the numbers are stubborn, even as society ties itself into knots trying to equalize them. The author of the post in question, for example, speaks about a training program that admits qualified applicants purely by lottery. It’s still mostly males, nevertheless:

At The Academy For Software Engineering (AFSE), we use a “limited unscreened” model to accept students. It’s limited because you have to attend an open house and make AFSE your first choice, but once you do those two things, its a lottery system to get in. So effectively the distributiion of students admitted is going to be very similar to the distribution of students who apply and make the school their first choice. In our first year, we admitted 24% young women. In our second year, the percentage was less, I believe below 20%.

So things are getting worse, not better. (As I said, we are looking at a national emergency here.)

If you insist on ruling out the possibility of natural male/female asymmetries, however, then the only explanation that remains is the one that consumes “progressive” sorts as an unquenchable flame: systematic cultural oppression, the eradication of which, down to the last “microagression”, is the primary duty of a just society. (We note that to do so requires assuming power, and exerting control. This has an obvious, and timeless, appeal.)

The other possibility is that such differences are, in fact, innate, but that they must be eliminated (admittedly, a minority view; the blank-slate/cultural-oppression tine of this fork is far more popular). Fortunately, human nature itself being but clay in the hands of the wise and the just, this can be fixed, too, if the right sort of pressure is applied.

By a happy convergence, both problems are amenable to the same remedies. Regarding the declining numbers quoted above, we read:

This is very upsetting to me and we are working on a number of things to change this.

Uh-oh.

It will require working hard on the parents of the young women and the middle school guidance counselors.

All I can say is that were I such a parent, to learn that I was about to be “worked hard on” by some busybody on a mission to adjust my daughter’s ambitions to fit his Procrustean template would have me reaching for my revolver.

We see here, yet again, the chief feature of the “progressive” way of looking at the world: to reverse cause and effect, and so to imagine that our nature is the product of our culture, rather than the other way around.

As it happens, the maverick feminist Camille Paglia took up the same topic in an astringent interview published just yesterday in the Wall Street Journal. “What you’re seeing”, she begins, “is how a civilization commits suicide.”

Right she is, and so we are. Read more here.

Not So Fast

From our reader Henry, here’s an interesting item: geneticists studying the rate at which biological complexity has increased over time have arrived at a provocative extrapolation.

Merry Christmas!

A warm and happy holiday to each and every one of you.

Thank you all, as always, for reading and commenting.

Bill On Phil

We’ve already discussed the Duck Dynasty brouhaha at sufficient length (and then some), but I wouldn’t want you to miss Bill Vallicella’s recent post about it: Some Points on Homosexuality in the Context of the Culture War.

Bull Goose Loony

We haven’t mentioned the Norks much in these pages lately (or, for that matter, foreign affairs generally; we’ve been taking a bit of a breather there). Time to catch up a bit.

As I’m sure most of you know, the doughy, degenerate, dipsomaniacal despot Kim Jong-un has recently executed his girlfriend, his uncle’s closest advisers, and his uncle himself, the latter being the man who groomed him for his current position.

Today, NightWatch reports that Kim was drunk at the time he offed his uncle’s posse:

North Korea: New information indicates that Kim Jong Un was thoroughly drunk when he ordered the execution of the aides of his uncle, Chang Sung-taek in November. Eight of Chang’s aides were executed before Chang. They were among the best and brightest economists in North Korea.

Comment:  Kim Jong Un has his father’s — Kim Chong-il’s — strong despotic instinct for survival, but even less insight about how to govern anything, much less a country, such as North Korea.

Kim Chong-il was a drunk and voyeur of pornography who executed those who disagreed with him. Kim Jong Un is following his father’s practices of making important decisions while drunk and executing those whom he is told disagree with him.

The key difference is that Kim Chong-il knew who his enemies were because he grew up in North Korea and studied with the veterans of the three Korean wars.

Kim Jong Un is essentially a foreigner; educated in Switzerland, who has no memories of the three wars; no military service, and has to be told who his enemies are by communist party hacks who are now wearing military uniforms. When Kim is drunk, the Army is always on duty.

Can you imagine what it must be like to be a member of this man’s inner circle? To be his wife? John Derbyshire does just that, here.

Finally, George Friedman of STRATFOR examines North Korea’s strategy for long-term survival: to be “ferocious, weak and crazy“.

The Flood

By now, I imagine, you are all familiar with the smug, epicene Eloi putz that the Obamacare P.R. machine recently chose to promote their product. He has entered the popular culture as “Pajama Boy”, and to an awful lot of people he has become an icon of American decadence and enfeeblement: a vain and useless girly-man, wearing infant’s clothing, nuzzling at the federal bosom for his most basic needs, and too weak even to lift his mug of chocolate with one hand.

If by some chance you haven’t seen him — you’ve just been released from solitary confinement, perhaps, or the bandages have just come off after your bilateral cornea transplant — here he is:

I find his image deeply disturbing: not only because it makes plain the accelerating infantilization and effeminization of Western culture, and the death of the virile self-sufficiency that once made America a mighty nation and the envy of the world, but also because it reveals the dangerous extent to which our ruling elites have learned to despise and reject the capable and hard-working people who, by the sweat of their brow and the skill of their hands, sustain their existence. To say that there are “two Americas”, as we’ve heard so often lately, misses the mark: I’d say, rather, that there is still just America — but attached to the base of its skull is this pale and parasitic thing, with its hyphae extending deep into our veins and central nervous system.

Anyway, I had been meaning to scribble an extended complaint about all of this, but just today I saw that Victor Davis Hanson had already done the job, and very well indeed. Go and read his essay, here.

Memo To Mo

Making the rounds today is a rant by one of the online community’s preeminent dyspeptics, Fred Reed. In it he responds to the New York Times columnist Maureen Dowd’s opinion that men are no longer necessary.

Ms. Dowd sums up her little idea as follows:

So now that women don’t need men to reproduce and refinance, the question is, will we keep you around? And the answer is, ‘You know we need you in the way we need ice cream — you’ll be more ornamental.’

An excerpt from Mr. Reed’s reply:

Listen, Corn Flower. Let’s think over this business of obsolete men. Reflect. You live in New York, in which every building was designed and built by men. You perhaps use the subway, designed, built, and maintained by men. You travel at in a car, invented, designed, and built by men—a vehicle that you don’t understand (what is a cam lobe?) and couldn’t maintain (have you ever changed a tire? Could you even find the tires?), and you do this on roads designed, built, and maintained by men. You fly in aircraft designed, built, and maintained by men, which you do not understand (what, Moon Pie, is a high-bypass turbofan?)

In short, as you run from convention to convention, peeing on hydrants, you depend utterly on men to keep you fed (via tractors designed by men, guided by GPS invented, designed, and launched by men, on farms run by men), and comfy (air conditioning invented…but need I repeat myself?)

I do not want to be unjust. It is not in my nature. While men may be obsolete (unless you want to eat) I cannot say, Apple Cheeks, that feminists are obsolete. They are not. Obsoleteness implies having passed through a period of usefulness.

A jot—an iota, a tittle, a scintilla—of gratitude might be in order. Should you look around you, you will note that everything that keeps you and the sisterhood from squatting in caves and picking lice from each other’s hair was provided for you by—the horror—men.

Mr. Reed singled out a particular field — computer science — for particular attention. To shorten the comment thread, I will pre-emptively acknowledge that this is in fact a field in which some women have indeed made significant contributions, so you needn’t chime in with indignant references to Ada Lovelace, Grace Hopper, et. al. It is also, of course, this website’s editorial position that any woman (or for that matter, anyone at all) who has any talent or interest in pretty much any human endeavor whatsoever should be able freely to pursue it without institutional obstruction. The pleasure of Mr. Reed’s essay is simply to see him give the smug and insufferable Ms. Dowd a poke in the eye.

You can read the rest here.