A Flea in the Fur of the Beast

“Death, fire, and burglary make all men equals.” —Dickens

Newt Gingrich Takes A Break From The Crazy To Make A Decent Point About Legislating

by evanmcmurry

Take it from a veteran observer: Newt Gingrich’s id-consumed-extemporaneous-word-salad-stream-of-consciousness-zoogastic performance art occasionally yields dividends.

Today, Gingrich was asked about John Roberts’s switcheroo to simultaneously uphold the individual mandate, save health care, balloon Obama’s election chances, deflate conservative momentum in the Supreme Court, and completely alter the political arithmetic of the judiciary. I’ve been over the various ways conservatives are attempting to spin this to their advantage, with less and less success. But Newt’s never been one for the script. He’s the GOP’s method actor, and he just goes with the moment, man:

What Roberts has said is, “Yes, it’s constitutional because of a gigantic tax increase, and if you don’t want the gigantic tax increase you’ve got to beat Obama.” You don’t just get to come to the Supreme Court to bail you out. And I happen to think that part of it is probably healthy for the country to be forced to confront, that it’s their burden.

That’s…not a bad point. You know when Gingrich and Gin & Tacos agree on a basic point that something’s up. Here’s G&T on the same subject:

Simply put, there is a good argument to be made that the Supreme Court is resolving a greater number of political issues because the actual political process – Congress and state legislatures, presidents and governors – refuses to do so. Our elected officials, rather than make decisions about hot button issues and risk infuriating half of their constituents, willingly punt to the guys who can’t be punished on Election Day.

Consider the choice facing members of Congress. One option is to introduce a bill about some controversial topic – abortion, gay marriage, healthcare reform, etc. – and then go on record for or against it. Another is to tread water, maintain the status quo, talk out of both sides of one’s mouth on the issue, and wait for the Supreme Court to issue a decision that may end up being unpopular. Rational self-interest suggests that the second option is superior for most elected officials. Consider the Republican House majority after 2010, which could very well have debated and voted on one of the “repeal and replace” bills for “Obamacare” that candidates talked about so much during the election. In practice, and recognizing how popular some (but not all) parts of the law are among the public, they decided to wait and let the Supreme Court strike it down. Obviously that strategy failed…

It is popular in recent years to write about the failure of leadership in today’s political class, often by resorting to sophomoric references to “common sense” and “guts”…Perhaps it is a lack of resolve; perhaps it is simply a rational response to the incentives laid out in our elections, particularly the financial incentive to placate the greatest number of interest groups to the greatest possible extent. Regardless, the Federal bench and the Supreme Court in particular resolve contentious political questions for an uncomplicated reason: someone has to, and the lawmakers won’t.

Gingrich goes on to spout words in no particular order about how Roberts’s decision is a victory for Grover Norquist, and how this is going to be Obama’s worst nightmare (Obama does often dream of being reelected), and all in all resumes his place as a semi-irrelevant upside-down slam poet of the right.

But credit where credit’s due: Gingrich’s point, that the judiciary can’t be counted on to do the work of dysfunctional legislatures anymore, is a solid one. And he can put all the conservative english on it he wants, but only one party is responsible, on a state and federal level, for that dysfunction. Which means Republican lawmakers should think twice before voting along intractable party lines against any and every bill that doesn’t fit their tea party pledges, as there might not be a judicial escape hatch later in the process. And omitting yourself from the legislative process out of ideological petulance, as David Frum told them oh so many years ago, is a good way to get booted out of governing entirely.

Gingrich may or may not have intended his comment as a warning to younger conservatives; either way, it should be taken as one.

The Right And Welfare Reform: Or, The Problem With Work Requirements When There Are No Jobs

by evanmcmurry

The right wing blogotwittersphere is agog at a Health and Human Services directive released late Thursday that grants states greater leeway in apportioning welfare payments with regard to an individual’s ability to work. Specifically, they’re up in arms over the directive’s elimination of the reporting mandate for welfare’s work requirement—the part of the 1996 welfare reform package necessitating welfare recipients participate in the labor force as a condition of aid.

What’s the big deal about a reporting requirement? Conservatives see it as an end route to eliminating the work requirement itself (if you don’t have to report it, you don’t have to do it), allowing bums to lie around on the government’s dime without so much as glancing at the want ads. Via Mickey Kaus, from the Daily Caller‘s Day Center For Cranky Bloggers:

Rector and Bradley of Heritage (among the first to attack Obama’s action) make the case that the law’s work requirements were specifically designed to not be waivable, and that Obama is using HHS’s authority to waive state reporting requirements as a tricky way of voiding the underlying substantive requirements that are to be reported about.

Or, if you’re the National Review: “Obama Ends Welfare Reform As We Know It.”

Welfare reform replaced the old Aid to Families with Dependent Children with a new program, Temporary Assistance for Needy Families (TANF). The underlying concept of welfare reform was that able-bodied adults should be required to work or prepare for work as a condition of receiving welfare aid.

The welfare reform law was very successful. In the four decades prior to welfare reform, the welfare caseload never experienced a significant decline. But, in the four years after welfare reform, the caseload dropped by nearly half. Employment surged and child poverty among blacks and single mothers plummeted to historic lows. What was the catalyst for these improvements? Rigorous new federal work requirements contained in TANF.

This is proof to the right of their strange and consistent belief that Obama is creating an entitlement society not for any policy-based long-term societal improvement, but simply as an end in and of itself, as if he needs something for show and tell at the Democratic National Convention.

Or something else is up—like unemployment. The 1996 Welfare Reform Act no doubt helped a lot of people off the dole and into the workforce, but it was only able to do so because there was a workforce with room for them. The last 90s were an excellent time to be wanting a job in the United States:

Employment Growth 1960-2000

(via BLS)

When there’s a rapidly growing labor market, it’s easy to chide able-bodied people on welfare; in 1998, you could get a job hanging Help Wanted signs.

Since then:

Employment growth, 2000-2010

(via The Big Picture)

Even at its bubbly peak, the labor market vastly underperformed during Bush’s administration, to say nothing of the financial collapse he left on our doorstep that shed more than 3,000,000 jobs in six months. This not only ballooned the number of people in need of government assistance, but drastically shrunk the labor market they were supposed to joining as a condition for that assistance. In 2008, you couldn’t get a job taking down Help Wanted signs.

Conservatives don’t want to bring up contextual economic factors in the viability of work requirements because it troubles their ideological cleaving of the world into Producers and Takers; there’s simply no ability for the rigid Randian worldview to accommodate people who want to work, can’t find a job, and need assistance.

But there’s also no eliding that the 1996 work requirement was designed with the late-90s booming labor market in mind. A weak economy turns this policy reform into a contradiction: how do you maintain a work requirement for people who can’t find work in order to get the subsidy they need precisely because they can’t find work?

Kaus knows this, though he buries it in his unusually wordy-for-him post:

Thanks to the prolonged recession, there aren’t enough jobs for welfare recipients to take. Even if there is a job shortage, the answer isn’t to get rid of the work requirements but to provide useful, public jobs (that recipients would then be required to perform, on pain of losing their checks, just like regular workers). You could call such jobs “workfare,” but in effect they would be something like a backdoor WPA.

Well, what a great idea. Federal stimulus in the form of employment? Obama should immediately propose that nine months ago. By the by, what do you figure the odds are of a massive WPA-style public employment program passing through the House of Representatives right now?

I’m with Kaus when he criticizes the directive for being open-ended. Obama loses nothing by giving the waiver a duration of, say, twelve months, with option of being renewed for another year. If nothing, it would paint the effort more as the stopgap that it actually is; and I generally agree with Reagan’s quip that a government program created can never be destroyed.

But this isn’t the end of welfare reform as we know it; it’s an acknowledgement that welfare reform assumed a robust economy, and the absence of that strength created a contradiction that left the very people who needed assistance without it. This directive should provide some of that assistance. The free market will survive; and so, hopefully, will the families suffering under it.

Michael Cunningham’s Fascinating And Completely Unsatisfying Pulitzer Defense

by evanmcmurry

“The Pulitzer Prize in fiction,” William Gass once said, “takes dead aim at mediocrity and almost never misses.”

Something besides mediocrity was afoot in the non-selection of the 2012 Pulitzer, and we now have our first testimony from someone involved in the inner process of this year’s aiming. Michael Cunningham, one of the three fiction jurors and a former Pulitzer winner himself, penned a well-written explanation of the process behind the odd selection of 2012’s three finalists, though he is just as much in the dark as the rest of us as to why the Pulitzer committee ultimately did not select any of them.

Quick background, skip if you don’t need it: for the first time since 1977, no Pulitzer was awarded this year, a move that surprised and angered a good number of people, not least of which were independent bookstore owners, who rely on the prize to boost sales of the otherwise boutique literary fiction market. Anger at the Pulitzer board soon pivoted to the fiction jurors, who seemed to have presented the board with an idiosyncratic and cumbersome selection of finalists: an uncompleted manuscript by David Foster Wallace, a novella by Denis Johnson that had been published, twice, at the other end of the last decade, and a debut novel by youngster Karen Russell. It was suggested that the board simply didn’t know what to do with a group that included at least two works that could arguably be disqualified, and called the whole thing off. This writer wondered why the tie didn’t go to Russell, for actual having written a book that could be given the award without an asterisk.

All caught up? On to Cunningham’s two-part letter-essay in the New Yorker (where else?), which, though it leaves absolutely no doubt that the three jurors approached their task with supreme gravity, is nonetheless completely unsatisfying—and not only because he ultimately has no more idea why the prize wasn’t awarded than we do.


Cunningham writes on behalf of the jury which also included English professor Maureen Corrigan and former book editor Susan Larson:

We were, all three of us, shocked by the board’s decision (non-decision), because we were, in fact, thrilled, not only by the books we’d nominated but also by several other books that came within millimetres of the final cut. We never felt as if we were scraping around for books that were passable enough to slap a prize onto. We agreed, by the end of all our reading and discussion, that contemporary American fiction is diverse, inventive, ambitious, and (maybe most important) still a lively, and therefore living, art form.


The jury does not designate a winner, or even indicate a favorite. The jury provides the board with three equally ranked options. The members of the board can, if they’re unsatisfied with the three nominees, ask the jury for a fourth possibility. No such call was made.

That last sentence is probably the most important in Cunningham’s entire missive. Why the board didn’t simply ask for another recommendation will probably be the primary unanswered question from this mess. It sounds, at least to this reader, as if the judges would have provided an alternative selection, even if they might not have been thrilled about it.

Cunningham then takes us step by step through the process of winnowing 300+ novels down to 30 or so, and then down to three. The more agonizing part of the process, Cunningham says, is not the early stages of chucking the novels that obviously weren’t worthy, but the later rounds in which the judges had to somehow filter the great from the very good. Cunningham struck down novels that were otherwise laudable but contained too many lazy lines (take note, writers); other books were well-written but too obviously derivative; a thinly-veiled Super Sad True Love Story is reluctantly struck for the simplicity of its love story. If you’re a writer, even a would-be one, this is fascinating stuff.

But then we get to this:

My own most dramatic reading experience occurred when, from the third shipment, I pulled Wallace’s “The Pale King.” I confess that I was not a huge fan of his novel “Infinite Jest,” and further confess that I thought, on opening “The Pale King,” that it was a long shot indeed, given that Wallace had not lived to complete it.

I was, as it happened, the first of us to read “The Pale King,” and well before I’d finished it I found myself calling Maureen and Susan and saying, “The first paragraph of the Wallace book is more powerful than any entire book we’ve read so far.”

Cunningham, the self-described “language crank,” was wowed by the Finnegan’s Wake-esque opener, and sold it to the other two judges. “It was a little like having heard a series of chamber pieces, and been pleased by them,” Cunningham says of the novel in comparison to the other entries, “until the orchestra started in on Beethoven.”

“The Pale King” was, of course, unfinished, but so are a number of great works of art. We have only fragments of Sappho’s poetry. Chaucer was a little more than halfway through “The Canterbury Tales” when he died. And, of course, there’s Haydn’s unfinished string quartet, and all those magnificent sculptures by Michelangelo, only half emerged from their blocks of marble.

So now we know whom to blame for the inclusion of The Pale King. But there’s an obvious problem with Cunningham’s logic: while he’s right that Sappho and Chaucer have been cemented into the canon despite the fragmented nature of their ouvre, and that unfinished works—Beethoven’s Ninth comes to mind—are now classics, the distinction of each was bestowed by time, retrospect and scholarship, not the immediate recognition of awards; they were acts of accretion, not recognition. Sappho’s poetry was written in completion, it simply has not survived that way. Canterbury Tales, classic of verse though it is, did not win the Pulitzer of 1475; centuries of readers and scholars have rendered it a classic as opposed to a half-text. Beethoven’s Ninth was not awarded a proto-Grammy, instead being declared a masterpiece only because we have the luxury of considering it in the full context of the composer’s career.

Only the last of these applies to The Pale King. It is possible that, in twenty or forty years’ time, Wallace will be enshrined as the artificing frontiersman of fiction many claim him to be; at that point, if we want to declare The Pale King an unfinished masterpiece—a could-have-been in the way that, say, The Last Tycoon is—we may. This context, one of epochal movements and shifts in taste, is the opposite of the forces that govern a yearly prize, which asks merely to name the best work of fiction within a given twelve month period. What he’d written of The Last Tycoon at the point of his too-early death suggests Fitzgerald was entering a midlife return to form; but no way did it deserve 1940’s Pulitzer, and it didn’t receive it. The quality we read into it must, by definition, be read into it, almost entirely through hindsight; The Pale King has no such hindsight yet from which to benefit.

Then there’s this:

It seemed, too, that a Pulitzer for “The Pale King” would be, by implication, an acknowledgement not only of Wallace but also of Michael Pietsch, the editor. As a novelist, I well know how much difference an editor can make—and there’s no major prize given to editors. The best an editor can hope for is mention on the acknowledgments page, when, sometimes, that editor has literally rescued the book.

That’s obnoxious. Cunningham is no doubt correct about the underappreciated role of editors, and if he ever wants to rant about it over a single malt, I’ll let him buy me a Balvenie. But the Pulitzer Prize isn’t for righting the structural wrongs of the publishing industry; all Cunningham succeeded in doing by including this reasoning is making the process seem like an insular club of literati backslapping each other, rather than the celebration of genuine achievement.

Cunningham wrote less of the other two texts. Here’s his say on Train Dreams:

Denis Johnson’s “Train Dreams” had been written ten years earlier and been published as a long short story in The Paris Review. It was, however, magnificently written, stylistically innovative, and—in its exhilarating, magical depiction of ordinary life in the much romanticized Wild West—a profoundly American book.

[…] “Train Dreams” had only been published as a novel in 2012, which made it eligible, for the first time, for a Pulitzer. We checked with the Pulitzer administrator about that. He gave us the O.K.

….which is permission, not an explanation. I’m 100% with the committee that Train Dreams is excellent. I also thought so in 2002, when I first read it in the Paris Review, and again when it was anthologized in The O. Henry Story Collection in 2003. But as I wrote at the time of the Pulitzer announcement, a child born when Train Dreams was first published has a cell phone by now. Why, when one is looking for a single qualified text out of 300, was deference not given to “magnificent” and “innovative” and “American” texts not 10 years old?

Cunningham doesn’t explain; he merely tells us that the text technically qualified. In the wake of what followed, who cares? No one’s here to indict Cunningham et al; we don’t care that it was technically permissable for one book or another to be considered, but why it was considered despite an obvious deficiency according to the nature of the prize. Cunningham gives us nothing to this end.

The selection of Russell’s novel was not without its equivocations, either:

Karen Russell’s “Swamplandia!” was a first novel, and, like many first novels, it contained among its wonders certain narrative miscalculations—the occasional overreliance on endearingly quirky characters, certain scenes that should have been subtler. Was a Pulitzer a slightly excessive response to a fledgling effort?

However, it seemed very much like the initial appearance of an important writer, and its wonders were wonderful indeed. Other first novels, among them Harper Lee’s “To Kill a Mockingbird” and John Kennedy Toole’s “A Confederacy of Dunces,” have won the Pulitzer. One is not necessarily looking for perfection in a novel, or for the level of control that generally comes with more practice. One is looking, more than anything, for originality, authority, and verve, all of which “Swamplandia!” possessed in abundance.

All in all, it sounds as if the judges, or at least Cunningham, liked The Pale King the best, with a runner up as Train Dreams, and Swamplandia as a spunky bronze medal. Cunningham writes on the largely-pointless second page that the jury was looking for The One, which

The One would be the novel so monumental, so original and vast and funny and tragic, so clearly important, that only an idiot would deny it the Pulitzer Prize.

We wanted a foolproof book, a book about which we could be absolutely certain.

[…] But none of them was unquestionable, none so flawlessly and obviously great as to quell all doubts. Juries are assigned, in part, to doubt. To weigh and question, to wonder over the balance between virtue and lapse.

Cunningham clearly thought that The Pale King was as close to The One as this batch of 300 novels came. He also seems congizant that Wallace, who never received a major prize in his lifetime, wouldn’t have another chance at literature’s most prominent prize. Cunningham even runs down, as if out of guilty conscience, the list of writers who were honored far too late in their careers by similar juries who bestowed the Pulitzer more as a lifetime achievement award than a verdict of a single novel:

It’s true as well that a number of the authors of all those great but unselected books got the prize eventually, though most of us would agree that the prizes, when finally awarded, gave off a hint of redress, unless we believe that Hemingway’s “The Old Man and the Sea” (which won in 1953) outshines “The Sun Also Rises” and “A Farewell to Arms,” or that Faulkner’s “A Fable” (1955) and “The Reivers” (1963)…leave “The Sound and the Fury” and “Absalom, Absalom!” in the Mississippi dust.

Clearly, the Pulitzer committee didn’t buy any of this, or they would have given the award to Wallace. We have no idea why the committee decided the way it did, but perhaps they didn’t want one more book included on the list of the type Cunningham just gave, one more book that obviously didn’t belong. It’s too bad that they didn’t reconvene the jury: for as much as Cunningham goes to bat for Wallace in this essay, he is not palpably convinced that The Pale Kingwas the best of the bunch. If the committee had asked Cunningham to choose again, would he have looked back into the pile of books he’d so agonizingly turned down and selected another? Or would he have stuck by Wallace’s fragment, in deference to Wallace’s career or reputation, or, most likely, the idea that future generations would look back in judgment at our inability to recongize Wallace’s genius? In worrying so much about the opinion of the future, did he commit himself a book that didn’t even survive the judgement of the present?

David Brooks And The Elites

by evanmcmurry

Congratulations to Giant Marxist Vagina Chris Hayes. Nevermind a steady gig at the Nation or a weekend show on MSNBC: you know you’ve made it when David Brooks devotes a whole column to debunking you. From now on, Mr. Hayes drinks for free south of 14th St., in which I include most of the rest of the country where most of the rest of the plebes live.

Brooks no doubt could see this portion of the superstructure from his house, if he didn’t have such a bad habit of mistaking mirrors for windows:

Through most of the 19th and 20th centuries, the Protestant Establishment sat atop the American power structure. A relatively small network of white Protestant men dominated the universities, the world of finance, the local country clubs and even high government service.

And things were great. Go on.

Over the past half–century, a more diverse and meritocratic elite has replaced the Protestant Establishment. People are more likely to rise on the basis of grades, test scores, effort and performance.

Or family pedigree, family prominence, family connections, or a combination of all three. Go on.

Yet, as this meritocratic elite has taken over institutions, trust in them has plummeted. It’s not even clear that the brainy elite is doing a better job of running them than the old boys’ network. Would we say that Wall Street is working better now than it did 60 years ago? Or government? The system is more just, but the outcomes are mixed. The meritocracy has not fulfilled its promise.

That’s unobjectionable until you catch the false dilemma slipped in at the end: “The system is more just, but the outcomes are mixed.” Note how justice is cannily posed in direct opposition to success, as if society were a zero-sum competition between the two: a more just society, to Brooks, necessarily means a less functional one. Thus do we return to Brooks’s running theme, that world would work best if the people for whom it worked worst accepted their fate and got out of the way.

Would anybody like to pose a counterargument to that?

Christopher Hayes of MSNBC and The Nation believes that the problem is inherent in the nature of meritocracies. In his book, “Twilight of the Elites,” he argues that meritocratic elites may rise on the basis of grades, effort and merit, but, to preserve their status, they become corrupt. They create wildly unequal societies, and then they rig things so that few can climb the ladders behind them. Meritocracy leads to oligarchy.

Hayes points to his own elite training ground, Hunter College High School in New York City. You have to ace an entrance exam to get in, but affluent parents send their kids to rigorous test prep centers and now few poor black and Latino students can get in.

Baseball players get to the major leagues through merit, but then some take enhancement drugs to preserve their status. Financiers work hard to get jobs at the big banks, but then some rig the game for their own mutual benefit.

[…] Far from being the fairest of all systems, he concludes, the meritocracy promotes gigantic inequality and is fundamentally dysfunctional. No wonder institutional failure has been the leitmotif of our age.

Sounds right to me. If we agree, then why are we arguing?

It’s a challenging argument but wrong.


I’d say today’s meritocratic elites achieve and preserve their status not mainly by being corrupt but mainly by being ambitious and disciplined. They raise their kids in organized families. They spend enormous amounts of money and time on enrichment. They work much longer hours than people down the income scale, driving their kids to piano lessons and then taking part in conference calls from the waiting room.

Hold on a damn second. Where on god’s big green dumb earth did people come up with this idea that poor people don’t work hard? Leaving aside the problems of correlation v. causation inherent in the structure of such an “argument”—wanting as they do to see poverty as a moral failing, conservatives can never agree whether people are poor because they don’t work hard, or don’t work hard because they’re poor, and often end up claiming both, a sort of “let them have their cake and eat it, too” avoidance of logic—it just isn’t true.

Poor people tend to work more than the wealthy; they just receive exponentially less in wages for it. The main structural challenge of being poor—at least up until our most recent White Protestant opposite-of-succeeded with our economy and eliminated the problem of work by eliminating available jobs altogether—is that no amount of work was enough to stimulate economic mobility in a structurally unequal system. When you must work your whole week to break even, there’s no time left over for study and no money left over for study aid (note how Brooks intentionally conflates discipline and money in the above examples), calcifying poverty over generations. Poor people aren’t choosing to watch Real Housewives of the Meritocracy as opposed to taking their children to piano recitals; often they don’t get to either, because they’re clocked in. Brooks wants this to be a moral failing, not a structural-economic one that might be addressed by these concepts of “justice” running around mucking everything up.

Phenomena like the test-prep industry are just the icing on the cake, giving some upper-middle-class applicants a slight edge over other upper-middle-class applicants. The real advantages are much deeper and more honest.

Ah. He must mean about how nobody’s being honest about how the ethos of self interest that was supposed to gird our economy instead destroyed it, and how we have used the acidic rhetoric of the free market to dissolve the social contract.

The problem is that today’s meritocratic elites cannot admit to themselves that they are elites.

Or that. That also makes sense.

Everybody thinks they are countercultural rebels, insurgents against the true establishment, which is always somewhere else. This attitude prevails in the Ivy League, in the corporate boardrooms and even at television studios where hosts from Harvard, Stanford and Brown rail against the establishment.

(That last is a shot at Hayes and Co.) No, what the wealthy have mastered is using the matrices of late capitalism to appropriate counterculture icons and turn them into found status symbols. Nobody’s pretending that they are counterculture, they’re pretending that counterculture is just like anything else—it has an exchange value, and can be purchased for capital. It’s sort of like when a New York Times columnist buys a ticket to a Springsteen concert.

As a result, today’s elite lacks the self-conscious leadership ethos that the racist, sexist and anti-Semitic old boys’ network did possess. If you went to Groton a century ago, you knew you were privileged.

No, you knew you were better. There’s a difference, and it’s a difference to which Brooks obviously wants to return:

The difference between the Hayes view and mine is a bit like the difference between the French Revolution and the American Revolution. He wants to upend the social order. I want to keep the current social order, but I want to give it a different ethos and institutions that are more consistent with its existing ideals.

That ethos would be the one “the racist, sexist and anti-Semitic old boys’ network did possess.” Brooks never quite says as much, but every sneaky syllogism of his article points to it. Since the introduction of “justice” to the structure of our society, the “outcomes” have been “mixed.” We lost the self-awareness of elitism that allowed elites prior to the introduction of “justice” to conceive of themselves as stewards of institutions. And it just so happened that said self-awareness corresponded with a white, anti-Semitic (and anti-everything else Other) group of men.

Could it be that their self-awareness was formed exactly because their group was exclusive? Elites of yesterday thought they were in charge of institutions not because they were moral captains, but because they thought the institutions should function only for them; what Brooks sees as upstanding men guiding institutions was in fact insular groups of men protecting them against encroachment from the non-white, non-male portions of society. They acted in self-interest, exactly how the bankers of five years ago were acting when they brought down the economy.

The opposite of this self interest, it turned out, was justice. David Brooks thinks it doesn’t work as well; turns out it just doesn’t work as well for his class. But when you think a mirror is window, your class appears to be everybody. If only somebody would write a column about what happens when people “cannot admit to themselves that they are elites.”