Friday, August 9, 2013

Phony Fear Factor

August 8, 2013

By PAUL KRUGMAN

We live in a golden age of economic debunkery; fallacious doctrines have been dropping like flies. No, monetary expansion needn’t cause hyperinflation. No, budget deficits in a depressed economy don’t cause soaring interest rates. No, slashing spending doesn’t create jobs. No, economic growth doesn’t collapse when debt exceeds 90 percent of G.D.P.

And now the latest myth bites the dust: No, “economic policy uncertainty” — created, it goes without saying, by That Man in the White House — isn’t holding back the recovery.

I’ll get to the doctrine and its refutation in a minute. First, however, I want to recommend a very old essay that explains a great deal about the times we live in.

The Polish economist Michal Kalecki published “Political Aspects of Full Employment” 70 years ago. Keynesian ideas were riding high; a “solid majority” of economists believed that full employment could be secured by government spending. Yet Kalecki predicted that such spending would, nonetheless, face fierce opposition from business and the wealthy, even in times of depression. Why?

The answer, he suggested, was the role of “confidence” as a tool of intimidation. If the government can’t boost employment directly, it must promote private spending instead — and anything that might hurt the privileged, such as higher tax rates or financial regulation, can be denounced as job-killing because it undermines confidence, and hence investment. But if the government can create jobs, confidence becomes less important — and vested interests lose their veto power.

Kalecki argued that “captains of industry” understand this point, and that they oppose job-creating policies precisely because such policies would undermine their political influence. “Hence budget deficits necessary to carry out government intervention must be regarded as perilous.”

When I first read this essay, I thought it was over the top. Kalecki was, after all, a declared Marxist (although I don’t see much of Marx in his writings). But, if you haven’t been radicalized by recent events, you haven’t been paying attention; and policy discourse since 2008 has run exactly along the lines Kalecki predicted.

First came the “pivot” — the sudden switch to the view that budget deficits, not mass unemployment, were the crucial policy issue. Then came the Great Whinethe declaration by one leading business figure after another that President Obama was undermining confidence by saying mean things about businesspeople and doing outrageous things like helping the uninsured. Finally, just as happened with the claims that slashing spending is actually expansionary and terrible things happen if government debt rises, the usual suspects found an academic research paper to adopt as mascot: in this case, a paper by economists at Stanford and Chicago purportedly showing that rising levels of “economic policy uncertainty” were holding the economy back.

But, as I said, we live in a golden age of economic debunkery. The doctrine of expansionary austerity collapsed as evidence on the actual effects of austerity came in, with officials at the International Monetary Fund even admitting that they had severely underestimated the harm austerity does. The debt-scare doctrine collapsed once independent economists reviewed the data. And now the policy-uncertainty claim has gone the same way.

Actually, this happened in two stages. Soon after it became famous, the proposed measure of uncertainty was shown to be almost comically flawed; for example, it relied in part on press mentions of “economic policy uncertainty,” which meant that the index automatically surged once that phrase became a Republican talking point. Then the index itself plunged, back to levels not seen since 2008, but the economy didn’t take off. It turns out that uncertainty wasn’t the problem.

The truth is that we understand perfectly well why recovery has been slow, and confidence has nothing to do with it. What we’re looking at, instead, is the normal aftermath of a debt-fueled asset bubble; the sluggish U.S. recovery since 2009 is more or less in line with many historical examples, running all the way back to the Panic of 1893. Furthermore, the recovery has been hobbled by spending cuts — cuts that were motivated by what we now know was completely wrongheaded deficit panic.

And the policy moral is clear: We need to stop talking about spending cuts and start talking about job-creating spending increases instead. Yes, I know that the politics of doing the right thing will be very hard. But, as far as the economics goes, the only thing we have to fear is fear-mongering itself.

Correction: In my column on Monday, I somehow misstated the Republican plan on food stamps, which was for a doubling of planned cuts — a significant cut but not, as I said, a halving of benefits.

Monday, July 15, 2013

Hunger Games, U.S.A.

July 14, 2013

By PAUL KRUGMAN

Something terrible has happened to the soul of the Republican Party. We’ve gone beyond bad economic doctrine. We’ve even gone beyond selfishness and special interests. At this point we’re talking about a state of mind that takes positive glee in inflicting further suffering on the already miserable.

The occasion for these observations is, as you may have guessed, the monstrous farm bill the House passed last week.

For decades, farm bills have had two major pieces. One piece offers subsidies to farmers; the other offers nutritional aid to Americans in distress, mainly in the form of food stamps (these days officially known as the Supplemental Nutrition Assistance Program, or SNAP).

Long ago, when subsidies helped many poor farmers, you could defend the whole package as a form of support for those in need. Over the years, however, the two pieces diverged. Farm subsidies became a fraud-ridden program that mainly benefits corporations and wealthy individuals. Meanwhile food stamps became a crucial part of the social safety net.

So House Republicans voted to maintain farm subsidies — at a higher level than either the Senate or the White House proposed — while completely eliminating food stamps from the bill.

To fully appreciate what just went down, listen to the rhetoric conservatives often use to justify eliminating safety-net programs. It goes something like this: “You’re personally free to help the poor. But the government has no right to take people’s money” — frequently, at this point, they add the words “at the point of a gun” — “and force them to give it to the poor.”

It is, however, apparently perfectly O.K. to take people’s money at the point of a gun and force them to give it to agribusinesses and the wealthy.

Now, some enemies of food stamps don’t quote libertarian philosophy; they quote the Bible instead. Representative Stephen Fincher of Tennessee, for example, cited the New Testament: “The one who is unwilling to work shall not eat.” Sure enough, it turns out that Mr. Fincher has personally received millions in farm subsidies.

Given this awesome double standard — I don’t think the word “hypocrisy” does it justice — it seems almost anti-climactic to talk about facts and figures. But I guess we must.

So: Food stamp usage has indeed soared in recent years, with the percentage of the population receiving stamps rising from 8.7 in 2007 to 15.2 in the most recent data. There is, however, no mystery here. SNAP is supposed to help families in distress, and lately a lot of families have been in distress.

In fact, SNAP usage tends to track broad measures of unemployment, like U6, which includes the underemployed and workers who have temporarily given up active job search. And U6 more than doubled in the crisis, from about 8 percent before the Great Recession to 17 percent in early 2010. It’s true that broad unemployment has since declined slightly, while food stamp numbers have continued to rise — but there’s normally some lag in the relationship, and it’s probably also true that some families have been forced to take food stamps by sharp cuts in unemployment benefits.

What about the theory, common on the right, that it’s the other way around — that we have so much unemployment thanks to government programs that, in effect, pay people not to work? (Soup kitchens caused the Great Depression!) The basic answer is, you have to be kidding. Do you really believe that Americans are living lives of leisure on $134 a month, the average SNAP benefit?

Still, let’s pretend to take this seriously. If employment is down because government aid is inducing people to stay home, reducing the labor force, then the law of supply and demand should apply: withdrawing all those workers should be causing labor shortages and rising wages, especially among the low-paid workers most likely to receive aid. In reality, of course, wages are stagnant or declining — and that’s especially true for the groups that benefit most from food stamps.

So what’s going on here? Is it just racism? No doubt the old racist canards — like Ronald Reagan’s image of the “strapping young buck” using food stamps to buy a T-bone steak — still have some traction. But these days almost half of food stamp recipients are non-Hispanic whites; in Tennessee, home of the Bible-quoting Mr. Fincher, the number is 63 percent. So it’s not all about race.

What is it about, then? Somehow, one of our nation’s two great parties has become infected by an almost pathological meanspiritedness, a contempt for what CNBC’s Rick Santelli, in the famous rant that launched the Tea Party, called “losers.” If you’re an American, and you’re down on your luck, these people don’t want to help; they want to give you an extra kick. I don’t fully understand it, but it’s a terrible thing to behold.

Friday, June 7, 2013

The Spite Club

June 6, 2013

By PAUL KRUGMAN

House Republicans have voted 37 times to repeal ObamaRomneyCare — the Affordable Care Act, which creates a national health insurance system similar to the one Massachusetts has had since 2006. Nonetheless, almost all of the act will go fully into effect at the beginning of next year.

There is, however, one form of obstruction still available to the G.O.P. Last year’s Supreme Court decision upholding the law’s constitutionality also gave states the right to opt out of one piece of the plan, a federally financed expansion of Medicaid. Sure enough, a number of Republican-dominated states seem set to reject Medicaid expansion, at least at first.

And why would they do this? They won’t save money. On the contrary, they will hurt their own budgets and damage their own economies. Nor will Medicaid rejectionism serve any clear political purpose. As I’ll explain later, it will probably hurt Republicans for years to come.

No, the only way to understand the refusal to expand Medicaid is as an act of sheer spite. And the cost of that spite won’t just come in the form of lost dollars; it will also come in the form of gratuitous hardship for some of our most vulnerable citizens.

Some background: Obamacare rests on three pillars. First, insurers must offer the same coverage to everyone regardless of medical history. Second, everyone must purchase coverage — the famous “mandate” — so that the young and healthy don’t opt out until they get older and/or sicker. Third, premiums will be subsidized, so as to make insurance affordable for everyone. And this system is going into effect next year, whether Republicans like it or not.

Under this system, by the way, a few people — basically young, healthy individuals who don’t already get insurance from their employers, and whose incomes are high enough that they won’t benefit from subsidies — will end up paying more for insurance than they do now. Right-wingers are hyping this observation as if it were some kind of shocking surprise, when it was, in fact, well-known to everyone from the beginning of the debate. And, as far as anyone can tell, we’re talking about a small number of people who are, by definition, relatively well off.

Back to the Medicaid expansion. Obamacare, as I’ve just explained, relies on subsidies to make insurance affordable for lower-income Americans. But we already have a program, Medicaid, providing health coverage to very-low-income Americans, at a cost private insurers can’t match. So the Affordable Care Act, sensibly, relies on an expansion of Medicaid rather than the mandate-plus-subsidy arrangement to guarantee care to the poor and near-poor.

But Medicaid is a joint federal-state program, and the Supreme Court made it possible for states to opt out of the expansion. And it appears that a number of states will take advantage of that “opportunity.” What will that mean?

A new study from the RAND Corporation, a nonpartisan research institution, examines the consequences if 14 states whose governors have declared their opposition to Medicaid expansion do, in fact, reject the expansion. The result, the study concluded, would be a huge financial hit: the rejectionist states would lose more than $8 billion a year in federal aid, and would also find themselves on the hook for roughly $1 billion more to cover the losses hospitals incur when treating the uninsured.

Meanwhile, Medicaid rejectionism will deny health coverage to roughly 3.6 million Americans, with essentially all of the victims living near or below the poverty line. And since past experience shows that Medicaid expansion is associated with significant declines in mortality, this would mean a lot of avoidable deaths: about 19,000 a year, the study estimated.

Just think about this for a minute. It’s one thing when politicians refuse to spend money helping the poor and vulnerable; that’s just business as usual. But here we have a case in which politicians are, in effect, spending large sums, in the form of rejected aid, not to help the poor but to hurt them.

And as I said, it doesn’t even make sense as cynical politics. If Obamacare works (which it will), millions of middle-income voters — the kind of people who might support either party in future elections — will see major benefits, even in rejectionist states. So rejectionism won’t discredit health reform. What it might do, however, is drive home to lower-income voters — many of them nonwhite — just how little the G.O.P. cares about their well-being, and reinforce the already strong Democratic advantage among Latinos, in particular.

Rationally, in other words, Republicans should accept defeat on health care, at least for now, and move on. Instead, however, their spitefulness appears to override all other considerations. And millions of Americans will pay the price.

Monday, June 3, 2013

The Geezers Are All Right

June 2, 2013

By PAUL KRUGMAN

Last month the Congressional Budget Office released its much-anticipated projections for debt and deficits, and there were cries of lamentation from the deficit scolds who have had so much influence on our policy discourse. The problem, you see, was that the budget office numbers looked, well, O.K.: deficits are falling fast, and the ratio of debt to gross domestic product is projected to remain roughly stable over the next decade. Obviously it would be nice, eventually, to actually reduce debt. But if you’ve built your career around proclamations of imminent fiscal doom, this definitely wasn’t the report you wanted to see.

Still, we can always count on the baby boomers to deliver disaster, can’t we? Doesn’t the rising tide of retirees mean that Social Security and Medicare are doomed unless we radically change those programs now now now?

Maybe not.

To be fair, the reports of the Social Security and Medicare trustees released Friday do suggest that America’s retirement system needs some significant work. The ratio of Americans over 65 to those of working age will rise inexorably over the decades ahead, and this will translate into rising spending on Social Security and Medicare as a share of national income.

But the numbers aren’t nearly as overwhelming as you might have imagined, given the usual rhetoric. And if you look under the hood, the data suggest that we can, if we choose, maintain social insurance as we know it with only modest adjustments.

Start with Social Security. The retirement program’s trustees do foresee rising spending as the population ages, with total payments rising from 5.1 percent of G.D.P. now to 6.2 percent in 2035, at which point they stabilize. This means, by the way, that all the talk of Social Security going “bankrupt” is nonsense; even if nothing at all is done, the system will be able to pay most of its scheduled benefits as far as the eye can see.

Still, it does look as if there will eventually be a shortfall, and the usual suspects insist that we must move right now to reduce scheduled benefits. But I’ve never understood the logic of this demand. The risk is that we might, at some point in the future, have to cut benefits; to avoid this risk of future benefit cuts, we are supposed to act pre-emptively by...cutting future benefits. What problem, exactly, are we solving here?

What about Medicare? For years, many people — myself included — have warned that Medicare is a much bigger problem than Social Security, and the latest report from the program’s trustees still shows spending rising from 3.6 percent of G.D.P. now to 5.6 percent in 2035. But that’s a smaller rise than in previous projections. Why?

The answer is that the long-term upward trend in health care costs — a trend that has affected private insurance as well as Medicare — seems to have flattened out significantly over the past few years. Nobody is quite sure why, but there are indications that some of the cost-reducing measures contained in the Affordable Care Act, a k a Obamacare, are actually starting to “bend the curve,” just as they were supposed to. And because there are a number of cost-reducing measures in the law that have not yet kicked in, there’s every reason to believe that this favorable trend will continue.

Furthermore, there’s plenty of room for more savings, if only because recent research confirms that Americans pay far more for health procedures than citizens of other advanced countries pay; that the price premium can and should be brought down, and when it is, Medicare’s financial outlook will improve further.

So what are we looking at here? The latest projections show the combined cost of Social Security and Medicare rising by a bit more than 3 percent of G.D.P. between now and 2035, and that number could easily come down with more effort on the health care front. Now, 3 percent of G.D.P. is a big number, but it’s not an economy-crushing number. The United States could, for example, close that gap entirely through tax increases, with no reduction in benefits at all, and still have one of the lowest overall tax rates in the advanced world.

But haven’t all the great and the good been telling us that Social Security and Medicare as we know them are unsustainable, that they must be totally revamped — and made much less generous? Why yes, they have; they’ve also been telling us that we must slash spending right away or we’ll face a Greek-style fiscal crisis. They were wrong about that, and they’re wrong about the longer run, too.

The truth is that the long-term outlook for Social Security and Medicare, while not great, actually isn’t all that bad. It’s time to stop obsessing about how we’ll pay benefits to retirees in 2035 and focus instead on how we’re going to provide jobs to unemployed Americans in the here and now.

Friday, April 26, 2013

The 1 Percent’s Solution

April 25, 2013

By PAUL KRUGMAN

Economic debates rarely end with a T.K.O. But the great policy debate of recent years between Keynesians, who advocate sustaining and, indeed, increasing government spending in a depression, and austerians, who demand immediate spending cuts, comes close — at least in the world of ideas. At this point, the austerian position has imploded; not only have its predictions about the real world failed completely, but the academic research invoked to support that position has turned out to be riddled with errors, omissions and dubious statistics.

Yet two big questions remain. First, how did austerity doctrine become so influential in the first place? Second, will policy change at all now that crucial austerian claims have become fodder for late-night comics?

On the first question: the dominance of austerians in influential circles should disturb anyone who likes to believe that policy is based on, or even strongly influenced by, actual evidence. After all, the two main studies providing the alleged intellectual justification for austerity — Alberto Alesina and Silvia Ardagna on “expansionary austerity” and Carmen Reinhart and Kenneth Rogoff on the dangerous debt “threshold” at 90 percent of G.D.P. — faced withering criticism almost as soon as they came out.

And the studies did not hold up under scrutiny. By late 2010, the International Monetary Fund had reworked Alesina-Ardagna with better data and reversed their findings, while many economists raised fundamental questions about Reinhart-Rogoff long before we knew about the famous Excel error. Meanwhile, real-world events — stagnation in Ireland, the original poster child for austerity, falling interest rates in the United States, which was supposed to be facing an imminent fiscal crisis — quickly made nonsense of austerian predictions.

Yet austerity maintained and even strengthened its grip on elite opinion. Why?

Part of the answer surely lies in the widespread desire to see economics as a morality play, to make it a tale of excess and its consequences. We lived beyond our means, the story goes, and now we’re paying the inevitable price. Economists can explain ad nauseam that this is wrong, that the reason we have mass unemployment isn’t that we spent too much in the past but that we’re spending too little now, and that this problem can and should be solved. No matter; many people have a visceral sense that we sinned and must seek redemption through suffering — and neither economic argument nor the observation that the people now suffering aren’t at all the same people who sinned during the bubble years makes much of a dent.

But it’s not just a matter of emotion versus logic. You can’t understand the influence of austerity doctrine without talking about class and inequality.

What, after all, do people want from economic policy? The answer, it turns out, is that it depends on which people you ask — a point documented in a recent research paper by the political scientists Benjamin Page, Larry Bartels and Jason Seawright. The paper compares the policy preferences of ordinary Americans with those of the very wealthy, and the results are eye-opening.

Thus, the average American is somewhat worried about budget deficits, which is no surprise given the constant barrage of deficit scare stories in the news media, but the wealthy, by a large majority, regard deficits as the most important problem we face. And how should the budget deficit be brought down? The wealthy favor cutting federal spending on health care and Social Security — that is, “entitlements” — while the public at large actually wants to see spending on those programs rise.

You get the idea: The austerity agenda looks a lot like a simple expression of upper-class preferences, wrapped in a facade of academic rigor. What the top 1 percent wants becomes what economic science says we must do.

Does a continuing depression actually serve the interests of the wealthy? That’s doubtful, since a booming economy is generally good for almost everyone. What is true, however, is that the years since we turned to austerity have been dismal for workers but not at all bad for the wealthy, who have benefited from surging profits and stock prices even as long-term unemployment festers. The 1 percent may not actually want a weak economy, but they’re doing well enough to indulge their prejudices.

And this makes one wonder how much difference the intellectual collapse of the austerian position will actually make. To the extent that we have policy of the 1 percent, by the 1 percent, for the 1 percent, won’t we just see new justifications for the same old policies?

I hope not; I’d like to believe that ideas and evidence matter, at least a bit. Otherwise, what am I doing with my life? But I guess we’ll see just how much cynicism is justified.

Monday, April 8, 2013

Insurance and Freedom

April 7, 2013

By PAUL KRUGMAN

President Obama will soon release a new budget, and the commentary is already flowing fast and furious. Progressives are angry (with good reason) over proposed cuts to Social Security; conservatives are denouncing the call for more revenues. But it’s all Kabuki. Since House Republicans will block anything Mr. Obama proposes, his budget is best seen not as policy but as positioning, an attempt to gain praise from “centrist” pundits.

No, the real policy action at this point is in the states, where the question is, How many Americans will be denied essential health care in the name of freedom?

I’m referring, of course, to the question of how many Republican governors will reject the Medicaid expansion that is a key part of Obamacare. What does that have to do with freedom? In reality, nothing. But when it comes to politics, it’s a different story.

It goes without saying that Republicans oppose any expansion of programs that help the less fortunate — along with tax cuts for the wealthy, such opposition is pretty much what defines modern conservatism. But they seem to be having more trouble than in the past defending their opposition without simply coming across as big meanies.

Specifically, the time-honored practice of attacking beneficiaries of government programs as undeserving malingerers doesn’t play the way it used to. When Ronald Reagan spoke about welfare queens driving Cadillacs, it resonated with many voters. When Mitt Romney was caught on tape sneering at the 47 percent, not so much.

There is, however, an alternative. From the enthusiastic reception American conservatives gave Friedrich Hayek’s “Road to Serfdom,” to Reagan, to the governors now standing in the way of Medicaid expansion, the U.S. right has sought to portray its position not as a matter of comforting the comfortable while afflicting the afflicted, but as a courageous defense of freedom.

Conservatives love, for example, to quote from a stirring speech Reagan gave in 1961, in which he warned of a grim future unless patriots took a stand. (Liz Cheney used it in a Wall Street Journal op-ed article just a few days ago.) “If you and I don’t do this,” Reagan declared, “then you and I may well spend our sunset years telling our children and our children’s children what it once was like in America when men were free.” What youmight not guess from the lofty language is that “this” — the heroic act Reagan was calling on his listeners to perform — was a concerted effort to block the enactment of Medicare.

These days, conservatives make very similar arguments against Obamacare. For example, Senator Ron Johnson of Wisconsin has called it the “greatest assault on freedom in our lifetime.” And this kind of rhetoric matters, because when it comes to the main obstacle now remaining to more or less universal health coverage — the reluctance of Republican governors to allow the Medicaid expansion that is a key part of reform — it’s pretty much all the right has.

As I’ve already suggested, the old trick of blaming the needy for their need doesn’t seem to play the way it used to, and especially not on health care: perhaps because the experience of losing insurance is so common, Medicaid enjoys remarkably strong public support. And now that health reform is the law of the land, the economic and fiscal case for individual states to accept Medicaid expansion is overwhelming. That’s why business interests strongly support expansion just about everywhere — even in Texas. But such practical concerns can be set aside if you can successfully argue that insurance is slavery.

Of course, it isn’t. In fact, it’s hard to think of a proposition that has been more thoroughly refuted by history than the notion that social insurance undermines a free society. Almost 70 years have passed since Friedrich Hayek predicted (or at any rate was understood by his admirers to predict) that Britain’s welfare state would put the nation on the slippery slope to Stalinism; 46 years have passed since Medicare went into effect; as far as most of us can tell, freedom hasn’t died on either side of the Atlantic.

In fact, the real, lived experience of Obamacare is likely to be one of significantly increased individual freedom. For all our talk of being the land of liberty, those holding one of the dwindling number of jobs that carry decent health benefits often feel anything but free, knowing that if they leave or lose their job, for whatever reason, they may not be able to regain the coverage they need. Over time, as people come to realize that affordable coverage is now guaranteed, it will have a powerful liberating effect.

But what we still don’t know is how many Americans will be denied that kind of liberation — a denial all the crueler because it will be imposed in the name of freedom.

Monday, March 18, 2013

Marches of Folly

March 17, 2013

By PAUL KRUGMAN

Ten years ago, America invaded Iraq; somehow, our political class decided that we should respond to a terrorist attack by making war on a regime that, however vile, had nothing to do with that attack.

Some voices warned that we were making a terrible mistake — that the case for war was weak and possibly fraudulent, and that far from yielding the promised easy victory, the venture was all too likely to end in costly grief. And those warnings were, of course, right.

There were, it turned out, no weapons of mass destruction; it was obvious in retrospect that the Bush administration deliberately misled the nation into war. And the war — having cost thousands of American lives and scores of thousands of Iraqi lives, having imposed financial costs vastly higher than the war’s boosters predicted — left America weaker, not stronger, and ended up creating an Iraqi regime that is closer to Tehran than it is to Washington.

So did our political elite and our news media learn from this experience? It sure doesn’t look like it.

The really striking thing, during the run-up to the war, was the illusion of consensus. To this day, pundits who got it wrong excuse themselves on the grounds that “everyone” thought that there was a solid case for war. Of course, they acknowledge, there were war opponents — but they were out of the mainstream.

The trouble with this argument is that it was and is circular: support for the war became part of the definition of what it meant to hold a mainstream opinion. Anyone who dissented, no matter how qualified, was ipso facto labeled as unworthy of consideration. This was true in political circles; it was equally true of much of the press, which effectively took sides and joined the war party.

CNN’s Howard Kurtz, who was at The Washington Post at the time, recently wrote about how this process worked, how skeptical reporting, no matter how solid, was discouraged and rejected. “Pieces questioning the evidence or rationale for war,” he wrote, “were frequently buried, minimized or spiked.”

Closely associated with this taking of sides was an exaggerated and inappropriate reverence for authority. Only people in positions of power were considered worthy of respect. Mr. Kurtz tells us, for example, that The Post killed a piece on war doubts by its own senior defense reporter on the grounds that it relied on retired military officials and outside experts — “in other words, those with sufficient independence to question the rationale for war.”

All in all, it was an object lesson in the dangers of groupthink, a demonstration of how important it is to listen to skeptical voices and separate reporting from advocacy. But as I said, it’s a lesson that doesn’t seem to have been learned. Consider, as evidence, the deficit obsession that has dominated our political scene for the past three years.

Now, I don’t want to push the analogy too far. Bad economic policy isn’t the moral equivalent of a war fought on false pretenses, and while the predictions of deficit scolds have been wrong time and again, there hasn’t been any development either as decisive or as shocking as the complete failure to find weapons of mass destruction. Best of all, these days dissenters don’t operate in the atmosphere of menace, the sense that raising doubts could have devastating personal and career consequences, that was so pervasive in 2002 and 2003. (Remember the hate campaign against the Dixie Chicks?)

But now as then we have the illusion of consensus, an illusion based on a process in which anyone questioning the preferred narrative is immediately marginalized, no matter how strong his or her credentials. And now as then the press often seems to have taken sides. It has been especially striking how often questionable assertions are reported as fact. How many times, for example, have you seen news articles simply asserting that the United States has a “debt crisis,” even though many economists would argue that it faces no such thing?

In fact, in some ways the line between news and opinion has been even more blurred on fiscal issues than it was in the march to war. As The Post’s Ezra Klein noted last month, it seems that “the rules of reportorial neutrality don’t apply when it comes to the deficit.”

What we should have learned from the Iraq debacle was that you should always be skeptical and that you should never rely on supposed authority. If you hear that “everyone” supports a policy, whether it’s a war of choice or fiscal austerity, you should ask whether “everyone” has been defined to exclude anyone expressing a different opinion. And policy arguments should be evaluated on the merits, not by who expresses them; remember when Colin Powell assured us about those Iraqi W.M.D.’s?

Unfortunately, as I said, we don’t seem to have learned those lessons. Will we ever?