Monday, March 18, 2013

Marches of Folly

March 17, 2013

By PAUL KRUGMAN

Ten years ago, America invaded Iraq; somehow, our political class decided that we should respond to a terrorist attack by making war on a regime that, however vile, had nothing to do with that attack.

Some voices warned that we were making a terrible mistake — that the case for war was weak and possibly fraudulent, and that far from yielding the promised easy victory, the venture was all too likely to end in costly grief. And those warnings were, of course, right.

There were, it turned out, no weapons of mass destruction; it was obvious in retrospect that the Bush administration deliberately misled the nation into war. And the war — having cost thousands of American lives and scores of thousands of Iraqi lives, having imposed financial costs vastly higher than the war’s boosters predicted — left America weaker, not stronger, and ended up creating an Iraqi regime that is closer to Tehran than it is to Washington.

So did our political elite and our news media learn from this experience? It sure doesn’t look like it.

The really striking thing, during the run-up to the war, was the illusion of consensus. To this day, pundits who got it wrong excuse themselves on the grounds that “everyone” thought that there was a solid case for war. Of course, they acknowledge, there were war opponents — but they were out of the mainstream.

The trouble with this argument is that it was and is circular: support for the war became part of the definition of what it meant to hold a mainstream opinion. Anyone who dissented, no matter how qualified, was ipso facto labeled as unworthy of consideration. This was true in political circles; it was equally true of much of the press, which effectively took sides and joined the war party.

CNN’s Howard Kurtz, who was at The Washington Post at the time, recently wrote about how this process worked, how skeptical reporting, no matter how solid, was discouraged and rejected. “Pieces questioning the evidence or rationale for war,” he wrote, “were frequently buried, minimized or spiked.”

Closely associated with this taking of sides was an exaggerated and inappropriate reverence for authority. Only people in positions of power were considered worthy of respect. Mr. Kurtz tells us, for example, that The Post killed a piece on war doubts by its own senior defense reporter on the grounds that it relied on retired military officials and outside experts — “in other words, those with sufficient independence to question the rationale for war.”

All in all, it was an object lesson in the dangers of groupthink, a demonstration of how important it is to listen to skeptical voices and separate reporting from advocacy. But as I said, it’s a lesson that doesn’t seem to have been learned. Consider, as evidence, the deficit obsession that has dominated our political scene for the past three years.

Now, I don’t want to push the analogy too far. Bad economic policy isn’t the moral equivalent of a war fought on false pretenses, and while the predictions of deficit scolds have been wrong time and again, there hasn’t been any development either as decisive or as shocking as the complete failure to find weapons of mass destruction. Best of all, these days dissenters don’t operate in the atmosphere of menace, the sense that raising doubts could have devastating personal and career consequences, that was so pervasive in 2002 and 2003. (Remember the hate campaign against the Dixie Chicks?)

But now as then we have the illusion of consensus, an illusion based on a process in which anyone questioning the preferred narrative is immediately marginalized, no matter how strong his or her credentials. And now as then the press often seems to have taken sides. It has been especially striking how often questionable assertions are reported as fact. How many times, for example, have you seen news articles simply asserting that the United States has a “debt crisis,” even though many economists would argue that it faces no such thing?

In fact, in some ways the line between news and opinion has been even more blurred on fiscal issues than it was in the march to war. As The Post’s Ezra Klein noted last month, it seems that “the rules of reportorial neutrality don’t apply when it comes to the deficit.”

What we should have learned from the Iraq debacle was that you should always be skeptical and that you should never rely on supposed authority. If you hear that “everyone” supports a policy, whether it’s a war of choice or fiscal austerity, you should ask whether “everyone” has been defined to exclude anyone expressing a different opinion. And policy arguments should be evaluated on the merits, not by who expresses them; remember when Colin Powell assured us about those Iraqi W.M.D.’s?

Unfortunately, as I said, we don’t seem to have learned those lessons. Will we ever?

Friday, February 15, 2013

Rubio and the Zombies

February 14, 2013

By PAUL KRUGMAN

The State of the Union address was not, I’m sorry to say, very interesting. True, the president offered many good ideas. But we already know that almost none of those ideas will make it past a hostile House of Representatives.

On the other hand, the G.O.P. reply, delivered by Senator Marco Rubio of Florida, was both interesting and revelatory. And I mean that in the worst way. For Mr. Rubio is a rising star, to such an extent that Time magazine put him on its cover, calling him “The Republican Savior.” What we learned Tuesday, however, was that zombie economic ideas have eaten his brain.

In case you’re wondering, a zombie idea is a proposition that has been thoroughly refuted by analysis and evidence, and should be dead — but won’t stay dead because it serves a political purpose, appeals to prejudices, or both. The classic zombie idea in U.S. political discourse is the notion that tax cuts for the wealthy pay for themselves, but there are many more. And, as I said, when it comes to economics it appears that Mr. Rubio’s mind is zombie-infested.

Start with the big question: How did we get into the mess we’re in?

The financial crisis of 2008 and its painful aftermath, which we’re still dealing with, were a huge slap in the face for free-market fundamentalists. Circa 2005, the usual suspects — conservative publications, analysts at right-wing think tanks like the American Enterprise Institute and the Cato Institute, and so on — insisted that deregulated financial markets were doing just fine, and dismissed warnings about a housing bubble as liberal whining. Then the nonexistent bubble burst, and the financial system proved dangerously fragile; only huge government bailouts prevented a total collapse.

Instead of learning from this experience, however, many on the right have chosen to rewrite history. Back then, they thought things were great, and their only complaint was that the government was getting in the way of even more mortgage lending; now they claim that government policies, somehow dictated by liberals even though the G.O.P. controlled both Congress and the White House, were promoting excessive borrowing and causing all the problems.

Every piece of this revisionist history has been refuted in detail. No, the government didn’t force banks to lend to Those People; no, Fannie Mae and Freddie Mac didn’t cause the housing bubble (they were doing relatively little lending during the peak bubble years); no, government-sponsored lenders weren’t responsible for the surge in risky mortgages (private mortgage issuers accounted for the vast majority of the riskiest loans).

But the zombie keeps shambling on — and here’s Mr. Rubio Tuesday night: “This idea — that our problems were caused by a government that was too small  — it’s just not true. In fact, a major cause of our recent downturn was a housing crisis created by reckless government policies.” Yep, it’s the full zombie.

What about responding to the crisis? Four years ago, right-wing economic analysts insisted that deficit spending would destroy jobs, because government borrowing would divert funds that would otherwise have gone into business investment, and also insisted that this borrowing would send interest rates soaring. The right thing, they claimed, was to balance the budget, even in a depressed economy.

Now, this argument was obviously fallacious from the beginning. As people like me tried to point out, the whole reason our economy was depressed was that businesses weren’t willing to invest as much as consumers were trying to save. So government borrowing would not, in fact, drive up interest rates — and trying to balance the budget would simply deepen the depression.

Sure enough, interest rates, far from soaring, are at historic lows — and countries that slashed spending have also seen sharp job losses. You rarely get this clear a test of competing economic ideas, and the right’s ideas failed.

But the zombie still shambles on. And here’s Mr. Rubio: “Every dollar our government borrows is money that isn’t being invested to create jobs. And the uncertainty created by the debt is one reason why many businesses aren’t hiring.” Zombies 2, Reality 0.

In fairness to Mr. Rubio, what he’s saying isn’t any different from what everyone else in his party is saying. But that, of course, is what’s so scary.

For here we are, more than five years into the worst economic slump since the Great Depression, and one of our two great political parties has seen its economic doctrine crash and burn twice: first in the run-up to crisis, then again in the aftermath. Yet that party has learned nothing; it apparently believes that all will be well if it just keeps repeating the old slogans, but louder.

It’s a disturbing picture, and one that bodes ill for our nation’s future.

Monday, February 4, 2013

Friends of Fraud

By PAUL KRUGMAN

Like many advocates of financial reform, I was a bit disappointed in the bill that finally emerged. Dodd-Frank gave regulators the power to rein in many financial excesses; but it was and is less clear that future regulators will use that power. As history shows, the financial industry’s wealth and influence can all too easily turn those who are supposed to serve as watchdogs into lap dogs instead.

There was, however, one piece of the reform that was a shining example of how to do it right: the creation of a Consumer Financial Protection Bureau, a stand-alone agency with its own funding, charged with protecting consumers against financial fraud and abuse. And sure enough, Senate Republicans are going all out in an attempt to kill that bureau.

Why is consumer financial protection necessary? Because fraud and abuse happen.

Don’t say that educated and informed consumers can take care of themselves. For one thing, not all consumers are educated and informed. Edward Gramlich, the Federal Reserve official who warned in vain about thedangers of subprime, famously asked, “Why are the most risky loan products sold to the least sophisticated borrowers?” He went on, “The question answers itself — the least sophisticated borrowers are probably duped into taking these products.”

And even well-educated adults can have a hard time understanding the risks and payoffs associated with financial deals — a fact of which shady operators are all too aware. To take an area in which the bureau has already done excellent work, how many of us know what’s actually in our credit-card contracts?

Now, you might be tempted to say that while we need protection against financial fraud, there’s no need to create another bureaucracy. Why not leave it up to the regulators we already have? The answer is that existing regulatory agencies are basically concerned with bolstering the banks; as a practical, cultural matter they will always put consumer protection on the back burner — just as they did when they ignored Mr. Gramlich’s warnings about subprime.

So the consumer protection bureau serves a vital function. But as I said, Senate Republicans are trying to kill it.

How can they do that, when the reform is already law and Democrats hold a Senate majority? Here as elsewhere, they’re turning to extortion — threatening to filibuster the appointment of Richard Cordray, the bureau’s acting head, and thereby leave the bureau unable to function. Mr. Cordray, whose work has drawn praise even from the bankers, is clearly not the issue. Instead, it’s an open attempt to use raw obstructionism to overturn the law.

What Republicans are demanding, basically, is that the protection bureau lose its independence. They want its actions subjected to a veto by other, bank-centered financial regulators, ensuring that consumers will once again be neglected, and they also want to take away its guaranteed funding, opening it to interest-group pressure. These changes would make the agency more or less worthless — but that, of course, is the point.

How can the G.O.P. be so determined to make America safe for financial fraud, with the 2008 crisis still so fresh in our memory? In part it’s because Republicans are deep in denial about what actually happened to our financial system and economy. On the right, it’s now complete orthodoxy that do-gooder liberals, especially former Representative Barney Frank, somehow caused the financial disaster by forcing helpless bankers to lend to Those People.

In reality, this is a nonsense story that has been extensively refuted; I’ve always been struck in particular by the notion that a Congressional Democrat, holding office at a time when Republicans ruled the House with an iron first, somehow had the mystical power to distort our whole banking system. But it’s a story conservatives much prefer to the awkward reality that their faith in the perfection of free markets was proved false.

And as always, you should follow the money. Historically, the financial sector has given a lot of money to both parties, with only a modest Republican lean. In the last election, however, it went all in for Republicans, giving them more than twice as much as it gave to Democrats (and favoring Mitt Romney over the president almost three to one). All this money wasn’t enough to buy an election — but it was, arguably, enough to buy a major political party.

Right now, all the media focus is on the obvious hot issues — immigration, guns, the sequester, and so on. But let’s try not to let this one fall through the cracks: just four years after runaway bankers brought the world economy to its knees, Senate Republicans are using every means at their disposal, violating all the usual norms of politics in the process, in an attempt to give the bankers a chance to do it all over again.

Monday, November 19, 2012

The Twinkie Manifesto

November 18, 2012

By PAUL KRUGMAN

The Twinkie, it turns out, was introduced way back in 1930. In our memories, however, the iconic snack will forever be identified with the 1950s, when Hostess popularized the brand by sponsoring “The Howdy Doody Show.” And the demise of Hostess has unleashed a wave of baby boomer nostalgia for a seemingly more innocent time.

Needless to say, it wasn’t really innocent. But the ’50s — the Twinkie Era — do offer lessons that remain relevant in the 21st century. Above all, the success of the postwar American economy demonstrates that, contrary to today’s conservative orthodoxy, you can have prosperity without demeaning workers and coddling the rich.

Consider the question of tax rates on the wealthy. The modern American right, and much of the alleged center, is obsessed with the notion that low tax rates at the top are essential to growth. Remember that Erskine Bowles and Alan Simpson, charged with producing a plan to curb deficits, nonetheless somehow ended up listing “lower tax rates” as a “guiding principle.”

Yet in the 1950s incomes in the top bracket faced a marginal tax rate of 91, that’s right, 91 percent, while taxes on corporate profits were twice as large, relative to national income, as in recent years. The best estimates suggest that circa 1960 the top 0.01 percent of Americans paid an effective federal tax rate of more than 70 percent, twice what they pay today.

Nor were high taxes the only burden wealthy businessmen had to bear. They also faced a labor force with a degree of bargaining power hard to imagine today. In 1955 roughly a third of American workers were union members. In the biggest companies, management and labor bargained as equals, so much so that it was common to talk about corporations serving an array of “stakeholders” as opposed to merely serving stockholders.

Squeezed between high taxes and empowered workers, executives were relatively impoverished by the standards of either earlier or later generations. In 1955 Fortune magazine published an essay, “How top executives live,” which emphasized how modest their lifestyles had become compared with days of yore. The vast mansions, armies of servants, and huge yachts of the 1920s were no more; by 1955 the typical executive, Fortune claimed, lived in a smallish suburban house, relied on part-time help and skippered his own relatively small boat.

The data confirm Fortune’s impressions. Between the 1920s and the 1950s real incomes for the richest Americans fell sharply, not just compared with the middle class but in absolute terms. According to estimates by the economists Thomas Piketty and Emmanuel Saez, in 1955 the real incomes of the top 0.01 percent of Americans were less than half what they had been in the late 1920s, and their share of total income was down by three-quarters.

Today, of course, the mansions, armies of servants and yachts are back, bigger than ever — and any hint of policies that might crimp plutocrats’ style is met with cries of “socialism.” Indeed, the whole Romney campaign was based on the premise that President Obama’s threat to modestly raise taxes on top incomes, plus his temerity in suggesting that some bankers had behaved badly, were crippling the economy. Surely, then, the far less plutocrat-friendly environment of the 1950s must have been an economic disaster, right?

Actually, some people thought so at the time. Paul Ryan and many other modern conservatives are devotees of Ayn Rand. Well, the collapsing, moocher-infested nation she portrayed in “Atlas Shrugged,” published in 1957, was basically Dwight Eisenhower’s America.

Strange to say, however, the oppressed executives Fortune portrayed in 1955 didn’t go Galt and deprive the nation of their talents. On the contrary, if Fortune is to be believed, they were working harder than ever. And the high-tax, strong-union decades after World War II were in fact marked by spectacular, widely shared economic growth: nothing before or since has matched the doubling of median family income between 1947 and 1973.

Which brings us back to the nostalgia thing.

There are, let’s face it, some people in our political life who pine for the days when minorities and women knew their place, gays stayed firmly in the closet and congressmen asked, “Are you now or have you ever been?” The rest of us, however, are very glad those days are gone. We are, morally, a much better nation than we were. Oh, and the food has improved a lot, too.

Along the way, however, we’ve forgotten something important — namely, that economic justice and economic growth aren’t incompatible. America in the 1950s made the rich pay their fair share; it gave workers the power to bargain for decent wages and benefits; yet contrary to right-wing propaganda then and now, it prospered. And we can do that again.

Friday, November 16, 2012

Life, Death and Deficits

November 15, 2012

By PAUL KRUGMAN

America’s political landscape is infested with many zombie ideas — beliefs about policy that have been repeatedly refuted with evidence and analysis but refuse to die. The most prominent zombie is the insistence that low taxes on rich people are the key to prosperity. But there are others.

And right now the most dangerous zombie is probably the claim that rising life expectancy justifies a rise in both the Social Security retirement age and the age of eligibility for Medicare. Even some Democrats — including, according to reports, the president — have seemed susceptible to this argument. But it’s a cruel, foolish idea — cruel in the case of Social Security, foolish in the case of Medicare — and we shouldn’t let it eat our brains.

First of all, you need to understand that while life expectancy at birth has gone up a lot, that’s not relevant to this issue; what matters is life expectancy for those at or near retirement age. When, to take one example, Alan Simpson — the co-chairman of President Obama’s deficit commission — declared that Social Security was “never intended as a retirement program” because life expectancy when it was founded was only 63, he was displaying his ignorance. Even in 1940, Americans who made it to age 65 generally had many years left.

Now, life expectancy at age 65 has risen, too. But the rise has been very uneven since the 1970s, with only the relatively affluent and well-educated seeing large gains. Bear in mind, too, that the full retirement age has already gone up to 66 and is scheduled to rise to 67 under current law.

This means that any further rise in the retirement age would be a harsh blow to Americans in the bottom half of the income distribution, who aren’t living much longer, and who, in many cases, have jobs requiring physical effort that’s difficult even for healthy seniors. And these are precisely the people who depend most on Social Security.

So any rise in the Social Security retirement age would, as I said, be cruel, hurting the most vulnerable Americans. And this cruelty would be gratuitous: While the United States does have a long-run budget problem, Social Security is not a major factor in that problem.

Medicare, on the other hand, is a big budget problem. But raising the eligibility age, which means forcing seniors to seek private insurance, is no way to deal with that problem.

It’s true that thanks to Obamacare, seniors should actually be able to get insurance even without Medicare. (Although, what happens if a number of states block the expansion of Medicaid that’s a crucial piece of the program?) But let’s be clear: Government insurance via Medicare is better and more cost-effective than private insurance.

You might ask why, in that case, health reform didn’t just extend Medicare to everyone, as opposed to setting up a system that continues to rely on private insurers. The answer, of course, is political realism. Given the power of the insurance industry, the Obama administration had to keep that industry in the loop. But the fact that Medicare for all may have been politically out of reach is no reason to push millions of Americans out of a good system into a worse one.

What would happen if we raised the Medicare eligibility age? The federal government would save only a small amount of money, because younger seniors are relatively healthy and hence low-cost. Meanwhile, however, those seniors would face sharply higher out-of-pocket costs. How could this trade-off be considered good policy?

The bottom line is that raising the age of eligibility for either Social Security benefits or Medicare would be destructive, making Americans’ lives worse without contributing in any significant way to deficit reduction. Democrats, in particular, who even consider either alternative need to ask themselves what on earth they think they’re doing.

But what, ask the deficit scolds, do people like me propose doing about rising spending? The answer is to do what every other advanced country does, and make a serious effort to rein in health care costs. Give Medicare the ability to bargain over drug prices. Let the Independent Payment Advisory Board, created as part of Obamacare to help Medicare control costs, do its job instead of crying “death panels.” (And isn’t it odd that the same people who demagogue attempts to help Medicare save money are eager to throw millions of people out of the program altogether?) We know that we have a health care system with skewed incentives and bloated costs, so why don’t we try to fix it?

What we know for sure is that there is no good case for denying older Americans access to the programs they count on. This should be a red line in any budget negotiations, and we can only hope that Mr. Obama doesn’t betray his supporters by crossing it.

Friday, November 9, 2012

Let’s Not Make a Deal

November 8, 2012

By PAUL KRUGMAN

To say the obvious: Democrats won an amazing victory. Not only did they hold the White House despite a still-troubled economy, in a year when their Senate majority was supposed to be doomed, they actually added seats.

Nor was that all: They scored major gains in the states. Most notably, California — long a poster child for the political dysfunction that comes when nothing can get done without a legislative supermajority — not only voted for much-needed tax increases, but elected, you guessed it, a Democratic supermajority.

But one goal eluded the victors. Even though preliminary estimates suggest that Democrats received somewhat more votes than Republicans in Congressional elections, the G.O.P. retains solid control of the House thanks to extreme gerrymandering by courts and Republican-controlled state governments. And Representative John Boehner, the speaker of the House, wasted no time in declaring that his party remains as intransigent as ever, utterly opposed to any rise in tax rates even as it whines about the size of the deficit.

So President Obama has to make a decision, almost immediately, about how to deal with continuing Republican obstruction. How far should he go in accommodating the G.O.P.’s demands?

My answer is, not far at all. Mr. Obama should hang tough, declaring himself willing, if necessary, to hold his ground even at the cost of letting his opponents inflict damage on a still-shaky economy. And this is definitely no time to negotiate a “grand bargain” on the budget that snatches defeat from the jaws of victory.

In saying this, I don’t mean to minimize the very real economic dangers posed by the so-called fiscal cliff that is looming at the end of this year if the two parties can’t reach a deal. Both the Bush-era tax cuts and the Obama administration’s payroll tax cut are set to expire, even as automatic spending cuts in defense and elsewhere kick in thanks to the deal struck after the 2011 confrontation over the debt ceiling. And the looming combination of tax increases and spending cuts looks easily large enough to push America back into recession.

Nobody wants to see that happen. Yet it may happen all the same, and Mr. Obama has to be willing to let it happen if necessary.

Why? Because Republicans are trying, for the third time since he took office, to use economic blackmail to achieve a goal they lack the votes to achieve through the normal legislative process. In particular, they want to extend the Bush tax cuts for the wealthy, even though the nation can’t afford to make those tax cuts permanent and the public believes that taxes on the rich should go up — and they’re threatening to block any deal on anything else unless they get their way. So they are, in effect, threatening to tank the economy unless their demands are met.

Mr. Obama essentially surrendered in the face of similar tactics at the end of 2010, extending low taxes on the rich for two more years. He made significant concessions again in 2011, when Republicans threatened to create financial chaos by refusing to raise the debt ceiling. And the current potential crisis is the legacy of those past concessions.

Well, this has to stop — unless we want hostage-taking, the threat of making the nation ungovernable, to become a standard part of our political process.

So what should he do? Just say no, and go over the cliff if necessary.

It’s worth pointing out that the fiscal cliff isn’t really a cliff. It’s not like the debt-ceiling confrontation, where terrible things might well have happened right away if the deadline had been missed. This time, nothing very bad will happen to the economy if agreement isn’t reached until a few weeks or even a few months into 2013. So there’s time to bargain.

More important, however, is the point that a stalemate would hurt Republican backers, corporate donors in particular, every bit as much as it hurt the rest of the country. As the risk of severe economic damage grew, Republicans would face intense pressure to cut a deal after all.

Meanwhile, the president is in a far stronger position than in previous confrontations. I don’t place much stock in talk of “mandates,” but Mr. Obama did win re-election with a populist campaign, so he can plausibly claim that Republicans are defying the will of the American people. And he just won his big election and is, therefore, far better placed than before to weather any political blowback from economic troubles — especially when it would be so obvious that these troubles were being deliberately inflicted by the G.O.P. in a last-ditch attempt to defend the privileges of the 1 percent.

Most of all, standing up to hostage-taking is the right thing to do for the health of America’s political system.

So stand your ground, Mr. President, and don’t give in to threats. No deal is better than a bad deal.

Friday, October 12, 2012

This Must Be Heaven

Book Review by Sam Harris: neuroscientist, etc…

heaven newsweek

Once upon a time, a neurosurgeon named Eben Alexander contracted a bad case of bacterial meningitis and fell into a coma. While immobile in his hospital bed, he experienced visions of such intense beauty that they changed everything—not just for him, but for all of us, and for science as a whole. According to Newsweek, Alexander’s experience proves that consciousness is independent of the brain, that death is an illusion, and that an eternity of perfect splendor awaits us beyond the grave—complete with the usual angels, clouds, and departed relatives, but also butterflies and beautiful girls in peasant dress. Our current understanding of the mind “now lies broken at our feet”—for, as the doctor writes, “What happened to me destroyed it, and I intend to spend the rest of my life investigating the true nature of consciousness and making the fact that we are more, much more, than our physical brains as clear as I can, both to my fellow scientists and to people at large.”

Well, I intend to spend the rest of the morning sparing him the effort. Whether you read it online or hold the physical object in your hands, this issue of Newsweek is best viewed as an archaeological artifact that is certain to embarrass us in the eyes of future generations. Its existence surely says more about our time than the editors at the magazine meant to say—for the cover alone reveals the abasement and desperation of our journalism, the intellectual bankruptcy and resultant tenacity of faith-based religion, and our ubiquitous confusion about the nature of scientific authority. The article is the modern equivalent of a 14th-century woodcut depicting the work of alchemists, inquisitors, Crusaders, and fortune-tellers. I hope our descendants understand that at least some of us were blushing.

As many of you know, I am interested in “spiritual” experiences of the sort Alexander reports. Unlike many atheists, I don’t doubt the subjective phenomena themselves—that is, I don’t believe that everyone who claims to have seen an angel, or left his body in a trance, or become one with the universe, is lying or mentally ill. Indeed, I have had similar experiences myself in meditation, in lucid dreams (even while meditating in a lucid dream), and through the use of various psychedelics (in times gone by). I know that astonishing changes in the contents of consciousness are possible and can be psychologically transformative.

And, unlike many neuroscientists and philosophers, I remain agnostic on the question of how consciousness is related to the physical world. There are, of course, very good reasons to believe that it is an emergent property of brain activity, just as the rest of the human mind obviously is. But we know nothing about how such a miracle of emergence might occur. And if consciousness were, in fact, irreducible—or even separable from the brain in a way that would give comfort to Saint Augustine—my worldview would not be overturned. I know that we do not understand consciousness, and nothing that I think I know about the cosmos, or about the patent falsity of most religious beliefs, requires that I deny this. So, although I am an atheist who can be expected to be unforgiving of religious dogma, I am not reflexively hostile to claims of the sort Alexander has made. In principle, my mind is open. (It really is.)

But Alexander’s account is so bad—his reasoning so lazy and tendentious—that it would be beneath notice if not for the fact that it currently disgraces the cover of a major newsmagazine. Alexander is also releasing a book at the end of the month, Proof of Heaven: A Neurosurgeon’s Journey into the Afterlife, which seems destined to become an instant bestseller. As much as I would like to simply ignore the unfolding travesty, it would be derelict of me to do so.

But first things first: You really must read Alexander’s article.

I trust that doing so has given you cause to worry that the good doctor is just another casualty of American-style Christianity—for though he claims to have been a nonbeliever before his adventures in coma, he presents the following self-portrait:

Although I considered myself a faithful Christian, I was so more in name than in actual belief. I didn’t begrudge those who wanted to believe that Jesus was more than simply a good man who had suffered at the hands of the world. I sympathized deeply with those who wanted to believe that there was a God somewhere out there who loved us unconditionally. In fact, I envied such people the security that those beliefs no doubt provided. But as a scientist, I simply knew better than to believe them myself.

What it means to be a “faithful Christian” without “actual belief” is not spelled out, but few nonbelievers will be surprised when our hero’s scientific skepticism proves no match for his religious conditioning. Most of us have been around this block often enough to know that many “former atheists”—like Francis Collins—spent so long on the brink of faith, and yearned for its emotional consolations with such vampiric intensity, that the slightest breeze would send them spinning into the abyss. For Collins, you may recall, all it took to establish the divinity of Jesus and the coming resurrection of the dead was the sight of a frozen waterfall. Alexander seems to have required a ride on a psychedelic butterfly. In either case, it’s not the perception of beauty we should begrudge but the utter absence of intellectual seriousness with which the author interprets it.

Everything—absolutely everything—in Alexander’s account rests on repeated assertions that his visions of heaven occurred while his cerebral cortex was “shut down,” “inactivated,” “completely shut down,” “totally offline,” and “stunned to complete inactivity.” The evidence he provides for this claim is not only inadequate—it suggests that he doesn’t know anything about the relevant brain science. Perhaps he has saved a more persuasive account for his book—though now that I’ve listened to an hour-long interview with him online, I very much doubt it. In his Newsweekarticle, Alexander asserts that the cessation of cortical activity was “clear from the severity and duration of my meningitis, and from the global cortical involvement documented by CT scans and neurological examinations.” To his editors, this presumably sounded like neuroscience.

The problem, however, is that “CT scans and neurological examinations” can’t determine neuronal inactivity—in the cortex or anywhere else. And Alexander makes no reference to functional data that might have been acquired by fMRI, PET, or EEG—nor does he seem to realize that only this sort of evidence could support his case. Obviously, the man’s cortex is functioning now—he has, after all, written a book—so whatever structural damage appeared on CT could not have been “global.” (Otherwise, he would be claiming that his entire cortex was destroyed and then grew back.) Coma is not associated with the complete cessation of cortical activity, in any case. And to my knowledge, almost no one thinks that consciousness is purely a matter of cortical activity. Alexander’s unwarranted assumptions are proliferating rather quickly. Why doesn’t he know these things? He is, after all, a neurosurgeon who survived a coma and now claims to be upending the scientific worldview on the basis of the fact that his cortex was totally quiescent at the precise moment he was enjoying the best day of his life in the company of angels. Even if his entire cortex had truly shut down (again, an incredible claim), how can he know that his visions didn’t occur in the minutes and hours during which its functions returned?

I confess that I found Alexander’s account so alarmingly unscientific that I began to worry that something had gone wrong with my own brain. So I sought the opinion of Mark Cohen, a pioneer in the field of neuroimaging who holds appointments in the Departments of Psychiatry & Biobehavioral Science, Neurology, Psychology, Radiological Science, and Bioengineering at UCLA. (He was also my thesis advisor.) Here is part of what he had to say:

This poetic interpretation of his experience is not supported by evidence of any kind. As you correctly point out, coma does not equate to “inactivation of the cerebral cortex” or “higher-order brain functions totally offline” or “neurons of [my] cortex stunned into complete inactivity”. These describe brain death, a one hundred percent lethal condition. There are many excellent scholarly articles that discuss the definitions of coma. (For example: 1 & 2)

We are not privy to his EEG records, but high alpha activity is common in coma. Also common is “flat” EEG. The EEG can appear flat even in the presence of high activity, when that activity is not synchronous. For example, the EEG flattens in regions involved in direct task processing. This phenomenon is known as event-related desynchronization (hundreds of references).

As is obvious to you, this is truth by authority. Neurosurgeons, however, are rarely well-trained in brain function. Dr. Alexander cuts brains; he does not appear to study them. “There is no scientific explanation for the fact that while my body lay in coma, my mind—my conscious, inner self—was alive and well. While the neurons of my cortex were stunned to complete inactivity by the bacteria that had attacked them, my brain-free consciousness ...” True, science cannot explain brain-free consciousness. Of course, science cannot explain consciousness anyway. In this case, however, it would be parsimonious to reject the whole idea of consciousness in the absence of brain activity. Either his brain was active when he had these dreams, or they are a confabulation of whatever took place in his state of minimally conscious coma.

There are many reports of people remembering dream-like states while in medical coma. They lack consistency, of course, but there is nothing particularly unique in Dr. Alexander’s unfortunate episode.

Okay, so it appears that my own cortex hasn’t completely shut down. In fact, there are further problems with Alexander’s account. Not only does he appear ignorant of the relevant science, but he doesn’t realize how many people have experienced visions similar to his while their brains were operational. In his online interview we learn about the kinds of conversations he’s now having with skeptics:

I guess one could always argue, “Well, your brain was probably just barely able to ignite real consciousness and then it would flip back into a very diseased state,” which doesn’t make any sense to me. Especially because that hyper-real state is so indescribable and so crisp. It’s totally unlike any drug experience. A lot of people have come up to me and said, “Oh that sounds like a DMT experience,” or “That sounds like ketamine.” Not at all. That is not even in the right ballpark.

Those things do not explain the kind of clarity, the rich interactivity, the layer upon layer of understanding and of lessons taught by deceased loved ones and spiritual beings.

“Not in the right ballpark”? His experience sounds so much like a DMT trip that we are not only in the right ballpark, we are talking about the stitching on the same ball. Here is Alexander’s description of the afterlife:

I was a speck on a beautiful butterfly wing; millions of other butterflies around us. We were flying through blooming flowers, blossoms on trees, and they were all coming out as we flew through them… [there were] waterfalls, pools of water, indescribable colors, and above there were these arcs of silver and gold light and beautiful hymns coming down from them. Indescribably gorgeous hymns. I later came to call them “angels,” those arcs of light in the sky. I think that word is probably fairly accurate….

Then we went out of this universe. I remember just seeing everything receding and initially I felt as if my awareness was in an infinite black void. It was very comforting but I could feel the extent of the infinity and that it was, as you would expect, impossible to put into words. I was there with that Divine presence that was not anything that I could visibly see and describe, and with a brilliant orb of light….

They said there were many things that they would show me, and they continued to do that. In fact, the whole higher-dimensional multiverse was this incredibly complex corrugated ball and all these lessons coming into me about it. Part of the lessons involved becoming all of what I was being shown. It was indescribable.

But then I would find myself—and time out there I can say is totally different from what we call time. There was access from out there to any part of our space/time and that made it difficult to understand a lot of these memories because we always try to sequence things and put them in linear form and description. That just really doesn’t work.

Everything that Alexander describes here and in his Newsweek article, including the parts I have left out, has been reported by DMT users. The similarity is uncanny. Here is how the late Terence McKenna described the prototypical DMT trance:

Under the influence of DMT, the world becomes an Arabian labyrinth, a palace, a more than possible Martian jewel, vast with motifs that flood the gaping mind with complex and wordless awe. Color and the sense of a reality-unlocking secret nearby pervade the experience. There is a sense of other times, and of one’s own infancy, and of wonder, wonder and more wonder. It is an audience with the alien nuncio. In the midst of this experience, apparently at the end of human history, guarding gates that seem surely to open on the howling maelstrom of the unspeakable emptiness between the stars, is the Aeon.

The Aeon, as Heraclitus presciently observed, is a child at play with colored balls. Many diminutive beings are present there—the tykes, the self-transforming machine elves of hyperspace. Are they the children destined to be father to the man? One has the impression of entering into an ecology of souls that lies beyond the portals of what we naively call death. I do not know. Are they the synesthetic embodiment of ourselves as the Other, or of the Other as ourselves? Are they the elves lost to us since the fading of the magic light of childhood? Here is a tremendum barely to be told, an epiphany beyond our wildest dreams. Here is the realm of that which is stranger than we can suppose. Here is the mystery, alive, unscathed, still as new for us as when our ancestors lived it fifteen thousand summers ago. The tryptamine entities offer the gift of new language, they sing in pearly voices that rain down as colored petals and flow through the air like hot metal to become toys and such gifts as gods would give their children. The sense of emotional connection is terrifying and intense. The Mysteries revealed are real and if ever fully told will leave no stone upon another in the small world we have gone so ill in.

This is not the mercurial world of the UFO, to be invoked from lonely hilltops; this is not the siren song of lost Atlantis wailing through the trailer courts of crack-crazed America. DMT is not one of our irrational illusions. I believe that what we experience in the presence of DMT is real news. It is a nearby dimension—frightening, transformative, and beyond our powers to imagine, and yet to be explored in the usual way. We must send fearless experts, whatever that may come to mean, to explore and to report on what they find.  (Terence McKenna, Food of the Gods, pp. 258-259.)

Alexander believes that his E. coli-addled brain could not have produced his visions because they were too “intense,” too “hyper-real,” too “beautiful,” too “interactive,” and too drenched in significance for even a healthy brain to conjure. He also appears to think that despite their timeless quality, his visions could not have arisen in the minutes or hours during which his cortex (which surely never went off) switched back on. He clearly knows nothing about what people with working brains experience under the influence of psychedelics. Nor does he know that visions of the sort that McKenna describes, although they may seem to last for ages, require only a brief span of biological time. Unlike LSD and other long-acting psychedelics, DMT alters consciousness for merely a few minutes. Alexander would have had more than enough time to experience a visionary ecstasy as he was coming out of his coma (whether his cortex was rebooting or not).

Does Alexander know that DMT already exists in the brain as a neurotransmitter? Did his brain experience a surge of DMT release during his coma? This is pure speculation, of course, but it is a far more credible hypothesis than that his cortex “shut down,” freeing his soul to travel to another dimension. As one of his correspondents has already informed him, similar experiences can be had with ketamine, which is a surgical anesthetic that is occasionally used to protect a traumatized brain. Did Alexander by any chance receive ketamine while in the hospital? Would he even think it relevant if he had? His assertion that psychedelic compounds like DMT and ketamine “do not explain the kind of clarity, the rich interactivity, the layer upon layer of understanding” he experienced is perhaps the most amazing thing he has said since he returned from heaven. Such compounds are universally understood to do the job. And most scientists believe that the reliable effects of psychedelics indicate that the brain is at the very leastinvolved in the production of visionary states of the sort Alexander is talking about.

Again, there is nothing to be said against Alexander’s experience. It sounds perfectly sublime. And such ecstasies do tell us something about how good a human mind can feel. The problem is that the conclusions Alexander has drawn from his experience—he continually reminds us, as ascientist—are based on some very obvious errors in reasoning and gaps in his understanding.

Let me suggest that, whether or not heaven exists, Alexander sounds precisely how a scientist should not sound when he doesn’t know what he is talking about. And his article is not the sort of thing that the editors of a once-important magazine should publish if they hope to reclaim some measure of respect for their battered brand.