Monday, November 19, 2012

The Twinkie Manifesto

November 18, 2012

By PAUL KRUGMAN

The Twinkie, it turns out, was introduced way back in 1930. In our memories, however, the iconic snack will forever be identified with the 1950s, when Hostess popularized the brand by sponsoring “The Howdy Doody Show.” And the demise of Hostess has unleashed a wave of baby boomer nostalgia for a seemingly more innocent time.

Needless to say, it wasn’t really innocent. But the ’50s — the Twinkie Era — do offer lessons that remain relevant in the 21st century. Above all, the success of the postwar American economy demonstrates that, contrary to today’s conservative orthodoxy, you can have prosperity without demeaning workers and coddling the rich.

Consider the question of tax rates on the wealthy. The modern American right, and much of the alleged center, is obsessed with the notion that low tax rates at the top are essential to growth. Remember that Erskine Bowles and Alan Simpson, charged with producing a plan to curb deficits, nonetheless somehow ended up listing “lower tax rates” as a “guiding principle.”

Yet in the 1950s incomes in the top bracket faced a marginal tax rate of 91, that’s right, 91 percent, while taxes on corporate profits were twice as large, relative to national income, as in recent years. The best estimates suggest that circa 1960 the top 0.01 percent of Americans paid an effective federal tax rate of more than 70 percent, twice what they pay today.

Nor were high taxes the only burden wealthy businessmen had to bear. They also faced a labor force with a degree of bargaining power hard to imagine today. In 1955 roughly a third of American workers were union members. In the biggest companies, management and labor bargained as equals, so much so that it was common to talk about corporations serving an array of “stakeholders” as opposed to merely serving stockholders.

Squeezed between high taxes and empowered workers, executives were relatively impoverished by the standards of either earlier or later generations. In 1955 Fortune magazine published an essay, “How top executives live,” which emphasized how modest their lifestyles had become compared with days of yore. The vast mansions, armies of servants, and huge yachts of the 1920s were no more; by 1955 the typical executive, Fortune claimed, lived in a smallish suburban house, relied on part-time help and skippered his own relatively small boat.

The data confirm Fortune’s impressions. Between the 1920s and the 1950s real incomes for the richest Americans fell sharply, not just compared with the middle class but in absolute terms. According to estimates by the economists Thomas Piketty and Emmanuel Saez, in 1955 the real incomes of the top 0.01 percent of Americans were less than half what they had been in the late 1920s, and their share of total income was down by three-quarters.

Today, of course, the mansions, armies of servants and yachts are back, bigger than ever — and any hint of policies that might crimp plutocrats’ style is met with cries of “socialism.” Indeed, the whole Romney campaign was based on the premise that President Obama’s threat to modestly raise taxes on top incomes, plus his temerity in suggesting that some bankers had behaved badly, were crippling the economy. Surely, then, the far less plutocrat-friendly environment of the 1950s must have been an economic disaster, right?

Actually, some people thought so at the time. Paul Ryan and many other modern conservatives are devotees of Ayn Rand. Well, the collapsing, moocher-infested nation she portrayed in “Atlas Shrugged,” published in 1957, was basically Dwight Eisenhower’s America.

Strange to say, however, the oppressed executives Fortune portrayed in 1955 didn’t go Galt and deprive the nation of their talents. On the contrary, if Fortune is to be believed, they were working harder than ever. And the high-tax, strong-union decades after World War II were in fact marked by spectacular, widely shared economic growth: nothing before or since has matched the doubling of median family income between 1947 and 1973.

Which brings us back to the nostalgia thing.

There are, let’s face it, some people in our political life who pine for the days when minorities and women knew their place, gays stayed firmly in the closet and congressmen asked, “Are you now or have you ever been?” The rest of us, however, are very glad those days are gone. We are, morally, a much better nation than we were. Oh, and the food has improved a lot, too.

Along the way, however, we’ve forgotten something important — namely, that economic justice and economic growth aren’t incompatible. America in the 1950s made the rich pay their fair share; it gave workers the power to bargain for decent wages and benefits; yet contrary to right-wing propaganda then and now, it prospered. And we can do that again.

Friday, November 16, 2012

Life, Death and Deficits

November 15, 2012

By PAUL KRUGMAN

America’s political landscape is infested with many zombie ideas — beliefs about policy that have been repeatedly refuted with evidence and analysis but refuse to die. The most prominent zombie is the insistence that low taxes on rich people are the key to prosperity. But there are others.

And right now the most dangerous zombie is probably the claim that rising life expectancy justifies a rise in both the Social Security retirement age and the age of eligibility for Medicare. Even some Democrats — including, according to reports, the president — have seemed susceptible to this argument. But it’s a cruel, foolish idea — cruel in the case of Social Security, foolish in the case of Medicare — and we shouldn’t let it eat our brains.

First of all, you need to understand that while life expectancy at birth has gone up a lot, that’s not relevant to this issue; what matters is life expectancy for those at or near retirement age. When, to take one example, Alan Simpson — the co-chairman of President Obama’s deficit commission — declared that Social Security was “never intended as a retirement program” because life expectancy when it was founded was only 63, he was displaying his ignorance. Even in 1940, Americans who made it to age 65 generally had many years left.

Now, life expectancy at age 65 has risen, too. But the rise has been very uneven since the 1970s, with only the relatively affluent and well-educated seeing large gains. Bear in mind, too, that the full retirement age has already gone up to 66 and is scheduled to rise to 67 under current law.

This means that any further rise in the retirement age would be a harsh blow to Americans in the bottom half of the income distribution, who aren’t living much longer, and who, in many cases, have jobs requiring physical effort that’s difficult even for healthy seniors. And these are precisely the people who depend most on Social Security.

So any rise in the Social Security retirement age would, as I said, be cruel, hurting the most vulnerable Americans. And this cruelty would be gratuitous: While the United States does have a long-run budget problem, Social Security is not a major factor in that problem.

Medicare, on the other hand, is a big budget problem. But raising the eligibility age, which means forcing seniors to seek private insurance, is no way to deal with that problem.

It’s true that thanks to Obamacare, seniors should actually be able to get insurance even without Medicare. (Although, what happens if a number of states block the expansion of Medicaid that’s a crucial piece of the program?) But let’s be clear: Government insurance via Medicare is better and more cost-effective than private insurance.

You might ask why, in that case, health reform didn’t just extend Medicare to everyone, as opposed to setting up a system that continues to rely on private insurers. The answer, of course, is political realism. Given the power of the insurance industry, the Obama administration had to keep that industry in the loop. But the fact that Medicare for all may have been politically out of reach is no reason to push millions of Americans out of a good system into a worse one.

What would happen if we raised the Medicare eligibility age? The federal government would save only a small amount of money, because younger seniors are relatively healthy and hence low-cost. Meanwhile, however, those seniors would face sharply higher out-of-pocket costs. How could this trade-off be considered good policy?

The bottom line is that raising the age of eligibility for either Social Security benefits or Medicare would be destructive, making Americans’ lives worse without contributing in any significant way to deficit reduction. Democrats, in particular, who even consider either alternative need to ask themselves what on earth they think they’re doing.

But what, ask the deficit scolds, do people like me propose doing about rising spending? The answer is to do what every other advanced country does, and make a serious effort to rein in health care costs. Give Medicare the ability to bargain over drug prices. Let the Independent Payment Advisory Board, created as part of Obamacare to help Medicare control costs, do its job instead of crying “death panels.” (And isn’t it odd that the same people who demagogue attempts to help Medicare save money are eager to throw millions of people out of the program altogether?) We know that we have a health care system with skewed incentives and bloated costs, so why don’t we try to fix it?

What we know for sure is that there is no good case for denying older Americans access to the programs they count on. This should be a red line in any budget negotiations, and we can only hope that Mr. Obama doesn’t betray his supporters by crossing it.

Friday, November 9, 2012

Let’s Not Make a Deal

November 8, 2012

By PAUL KRUGMAN

To say the obvious: Democrats won an amazing victory. Not only did they hold the White House despite a still-troubled economy, in a year when their Senate majority was supposed to be doomed, they actually added seats.

Nor was that all: They scored major gains in the states. Most notably, California — long a poster child for the political dysfunction that comes when nothing can get done without a legislative supermajority — not only voted for much-needed tax increases, but elected, you guessed it, a Democratic supermajority.

But one goal eluded the victors. Even though preliminary estimates suggest that Democrats received somewhat more votes than Republicans in Congressional elections, the G.O.P. retains solid control of the House thanks to extreme gerrymandering by courts and Republican-controlled state governments. And Representative John Boehner, the speaker of the House, wasted no time in declaring that his party remains as intransigent as ever, utterly opposed to any rise in tax rates even as it whines about the size of the deficit.

So President Obama has to make a decision, almost immediately, about how to deal with continuing Republican obstruction. How far should he go in accommodating the G.O.P.’s demands?

My answer is, not far at all. Mr. Obama should hang tough, declaring himself willing, if necessary, to hold his ground even at the cost of letting his opponents inflict damage on a still-shaky economy. And this is definitely no time to negotiate a “grand bargain” on the budget that snatches defeat from the jaws of victory.

In saying this, I don’t mean to minimize the very real economic dangers posed by the so-called fiscal cliff that is looming at the end of this year if the two parties can’t reach a deal. Both the Bush-era tax cuts and the Obama administration’s payroll tax cut are set to expire, even as automatic spending cuts in defense and elsewhere kick in thanks to the deal struck after the 2011 confrontation over the debt ceiling. And the looming combination of tax increases and spending cuts looks easily large enough to push America back into recession.

Nobody wants to see that happen. Yet it may happen all the same, and Mr. Obama has to be willing to let it happen if necessary.

Why? Because Republicans are trying, for the third time since he took office, to use economic blackmail to achieve a goal they lack the votes to achieve through the normal legislative process. In particular, they want to extend the Bush tax cuts for the wealthy, even though the nation can’t afford to make those tax cuts permanent and the public believes that taxes on the rich should go up — and they’re threatening to block any deal on anything else unless they get their way. So they are, in effect, threatening to tank the economy unless their demands are met.

Mr. Obama essentially surrendered in the face of similar tactics at the end of 2010, extending low taxes on the rich for two more years. He made significant concessions again in 2011, when Republicans threatened to create financial chaos by refusing to raise the debt ceiling. And the current potential crisis is the legacy of those past concessions.

Well, this has to stop — unless we want hostage-taking, the threat of making the nation ungovernable, to become a standard part of our political process.

So what should he do? Just say no, and go over the cliff if necessary.

It’s worth pointing out that the fiscal cliff isn’t really a cliff. It’s not like the debt-ceiling confrontation, where terrible things might well have happened right away if the deadline had been missed. This time, nothing very bad will happen to the economy if agreement isn’t reached until a few weeks or even a few months into 2013. So there’s time to bargain.

More important, however, is the point that a stalemate would hurt Republican backers, corporate donors in particular, every bit as much as it hurt the rest of the country. As the risk of severe economic damage grew, Republicans would face intense pressure to cut a deal after all.

Meanwhile, the president is in a far stronger position than in previous confrontations. I don’t place much stock in talk of “mandates,” but Mr. Obama did win re-election with a populist campaign, so he can plausibly claim that Republicans are defying the will of the American people. And he just won his big election and is, therefore, far better placed than before to weather any political blowback from economic troubles — especially when it would be so obvious that these troubles were being deliberately inflicted by the G.O.P. in a last-ditch attempt to defend the privileges of the 1 percent.

Most of all, standing up to hostage-taking is the right thing to do for the health of America’s political system.

So stand your ground, Mr. President, and don’t give in to threats. No deal is better than a bad deal.

Friday, October 12, 2012

This Must Be Heaven

Book Review by Sam Harris: neuroscientist, etc…

heaven newsweek

Once upon a time, a neurosurgeon named Eben Alexander contracted a bad case of bacterial meningitis and fell into a coma. While immobile in his hospital bed, he experienced visions of such intense beauty that they changed everything—not just for him, but for all of us, and for science as a whole. According to Newsweek, Alexander’s experience proves that consciousness is independent of the brain, that death is an illusion, and that an eternity of perfect splendor awaits us beyond the grave—complete with the usual angels, clouds, and departed relatives, but also butterflies and beautiful girls in peasant dress. Our current understanding of the mind “now lies broken at our feet”—for, as the doctor writes, “What happened to me destroyed it, and I intend to spend the rest of my life investigating the true nature of consciousness and making the fact that we are more, much more, than our physical brains as clear as I can, both to my fellow scientists and to people at large.”

Well, I intend to spend the rest of the morning sparing him the effort. Whether you read it online or hold the physical object in your hands, this issue of Newsweek is best viewed as an archaeological artifact that is certain to embarrass us in the eyes of future generations. Its existence surely says more about our time than the editors at the magazine meant to say—for the cover alone reveals the abasement and desperation of our journalism, the intellectual bankruptcy and resultant tenacity of faith-based religion, and our ubiquitous confusion about the nature of scientific authority. The article is the modern equivalent of a 14th-century woodcut depicting the work of alchemists, inquisitors, Crusaders, and fortune-tellers. I hope our descendants understand that at least some of us were blushing.

As many of you know, I am interested in “spiritual” experiences of the sort Alexander reports. Unlike many atheists, I don’t doubt the subjective phenomena themselves—that is, I don’t believe that everyone who claims to have seen an angel, or left his body in a trance, or become one with the universe, is lying or mentally ill. Indeed, I have had similar experiences myself in meditation, in lucid dreams (even while meditating in a lucid dream), and through the use of various psychedelics (in times gone by). I know that astonishing changes in the contents of consciousness are possible and can be psychologically transformative.

And, unlike many neuroscientists and philosophers, I remain agnostic on the question of how consciousness is related to the physical world. There are, of course, very good reasons to believe that it is an emergent property of brain activity, just as the rest of the human mind obviously is. But we know nothing about how such a miracle of emergence might occur. And if consciousness were, in fact, irreducible—or even separable from the brain in a way that would give comfort to Saint Augustine—my worldview would not be overturned. I know that we do not understand consciousness, and nothing that I think I know about the cosmos, or about the patent falsity of most religious beliefs, requires that I deny this. So, although I am an atheist who can be expected to be unforgiving of religious dogma, I am not reflexively hostile to claims of the sort Alexander has made. In principle, my mind is open. (It really is.)

But Alexander’s account is so bad—his reasoning so lazy and tendentious—that it would be beneath notice if not for the fact that it currently disgraces the cover of a major newsmagazine. Alexander is also releasing a book at the end of the month, Proof of Heaven: A Neurosurgeon’s Journey into the Afterlife, which seems destined to become an instant bestseller. As much as I would like to simply ignore the unfolding travesty, it would be derelict of me to do so.

But first things first: You really must read Alexander’s article.

I trust that doing so has given you cause to worry that the good doctor is just another casualty of American-style Christianity—for though he claims to have been a nonbeliever before his adventures in coma, he presents the following self-portrait:

Although I considered myself a faithful Christian, I was so more in name than in actual belief. I didn’t begrudge those who wanted to believe that Jesus was more than simply a good man who had suffered at the hands of the world. I sympathized deeply with those who wanted to believe that there was a God somewhere out there who loved us unconditionally. In fact, I envied such people the security that those beliefs no doubt provided. But as a scientist, I simply knew better than to believe them myself.

What it means to be a “faithful Christian” without “actual belief” is not spelled out, but few nonbelievers will be surprised when our hero’s scientific skepticism proves no match for his religious conditioning. Most of us have been around this block often enough to know that many “former atheists”—like Francis Collins—spent so long on the brink of faith, and yearned for its emotional consolations with such vampiric intensity, that the slightest breeze would send them spinning into the abyss. For Collins, you may recall, all it took to establish the divinity of Jesus and the coming resurrection of the dead was the sight of a frozen waterfall. Alexander seems to have required a ride on a psychedelic butterfly. In either case, it’s not the perception of beauty we should begrudge but the utter absence of intellectual seriousness with which the author interprets it.

Everything—absolutely everything—in Alexander’s account rests on repeated assertions that his visions of heaven occurred while his cerebral cortex was “shut down,” “inactivated,” “completely shut down,” “totally offline,” and “stunned to complete inactivity.” The evidence he provides for this claim is not only inadequate—it suggests that he doesn’t know anything about the relevant brain science. Perhaps he has saved a more persuasive account for his book—though now that I’ve listened to an hour-long interview with him online, I very much doubt it. In his Newsweekarticle, Alexander asserts that the cessation of cortical activity was “clear from the severity and duration of my meningitis, and from the global cortical involvement documented by CT scans and neurological examinations.” To his editors, this presumably sounded like neuroscience.

The problem, however, is that “CT scans and neurological examinations” can’t determine neuronal inactivity—in the cortex or anywhere else. And Alexander makes no reference to functional data that might have been acquired by fMRI, PET, or EEG—nor does he seem to realize that only this sort of evidence could support his case. Obviously, the man’s cortex is functioning now—he has, after all, written a book—so whatever structural damage appeared on CT could not have been “global.” (Otherwise, he would be claiming that his entire cortex was destroyed and then grew back.) Coma is not associated with the complete cessation of cortical activity, in any case. And to my knowledge, almost no one thinks that consciousness is purely a matter of cortical activity. Alexander’s unwarranted assumptions are proliferating rather quickly. Why doesn’t he know these things? He is, after all, a neurosurgeon who survived a coma and now claims to be upending the scientific worldview on the basis of the fact that his cortex was totally quiescent at the precise moment he was enjoying the best day of his life in the company of angels. Even if his entire cortex had truly shut down (again, an incredible claim), how can he know that his visions didn’t occur in the minutes and hours during which its functions returned?

I confess that I found Alexander’s account so alarmingly unscientific that I began to worry that something had gone wrong with my own brain. So I sought the opinion of Mark Cohen, a pioneer in the field of neuroimaging who holds appointments in the Departments of Psychiatry & Biobehavioral Science, Neurology, Psychology, Radiological Science, and Bioengineering at UCLA. (He was also my thesis advisor.) Here is part of what he had to say:

This poetic interpretation of his experience is not supported by evidence of any kind. As you correctly point out, coma does not equate to “inactivation of the cerebral cortex” or “higher-order brain functions totally offline” or “neurons of [my] cortex stunned into complete inactivity”. These describe brain death, a one hundred percent lethal condition. There are many excellent scholarly articles that discuss the definitions of coma. (For example: 1 & 2)

We are not privy to his EEG records, but high alpha activity is common in coma. Also common is “flat” EEG. The EEG can appear flat even in the presence of high activity, when that activity is not synchronous. For example, the EEG flattens in regions involved in direct task processing. This phenomenon is known as event-related desynchronization (hundreds of references).

As is obvious to you, this is truth by authority. Neurosurgeons, however, are rarely well-trained in brain function. Dr. Alexander cuts brains; he does not appear to study them. “There is no scientific explanation for the fact that while my body lay in coma, my mind—my conscious, inner self—was alive and well. While the neurons of my cortex were stunned to complete inactivity by the bacteria that had attacked them, my brain-free consciousness ...” True, science cannot explain brain-free consciousness. Of course, science cannot explain consciousness anyway. In this case, however, it would be parsimonious to reject the whole idea of consciousness in the absence of brain activity. Either his brain was active when he had these dreams, or they are a confabulation of whatever took place in his state of minimally conscious coma.

There are many reports of people remembering dream-like states while in medical coma. They lack consistency, of course, but there is nothing particularly unique in Dr. Alexander’s unfortunate episode.

Okay, so it appears that my own cortex hasn’t completely shut down. In fact, there are further problems with Alexander’s account. Not only does he appear ignorant of the relevant science, but he doesn’t realize how many people have experienced visions similar to his while their brains were operational. In his online interview we learn about the kinds of conversations he’s now having with skeptics:

I guess one could always argue, “Well, your brain was probably just barely able to ignite real consciousness and then it would flip back into a very diseased state,” which doesn’t make any sense to me. Especially because that hyper-real state is so indescribable and so crisp. It’s totally unlike any drug experience. A lot of people have come up to me and said, “Oh that sounds like a DMT experience,” or “That sounds like ketamine.” Not at all. That is not even in the right ballpark.

Those things do not explain the kind of clarity, the rich interactivity, the layer upon layer of understanding and of lessons taught by deceased loved ones and spiritual beings.

“Not in the right ballpark”? His experience sounds so much like a DMT trip that we are not only in the right ballpark, we are talking about the stitching on the same ball. Here is Alexander’s description of the afterlife:

I was a speck on a beautiful butterfly wing; millions of other butterflies around us. We were flying through blooming flowers, blossoms on trees, and they were all coming out as we flew through them… [there were] waterfalls, pools of water, indescribable colors, and above there were these arcs of silver and gold light and beautiful hymns coming down from them. Indescribably gorgeous hymns. I later came to call them “angels,” those arcs of light in the sky. I think that word is probably fairly accurate….

Then we went out of this universe. I remember just seeing everything receding and initially I felt as if my awareness was in an infinite black void. It was very comforting but I could feel the extent of the infinity and that it was, as you would expect, impossible to put into words. I was there with that Divine presence that was not anything that I could visibly see and describe, and with a brilliant orb of light….

They said there were many things that they would show me, and they continued to do that. In fact, the whole higher-dimensional multiverse was this incredibly complex corrugated ball and all these lessons coming into me about it. Part of the lessons involved becoming all of what I was being shown. It was indescribable.

But then I would find myself—and time out there I can say is totally different from what we call time. There was access from out there to any part of our space/time and that made it difficult to understand a lot of these memories because we always try to sequence things and put them in linear form and description. That just really doesn’t work.

Everything that Alexander describes here and in his Newsweek article, including the parts I have left out, has been reported by DMT users. The similarity is uncanny. Here is how the late Terence McKenna described the prototypical DMT trance:

Under the influence of DMT, the world becomes an Arabian labyrinth, a palace, a more than possible Martian jewel, vast with motifs that flood the gaping mind with complex and wordless awe. Color and the sense of a reality-unlocking secret nearby pervade the experience. There is a sense of other times, and of one’s own infancy, and of wonder, wonder and more wonder. It is an audience with the alien nuncio. In the midst of this experience, apparently at the end of human history, guarding gates that seem surely to open on the howling maelstrom of the unspeakable emptiness between the stars, is the Aeon.

The Aeon, as Heraclitus presciently observed, is a child at play with colored balls. Many diminutive beings are present there—the tykes, the self-transforming machine elves of hyperspace. Are they the children destined to be father to the man? One has the impression of entering into an ecology of souls that lies beyond the portals of what we naively call death. I do not know. Are they the synesthetic embodiment of ourselves as the Other, or of the Other as ourselves? Are they the elves lost to us since the fading of the magic light of childhood? Here is a tremendum barely to be told, an epiphany beyond our wildest dreams. Here is the realm of that which is stranger than we can suppose. Here is the mystery, alive, unscathed, still as new for us as when our ancestors lived it fifteen thousand summers ago. The tryptamine entities offer the gift of new language, they sing in pearly voices that rain down as colored petals and flow through the air like hot metal to become toys and such gifts as gods would give their children. The sense of emotional connection is terrifying and intense. The Mysteries revealed are real and if ever fully told will leave no stone upon another in the small world we have gone so ill in.

This is not the mercurial world of the UFO, to be invoked from lonely hilltops; this is not the siren song of lost Atlantis wailing through the trailer courts of crack-crazed America. DMT is not one of our irrational illusions. I believe that what we experience in the presence of DMT is real news. It is a nearby dimension—frightening, transformative, and beyond our powers to imagine, and yet to be explored in the usual way. We must send fearless experts, whatever that may come to mean, to explore and to report on what they find.  (Terence McKenna, Food of the Gods, pp. 258-259.)

Alexander believes that his E. coli-addled brain could not have produced his visions because they were too “intense,” too “hyper-real,” too “beautiful,” too “interactive,” and too drenched in significance for even a healthy brain to conjure. He also appears to think that despite their timeless quality, his visions could not have arisen in the minutes or hours during which his cortex (which surely never went off) switched back on. He clearly knows nothing about what people with working brains experience under the influence of psychedelics. Nor does he know that visions of the sort that McKenna describes, although they may seem to last for ages, require only a brief span of biological time. Unlike LSD and other long-acting psychedelics, DMT alters consciousness for merely a few minutes. Alexander would have had more than enough time to experience a visionary ecstasy as he was coming out of his coma (whether his cortex was rebooting or not).

Does Alexander know that DMT already exists in the brain as a neurotransmitter? Did his brain experience a surge of DMT release during his coma? This is pure speculation, of course, but it is a far more credible hypothesis than that his cortex “shut down,” freeing his soul to travel to another dimension. As one of his correspondents has already informed him, similar experiences can be had with ketamine, which is a surgical anesthetic that is occasionally used to protect a traumatized brain. Did Alexander by any chance receive ketamine while in the hospital? Would he even think it relevant if he had? His assertion that psychedelic compounds like DMT and ketamine “do not explain the kind of clarity, the rich interactivity, the layer upon layer of understanding” he experienced is perhaps the most amazing thing he has said since he returned from heaven. Such compounds are universally understood to do the job. And most scientists believe that the reliable effects of psychedelics indicate that the brain is at the very leastinvolved in the production of visionary states of the sort Alexander is talking about.

Again, there is nothing to be said against Alexander’s experience. It sounds perfectly sublime. And such ecstasies do tell us something about how good a human mind can feel. The problem is that the conclusions Alexander has drawn from his experience—he continually reminds us, as ascientist—are based on some very obvious errors in reasoning and gaps in his understanding.

Let me suggest that, whether or not heaven exists, Alexander sounds precisely how a scientist should not sound when he doesn’t know what he is talking about. And his article is not the sort of thing that the editors of a once-important magazine should publish if they hope to reclaim some measure of respect for their battered brand.

Monday, October 8, 2012

Truth About Jobs

By PAUL KRUGMAN

If anyone had doubts about the madness that has spread through a large part of the American political spectrum, the reaction to Friday’s better-than expected report from the Bureau of Labor Statistics should have settled the issue. For the immediate response of many on the right — and we’re not just talking fringe figures — was to cry conspiracy.

Leading the charge of what were quickly dubbed the “B.L.S. truthers” was none other than Jack Welch, the former chairman of General Electric, who posted an assertion on Twitter that the books had been cooked to help President Obama’s re-election campaign. His claim was quickly picked up by right-wing pundits and media personalities.

It was nonsense, of course. Job numbers are prepared by professional civil servants, at an agency that currently has no political appointees. But then maybe Mr. Welch — under whose leadership G.E. reported remarkably smooth earnings growth, with none of the short-term fluctuations you might have expected (fluctuations that reappeared under his successor) — doesn’t know how hard it would be to cook the jobs data.

Furthermore, the methods the bureau uses are public — and anyone familiar with the data understands that they are “noisy,” that especially good (or bad) months will be reported now and then as a simple consequence of statistical randomness. And that in turn means that you shouldn’t put much weight on any one month’s report.

In that case, however, what is the somewhat longer-term trend? Is the U.S. employment picture getting better? Yes, it is.

Some background: the monthly employment report is based on two surveys. One asks a random sample of employers how many people are on their payroll. The other asks a random sample of households whether their members are working or looking for work. And if you look at the trend over the past year or so, both surveys suggest a labor market that is gradually on the mend, with job creation consistently exceeding growth in the working-age population.

On the employer side, the current numbers say that over the past year the economy added 150,000 jobs a month, and revisions will probably push that number up significantly. That’s well above the 90,000 or so added jobs per month that we need to keep up with population. (This number used to be higher, but underlying work force growth has dropped off sharply now that many baby boomers are reaching retirement age.)

Meanwhile, the household survey produces estimates of both the number of Americans employed and the number unemployed, defined as people who are seeking work but don’t currently have a job. The eye-popping number from Friday’s report was a sudden drop in the unemployment rate to 7.8 percent from 8.1 percent, but as I said, you shouldn’t put too much emphasis on one month’s number. The more important point is that unemployment has been on a sustained downward trend.

But isn’t that just because people have given up looking for work, and hence no longer count as unemployed? Actually, no. It’s true that the employment-population ratio — the percentage of adults with jobs — has been more or less flat for the past year. But remember those aging baby boomers: the fraction of American adults who are in their prime working years is falling fast. Once you take the effects of an aging population into account, the numbers show a substantial improvement in the employment picture since the summer of 2011.

None of this should be taken to imply that the situation is good, or to deny that we should be doing better — a shortfall largely due to the scorched-earth tactics of Republicans, who have blocked any and all efforts to accelerate the pace of recovery. (If the American Jobs Act, proposed by the Obama administration last year, had been passed, the unemployment rate would probably be below 7 percent.) The U.S. economy is still far short of where it should be, and the job market has a long way to go before it makes up the ground lost in the Great Recession. But the employment data do suggest an economy that is slowly healing, an economy in which declining consumer debt burdens and a housing revival have finally put us on the road back to full employment.

And that’s the truth that the right can’t handle. The furor over Friday’s report revealed a political movement that is rooting for American failure, so obsessed with taking down Mr. Obama that good news for the nation’s long-suffering workers drives its members into a blind rage. It also revealed a movement that lives in an intellectual bubble, dealing with uncomfortable reality — whether that reality involves polls or economic data — not just by denying the facts, but by spinning wild conspiracy theories.

It is, quite simply, frightening to think that a movement this deranged wields so much political power.

Monday, August 20, 2012

An Unserious Man

August 19, 2012

By PAUL KRUGMAN

Mitt Romney’s choice of Paul Ryan as his running mate led to a wave of pundit accolades. Now, declared writer after writer, we’re going to have a real debate about the nation’s fiscal future. This was predictable: never mind the Tea Party, Mr. Ryan’s true constituency is the commentariat, which years ago decided that he was the Honest, Serious Conservative, whose proposals deserve respect even if you don’t like him.

But he isn’t and they don’t. Ryanomics is and always has been a con game, although to be fair, it has become even more of a con since Mr. Ryan joined the ticket.

Let’s talk about what’s actually in the Ryan plan, and let’s distinguish in particular between actual, specific policy proposals and unsupported assertions. To focus things a bit more, let’s talk — as most budget discussions do — about what’s supposed to happen over the next 10 years.

On the tax side, Mr. Ryan proposes big cuts in tax rates on top income brackets and corporations. He has tried to dodge the normal process in which tax proposals are “scored” by independent auditors, but the nonpartisan Tax Policy Center has done the math, and the revenue loss from these cuts comes to $4.3 trillion over the next decade.

On the spending side, Mr. Ryan proposes huge cuts in Medicaid, turning it over to the states while sharply reducing funding relative to projections under current policy. That saves around $800 billion. He proposes similar harsh cuts in food stamps, saving a further $130 billion or so, plus a grab-bag of other cuts, such as reduced aid to college students. Let’s be generous and say that all these cuts would save $1 trillion.

On top of this, Mr. Ryan includes the $716 billion in Medicare savings that are part of Obamacare, even though he wants to scrap everything else in that act. Despite this, Mr. Ryan has now joined Mr. Romney in denouncing President Obama for “cutting Medicare”; more on that in a minute.

So if we add up Mr. Ryan’s specific proposals, we have $4.3 trillion in tax cuts, partially offset by around $1.7 trillion in spending cuts — with the tax cuts, surprise, disproportionately benefiting the top 1 percent, while the spending cuts would primarily come at the expense of low-income families. Over all, the effect would be to increase the deficit by around two and a half trillion dollars.

Yet Mr. Ryan claims to be a deficit hawk. What’s the basis for that claim?

Well, he says that he would offset his tax cuts by “base broadening,” eliminating enough tax deductions to make up the lost revenue. Which deductions would he eliminate? He refuses to say — and realistically, revenue gain on the scale he claims would be virtually impossible.

At the same time, he asserts that he would make huge further cuts in spending. What would he cut? He refuses to say.

What Mr. Ryan actually offers, then, are specific proposals that would sharply increase the deficit, plus an assertion that he has secret tax and spending plans that he refuses to share with us, but which will turn his overall plan into deficit reduction.

If this sounds like a joke, that’s because it is. Yet Mr. Ryan’s “plan” has been treated with great respect in Washington. He even received an award for fiscal responsibility from three of the leading deficit-scold pressure groups. What’s going on?

The answer, basically, is a triumph of style over substance. Over the longer term, the Ryan plan would end Medicare as we know it — and in Washington, “fiscal responsibility” is often equated with willingness to slash Medicare and Social Security, even if the purported savings would be used to cut taxes on the rich rather than to reduce deficits. Also, self-proclaimed centrists are always looking for conservatives they can praise to showcase their centrism, and Mr. Ryan has skillfully played into that weakness, talking a good game even if his numbers don’t add up.

The question now is whether Mr. Ryan’s undeserved reputation for honesty and fiscal responsibility can survive his participation in a deeply dishonest and irresponsible presidential campaign.

The first sign of trouble has already surfaced over the issue of Medicare. Mr. Romney, in an attempt to repeat the G.O.P.’s successful “death panels” strategy of the 2010 midterms, has been busily attacking the president for the same Medicare savings that are part of the Ryan plan. And Mr. Ryan’s response when this was pointed out was incredibly lame: he only included those cuts, he says, because the president put them “in the baseline,” whatever that means. Of course, whatever Mr. Ryan’s excuse, the fact is that without those savings his budget becomes even more of a plan to increase, not reduce, the deficit.

So will the choice of Mr. Ryan mean a serious campaign? No, because Mr. Ryan isn’t a serious man — he just plays one on TV.

Monday, August 6, 2012

The Science of Genocide

 

Posted on Aug 6, 2012

By Chris Hedges

On this day in 1945 the United States demonstrated that it was as morally bankrupt as the Nazi machine it had recently vanquished and the Soviet regime with which it was allied. Over Hiroshima, and three days later over Nagasaki, it exploded an atomic device that was the most efficient weapon of genocide in human history. The blast killed tens of thousands of men, women and children. It was an act of mass annihilation that was strategically and militarily indefensible. The Japanese had been on the verge of surrender. Hiroshima and Nagasaki had no military significance. It was a war crime for which no one was ever tried. The explosions, which marked the culmination of three centuries of physics, signaled the ascendancy of the technician and scientist as our most potent agents of death.

“In World War II Auschwitz and Hiroshima showed that progress through technology has escalated man’s destructive impulses into more precise and incredibly more devastating form,” Bruno Bettelheimsaid. “The concentration camps with their gas chambers, the first atomic bomb … confronted us with the stark reality of overwhelming death, not so much one’s own—this each of us has to face sooner or later, and however uneasily, most of us manage not to be overpowered by our fear of it—but the unnecessary and untimely death of millions. … Progress not only failed to preserve life but it deprived millions of their lives more effectively than had ever been possible before. Whether we choose to recognize it or not, after the second World War Auschwitz and Hiroshima became monuments to the incredible devastation man and technology together bring about.”

The atomic blasts, ignited in large part to send a message to the Soviet Union, were a reminder that science is morally neutral. Science and technology serve the ambitions of humankind. And few in the sciences look beyond the narrow tasks handed to them by corporations or government. They employ their dark arts, often blind to the consequences, to cement into place systems of security and surveillance, as well as systems of environmental destruction, that will result in collective enslavement and mass extermination. As we veer toward environmental collapse we will have to pit ourselves against many of these experts, scientists and technicians whose loyalty is to institutions that profit from exploitation and death.

Scientists and technicians in the United States over the last five decades built 70,000 nuclear weapons at a cost of $5.5 trillion. (The Soviet Union had a nuclear arsenal of similar capability.) By 1963, according to the Columbia University professor Seymour Melman, the United States could overkill the 140 principal cities in the Soviet Union more than 78 times. Yet we went on manufacturing nuclear warheads. And those who publicly questioned the rationality of the massive nuclear buildup, such as J. Robert Oppenheimer, who at the government lab at Los Alamos, N.M., had overseen the building of the two bombs used on Japan, often were zealously persecuted on suspicion of being communists or communist sympathizers. It was a war plan that called for a calculated act of enormous, criminal genocide. We built more and more bombs with the sole purpose of killing hundreds of millions of people. And those who built them, with few exceptions, never gave a thought to their suicidal creations.

“What are we to make of a civilization which has always regarded ethics as an essential part of human life [but] which has not been able to talk about the prospect of killing almost everyone except in prudential and game-theoretical terms?” Oppenheimer asked after World War II.

Max Born, the great German-British physicist and mathematician who was instrumental in the development of quantum mechanics, in his memoirs made it clear he disapproved of Oppenheimer and the other physicists who built the atomic bombs. “It is satisfying to have had such clever and efficient pupils,” Born wrote, “but I wish they had shown less cleverness and more wisdom.” Oppenheimer wrote his old teacher back. “Over the years, I have felt a certain disapproval on your part for much that I have done. This has always seemed to me quite natural, for it is a sentiment that I share.” But of course, by then, it was too late.

It was science, industry and technology that made possible the 20th century’s industrial killing. These forces magnified innate human barbarity. They served the immoral. And there are numerous scientists who continue to work in labs across the country on weapons systems that have the capacity to exterminate millions of human beings. Is this a “rational” enterprise? Is it moral? Does it advance the human species? Does it protect life?

For many of us, science has supplanted religion. We harbor a naive faith in the godlike power of science. Since scientific knowledge is cumulative, albeit morally neutral, it gives the illusion that human history and human progress also are cumulative. Science is for us what totems and spells were for our premodern ancestors. It is magical thinking. It feeds our hubris and sense of divine empowerment. And trusting in its fearsome power will mean our extinction.

The 17th century Enlightenment myth of human advancement through science, reason and rationality should have been obliterated forever by the slaughter of World War I. Europeans watched the collective suicide of a generation. The darker visions of human nature embodied in the works of Fyodor Dostoevsky, Leo Tolstoy, Thomas Hardy, Joseph Conrad and Frederick Nietzsche before the war found modern expression in the work of Sigmund Freud, James Joyce, Marcel Proust, Franz Kafka, D.H. Lawrence, Thomas Mann and Samuel Beckett, along with atonal and dissonant composers such as Igor Stravinsky and painters such as Otto Dix, George Grosz, Henri Matisse and Pablo Picasso. Human progress, these artists and writers understood, was a joke. But there were many more who enthusiastically embraced new utopian visions of progress and glory peddled by fascists and communists. These belief systems defied reality. They fetishized death. They sought unattainable utopias through violence. And empowered by science and technology, they killed millions.

Human motives often are irrational and, as Freud pointed out, contain powerful yearnings for death and self-immolation. Science and technology have empowered and amplified the ancient lusts for war, violence and death. Knowledge did not free humankind from barbarism. The civilized veneer only masked the dark, inchoate longings that plague all human societies, including our own. Freud feared the destructive power of these urges. He warned in “Civilization and Its Discontents”that if we could not regulate or contain these urges, human beings would, as the Stoics predicted, consume themselves in a vast conflagration. The future of the human race depends on naming and controlling these urges. To pretend they do not exist is to fall into self-delusion.

The breakdown of social and political control during periods of political and economic turmoil allows these urges to reign supreme. Our first inclination, Freud noted correctly, is not to love one another as brothers or sisters but to “satisfy [our] aggressiveness on [our fellow human being], to exploit his capacity for work without compensation, to use him sexually without his consent, to seize his possessions, to humiliate him, to cause him pain, to torture and to kill him.” The war in Bosnia, with rampaging Serbian militias, rape camps, torture centers, concentration camps, razed villages and mass executions, was one of numerous examples of Freud’s wisdom. At best, Freud knew, we can learn to live with, regulate and control our inner tensions and conflicts. The structure of civilized societies would always be fraught with this inner tension, he wrote, because “… man’s natural aggressive instinct, the hostility of each against all and of all against each, opposes this program of civilization.” The burden of civilization is worth it. The alternative, as Freud knew, is self-destruction.

A rational world, a world that will protect the ecosystem and build economies that learn to distribute wealth rather than allow a rapacious elite to hoard it, will never be handed to us by the scientists and technicians. Nearly all of them work for the enemy. Mary Shelley warned us about becoming Prometheus as we seek to defy fate and the gods in order to master life and death. Her Victor Frankenstein, when his 8-foot-tall creation made partly of body pieces from graves came to ghastly life, had the same reaction as Oppenheimer when the American scientist discovered that his bomb had incinerated Japanese schoolchildren. The scientist Victor Frankenstein watched the “dull yellow eye” of his creature open and “breathless horror and disgust” filled his heart.” Oppenheimer said after the first atomic bomb was detonated in the New Mexican desert: “I remembered the line from the Hindu scripture, the Bhagavad-Gita. Vishnu is trying to persuade the Prince that he should do his duty and to impress him takes on his multi-armed form and says, ‘Now I am become Death, the destroyer of worlds.’ I suppose we all thought that, in one way or another.” The critic Harold Bloom, in words that could be applied to Oppenheimer, called Victor Frankenstein “a moral idiot.”

All attempts to control the universe, to play God, to become the arbiters of life and death, have been carried out by moral idiots. They will relentlessly push forward, exploiting and pillaging, perfecting their terrible tools of technology and science, until their creation destroys them and us. They make the nuclear bombs. They extract oil from the tar sands. They turn the Appalachians into a wasteland to extract coal. They serve the evils of globalism and finance. They run the fossil fuel industry. They flood the atmosphere with carbon emissions, doom the seas, melt the polar ice caps, unleash the droughts and floods, the heat waves, the freak storms and hurricanes.

Now I am become Death, the destroyer of worlds.