Monday, November 19, 2012

The Twinkie Manifesto

November 18, 2012

By PAUL KRUGMAN

The Twinkie, it turns out, was introduced way back in 1930. In our memories, however, the iconic snack will forever be identified with the 1950s, when Hostess popularized the brand by sponsoring “The Howdy Doody Show.” And the demise of Hostess has unleashed a wave of baby boomer nostalgia for a seemingly more innocent time.

Needless to say, it wasn’t really innocent. But the ’50s — the Twinkie Era — do offer lessons that remain relevant in the 21st century. Above all, the success of the postwar American economy demonstrates that, contrary to today’s conservative orthodoxy, you can have prosperity without demeaning workers and coddling the rich.

Consider the question of tax rates on the wealthy. The modern American right, and much of the alleged center, is obsessed with the notion that low tax rates at the top are essential to growth. Remember that Erskine Bowles and Alan Simpson, charged with producing a plan to curb deficits, nonetheless somehow ended up listing “lower tax rates” as a “guiding principle.”

Yet in the 1950s incomes in the top bracket faced a marginal tax rate of 91, that’s right, 91 percent, while taxes on corporate profits were twice as large, relative to national income, as in recent years. The best estimates suggest that circa 1960 the top 0.01 percent of Americans paid an effective federal tax rate of more than 70 percent, twice what they pay today.

Nor were high taxes the only burden wealthy businessmen had to bear. They also faced a labor force with a degree of bargaining power hard to imagine today. In 1955 roughly a third of American workers were union members. In the biggest companies, management and labor bargained as equals, so much so that it was common to talk about corporations serving an array of “stakeholders” as opposed to merely serving stockholders.

Squeezed between high taxes and empowered workers, executives were relatively impoverished by the standards of either earlier or later generations. In 1955 Fortune magazine published an essay, “How top executives live,” which emphasized how modest their lifestyles had become compared with days of yore. The vast mansions, armies of servants, and huge yachts of the 1920s were no more; by 1955 the typical executive, Fortune claimed, lived in a smallish suburban house, relied on part-time help and skippered his own relatively small boat.

The data confirm Fortune’s impressions. Between the 1920s and the 1950s real incomes for the richest Americans fell sharply, not just compared with the middle class but in absolute terms. According to estimates by the economists Thomas Piketty and Emmanuel Saez, in 1955 the real incomes of the top 0.01 percent of Americans were less than half what they had been in the late 1920s, and their share of total income was down by three-quarters.

Today, of course, the mansions, armies of servants and yachts are back, bigger than ever — and any hint of policies that might crimp plutocrats’ style is met with cries of “socialism.” Indeed, the whole Romney campaign was based on the premise that President Obama’s threat to modestly raise taxes on top incomes, plus his temerity in suggesting that some bankers had behaved badly, were crippling the economy. Surely, then, the far less plutocrat-friendly environment of the 1950s must have been an economic disaster, right?

Actually, some people thought so at the time. Paul Ryan and many other modern conservatives are devotees of Ayn Rand. Well, the collapsing, moocher-infested nation she portrayed in “Atlas Shrugged,” published in 1957, was basically Dwight Eisenhower’s America.

Strange to say, however, the oppressed executives Fortune portrayed in 1955 didn’t go Galt and deprive the nation of their talents. On the contrary, if Fortune is to be believed, they were working harder than ever. And the high-tax, strong-union decades after World War II were in fact marked by spectacular, widely shared economic growth: nothing before or since has matched the doubling of median family income between 1947 and 1973.

Which brings us back to the nostalgia thing.

There are, let’s face it, some people in our political life who pine for the days when minorities and women knew their place, gays stayed firmly in the closet and congressmen asked, “Are you now or have you ever been?” The rest of us, however, are very glad those days are gone. We are, morally, a much better nation than we were. Oh, and the food has improved a lot, too.

Along the way, however, we’ve forgotten something important — namely, that economic justice and economic growth aren’t incompatible. America in the 1950s made the rich pay their fair share; it gave workers the power to bargain for decent wages and benefits; yet contrary to right-wing propaganda then and now, it prospered. And we can do that again.

Friday, November 16, 2012

Life, Death and Deficits

November 15, 2012

By PAUL KRUGMAN

America’s political landscape is infested with many zombie ideas — beliefs about policy that have been repeatedly refuted with evidence and analysis but refuse to die. The most prominent zombie is the insistence that low taxes on rich people are the key to prosperity. But there are others.

And right now the most dangerous zombie is probably the claim that rising life expectancy justifies a rise in both the Social Security retirement age and the age of eligibility for Medicare. Even some Democrats — including, according to reports, the president — have seemed susceptible to this argument. But it’s a cruel, foolish idea — cruel in the case of Social Security, foolish in the case of Medicare — and we shouldn’t let it eat our brains.

First of all, you need to understand that while life expectancy at birth has gone up a lot, that’s not relevant to this issue; what matters is life expectancy for those at or near retirement age. When, to take one example, Alan Simpson — the co-chairman of President Obama’s deficit commission — declared that Social Security was “never intended as a retirement program” because life expectancy when it was founded was only 63, he was displaying his ignorance. Even in 1940, Americans who made it to age 65 generally had many years left.

Now, life expectancy at age 65 has risen, too. But the rise has been very uneven since the 1970s, with only the relatively affluent and well-educated seeing large gains. Bear in mind, too, that the full retirement age has already gone up to 66 and is scheduled to rise to 67 under current law.

This means that any further rise in the retirement age would be a harsh blow to Americans in the bottom half of the income distribution, who aren’t living much longer, and who, in many cases, have jobs requiring physical effort that’s difficult even for healthy seniors. And these are precisely the people who depend most on Social Security.

So any rise in the Social Security retirement age would, as I said, be cruel, hurting the most vulnerable Americans. And this cruelty would be gratuitous: While the United States does have a long-run budget problem, Social Security is not a major factor in that problem.

Medicare, on the other hand, is a big budget problem. But raising the eligibility age, which means forcing seniors to seek private insurance, is no way to deal with that problem.

It’s true that thanks to Obamacare, seniors should actually be able to get insurance even without Medicare. (Although, what happens if a number of states block the expansion of Medicaid that’s a crucial piece of the program?) But let’s be clear: Government insurance via Medicare is better and more cost-effective than private insurance.

You might ask why, in that case, health reform didn’t just extend Medicare to everyone, as opposed to setting up a system that continues to rely on private insurers. The answer, of course, is political realism. Given the power of the insurance industry, the Obama administration had to keep that industry in the loop. But the fact that Medicare for all may have been politically out of reach is no reason to push millions of Americans out of a good system into a worse one.

What would happen if we raised the Medicare eligibility age? The federal government would save only a small amount of money, because younger seniors are relatively healthy and hence low-cost. Meanwhile, however, those seniors would face sharply higher out-of-pocket costs. How could this trade-off be considered good policy?

The bottom line is that raising the age of eligibility for either Social Security benefits or Medicare would be destructive, making Americans’ lives worse without contributing in any significant way to deficit reduction. Democrats, in particular, who even consider either alternative need to ask themselves what on earth they think they’re doing.

But what, ask the deficit scolds, do people like me propose doing about rising spending? The answer is to do what every other advanced country does, and make a serious effort to rein in health care costs. Give Medicare the ability to bargain over drug prices. Let the Independent Payment Advisory Board, created as part of Obamacare to help Medicare control costs, do its job instead of crying “death panels.” (And isn’t it odd that the same people who demagogue attempts to help Medicare save money are eager to throw millions of people out of the program altogether?) We know that we have a health care system with skewed incentives and bloated costs, so why don’t we try to fix it?

What we know for sure is that there is no good case for denying older Americans access to the programs they count on. This should be a red line in any budget negotiations, and we can only hope that Mr. Obama doesn’t betray his supporters by crossing it.

Friday, November 9, 2012

Let’s Not Make a Deal

November 8, 2012

By PAUL KRUGMAN

To say the obvious: Democrats won an amazing victory. Not only did they hold the White House despite a still-troubled economy, in a year when their Senate majority was supposed to be doomed, they actually added seats.

Nor was that all: They scored major gains in the states. Most notably, California — long a poster child for the political dysfunction that comes when nothing can get done without a legislative supermajority — not only voted for much-needed tax increases, but elected, you guessed it, a Democratic supermajority.

But one goal eluded the victors. Even though preliminary estimates suggest that Democrats received somewhat more votes than Republicans in Congressional elections, the G.O.P. retains solid control of the House thanks to extreme gerrymandering by courts and Republican-controlled state governments. And Representative John Boehner, the speaker of the House, wasted no time in declaring that his party remains as intransigent as ever, utterly opposed to any rise in tax rates even as it whines about the size of the deficit.

So President Obama has to make a decision, almost immediately, about how to deal with continuing Republican obstruction. How far should he go in accommodating the G.O.P.’s demands?

My answer is, not far at all. Mr. Obama should hang tough, declaring himself willing, if necessary, to hold his ground even at the cost of letting his opponents inflict damage on a still-shaky economy. And this is definitely no time to negotiate a “grand bargain” on the budget that snatches defeat from the jaws of victory.

In saying this, I don’t mean to minimize the very real economic dangers posed by the so-called fiscal cliff that is looming at the end of this year if the two parties can’t reach a deal. Both the Bush-era tax cuts and the Obama administration’s payroll tax cut are set to expire, even as automatic spending cuts in defense and elsewhere kick in thanks to the deal struck after the 2011 confrontation over the debt ceiling. And the looming combination of tax increases and spending cuts looks easily large enough to push America back into recession.

Nobody wants to see that happen. Yet it may happen all the same, and Mr. Obama has to be willing to let it happen if necessary.

Why? Because Republicans are trying, for the third time since he took office, to use economic blackmail to achieve a goal they lack the votes to achieve through the normal legislative process. In particular, they want to extend the Bush tax cuts for the wealthy, even though the nation can’t afford to make those tax cuts permanent and the public believes that taxes on the rich should go up — and they’re threatening to block any deal on anything else unless they get their way. So they are, in effect, threatening to tank the economy unless their demands are met.

Mr. Obama essentially surrendered in the face of similar tactics at the end of 2010, extending low taxes on the rich for two more years. He made significant concessions again in 2011, when Republicans threatened to create financial chaos by refusing to raise the debt ceiling. And the current potential crisis is the legacy of those past concessions.

Well, this has to stop — unless we want hostage-taking, the threat of making the nation ungovernable, to become a standard part of our political process.

So what should he do? Just say no, and go over the cliff if necessary.

It’s worth pointing out that the fiscal cliff isn’t really a cliff. It’s not like the debt-ceiling confrontation, where terrible things might well have happened right away if the deadline had been missed. This time, nothing very bad will happen to the economy if agreement isn’t reached until a few weeks or even a few months into 2013. So there’s time to bargain.

More important, however, is the point that a stalemate would hurt Republican backers, corporate donors in particular, every bit as much as it hurt the rest of the country. As the risk of severe economic damage grew, Republicans would face intense pressure to cut a deal after all.

Meanwhile, the president is in a far stronger position than in previous confrontations. I don’t place much stock in talk of “mandates,” but Mr. Obama did win re-election with a populist campaign, so he can plausibly claim that Republicans are defying the will of the American people. And he just won his big election and is, therefore, far better placed than before to weather any political blowback from economic troubles — especially when it would be so obvious that these troubles were being deliberately inflicted by the G.O.P. in a last-ditch attempt to defend the privileges of the 1 percent.

Most of all, standing up to hostage-taking is the right thing to do for the health of America’s political system.

So stand your ground, Mr. President, and don’t give in to threats. No deal is better than a bad deal.

Friday, October 12, 2012

This Must Be Heaven

Book Review by Sam Harris: neuroscientist, etc…

heaven newsweek

Once upon a time, a neurosurgeon named Eben Alexander contracted a bad case of bacterial meningitis and fell into a coma. While immobile in his hospital bed, he experienced visions of such intense beauty that they changed everything—not just for him, but for all of us, and for science as a whole. According to Newsweek, Alexander’s experience proves that consciousness is independent of the brain, that death is an illusion, and that an eternity of perfect splendor awaits us beyond the grave—complete with the usual angels, clouds, and departed relatives, but also butterflies and beautiful girls in peasant dress. Our current understanding of the mind “now lies broken at our feet”—for, as the doctor writes, “What happened to me destroyed it, and I intend to spend the rest of my life investigating the true nature of consciousness and making the fact that we are more, much more, than our physical brains as clear as I can, both to my fellow scientists and to people at large.”

Well, I intend to spend the rest of the morning sparing him the effort. Whether you read it online or hold the physical object in your hands, this issue of Newsweek is best viewed as an archaeological artifact that is certain to embarrass us in the eyes of future generations. Its existence surely says more about our time than the editors at the magazine meant to say—for the cover alone reveals the abasement and desperation of our journalism, the intellectual bankruptcy and resultant tenacity of faith-based religion, and our ubiquitous confusion about the nature of scientific authority. The article is the modern equivalent of a 14th-century woodcut depicting the work of alchemists, inquisitors, Crusaders, and fortune-tellers. I hope our descendants understand that at least some of us were blushing.

As many of you know, I am interested in “spiritual” experiences of the sort Alexander reports. Unlike many atheists, I don’t doubt the subjective phenomena themselves—that is, I don’t believe that everyone who claims to have seen an angel, or left his body in a trance, or become one with the universe, is lying or mentally ill. Indeed, I have had similar experiences myself in meditation, in lucid dreams (even while meditating in a lucid dream), and through the use of various psychedelics (in times gone by). I know that astonishing changes in the contents of consciousness are possible and can be psychologically transformative.

And, unlike many neuroscientists and philosophers, I remain agnostic on the question of how consciousness is related to the physical world. There are, of course, very good reasons to believe that it is an emergent property of brain activity, just as the rest of the human mind obviously is. But we know nothing about how such a miracle of emergence might occur. And if consciousness were, in fact, irreducible—or even separable from the brain in a way that would give comfort to Saint Augustine—my worldview would not be overturned. I know that we do not understand consciousness, and nothing that I think I know about the cosmos, or about the patent falsity of most religious beliefs, requires that I deny this. So, although I am an atheist who can be expected to be unforgiving of religious dogma, I am not reflexively hostile to claims of the sort Alexander has made. In principle, my mind is open. (It really is.)

But Alexander’s account is so bad—his reasoning so lazy and tendentious—that it would be beneath notice if not for the fact that it currently disgraces the cover of a major newsmagazine. Alexander is also releasing a book at the end of the month, Proof of Heaven: A Neurosurgeon’s Journey into the Afterlife, which seems destined to become an instant bestseller. As much as I would like to simply ignore the unfolding travesty, it would be derelict of me to do so.

But first things first: You really must read Alexander’s article.

I trust that doing so has given you cause to worry that the good doctor is just another casualty of American-style Christianity—for though he claims to have been a nonbeliever before his adventures in coma, he presents the following self-portrait:

Although I considered myself a faithful Christian, I was so more in name than in actual belief. I didn’t begrudge those who wanted to believe that Jesus was more than simply a good man who had suffered at the hands of the world. I sympathized deeply with those who wanted to believe that there was a God somewhere out there who loved us unconditionally. In fact, I envied such people the security that those beliefs no doubt provided. But as a scientist, I simply knew better than to believe them myself.

What it means to be a “faithful Christian” without “actual belief” is not spelled out, but few nonbelievers will be surprised when our hero’s scientific skepticism proves no match for his religious conditioning. Most of us have been around this block often enough to know that many “former atheists”—like Francis Collins—spent so long on the brink of faith, and yearned for its emotional consolations with such vampiric intensity, that the slightest breeze would send them spinning into the abyss. For Collins, you may recall, all it took to establish the divinity of Jesus and the coming resurrection of the dead was the sight of a frozen waterfall. Alexander seems to have required a ride on a psychedelic butterfly. In either case, it’s not the perception of beauty we should begrudge but the utter absence of intellectual seriousness with which the author interprets it.

Everything—absolutely everything—in Alexander’s account rests on repeated assertions that his visions of heaven occurred while his cerebral cortex was “shut down,” “inactivated,” “completely shut down,” “totally offline,” and “stunned to complete inactivity.” The evidence he provides for this claim is not only inadequate—it suggests that he doesn’t know anything about the relevant brain science. Perhaps he has saved a more persuasive account for his book—though now that I’ve listened to an hour-long interview with him online, I very much doubt it. In his Newsweekarticle, Alexander asserts that the cessation of cortical activity was “clear from the severity and duration of my meningitis, and from the global cortical involvement documented by CT scans and neurological examinations.” To his editors, this presumably sounded like neuroscience.

The problem, however, is that “CT scans and neurological examinations” can’t determine neuronal inactivity—in the cortex or anywhere else. And Alexander makes no reference to functional data that might have been acquired by fMRI, PET, or EEG—nor does he seem to realize that only this sort of evidence could support his case. Obviously, the man’s cortex is functioning now—he has, after all, written a book—so whatever structural damage appeared on CT could not have been “global.” (Otherwise, he would be claiming that his entire cortex was destroyed and then grew back.) Coma is not associated with the complete cessation of cortical activity, in any case. And to my knowledge, almost no one thinks that consciousness is purely a matter of cortical activity. Alexander’s unwarranted assumptions are proliferating rather quickly. Why doesn’t he know these things? He is, after all, a neurosurgeon who survived a coma and now claims to be upending the scientific worldview on the basis of the fact that his cortex was totally quiescent at the precise moment he was enjoying the best day of his life in the company of angels. Even if his entire cortex had truly shut down (again, an incredible claim), how can he know that his visions didn’t occur in the minutes and hours during which its functions returned?

I confess that I found Alexander’s account so alarmingly unscientific that I began to worry that something had gone wrong with my own brain. So I sought the opinion of Mark Cohen, a pioneer in the field of neuroimaging who holds appointments in the Departments of Psychiatry & Biobehavioral Science, Neurology, Psychology, Radiological Science, and Bioengineering at UCLA. (He was also my thesis advisor.) Here is part of what he had to say:

This poetic interpretation of his experience is not supported by evidence of any kind. As you correctly point out, coma does not equate to “inactivation of the cerebral cortex” or “higher-order brain functions totally offline” or “neurons of [my] cortex stunned into complete inactivity”. These describe brain death, a one hundred percent lethal condition. There are many excellent scholarly articles that discuss the definitions of coma. (For example: 1 & 2)

We are not privy to his EEG records, but high alpha activity is common in coma. Also common is “flat” EEG. The EEG can appear flat even in the presence of high activity, when that activity is not synchronous. For example, the EEG flattens in regions involved in direct task processing. This phenomenon is known as event-related desynchronization (hundreds of references).

As is obvious to you, this is truth by authority. Neurosurgeons, however, are rarely well-trained in brain function. Dr. Alexander cuts brains; he does not appear to study them. “There is no scientific explanation for the fact that while my body lay in coma, my mind—my conscious, inner self—was alive and well. While the neurons of my cortex were stunned to complete inactivity by the bacteria that had attacked them, my brain-free consciousness ...” True, science cannot explain brain-free consciousness. Of course, science cannot explain consciousness anyway. In this case, however, it would be parsimonious to reject the whole idea of consciousness in the absence of brain activity. Either his brain was active when he had these dreams, or they are a confabulation of whatever took place in his state of minimally conscious coma.

There are many reports of people remembering dream-like states while in medical coma. They lack consistency, of course, but there is nothing particularly unique in Dr. Alexander’s unfortunate episode.

Okay, so it appears that my own cortex hasn’t completely shut down. In fact, there are further problems with Alexander’s account. Not only does he appear ignorant of the relevant science, but he doesn’t realize how many people have experienced visions similar to his while their brains were operational. In his online interview we learn about the kinds of conversations he’s now having with skeptics:

I guess one could always argue, “Well, your brain was probably just barely able to ignite real consciousness and then it would flip back into a very diseased state,” which doesn’t make any sense to me. Especially because that hyper-real state is so indescribable and so crisp. It’s totally unlike any drug experience. A lot of people have come up to me and said, “Oh that sounds like a DMT experience,” or “That sounds like ketamine.” Not at all. That is not even in the right ballpark.

Those things do not explain the kind of clarity, the rich interactivity, the layer upon layer of understanding and of lessons taught by deceased loved ones and spiritual beings.

“Not in the right ballpark”? His experience sounds so much like a DMT trip that we are not only in the right ballpark, we are talking about the stitching on the same ball. Here is Alexander’s description of the afterlife:

I was a speck on a beautiful butterfly wing; millions of other butterflies around us. We were flying through blooming flowers, blossoms on trees, and they were all coming out as we flew through them… [there were] waterfalls, pools of water, indescribable colors, and above there were these arcs of silver and gold light and beautiful hymns coming down from them. Indescribably gorgeous hymns. I later came to call them “angels,” those arcs of light in the sky. I think that word is probably fairly accurate….

Then we went out of this universe. I remember just seeing everything receding and initially I felt as if my awareness was in an infinite black void. It was very comforting but I could feel the extent of the infinity and that it was, as you would expect, impossible to put into words. I was there with that Divine presence that was not anything that I could visibly see and describe, and with a brilliant orb of light….

They said there were many things that they would show me, and they continued to do that. In fact, the whole higher-dimensional multiverse was this incredibly complex corrugated ball and all these lessons coming into me about it. Part of the lessons involved becoming all of what I was being shown. It was indescribable.

But then I would find myself—and time out there I can say is totally different from what we call time. There was access from out there to any part of our space/time and that made it difficult to understand a lot of these memories because we always try to sequence things and put them in linear form and description. That just really doesn’t work.

Everything that Alexander describes here and in his Newsweek article, including the parts I have left out, has been reported by DMT users. The similarity is uncanny. Here is how the late Terence McKenna described the prototypical DMT trance:

Under the influence of DMT, the world becomes an Arabian labyrinth, a palace, a more than possible Martian jewel, vast with motifs that flood the gaping mind with complex and wordless awe. Color and the sense of a reality-unlocking secret nearby pervade the experience. There is a sense of other times, and of one’s own infancy, and of wonder, wonder and more wonder. It is an audience with the alien nuncio. In the midst of this experience, apparently at the end of human history, guarding gates that seem surely to open on the howling maelstrom of the unspeakable emptiness between the stars, is the Aeon.

The Aeon, as Heraclitus presciently observed, is a child at play with colored balls. Many diminutive beings are present there—the tykes, the self-transforming machine elves of hyperspace. Are they the children destined to be father to the man? One has the impression of entering into an ecology of souls that lies beyond the portals of what we naively call death. I do not know. Are they the synesthetic embodiment of ourselves as the Other, or of the Other as ourselves? Are they the elves lost to us since the fading of the magic light of childhood? Here is a tremendum barely to be told, an epiphany beyond our wildest dreams. Here is the realm of that which is stranger than we can suppose. Here is the mystery, alive, unscathed, still as new for us as when our ancestors lived it fifteen thousand summers ago. The tryptamine entities offer the gift of new language, they sing in pearly voices that rain down as colored petals and flow through the air like hot metal to become toys and such gifts as gods would give their children. The sense of emotional connection is terrifying and intense. The Mysteries revealed are real and if ever fully told will leave no stone upon another in the small world we have gone so ill in.

This is not the mercurial world of the UFO, to be invoked from lonely hilltops; this is not the siren song of lost Atlantis wailing through the trailer courts of crack-crazed America. DMT is not one of our irrational illusions. I believe that what we experience in the presence of DMT is real news. It is a nearby dimension—frightening, transformative, and beyond our powers to imagine, and yet to be explored in the usual way. We must send fearless experts, whatever that may come to mean, to explore and to report on what they find.  (Terence McKenna, Food of the Gods, pp. 258-259.)

Alexander believes that his E. coli-addled brain could not have produced his visions because they were too “intense,” too “hyper-real,” too “beautiful,” too “interactive,” and too drenched in significance for even a healthy brain to conjure. He also appears to think that despite their timeless quality, his visions could not have arisen in the minutes or hours during which his cortex (which surely never went off) switched back on. He clearly knows nothing about what people with working brains experience under the influence of psychedelics. Nor does he know that visions of the sort that McKenna describes, although they may seem to last for ages, require only a brief span of biological time. Unlike LSD and other long-acting psychedelics, DMT alters consciousness for merely a few minutes. Alexander would have had more than enough time to experience a visionary ecstasy as he was coming out of his coma (whether his cortex was rebooting or not).

Does Alexander know that DMT already exists in the brain as a neurotransmitter? Did his brain experience a surge of DMT release during his coma? This is pure speculation, of course, but it is a far more credible hypothesis than that his cortex “shut down,” freeing his soul to travel to another dimension. As one of his correspondents has already informed him, similar experiences can be had with ketamine, which is a surgical anesthetic that is occasionally used to protect a traumatized brain. Did Alexander by any chance receive ketamine while in the hospital? Would he even think it relevant if he had? His assertion that psychedelic compounds like DMT and ketamine “do not explain the kind of clarity, the rich interactivity, the layer upon layer of understanding” he experienced is perhaps the most amazing thing he has said since he returned from heaven. Such compounds are universally understood to do the job. And most scientists believe that the reliable effects of psychedelics indicate that the brain is at the very leastinvolved in the production of visionary states of the sort Alexander is talking about.

Again, there is nothing to be said against Alexander’s experience. It sounds perfectly sublime. And such ecstasies do tell us something about how good a human mind can feel. The problem is that the conclusions Alexander has drawn from his experience—he continually reminds us, as ascientist—are based on some very obvious errors in reasoning and gaps in his understanding.

Let me suggest that, whether or not heaven exists, Alexander sounds precisely how a scientist should not sound when he doesn’t know what he is talking about. And his article is not the sort of thing that the editors of a once-important magazine should publish if they hope to reclaim some measure of respect for their battered brand.

Monday, October 8, 2012

Truth About Jobs

By PAUL KRUGMAN

If anyone had doubts about the madness that has spread through a large part of the American political spectrum, the reaction to Friday’s better-than expected report from the Bureau of Labor Statistics should have settled the issue. For the immediate response of many on the right — and we’re not just talking fringe figures — was to cry conspiracy.

Leading the charge of what were quickly dubbed the “B.L.S. truthers” was none other than Jack Welch, the former chairman of General Electric, who posted an assertion on Twitter that the books had been cooked to help President Obama’s re-election campaign. His claim was quickly picked up by right-wing pundits and media personalities.

It was nonsense, of course. Job numbers are prepared by professional civil servants, at an agency that currently has no political appointees. But then maybe Mr. Welch — under whose leadership G.E. reported remarkably smooth earnings growth, with none of the short-term fluctuations you might have expected (fluctuations that reappeared under his successor) — doesn’t know how hard it would be to cook the jobs data.

Furthermore, the methods the bureau uses are public — and anyone familiar with the data understands that they are “noisy,” that especially good (or bad) months will be reported now and then as a simple consequence of statistical randomness. And that in turn means that you shouldn’t put much weight on any one month’s report.

In that case, however, what is the somewhat longer-term trend? Is the U.S. employment picture getting better? Yes, it is.

Some background: the monthly employment report is based on two surveys. One asks a random sample of employers how many people are on their payroll. The other asks a random sample of households whether their members are working or looking for work. And if you look at the trend over the past year or so, both surveys suggest a labor market that is gradually on the mend, with job creation consistently exceeding growth in the working-age population.

On the employer side, the current numbers say that over the past year the economy added 150,000 jobs a month, and revisions will probably push that number up significantly. That’s well above the 90,000 or so added jobs per month that we need to keep up with population. (This number used to be higher, but underlying work force growth has dropped off sharply now that many baby boomers are reaching retirement age.)

Meanwhile, the household survey produces estimates of both the number of Americans employed and the number unemployed, defined as people who are seeking work but don’t currently have a job. The eye-popping number from Friday’s report was a sudden drop in the unemployment rate to 7.8 percent from 8.1 percent, but as I said, you shouldn’t put too much emphasis on one month’s number. The more important point is that unemployment has been on a sustained downward trend.

But isn’t that just because people have given up looking for work, and hence no longer count as unemployed? Actually, no. It’s true that the employment-population ratio — the percentage of adults with jobs — has been more or less flat for the past year. But remember those aging baby boomers: the fraction of American adults who are in their prime working years is falling fast. Once you take the effects of an aging population into account, the numbers show a substantial improvement in the employment picture since the summer of 2011.

None of this should be taken to imply that the situation is good, or to deny that we should be doing better — a shortfall largely due to the scorched-earth tactics of Republicans, who have blocked any and all efforts to accelerate the pace of recovery. (If the American Jobs Act, proposed by the Obama administration last year, had been passed, the unemployment rate would probably be below 7 percent.) The U.S. economy is still far short of where it should be, and the job market has a long way to go before it makes up the ground lost in the Great Recession. But the employment data do suggest an economy that is slowly healing, an economy in which declining consumer debt burdens and a housing revival have finally put us on the road back to full employment.

And that’s the truth that the right can’t handle. The furor over Friday’s report revealed a political movement that is rooting for American failure, so obsessed with taking down Mr. Obama that good news for the nation’s long-suffering workers drives its members into a blind rage. It also revealed a movement that lives in an intellectual bubble, dealing with uncomfortable reality — whether that reality involves polls or economic data — not just by denying the facts, but by spinning wild conspiracy theories.

It is, quite simply, frightening to think that a movement this deranged wields so much political power.

Monday, August 20, 2012

An Unserious Man

August 19, 2012

By PAUL KRUGMAN

Mitt Romney’s choice of Paul Ryan as his running mate led to a wave of pundit accolades. Now, declared writer after writer, we’re going to have a real debate about the nation’s fiscal future. This was predictable: never mind the Tea Party, Mr. Ryan’s true constituency is the commentariat, which years ago decided that he was the Honest, Serious Conservative, whose proposals deserve respect even if you don’t like him.

But he isn’t and they don’t. Ryanomics is and always has been a con game, although to be fair, it has become even more of a con since Mr. Ryan joined the ticket.

Let’s talk about what’s actually in the Ryan plan, and let’s distinguish in particular between actual, specific policy proposals and unsupported assertions. To focus things a bit more, let’s talk — as most budget discussions do — about what’s supposed to happen over the next 10 years.

On the tax side, Mr. Ryan proposes big cuts in tax rates on top income brackets and corporations. He has tried to dodge the normal process in which tax proposals are “scored” by independent auditors, but the nonpartisan Tax Policy Center has done the math, and the revenue loss from these cuts comes to $4.3 trillion over the next decade.

On the spending side, Mr. Ryan proposes huge cuts in Medicaid, turning it over to the states while sharply reducing funding relative to projections under current policy. That saves around $800 billion. He proposes similar harsh cuts in food stamps, saving a further $130 billion or so, plus a grab-bag of other cuts, such as reduced aid to college students. Let’s be generous and say that all these cuts would save $1 trillion.

On top of this, Mr. Ryan includes the $716 billion in Medicare savings that are part of Obamacare, even though he wants to scrap everything else in that act. Despite this, Mr. Ryan has now joined Mr. Romney in denouncing President Obama for “cutting Medicare”; more on that in a minute.

So if we add up Mr. Ryan’s specific proposals, we have $4.3 trillion in tax cuts, partially offset by around $1.7 trillion in spending cuts — with the tax cuts, surprise, disproportionately benefiting the top 1 percent, while the spending cuts would primarily come at the expense of low-income families. Over all, the effect would be to increase the deficit by around two and a half trillion dollars.

Yet Mr. Ryan claims to be a deficit hawk. What’s the basis for that claim?

Well, he says that he would offset his tax cuts by “base broadening,” eliminating enough tax deductions to make up the lost revenue. Which deductions would he eliminate? He refuses to say — and realistically, revenue gain on the scale he claims would be virtually impossible.

At the same time, he asserts that he would make huge further cuts in spending. What would he cut? He refuses to say.

What Mr. Ryan actually offers, then, are specific proposals that would sharply increase the deficit, plus an assertion that he has secret tax and spending plans that he refuses to share with us, but which will turn his overall plan into deficit reduction.

If this sounds like a joke, that’s because it is. Yet Mr. Ryan’s “plan” has been treated with great respect in Washington. He even received an award for fiscal responsibility from three of the leading deficit-scold pressure groups. What’s going on?

The answer, basically, is a triumph of style over substance. Over the longer term, the Ryan plan would end Medicare as we know it — and in Washington, “fiscal responsibility” is often equated with willingness to slash Medicare and Social Security, even if the purported savings would be used to cut taxes on the rich rather than to reduce deficits. Also, self-proclaimed centrists are always looking for conservatives they can praise to showcase their centrism, and Mr. Ryan has skillfully played into that weakness, talking a good game even if his numbers don’t add up.

The question now is whether Mr. Ryan’s undeserved reputation for honesty and fiscal responsibility can survive his participation in a deeply dishonest and irresponsible presidential campaign.

The first sign of trouble has already surfaced over the issue of Medicare. Mr. Romney, in an attempt to repeat the G.O.P.’s successful “death panels” strategy of the 2010 midterms, has been busily attacking the president for the same Medicare savings that are part of the Ryan plan. And Mr. Ryan’s response when this was pointed out was incredibly lame: he only included those cuts, he says, because the president put them “in the baseline,” whatever that means. Of course, whatever Mr. Ryan’s excuse, the fact is that without those savings his budget becomes even more of a plan to increase, not reduce, the deficit.

So will the choice of Mr. Ryan mean a serious campaign? No, because Mr. Ryan isn’t a serious man — he just plays one on TV.

Monday, August 6, 2012

The Science of Genocide

 

Posted on Aug 6, 2012

By Chris Hedges

On this day in 1945 the United States demonstrated that it was as morally bankrupt as the Nazi machine it had recently vanquished and the Soviet regime with which it was allied. Over Hiroshima, and three days later over Nagasaki, it exploded an atomic device that was the most efficient weapon of genocide in human history. The blast killed tens of thousands of men, women and children. It was an act of mass annihilation that was strategically and militarily indefensible. The Japanese had been on the verge of surrender. Hiroshima and Nagasaki had no military significance. It was a war crime for which no one was ever tried. The explosions, which marked the culmination of three centuries of physics, signaled the ascendancy of the technician and scientist as our most potent agents of death.

“In World War II Auschwitz and Hiroshima showed that progress through technology has escalated man’s destructive impulses into more precise and incredibly more devastating form,” Bruno Bettelheimsaid. “The concentration camps with their gas chambers, the first atomic bomb … confronted us with the stark reality of overwhelming death, not so much one’s own—this each of us has to face sooner or later, and however uneasily, most of us manage not to be overpowered by our fear of it—but the unnecessary and untimely death of millions. … Progress not only failed to preserve life but it deprived millions of their lives more effectively than had ever been possible before. Whether we choose to recognize it or not, after the second World War Auschwitz and Hiroshima became monuments to the incredible devastation man and technology together bring about.”

The atomic blasts, ignited in large part to send a message to the Soviet Union, were a reminder that science is morally neutral. Science and technology serve the ambitions of humankind. And few in the sciences look beyond the narrow tasks handed to them by corporations or government. They employ their dark arts, often blind to the consequences, to cement into place systems of security and surveillance, as well as systems of environmental destruction, that will result in collective enslavement and mass extermination. As we veer toward environmental collapse we will have to pit ourselves against many of these experts, scientists and technicians whose loyalty is to institutions that profit from exploitation and death.

Scientists and technicians in the United States over the last five decades built 70,000 nuclear weapons at a cost of $5.5 trillion. (The Soviet Union had a nuclear arsenal of similar capability.) By 1963, according to the Columbia University professor Seymour Melman, the United States could overkill the 140 principal cities in the Soviet Union more than 78 times. Yet we went on manufacturing nuclear warheads. And those who publicly questioned the rationality of the massive nuclear buildup, such as J. Robert Oppenheimer, who at the government lab at Los Alamos, N.M., had overseen the building of the two bombs used on Japan, often were zealously persecuted on suspicion of being communists or communist sympathizers. It was a war plan that called for a calculated act of enormous, criminal genocide. We built more and more bombs with the sole purpose of killing hundreds of millions of people. And those who built them, with few exceptions, never gave a thought to their suicidal creations.

“What are we to make of a civilization which has always regarded ethics as an essential part of human life [but] which has not been able to talk about the prospect of killing almost everyone except in prudential and game-theoretical terms?” Oppenheimer asked after World War II.

Max Born, the great German-British physicist and mathematician who was instrumental in the development of quantum mechanics, in his memoirs made it clear he disapproved of Oppenheimer and the other physicists who built the atomic bombs. “It is satisfying to have had such clever and efficient pupils,” Born wrote, “but I wish they had shown less cleverness and more wisdom.” Oppenheimer wrote his old teacher back. “Over the years, I have felt a certain disapproval on your part for much that I have done. This has always seemed to me quite natural, for it is a sentiment that I share.” But of course, by then, it was too late.

It was science, industry and technology that made possible the 20th century’s industrial killing. These forces magnified innate human barbarity. They served the immoral. And there are numerous scientists who continue to work in labs across the country on weapons systems that have the capacity to exterminate millions of human beings. Is this a “rational” enterprise? Is it moral? Does it advance the human species? Does it protect life?

For many of us, science has supplanted religion. We harbor a naive faith in the godlike power of science. Since scientific knowledge is cumulative, albeit morally neutral, it gives the illusion that human history and human progress also are cumulative. Science is for us what totems and spells were for our premodern ancestors. It is magical thinking. It feeds our hubris and sense of divine empowerment. And trusting in its fearsome power will mean our extinction.

The 17th century Enlightenment myth of human advancement through science, reason and rationality should have been obliterated forever by the slaughter of World War I. Europeans watched the collective suicide of a generation. The darker visions of human nature embodied in the works of Fyodor Dostoevsky, Leo Tolstoy, Thomas Hardy, Joseph Conrad and Frederick Nietzsche before the war found modern expression in the work of Sigmund Freud, James Joyce, Marcel Proust, Franz Kafka, D.H. Lawrence, Thomas Mann and Samuel Beckett, along with atonal and dissonant composers such as Igor Stravinsky and painters such as Otto Dix, George Grosz, Henri Matisse and Pablo Picasso. Human progress, these artists and writers understood, was a joke. But there were many more who enthusiastically embraced new utopian visions of progress and glory peddled by fascists and communists. These belief systems defied reality. They fetishized death. They sought unattainable utopias through violence. And empowered by science and technology, they killed millions.

Human motives often are irrational and, as Freud pointed out, contain powerful yearnings for death and self-immolation. Science and technology have empowered and amplified the ancient lusts for war, violence and death. Knowledge did not free humankind from barbarism. The civilized veneer only masked the dark, inchoate longings that plague all human societies, including our own. Freud feared the destructive power of these urges. He warned in “Civilization and Its Discontents”that if we could not regulate or contain these urges, human beings would, as the Stoics predicted, consume themselves in a vast conflagration. The future of the human race depends on naming and controlling these urges. To pretend they do not exist is to fall into self-delusion.

The breakdown of social and political control during periods of political and economic turmoil allows these urges to reign supreme. Our first inclination, Freud noted correctly, is not to love one another as brothers or sisters but to “satisfy [our] aggressiveness on [our fellow human being], to exploit his capacity for work without compensation, to use him sexually without his consent, to seize his possessions, to humiliate him, to cause him pain, to torture and to kill him.” The war in Bosnia, with rampaging Serbian militias, rape camps, torture centers, concentration camps, razed villages and mass executions, was one of numerous examples of Freud’s wisdom. At best, Freud knew, we can learn to live with, regulate and control our inner tensions and conflicts. The structure of civilized societies would always be fraught with this inner tension, he wrote, because “… man’s natural aggressive instinct, the hostility of each against all and of all against each, opposes this program of civilization.” The burden of civilization is worth it. The alternative, as Freud knew, is self-destruction.

A rational world, a world that will protect the ecosystem and build economies that learn to distribute wealth rather than allow a rapacious elite to hoard it, will never be handed to us by the scientists and technicians. Nearly all of them work for the enemy. Mary Shelley warned us about becoming Prometheus as we seek to defy fate and the gods in order to master life and death. Her Victor Frankenstein, when his 8-foot-tall creation made partly of body pieces from graves came to ghastly life, had the same reaction as Oppenheimer when the American scientist discovered that his bomb had incinerated Japanese schoolchildren. The scientist Victor Frankenstein watched the “dull yellow eye” of his creature open and “breathless horror and disgust” filled his heart.” Oppenheimer said after the first atomic bomb was detonated in the New Mexican desert: “I remembered the line from the Hindu scripture, the Bhagavad-Gita. Vishnu is trying to persuade the Prince that he should do his duty and to impress him takes on his multi-armed form and says, ‘Now I am become Death, the destroyer of worlds.’ I suppose we all thought that, in one way or another.” The critic Harold Bloom, in words that could be applied to Oppenheimer, called Victor Frankenstein “a moral idiot.”

All attempts to control the universe, to play God, to become the arbiters of life and death, have been carried out by moral idiots. They will relentlessly push forward, exploiting and pillaging, perfecting their terrible tools of technology and science, until their creation destroys them and us. They make the nuclear bombs. They extract oil from the tar sands. They turn the Appalachians into a wasteland to extract coal. They serve the evils of globalism and finance. They run the fossil fuel industry. They flood the atmosphere with carbon emissions, doom the seas, melt the polar ice caps, unleash the droughts and floods, the heat waves, the freak storms and hurricanes.

Now I am become Death, the destroyer of worlds.

Monday, July 23, 2012

Loading the Climate Dice

July 22, 2012

By PAUL KRUGMAN

A couple of weeks ago the Northeast was in the grip of a severe heat wave. As I write this, however, it’s a fairly cool day in New Jersey, considering that it’s late July. Weather is like that; it fluctuates.

And this banal observation may be what dooms us to climate catastrophe, in two ways. On one side, the variability of temperatures from day to day and year to year makes it easy to miss, ignore or obscure the longer-term upward trend. On the other, even a fairly modest rise in average temperatures translates into a much higher frequency of extreme events — like the devastating drought now gripping America’s heartland — that do vast damage.

On the first point: Even with the best will in the world, it would be hard for most people to stay focused on the big picture in the face of short-run fluctuations. When the mercury is high and the crops are withering, everyone talks about it, and some make the connection to global warming. But let the days grow a bit cooler and the rains fall, and inevitably people’s attention turns to other matters.

Making things much worse, of course, is the role of players who don’t have the best will in the world. Climate change denial is a major industry, lavishly financed by Exxon, the Koch brothers and others with a financial stake in the continued burning of fossil fuels. And exploiting variability is one of the key tricks of that industry’s trade. Applications range from the Fox News perennial — “It’s cold outside! Al Gore was wrong!” — to the constant claims that we’re experiencing global cooling, not warming, because it’s not as hot right now as it was a few years back.

How should we think about the relationship between climate change and day-to-day experience? Almost a quarter of a century ago James Hansen, the NASA scientist who did more than anyone to put climate change on the agenda, suggested the analogy of loaded dice. Imagine, he and his associates suggested, representing the probabilities of a hot, average or cold summer by historical standards as a die with two faces painted red, two white and two blue. By the early 21st century, they predicted, it would be as if four of the faces were red, one white and one blue. Hot summers would become much more frequent, but there would still be cold summers now and then.

And so it has proved. As documented in a new paper by Dr. Hansen and others, cold summers by historical standards still happen, but rarely, while hot summers have in fact become roughly twice as prevalent. And 9 of the 10 hottest years on record have occurred since 2000.

But that’s not all: really extreme high temperatures, the kind of thing that used to happen very rarely in the past, have now become fairly common. Think of it as rolling two sixes, which happens less than 3 percent of the time with fair dice, but more often when the dice are loaded. And this rising incidence of extreme events, reflecting the same variability of weather that can obscure the reality of climate change, means that the costs of climate change aren’t a distant prospect, decades in the future. On the contrary, they’re already here, even though so far global temperatures are only about 1 degree Fahrenheit above their historical norms, a small fraction of their eventual rise if we don’t act.

The great Midwestern drought is a case in point. This drought has already sent corn prices to their highest level ever. If it continues, it could cause a global food crisis, because the U.S. heartland is still the world’s breadbasket. And yes, the drought is linked to climate change: such events have happened before, but they’re much more likely now than they used to be.

Now, maybe this drought will break in time to avoid the worst. But there will be more events like this. Joseph Romm, the influential climate blogger, has coined the term “Dust-Bowlification” for the prospect of extended periods of extreme drought in formerly productive agricultural areas. He has been arguing for some time that this phenomenon, with its disastrous effects on food security, is likely to be the leading edge of damage from climate change, taking place over the next few decades; the drowning of Florida by rising sea levels and all that will come later.

And here it comes.

Will the current drought finally lead to serious climate action? History isn’t encouraging. The deniers will surely keep on denying, especially because conceding at this point that the science they’ve trashed was right all along would be to admit their own culpability for the looming disaster. And the public is all too likely to lose interest again the next time the die comes up white or blue.

But let’s hope that this time is different. For large-scale damage from climate change is no longer a disaster waiting to happen. It’s happening now.

Friday, July 20, 2012

Pathos of the Plutocrat

July 19, 2012

By PAUL KRUGMAN

“Let me tell you about the very rich. They are different from you and me.” So wrote F. Scott Fitzgerald — and he didn’t just mean that they have more money. What he meant instead, at least in part, was that many of the very rich expect a level of deference that the rest of us never experience and are deeply distressed when they don’t get the special treatment they consider their birthright; their wealth “makes them soft where we are hard.”

And because money talks, this softness — call it the pathos of the plutocrats — has become a major factor in America’s political life.

It’s no secret that, at this point, many of America’s richest men — including some former Obama supporters — hate, just hate, President Obama. Why? Well, according to them, it’s because he “demonizes” business — or as Mitt Romney put it earlier this week, he “attacks success.” Listening to them, you’d think that the president was the second coming of Huey Long, preaching class hatred and the need to soak the rich.

Needless to say, this is crazy. In fact, Mr. Obama always bends over backward to declare his support for free enterprise and his belief that getting rich is perfectly fine. All that he has done is to suggest that sometimes businesses behave badly, and that this is one reason we need things like financial regulation. No matter: even this hint that sometimes the rich aren’t completely praiseworthy has been enough to drive plutocrats wild. For two years or more, Wall Street in particular has been crying: “Ma! He’s looking at me funny!”

Wait, there’s more. Not only do many of the superrich feel deeply aggrieved at the notion that anyone in their class might face criticism, they also insist that their perception that Mr. Obama doesn’t like them is at the root of our economic problems. Businesses aren’t investing, they say, because business leaders don’t feel valued. Mr. Romney repeated this line, too, arguing that because the president attacks success “we have less success.”

This, too, is crazy (and it’s disturbing that Mr. Romney appears to share this delusional view about what ails our economy). There’s no mystery about the reasons the economic recovery has been so weak. Housing is still depressed in the aftermath of a huge bubble, and consumer demand is being held back by the high levels of household debt that are the legacy of that bubble. Business investment has actually held up fairly well given this weakness in demand. Why should businesses invest more when they don’t have enough customers to make full use of the capacity they already have?

But never mind. Because the rich are different from you and me, many of them are incredibly self-centered. They don’t even see how funny it is — how ridiculous they look — when they attribute the weakness of a $15 trillion economy to their own hurt feelings. After all, who’s going to tell them? They’re safely ensconced in a bubble of deference and flattery.

Unless, that is, they run for public office.

Like everyone else following the news, I’ve been awe-struck by the way questions about Mr. Romney’s career at Bain Capital, the private-equity firm he founded, and his refusal to release tax returns have so obviously caught the Romney campaign off guard. Shouldn’t a very wealthy man running for president — and running specifically on the premise that his business success makes him qualified for office — have expected the nature of that success to become an issue? Shouldn’t it have been obvious that refusing to release tax returns from before 2010 would raise all kinds of suspicions?

By the way, while we don’t know what Mr. Romney is hiding in earlier returns, the fact that he is still stonewalling despite calls by Republicans as well as Democrats to come clean suggests that it could be something seriously damaging.

Anyway, what’s now apparent is that the campaign was completely unprepared for the obvious questions, and it has reacted to the Obama campaign’s decision to ask those questions with a hysteria that surely must be coming from the top. Clearly, Mr. Romney believed that he could run for president while remaining safe inside the plutocratic bubble and is both shocked and angry at the discovery that the rules that apply to others also apply to people like him. Fitzgerald again, about the very rich: “They think, deep down, that they are better than we are.”

O.K., let’s take a deep breath. The truth is that many, and probably most, of the very rich don’t fit Fitzgerald’s description. There are plenty of very rich Americans who have a sense of perspective, who take pride in their achievements without believing that their success entitles them to live by different rules.

But Mitt Romney, it seems, isn’t one of those people. And that discovery may be an even bigger issue than whatever is hidden in those tax returns he won’t release.

Friday, July 13, 2012

Who’s Very Important?

July 12, 2012

By PAUL KRUGMAN

“Is there a V.I.P. entrance? We are V.I.P.” That remark, by a donor waiting to get in to one of Mitt Romney’s recent fund-raisers in the Hamptons, pretty much sums up the attitude of America’s wealthy elite. Mr. Romney’s base — never mind the top 1 percent, we’re talking about the top 0.01 percent or higher — is composed of very self-important people.

Specifically, these are people who believe that they are, as another Romney donor put it, “the engine of the economy”; they should be cherished, and the taxes they pay, which are already at an 80-year low, should be cut even further. Unfortunately, said yet another donor, the “common person” — for example, the “nails ladies” — just doesn’t get it.

O.K., it’s easy to mock these people, but the joke’s really on us. For the “we are V.I.P.” crowd has fully captured the modern Republican Party, to such an extent that leading Republicans consider Mr. Romney’s apparent use of multimillion-dollar offshore accounts to dodge federal taxes not just acceptable but praiseworthy: “It’s really American to avoid paying taxes, legally,” declared Senator Lindsey Graham, Republican of South Carolina. And there is, of course, a good chance that Republicans will control both Congress and the White House next year.

If that happens, we’ll see a sharp turn toward economic policies based on the proposition that we need to be especially solicitous toward the superrich — I’m sorry, I mean the “job creators.” So it’s important to understand why that’s wrong.

The first thing you need to know is that America wasn’t always like this. When John F. Kennedy was elected president, the top 0.01 percent was only about a quarter as rich compared with the typical family as it is now — and members of that class paid much higher taxes than they do today. Yet somehow we managed to have a dynamic, innovative economy that was the envy of the world. The superrich may imagine that their wealth makes the world go round, but history says otherwise.

To this historical observation we should add another note: quite a few of today’s superrich, Mr. Romney included, make or made their money in the financial sector, buying and selling assets rather than building businesses in the old-fashioned sense. Indeed, the soaring share of the wealthy in national income went hand in hand with the explosive growth of Wall Street.

Not long ago, we were told that all this wheeling and dealing was good for everyone, that it was making the economy both more efficient and more stable. Instead, it turned out that modern finance was laying the foundation for a severe economic crisis whose fallout continues to afflict millions of Americans, and that taxpayers had to bail out many of those supposedly brilliant bankers to prevent an even worse crisis. So at least some members of the top 0.01 percent are best viewed as job destroyers rather than job creators.

Did I mention that those bailed-out bankers are now overwhelmingly backing Mr. Romney, who promises to reverse the mild financial reforms introduced after the crisis?

To be sure, many and probably most of the rich do, in fact, contribute positively to the economy. However, they also receive large monetary rewards. Yet somehow $20 million-plus in annual income isn’t enough. They want to be revered, too, and given special treatment in the form of low taxes. And that is more than they deserve. After all, the “common person” also makes a positive contribution to the economy. Why single out the rich for extra praise and perks?

What about the argument that we must keep taxes on the rich low lest we remove their incentive to create wealth? The answer is that we have a lot of historical evidence, going all the way back to the 1920s, on the effects of tax increases on the rich, and none of it supports the view that the kinds of tax-rate changes for the rich currently on the table — President Obama’s proposal for a modest rise, Mr. Romney’s call for further cuts — would have any major effect on incentives. Remember when all the usual suspects claimed that the economy would crash when Bill Clinton raised taxes in 1993?

Furthermore, if you’re really concerned about the incentive effects of public policy, you should be focused not on the rich but on workers making $20,000 to $30,000 a year, who are often penalized for any gain in income because they end up losing means-tested benefits like Medicaid and food stamps. I’ll have more to say about that in another column. By the way, in 2010, the average annual wage of manicurists — “nails ladies,” in Romney-donor speak — was $21,760.

So, are the very rich V.I.P.? No, they aren’t — at least no more so than other working Americans. And the “common person” will be hurt, not helped, if we end up with government of the 0.01 percent, by the 0.01 percent, for the 0.01 percent.

Friday, June 29, 2012

The Real Winners - NYTimes.com

June 28, 2012 The Real Winners By PAUL KRUGMAN So the Supreme Court — defying many expectations — upheld the Affordable Care Act, a k a Obamacare. There will, no doubt, be many headlines declaring this a big victory for President Obama, which it is. But the real winners are ordinary Americans — people like you. How many people are we talking about? You might say 30 million, the number of additional people the Congressional Budget Office says will have health insurance thanks to Obamacare. But that vastly understates the true number of winners because millions of other Americans — including many who oppose the act — would have been at risk of being one of those 30 million. So add in every American who currently works for a company that offers good health insurance but is at risk of losing that job (and who isn’t in this world of outsourcing and private equity buyouts?); every American who would have found health insurance unaffordable but will now receive crucial financial help; every American with a pre-existing condition who would have been flatly denied coverage in many states. In short, unless you belong to that tiny class of wealthy Americans who are insulated and isolated from the realities of most people’s lives, the winners from that Supreme Court decision are your friends, your relatives, the people you work with — and, very likely, you. For almost all of us stand to benefit from making America a kinder and more decent society. But what about the cost? Put it this way: the budget office’s estimate of the cost over the next decade of Obamacare’s “coverage provisions” — basically, the subsidies needed to make insurance affordable for all — is about only a third of the cost of the tax cuts, overwhelmingly favoring the wealthy, that Mitt Romney is proposing over the same period. True, Mr. Romney says that he would offset that cost, but he has failed to provide any plausible explanation of how he’d do that. The Affordable Care Act, by contrast, is fully paid for, with an explicit combination of tax increases and spending cuts elsewhere. So the law that the Supreme Court upheld is an act of human decency that is also fiscally responsible. It’s not perfect, by a long shot — it is, after all, originally a Republican plan, devised long ago as a way to forestall the obvious alternative of extending Medicare to cover everyone. As a result, it’s an awkward hybrid of public and private insurance that isn’t the way anyone would have designed a system from scratch. And there will be a long struggle to make it better, just as there was for Social Security. (Bring back the public option!) But it’s still a big step toward a better — and by that I mean morally better — society. Which brings us to the nature of the people who tried to kill health reform — and who will, of course, continue their efforts despite this unexpected defeat. At one level, the most striking thing about the campaign against reform was its dishonesty. Remember “death panels”? Remember how reform’s opponents would, in the same breath, accuse Mr. Obama of promoting big government and denounce him for cutting Medicare? Politics ain’t beanbag, but, even in these partisan times, the unscrupulous nature of the campaign against reform was exceptional. And, rest assured, all the old lies and probably a bunch of new ones will be rolled out again in the wake of the Supreme Court’s decision. Let’s hope the Democrats are ready. But what was and is really striking about the anti-reformers is their cruelty. It would be one thing if, at any point, they had offered any hint of an alternative proposal to help Americans with pre-existing conditions, Americans who simply can’t afford expensive individual insurance, Americans who lose coverage along with their jobs. But it has long been obvious that the opposition’s goal is simply to kill reform, never mind the human consequences. We should all be thankful that, for the moment at least, that effort has failed. Let me add a final word on the Supreme Court. Before the arguments began, the overwhelming consensus among legal experts who aren’t hard-core conservatives — and even among some who are — was that Obamacare was clearly constitutional. And, in the end, thanks to Chief Justice John Roberts Jr., the court upheld that view. But four justices dissented, and did so in extreme terms, proclaiming not just the much-disputed individual mandate but the whole act unconstitutional. Given prevailing legal opinion, it’s hard to see that position as anything but naked partisanship. The point is that this isn’t over — not on health care, not on the broader shape of American society. The cruelty and ruthlessness that made this court decision such a nail-biter aren’t going away. But, for now, let’s celebrate. This was a big day, a victory for due process, decency and the American people.

Friday, June 22, 2012

Prisons, Privatization, Patronage

June 21, 2012

By PAUL KRUGMAN

Over the past few days, The New York Times has published several terrifying reports about New Jersey’s system of halfway houses — privately run adjuncts to the regular system of prisons. The series is a model of investigative reporting, which everyone should read. But it should also be seen in context. The horrors described are part of a broader pattern in which essential functions of government are being both privatized and degraded.

First of all, about those halfway houses: In 2010, Chris Christie, the state’s governor — who has close personal ties to Community Education Centers, the largest operator of these facilities, and who once worked as a lobbyist for the firm — described the company’s operations as “representing the very best of the human spirit.” But The Times’s reports instead portray something closer to hell on earth — an understaffed, poorly run system, with a demoralized work force, from which the most dangerous individuals often escape to wreak havoc, while relatively mild offenders face terror and abuse at the hands of other inmates.

It’s a terrible story. But, as I said, you really need to see it in the broader context of a nationwide drive on the part of America’s right to privatize government functions, very much including the operation of prisons. What’s behind this drive?

You might be tempted to say that it reflects conservative belief in the magic of the marketplace, in the superiority of free-market competition over government planning. And that’s certainly the way right-wing politicians like to frame the issue.

But if you think about it even for a minute, you realize that the one thing the companies that make up the prison-industrial complex — companies like Community Education or the private-prison giant Corrections Corporation of America — are definitely not doing is competing in a free market. They are, instead, living off government contracts. There isn’t any market here, and there is, therefore, no reason to expect any magical gains in efficiency.

And, sure enough, despite many promises that prison privatization will lead to big cost savings, such savings — as a comprehensive study by the Bureau of Justice Assistance, part of the U.S. Department of Justice, concluded — “have simply not materialized.” To the extent that private prison operators do manage to save money, they do so through “reductions in staffing patterns, fringe benefits, and other labor-related costs.”

So let’s see: Privatized prisons save money by employing fewer guards and other workers, and by paying them badly. And then we get horror stories about how these prisons are run. What a surprise!

So what’s really behind the drive to privatize prisons, and just about everything else?

One answer is that privatization can serve as a stealth form of government borrowing, in which governments avoid recording upfront expenses (or even raise money by selling existing facilities) while raising their long-run costs in ways taxpayers can’t see. We hear a lot about the hidden debts that states have incurred in the form of pension liabilities; we don’t hear much about the hidden debts now being accumulated in the form of long-term contracts with private companies hired to operate prisons, schools and more.

Another answer is that privatization is a way of getting rid of public employees, who do have a habit of unionizing and tend to lean Democratic in any case.

But the main answer, surely, is to follow the money. Never mind what privatization does or doesn’t do to state budgets; think instead of what it does for both the campaign coffers and the personal finances of politicians and their friends. As more and more government functions get privatized, states become pay-to-play paradises, in which both political contributions and contracts for friends and relatives become a quid pro quo for getting government business. Are the corporations capturing the politicians, or the politicians capturing the corporations? Does it matter?

Now, someone will surely point out that nonprivatized government has its own problems of undue influence, that prison guards and teachers’ unions also have political clout, and this clout sometimes distorts public policy. Fair enough. But such influence tends to be relatively transparent. Everyone knows about those arguably excessive public pensions; it took an investigation by The Times over several months to bring the account of New Jersey’s halfway-house-hell to light.

The point, then, is that you shouldn’t imagine that what The Times discovered about prison privatization in New Jersey is an isolated instance of bad behavior. It is, instead, almost surely a glimpse of a pervasive and growing reality, of a corrupt nexus of privatization and patronage that is undermining government across much of our nation.

Monday, June 18, 2012

Greece as Victim

June 17, 2012

By PAUL KRUGMAN

Ever since Greece hit the skids, we’ve heard a lot about what’s wrong with everything Greek. Some of the accusations are true, some are false — but all of them are beside the point. Yes, there are big failings in Greece’s economy, its politics and no doubt its society. But those failings aren’t what caused the crisis that is tearing Greece apart, and threatens to spread across Europe.

No, the origins of this disaster lie farther north, in Brussels, Frankfurt and Berlin, where officials created a deeply — perhaps fatally — flawed monetary system, then compounded the problems of that system by substituting moralizing for analysis. And the solution to the crisis, if there is one, will have to come from the same places.

So, about those Greek failings: Greece does indeed have a lot of corruption and a lot of tax evasion, and the Greek government has had a habit of living beyond its means. Beyond that, Greek labor productivity is low by European standards — about 25 percent below the European Union average. It’s worth noting, however, that labor productivity in, say, Mississippi is similarly low by American standards — and by about the same margin.

On the other hand, many things you hear about Greece just aren’t true. The Greeks aren’t lazy — on the contrary, they work longer hours than almost anyone else in Europe, and much longer hours than the Germans in particular. Nor does Greece have a runaway welfare state, as conservatives like to claim; social expenditure as a percentage of G.D.P., the standard measure of the size of the welfare state, is substantially lower in Greece than in, say, Sweden or Germany, countries that have so far weathered the European crisis pretty well.

So how did Greece get into so much trouble? Blame the euro.

Fifteen years ago Greece was no paradise, but it wasn’t in crisis either. Unemployment was high but not catastrophic, and the nation more or less paid its way on world markets, earning enough from exports, tourism, shipping and other sources to more or less pay for its imports.

Then Greece joined the euro, and a terrible thing happened: people started believing that it was a safe place to invest. Foreign money poured into Greece, some but not all of it financing government deficits; the economy boomed; inflation rose; and Greece became increasingly uncompetitive. To be sure, the Greeks squandered much if not most of the money that came flooding in, but then so did everyone else who got caught up in the euro bubble.

And then the bubble burst, at which point the fundamental flaws in the whole euro system became all too apparent.

Ask yourself, why does the dollar area — also known as the United States of America — more or less work, without the kind of severe regional crises now afflicting Europe? The answer is that we have a strong central government, and the activities of this government in effect provide automatic bailouts to states that get in trouble.

Consider, for example, what would be happening to Florida right now, in the aftermath of its huge housing bubble, if the state had to come up with the money for Social Security and Medicare out of its own suddenly reduced revenues. Luckily for Florida, Washington rather than Tallahassee is picking up the tab, which means that Florida is in effect receiving a bailout on a scale no European nation could dream of.

Or consider an older example, the savings and loan crisis of the 1980s, which was largely a Texas affair. Taxpayers ended up paying a huge sum to clean up the mess — but the vast majority of those taxpayers were in states other than Texas. Again, the state received an automatic bailout on a scale inconceivable in modern Europe.

So Greece, although not without sin, is mainly in trouble thanks to the arrogance of European officials, mostly from richer countries, who convinced themselves that they could make a single currency work without a single government. And these same officials have made the situation even worse by insisting, in the teeth of the evidence, that all the currency’s troubles were caused by irresponsible behavior on the part of those Southern Europeans, and that everything would work out if only people were willing to suffer some more.

Which brings us to Sunday’s Greek election, which ended up settling nothing. The governing coalition may have managed to stay in power, although even that’s not clear (the junior partner in the coalition is threatening to defect). But the Greeks can’t solve this crisis anyway.

The only way the euro might — might — be saved is if the Germans and the European Central Bank realize that they’re the ones who need to change their behavior, spending more and, yes, accepting higher inflation. If not — well, Greece will basically go down in history as the victim of other people’s hubris.

Friday, June 15, 2012

We Don’t Need No Education

June 14, 2012

By PAUL KRUGMAN

Hope springs eternal. For a few hours I was ready to applaud Mitt Romney for speaking honestly about what his calls for smaller government actually mean.

Never mind. Soon the candidate was being his normal self, denying having said what he said and serving up a bunch of self-contradictory excuses. But let’s talk about his accidental truth-telling, and what it reveals.

In the remarks Mr. Romney later tried to deny, he derided President Obama: “He says we need more firemen, more policemen, more teachers.” Then he declared, “It’s time for us to cut back on government and help the American people.”

You can see why I was ready to give points for honesty. For once, he actually admitted what he and his allies mean when they talk about shrinking government. Conservatives love to pretend that there are vast armies of government bureaucrats doing who knows what; in reality, a majority of government workers are employed providing either education (teachers) or public protection (police officers and firefighters).

So would getting rid of teachers, police officers, and firefighters help the American people? Well, some Republicans would prefer to see Americans get less education; remember Rick Santorum’s description of colleges as “indoctrination mills”? Still, neither less education nor worse protection are issues the G.O.P. wants to run on.

But the more relevant question for the moment is whether the public job cuts Mr. Romney applauds are good or bad for the economy. And we now have a lot of evidence bearing on that question.

First of all, there’s our own experience. Conservatives would have you believe that our disappointing economic performance has somehow been caused by excessive government spending, which crowds out private job creation. But the reality is that private-sector job growth has more or less matched the recoveries from the last two recessions; the big difference this time is an unprecedented fall in public employment, which is now about 1.4 million jobs less than it would be if it had grown as fast as it did under President George W. Bush.

And, if we had those extra jobs, the unemployment rate would be much lower than it is — something like 7.3 percent instead of 8.2 percent. It sure looks as if cutting government when the economy is deeply depressed hurts rather than helps the American people.

The really decisive evidence on government cuts, however, comes from Europe. Consider the case of Ireland, which has reduced public employment by 28,000 since 2008 — the equivalent, as a share of population, of laying off 1.9 million workers here. These cuts were hailed by conservatives, who predicted great results. “The Irish economy is showing encouraging signs of recovery,” declared Alan Reynolds of the Cato Institute in June 2010.

But recovery never came; Irish unemployment is currently more than 14 percent. Ireland’s experience shows that austerity in the face of a depressed economy is a terrible mistake to be avoided if possible.

And the point is that in America it is possible. You can argue that countries like Ireland had and have very limited policy choices. But America — which unlike Europe has a federal government — has an easy way to reverse the job cuts that are killing the recovery: have the feds, who can borrow at historically low rates, provide aid that helps state and local governments weather the hard times. That, in essence, is what the president was proposing and Mr. Romney was deriding.

So the former governor of Massachusetts was telling the truth the first time: by opposing aid to beleaguered state and local governments, he is, in effect, calling for more layoffs of teachers, policemen and firemen.

Actually, it’s kind of ironic. While Republicans love to engage in Europe-bashing, they’re actually the ones who want us to emulate European-style austerity and experience a European-style depression.

And that’s not just an inference. Last week R. Glenn Hubbard of Columbia University, a top Romney adviser, published an article in a German newspaper urging the Germans to ignore advice from Mr. Obama and continue pushing their hard-line policies. In so doing, Mr. Hubbard was deliberately undercutting a sitting president’s foreign policy. More important, however, he was throwing his support behind a policy that is collapsing as you read this.

In fact, almost everyone following the situation now realizes that Germany’s austerity obsession has brought Europe to the edge of catastrophe — almost everyone, that is, except the Germans themselves and, it turns out, the Romney economic team.

Needless to say, this bodes ill if Mr. Romney wins in November. For all indications are that his idea of smart policy is to double down on the very spending cuts that have hobbled recovery here and sent Europe into an economic and political tailspin.

Friday, June 8, 2012

Reagan Was a Keynesian

June 7, 2012

By PAUL KRUGMAN

There’s no question that America’s recovery from the financial crisis has been disappointing. In fact, I’ve been arguing that the era since 2007 is best viewed as a “depression,” an extended period of economic weakness and high unemployment that, like the Great Depression of the 1930s, persists despite episodes during which the economy grows. And Republicans are, of course, trying — with considerable success — to turn this dismal state of affairs to their political advantage.

They love, in particular, to contrast President Obama’s record with that of Ronald Reagan, who, by this point in his presidency, was indeed presiding over a strong economic recovery. You might think that the more relevant comparison is with George W. Bush, who, at this stage of his administration, was — unlike Mr. Obama — still presiding over a large loss in private-sector jobs. And, as I’ll explain shortly, the economic slump Reagan faced was very different from our current depression, and much easier to deal with. Still, the Reagan-Obama comparison is revealing in some ways. So let’s look at that comparison, shall we?

For the truth is that on at least one dimension, government spending, there was a large difference between the two presidencies, with total government spending adjusted for inflation and population growth rising much faster under one than under the other. I find it especially instructive to look at spending levels three years into each man’s administration — that is, in the first quarter of 1984 in Reagan’s case, and in the first quarter of 2012 in Mr. Obama’s — compared with four years earlier, which in each case more or less corresponds to the start of an economic crisis. Under one president, real per capita government spending at that point was 14.4 percent higher than four years previously; under the other, less than half as much, just 6.4 percent.

O.K., by now many readers have probably figured out the trick here: Reagan, not Obama, was the big spender. While there was a brief burst of government spending early in the Obama administration — mainly for emergency aid programs like unemployment insurance and food stamps — that burst is long past. Indeed, at this point, government spending is falling fast, with real per capita spending falling over the past year at a rate not seen since the demobilization that followed the Korean War.

Why was government spending much stronger under Reagan than in the current slump? “Weaponized Keynesianism” — Reagan’s big military buildup — played some role. But the big difference was real per capita spending at the state and local level, which continued to rise under Reagan but has fallen significantly this time around.

And this, in turn, reflects a changed political environment. For one thing, states and local governments used to benefit from revenue-sharing — automatic aid from the federal government, a program that Reagan eventually killed but only after the slump was past. More important, in the 1980s, anti-tax dogma hadn’t taken effect to the same extent it has today, so state and local governments were much more willing than they are now to cover temporary deficits with temporary tax increases, thereby avoiding sharp spending cuts.

In short, if you want to see government responding to economic hard times with the “tax and spend” policies conservatives always denounce, you should look to the Reagan era — not the Obama years.

So does the Reagan-era economic recovery demonstrate the superiority of Keynesian economics? Not exactly. For, as I said, the truth is that the slump of the 1980s — which was more or less deliberately caused by the Federal Reserve, as a way to bring down inflation — was very different from our current depression, which was brought on by private-sector excess: above all, the surge in household debt during the Bush years. The Reagan slump could be and was brought to a rapid end when the Fed decided to relent and cut interest rates, sparking a giant housing boom. That option isn’t available now because rates are already close to zero.

As many economists have pointed out, America is currently suffering from a classic case of debt deflation: all across the economy people are trying to pay down debt by slashing spending, but, in so doing, they are causing a depression that makes their debt problems even worse. This is exactly the situation in which government spending should temporarily rise to offset the slump in private spending and give the private sector time to repair its finances. Yet that’s not happening.

The point, then, is that we’d be in much better shape if we were following Reagan-style Keynesianism. Reagan may have preached small government, but in practice he presided over a lot of spending growth — and right now that’s exactly what America needs.

Monday, June 4, 2012

This Republican Economy

June 3, 2012

By PAUL KRUGMAN

What should be done about the economy? Republicans claim to have the answer: slash spending and cut taxes. What they hope voters won’t notice is that that’s precisely the policy we’ve been following the past couple of years. Never mind the Democrat in the White House; for all practical purposes, this is already the economic policy of Republican dreams.

So the Republican electoral strategy is, in effect, a gigantic con game: it depends on convincing voters that the bad economy is the result of big-spending policies that President Obama hasn’t followed (in large part because the G.O.P. wouldn’t let him), and that our woes can be cured by pursuing more of the same policies that have already failed.

For some reason, however, neither the press nor Mr. Obama’s political team has done a very good job of exposing the con.

What do I mean by saying that this is already a Republican economy? Look first at total government spending — federal, state and local. Adjusted for population growth and inflation, such spending has recently been falling at a rate not seen since the demobilization that followed the Korean War.

How is that possible? Isn’t Mr. Obama a big spender? Actually, no; there was a brief burst of spending in late 2009 and early 2010 as the stimulus kicked in, but that boost is long behind us. Since then it has been all downhill. Cash-strapped state and local governments have laid off teachers, firefighters and police officers; meanwhile, unemployment benefits have been trailing off even though unemployment remains extremely high.

Over all, the picture for America in 2012 bears a stunning resemblance to the great mistake of 1937, when F.D.R. prematurely slashed spending, sending the U.S. economy — which had actually been recovering fairly fast until that point — into the second leg of the Great Depression. In F.D.R.’s case, however, this was an unforced error, since he had a solidly Democratic Congress. In President Obama’s case, much though not all of the responsibility for the policy wrong turn lies with a completely obstructionist Republican majority in the House.

That same obstructionist House majority effectively blackmailed the president into continuing all the Bush tax cuts for the wealthy, so that federal taxes as a share of G.D.P. are near historic lows — much lower, in particular, than at any point during Ronald Reagan’s presidency.

As I said, for all practical purposes this is already a Republican economy.

As an aside, I think it’s worth pointing out that although the economy’s performance has been disappointing, to say the least, none of the disasters Republicans predicted have come to pass. Remember all those assertions that budget deficits would lead to soaring interest rates? Well, U.S. borrowing costs have just hit a record low. And remember those dire warnings about inflation and the “debasement” of the dollar? Well, inflation remains low, and the dollar has been stronger than it was in the Bush years.

Put it this way: Republicans have been warning that we were about to turn into Greece because President Obama was doing too much to boost the economy; Keynesian economists like myself warned that we were, on the contrary, at risk of turning into Japan because he was doing too little. And Japanification it is, except with a level of misery the Japanese never had to endure.

So why don’t voters know any of this?

Part of the answer is that far too much economic reporting is still of the he-said, she-said variety, with dueling quotes from hired guns on either side. But it’s also true that the Obama team has consistently failed to highlight Republican obstruction, perhaps out of a fear of seeming weak. Instead, the president’s advisers keep turning to happy talk, seizing on a few months’ good economic news as proof that their policies are working — and then ending up looking foolish when the numbers turn down again. Remarkably, they’ve made this mistake three times in a row: in 2010, 2011 and now once again.

At this point, however, Mr. Obama and his political team don’t seem to have much choice. They can point with pride to some big economic achievements, above all the successful rescue of the auto industry, which is responsible for a large part of whatever job growth we are managing to get. But they’re not going to be able to sell a narrative of overall economic success. Their best bet, surely, is to do a Harry Truman, to run against the “do-nothing” Republican Congress that has, in reality, blocked proposals — for tax cuts as well as more spending — that would have made 2012 a much better year than it’s turning out to be.

For that, in the end, is the best argument against Republicans’ claims that they can fix the economy. The fact is that we have already seen the Republican economic future — and it doesn’t work.