Wholesale prices jumped last month by the most in nearly two years due to higher energy costs and the steepest rise in food prices in 36 years. Excluding those volatile categories, inflation was tame.
The Labor Department said Wednesday that the Producer Price Index rose a seasonally adjusted 1.6 percent in February — double the 0.8 percent rise in the previous month. Outside of food and energy costs, the core index ticked up 0.2 percent, less than January’s 0.5 percent rise.
Food prices soared 3.9 percent last month, the biggest gain since November 1974. Most of that increase was due to a sharp rise in vegetable costs, which increased nearly 50 percent. That was the most in almost a year. Meat and dairy products also rose.
I believe that food inflation is in the midst of its greatest run-up (by one measurement of a basket of basic foodstuffs) since 1974. The lead story on Drudge reports on the most recent data.
Under the rubric of QE2, the Federal Reserve Bank is engaged in the venture of increasing the money supply with the goal of moderately increasing inflation. I fear that this venture is misguided and destructive. I believe it will result in inflation exceeding the Fed’s goal, if it has not done so already, and that the Fed will apply the brakes well after the damage has been done, as is its style.
Scott cleverly titles his post, “Let them eat iPads.” I’m not sure I’d draw a line between QE2 and what has happened in food and oil prices, at least not as a primary factor. The effect of QE2 will be to weaken the dollar, which will hike the cost of imports, to be sure, and that may account for a little of the large price jump. If it was the main factor — if the dollar had been weakened to that extent — then prices would be up across the board, especially on imports. At least according to today’s report from the BEA on the trade deficit, that doesn’t appear to be the case.
The real source of this problem is America’s continuing refusal to exploit its own energy sources. We remain too dependent on imports for energy while deliberately sidelining at least hundreds of thousands of potential high-paying jobs by refusing to extract our own oil and natural gas. When the unstable countries that produce oil go through political paroxysms, it spooks investors and sends commodity prices soaring on the increased risk to distribution. Those price increases mean higher transportation costs, which impacts all goods and services that require transport to get to consumers. It’s a multiplier factor that we have seen a number of times over the last four decades, and which our political class continues to pretend doesn’t exist.
In the year ahead, expect to see the largest food price increases in the protein group: chicken, beef, and pork, as well as dairy items. One key reason: The price of corn, used as feed by ranchers and farmers, has doubled in the past year. But vegetarians won’t get off easy: Produce and orange juice are rising sharply, as well.
Higher food prices have wide economic ramifications and are being watched closely by the Federal Reserve. From a business standpoint, food producers – from agricultural giants to the corner pizza parlor – must raise prices or watch their profit margins evaporate. Many middle-class households are financially stretched to the limit, so any extra expense for such basics as milk or bread makes their life even tougher. Organizations that help the poor with food, moreover, find they can’t help as many people because their dollar doesn’t go as far.
“The more you have to spend on a loaf of bread and a pound of ground beef, the less you have to spend on everything else,” says Mark Zandi, chief economist at Moody’s Analytics in West Chester, Pa. “It’s like a tax increase, although it’s not quite as bad as rising oil prices, since at least the revenues go to US farmers, truckers, and ag-equipment manufacturers.”
The US Department of Agriculture expects the average price of food in 2011 to be 4 percent higher than last year. Some private forecasters say that, by December, prices could be as much as 6 percent higher than in December 2010.
“If food inflation comes in at 6 percent, it would be the most dramatic increase since 1982,” says William Lapp, a consumer foods economist with his own firm, Advanced Economic Solutions in Omaha, Neb. “We had a 10-year period, from 1972 to 1981, when annual food prices rose sharply – including a two-year period when increases averaged 8.7 percent.”
When you factor in crude foodstuff and feedstuff to food costs to producers, food prices rose at the fastest rate since 1974, when the U.S. economy was in the grips of what was known as “stagflation.” Prices were rising rapidly despite little or no growth in the economy.
There’s lots of interesting stuff in Ed Glaeser’s new book, “The Triumph of the City.” One of Glaeser’s themes, for instance, is the apparent paradox of cities becoming more expensive and more crowded even as the cost of communicating over great distances has fallen dramatically. New York is a good example of this, but Silicon Valley is a better one
[…]
The overarching theme of Glaeser’s book is that cities make us smarter, more productive and more innovative. To put it plainly, they make us richer. And the evidence in favor of this point is very, very strong. But it would of course be political suicide for President Obama to say that part of winning the future is ending the raft of subsidies we devote to sustaining rural living. And the U.S. Senate is literally set up to ensure that such a policy never becomes politically plausible.
Yesterday afternoon, I got an e-mail from a “usda.gov” address. “Secretary Vilsack read your blog post ‘Why we still need cities’ over the weekend, and he has some thoughts and reflections, particularly about the importance of rural America,” it said. A call was set for a little later in the day. I think it’s safe to say Vilsack didn’t like the post. A lightly edited transcript of our discussion about rural America, subsidies and values follows.
Ezra Klein: Let’s talk about the post.
Tom Vilsack: I took it as a slam on rural America. Rural America is a unique and interesting place that I don’t think a lot of folks fully appreciate and understand. They don’t understand that that while it represents 16 percent of America’s population, 44 percent of the military comes from rural America. It’s the source of our food, fiber and feed, and 88 percent of our renewable water resources. One of every 12 jobs in the American economy is connected in some way to what happens in rural America. It’s one of the few parts of our economy that still has a trade surplus. And sometimes people don’t realize that 90 percent of the persistent poverty counties are located in rural America.
EK: Let me stop you there for a moment. Are 90 percent of the people in persistent poverty in rural America? Or just 90 percent of the counties?
TV: Well, I’m sure that more people live in cities who are below the poverty level. In terms of abject poverty and significant poverty, there’s a lot of it in rural America.
The other thing is that people don’t understand is how difficult farming is. There are really three different kinds of farmers. Of the 2.1 million people who counted as farmers, about 1.3 million of them live in a farmstead in rural America. They don’t really make any money from their operation. Then there are 600,000 people who, if you ask them what they do for a living, they’re farmers. They produce more than $10,000 but less than $250,000 in sales. Those folks are good people, they populate rural communities and support good schools and serve important functions. And those are the folks for whom I’m trying to figure out how to diversify income opportunities, help them spread out into renewable fuel sources. And then the balance of farmers, roughly 200,000 to 300,000, are commercial operations, and they do pretty well, particularly when commodity prices are high. But they have a tremendous amount of capital at risk. And they’re aging at a rapid rate, with 37 percent over 65. Who’s going to replace those folks?
EK: You keep saying that rural Americans are good and decent people, that they work hard and participate in their communities. But no one is questioning that. The issue is that people who live in cities are also good people. People who live in exurbs work hard and mow their lawns. So what does the character of rural America have to do with subsidies for rural America?
TV: It is an argument. There is a value system that’s important to support. If there’s not economic opportunity, we can’t utilize the resources of rural America. I think it’s a complicated discussion and it does start with the fact that these are good, hardworking people who feel underappreciated. When you spend 6 or 7 percent of your paycheck for groceries and people in other countries spend 20 percent, that’s partly because of these farmers.
IN THIS chat with Ezra Klein, Tom Vilsack, the secretary of agriculture, offers a pandering defence of agricultural subsidies so thoroughly bereft of substance I began to fear that Mr Vilsack would be sucked into the vacuum of his mouth and disappear.When Mr Klein first raises the subject of subsidies for sugar and corn, Mr Vilsack admirably says, “I admit and acknowledge that over a period of time, those subsidies need to be phased out.” But not yet! Vilsack immediately thereafter scrambles to defend the injurious practice. Ethanol subsidies help to wean us off foreign fuels and dampen price volatility when there is no peace is the Middle East, Mr Vilsack contends. Anyway, he continues, undoing the economic dislocation created by decades of corporate welfare for the likes of ADM and Cargill will create economic dislocation. Neither of these points is entirely lacking in merit, but they at best argue for phasing out subsidies slowly starting now.
Mr Vilsack should have stopped here, since this is as strong as his case is ever going to be, but instead he goes on to argue that these subsidies sustain rural culture, which is a patriotic culture that honours and encourages vital military service:
[S]mall-town folks in rural America don’t feel appreciated. They feel they do a great service for America. They send their children to the military not just because it’s an opportunity, but because they have a value system from the farm: They have to give something back to the land that sustains them.
Mr Klein follows up sanely:
It sounds to me like the policy you’re suggesting here is to subsidize the military by subsidizing rural America. Why not just increase military pay? Do you believe that if there was a substantial shift in geography over the next 15 years, that we wouldn’t be able to furnish a military?
To which Mr Vilsack says:
I think we would have fewer people. There’s a value system there. Service is important for rural folks. Country is important, patriotism is important. And people grow up with that. I wish I could give you all the examples over the last two years as secretary of agriculture, where I hear people in rural America constantly being criticized, without any expression of appreciation for what they do do.
In the end, Mr Vilsack’s argument comes down to the notion that the people of rural America feel that they have lost social status, and that subsidies amount to a form of just compensation for this injury. I don’t think Mr Vilsack really believes that in the absence of welfare for farmers, the armed services would be hard-pressed to find young men and women willing to make war for the American state. He’s using willingness-to-volunteer as proof of superior patriotism, and superior patriotism is the one claim to status left to those who have no other.
I’ll add a few comments. First, it may be that the economists who understand the economic virtues of city life aren’t doing a sufficiently good job explaining that it’s not the people in cities that contribute the extra economic punch; it’s the cities or, more exactly, the interactions between the people cities facilitate. It’s fine to love the peace of rural life. Just understand that the price of peace is isolation, which reduces productivity.
Second, the idea that economically virtuous actors deserve to be rewarded not simply with economic success but with subsidies is remarkably common in America (and elsewhere) and is not by any means a characteristic limited to rural people. I also find it strange how upset Mr Vilsack is by the fact that he “ha[s] a hard time finding journalists who will speak for them”. Agricultural interests are represented by some of the most effective lobbyists in the country, but their feelings are hurt by the fact that journalists aren’t saying how great they are? This reminds me of the argument that business leaders aren’t investing because they’re put off by the president’s populist rhetoric. When did people become so sensitive? When did hurt feelings become a sufficient justification for untold government subsidies?
Finally, what Mr Klein doesn’t mention is that rural voters are purchasing respect or dignity at the price of livelihoods in much poorer places. If Americans truly cared for the values of an urban life and truly wished to address rural poverty, they’d get rid of agricultural policies that primarily punish farmers in developing economies.
Essentially, Vilsack justifies subsiding farmers on the basis that rural America is the storehouse of our values, for which he has no evidence. And he’s befuddled when confronted with someone who doesn’t take his homilies as obvious facts.
Nobody argues that America’s farmers aren’t a vital part of our economy or denies that rural areas provide a disproportionate number of our soldiers. But the notion that country folks are somehow better people or even better Americans has no basis in reality.
Why is it so common to praise the character of rural America? Part of it is doubtless that rural life represents the past, and we think of the past as a simpler and more honest time. But surely another element is simply that rural America is overwhelmingly white and Protestant. And completely aside from the policy ramifications, the deep-seated veneration of rural America reflects, at bottom, a prejudice few would be willing to openly spell out.
Mike Huckabee, the former governor of Arkansas and a potential 2012 presidential candidate, has been getting lots of press recently for his comments on radio shows. The latest? This week, as first reported by Politico, he went after Hollywood star Natalie Portman.
“People see a Natalie Portman or some other Hollywood starlet who boasts, ‘we’re not married but we’re having these children and they’re doing just fine,’” Huckabee told conservative radio host Michael Medved Monday. “I think it gives a distorted image. It’s unfortunate that we glorify and glamorize the idea of out-of- wedlock children.”
In framing the question to Huckabee, Medved had noted that Portman had said during her acceptance speech that she wanted to thank the father of her child for giving her “the most wonderful gift,” and argued that Portman’s message was “problematic.”
“I think it gives a distorted image that yes, not everybody hires nannies, and caretakers, and nurses,” Huckabee said. “Most single moms are very poor, uneducated, can’t get a job, and if it weren’t for government assistance, their kids would be starving to death and never have health care. And that’s the story that we’re not seeing, and it’s unfortunate that we glorify and glamorize the idea of out of children wedlock.”
“You know, right now, 75 percent of black kids in this country are born out of wedlock,” he continued. “Sixty-one percent of Hispanic kids — across the board, 41 percent of all live births in America are out of wedlock births. And the cost of that is simply staggering.”
During Portman’s Oscar acceptance speech Sunday, she thanked Millepied, saying he gave her “the most important role” of her life.
Medved responded that Millepied “didn’t give her the most wonderful gift, which would be a wedding ring!”
People Magazine reported at the end of last year that Portman and Millepied were engaged. Us Weekly revealed Portman’s engagement ring photos at the beginning of this year. They’re currently still engaged.
Here’s one humble suggestion. Maybe there would be fewer out-of-wedlock pregnancies if there were more sex education, including abstinence and safer sex. Even Bristol knows that.
Also, stop calling it “wedlock.” Sounds like something you get from stepping on a rusty nail.
But in the larger context, hearing about Huckabee’s criticism reinforces the notion that we really are stuck in the 1990s. After all, are there any substantive differences between what Huckabee said yesterday about Natalie Portman and what Dan Quayle said about Murphy Brown in 1992? Other than the fact that Brown was a fictional character, the remarks are remarkably similar.
Indeed, I feel like this keeps coming up. What do we see on the political landscape? Republicans are talking about shutting down the government and impeaching the president; Newt Gingrich is talking about running for president; a Democratic president saw his party get slammed in the midterms; the right wants a balanced budget amendment to the Constitution; conservatives are falsely labeling a moderate health care reform plan “socialized medicine”; and some national GOP leaders are preoccupied with Hollywood and out-of-wedlock births.
The general point about the importance of two parents and marriage for children in poverty is well taken. But using Portman as an object of scorn? A woman who is in a loving relationship, is engaged to be married, and who publicly called her impending motherhood “the most important role of my life”?
She seems an unlikely culture war target. And a hopelessly tone-deaf one. Huckabee seems unready to me, or unwilling, to enter the race. And if he doesn’t, we all know what that means …
Everybody loves Princess Amidala. Luke Skywalker’s mom, for crying out loud! And why would a conservative trash a woman who just called motherhood “the most important role of my life“?
Oh, wait. I forgot.
Mike Huckabee isn’t a conservative. Just ask Ann Coulter.
Frank W. Buckles died Sunday, sadly yet not unexpectedly at age 110, having achieved a singular feat of longevity that left him proud and a bit bemused.
In 1917 and 1918, close to 5 million Americans served in World War I, and Mr. Buckles, a cordial fellow of gentle humor, was the last known survivor. “I knew there’d be only one someday,” he said a few years back. “I didn’t think it would be me.”
Mr. Buckles, a widower, died on his West Virginia farm, said his daughter, Susannah Buckles Flanagan, who had been caring for him there.
Flanagan, 55, said her father had recently recovered from a chest infection and seemed in reasonably good health for a man his age. At 12:15 a.m. Sunday, he summoned his live-in nurse to his bedroom. As the nurse looked on, Flanagan said, Mr. Buckles drew a breath, and his eyes fell shut
There used to be a newsletter for American veterans of World War I. When I first saw it some two decades or more ago, it noted there were some 4,000 of them still alive. I haven’t seen it in many years — I don’t recall its name, but it might have been The Torch. Amazing that any were still alive, given that their war began in this decade a century ago. Alas, its subscriber base dwindled to zero over the weekend with the death of Frank Buckles of West Virginia at 110.
The last American veteran of World War I has died.
At first, it didn’t seem like the like the Missouri-born Frank Buckles would ever go to war. He was repeatedly turned down by military recruiters on account of his age (he was only 16 when the war broke out) but successfully enlisted when he convinced an Army captain he was 18.
“A boy of [that age], he’s not afraid of anything,” said Buckles, who had first tried to join the Marines. “He wants to get in there.”
“I went to the state fair up in Wichita, Kansas, and while there, went to the recruiting station for the Marine Corps,” he told the AP in 2007. “The nice Marine sergeant said I was too young when I gave my age as 18, said I had to be 21.” A week later, Buckles returned to tell the Marine recruiter he was 21, only to be informed that he wasn’t heavy enough.
Buckles then tried for the Navy, but was turned down on account of his flat feet. Finally, he tried for the Army. When a captain asked for his birth certificate, Buckles said they weren’t issued in Missouri at the time of his birth, but that there was a record in the family Bible. “I said, ‘You don’t want me to bring the family Bible down, do you?’” Buckles remembered with a laugh. “He said, ‘OK, we’ll take you.’”
After leaving the Army as a Corporal, he ended up getting a job with a shipping company and traveling all over the world. As luck would have it, he was in Manila when the Japanese attacked Pearl Harbor a few hours before bombing and invading the Philippines. He ended up in Japanese POW camps until 1945 when his was liberated.
He got married several years later and moved to a farm in West Virginia, where he still drove his own car and tractor until he was 102. His wife died in 1999, the same year he was awarded the French Legion of Honor.
In 2008 he became the oldest surviving WWI vet, which of course got him some attention in Washington (including a visit to the White House with George W. Bush) and beyond. George Will wrote a nice column about him. Not everything in WV is named after Robert C. Byrd – then-Gov Joe Manchin named a section of WV Route 9 in his honor at the time.
Sounds like he was a good-natured, amiable sort who did not take his status as the last remaining US WWI veteran as being anything except a testament to his longevity… and as an opportunity to push for refurbishing and rededicating DC’s WWI memorial as a national one.
A leaked manuscript by one of Sarah Palin’s closest aides from her time as governor charges that Palin broke state election law in her 2006 gubernatorial campaign and was consumed by petty grievances up until she resigned.
The unpublished book by Frank Bailey was leaked to the media and widely circulated on Friday.
The manuscript opens with an account of Palin sending Bailey a message saying “I hate this damn job” shortly before she resigned as Alaska’s governor in July 2009, less than three years into her four-year term. The manuscript goes on for nearly 500 pages, a mixture of analysis, gossip and allegation.
Copies of the manuscript were forwarded around Alaska political circles on Friday. The Daily News received copies from multiple sources, the first from author Joe McGinniss, who is working on his own Palin book. McGinniss didn’t respond to a message asking where he obtained the manuscript and the reason he circulated it.
Bailey, a political insider who joined Palin’s 2006 campaign for governor and became part of her inner circle, has never before told his version of the Palin story. Bailey has consistently refused requests for interviews and did so again Friday. The book was co-written with California author Ken Morris and Jeanne Devon of Anchorage, who publishes the popular anti-Palin website Mudflats.
The book comes with all sorts of caveats–it’s not yet published, there’s been no outside verifcation, and Palin has yet to comment–but these are the new nuggets that Palin obsessives are digesting:
Palin may have violated Alaska’s state election law by collaborating with the Republican Governor’s Association on a campaign ad. “State candidates can’t team up with soft-money groups such as the Republican Governor’s Association, which paid for TV commericials and mailers in Alaska during the election in a purported ‘independent’ effort,” the Anchorage Daily News’ Sean Cockerham and Kyle Hopkins explain.
Bailey claims he was “recruited” by Palin’s husband, Todd, to take down Mike Wooten, a fire trooper who was engaged in a child custody battle with Palin’s sister, his ex-wife. According to Bailey, “Todd Palin kept feeding him information on Wooten, which he passed on to troopers.” Bailey also asserts that the selection of Superior Court Judge Morgan Christen as one of the top two judges considered for Supreme Court appointment by the governor was directly influenced by Christen’s ruling against Wooten in the custody fight with Palin’s sister.
Palin supposedly abandoned a commitment to work with the Alaska Family Council to promote a ballot initiative outlawing abortions for teens because she was working on her book. In the manuscript, Bailey writes that this was the final straw, as he had realized Palin was motivated primarily by the prospect of making money.
Bailey claims that the campaign trail revealed Palin’s widespread support was less than genuine. Bailey recalls, “we set our sights and went after opponents in coordinated attacks, utilizing what we called ‘Fox News surrogates,’ friendly blogs, ghost-written op-eds, media opinion polls (that we often rigged), letters to editors, and carefully edited speeches.”
Frank Bailey’s co-authored manuscript, “Blind Allegiance To Sarah Palin,” which leaked out via his agent’s emails to potential publishers, is dynamite. Why? Because Bailey was as close to the Palins as anyone from Palin’s first race for governor to the bitter end, is a rock-ribbed Fox News Republican, has vast amounts of firsthand data (the emails he has published alone reveal a lot), has contempt for Trig skeptics like yours truly, and comes to a simple conclusion in retrospect: Palin is a dangerous, vindictive, incompetent, congenital liar who has no business in any public office. Any publisher interested in the truth about Palin (Harper Collins therefore need not apply) should fight to publish it.
There’s a useful summary of its contents at the Anchorage Daily News, and some notes from the paper’s gossip column with this tart truth:
In the end, what makes Bailey’s manuscript worth more than other Sarah books is his liberal use of contemporaneous records — long quotes from e-mails written at the time by the actual participants. If you want to understand who Sarah really is, you can’t beat her own words.
There’s also just, well, nutritious nuggets like the following. Bailey describes Palin’s eventual media strategy: avoid any MSM interviews and get talking points out through surrogates. Who were they? Bailey names names: Bill Kristol, Mary Matalin, former Bush aides Jason Recher and Steve Biegun, GOP officials Nick Ayers and Michael Steele, Rush Limbaugh, Laura Ingraham, Glenn Beck, Greta Van Susteren, Sean Hannity, and Bill O‘Reilly.
Unlike that other Palin book in the pipeline, Bailey wasn’t just geographically close to his subject (strangely, the Anchorage Daily News reports that author and Palin-neighbor Joe McGinniss was one of the people to pass them the leaked manuscript), he was actually a close confidant to both Palin and her husband, Todd. The book was reportedly put together with the help of 60,000 emails back and forth between he and the former governor. It actually opens with a quote from one of those emails as Palin tells Bailey she “hate[s] this damn job,” shortly before her resignation.
But, everyone’s wondering, what’s the dirtiest “all” that this tell-all “tells?”
The article quotes several passages from Bailey’s book, but none of them seem to rise to a level of scandalous behavior or shocking revelation. Palin obsesses over her media image? Well, maybe, but few politicians at the national level don’t. Palin confidentially told Bailey “I hate this damn job”? Even people who love their jobs have those moments, especially jobs with large responsibilities. Bailey wonders why Palin decided to get caught up in the Carrie Prejean controversy in May 2009:
Concludes Bailey after the episode: “The question we failed to ask was: What does this possibly have to do with being governor of Alaska? While it had nothing to do with Alaska, it had plenty to do with publicity. Fox News made this an ongoing story, giving it wall-to-wall coverage. Sean Hannity in particular latched on with both hands. With Sarah suddenly an outspoken supporter, he had gorgeous Prejean on one arm and sparkling Governor Palin on the other. He appeared a happy man.”
It’s not exactly an unfair question, but it also presumes that every other governor ignores national stories and keeps themselves insulated, which is hardly the case. Palin by this time had already become a national political figure, especially on conservative issues through the burgeoning Tea Party movement, and had been outspoken on social issues since the presidential election. It’s hardly surprising that Palin would want to work to keep up a national profile, which is harder to do from Alaska, both for the grassroots leadership she wanted to provide and for her own political ambitions. While it’s a fair point for criticism from the perspective of Alaskans, it’s hardly the mystery or the anomaly Bailey suggests.
“A leaked manuscript by one of Sarah Palin’s closest aides from her time as governor charges that Palin broke state election law in her 2006 gubernatorial campaign and was consumed by petty grievances up until she resigned.” Nah, that doesn’t sound like her. Must be a governor of another unpopulated northern meth-and-jerky wasteland they’re thinking of. On the other hand, it appears this book has been leaked to Wonkette at least twice, by somebody with a South African e-mail address. And the publisher is said to be upset. Fine. Anyway, here is the good quote holding everything together, dating to right before her resignation as governor: “I hate this damn job.” If she didn’t like that job, she must be very happy she will never be president!
When I was selected as one of the two human players to be pitted against IBM’s “Watson” supercomputer in a special man-vs.-machine Jeopardy! exhibition match, I felt honored, even heroic. I envisioned myself as the Great Carbon-Based Hope against a new generation of thinking machines—which, if Hollywood is to believed, will inevitably run amok, build unstoppable robot shells, and destroy us all. But at IBM’s Thomas J. Watson Research Lab, an Eero Saarinen-designed fortress in the snowy wilds of New York’s Westchester County, where the shows taped last month, I wasn’t the hero at all. I was the villain.
This was to be an away game for humanity, I realized as I walked onto the slightly-smaller-than-regulation Jeopardy! set that had been mocked up in the building’s main auditorium. In the middle of the floor was a huge image of Watson’s on-camera avatar, a glowing blue ball crisscrossed by “threads” of thought—42 threads, to be precise, an in-joke for Douglas Adams fans. The stands were full of hopeful IBM programmers and executives, whispering excitedly and pumping their fists every time their digital darling nailed a question. A Watson loss would be invigorating for Luddites and computer-phobes everywhere, but bad news for IBM shareholders.
The IBM team had every reason to be hopeful. Watson seems to represent a giant leap forward in the field of natural-language processing—the ability to understand and respond to everyday English, the way Ask Jeeves did (with uneven results) in the dot-com boom. Jeopardy! clues cover an open domain of human knowledge—every subject imaginable—and are full of booby traps for computers: puns, slang, wordplay, oblique allusions. But in just a few years, Watson has learned—yes, it learns—to deal with some of the myriad complexities of English. When it sees the word “Blondie,” it’s very good at figuring out whether Jeopardy! means the cookie, the comic strip, or the new-wave band.
I expected Watson’s bag of cognitive tricks to be fairly shallow, but I felt an uneasy sense of familiarity as its programmers briefed us before the big match: The computer’s techniques for unraveling Jeopardy! clues sounded just like mine. That machine zeroes in on key words in a clue, then combs its memory (in Watson’s case, a 15-terabyte data bank of human knowledge) for clusters of associations with those words. It rigorously checks the top hits against all the contextual information it can muster: the category name; the kind of answer being sought; the time, place, and gender hinted at in the clue; and so on. And when it feels “sure” enough, it decides to buzz. This is all an instant, intuitive process for a human Jeopardy! player, but I felt convinced that under the hood my brain was doing more or less the same thing.
Indeed, playing against Watson turned out to be a lot like any other Jeopardy! game, though out of the corner of my eye I could see that the middle player had a plasma screen for a face. Watson has lots in common with a top-ranked human Jeopardy! player: It’s very smart, very fast, speaks in an uneven monotone, and has never known the touch of a woman. But unlike us, Watson cannot be intimidated. It never gets cocky or discouraged. It plays its game coldly, implacably, always offering a perfectly timed buzz when it’s confident about an answer. Jeopardy! devotees know that buzzer skill is crucial—games between humans are more often won by the fastest thumb than the fastest brain. This advantage is only magnified when one of the “thumbs” is an electromagnetic solenoid trigged by a microsecond-precise jolt of current. I knew it would take some lucky breaks to keep up with the computer, since it couldn’t be beaten on speed.
DID THE SINGULARITY just happen on Jeopardy? The Singularity is a process, more than an event, even if, from a long-term historical perspective, it may look like an event. (Kind of like the invention of agriculture looks to us now). So, yeah. “In the CNN story one of the machine’s creators admitted that he was a very poor Jeopardy player. Somehow he was able to make a machine that could do better than himself in that contest. The creators aren’t even able to follow the reasoning of the computer. The system is showing emergent complexity.”
I’m not a big Jeopardy geek, but my understanding is that players are surprised at how big a role button management plays in winning or losing a round. In the few minutes of the Watson game that I watched, it was pretty clear that Watson was excellent at pressing the button at exactly the right moment if it knew the answer, which is more a measure of electromechanical reflex than human-like intelligence.
To the credit of IBM engineers, Watson almost always did know the right answer. Still, there were a few bloopers, such as the final Jeopardy question from yesterday (paraphrasing): “This city has two airports, one named after a World War II hero, and the other named after a World War II battle.” Watson’s guess, “Toronto”, was just laughably bad—Lester Pearson and Billy Bishop fought in World War I, and neither person is a battle. The right answer, “Chicago”, was pretty obvious, but apparently Watson couldn’t connect Midway or O’Hare with WW II.
I was on the show, in 1996 or ’97, and success is based almost entirely on your reflexes — i.e., pushing the buzzer as soon as Trebek finishes reading the question, er, the answer. (I came in second, winning a dining-room set and other fabulous parting gifts, which I had to sell to pay the taxes on them.)The benefit to society would come if we could turn Alex Trebek into Captain Dunsel.
If I owned a gun, it would probably be in my mouth as I type this. I don’t know how the physics of that arrangement would work, but the mood in Chez Jim is darker than Mothra’s hairy crotch. I’ve just been sitting here listening to Weird Al’s weirdly prescient “I Lost on Jeopardy” in the dark, cuddling with a tapped-out bottle of WD-40. Humanity took a hit tonight. Our valiant human heroes made it close, but that Watson tore us new assholes in our foreheads. ALL OF US. That noise you heard driving to work was your GPS system laughing at you. While you were sneezing on the D train this morning your Kindle was giving you the finger. There is blood in the water this morning and this afternoon and forever more. This wasn’t like losing some Nerdgame like chess. Who the hell even knows how to play chess? The horsies go in little circles, right? “Jeopardy!” is the game that makes dumb people feel smart. Like National Public Radio, it’s designed to make people feel superior. And we just found out that people are not superior. No, not at all.I might personally call the whole thing a draw. I read Ken Jennings’ piece in Slate and I can tell the machine was just better at ringing the buzzer than him. If it was truly a battle of Humanity versus Accursed Frankensteinian Monstrosity there should have been one human and one monstrosity. Or one smart human, one machine and me. I could answer sportsy questions. And the rest of the time stay out of Ken’s way. No disrespect to Brad, but this is one fight that ought to have been fought one-on-one. Don’t make humans battle each other to save the world from machines. It’s too cruel. I’d sit back and let the goddamned human expert answer the tough questions. I’d just be there to figure out a way how to unplug the fucking thing when no one was watching. So, here’s the lineup for this Rematch that I demand, formally, right here on The Awl—which I know everyone at IBM reads—Me, Ken and your little Betamax.
And you have to put a little more at stake than just money. For Ken, Me and the Watson. Why did they call it Watson, anyway? Wasn’t Watson just Sherlock Holmes’ butler? And Alexander Graham Bell’s friend who was in the other room and got the first phone call. Why not call the thing what it is: HYDE. Or LILITH. Or Beezelbub of the Underland? Its dark, soulless visage no doubt crushed the very spirit of our human champions. Maybe force it to wear a blonde wig. And talk in Valley Girl language. “Like Oh My God, Gag Me with a Spoon, Alex. I’ll like take like Potpourri for like $800!”
This rematch should happen on Neutral Ground. I suggest Indianapolis. Halftime at the next Super Bowl. This gives Ken a chance to put the pieces of his broken ego back together. And for me to eat some Twinkies. There probably won’t even Be a Super Bowl because of the Looming Lockout, so America will just be watching commercials and various superstars mangling America’s Favorite Patriotic songs. Make IBM take their little Cabinet of Wonders on the Road. Get the military involved to make sure there are no shenanigans this time like plugging it into the Internet or texting it answers from the audience. Also, I want the damned thing to NOT be plugged into the Jeopardy game. It needs to be able to hear Alex and to read the hint on the little blue screen. How much time does it take a human to hear Alex and see it printed out and understand just what the hell the half-idiot writers of “Jeopardy!” were getting at? (Was a Dave Eggers mention really necessary during Wednesday night’s episode? The category was Non-fiction. And it’s obvious that Watson has some kind of super Amazon app embedded in its evil systems. The first 200 pages of Dave’s Heartbreaking Work of Staggering Genius were pretty good. Everything else is Twee Bullshit. “I am a dog from a short story. I am fast and strong. Too bad you know I die in the river from the title of this short story. Woooof!” I mean, seriously, “Jeopardy!” Get a library card. There are billions of other writers and I’ve seen at least 5 shows in which you’ve used some form of Dave Eggers. )
The computer-science department at the University of Texas at Austin hosted viewing parties for the first two nights of the competition.
“People were cheering for Watson,” says Ken Barker, a research scientist at Texas. “When they introduced Brad and Ken, there were a few boos in the audience.”
Texas is one of eight universities whose researchers helped develop the technology on which Watson is based. Many of the other universities hosted viewing parties for the three days of competition as well.
Mr. Barker says he was blown away by Watson’s performance on the show, particularly the computer’s ability to make sense of Jeopardy!‘s cleverly worded clues.
But the computer did make a few mistakes along the way.
Most notably, Watson incorrectly wrote “Toronto” in response to a Final Jeopardy clue in the category of U.S. Cities. Both Mr. Jennings and Mr. Rutter returned the correct response, which was Chicago.
Mr. Barker says Watson may have considered U.S. to be a synonym of America and, as such, considered Toronto, a North American city, to be a suitable response.
Raymond J. Mooney, a computer-science professor at Texas, says Final Jeopardy is the Achilles heel of the computer.
“If it didn’t have to answer that question, it wouldn’t have,” he says.
Clues in that final round are often more complicated than others in the show because they involve multiple parts.
The phrasing of the question Watson got wrong included what linguists refer to as an ellipsis, an omitted phrase whose meaning is implicit from other parts of the sentence. The clue that tripped up Watson, “Its largest airport is named for a World War II hero; its second largest, for a World War II battle,” left out “airport is named” in the second clause.
Mr. Mooney says it will be some time before the average person will be using a computer with the capabilities of Watson, but he did see one potential immediate impact from the show.
The sentient computers of the future are going to think it pretty hilarious that a knowledge-based showdown between one of their own and a creature with a liver was ever considered a fair fight.
FreedomWorks will host a premier of the trailer for the film adaption of Atlas Shrugged at the Conservative Political Action Conference.
Since the novel by Ayn Rand was published in 1957, efforts to produce a film version have been attempted. All failed due to a variety of legal and editorial disputes.
Protagonist Dagny Taggart will be played by actress Taylor Schilling, who previously was the lead character in NBC medical drama Mercy.
Atlas Shrugged has been highly influential within conservative and libertarian circles for its support of laissez-faire economics.
FreedomWorks has been distributing “Who Is John Galt?” signs and merchandise at CPAC, part of an advertisement for a faithful adaptation of Ayn Rand’s “Atlas Shrugged.” Clips from the movie have been playing at CPAC. I haven’t watched them all. I have seen the trailer.
Last night, the CPAC Bloggers Bash attendees were “treated” to an excruciatingly long preview of the forthcoming “Atlas Shrugged” movie, which will hit a theater near you on April 15. Actually, it’ll just be Part I. Like the Lord of the Rings, this will be a trilogy.
Judging by the preview, I can fully understand why it took more than two decades to find a studio to produce the flick. This is quite possibly the most boring film ever made — and I include documentaries that are shown in grammar school so that children can request to view them backwards.
Put it this way: I simply do not know enough expletives to adequately express how truly horrible this film was. I would rather be subjected to the “Clockwork Orange” treatment than sit through one part of this. I might well prefer death to enduring the trilogy.
As a long-time fan of the novel and a very discriminating movie viewer, I’ll admit that I’ve had my doubts about this project all along, given its low budget and rushed production schedule. Viewing the scenes that I did – albeit a small sample size – did not assuage my early concerns.
Like the book, the film is set in the near future, though now it’s given the date of 2016. The filmmakers went for a “ripped from the headlines” type vibe, with images of the economy tanking, the country’s infrastructure collapsing, protests raging in the streets, Congress passing statist legislation, and a TV news anchor leading a panel discussion between some of the book’s characters.
The dramatic scenes were true to the book. The problem is that Rand’s characters don’t really speak like normal people, and this can be particularly jarring on film if not handled correctly. I found the dialogue in the parts between Dagny Taggart and Hank Reardon to be unnatural and their acting subpar.
I spoke with some fellow bloggers afterword who thought I was being too harsh and others who were outright enthusiastic about what they saw. I felt compelled to write something given the immense interest in this film, but I’ll withhold further judgment until I see the entire movie, which is the first of a planned three-part series.
If anything, to me it feels too generic, like a promo for some new Fox primetime soap about young, beautiful businesspeople. Think “Melrose Place” meets “Wall Street.” Or isn’t that what “Atlas Shrugged” basically is, plus some loooooooong didactic passages about libertarianism? (Haven’t read it!)
For all I know, “meh” is not actually a word, but somehow it perfectly describes the new Atlas Shrugged trailer. This movie has been through true development hell – detailing every incarnation would be a long, strange trip, but for some reason, no one’s ever pulled the trigger on it until now.
Its 40 year ride through development, through Brad Pitt and Russell Crowe, through Angelina Jolie and Charlize Theron, has deposited it here – without any big stars and split up into three films.
Before I begin, no post about Atlas Shrugged is complete if it does not include this:
There are two novels that can change a bookish fourteen-year old’s life: The Lord of the Rings and Atlas Shrugged. One is a childish fantasy that often engenders a lifelong obsession with its unbelievable heroes, leading to an emotionally stunted, socially crippled adulthood, unable to deal with the real world. The other, of course, involves orcs.- KF Monkey
So, for your Friday evening’s viewing pleasure (courtesy of DougJ aka A Writer At Balloon Juice, LLC) and pretty much everyone else who has stumbled across this youtube nugget: Atlas Shrugged: The Movie, coming soon to cinema emporium hopefully nearer to you than me.
Trains. Who in America doesn’t want to see more movies with lots and lots of trains in them? And industrialists talking about money and profits. And trains. Let’s go to the imdb description:
A powerful railroad executive, Dagny Taggart, struggles to keep her business alive while society is crumbling around her.
As we can see from the preview, Dagny is going to shut down her train business and that will make America fail. Because America’s trains …. well, I guess they power iPhones or make porn or something. And we all know that America cannot live without those things.
According to someone at imdb who seems to be in the know:
Rand’s dramatic classic comes to the screen after decades of endeavor. Although on a tight budget, it is well cast, and the story is given a modern setting to appeal more to today’s audiences.
If they wanted to update it to appeal to modern audiences then the trains would change into big robots are start fighting each other amidst shit blowing up. Then they could have gotten Michael Bay to make this film. It would still be shitty, but at least it would make money.
That’s the title and it’s by me, the Amazon link is here, Barnes&Noble here. That’s an eBook only, about 15,000 words, and it costs $4.00. If you wish, think of it as a “Kindle single.”
Your copy will arrive on January 25 and loyal MR readers are receiving the very first chance to buy it. Very little of the content has already appeared on MR.
Many of you have read my article “The Inequality that Matters,” but there I hardly touched on median income growth. That is because I was writing this eBook.
Has median household income really stagnated in the United States? If so, why? Are the causes political or something deeper? What are the important biases in how we are measuring national income and productivity and why do they matter for economic policy? Are we getting enough value for all the extra money we are spending on the health care and education sectors? What do some major right-wing and left-wing thinkers miss about this phenomenon?
How does all this relate to our recent financial crisis?
I dedicated this book to Michael Mandel and Peter Thiel, two major influences on some of the arguments.
Why did big government arise in the late 19th and early 20th centuries, what is its future, and why is science so important for macroeconomics? How can we fix the current mess we are in?
How great was Tyler Cowen’s marketing coup? Well he forced a technophobe like me to actually learn how to use Kindle. I wasn’t too happy about that, which makes me inclined to write a very negative review. But that’s kind of hard to do credibly when I agree with the central proposition of the book; that technological progress (at least as traditionally measured) has slowed dramatically, and will continue to be disappointing for the foreseeable future.
In an earlier post I argued that my grandma’s generation (1890-1969) saw the biggest increase in living standards; most notably a longer lifespan (due to diet/sanitation/health care), indoor plumbing and electric lights. Less important inventions included home appliances, cars and airplanes, and TVs. From the horse and buggy era to the moon landing in one life. And all I’ve seen is the home computer revolution. Not much consolation for a technophobe like me. I’m probably even more pessimistic than Tyler.
The parts of the book I liked best were those that discussed governance. I had noticed that there was a correlation between cultures that are good at governance, and cultures that are good at running big corporations. But Tyler added an interesting perspective, arguing that the technologies that facilitated the growth of big corporations also facilitated the growth of big government. I don’t recall if he made this point, but I couldn’t help thinking that the neoliberal revolution, which led to some shrinkage in government size, was also associated with a move away from the big corporate conglomerates of the 1960s, towards smaller and more nimble businesses.
Tyler has a long list of complaints about the wasteful nature of our government/education/health care sectors, which he hinted is really just one big sector. While reading this section I kept wondering when he was going to mention Singapore, which has constructed a fiscal regime ideally suited for the Great Stagnation. When he finally did, on “Page” 830-37, he did so in an unexpected context, as an example of a society that reveres scientists and engineers. He had just suggested that the most important thing we could do to overcome the stagnation was:
Raise the social status of scientists.
My initial reaction was skepticism. First, how realistic is it to expect something like this to happen? I suppose the counterargument is that every new idea seems unrealistic, until it actually occurs. But even if it did, would it really speed up the rate of scientific progress? My hunch is that if we doubled the number of people going into science, there would be very little acceleration in scientific progress. First, because the best scientists (think Einstein) are already in science, driven by a love of the subject. Second, with a reasonably comprehensive research regime, progress in finding a cure for cancer may require a certain set of interconnected discoveries in biochemistry that simply can’t be rushed by throwing more money and people at the problem. Similarly, progress in info tech may play out at a pace dictated by Moore’s law. Given Moore’s law, no amount of research could have produced a Kindle in 1983. Could more scientists speed up Moore’s law? Perhaps, I’m not qualified to say. But that’s certainly not the impression I get from reading others talk about information technology.
Here’s another exhortation that caught my eye:
Be tolerant, and realize there are some pretty deep-seated reasons for all the political strife and all the hard feelings and all the polarization.
I couldn’t help thinking of Paul Krugman and Tyler Cowen, the two brightest stars of the economic blogosphere. If only one of those two are able to have this sort of dispassionate take on policy strife, how likely are the rest of us mere mortals to be able keep a clear head and remain above the fray? Still, it’s great advice.
Mr Cowen’s book can be very briefly (too briefly) summarised as follows. The rich world faces two problems. The first is that a decline in innovation has reduced the growth rate of output and median incomes, making it hard for rich countries to meat obligations accepted when expectations were higher. The second is that a lot of recent innovation is occuring in places like the internet, where new products are cheap or free and create very few jobs.
Mr Sumner’s response is a good one. What Mr Cowen is essentially saying, he suggests, is that the actual price level is tumbling. Technology has created a lot of great things that are available for free, and so the price of a typical basket of household consumption is dropping like a rock. People used to spend a lot of money going to movies, buying books and records, making expensive long-distance phone calls, paying for word processing software, and so on. Now, a lot of that can be done at almost no cost. Prices are falling.
That has a couple of implications. It suggests that real incomes are actually rising, at least for those consuming the bulk of the free online content. And perhaps real incomes are too high, in some cases, for labour markets to clear. Given broader disinflation (understated because non-purchased goods aren’t included in price indexes) both prices and wages may need to adjust, but if they’re sticky, then they won’t. What’s needed is reinflation.
To a certain extent, Mr Cowen is concerned about society’s ability to pay off old obligations, and one reason society might struggle to do this is that new innovations deliver value through non-monetary transactions. But the value is still there, and that’s what should really matter for the paying-off of obligations. When you borrow, you’re offering to compensate the lender with more utility tomorrow for less utility today. Thanks to the internet, utility today is cheap, and that’s only a problem because the obligations we acquired yesterday were denominated in dollars. But we can print enough money to meet yesterday’s obligations. Indeed, we should, in order to offset the deflationary pressures from the cheap innovations.
Imagine a world in which technology has advanced to the point that robots can build robots that operate at basically no cost at basically no cost, such that people can have anything that want anytime for free; the only constraint on consumption is the time available. That would be a cashless economy, and as a result, debtors would be totally unable to pay creditors. But does that matter?
Tyler Cowen argues that technological change since the early 1960s hasn’t been as transformative for ordinary peoples’ lives as the change that went before.
Better yet, think about how a typical middle-class family lives today compared with 40 years ago — and compare those changes with the progress that took place over the previous 40 years.
I happen to be an expert on some of those changes, because I live in a house with a late-50s-vintage kitchen, never remodelled. The nonself-defrosting refrigerator, and the gas range with its open pilot lights, are pretty depressing (anyone know a good contractor?) — but when all is said and done it is still a pretty functional kitchen. The 1957 owners didn’t have a microwave, and we have gone from black and white broadcasts of Sid Caesar to off-color humor on The Comedy Channel, but basically they lived pretty much the way we do. Now turn the clock back another 39 years, to 1918 — and you are in a world in which a horse-drawn wagon delivered blocks of ice to your icebox, a world not only without TV but without mass media of any kind (regularly scheduled radio entertainment began only in 1920). And of course back in 1918 nearly half of Americans still lived on farms, most without electricity and many without running water. By any reasonable standard, the change in how America lived between 1918 and 1957 was immensely greater than the change between 1957 and the present.
Now, you can overstate this case; medical innovations, in particular, have made a huge difference to some peoples’ lives, mine included (I have a form of arthritis that would have crippled me in the 1950s, and in fact almost did 20 years ago until it was properly diagnosed, but barely affects my life now thanks to modern anti-inflammatories.) But the general sense that the future isn’t what it used to be seems right.
Tyler Cowen’s celebrated Kindle publication “The Great Stagnation” has received a lot of attention from the Web community. The New York Times David Leonhardt gets the author to sit for an e-interview on his e-book and asks a good first question: If our innovation motor is broken, what should we do know?
Cowen responds that we should double down on science…
The N.I.H. has done a very good job in promoting medical innovation and this is in large part because it allocates funds on a relatively meritocratic basis; Congress doesn’t control particular grants and on many important fronts the N.I.H. has autonomy. It is one reason why the United States is the world leader in medical research and development and I would expand its funding, provided it retains this autonomy. Basic research is often what economists call a “public good” and it offers economic and health returns for many years to come.
… and get realistic about clean energy.
“Clean energy” is a very important issue, for reasons of climate change, but it won’t be a job creator in a useful sense. In terms of energy production, fossil fuels are quite powerful. With green energy, at this point, we are simply looking to break even, namely to receive some of our current power but without the negative environmental consequences which accrue from carbon. That’s a worthy goal, but we shouldn’t start thinking about green energy as speeding up economic growth or creating jobs. It’s more like a necessary burden we will have to bear and the fact that these costs lie in front of us – from both the climate change and from the technological adjustments — is a sobering thought.
These are smart thoughts from a very smart guy. But let’s think about NIH funding from a jobs perspective. If the government increases science funding and this results in more pharmaceutical drugs coming online, that’s a great thing for the pharmaceutical industry. But new drugs, like any new technology, can be disruptive. For example, a drug to ease the side-effects of end-of-life diseases might replace the need for home health aides, which are projected to be one of the fastest growing jobs in the country for low-skilled workers. That’s not a reason not to develop a totally useful rug! But it throws a wrench into a claim (one that I’ve often made, too) that innovations in biosciences are pure job-creators.
Cowen’s characterization of plumbing, fossil fuels, public education systems, penicillin and so forth as “low-hanging fruit” bugs me a bit. It took human beings quite a while to figure all that out. But Cowen is right to say that once discovered, those innovations produced extremely high returns. From the economy’s perspective, the difference between having cars and not having cars is a lot larger than the difference between having cars and having slightly better cars. A 1992 Honda Accord and a 2010 Honda Accord aren’t the same, but they’re pretty close.
The obvious rejoinder to this is, “What about the internet?” The problem, as Cowen points out, is that the Internet is not yet employing many people or creating much growth. We needed a lot of people to build cars. We don’t need many people to program Facebook. It’s possible, Cowen thinks, that the Internet is just a different type of innovation, at least so far as its ripples in the labor market are concerned. “We have a collective historical memory that technological progress brings a big and predictable stream of revenue growth across most of the economy,” he writes. “When it comes to the web, those assumptions are turning out to be wrong or misleading. The revenue-intensive sector of our economy have been slowing down and the beg technological gains are coming in revenue-deficient sectors.”
Maybe the Internet just needs some time to come into its growth-accelerating own. Or maybe the Internet is going to be an odd innovation in that its gains to human knowledge and enjoyment and well-being will serve to demonstrate that GDP and even median wage growth are insufficient proxies for living standards. Either way, we’re still left with a problem: Stagnant wages are a bad thing even if Wikipedia is a big deal.
And it’s not just the Internet. Even when we’re growing, things look bad. The sectors that are expanding fastest are dysfunctional. We spend a lot of money on education and health care, but seem to be getting less and less back. The public sector is getting bigger, but it’s not at all clear it’s getting better. For much of the last few decades, the financial sector was was generating amazing returns — but that turned out to be a particularly damaging scam. And economic malaise is polarizing our politics, leaving us less able to respond to these problems in an effective or intelligent way.
Tyler makes a bunch of other arguments in “The Great Stagnation” too, some more persuasive than others. Like some other critics, I’m not sure why he uses median wage growth as a proxy for economic growth. It’s important, but it’s just not the same thing. Besides, median wage growth in the United States slowed very suddenly in 1973, and it’s really not plausible that our supply of low hanging fruit just suddenly dropped by half over the space of a few years. I also had a lot of problems with his arguments about whether GDP generated by government, education, and healthcare is as “real” as other GDP. For example, he suggests that as government grows, its consumption is less efficient, but that’s as true of the private sector as it is of the public sector. A dollar of GDP spent on an apple is surely more “real” than a dollar spent on a pet rock, but there’s simply no way to judge that. So we just call a dollar a dollar, and figure that people are able to decide for themselves whether they’re getting the same utility from one dollar as they do from the next.
The healthcare front is harder to judge. I agree with Tyler that we waste a lot of money on healthcare, but at the same time, I think a lot of people seriously underrate the value of modern improvements in healthcare. It’s not just vaccines, antibiotics, sterilization and anesthesia. Hip replacements really, truly improve your life quality, far more than a better car does. Ditto for antidepressants, blood pressure meds, cancer treatments, arthritis medication, and much more. The fact that we waste lots of money on useless end-of-life treatments doesn’t make this other stuff any less real.
To summarize, then: I agree that the pace of fundamental technological improvements has slowed, and I agree with Tyler’s basic point that this is likely to usher in an era of slower economic growth in advanced countries. At the same time, improvements in managerial and organizational efficiency thanks to computerization shouldn’t be underestimated. Neither should the fact that other countries still have quantum leaps in education to make, and that’s going to help us, not just the countries trying to catch up to us. After all, an invention is an invention, no matter where it comes from. And finally, try to keep an even keel about healthcare. It’s easy to point out its inefficiencies, but it’s also easy to miss its advances if they happen to be in areas that don’t affect you personally.
With a history of sending spandex-clad stunt doubles hurtling towards earth and terrible buzz, there was little suspense about how the nation’s top theater critics would review Julie Taymor’s latest musical, Spider-Man: Turn Off the Dark. On Monday night, they posted their reviews, breaking an embargo that was supposed to last until the show opens on March 15, and it became clear that the true contest was to see which critic could craft the most withering put-down.
“Spider-Man” has not even officially opened yet. The date has been delayed five times to fix myriad problems, with Sunday afternoon being preview performance No. 66 and the opening planned for Monday night being pushed back five more weeks to March 15. But this $65 million musical has become a national object of pop culture fascination — more so, perhaps, than any show in Broadway history.
Starting with Conan O’Brien’s spoof of Spider-Man warbling in rhyme on Nov. 30, two nights after the musical’s problem-plagued first preview, the show has been lampooned on every major late-night comedy show and by The Onion, which portrayed the producers as still being optimistic about the show despite a nuclear bomb’s detonating during a preview. Recently, Steve Martinslyly referred to it in a series of tweets about watching the “Spider-Man” movies at home.
“Settling in to watch Spiderman 3 on deluxe edition DVD, but I fell from hanging cables in screening room. 2 hour delay,” he wrote.
Media celebrities like Oprah Winfrey, Glenn Beck and the hosts of “Morning Joe” have all raved about the musical, especially Mr. Beck, who said in an interview on Friday that he had seen it four times.
Mr. Beck has framed its appeal on his radio broadcast as a face-off between regular Americans and cultural snobs (i.e., liberals). In the interview, however, he was more fanboy than fire breather, rattling off plot points and design elements with the practiced eye of a Sardi’s regular.
“The story line is right on the money for today, which is to be your better self, that you can spiral into darkness or — ” here he quoted one of the show’s anthemic songs — “you can rise above,” said Mr. Beck, who estimated that he sees a dozen shows a year. “In fact, I just wrote an e-mail to Julie” — Ms. Taymor — “about how much I loved the new ending.”
Last month, “Spider-Man” became the first Broadway show since “The Producers” to land on the cover of The New Yorker; the cartoon, by Barry Blitt, who also did “The Producers” cover in 2001, showed several injured Spider-Men in a hospital ward.
“For our cover we always ask ourselves, would our one million readers know what we were making reference to?” said Francoise Mouly, art editor of The New Yorker. “But in no time at all, ‘Spider-Man’ has gotten enough notoriety that we knew the cover would make people laugh. Even the show’s producers laughed; they’ve been hounding us to buy copies of the artwork.”
Reading through the reviews this morning, it became clear that the main character in this drama isn’t Peter Parker—it’s Julie Taymor. Theater directors rarely receive the kind of mainstream attention that their Hollywood brethren do. (Do you know who Daniel Sullivan is?) But in this case, the specter of steely, uncompromising Taymor looms large over the critical discussion.
There’s a reason for this: Spider-Man is very clearly Taymor’s production, stamped with her trademark mix of spectacle and folklore. (She first gained widespread fame for her shadow-puppets-on-the-savannah production of The Lion King.) And she seems to have created a proxy for herself with Arachne, Spider-Man‘s ancient, eight-legged antagonist.
Some of my colleagues have wondered aloud whether Spider-man will ever be finished — whether it is, in fact, finishable. I think they’re onto something: I saw the show on Saturday night, and found it predictably unfinished, but unpredictably entertaining, perhaps on account of this very quality of Death Star–under–construction inchoateness. Conceptually speaking, it’s closer to a theme-park stunt spectacular than “circus art,” closer to a comic than a musical, closer to The Cremaster Cycle than a rock concert. But “closer” implies proximity to some fixed point, and Spider-man is faaaar out, man. It’s by turns hyperstimulated, vivid, lurid, overeducated, underbaked, terrifying, confusing, distracted, ridiculously slick, shockingly clumsy, unmistakably monomaniacal and clinically bipolar.
But never, ever boring. The 2-D comic art doesn’t really go with Julie Taymor’s foamy, tactile puppetry, just as U2’s textural atmo-rock score doesn’t really go with the episodic Act One storytelling. Yet even in the depths of Spider-man‘s certifiably insane second act, I was riveted. Riveted, yes, by what was visible onstage: the inverted Fritz Lang cityscapes, the rag doll fly-assisted choreography, the acid-Skittle color scheme and Ditko-era comic-art backdrops. But often I was equally transfixed by the palpable offstage imagination willing it all into existence. See, Spider-man isn’t really about Spider-man. It’s about an artist locked in a death grapple with her subject, a tumultuous relationship between a talented, tormented older woman and a callow young stud. Strip out the $70 million in robotic guywires, Vari-lites, and latex mummery, and you’re basically looking at a Tennessee Williams play.
We loved the show, and here is why we think people will see it:
• Flying is awesome.
There are aerial acrobatics; airborne fight scenes; the actors fly up and land among the audience. The wires are visible but don’t obstruct any of the view or movements of the actors.
• The story is familiar, yet fresh.
It is based on the classic comic books, and the movie, so the audience knows what to expect — nerdy Peter Parker gets bit by a mutating spider and acquires superpowers. After his uncle is killed, he becomes a crusader against crime. And, of course, Peter is in love aspiring actress Mary Jane who is in love with Spiderman.
Spiderman faces off with a bunch of villains, most notably the Green Goblin.
There are only two new story elements that the writers have introduced: the Geek Chorus — four teenagers that are obviously creating/narrating the story of Spiderman that unfolds before our eyes; and a new villain — Arachne, a character from Greek mythology, that tempts Spiderman to give in to his powers and cross over to some abstract dimension to become her boyfriend.
These new elements make Spiderman: The Musical fresh and different that the usual Spiderman adaptation. And who is to complain about an old-fashion love triangle plot?
• The sets are creative.
Unfolding backdrops, huge video screens; most of the set invokes the theme that this is a comic book story. The sets move surprisingly quickly, given how massive and detailed they are.
• The music is by Bono and The Edge.
The songs are very U2 and very rock at times, and it’s loud. As it should be.
• The cast
My favorite were the villains — the Green Goblin and Arachne.
• The choreography
Cool slow motion sequences.
• It’s the most expensive show ever.
With a price tag of $65 million, this is indeed the most expensive Broadway show ever produced — which is another reason why tourists and locals alike would flock to see it and judge it for themselves.
So if there are no more injuries, and the production irons out the technical glitches that do occur and are tolerable during previews but will be unacceptable once the show opens, Spiderman should pull through for its investors (who include theater veterans like James Nederlander and Terry Allan Kramer, as well as Disney via its acquisition of Marvel, the franchise for the Spiderman comics.)
Sorry, esteemed Broadway critics, but we are with Glenn Beck on this one.
And so, while we usually reserve our “Most Scathing Reviews” feature for movies, we’ll make an exception for this Broadway production that seems to wish it was a movie.9. “Never mind turning off the dark. I spent much of this dreadful new musical muttering Please, Lord, make it stop.” — Charles Spencer, The Telegraph
8. “For without a book with consistent rules that a mainstream audience can follow and track, without characters in whom one can invest emotionally, without a sense of the empowering optimism that should come from time spent in the presence of a good, kind man who can walk up buildings and save our lousy world from evil, it is all just clatter and chatter.” — Chris Jones, The Chicago Tribune
7. “Spider-Man is chaotic, dull and a little silly. And there’s nothing here half as catchy as the 1967 ABC cartoon theme tune.” — David Rooney, The Hollywood Reporter
6. “More dispiriting is the music… [Bono and the Edge] transformed their sound into stock Broadway schlock pop—sentimental wailing from the early Andrew Lloyd Webber playbook, winceable lyrics and the kind of thumpa-thumpa music that passes for suspense in action flicks.” — Linda Winer, Newsday
5. “Or wait, maybe the bottom of the barrel is a weird on-the-runway sequence, in which a cadre of second-tier villains with names like Swiss Miss and Carnage do a bit of high-fashion sashaying. In the running, too, is a bizarre military number, as well as the first-act closer, a rip-off of a Rodgers and Hart song. The latter is sung by – get out your score cards – the other main-event evildoer, the Green Goblin, a former scientist played by the talented classical actor Patrick Page.” — Peter Marks, The Washington Post
4. “Who exactly is “Spider-Man: Turn Off the Dark” for anyway? The only answer I can come up with is an audience of Julie Taymor types who care only about panoramic sensibility— a bit of slow-mo choreography here, a smattering of diabolical mask work there. Much as I enjoyed the clever shifts in perspective during the skyscraper scenes, it was hard for me to picture adults or young people yearning for a second visit, never mind critics who may feel obliged to check back in with the production when (or should I say if?) it officially opens. Nothing cures the curiosity about “Spider-Man” quite like seeing it.” — Charles McNulty, The LA Times
3. “After all this expenditure of talent and money, “Spider- Man” is probably unfixable because too much has gone into making humans fly, which is not what they are good at. It imitates poorly what the “Spider-Man” movies do brilliantly with computer graphics — and without putting live actors in jeopardy.” — Jeremy Gerard, Bloomberg
2. “This production should play up regularly and resonantly the promise that things could go wrong. Because only when things go wrong in this production does it feel remotely right — if, by right, one means entertaining. So keep the fear factor an active part of the show, guys, and stock the Foxwoods gift shops with souvenir crash helmets and T-shirts that say “I saw ‘Spider-Man’ and lived.” Otherwise, a more appropriate slogan would be “I saw ‘Spider-Man’ and slept.” — Ben Brantley, New York Times
1. “It’s by turns hyperstimulated, vivid, lurid, overeducated, underbaked, terrifying, confusing, distracted, ridiculously slick, shockingly clumsy, unmistakably monomaniacal and clinically bipolar…At this point, I honestly hope they never fix the (non-injurious) glitches: They puncture the show’s pretense and furnish meta-theatrical opportunities that can’t be staged. We’ve had Epic Theater, we’ve had Poor Theater — is this the dawn of Broken Theater?” — Scott Brown, From his review in New York Magazine, which is actually neither negative, positive or even neutral, but seems to sum up the irrationality of the whole enterprise better than any other.
Donald Rumsfeld’s memoir, “Known and Unknown,” isn’t set to be released until next week, but several news sites have obtained early copies. Previews of the book give insight into Rumsfeld’s negative opinion of several of his colleagues, his regrets or lack there of from his years as defense secretary, as well has personal struggles within his own family.
Just 15 days after the terrorist attacks of Sept. 11, 2001, President George W. Bush invited his defense secretary, Donald H. Rumsfeld, to meet with him alone in the Oval Office. According to Mr. Rumsfeld’s new memoir, the president leaned back in his leather chair and ordered a review and revision of war plans — but not for Afghanistan, where the Qaeda attacks on New York and Washington had been planned and where American retaliation was imminent.
“He asked that I take a look at the shape of our military plans on Iraq,” Mr. Rumsfeld writes.
“Two weeks after the worst terrorist attack in our nation’s history, those of us in the Department of Defense were fully occupied,” Mr. Rumsfeld recalls. But the president insisted on new military plans for Iraq, Mr. Rumsfeld writes. “He wanted the options to be ‘creative.’ ”
When the option of attacking Iraq in post-9/11 military action was raised first during a Camp David meeting on Sept. 15, 2001, Mr. Bush said Afghanistan would be the target. But Mr. Rumsfeld’s recollection in the memoir, “Known and Unknown,” to be published Tuesday, shows that even then Mr. Bush was focused as well on Iraq. A copy was obtained Wednesday by The New York Times.
But Rumsfeld still can’t resist – in a memoir due out next week – taking a few pops at former secretaries of state Colin L. Powell and Condoleezza Rice as well as at some lawmakers and journalists. He goes so far as to depict former president George W. Bush as presiding over a national security process that was marked by incoherent decision-making and policy drift, most damagingly on the war in Iraq.
Much of Rumsfeld’s retrospective reinforces earlier accounts of a dysfunctional National Security Council riven by tensions between the Pentagon and State Department, which many critics outside and within the Bush administration have blamed on him. Speaking out for the first time since his departure from office four years ago, the former Pentagon leader offers a vigorous explanation of his own thoughts and actions and is making available on his Web site (www.rumsfeld.com) many previously classified or private documents.
Sounding characteristically tough and defiant in the 800-page autobiography “Known and Unknown,” Rumsfeld remains largely unapologetic about his overall handling of the Iraq conflict and concludes that the war has been worth the costs. Had the government of Saddam Hussein remained in power, he says, the Middle East would be “far more perilous than it is today.”
Addressing charges that he failed to provide enough troops for the war, he allows that, “In retrospect, there may have been times when more troops could have helped.” But he insists that if senior military officers had reservations about the size of the invading force, they never informed him. And as the conflict wore on, he says, U.S. commanders, even when pressed repeatedly for their views, did not ask him for more troops or disagree with the strategy.
Much of his explanation of what went wrong in the crucial first year of the occupation of Iraq stems from a prewar failure to decide how to manage the postwar political transition. Two differing approaches were debated in the run-up to the war: a Pentagon view that power should be handed over quickly to an interim Iraqi authority containing a number of Iraqi exiles, and a State Department view favoring a slower transition that would allow new leaders to emerge from within the country.
Shortly after the Abu Ghraib scandal broke in 2004, Secretary of Defense Donald Rumsfeld offered President George W. Bush his resignation. Bush refused. Five days later, just so there was no confusion, Rumsfeld offered again, and once again, Bush refused. It was another two and a half years until Rumsfeld was finally canned. But in his upcoming 800-page memoir, Known and Unknown, Rumsfeld writes that he really wishes Bush had just let him go earlier.
One of the few personal anecdotes in the 815-page volume takes place more than 12 hours after hijacked planes struck not only the World Trade Center but the Pentagon, filling his office with heavy smoke and forcing him to evacuate with other employees, some of them wounded. His spokeswoman, Torie Clarke, asked if he had called his wife of 47 years, Joyce. Rumsfeld replied that he had not.
“You son of a bitch,” Clarke said with a hard stare.
“I respect Secretary Rumsfeld. He and I had a very, very strong difference of opinion about the strategy that he was employing in Iraq which I predicted was doomed to failure,” the Arizona Republican said on “GMA.”
McCain and Rumsfeld had clashed over troop levels.
“And thank God he was relieved of his duties and we put the surge in otherwise we would have had a disastrous defeat in Iraq,” McCain told me.
Rumsfeld is also going to release a website full of “primary documents” that he thinks will prove his point. It will be like the WikiLeaks, only instead of pulling back the curtain and exposing American diplomatic and military secrets, they will probably just be a bunch of memos about how much Rumsfeld was “concerned” about the security situation in post-invasion Baghdad. Also I bet there will be a document that says “I promise Donald Rumsfeld had no idea that we were torturing and killing prisoners, signed, everyone at Abu Ghraib.”
Speaking of! Rumsfeld says Bill Clinton called him once and said: “No one with an ounce of sense thinks you had any way in the world to know about the abuse taking place that night in Iraq.” Yes, well, the people with ounces of sense are completely wrong.
Rumsfeld also apparently devotes a lot of space to rewaging various long-forgotten bureaucratic disputes. There is something about George H. W. Bush, whom he clearly hates. Rumsfeld also wants everyone to know that former Vice President Nelson Rockefeller was “bullying” and an “imperial vice president,” which is hilarious for many reasons, including Rumsfeld’s closeness to Dick Cheney and the fact that as Gerald Ford’s chief of staff, Rumsfeld basically blocked Rockefeller from doing anything.
Now let’s enjoy the attempted rehabilitation of Rumsfeld in the press, where his awfulness has probably been entirely forgotten.
Rummy says Defense was preparing for offense on Afghanistan at the time, but Bush asked him to be “creative.” Creative! Perhaps the military could stage a production of Grease for the people of Iraq before taking a bow and dropping a bomb on them?
The book mixes the policy and the personal; at the end of the same Oval Office session in which Mr. Bush asked for an Iraq war plan, Mr. Rumsfeld recounts, the president asked about Mr. Rumsfeld’s son, Nick, who struggled with drug addiction, had relapsed and just days before had entered a rehabilitation center. The president, who has written of his own battles to overcome a drinking problem, said that he was praying for Mr. Rumsfeld, his wife, Joyce, and all their children.
“What had happened to Nick — coupled with the wounds to our country and the Pentagon — all started to hit me,” Mr. Rumsfeld writes. “At that moment, I couldn’t speak. And I was unable to hold back the emotions that until then I had shared only with Joyce.”
Ah, there you have it. Rumsfeld could have said, “What the fuck are you talking about going to war with Iraq for? Our country was just attacked by a foreign terrorist organization we need to go try to destroy. Iraq has nothing to do with this. Aren’t you more concerned with winning this war we haven’t even begun yet?” But instead, his son had done some drugs. Sure thing, Rumsfeld. Perfectly good excuse. You should drop some leaflets on the families of people, American and Iraqi, whose children have died in that war. “Sorry, my son was doing drugs. I was emotional at the time. Not my fault.”
So here you have it: There’s finally someone to blame the entire Iraq War on: Nick Rumsfeld. HOPE YOU LIKED THOSE DRUGS, ASSHOLE!