About Me

My Photo

A good day begins with the NYTimes, NPR, Arts & Letters Daily, Sacred Space & good coffee; it ends with a Grand Marnier. A brilliant day would be spent in London, New York or San Francisco -- although Sydney would be right up there. Unwinding in Carmel or Antibbes. Daytime spent in library (the Morgan, LOC or Widener) or museum (the Frick, the Louvre, British) with a healthy walk (around Lake Annecey); evening -- theatre (West End), or music (Carnegie Hall). A nice last meal: Perhaps the French Laundry or Fredy Giardet or Quennelles de Brochet from Taillevent, Cassoulet from Cafe des Artistes, Peking Duck from le Tsé-Fung, Lobster Savannah from Locke-Ober, Sacher Torte from Demel and Café Brulot from Antoine. Sazerac as an apéritif, Le Môntrachet in the beginning, Stag's Leap Cabernet in the middle, Veuve Cliqûot to conclude. Desert Island: Imac, Ipod, (I know, generator and dish necessary) Johnnie Walker Blue Label, wife & Adler's Great Books.


Switching on to a new generation gap » The Spectator

Switching on to a new generation gap » The Spectator

Obama & the Coming Election by Elizabeth Drew

Obama & the Coming Election by Elizabeth Drew | The New York Review of Books - gmagnuson@gmail.com - Gmail

Obama & the Coming Election

Win McNamee/Getty Images
President Obama with Republican congressional leaders Mitch McConnell and John Boehner at the Capitol, Washington, D.C., February 2013
Thus far, interest in this year’s midterm elections is in almost inverse proportion to their importance. The most important question is whether the Republicans will gain control of the Senate while retaining their majority in the House. That could make Congress still more belligerent toward the president. It would not only continue to block progress on pressing national needs but also prevent him from shoring up the progressive faction on the Supreme Court against what a possible Republican successor would do.
Also uncertain is to what extent the Democrats can reverse the enormous gains the Republicans made in 2010, when they took over both the governorships and the legislatures of twelve formerly Democratic states. They now control twenty-six states, which has had major substantive effects on national policy. For example, twenty Republican-dominated states have refused to expand Medicaid coverage to their poorest citizens or to set up their own health insurance exchanges under the Affordable Care Act.
The outcomes in the state elections will affect the extent to which redistricting will result in an even more unrepresentative system and exacerbate gridlock and hyperpartisanship in Washington. As of now, the turnout this November is predicted to be uncommonly low, even for midterms, which traditionally attract fewer voters than do presidential elections. Midterm voters are older, whiter, and, since they include fewer and fewer veterans of the New Deal era, over time they have come to represent more conservative values than the voters in presidential contests. The political analyst Charlie Cook says:
In effect, seniors, who have always had a disproportionate influence in midterms because they have a higher participating rate than any other group, have switched sides and are more conservative than before. As this has happened, the difference between midterm and presidential electorates has widened.
We have in effect two different electorates.
Republicans have been remarkably successful in blocking bills supported by Obama, and this in turn has helped convince voters that his accomplishments are meager. Frustration with the gridlock in Washington and feelings of discouragement about the future have led to a particularly sour electorate, which also takes a dim view of the Republican Congress. (In recent polling, no more than 19 percent approved of it.) The sour mood could well affect the turnout; and a small number of voters could determine how the country is governed for the next two years.
With the president’s job approval dropping to 40 percent, according to the latestNBC/Wall Street Journal poll (other polls were slightly higher), Republicans are trying to identify his party’s candidates with him. And since for the first time his rating for likability is below 50 percent, the president now has less to fall back on. It’s often difficult for politicians of the president’s party to deflect the attacks on him. It’s even more unlikely to happen if they don’t try.
One of the most potent issues the Democrats might have had was the right wing’s engineering of the highly unpopular government shutdown last fall. But as it happened, the shutdown began on the same day, October 1, as the rollout of Obamacare, which crashed by day’s end. The shutdown ended but the problems plaguing the federal health care exchange continued a good deal longer, and left a lasting poor impression of the Affordable Care Act and of the president as a manager—a fair charge in this instance.
As expected, the Republicans are attacking incumbents who supported Obamacare—or they are demanding that would-be Democratic senators say whether they would support it, but the fact is that congressional Republicans have given up even pretending that they would repeal it. Though Obamacare is by now generally working, it remains deeply unpopular. Yet voters don’t list it as among their top concerns. It turns out that it’s the president’s name in the nickname for the law—Obamacare, a Republican invention that the president had no choice but to embrace—that’s highly unpopular, and even Republicans aren’t challenging the health care law’s most popular reforms. So Democratic candidates are loath to extoll Obamacare as such, and many of them are offering up the less than rousing line that it needs to be fixed but not ended.
Probably not since Richard Nixon have so many candidates shied away from being in the presence of their party’s president when he shows up in their states—though they welcome his strenuous fund-raising efforts on their behalf. It’s often said that the president should socialize more with Republicans, but they, too, don’t want to be seen in his presence and often turn down White House invitations; John Boehner has been forbidden by the House Republican caucus to negotiate with Obama on his own. Yet the public perception is that the failure of Washington to solve major problems during the past six years falls on the president as well as on those actually responsible—the Republicans. In fact, no president in history has faced such intransigence from the opposition party. It’s undeniable that the president’s race has a significant part in the destructive ways in which he is talked about and opposed.
Obama has on occasion fretted aloud that the focus in the news on the gridlock and dysfunction in Washington diverts attention from what he’s been able to achieve. When he’s long gone from the White House it could well become apparent that despite the odds Obama was responsible for notable achievements, among them Obamacare; getting gay marriage widely accepted; beginning to turn federal energy policy toward a more environmentally conscious set of policies; the Dodd-Frank bill’s restraints on Wall Street, however limited, with its rules still being argued over; and the establishment of the Consumer Financial Protection Agency championed by Elizabeth Warren.
Obama did much to pull the country out of the deep recession he inherited, including a rescue of the automobile industry, but a lot of people still don’t benefit from the improved economy, or have dropped out of the labor market, or have been forced into part-time jobs and lower wages.
No doubt it would have been beneficial if more money had been approved for rebuilding the nation’s crumbling infrastructure, but the votes in Congress weren’t there, just as they weren’t for a single-payer health system, and no amount of presidential rhetoric or arm-twisting—about which there is a fair amount of mythology—would have made a difference.
Obama’s one great disappointment was the failure to win comprehensive immigration reform. After the 2012 election the Republicans were panicking that if they didn’t back immigration reform, Hispanics would punish them mightily in 2016. But then they panicked that if they did back it, Tea Party candidates would upset them in their primaries in 2014.
It’s been evident for quite a while that a certain chilliness on Obama’s part has affected his relations with Congress, but it’s also questionable how much substantive difference this has made. A Cabinet officer said to me, “He’s a loner, and one result is that few Democrats are willing to take the hill for him.” Obama rose swiftly in politics and essentially on his own—he’d been on his own for most of his life—and political camaraderie is of little interest to him. His golfing foursomes are most often made up of junior White House staff and close nonpolitical friends from Chicago. This might not make much difference in the number of bills passed but it has had one very serious effect on his presidency: the Democrats’ unwillingness to praise, defend, much less celebrate the president has left the field clear to his multitude of attackers.
Obama tended to proceed on the theory that if he made some concessions to the Republicans—say, by speeding up deportations of undocumented immigrants—they might be more cooperative; but this hasn’t worked out. It’s true that he is innately cautious, and it’s also true that it is a lot easier to declare what he should have done than to show how he could actually have gotten the votes for that. Little is as simple in the Oval Office as it is to outside critics.
Obama has been beset by the same problem on foreign policy. And as a result of his own actions (or inactions), the NBC/Wall Street Journal poll this past June showed that a mere 37 percent approved of his handling of it and 57 percent disapproved. Just a year before it had been even, at 46 percent to 46 percent. What has happened in the intervening time? Obama is accused of often overthinking an issue until too late, of being too slow to act, of allowing events to dictate his responses. It might seem that after eight years of George W. Bush’s rash and disastrous actions, caution would be welcome.
But the Ronald Reagan–John Wayne myth of bold, simple solutions lies deep in the American psyche. It was all so much simpler during the cold war; and the country became accustomed to simpler rhetoric. When Obama acts, or declines to, his critics—be they John McCain or an editorial writer or one of a myriad of foreign or defense policy “experts” who pop up on television—can urge from their comfortable perches that he should do more. But when McCain and his pal Lindsey Graham argue that the president should use greater force in Libya, Syria, Iraq, Ukraine, or wherever, they aren’t required to explain the downside risks, or what they would do next if their strategy failed.
When the president authorized air strikes against ISIS in Iraq in August, the usual Republicans inevitably said he wasn’t going far enough and some Democrats began to fret aloud that he might get too involved. Though some leading Democrats quickly drew a line at the use of American ground troops, the president is as reluctant as anyone else to use them. An official who has dealt with him on policy in the Middle East says, “Avoiding another Iraq is his guiding principle.”
The difficult situation Obama was in, politically as well as militarily, over ISIS made all the more jarring Hillary Clinton’s comment that if he had taken her advice and armed the “moderate” Syrian rebels, ISIS might not have developed. It also raised serious questions about both her political and strategic judgment.
An oddity about Mrs. Clinton’s complaint that the president allowed a vacuum in Syria in which ISIS could develop is that ISIS is an offshoot of al-Qaeda in Iraq (AQI) and it first emerged there as a result of Iraq’s dysfunction; so it’s questionable whether it could have been stamped out in Syria, much less by arming “moderate” forces. Bruce Riedel, a former high-level CIA official specializing in the Middle East and North Africa, a presidential adviser, and now with the Brookings Institution, told me, “ISIS’s base and stronghold is still in Iraq—the critics are in the wrong battlefield when they claim helping in Syria would have prevented ISIS.”
Mrs. Clinton’s efforts in the face of widespread criticism to smooth things over with the president weren’t likely to cause him to forget the whole thing. He can do a slow burn with the best of them. Moreover, the Clinton camp had been trying for weeks to call attention to her disagreement with the president over Syria, among other differences with him. Obama may recall that when he was first elected president and it became known that he was considering Clinton for the nomination for secretary of state—undoubtedly on the theory of “keep ’em in the corral”—Senator Edward Kennedy warned him that he was about to make a very serious mistake that he would come to regret: that the Clintons are about themselves.
David Tulis/AP Images
Democrat Michelle Nunn (center), who is running for a Senate seat from Georgia, at a debate during the Democratic primaries, May 2014
Bruce Riedel reaffirms the president’s view of the risks of arming “moderates” in Syria. Riedel said in a recent Brookings forum: “If you think you can give weapons only to the good guys, forget it. The bad guys will get them.” Later, he told me, “The president has had a very clear policy toward Syria: stay out of it at any cost. His governing policy is to avoid getting tangled up in situations in the Middle East and North Africa that can turn out to be disasters.” But ISIS may force his hand to get more and more involved in Syria with air strikes and special forces and perhaps drones, as he has already done in Iraq.
A problem for the public is that the president occasionally sends confusing signals—doing a little of what he’d adamantly said shouldn’t be done, or feinting in the direction of more involvement without wanting to follow through. The president more than once moved toward greater involvement in Syria while at the same time seeking to make sure that it wouldn’t happen. In 2012 he drew a “red line” on the Assad regime’s use of chemical weapons against its own citizens and then was much criticized when he didn’t follow through after Assad used them.
Unfortunately for the president, such criticism is based on a partial recollection of what happened. After Assad defied him and used chemical weapons, Obama felt pressed to respond. But rather than go ahead with bombing in Syria, with all the risks of getting further drawn into a civil war he was trying to avoid, he took the famous long walk on the White House grounds with his chief of staff, Denis McDonough, to whom he’s said to feel closer than anyone else he works with—other than, of course, the ever-present Valerie Jarrett—and decided to put the issue to Congress by asking its permission to bomb in Syria.
There’s little reason to doubt that he did this in the knowledge that the permission was unlikely to be forthcoming. But the outcome was more felicitous than that. Obama accepted an offer by the Russians to negotiate the removal of the chemical weapons from Syrian hands. Since the Russians are allied with the Syrian government, Obama’s threat seems to have been more credible to Assad than to his American critics.
Another example of Obama fuzzing his declared policy actually concerns supplying weapons to the Syrian rebels. On two occasions—once in 2012, under pressure from Hillary Clinton, CIA Director General David Petraeus, and Defense Secretary Leon Panetta to arm the rebels, and again in June of this year—the president, rather than issuing a formal statement from the White House, had the CIA e-mail halfhearted requests to Congress for relatively small amounts for arms for “moderate” rebels fighting the Assad regime.
Predictably, on both occasions, Republican and Democratic members of the intelligence and foreign relations committees were skeptical, asking such questions as: How do you know whom to give the weapons to, and how does this fit our general policy of not getting drawn into the Syrian civil war? The administration had no good answers, and as the president appeared to hope, only a small and insignificant number of weapons were sent to Syrian rebels.
As when he said “Assad must go,” Obama’s occasional resorting to unsupported rhetoric contributed to the impression of a weak and indecisive leader. The improvised nature of the president’s foreign policy is only partially of his own doing. McCain and Graham notwithstanding, there can be no one-size-fits-all foreign policy now (nor do they represent the views of even the majority of Republicans on Capitol Hill). The disparate nature of the challenges—from Putin’s adventurism to ISIS’s rise—makes it difficult for a president to enunciate a clear, single policy. As Riedel put it, “‘Don’t do stupid stuff’ is as smart an organizing first principle as any.”
But it’s the sense of ad hoc policy-making that causes the public to wonder if the president knows what he’s doing. The former defense and foreign policy official Leslie Gelb wrote recently in The Daily Beast:
Mr. Obama always says a lot of smart things…. Much more than most foreign policy blabbermouths, he is attuned to the underlying centrality of politics in most world problems, and to the need to seek diplomatic solutions…. Once there is any kind of crisis, he doles out little pieces of policy daily…. Obama may view this as making sensible decisions in a step-by-step manner. To those trying to understand what he’s doing, they simply can’t follow him, let alone understand how the pieces and the day-to-day changes mesh.
Around early August the consensus among political observers was that the contours of the midterm elections were set. Chuck Todd of NBC said, “August 1st is the new Labor Day.” The consensus since then has been that the Republicans are odds-on favorites to take over the Senate. The only question has been the percentage of that likelihood. Also it’s generally agreed that no great issue evolved to dominate the fall contests. No one is seriously predicting that the Democrats can win back the House. Of 435 congressional seats, only thirty-eight—twenty-six held by Democrats and twelve by Republicans—are considered in play; and through the Republicans’ control of most states they’ve managed to redistrict in a way that gives them a significant advantage in the House.
With rare exceptions, moreover, the sixth year of a presidency is usually one that favors the opposition party. People have tired of the man in the White House. The Democratic pollster Peter Hart says that people have made up their minds about Obama and are unlikely to change them before November. Finally, by various measurements Republicans are more fired up than Democrats about voting this time. This could be the decisive factor in many-to-all of the races.
Though a few of the twenty-nine Republican governorships might change hands, Republicans will still dominate the statehouses; but the rightward trend at the state level has already been blunted, and may be more so as of this election. As of now, at least one Democratic governor, Pat Quinn of Illinois, is seen to be in serious trouble. Illinois’s crisis of overpromised and underfunded pension is the most acute in the country and the state is nearly bankrupt.
In most of the Democratic-controlled states that the Republicans took over in 2010, they adopted the agenda of the pro-business organization ALEC, which included tax cuts, reduced spending, particularly on education, and also model laws for voter ID and relaxed gun control. But John Kasich, the Republican governor of Ohio, for example, wised up and began to move away from this essentially unpopular agenda, and so he is in a strong reelection position. Scott Walker, of Wisconsin, who has demonstrated presidential ambitions, hasn’t been quite as agile and is in a tight race, though his Democratic opponent is at a serious funding disadvantage. The two deeply conservative Republican governors in eastern states—Tom Corbett of Pennsylvania and Tea Party member Paul LePage of Maine—are highly unpopular (Corbett has the distinction of being the most unpopular governor in the country) and widely expected to go down to defeat.
Rick Scott of Florida is in a close race with Charlie Crist, a Republican turned Democrat. But probably the most interesting governorship race is in Kansas, where the incumbent Sam Brownback gave full vent to his extremely conservative fiscal and social views. Kansas is now deeply in debt. Brownback also tried to purge the more moderate Republicans in his state legislature. This caused over a hundred leading Republicans to oppose him for reelection this year. If Brownback loses, this would confirm that the country simply isn’t ready to be governed by a highly conservative agenda.
But there are reasons to hold back on prognosticating what will happen in November. There’s still plenty of time for an issue to blow up and have an im- pact on the outcome. In 1980 the race between Ronald Reagan and Jimmy Carter was quite close heading into the final weekend. Then, going into that weekend, it suddenly became clear that the Iranians wouldn’t release the American hostages then that they had been holding captive for over a year. This failure lit the fuse under a growing frustration with Carter, with the result that Reagan carried forty-four states. Moreover, nine incumbent Democratic senators were defeated in the undertow of the last-minute “wave.” Since the president is on the defensive over a number of issues, his party is more vulnerable to a wave of opposition votes that can still develop at any time up to election day.
One reason for the widespread view that the Republicans would likely take over the Senate is that the election map and math in 2014 favor them. The Democrats have twenty-one incumbent senators up for reelection, several in red or purple states, while the Republicans have fifteen, almost all of them in safe Republican states. But some races count for more than the others, the most-watched one being the reelection effort of Mitch McConnell in Kentucky, which has national implications.
Should the Republicans take over the Senate, then McConnell, particularly loathed by Democrats for his obstructionist tactics and his wintry personality, would become majority leader. To appeal to the Republican base, McConnell recently said that were he to become majority leader he would favor more government shutdowns—a total reversal of his previous position against them for fear they would hurt his party. As of August, McConnell was facing a stiff challenge by Alison Lundergan Grimes, though he has a record of pulling out victories at the last minute, sometimes with ads that are particularly nasty. But his popularity in Kentucky has hit an all-time low. Of the six Senate seats the Republicans need to pick up in order to capture a majority, three seats held by Democrats who have chosen to retire have for some time been considered by pollsters and analysts to be lost to the Republicans: South Dakota, West Virginia, and Montana. There’s no reason to doubt them on this. In the remaining seven close races where the Democratic incumbent faces a strong challenge or there’s an open seat—Louisiana, Arkansas, North Carolina, Colorado, Iowa, Alaska, and Michigan—the analyses have gone back and forth on how the Democrat is doing. At times Mark Pryor of Arkansas, Mary Landrieu of Louisiana, Mark Udall of Colorado, and Mark Begich of Alaska have been believed to be in peril, only to be resuscitated as “doing better.”
The Democrats’ highest hopes of capturing a previously held Republican seat have been placed on Michelle Nunn, the former executive director of George H.W. Bush’s Points of Light volunteer association and daughter of the popular former senator Sam Nunn. But Michelle Nunn faces another scion of Georgia’s political aristocracy, David Perdue. While the demography of Georgia has been moving toward the Democrats, the most reputable analysts now say that the state hasn’t yet changed enough for a Democrat to win it this year.
Whether or not the Republicans take control of the Senate, the ground there has already shifted to the right. While national Republican officials boast that not one of their incumbents was defeated by a Tea Party challenger—and unlike in the last two elections they had avoided nominating any goofballs (doing so had cost the party six seats)—the victories of what are called “mainstream” Republicans over Tea Party challengers haven’t been without cost to the party’s standing in the next presidential election. For one thing, some of the victories weren’t so thumping as to warrant discounting the Tea Party’s effect on the GOP. In most cases the incumbent had to move to the right in order to prevail.
The Republicans are so uncertain of victory in elections to federal offices that they’re still resorting in several states to passing laws that make voting more difficult for minorities and other groups who would ordinarily vote for the Democrats. Some of these laws are even stricter than those adopted in 2012. Democrats might appear to have issues that could drive their voters to the polls. These would include Republican efforts to deprive women of their own reproductive decisions and opposition to such measures as raising the minimum wage and making unemployment insurance last longer.
Still, largely because of the president’s unpopularity, the Democratic candidates have been having problems finding their voice. Most of their races are focused on the vulnerabilities of their opponents, making for a thus far unedifying election. The result is that a midterm election with national implications so far has no overall national theme.
Unknown at this point is the effect of the unprecedented amounts of outside money being poured into many of the races. It’s estimated that the Kentucky race alone will cost $100 million, the highest amount ever for a state contest. In addition, numerous members of the more militantly liberal Democratic wing have been holding back support of their party’s candidate because of impurities they find in the president’s or candidate’s positions. Democrats “disappointed” in Obama could help elect a Republican Senate. The odds may be stacked against the Democrats this November, but whether they can stave off a loss of control of one half of Congress is still up to them and their would-be supporters.
—August 27, 2014


No Easy Remedies

America in Decay

The Circus

he Ordinary Acrobat by Duncan Wall. P.T. Barnum was one of the best-known men in the world in the nineteenth century. Though he is best remembered today for his reinvention of the circus, that came late in his life. His claim to fame during most of his career was the American Museum in lower Manhattan. There, among other things, he assembled the greatest collection of "freaks" in the world, and thirty-eight million paying visitors passed through his museum's doors during a time when the nation's population was just above thirty million:

"In his museum, ... a person could experience in a day all the wonders of the world -- animals, spectacle, and adventure. This similarity to the fairgrounds [of Barnum's youth] is particularly evident in Barnum's relationship with 'freaks' (a.k.a. human oddities, living curiosities). As a concept, 'freaks' date back to the ancient period. African Pygmies entertained the royals of Egypt, and Roman emperors delighted themselves with midgets dueling obese women. There were self-made and congenital or natural 'freaks.' Those self-made altered themselves through body modification -- most frequently tattooing or piercing -- but also through weight gain or starvation, such as the Fat Boy of Peckham and Giuseppe Sacco-Homann, the famous World Champion Fasting Man, both celebrities on the English fairgrounds. Natural 'freaks' were usually born with some kind of deformity or genetic condition -- dwarfs, conjoined twins, and people with secondary sexual characteristics of the opposite gender (e.g., bearded women). Often they had a skill to complement their abnormality. Matthias Buchinger was born on June 3, 1674, in Nuremberg without arms or legs, but later learned to play a half-dozen instruments and perform calligraphy displays, which he did for the kings and queens of Europe.

"The first 'freak' display in the United States occurred in 1771, when Emma Leach, a dwarf, was shown in Boston. Around 1840, full 'freak shows' began to emerge, traveling with menageries or in the company of 'handlers' who managed the promotion and exhibition of the stars, enhancing their natural deformities with a story or an exotic medical explanation. (As Tom Norman, Barnum's English equivalent and the handler of the Elephant Man, wrote in his autobiography, 'It was not the show, it was the tale that you told.')

P.T. Barnum and General Tom Thumb
"Barnum was in this tradition, and he excelled at it. According to his biographer, A. H. Saxon, nearly every famous freak of the period spent a few weeks in the showman's employ: R. O. Wickward, the skeleton man; Jane Campbell, 'the largest Mountain of Human Flesh ever seen in the form of a woman'; S. K. G. Nellis, the armless wonder, who could shoot a bow and arrow with his toes. Many of the freaks appeared as stars in his museum, either as roving attractions, as part of special exhibitions, or as spectacles in the theater in back. Sometimes Barnum toured with them as well. General Tom Thumb (Charles Stratton) was a twenty-five inch-tall four-year-old midget, who Barnum claimed was eleven. Barnum coached the boy to perform impersonations of various heads of state, including Queen Victoria, whom he visited on three separate occasions. In Paris, the duo played to Napoleon III and in a series of shows at the Salle Musard that sold out months in advance. 'The French are exceedingly impressible,' Barnum wrote of the visit in his 1896 autobiography Struggles and Triumphs, 'and what in London is only excitement in Paris becomes furor.'

"Given our modern mores and science, most people -- circus historians included -- lament these displays. At best they were grossly lowbrow, at worst debauched. Russian circus-historian Yuri Dimitriev once called them 'a disgrace to human dignity.' 'They were an insult to the very essence of the circus where the skill and beauty of the human body are celebrated,' he alleged, 'playing on the basest instincts of the gawking crowd.' But it's also important to consider the context. Though much of the interest in 'freaks' indeed derived from inconsiderate or malicious instincts, the 1850s were an age before photographs, cultural museums, or widespread literacy. Audiences were curious about the world, and Barnum played to this curiosity in his exhibits. He advertised his museum as an 'encyclopedic synopsis of everything worth seeing in this curious world.' He presented his artifacts, however strange, as part of the scientific revolution sweeping the globe. For example, he called his ape-man the 'missing link' in Darwin's theories of evolution. Barnum succeeded in this presentation because the museum's atmosphere was consistently middlebrow. A lifelong teetotaler, he prohibited profanity, sexuality, and liquor. In letters he referred to himself as the 'Director of Moral and Refined Exhibitions for the Amusement and Instruction of the Public.' 'Barnum's genius was in developing popular potential,' Bluford Adams, a Barnum scholar, told me. 'He would take an idea, make it safe for the middle classes, and then commercialize it to the hilt.'

"Approximately thirty-eight million paying visitors passed through Barnum's doors in that time span. This figure is particularly astounding given America's population at the time: thirty-two million just before the Civil War. The American Museum made P. T. Barnum rich."

The Ordinary Acrobat: A Journey into the Wondrous World of the Circus, Past and Present
Author: Duncan Wall
Publisher: Alfred A Knopf, a division of Random House
Copyright 2013 by Duncan Wall


If Mike Trout walked into your neighborhood bar, would you recognize him? Let me rephrase: If the baseball player who is widely considered the best in the world—a once-in-a-generation talent, the greatest outfielder since Barry Bonds, the most accomplished twenty-two-year-old that the activity formerly known as the national pastime has ever known—bent elbows over a stool and ordered an I.P.A., would anyone notice? A few weeks ago, Trout, who plays center field for the Angels, hit a ball nearly five hundred feet. At the All-Star Game, he was clocked at twenty miles per hour—rounding the bases, on foot. Yet his Q rating is about on par with that of Jim, the guy in South Jersey whose burgers Trout’s mother sometimes mails, frozen, to her superhuman son in Anaheim, to keep him rooted in the tastes and comforts of home. The pride of Millville: a chubby-cheeked mama’s boy with a haircut certified by the Marine Corps. He strides among us like a colossus, anonymous.
“Is baseball in trouble?” is one of those questions—like “Is football too violent?” or “Is golf too boring?”—that is both everlasting and newly inescapable, symptomatic of an era in which the games we watch, ostensibly to amuse ourselves, are commonly analyzed like brands. It’s the wide world of sports as a high-school cafeteria, surrounded by bleachers. “Will the center table make room for soccer today?” we all ask ourselves, while keeping a suspicious eye on the commotion that’s quietly been gathering around lacrosse, in what used to be the preppie corner. “Is football drunk? If he’s not careful he might soon be expelled.” Meanwhile, “Poor Nascar. His parents got whacked by subprime, and now, after a brief flirtation with some cheerleaders, he appears destined for trenchcoat mafia.”
The discussion of baseball’s health, pro and con, generally peels one of two ways: economic or cultural. The economic argument points to the league’s paid attendance, for instance, which is very high. Thirty thousand people attend the average ballgame today, compared to fifteen thousand in the early nineteen-seventies. Annual revenues, too, are strong: eight billion, more than double what they were when McGwire and Sosa were chasing Maris, owing to ever more lucrative TV-rights deals. How bad can the situation really be if it’s substantially better than it was back when home runs were on everyone’s mind? (Consider that the N.F.L.’s annual ten billion is often cited casually as prima-facie evidence of football’s invulnerability: concussions consmussions.) In absolute terms, baseball is doing just fine—thriving, even.
The cultural argument returns us to the cafeteria, and it begins by noting that nobody seems to be discussing home runs any longer. It’s a relativist perspective, whereby the pecking order is foremost. On television nowadays, the World Series can hardly compete with Browns versus Jaguars, Week Seven. The so-called Fall Classic’s ratings have been declining in recent decades, roughly mirroring the vaunted gains in midsummer attendance. It’s not baseball that’s doing fine, in other words; it’s the Yankees, the Red Sox, the Cardinals, the Dodgers, the Giants, the Brewers—everyone except South Florida, basically. You watch your team, but not mine—an arrangement befitting our partisan moment. What’s more, the other major team sports have made similar or, in some cases, greater proportional gains in attendance during the same supposedly triumphant period in baseball. This suggests, or at least raises the possibility that, M.L.B. owes its economic boom to little more than the Baby Boom: more people, with more disposable income. Who will fill the seats vacated by Boomers after they come up lame? Relatively speaking, baseball fans are geriatric. (And white—let’s not forget the waning African-American enthusiasm for the sport, to the extent that stories like this one, about the success of an all-black team in the Little League World Series, are meant to be read as surprising good news.)
As a fan of not just baseball but hockey (good revenues and great attendance, by the way!), I’ve long since grown hardened to the bullying implicit in the relativist argument. As long as the athletes we admire are paid enviable wages, and as long as the games we want to watch are broadcast on TV or streamed on the Internet, who cares what the smirking zeitgeist surfers think? But the Trout conundrum strikes me as a significant milestone in baseball doomsaying—more problematic, say, than the demise of corporate slow-pitch leagues, which the Wall Street Journal recently foretold. When was the last time baseball’s reigning king was a cultural nonentity, someone you can’t even name-drop without a non-fan giving you a patronizing smile?
I’ve been thinking about Trout lately, because of the interminable retirement parade for Derek Jeter, and because of Bud Selig’s planned departure from the commissioner’s office in January. In a few months, Red Sox Nation will toast David Ortiz on the occasion of his thirty-ninth birthday. Soon enough, Big Papi, too, will be gone—and baseball under Commissioner Rob Manfred may be looking at a horizon devoid of personalities who exist beyond the realm of fantasy leagues. (Barroom debates are at their best amid the buzz of a couple of beers, so let’s set aside the Puig factor, as well as the Mo’ne phenomenon, which produced better ratings for ESPN last week than any adult game since 2007.)
“It feels as if he rolled into baseball out of the pages of a W.P. Kinsella novel,” ESPN’s Jayson Starke wrote of Trout last month. Kinsella’s “Shoeless Joe” inspired “Field of Dreams,” a movie whose appeal (however treacly, no matter the Costner) derives from romantic, rather than economic, assumptions about baseball’s role in the national consciousness. It débuted in the spring of 1989, the rookie year of Ken Griffey, Jr., a prospect so highly anticipated that he had his own branded candy bar. The quarter century that followed gave us Bonds and A-Rod, Pedro and the Rocket—and now Mo’ne Davis, a thirteen-year-old girl whose dream is to play point guard for UConn.


Syllabus Tyrannus 

The decline and fall of the American university is written in 25-page course syllabi.

Illustration by Alex Eben Meyer.
Illustration by Alex Eben Meyer
When I was an undergrad in the ’90s, there was little more exciting than the first day of class. What will my professor be like? What books will I be reading? How many papers will I have to write? Answers came readily, in the form of a tidy one-page document that consisted solely of the professor’s name and office hours, a three-sentence course description, a list of books, and, finally, a very brief rundown of the assignments (papers, exams) and their relevant dates. This was a course syllabus in 1996, and it was good.
Rebecca Schuman is an education columnist for Slate.
If, like me, you haven’t been a college student since the Clinton administration—but, unlike me, you also haven’t been a professor today—then you might be equal parts impressed and aghast at what isrequired for a course syllabus now. Ten, 15, even 20 pages of policies, rubrics, and required administrative boilerplate, some so ludicrous (“course-specific expected learning outcomes”) that I myself have never actually read parts of my own syllabi all the way through.
The texts? The assignments? Unsurprisingly, these are still able to fit on a page or two. The meticulous explanations of our laptop policies, or why, exactly, it’s inadvisable to begin course-related emails with “heyyyyyyyyy,” or why we will not necessarily return said emails at 3:15 a.m.? A novel’s worth. Today’s college syllabus is longer than many of the assignments it allegedly lists—and it’s about as thoroughly read as an end-user license agreement for the latest update of MS Word.
As any professor can tell you—or, possibly, show you on T-shirts both clever andprofane—endless syllabi result in a semester-long litany of questions whose answers are actually readily available on that most-unread of documents. Today’s ever-creative professors have been compelled to instate syllabus quizzes that a student must pass before she may turn in any assignments. My own method is to simply assign my syllabus as the course’s first reading, with the warning: “I will knowif you haven’t read it.” Half of my students think I’m bluffing, so they don’t read all the way to the end, where I’ve put both sincere congratulations and a directive to email me with a question, for credit. Imagine their horror when their first grade in my course is an F for an assignment they didn’t even know existed. (Since my syllabus explains that I accept late assignments, though, the F is fleeting.)
Syllabus bloat is more than an annoyance. It’s a textual artifact of the decline and fall of American higher education. Once the symbolic one-page tickets for epistemic trips filled with wonder and possibility, course syllabi are now but exhaustive legal contracts that seek to cover every possible eventuality, from fourth instances of plagiarism to tornadoes. The syllabus now merely exists to ensure a “customer experience” wherein if every box is adequately checked, the end result—a desired grade—is inevitable and demanded, learning be damned. You want to know why, how, and to what extent the university has undergone a full corporate metamorphosis? In the words of every exasperated professor ever, “It’s on the syllabus.”
So how did this happen? Sometime between 1998, when I finished my undergrad degree and one-pagers were still standard, and now, when the average length of my academic friends’ syllabi is 15 pages, several important changes took place at this country’s colleges and universities.
First, the helicopter generation—raised on both suffocating parental pressure and the teach-to-the-test mandates of No Child Left Behind—started coming to college. Everyone needed A’s, and everyone needed to know exactly what needed to be done to get one. When that wasn’t abundantly clear, that made schools vulnerable to lawsuits.
Second, syllabi went from print to online, thus freeing the entire professoriate to capitulate to the aforementioned demands for everything from grading rubrics to the day-by-day breakdown of late assignment policies, without worrying about sacrificing trees or intimidating the class with a first-day handout they could barely lift, much less peruse in a mere 75 minutes.
Third, the skyrocketing percentage of hired-gun adjuncts—as opposed to tenure-track faculty, who have both a modicum of security and a minuscule say in university governance—meant that a substantial number of instructors were taking on courses a matter of weeks (sometimes days) before they began. Thus, they relied heavily on extensive syllabi already in existence.
And, finally, universities—especially public institutions, ever-starved of tax revenue and ever-more-dependent upon corporate partnerships and tuition—started hiringCEOs as administrators, most of whom gleefully explained that they would start running these public, nonprofit entities like businesses.
With corporatization came prioritization of the student “customer experience”:climbing walls, luxury dorms, and coursework that is transactional rather than educational. To facilitate the optimal experience for these customers, administrators began to increase oversight of their faculty, which, with an ever-adjunctifying professoriate unable to fight back, became ever easier to do. And so the instructors—wary of lawsuits and poor evaluations that would cost them their jobs—had little choice but to pass that micromanagement on to the students.
Obviously, the only real solution would be for the entire system to shake some sense into its head, like a Basset Hound coming in from a driving rainstorm. Oh, hey, the basset hound would realize, corporatized education hurts almost everyone it touches. It hurts the students who go into lifelong debt to be taught by adjuncts making $17,000 a year; it hurts the staff on forced furlough; it hurts the alumni, who learn little more than how to fulfill a meticulously circumscribed contract, and who are foisted, unprepared, upon an intransigent job market. Really, it hurts everyone but the administrators and the for-profit textbook publishers—but, of course, they run the show.
So my recommendation is something at which we intellectuals excel: a subtle war of passive aggression. Go ahead and include that admin boilerplate, but do it at the end, in six-point type, and label it “Appendix A: Boilerplate”—or, even better, “tl;dr,” since the executive vice dean in charge of micromanaging your syllabus probably won’t know what that means. Make it very clear, simply through the use of placement and typeface, what you think is important for students to read and what you don’t.
Finally, explain to your students, face-to-face, that even though a syllabus is a contract, it’s an inappropriately developed one, comprised of transparent ass-covering and bad intentions, and that any college course actually worth attending is going to begin with least some air of mystery about what you “need” to get an A. Because, you’ll explain, what you need is to learn and learn well, and if you already knew what you needed to know, you wouldn’t be in the class in the first place. The students probably won’t be paying attention, because they’ll be texting—and they won’t know they’re not allowed to, because they won’t have read the texting rules on the syllabus.

Stefan Zweig

TThe careers of Stefan Zweig and Walter Benjamin offer a contrast so perfect as to become almost a parable. The two writers were contemporariesBenjamin was born in 1892, Zweig in 1881and both operated in the same German literary ecosystem, though Benjamin was from Berlin and Zweig from Vienna. Both reached their height of productivity and reputation during the Weimar Republic, and as Jews both were forbidden from publishing in Germany once Hitler took power. And both ended darkly as suicides: Benjamin took his life in 1940 while trying to flee from France to Spain, and Zweig died a year and a half later in Brazil, where he sought refuge after unhappy sojourns in England and America.
Yet the similarities end with their biographies. As writers, they could not have been more different, and their literary destinies were exact opposites. Zweig flourished during his lifetime, enjoying huge sales of his psychologically charged novels and his popular historical biographies. Born with a fortunehis father was a textile manufacturer in Bohemiahe earned another fortune through his books, carrying into literature the bourgeois discipline and regularity that he inherited from his businessman ancestors. Three Lives, the definitive biography of Zweig by Oliver Matuschek, describes his annual production of books during the 1920s: 
Over time Zweig had evolved a taut and effective work schedule for the production of his books. The winter months were spent in assembling the material, the spring was used for working up the early drafts, so that the final draft could be completed during the summer and the manuscript then sent off to the publisher as soon as possible. This allowed the typesetting and proofreading to be completed in good time by the autumn, in order to get the printed and bound copies into the bookshops to catch the Christmas trade.
Benjamin, by contrast, was not remotely as popular, nor would he have wanted to be. His audience was not the public at large but his fellow writers and intellectuals, who held him in the highest esteem; Brecht, Hofmannsthal, Adorno, and Scholem were among his friends and patrons. Zweig, whose books were bestsellers in several languages, was able to survive the loss of his German market and remain fairly prosperous; but for Benjamin the exile from Germany was devastating, and he spent the rest of his life in dire poverty. When the two men died, Zweig was one of the most famous writers in the world, Benjamin one of the most obscure.
Yet today there has been a reversal of their fortunes. It is Benjamin who has been canonized as one of the most important theorists of modernism, his works studied and debated and interpreted endlessly. He has become an emblem of the fate of the mind under fascism, not just a thinker butin the hands of admirers such as Susan Sontagalso a kind of saint. Zweig, on the other hand, was until very recently a cipher on the American scene, a name from history rather than a living literary presence. It is a literary tortoise-and-hare fable, whose familiar if unwelcome lesson is that the most serious, most difficult, most “highbrow” writing is usually what wins in the end.
What are we to make, then, of the current burst of interest in Zweig’s work? Thanks almost entirely to two publishersNew York Review Classics in the United States and Pushkin Press in Great Britainnovels and novellas from Zweig’s lengthy catalogue are pouring back into print at a fast clip. Zweig is written about in The New York Times; his extraordinary memoir, The World of Yesterday, which has just been reissued in a new translation by Anthea Bell, is cited by Wes Anderson as an inspiration for his film The Grand Budapest Hotel. And now The Impossible Exile, George Prochnik’s fine study of Zweig’s last years, brings the melancholy tale of his emigration and death to a new generation of readers.
It is not clear, however, that this surge of interest has been accompanied by any increase in his critical standing. Zweig remains today, as he was during his lifetime, the tragic German Jewish émigré writer whom it is acceptable to disdain. A few years ago Michael Hofmann caused a minor sensation with an essay in the London Review of Books when he attacked the long-dead and almost forgotten writer with as much passion and invective as if he had been, say, Jonathan Franzen. “Stefan Zweig just tastes fake,” Hoffman acidly quipped. “He’s the Pepsi of Austrian writing.” 
In doing so, he was reviving an old tradition of intellectual sniping. Three Lives is packed with the nasty things that other writers had to say about Zweig, who was less gifted than they were but, infuriatingly, much more successful. To Hofmannsthal, he was a “sixth-rate talent.” Karl Kraus, told that Zweig had triumphed in all the languages of the world, replied, “Except one”a jibe at his less-than-perfect German style. A satire published in 1920 described a creature called “ the Steffzweig”: “there are a few who still regard it as a living being. However the Steffzweig is an artificial creation, constructed for a writer’s conference in Vienna from feathers, skin, hair etc. taken from all manner of European animals.” Kurt Tucholsky summoned a whole world of pathetic mediocrity when he described a character this way: “Frau Steiner was from Frankfurt am Main, no longer in the first flush of youth, quite alone and dark-haired. She wore a different dress every evening, and sat quietly at her table reading refined books. In a word, she belonged to the readership of Stefan Zweig. Enough said? Enough said.”
Imagno/Getty Images
Stefan Zweig in front of his house in Salzburg.

TThe saddest thing about all this abuse is that no one was quicker to acknowledge the proper scale of his gifts, or to defer to writers of superior talent, than Zweig himself. In 1933, when the Nazis started holding bonfires of books, Zweig was one of the authors consigned to the flames. In The Impossible Exile, Prochnik quotes his reaction: it was, Zweig said, “an honor [rather] than a disgrace to be permitted to share this fate of the complete destruction of literary existence in Germany with such eminent contemporaries as Thomas Mann, Heinrich Mann, Franz Werfel. . . and many others whose work I consider incomparably more important than my own.” Hofmannsthal’s references to Zweig drip with contempt, and on several occasions he actively tried to sabotage Zweig’s career, But in his memoir Zweig compares Hofmannsthal to Keats and Leopardi, and recalls with awe the first time he heard him speak: “I have never known conversation on such an intellectual level with anyone else.”
One of the most important facts about Zweig is that he was perhaps even more passionate a collector than he was a writer; and what he collected were the literary and musical manuscripts of Mozart and Beethoven and Goethe, the highest peaks of human genius. Zweig did not claim to dwell at that height himself, but he was gifted enoughand sufficiently imbued with the German-Jewish passion for Bildung, or personal cultivationto worship at the shrine of art. In The World of Yesterday, he writes rather more enthusiastically about his acquisitionsa manuscript page of Faust, the handwritten score of Schubert’s “An die Musik”than he does about his own books: “I was aware that in this collection I had created something that in itself was worthier to last than my own works.” He declared quite straightforwardly that nothing he produced before the end of World War I, when he was thirty-seven, is worth preserving.
Yet even this love of culture can become a charge in the indictment against Zweig. For one thing, it can make him look like what Proust contemptuously called a “célibataire d’art,” one whose relations with art become passionate to the point of abjection, precisely because they are not truly creative. Certainly there is something alarming about the passage inThe World of Yesterday where Zweig proposes, in all earnestness, that the great books are too long and need cutting: 
I could not help wondering what exactly it was that made my books so unexpectedly popular . ..  I think it arose from a personal flaw in meI am an impatient, temperamental reader. Anything long-winded, high-flown or gushing irritates me, so does everything that is vague and indistinct, in fact anything that unnecessarily holds the reader up . . . why not bring out a series of the great works of international literature, from Homer through Balzac and Dostoevsky to Mann’s The Magic Mountain, with the unnecessary parts cut?
Balzac and Dostoevsky to one side, it is certainly true that the strength of Zweig’s fiction is its compression and intensity. He specialized in short novellas, the best of which have titles that get straight to the emotional point: Burning SecretConfusion. (It is no accident that many of these stories were made into successful movies, most famously Ophüls’s Letter from an Unknown Woman.) Zweig, who was a friend of Freud and wrote a biographical essay about him, is fascinated by primal scenes, moments when naive young people are initiated into the powerful and perverse forms that sexuality can take. In Burning Secret, a young boy accompanies his mother to a resort, where she enters into a flirtation with a predatory nobleman. Like a detective, the boy is determined to figure out exactly what is going on with these strange grown-upswhat is the force that draws them together, and why do they seem not to want him around? The story’s climax comes when the boy eavesdrops on his mother and the baron in the hallway at night, and misinterprets their erotic struggle as an attempted murdera primal scene gone terribly awry.
Similarly, in Confusion, a handsome young college student falls under the intellectual spell of his professor, and decides that he will help the old man complete his long-unfinished magnum opus on the Elizabethan drama. But the student cannot understand why the professor blows hot and cold, alternately encouraging him and holding him at an ironic distance. Not until the student ends up in bed with the professor’s young wife does his confusion begin to clear up: the professor, he realizes, is gay, and he is fighting his own attraction for the young man. The book concludes with the teacher’s passionate confession to his protégé, in which Zweig combines an acute analysis of the erotics of teaching with a remarkably forward-thinking plea for sexual toleration.
Even Zweig’s only full-length novel, Beware of Pitywritten in the late 1930s, when he had fled Austria for Englandis a novella at heart, focusing, again, on the romantic education of a naive young man. In this case, Lieutenant Hofmiller is a cavalry officer in a small Hungarian town on the eve of World War I. At home in the barracks but ill at ease in society, Hofmiller finds himself horrified to have committed a fairly innocent social blunder: at a party at the local nobleman’s mansion, he invites the daughter of the house to dance, not realizing that her legs are paralyzed. Overcome with shame and pity, he enters into a strange and ultimately destructive relationship with the girl, Edith, allowing her to believe that he wants to marry her when in fact the idea terrifies him. Here it is emotional immaturity, rather than sexual immaturity, that must be outgrown during the painful transition to adult understanding.

NNot for nothing, clearly, was zweig a product of the Vienna of Freud, Schnitzler, and Schiele. And if The World of Yesterdayturns out, as it now seems, to be Zweig’s most lasting and important book, it is in large part because of his rich evocation of Vienna’s turn-of-the-century culture. The city that Zweig describes, the one in which he grew up and eventually triumphed, was the Vienna of the educated Jewish haute bourgeoisie. It was only the money and the curiosity of this class, Zweig argues, that made the city’s golden age possible: “the part played by the Jewish bourgeoisie in Viennese culture, through the aid and patronage it offered, was immeasurable. They were the real public, they filled seats at the theater and in concert halls, they bought books and pictures, visited exhibitions, championed and encouraged new trends everywhere with minds that were more flexible, less weighed down by tradition.” His own movement from the mercantile middle class to the cultural elite, Zweig writes, was the ideal trajectory of all German Jewish families, instancing Aby Warburg and Ernst Cassirerand he could have added Ludwig Wittgenstein or, indeed, Walter Benjamin.
This Jewish cultural assimilation was made possible, Zweig explains, by Vienna’s love of the arts and its tradition of toleration: “Poor and rich, Czechs and Germans, Christians and Jews peacefully lived together in spite of the occasional needling remark.” At such moments, however, the rose color of Zweig’s nostalgia is impossible to ignore. For the Vienna he idealizes was the same city where anti-Semitism flourished, where Karl Lueger became mayor on an explicitly anti-Semitic platform, and where the young Hitler laid the groundwork for his plan to exterminate the Jews. If you take Zweig at face value, it is inexplicable how Vienna became, in the interwar period, the site of a virtual civil war between Social Democrats and fasciststhe city that wildly applauded the Anschluss in 1938 and happily sent its culture-loving Jews to concentration camps. As Prochnik recounts, Zweig wrote The World of Yesterday in a feverish few weeks in the summer of 1941 in, of all places, Ossining, New York, where his peregrinations had briefly taken him. If Vienna was really, as Zweig writes, the home of “live and let live”“a principle that still seems to me more humane than any categorical imperative”why did he end up writing about it in Ossining?

HHannah Arendt zeroed in on these questions when she stingingly reviewed The World of Yesterday in 1943. Zweig, Arendt believed, was the victim of the same illusion that had plagued German Jewry since the Enlightenment, and eventually led to its destruction: the belief that culture could replace politics, that individual artistic achievement would do the work of collective political consciousness. Despite what he thought, Arendt writes, Zweig never really belonged to Austrian society, because no Jew was allowed to belong to it. Instead he belonged to an international society of the famous, and he believed that membership in this cultural elite would protect him. But Zweig’s fatehis memoir was published after his suicidemade clear that he had wagered his life on an illusion. “Without the protective honor of fame, naked and disrobed,” Arendt concluded, “Stefan Zweig was confronted with the reality of the Jewish people.”
One of the strengths of Prochnik’s book, which focuses on the last few years of Zweig’s life, is that it pays due attention to his feelings about Jewishnessa subject that Matuschek’s fuller biography tends to scant. The Impossible Exile is a highly personal work, not a chronological biography so much as a meditation, through the lens of Zweig, on the whole experience of the German Jewish emigration. “Bruno Walter attributed the secret of a happy exile to remembering the distinction between ‘here’ and ‘there,’ ” Prochnik writes, but Zweig “offers a formula for toxic migrationwhat might be called Lot’s wife syndrome . . . he could not stop looking over his shoulder.”
The distinction is important to Prochnik because his own ancestors were among those impossible exiles. His father was a boy when he fled Vienna with his parents in 1938, eventually landing in the United States. The Prochnik family’s emigration was a success story, or at least that is the way he grew up hearing about it: “And though times were hard at first, living in a grim New York building, eventually they made their way to Boston, where your grandfather was able to resume his medical practice and get both his sons into Boston Latin School and Harvard. The End.” Only as an adult did Prochnik come to realize how much was left out of this comforting version, “how much was irrecoverably lost over the course of the family’s harrowing flight.” By writing about Zweig, he is clearly trying to grasp the reality of the world his own family left behind, their world of yesterday.

VVienna at the turn of the century was the birthplace of many intellectual movements, none of which turned out to be more consequential than Zionism. Zweig, as he writes in The World of Yesterday, started his career as a protégé of none other than Theodor Herzl. But the Herzl who mattered to Zweig was the literary editor of theNeue Freie Presse, not the founder of Jewish nationalism. When Herzl published Zweig’s first essay on the front page of Vienna’s leading daily when he was still a teenager, he felt that he had reached the summits of literature. In his memoir, however, Zweig is oddly evasive about the reasons why Herzl’s
Zionism made little impression on him. He observes only that he was turned off by the way Herzl, whom he saw as a king-like personality, was disrespected by his own followers: “the quarrelsome, opinionated spirit of constant opposition, the lack of honest, heartfelt acceptance in the Zionist circle, estranged me from a movement that I would willingly have approached with curiosity, if only for Herzl’s sake.”
The truth is that Zionism of any kind was never in the cards for Zweig, because of his deep conviction that being Jewish meant being in the vanguard of cosmopolitanism. “I see it as the mission of the Jews in the political sphere to uproot nationalism in every country, in order to bring about an attachment that is purely spiritual and intellectual,” he wrote in 1919. “This is also why I reject Jewish nationalism. . . . Our spirit is cosmopolitanthat’s how we have become what we are, and if we have to suffer for it, then so be it: it is our destiny.” Zweig’s hatred of all kinds of nationalism solidified during World War Ithe latter years of which he spent in contented exile in Switzerlandand he came to see the arts as the only existing manifestation of the international brotherhood for which he yearned. When Zweig writes in his memoir about his friendships with everyone from Romain Rolland to Rainer Maria Rilke to Emil Verhaeren, he is not boasting so much as demonstrating the living possibility of a fellowship that transcends borders, a modern republic of letters. What Arendt derided as a mere society of the famous was, in Zweig’s view, a society of the idealistic and talented, a humanistic elite that preserved the highest values of liberalism.

TThe fact that Zweig lived to see this version of liberalism utterly defeated in Europe was, depending on how sympathetically you look at it, either his tragedy or his comeuppance. For how could any liberalism hope to survive divorced from democracy? A liberalism of the elite was doomed to be a social luxury; yet Zweig’s historical experience as an Austrian Jew left him convinced that the masses had no love for toleration, free speech, and pacifism. On the contrary, the energy of the times seemed wholly opposed to these things, as Zweig recognized when he described the outbreak of popular enthusiasm in the first days of World War I: “The great wave broke over humanity so suddenly, with such violence, that as it foamed over the surface it brought up from the depths the dark, unconscious, primeval urges and instincts of the human animalwhat Freud perceptively described as a rejection of civilization, a longing to break out of the bourgeois world of laws and their precepts for once and indulge the ancient bloodlust of humanity.”
The profound pessimism of this view of humanity, and its implications for the liberalism that Zweig cherished, were not lost on him. Zweig’s nonfiction is today much less read than his fiction; none of it has yet been republished, though many of these books were translated into English in the 1920s and 1930s, at the height of his fame. The most significant for understanding Zweig’s political dilemma is Erasmus of Rotterdam, which he wrote in 1933, in the months after Hitler came to power, and just before he himself fled Austria. (It was the first of Zweig’s books not to be published by his lifelong publisher, Insel Verlag, which had purged its list of Jewish authors.)
Zweig’s Erasmus does not fulfill many of the duties of a biography. His historical works were always more about conjuring an atmosphere of intellectual drama than about telling a comprehensive story. As an essay on the fate of liberalism in an age of fanaticism, however, Erasmus is a powerful witness to its moment. The contrast between Erasmus, the moderate reformer and peacemaker, and Martin Luther, the belligerent and uncompromising man of faith, is transparently Zweig’s way of contrasting his own cherished ideals with those of triumphant fascism. As he says in his memoir, the book “presented my own views in veiled form through the person of Erasmus.”
Given this close identification, then, it is all the more remarkable that Zweig does not make a hero of Erasmus. Every word of praise for his subject’s irenic, cosmopolitan humanism is balanced by a word of censure for his timidity and his abstemiousness. Fifteenth-century humanists, like twentieth-century liberals, were out of touch with the people and unable to grapple with real-world problems: “though their realm was extensive,” with outposts among the intelligentsia of every nation, “its roots did not go deep, it only influenced the most superficial layers, having but feeble relations with reality.” Erasmus himself is sharply blamed for declining to attend the Diet of Worms, where he might have done something to bridge the gulf between Luther and the Catholic Church. “The absent are always wrong,” Zweig concludes, and the rebuke is directed also at himself. He had made Arendt’s indictment before she did.
The cowardice of Erasmus, if such it was, finally seems to Zweig like an inseparable component of a civilized character. It is for the barbaric Luthers to make war, while the civilized Erasmuses carve out niches of peace. No wonder he concludes that “the humanistic ideal, that ideal grounded upon breadth of vision and clarity of mind, is destined to remain a spiritual and aristocratic dream which few mortals are capable of dreaming.” Sounding surprisingly like Benjamin, Zweig places his trust in defeated ideas as the only possible source of redemption: “An idea which does not take on material shape is not necessarily a conquered idea or a false idea; it may represent a need which, though its gratification be postponed, is and remains a need. Nay, more: an ideal which, because it has failed to secure embodiment in action, is neither worn out nor compromised in any way, continues to work as a ferment in subsequent generations, urging them to the achievement of a higher morality.”
It is only when seen in the light of his political ideas that Zweig’s suicide becomes more than a personal tragedy. When Benjamin took an overdose of morphine in 1940, it was because, having been sent back across the border to conquered France, he believed that he was about to fall into the hands of the Gestapo. In 1942, however, Zweig had been out of the direct path of Nazism for eight years. Wisely or luckily, he had chosen not Paris but London as his refuge, which meant that he was not swept up in the fall of France like Arendt, Benjamin, and so many other Jewish refugees. Still, that disaster terrified him sufficiently to make him book passage to America in June 1940. Here, too, he could have found shelter, like Thomas Mann and Bertolt Brecht and many other writers he knew. But as Prochnik shows, by this point Zweig was so off-kilter, so traumatized by exile and terrified by war, that he made the irrational decision to move to Brazil, where he had earlier been given a hero’s welcome when he visited on a tour.
Finally even Brazil did not feel safe. Zweig was convinced that even if Hitler lost the warand after Pearl Harbor, this did not seem unlikelythe world would never again be “the world of yesterday.” What Zweig needed for his peace of mind, for his sanity, was the ability to forget the world crisis, to withdraw like Erasmus into a private sphere of intellect and decency. But as he mournfully writes in his memoir, the twentieth century had made this impossible: “The greatest curse brought down on us by technology is that it prevents us from escaping the present even for a brief time. Previous generations could retreat into solitude and seclusion when disaster struck; it was our fate to be aware of everything catastrophic happening everywhere in the world at the hour and the second when it happened.” Zweig came to believe that there was nowhere left to escape to, no place where the values he cherished could survive. His curse was that he died believing this; our good fortune is that he was wrong.