About Me

My Photo

A good day begins with the NYTimes, NPR, Arts & Letters Daily, Sacred Space & good coffee; it ends with a Grand Marnier. A brilliant day would be spent in London, New York or San Francisco -- although Sydney would be right up there. Unwinding in Carmel or Antibbes. Daytime spent in library (the Morgan, LOC or Widener) or museum (the Frick, the Louvre, British) with a healthy walk (around Lake Annecey); evening -- theatre (West End), or music (Carnegie Hall). A nice last meal: Perhaps the French Laundry or Fredy Giardet or Quennelles de Brochet from Taillevent, Cassoulet from Cafe des Artistes, Peking Duck from le Tsé-Fung, Lobster Savannah from Locke-Ober, Sacher Torte from Demel and Café Brulot from Antoine. Sazerac as an apéritif, Le Môntrachet in the beginning, Stag's Leap Cabernet in the middle, Veuve Cliqûot to conclude. Desert Island: Imac, Ipod, (I know, generator and dish necessary) Johnnie Walker Blue Label, wife & Adler's Great Books.

25.11.14

Stuff

Seven Questions for Collections Dean Scott Schaefer

Ichthyologist Scott Schaefer’s research specialty is rare catfishes, but on any given day he might be thinking about meteorites, textiles, or wasps.
That’s because in his other role, he oversees the Museum’s scientific collections, which passed the 33-million-holdings mark earlier this year. (For an inside look at the collections, see our new series Shelf Life, with monthly original videos, featuring Dr. Schaefer in Episode 1: 33 Million Things).
We recently caught up with Dr. Schaefer to talk about the importance of physical specimens, how collecting has changed, and his own favorite contribution to the 33 million (hint: it’s rare catfishes).
Scott Schaefer 1
Scott Schaefer holds a jar of about 65 specimens of the Vandellia catfish, or candiru, from the Museum's ichthyology collections. This parasitic species has a legendary reputation for entering the urethras of unwary humans, but it more typically finds a meal by swimming into the gill chambers of larger fish.  
©AMNH/E. Chapman
Q: Most of the Museum’s 33-million-plus specimens and artifacts are behind the scenes. Was that always the case?
A: At the very beginning, the collections were mostly acquired for exhibition. But over time, research became an important focus, and more and more collections never did make it onto public display. Today, more than 99 percent of the Museum’s collections are not on display—they’re in repositories, principally used in research. 
Q: What’s your role as the Museum’s dean of collections?
A:  On a day-to-day basis, I’m responding to problems that weren’t there yesterday. A flood in a collection. An outside request from the U.S. Fish and Wildlife Service, answering a question related to their law enforcement activities on wildlife. Perhaps there’s a situation where a loan is recalled, so I might lend a hand.
Q: So you’re the collections sheriff.
A: Collections sheriff is one way of putting it, right.
Q: With a growing jurisdiction, since collections increase every year. Why is it still important to bring back physical specimens?
A:  Because they often represent the only tangible snapshot we have of life on Earth. You might say, “You can sample the genome of a specimen. You can take a photograph of a specimen, won’t that be sufficient?”
Well, the answer is no. It might be adequate. Those might be excellent photographs. That might be one kind of representation, if you talk about a genome sequence, for example. But it isn’t necessarily sufficient to answer all the types of questions that could potentially be asked about that biodiversity at that place and at that time. So today, it’s just as essential to collect and acquire information about the remaining biodiversity of life on earth as it was 145 years ago when the Museum began building collections. 
Q: Has the process of collecting changed since those early days?
A: We’re now more focused. We are no longer taking large, very large samples from natural communities because we’re much more sensitive now as to the impact of collecting activity on the health and the welfare of natural populations. We need to be more focused on those aspects because we’re also advocates for conservation.
Q: Visitors know the Museum has dinosaur fossils—lots of them. What might they be surprised to learn?
A: We have one of the world’s largest collections of fossil mammals, approximately 4 million of our paleontological specimens. We’re one of the strongest collections in the world.
We also have one of the world’s largest collections of butterflies, with strength in numbers as well as in geographic coverage. When scientists talk about the relative importance of collections, it’s about coverage and depth. Both of those things speak to the relative importance as research tools of that material for the science questions being asked. 
Scot Schaefer2
Scott Schaefer holds the cranium of the Vampire tetra, a voracious predator that is a close relative of the neon and cardinal tetra popular in the aquarium trade. This specimen, one of more than 2 million in the Museum's ichthyology collection, is from Venezuela. 
©AMNH/E. Chapman
Q: Do you have a favorite specimen you’ve contributed to the collection?
A:  My personal favorite is a series of specimens of a new species of a very rare genus of catfish, Lithogenes, that I spent five years looking for over three different expeditions to Venezuela and ultimately found. The specimens are represented nowhere else. We don’t even know if those populations still exist in Venezuela.  It’s from a very remote place that very few people get to visit, so I have some confidence that this discovery and those collections will hold a special place for a long time.
Tags: Shelf Life

Trees

Carving

Ideas 
Cutting Remarks
Today, all you need to carve the turkey is an electric knife. In the 1600s, you needed a serious education.


Pity the turkey. Capons are sauced, cranes are lifted, partridges are allayed, geese arereared. Turkeys are, to use the proper historical carving vocabulary, simply cut up. The ritual carving of the turkey is one of the few vestiges of a past, glorious tradition that once wowed diners at spectacular feasts, and yet, the prosaic words for slicing up the turkey do not seem to match the grandeur of the deed.

Once, carving was held in high esteem. It was less about serving base bodily needs for nourishment and more concerned with spectacle and performance. Those who carved (and those who had carving done for them) were not concerned with where their next meal was coming from. It was a demonstration of power: the ability to muster a bountiful feast and an exhibition of control of the body (both that of the carver and of the animal carcass to be consumed). In full view of the diners assembled at the table, the carver hoisted the bird aloft with one hand, while wielding a razor-sharp knife in the other. Slices from the cooked carcass floated down to the plate. 

The skill of the carver governed enjoyment of the meal. In the Merry Wives of Windsor, Mistress Ford could entrance Falstaff with her skills with the knife, giving “the leer of entertainment.” William Shakespeare’s later editors have quibbled over the meaning of carving in that passage, wondering if slicing and serving could really be so seductive. (Yes. It can.) And while carving could possibly prove alluring — adding a frisson of fear, the loving housewife could turn her knife in threatening ways — unrefined skills could ruin the appetites of the diners. Samuel Pepys recorded his disgust at his aunt’s “greasy” manner of carving. Shame and social disgrace hung to those incapable of performing the duty well. Philip Dormer Stanhope, fourth Earl of Chesterfield, the eighteenth-century fop whose letters to his bastard son were posthumously published to advise the socially uncertain in all manner of behavior, declared “a man who tells you gravely he cannot carve, may as well tell you he cannot blow his nose; it is both as easy and as necessary.” 

Necessary, yes. Easy? Not quite. Carving required practice, but the results of that study were to be hidden behind a mask of boredom and insouciance. Baldessare Castiglione dubbed this kind of refined and mannered ease sprezzatura. It was a casual effortlessness that left observers wondering what magic could be brought forth if only a bit of effort was applied. A man who could cut up a turkey as easily as slicing through butter certainly had other skills. Was he formidable on the fencing court? Thrillingly dexterous in the bedroom? (Maybe some things are best not imagined when a close blood relative does the honors at Thanksgiving.) What other secret powers did that body possess? 

Not everyone could learn from a master. And self-help books advised uncertain readers, from lower down the social order, in the proper process of carving (and many other things.) Some of these books were more practical than others. A whole group of English cookery books revealed “the terms for the art of carving,” but not much else. Verb after verb piled up, as if words alone could sate hunger. Others provided diagrams, enumerated with step-by-step cuts. One seventeenth-century manual printed the diagrams on playing cards, dividing the various foodstuffs (fowl, fish, butcher’s meats like beef and lamb, and meat pies) into the various suites. 

Turkey took pride of place as the King of Diamonds, displacing capons and the other grand birds that had ruled earlier tables. The turkey had the advantage of not just being an exotic import from the New World, but also having tastier flesh. Turkey quickly found its way into European dining thanks to its similarity to other large birds. Cooks knew how to cook it. (Tomatoes and potatoes had a rougher assimilation, before becoming staples in cuisine.) The later arrival of the turkey from the Americas onto the English table, in the mid-sixteenth century, perhaps explains its quotidian carving vocabulary: the terms to describe cranes, herons, and peacocks all derived from medieval court practice and, specifically, the exclusionary language of hunting that was codified in manuscripts and printed books in the fifteenth and sixteenth centuries. Arcane and exceedingly precise vocabulary was one tool for keeping social climbers at bay. Even in the early eighteenth century, poet William King mocked his contemporaries who — guided by the lists of obsolete words published in cookery books — tried to present themselves as more exalted than they were:
I am sure Poets, as well as Cooks, are for having all Words nicely chosen, and are properly adapted; and therefore I believe they would shew the same Regret that I do, to hear Persons of some Rank, and Quality, say, Pray cut up that Goose; Help me to some of that Chicken, Hen, or Capon, or half that Plover, not considering how indiscreetly they talk, before Men of Art, whose proper Terms are Break that Goose, frust that Chicken; spoil that Hen; sauce that Capon; mince that Plover. If they are so much out in common things, how much more will they be with Bitterns, Herons, Cranes, and Peacocks?
It was a vocabulary meant for snobs (although that word came later, in the nineteenth century. W. M. Thackeray ridiculed the “dinner-giving snobs” to great effect.) In England, the noble Office of the Carver (alongside Butlers, Pantlers, Chamberlains, and other service positions) gave way to housewives, sons, and eventually servants doing the so-called “honors of the table,” until even that mostly disappeared from the table and happened, out of sight, in the kitchen, except for the Sunday roast or holiday meal. The American practice of carving the turkey at the table on Thanksgiving nods to this richer tradition of sanctioning a special meal with ritual and spectacle. Getting meat off the bone to eat — rather than putting on a feast for eyes — otherwise rules the day. 

The following instructions for cutting up a turkey first appeared in The Family Dictionary, or Household Companion (London, 1695), and were repeated verbatim in cookery books marketed at English housewives throughout the eighteenth century. Why not take a lesson from history this Thanksgiving?
Raise up the leg fairly, and open the Joint with the Point of your Knife, but take not off the Leg; then with your Knife lace down both Sides of the Breast, and open the Breast-pinion, but do not take it off; then raise the Merry-Thought betwixt the Breast-bone, and the top of it; then raise up the Brawn; then turn it outward upon both Sides, but not break it, nor cut it off; then cut off the Wing Pinions at the Joint, next the Body, and stick each Pinion, in the Place you turn’d the Brawn out; but cut off the sharp End of the Pinion, and take the middle Piece, and that will just fit in the Place.
If that doesn’t work, there is always the electric knife. • 15 November 2013


Heather Hess is an art historian specializing in the aesthetics of dining and banqueting since the Renaissance.

Some Thoughts

Noncanonical 
A Blessing
Nathaniel Hawthorne's ''John Inglefield's Thanksgiving."


It takes Satan to bring out the true spirit of Thanksgiving. That's because it can be hard to give thanks unless you know why you are doing it. Plenitude is lovely. Abundance is a delight. I think of the famous painting by Norman Rockwell. A large American family sits around a comfortable table as the venerable mother carries a moose-sized turkey as the centerpiece. The painting was originally titled "Freedom from Want" and was part of Rockwell's Four Freedoms series, meant to promote the buying of war bonds during World War II. If there is an unsettling message hidden in the Rockwellian sentimentality, though, it's that these people, this nice American family, knows nothing of want. They are giving thanks for an abundance that is taken for granted. 
When the devil is on your doorstep, however, thanks takes on a different timbre. The American most consistently preoccupied with thoughts of Satan was probably Nathaniel Hawthorne. Hawthorne never trusted in the good times. He saw the devil lurking in every moment of pleasure, waiting for the chance to pounce on the unsuspecting reveler when his guard was down. Hawthorne's story, "John Inglefield's Thanksgiving," is appropriately evil-obsessed. Utterly bleak, it is a difficult fit in the traditional American story of goods asked for, goods delivered, thanks given.

In Hawthorne's story, a blacksmith named John Inglefield is sitting with his family around the table just after a Thanksgiving Day's meal. A stern man, he has nevertheless carefully prepared a table setting for his wife, recently dead. Suddenly, the door opens and in walks his second daughter, Prudence Inglefield. We are made to understand that she has left home under trying circumstances, the specifics of which are never explained. John welcomes her, saying, "Your mother would have rejoiced to see you, but she has been gone from us these four months."

Prudence greets her younger sister and a former love interest, Robert Moore. The scene is tense. Why has Prudence returned? Still, the warmth of the home and the hearth draws everyone together. It was, Hawthorne writes, "one of those intervals when sorrow vanishes in its own depth of shadow, and joy starts forth in transitory brightness." Suddenly, as the family is getting ready for a nightly prayer, Prudence makes for the door. John calls after her. She hesitates, "her countenance wore almost the expression as if she were struggling with a fiend, who had power to seize his victim even within the hallowed precincts of her father's hearth." And then she leaves without a word. The visit home was only a temporary pause. She is called away by "some dark power," something she cannot resist.

It is a strange story by any standard; for a Thanksgiving story it is stranger still. But Hawthorne was committed to that strangeness in everything he wrote. He wanted to produce an American literature that was deeply moral without being moralistic. It would show human beings as the inscrutable creatures that they are, struggling to make decisions in situations they can never fully comprehend. Why does Prudence come home to the family she has left behind? Who knows, but we have all wanted to go home. When she gets there, something drives her away again. She can't go back, even though she desperately wants to go back. It is irresolvable. Alfred Kazin once wrote of Hawthorne that "by emphasizing the uncertainty and ambiguity that are attached to human relations, he incorporated into his fictions the strangeness, the ultimate causelessness, which we attribute to human nature as the subject of literature."

Perhaps, then, Hawthorne is a nice antidote to the blithe triteness that creeps inevitably into a holiday like Thanksgiving (thanks, God, for all the nice stuff!). Gratitude, for Hawthorne, happens under the eye of a creeping evil. Happiness is found only in moments of warmth snatched from the otherwise unaccountable stories of what people do to one another. But in the strangeness of human actions is also the indefinable uniqueness that makes us who we are. Humanness and strangeness are tied together, tragically perhaps, but inextricably. So it is for our strangeness, maybe, that we ought, with Hawthorne, to give thanks most of all. • 22 November 2010

Thanksgiving

Pertinent & Impertinent 
Verity Vs. Verisimilitude
In Ferris' The First Thanksgiving, the eternal battle between the history and the meaning of Thanksgiving rages on.

“A day of public thanksgiving and prayer to be observed by acknowledging with grateful hearts the many and signal favors of Almighty God.” — George Washington, the first President of the United States, delivering the National Thanksgiving Proclamation, on October 3rd, 1789 
For the last thirty years of his life — the first thirty years of 20th century America — the historical genre painter Jean Leon Gerome Ferris devoted himself to capturing the history of America in paint. He called the result of this enormous task — a series of 78 scenes from the landing of Columbus to the start of World War I — The Pageant of a Nation. No one had ever seen America like this: All her heroes, great and small, presented in one complete and satisfying narrative — the most extensive series of American historical paintings by an individual artist before or since.
The paintings, wrote the authors of History of the Portrait Collection, Independence National Historical Park, were “expertly executed reproductions of the past”; Ferris on par with the best historical genre painters of his time. They were meticulously researched and crafted. And yet, the authors wrote, Ferris’ paintings said more about the era in which they were created than the events they portray. The paintings “formed a bridge between fact and fiction over which the viewers…were willing travelers.” The Pageant of a Nation, wrote the authors, confused “verity with verisimilitude.”
In one of the most widely reproduced Ferris paintings, The First Thanksgiving, the characters are just as they exist in our Thanksgiving myth. The Pilgrims don the familiar tall hats and buckles. The Wampanoag wear feathered headdresses. The women smile. The Wampanoag are serious. The Pilgrims stand around a well-crafted outdoor table. The Wampanoag sit on the ground. In the foreground is a colonist in armor. One of the Wampanoag wears a Pilgrim’s hat with a feather stuck in the brim. There are cakes, and oysters, and porridge, and grace. There is a spirit of sharing and thanks. The weather seems to be holding.
But we know that it could not have been this way. That it was cold and the Pilgrims were starving. That they would have been wearing furs and skins if they had any clothes left on them at all. That the Pilgrims and Wampanoag would not have shared Pilgrim prayers to the One Almighty God. That the apocalyptic Pilgrims came to America largely to see it, and the rest of the cosmos, burn. That the Wampanoag would not be wearing headdresses, and would not have been sitting on the ground. We know that, a few decades on, the English colonists and Wampanoag would fight for the other’s annihilation, and that one of them would win.

The closer we look at this painting, the more we see it as ridiculous. If paintings can be held responsible for distortions of history, the painting is practically criminal.
But there is another painting in The First Thanksgiving that has nothing to do with historical verity. It is the painting of verisimilitude, the painting of the truth as-if. It’s what the Romantic poet Coleridge called “poetic truth.” This is the truth beneath the surface of reality, the facts taken up by the imagination. In the imagination, verity is transformed by verisimilitude. Our imagination can extract “hidden ideas and meaning” from the information we cannot escape. Imagination “dissolves, diffuses, dissipates” the stuff of life, which is essentially fixed and dead, in order to recreate it. And when it is impossible to recreate, said Coleridge, still the imagination struggles to idealize the dead lifestuff, to exalt it, to form a bridge between the perfect image and reality.
In the Ferris painting of verisimilitude, America celebrates a feast day. Everyone is American, and all Americans are grateful. This painting has little to do with the history of the early settlers. It is an allegory of gratitude. This painting is not trying to fix the facts. It is trying to transcend them. It asks, as George Washington did in his Thanksgiving Proclamation so many years ago, what does it mean to have a national language of gratitude? The myth of the first Thanksgiving aside, it is a fair question. • 24 November 2014

Stefany Anne Golberg is an artist, writer, musician, and professional dilettante. She's a founding member of the arts collective Flux Factory. She can be reached at stefanyanne@gmail.com.

Lives Well Lived

Klara and Drago Cvitanovich

GREG MILES PHOTOGRAPH

Giving Something Back


Like many of us, Klara Cvitanovich has powerful food memories from childhood, but hers aren’t associated with rose-colored recollections of Thanksgiving feasts. When she was 8 years old and living in post-World War II Yugoslavia, her school lunches provided by the Marshall Plan were the most important (and sometimes only) meal of the day. “America was sending us cheese, eggs and milk. And that was the best meal of the day,” she recalls. “Not just for me, but for a lot of people.”

Fast forward to New Orleans in early September 2005. In those first days after Hurricane Katrina, Klara and her husband Drago set up a base of operations on Harrison Avenue in Lakeview. Over the course of the next four weeks, they gave away more than 80,000 meals. During that time, Klara’s childhood experiences were never far from her mind. “I just felt that we had it, other people didn’t, so we had to give it away,” she says. “For me it was a moral obligation, and also my way of saying ‘thank you’ to the city of New Orleans and the United States for all they had done for us.”

Gratitude, compassion and generosity – these three words best describe Klara and Drago Cvitanovich. So this year, when the New Orleans Wine & Food Experience’s Board of Directors met to decide who would receive this year’s Ella Brennan Lifetime Achievement Award, their choice was easy. “Klara and Drago truly epitomize hospitality in every aspect of the word,” says NOWFE chairperson Cameron Benson Perry. “They have a way of just transforming people. Their love for this country and their love for the city is just inspiring.”

They are also successful businesspeople. When Drago and Klara first arrived in New Orleans in 1961, they came with just two suitcases, a little bit of money and their two young sons, Tommy and Gerry. By ’69 they had saved enough to open Drago’s in Fat City. The early years were tough. Drago worked round the clock to get the business up and running. Klara worked full time at DH Holmes as a travel agent, and when she was done there went straight to Drago’s, where she managed the front and did bookkeeping. After close they would wash and fold the linens and then start all over again the next day. They did this for several years, all while raising their two children.

Four expansions later, Drago’s has blossomed from 70 seats to over 300, with banquet facilities and their famous repurposed oyster-roasting fire truck to boot. They opened a second location in the Riverside Hilton a few years after Katrina, which has since become the international hotel chain’s highest grossing restaurant out of all their properties worldwide. Its success set the stage for a third Drago’s, set to open in Jackson, Mississippi, at the end of this year.

Their signature dish, Charbroiled Oysters, was invented by their son Tommy, who perceived an opportunity at a time where most other oyster establishments were facing a downturn due to a negative perception of raw oysters. “We were thinking what we could do for sales, when Tommy said, ‘Mom, what do you think about putting the oysters on the grill with our garlic-butter sauce?’ I said we can try, and a star was born,” Klara says about the genesis of that famous dish.

Charbroiled oysters have since gone on to join the rarified menu of dishes synonymous with New Orleans, such as Paul Prudhomme’s Blackened Redfish. On a busy day, they serve over 10,000 of those oysters. It also single-handedly flipped the table on the negative perception of the bivalve. “People who ‘don’t eat oysters’ eat their oysters,” Perry points out. “For this year’s award, we just really wanted to knock it out of the park. We don’t think we could have done any better than Miss Klara and Mr. Drago.”

1943

24.11.14

Byron

he Innovators by Walter Isaacson. George Gordon Byron (1788-1824), commonly known as Lord Byron was one of the greatest British poets, an aristocratic and flamboyant celebrity known for huge debts and numerous love affairs with both sexes. Romance was absent though when it came to his marriage. He needed money and married a wealthy aristocrat named Annabella Milbank. The marriage crumbled when Lady Byron discovered her husband's infidelity:

"Lord Byron ... noticed a reserved young woman who was, he recalled, 'more simply dressed.' Annabella Milbanke, nineteen, was from a wealthy and multi-titled family. The night before the party, she had read [his poem] Childe Harold and had mixed feelings. 'He is rather too much of a mannerist,' she wrote. 'He excels most in the delineation of deep feeling.' Upon seeing him across the room at the party, her feelings were conflicted, dangerously so. 'I did not seek an introduction to him, for all the women were absurdly courting him, and trying to deserve the lash of his Satire,' she wrote her mother. 'I am not desirous of a place in his lays. I made no offering at the shrine of Childe Harold, though I shall not refuse the acquaintance if it comes my way.' 
Anne Isabella Milbanke
Lord Byron
"That acquaintance, as it turned out, did come her way. After he was introduced to her formally, Byron decided that she might make a suitable wife. It was, for him, a rare display of reason over romanticism. Rather than arousing his passions, she seemed to be the sort of woman who might tame those passions and protect him from his excesses -- as well as help payoff his burdensome debts. He proposed to her halfheartedly by letter. She sensibly declined. He wandered off to far less appropriate liaisons, including one with his half sister, Augusta Leigh. But after a year, Annabella rekindled the courtship. Byron, falling more deeply in debt while grasping for a way to curb his enthusiasms, saw the rationale if not the romance in the possible relationship. 'Nothing but marriage and a speedy one can save me,' he admitted to Annabella's aunt. 'If your niece is obtainable, I should prefer her; if not, the very first woman who does not look as if she would spit in my face.' There were times when Lord Byron was not a romantic. He and Annabella were married in January 1815.  
Augusta Maria Leigh

"Byron initiated the marriage in his Byronic fashion. 'Had Lady Byron on the sofa before dinner,' he wrote about his wedding day. Their relationship was still active when they visited his half sister two months later, because around then Annabella got pregnant. However, during the visit she began to suspect that her husband's friendship with Augusta went beyond the fraternal, especially after he lay on a sofa and asked them both to take turns kissing him. The marriage started to unravel.

"Annabella had been tutored in mathematics, which amused Lord Byron, and during their courtship he had joked about his own disdain for the exactitude of numbers. ... Early on, he affectionately dubbed her the 'Princess of Parallelograms.' But when the marriage began to sour, he refined that mathematical image: 'We are two parallel lines prolonged to infinity side by side but never to meet.' Later, in the first canto of his epic poem Don Juan, he would mock her: 'Her favourite science was the mathematical .... She was a walking calculation.'

"The marriage was not saved by the birth of their daughter on December 10, 1815. She was named Augusta Ada Byron, her first name that of Byron's too-beloved half sister. When Lady Byron became convinced of her husband's perfidy, she thereafter called her daughter by her middle name. Five weeks later she packed her belongings into a carriage and fled to her parents' country home with the infant Ada.

"Ada never saw her father again. Lord Byron left the country that April after Lady Byron, in letters so calculating that she earned his sobriquet of 'Mathematical Medea,' threatened to expose his alleged incestuous and homosexual affairs as a way to secure a separation agreement that gave her custody of their child." 

The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution
Author: Walter Isaacson
Publisher: Simon & Schuster
Copyright 2014 by Walter Isaacson
Pages 10-12

Markets

GUY SORMAN
Markets Everywhere
How economist Gary Becker changed our lives
Autumn 2014
JOSHUA LOTT/AP PHOTO
The Nobel Prize winner, who died earlier this year, in his University of Chicago office
The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.” So famously observed John Maynard Keynes in 1936, in the conclusion of his opus The General Theory of Employment, Interest, and Money. Keynes proved prophetic, at least in his own case. “Keynesianism” as a doctrine came into its own during the 1970s, long after the man himself had disappeared; President Nixon, who declared, “I am now a Keynesian in economics,” probably had read little of the economist’s work.
Keynes’s aphorism has been reliable when applied to free-market economists, too—and some of them haven’t had to wait for posthumous vindication. Friedrich Hayek became influential while he was still alive, helping inspire the economic policies of Margaret Thatcher and Ronald Reagan. “This is because I lived too long,” quipped Hayek. Milton Friedman stuck around long enough to see the end of hyperinflation, thanks to the application of his monetary theories. A third Nobel Prize–winning University of Chicago economist, Gary S. Becker, who died this past April, may have been less famous during his lifetime than Hayek and Friedman, but his ideas lie behind some of the most striking policy innovations of the last few decades. The politicians implementing those policies probably hadn’t worked their way through The Economic Approach to Human BehaviorHuman CapitalA Treatise on the Family, or other dense Becker studies. But they could easily have become acquainted with his thinking through his clearly argued Businessweekcolumns, collected in the 1998 book The Economics of Life: From Baseball to Affirmative Action to Immigration, How Real-World Issues Affect Our Everyday Life, or the popular blog he cowrote for several years with Judge Richard Posner, the highlights of which appeared in book form asUncommon Sense in 2009. And some of Becker’s Chicago disciples turned his theories into recent bestsellers. Just about everyone has heard ofFreakonomics, the wildly successful book by Steven Levitt and Stephen Dubner that popularized Becker’s view that criminals are like entrepreneurs, responding to market incentives. In many ways, then, Becker’s star is ascending—and that’s a good thing for economics as a discipline and, more broadly, for our public policy debate.
Becker was astonishingly ambitious, bringing market-based thinking and solutions into fields—from crime to job discrimination to medicine to traffic flows—where no economist had gone before. Becker dubbed his approach “rational action theory.” In economics, as in other sciences, theory is necessary. It will offer an imperfect description of the real world, yes, but without one, the world can’t be seen at all. Indeed, all major breakthroughs in economics have been built on new theories, Becker’s included, though he agreed that no theory was final. He spent his academic career responding to critics, often from other disciplines like sociology or psychology, who felt attacked on their own turf, and sometimes he incorporated their concerns into his own thinking. Such intellectual generosity revealed him to be a classical liberal, not an authoritarian ideologue, as he was sometimes caricatured.
Rational action theory holds that all human beings, in all civilizations, act “as if” they are rational. For Becker, this meant that they acted as if they sought to maximize their profit in a market. “Nobody ever claimed that people were perfectly rational all the time,” Becker told me in a 2009 conversation. “People make mistakes. They may get under the sway of emotions. My theory has recognized that from the beginning. What I have argued, however, is that when you put the collection of individuals together in markets, the markets perform better than any alternative that has ever been devised.”
We’re entrepreneurs in all walks of life, Becker believed, responding to incentives and behaving, more often than not, rationally. In his work on the family, to take a controversial example, Becker showed how parents tended to choose the number of children they had in order to maximize the overall welfare of the family, as if it were an economic firm. Parents invest in their children’s education as well, seeking to boost their offspring’s “human capital,” an investment that they hope will bring positive returns. Becker didn’t invent the concept of human capital—the cognitive habits, know-how, creativity, and other mental attributes that generate economic wealth in a modern economy—but he was the first economist to use it systematically, as a basis for his reasoning and policy arguments. He was keenly aware that we live in a time—in the developed world, anyway—when human capital has superseded physical capital, and he invited his fellow economists to adapt to this new reality.
Nowhere was rational action theory more persuasive than in analyzing criminality. As long ago as the late 1950s, Becker was proving that potential criminals adapt their behavior to the rewards and risks that they meet with in society. If committing crime carries little risk, wrongdoers will feel emboldened. The result will be more crime. Conversely, heavier enforced sanctions—arrests, fines, sentences—make crime appear more expensive for potential bad actors than any expected benefits that it might bring them, and thus crime is deterred. (If sanctions are too severe, however, some criminals might commit worse crimes because, in Becker’s words, “they have nothing to lose.”)
Becker’s views on crime weren’t born from his imagination but from his assembled statistical data, expressed in mathematical models. He popularized his findings in newspaper columns, conferences, and other outlets. Becker’s research has had a significant impact on crime policy, which has put far greater emphasis on the behavioral incentives that surround potential criminals—as seen in the quality-of-life policing, tougher sentencing, and other crime-fighting innovations that have helped drive crime rates down dramatically in New York and other cities over the last two decades.
Becker’s ideas—in many cases, unacknowledged—have also transformed the debate over another urban plague: traffic congestion. The planner’s old answer to congestion was to widen streets, add traffic lights, and build new tunnels and overpasses, all in the hope of making traffic more fluid. To no avail: the expensive public works actually worsened congestion by encouraging more drivers to take cars to the city, overwhelming the new infrastructure. Becker instead applied a strict market calculus to urban traffic. Any commuter or visitor who chose to enter a city center with his own car or truck, Becker explained, had an interest in doing so. The individual driver’s choice looks rational to him; he thinks that he will gain in comfort, access, and time. His choice, though, exacts a cost on other drivers in the form of greater congestion, producing what economists call a negative externality; he, too, becomes a victim of the congestion he helped cause. At the end of the day, all the drivers lose. “Time spent in traffic is an inefficient ‘price’ since it wastes the time of drivers without providing benefits to anyone else,” argued Becker. He estimated congestion’s cost to be $50 billion a year in the U.S.—a massive tax on time.
A market solution to the problem, easier to apply than any massive new road-construction project or bureaucratic regulation, would encourage drivers to make more rational choices by requiring payment for vehicle access to downtown areas and city centers. A version of such “congestion pricing,” as the idea has come to be known, was implemented in London several years ago by then-mayor Ken Livingstone; road congestion in London dropped by 20 percent, as many commuters opted for cheaper public transportation, leaving their cars at home. Former New York City mayor Michael Bloomberg tried to impose a similar entry toll in Manhattan during the mid-2000s, but the initiative, opposed by neighborhood officials and downtown shopkeepers, was defeated. Becker felt that Bloomberg had failed to explain how traffic congestion represented an economic loss for all New Yorkers. He wasn’t fully satisfied with the London system, either. Traffic congestion varies by the hour and day, he noted, so, as with airline tickets, the price of driving into the city should depend on the time—something the London plan, which had a uniform rate, didn’t take into account. But congestion pricing of some kind will be adopted by more cities in the future: mayors of large cities these days, even if they have never heard of Gary Becker, know that spending a ton of money on new arteries or underpasses to ease traffic is likely to be ineffective—a victory for Becker’s market-based thinking.
Always consistent, never afraid to appear as a provocateur, and, above all, an independent thinker, Becker was convinced that market solutions could help solve many other seemingly intractable public problems, among them devising a sensible immigration system. Immigrants know that they will find a better life in the United States. Most will work hard at achieving that better life, Becker believed, because the American welfare state today doesn’t provide all that much support for those unwilling to work (unlike in Western Europe, where many immigrants live permanently on welfare). A Mexican or Chinese immigrant working in America will usually multiply his income by five or even ten times over what it would be in his country of origin. Given that reality, Becker thought, such immigrants, legal or illegal, would be willing to pay for such an opportunity—just as students and their families pay for college, perceiving it, rightly, as an investment that will bring greater earning power. “Visa seekers,” he wrote, “are comparable to college degree seekers”: they’re entrepreneurs, investing in human capital. Thus, Becker proposed, all visas should come with a price tag attached, set by the market. For American taxpayers, Becker claimed, the benefits of such a visa-for-money system would be substantial. Border control would cost less, for starters, since some immigrants who are tempted to sneak into America illegally (which costs them time and money) could now buy their way in legally. And immigrants ready to pay for visas would have an even stronger incentive to work to recoup their investment.
In a Beckerian system, wouldn’t wealthy immigrants be favored over the deserving poor? This is already the case, he replied: the rich can often obtain U.S. residency permits if they invest in the country. Becker wanted to extend the market for visas, now enjoyed by the wealthy, to the hardworking poor. If they didn’t have the money up front, aspirational immigrants should be able to borrow it, just as American students and their families do. In Becker’s view, the only losers in an open market for visas would be the often unsavory “coyotes” paid to transport Latin American migrants across the Texas border.
A visa market would remain imperfect, Becker admitted; he wasn’t a free-market fundamentalist (a breed that exists more in the liberal imagination than on the University of Chicago campus). Not all foreign workers purchasing a visa would earn back their investments. Some might fail completely, costing American society more than what they generate. Such realities illustrate the limitations of economics as a discipline: the individual’s personal fate is hidden in the data and models that the economist proposes. On average, though, Becker predicted, his visa plan would work far better than the dysfunctional current immigration system.
Another area in need of market solutions, Becker held, was the medical availability of replacement organs. In the United States, as in other advanced societies, the law bans the outright sale of organs, part of a broader limit on the commodification of certain essential goods and services. “The prohibited transactions are prohibited because they are highly offensive to non-participants,” Becker wrote in 2006. “Why they are offensive remains to be explained.” Cultural attitudes are doubtless powerfully at work. In France, to sell one’s blood is illegal and, what’s more, culturally unthinkable: blood donation is a citizen’s duty. One result of this attitude is that French hospitals often find themselves short of fresh blood. Americans are more comfortable with the idea of selling their blood, which is currently legal to do. American hospitals, unsurprisingly, have more voluminous blood supplies.
If organs were similarly part of a legitimate market, American patients needing a new kidney or a liver would not have to wait for years to get one, as they do today, if they live long enough. “In 2012, 95,000 American men, women and children were on the waiting list for new kidneys, the most commonly transplanted organ,” Becker noted shortly before his death, in aWall Street Journal op-ed coauthored with Julio Elias. “Yet only about 16,500 kidney transplants were performed that year.” Paying living donors for their organs—Becker estimated the going rate for a kidney to be $15,000—would swiftly eliminate the tragic gap between supply and demand. Safeguards could be built in to such a system of exchange, he pointed out, to protect against exploitation or rash or unsafe donations.
Becker recognized that many would be shocked at the notion of paying living donors and that such a system was a long shot to be adopted. A more feasible market-based idea to increase the organ supply, he suggested, would be to pay people up front to donate their organs upon death to an organ depository. The alternative to organ markets is the impaired lives and premature deaths of many patients. Moreover, those who currently benefit from rare transplants typically get chosen in secrecy by medical personnel or through black-market transactions inside and outside America, an arrangement that favors the wealthy and well connected. Becker believed that organ markets would be more transparent and efficient—and moral. He always looked beyond common wisdom when that common wisdom had negative consequences, usually for the weakest people.
Becker not only came up with market-based solutions to public problems; he also debunked government efforts to use extensive regulations and spending to address those problems. This was a critical task, since the regulate-and-spend nanny-state approach, which denies the rationality of individuals and their capacity to take care of themselves, is seductive to many politicians and even to the public, in part because its unintended negative consequences, both moral and fiscal, aren’t always evident at first.
Becker viewed the Bloomberg administration’s 2006 ban on trans fats in restaurants as a classic example of overreaching regulation. The administration presumed that New Yorkers were too ignorant to make decisions in their own health interests. But were they? Yes, the evidence suggested that trans fats contributed to heart disease—though the degree of harm remained unclear. But before the ban, half of the city’s restaurants didn’t use trans fats, so health-conscious consumers could already easily avoid them if they wished. Further, the ban likely raised the cost of eating out in the city. Could such a price increase lead some New Yorkers to eat more at home—and perhaps eat more trans fats, too? Policymakers ignored such a possibility. Some customers, of course, may really love trans fats and want to consume them, even knowing that they could have bad health effects in the future. Defenders of the ban would say that making that choice could increase the incidence of heart disease in the city, which would burden Medicare and hence the taxpayer—a negative externality. If this were true, though, why not just let insurers require individuals who want to eat unhealthily to pay higher premiums? Why should the government impose a new regulation that diminishes freedom?
Public policies that curb personal liberty, Becker argued, too often are based on insufficient data; politicians regularly put them into effect without considering all their potential consequences or exploring alternatives. And such prohibitions are politically hard to remove, he added, meaning that the sphere of freedom continues to shrink.
Becker’s rational action theory is based on facts, not wishful thinking, and this is particularly evident in its application to the problem of racism and discrimination. For Becker, government affirmative-action programs to help disadvantaged minorities were counterproductive and even destructive. If people get the idea that they’re receiving special favors thanks to their race (or gender) and not getting ahead on their own efforts or talent, they will tend to stop pushing themselves to improve. Nobody works hard if he doesn’t have to.
On the use of racial preferences in college admissions, Becker embraced the “mismatch” argument made by economist Thomas Sowell and UCLA law professor Richard Sander. “If lower admission standards are used to admit African Americans or other groups, then good colleges would accept average minority students, good minority students would be accepted by very good colleges, and quite good students would be accepted by the most outstanding universities, like Harvard or Stanford,” Becker wrote in 2005. “This means that at all these types of schools, the qualifications of minority students would on average be below those of other students. As a result, they tend to rank at the lower end of their classes, even when they are good students, because affirmative action makes them compete against even better students.” The outcome could only be higher college-dropout rates and frustration. When minority graduates of elite universities enter the job market, he continued, they often suffer from the perception that their academic achievements stem from special preferences only, not from merit, corroding their sense of self-worth. And firms may elect not to hire applicants from such groups at all, out of fear of future legal action if minority employees are not promoted frequently or rapidly enough. This is how affirmative action produces, in Becker’s words, “a less progressive economy.”
Becker worried that opposition to affirmative-action programs was too often confused with support for discrimination, which he staunchly opposed. He viewed affirmative action and discrimination as two sides of the same coin. Both give special advantages to those who have done nothing to earn them. A robust free-market economy helps defeat discrimination far more than any government social engineering, pushing racist employers to hire minorities in order to stay competitive. Even an equal-pay provision for minorities could be counterproductive in the fight against discrimination, Becker discovered in his research on job markets, published in his 1957 study, The Economics of Discrimination. An employer who preferred to hire within his majority racial or ethnic group would be more willing to hire a disadvantaged minority if it cost him less, Becker found. Joining the workforce, the minority worker would then prove that his ability was equal to that of employees from other ethnic communities. “More than half a century after Professor Becker’s landmark work on the economics of discrimination, most controversies on that subject, both in the media and in politics, go on in utter ignorance of his penetrating insights. So do laws and policies that make discrimination worse,” lamented Sowell in a tribute published after Becker’s death.
If we really wanted to help minorities, especially young black men, the decriminalization of drugs should be a priority, Becker maintained. “Trafficking in drugs,” he wrote in a 2003 Businessweek column, “attracts young blacks mainly because it offers much better pay (provided they don’t get caught) than do the legal alternatives, which tend to be low-wage jobs. Even conservatives and liberals who are reluctant to make drugs legal have to recognize that the present system does enormous damage to the black community, especially to the many black men who spend years in prison on drug charges.” A well-calibrated tax on drugs could regulate their use at a tolerable level, while not making them so expensive that an illegal market again emerged. Any illegal acts committed while under the influence of drugs would have to be severely punished, of course.
Anti–affirmative action and pro–drug decriminalization—Becker’s arguments often ran against conventional thinking. This was certainly the case with another position he took: defending the pharmaceutical industry, which he believed would improve our lives, reduce our health-care expenses, and eventually cut the Medicare deficit—if we let drug markets flourish. He gave antidepressant medication as an example. New pills cost more, he acknowledged in a mid-2000s editorial, “but hospital stays declined by so much that total spending per depressed person fell. These drugs enormously improved the quality of patients’ lives, since most people who were suffering from serious depression can now function reasonably well at work and home.”
It might be politically more effective to denounce Big Pharma than to bet on a better future thanks to its products. But Gary Becker wasn’t running for political office; he was seeking the truth, which he did tirelessly until his death, at age 85.