Free Lunch and Iraq

Two very different articles highlight just how little cost-benefit analysis matters sometimes, whether at the highest policy levels or in the most mundane of circumstances.  Humans evolved in a world of threats and status, and oftentimes that runs counter to any sort of logic.  And so we face many opportunities lost and much damage done. 

Bob Herbert writes today about “The $2 Trillion Dollar Nightmare,” the on-going estimate of the total cost of the Iraq war.  He notes the lack of public discussion of the “consequences of these costs, which are like a cancer inside the American economy.”  Then he discusses the testimony of a Nobel-prizing winning economist, Joseph Stiglitz, and the vice chairman of Goldman Sachs, Robert Hormats: “Both men talked about large opportunities lost because of the money poured into the war. ‘For a fraction of the cost of this war,’ said Mr. Stiglitz, ‘we could have put Social Security on a sound footing for the next half-century or more’.” 

Carol Pogash wrote recently about “Free Lunch Isn’t Cool, So Some Students Go Hungry.”  Many students who qualify for federally-subsidized lunches go without:  “Lunchtime ‘is the best time to impress your peers,’ said Lewis Geist, a senior at Balboa and its student body president. Being seen with a subsidized meal, he said, lowers your status’.” 

Pogash writes later, “Ann Cooper, director of nutrition services for the public schools in Berkeley, Calif., said that attention to school cafeterias had traditionally focused on nutrition, but that the separation of students who pay and those who receive free meals was an important ‘social justice issue’.”

Beyond threats and status, cultural distinctions matter in these sorts of decisions.  The war on terror was framed, from the very first moment, as a war of civilization against barbarians—our very way of life seems to be under threat.  And students know what eating a subsidized meal signifies, that all that effort in having “spiky hair and sunglasses” goes to waste in that moment of being seen on the wrong side of the American Dream. 

In the end the costs do matter, particularly in opportunities lost, as our own biological and cultural heritages conspire together.  That’s more than the market, more than being predictably irrational, it’s the tragic acting out of our own selves at the smallest and largest of scales.  But they are dramas we ourselves write, and so can change. 

But it won’t be easy.  Write what you know best, one writer’s rule goes.  In everyday life it’s what we do all the time.  Breaking free from that, from lamenting what might have been to seizing what could be, will take courage and vision and work.

Decision Making and Emotion

Economists and policy makers are coming to the realization that rationality, in its multiple forms, doesn’t always explain why people make the decisions that they do.  By rationality, I mean both the assumption of “economic man” (a utilitarian cost/benefit analyzer) and the emphasis on education and knowledge as the privileged means of shaping behavior.   

Let’s take three recent headlines: “Why Sadness Increases Spending,” “Craving the High That Risky Trading Can Bring” and “Teenage Risks, and How to Avoid Them.”  All point to the role of emotion in decision making (any surprise here?). 

The first article states, “A research team [of Cynthia Cryder, Jennifer Lerner, and colleagues] finds that people feeling sad and self-focused spend more money to acquire the same commodities than those in a neutral emotional state.” 

The second provides an Aristotelian summary: “The findings, while preliminary, suggest — perhaps unsurprisingly — that traders who let their emotions get the best of them tend to fare poorly in the markets. But traders who rely on logic alone don’t do that well either. The most successful ones use their emotions to their advantage without letting the feelings overwhelm them.” 

The third tells us, “Scientific studies have shown that adolescents are very well aware of their vulnerability and that they actually overestimate their risk of suffering negative effects from activities like drinking and unprotected sex…  ‘It now becomes clearer why traditional intervention programs fail to help many teenagers,’ Dr. Valerie Reyna and Dr. Frank Farley wrote. ‘Although the programs stress the importance of accurate risk perception, young people already feel vulnerable and overestimate their risks.’  In Dr. Reyna’s view, inundating teenagers with factual risk information could backfire, leading them to realize that behaviors like unprotected sex are less risky than they thought. Using an analytical approach of weighing risks versus benefits is ‘a slippery slope that all too often results in teens’ thinking that the benefits outweigh the risks,’ she said.” 

This type of research provides small steps forward vis-à-vis traditional Western assumptions about decision making and rationality.  But my question is, Why don’t they go further?  Why do they simply seem to affirm our common sense view of the world? 

Continue reading “Decision Making and Emotion”

Postmodern Medicine

Harvard Magazine has an excerpt from Charles Rosenberg’s new book, Our Present Complaint: American Medicine, Then and Now, in this month’s issue.  I have pasted the entire article below, as I find it a strong evocation of how disease is as much a social entity as a biological phenomenon.  It captures much of what is difficult to understand, that our biology is inevitably and essentially social.

Postmodern Medicine

We are all “medical citizens,” embedded as potential or actual patients, with physicians, in a system of social, moral, and organizational understandings. So writes Monrad professor of the social sciences Charles E. Rosenberg in Our Present Complaint: American Medicine, Then and Now (Johns Hopkins, $50; $19.95 paper), touching on sources of unease.


Disease has become a bureaucratic—and, thus, social and administrative—as well as biological and conceptual—entity.What do I mean when I describe disease as a “social entity”? I refer to a web of practice guidelines, disease protocols, laboratory and imaging results, meta-analyses, and consensus conferences. These practices and procedures have over time come to constitute a seemingly objective and inescapable framework of disease categories, a framework that increasingly specifies diagnostic criteria and dictates appropriate therapeutic choices. In America’s peculiar hybrid health-care system, layers of hospital and managed care administrators enforce these disease-based guidelines. The past generation’s revolution in information technology has only exacerbated and intensified these trends—in parallel with the research and marketing strategies of major pharmaceutical companies…. This web of complex relationships has created a new reality for practitioners and patients alike. Physicians have had their choices increasingly constrained—if, in some ways, enhanced. For the sick, such ways of conceptualizing and treating disease have come to constitute a tangible aspect of their illness experience.

Of course, every society has entertained ideas about disease and its treatment; patients have never been blank slates.…Think of the generations of sufferers who were bled, sweated, puked, or purged to balance their humors. But never has the infrastructure of ideas, practices, thresholds, and protocols that comes between agreed-upon knowledge and the individual patient been so tightly woven and bureaucratically crafted.…

Yet, as I have emphasized, we are left with that inconveniently subjective object, the patient—in whose body these abstract entities manifest themselves. This is the characteristic split screen that faces today’s clinician: the tension between illness in the individual patient and disease as crystallized and made socially real in the laboratory’s and the epidemiologist’s outputs and inscriptions, practice guidelines, and algorithms.…Bedside, clinic, and physician’s office are the points at which the mandates of best—and increasingly most economically rational—practice bump up against the unique reality of the individual patient and challenge the physician’s traditional autonomy.…

It engenders a feeling of paradox, the juxtaposition of a powerful faith in scientific medicine with a widespread discontent at the circumstances in which it is made available. It is a set of attitudes and expectations postmodern as well as quintessentially modern

Closing Some Doors

Too many choices?  The New York Times has an article by John Tierney, “The Advantages of Closing a Few Doors.”<span>  Tierney discusses Dan Ariely’s new book, Predictably Irrational: The Hidden Forces That Shape Our Decisions: “an entertaining look at human foibles like the penchant for keeping too many options open.” 

Here’s one section about an experiment to look for cash behind one of three doors, with each door having a set pay-off and the other doors slowly disappearing over time unless one used a click to help keep it open (thus wasting a click that could provide more money): 

[The researchers] plumbed the players’ motivations by introducing yet another twist. This time, even if a door vanished from the screen, players could make it reappear whenever they wanted. But even when they knew it would not cost anything to make the door reappear, they still kept frantically trying to prevent doors from vanishing. Apparently they did not care so much about maintaining flexibility in the future. What really motivated them was the desire to avoid the immediate pain of watching a door close. “Closing a door on an option is experienced as a loss, and people are willing to pay a price to avoid the emotion of loss,” Dr. Ariely says.

That paragraph strikes me as in need of some good ethnography—that “apparently” looms too large in my imagination. 

Continue reading “Closing Some Doors”

Autism and Understanding Others

Amanda Baggs presents her own life and thoughts in her YouTube video, In My Language, her translation of how she is in a constant conversation with the world around her.  She is autistic and does not speak.  But she can type, and after three minutes showing her interacting with her environment, she uses computer technology to explain herself to us.


I came across this video through Tara Parker-Pope’s post, The Language of Autism.  As Parker-Pope relates, “Ms. Baggs does far more than give us a vivid glimpse into her mind. Her video is a clarion call on behalf of people with cognitive disabilities whose way of communicating isn’t understood by the rest of the world.”

Continue reading “Autism and Understanding Others”

Cognitive Science and the Advance of Ideas

Here’s a link to the Center for Cognitive Science at the University of Minnesota top 100 cognitive science papers of the last century.  Definitely a useful reference.  Debates about modularity, connectionism, the mind as computational, limits on human rationality, and so forth all emerged from these papers.  Not a lot of culture, inequality or anthropology in the bunch, and a definite bias towards psychology as universal rather than also being variable and contextual–but, hey, this site has to work on something…

And if you haven’t seen it, Edge asked top scholars in 2008, What Have You Changed Your Mind About? Why?

In looking at the first page of answers, I am struck by how much scientists are now reworking the views developed in those top 100 cognitive science papers.

So, Joseph LeDoux: “Like many scientists in the field of memory, I used to think that a memory is something stored in the brain and then accessed when used. Then, in 2000, a researcher in my lab, Karim Nader, did an experiment that convinced me, and many others, that our usual way of thinking was wrong. In a nutshell, what Karim showed was that each time a memory is used, it has to be restored as a new memory in order to be accessible later. The old memory is either not there or is inaccessible. In short, your memory about something is only as good as your last memory about it.”

Continue reading “Cognitive Science and the Advance of Ideas”