Neuroanthropology Session at the AAA Conference

Greg and I are organizing a session for the annual American Anthropological Association meeting, held this year in San Francisco from November 19 to November 23rd.  The session is called, “The Encultured Brain: Neuroanthropology and Interdisciplinary Engagement.” 

We still have one or two spots that might be open for people interested in presenting on neuroanthropology at the AAAs.  So please contact either me (dlende@nd.edu) or Greg (greg.downey@scmp.mq.edu.au) as soon as possible, as we need complete abstracts before March 10th.  Please let us know what you’d like to present on! 

Here’s our session abstract: 

As a collaborative endeavor, neuroanthropology aims to better integrate anthropology, social theory, and the brain sciences.  In this panel, we explore the implications of new findings in the neurosciences for our understanding of culture, human development, and behavior. Neuroanthropology can help to revitalize psychological anthropology, promote links between biological and cultural anthropology, and strengthen work in medical and linguistic anthropology.  However, recent anthropology has not engaged neuroscience to produce the sort of synthesis that began when Franz Boas built cultural anthropology from psychophysics. 

Neuroscience has increasingly produced basic research and theoretical models that are surprisingly amenable to anthropology.  Rather than “neuro-reductionist” or determinist approaches, research has increasingly emphasized the role of environment, body, experience, evolution, and behavior in shaping, even driving organic brain development and function.  At the same time, the complexity of the brain makes a mockery of attempts to pry apart “nature” from “nurture,” or to apportion credit for specific traits.   Research on gene expression, endocrine variability, mirror neurons, and neural plasticity all beg for comparative data from across the range of human variation — biological and cultural. 

Neuroscientists and other social scientists are already actively working on these sorts of integrated models; books like Wexler’s Brain and Culture and Quartz and Sejnowski’s Liars, Lovers and Heroes actively incorporate anthropological materials.  In the social sciences, books like Turner’s Brains/Practices/Relativism aim to bring neuroscience into social theory, often with critical intent. 

However, these works often leave out the best of anthropology.  Although our research is being borrowed, we are being left out of the conversation precisely at a time when we should speak with authority.  In the present round of integration, simplistic understandings of culture dominate, and, at times, outside authors read our research through unsettling ideological lenses.  And, given the emphasis on experience, behavior, context and development, the absence of ethnographic research and insight into precisely those domains that impact our neural function is startling. 

Anthropology has much to offer to and much to learn from engagement with neuroscience.  An apt model is just how important genetics has become in anthropology, cutting across the entire discipline.  A similar revolution is waiting with neurobiology, if we can draw on our strengths and build neuroanthropology on inclusion, collaboration and engagement, both within and outside anthropology.  To this end, this session explores areas of anthropological research related to the brain where heredity, environment, culture and biology are in complex relations, with human variation emerging from their nexus rather than being determined by a single variable.  Participants explore addiction, motor skill, XXXX, XXXX — brain-related phenomena that can only be explained by dynamic models including both “bottom-up” (biological, neural, and psychological levels) and “top-down” (cultural, social, and ideological) factors.  Participants highlight that no single model of the biological-cultural interface holds for all cases.  The papers in this panel also suggest ways in which anthropologists might intervene in public discussions of crucial human characteristics and make our concerns more persuasive for other academic disciplines exploring the complexity of the human brain.

Free Lunch and Iraq

Two very different articles highlight just how little cost-benefit analysis matters sometimes, whether at the highest policy levels or in the most mundane of circumstances.  Humans evolved in a world of threats and status, and oftentimes that runs counter to any sort of logic.  And so we face many opportunities lost and much damage done. 

Bob Herbert writes today about “The $2 Trillion Dollar Nightmare,” the on-going estimate of the total cost of the Iraq war.  He notes the lack of public discussion of the “consequences of these costs, which are like a cancer inside the American economy.”  Then he discusses the testimony of a Nobel-prizing winning economist, Joseph Stiglitz, and the vice chairman of Goldman Sachs, Robert Hormats: “Both men talked about large opportunities lost because of the money poured into the war. ‘For a fraction of the cost of this war,’ said Mr. Stiglitz, ‘we could have put Social Security on a sound footing for the next half-century or more’.” 

Carol Pogash wrote recently about “Free Lunch Isn’t Cool, So Some Students Go Hungry.”  Many students who qualify for federally-subsidized lunches go without:  “Lunchtime ‘is the best time to impress your peers,’ said Lewis Geist, a senior at Balboa and its student body president. Being seen with a subsidized meal, he said, lowers your status’.” 

Pogash writes later, “Ann Cooper, director of nutrition services for the public schools in Berkeley, Calif., said that attention to school cafeterias had traditionally focused on nutrition, but that the separation of students who pay and those who receive free meals was an important ‘social justice issue’.”

Beyond threats and status, cultural distinctions matter in these sorts of decisions.  The war on terror was framed, from the very first moment, as a war of civilization against barbarians—our very way of life seems to be under threat.  And students know what eating a subsidized meal signifies, that all that effort in having “spiky hair and sunglasses” goes to waste in that moment of being seen on the wrong side of the American Dream. 

In the end the costs do matter, particularly in opportunities lost, as our own biological and cultural heritages conspire together.  That’s more than the market, more than being predictably irrational, it’s the tragic acting out of our own selves at the smallest and largest of scales.  But they are dramas we ourselves write, and so can change. 

But it won’t be easy.  Write what you know best, one writer’s rule goes.  In everyday life it’s what we do all the time.  Breaking free from that, from lamenting what might have been to seizing what could be, will take courage and vision and work.

Decision Making and Emotion

Economists and policy makers are coming to the realization that rationality, in its multiple forms, doesn’t always explain why people make the decisions that they do.  By rationality, I mean both the assumption of “economic man” (a utilitarian cost/benefit analyzer) and the emphasis on education and knowledge as the privileged means of shaping behavior.   

Let’s take three recent headlines: “Why Sadness Increases Spending,” “Craving the High That Risky Trading Can Bring” and “Teenage Risks, and How to Avoid Them.”  All point to the role of emotion in decision making (any surprise here?). 

The first article states, “A research team [of Cynthia Cryder, Jennifer Lerner, and colleagues] finds that people feeling sad and self-focused spend more money to acquire the same commodities than those in a neutral emotional state.” 

The second provides an Aristotelian summary: “The findings, while preliminary, suggest — perhaps unsurprisingly — that traders who let their emotions get the best of them tend to fare poorly in the markets. But traders who rely on logic alone don’t do that well either. The most successful ones use their emotions to their advantage without letting the feelings overwhelm them.” 

The third tells us, “Scientific studies have shown that adolescents are very well aware of their vulnerability and that they actually overestimate their risk of suffering negative effects from activities like drinking and unprotected sex…  ‘It now becomes clearer why traditional intervention programs fail to help many teenagers,’ Dr. Valerie Reyna and Dr. Frank Farley wrote. ‘Although the programs stress the importance of accurate risk perception, young people already feel vulnerable and overestimate their risks.’  In Dr. Reyna’s view, inundating teenagers with factual risk information could backfire, leading them to realize that behaviors like unprotected sex are less risky than they thought. Using an analytical approach of weighing risks versus benefits is ‘a slippery slope that all too often results in teens’ thinking that the benefits outweigh the risks,’ she said.” 

This type of research provides small steps forward vis-à-vis traditional Western assumptions about decision making and rationality.  But my question is, Why don’t they go further?  Why do they simply seem to affirm our common sense view of the world? 

Continue reading “Decision Making and Emotion”

Postmodern Medicine

Harvard Magazine has an excerpt from Charles Rosenberg’s new book, Our Present Complaint: American Medicine, Then and Now, in this month’s issue.  I have pasted the entire article below, as I find it a strong evocation of how disease is as much a social entity as a biological phenomenon.  It captures much of what is difficult to understand, that our biology is inevitably and essentially social.

Postmodern Medicine

We are all “medical citizens,” embedded as potential or actual patients, with physicians, in a system of social, moral, and organizational understandings. So writes Monrad professor of the social sciences Charles E. Rosenberg in Our Present Complaint: American Medicine, Then and Now (Johns Hopkins, $50; $19.95 paper), touching on sources of unease.


Disease has become a bureaucratic—and, thus, social and administrative—as well as biological and conceptual—entity.What do I mean when I describe disease as a “social entity”? I refer to a web of practice guidelines, disease protocols, laboratory and imaging results, meta-analyses, and consensus conferences. These practices and procedures have over time come to constitute a seemingly objective and inescapable framework of disease categories, a framework that increasingly specifies diagnostic criteria and dictates appropriate therapeutic choices. In America’s peculiar hybrid health-care system, layers of hospital and managed care administrators enforce these disease-based guidelines. The past generation’s revolution in information technology has only exacerbated and intensified these trends—in parallel with the research and marketing strategies of major pharmaceutical companies…. This web of complex relationships has created a new reality for practitioners and patients alike. Physicians have had their choices increasingly constrained—if, in some ways, enhanced. For the sick, such ways of conceptualizing and treating disease have come to constitute a tangible aspect of their illness experience.

Of course, every society has entertained ideas about disease and its treatment; patients have never been blank slates.…Think of the generations of sufferers who were bled, sweated, puked, or purged to balance their humors. But never has the infrastructure of ideas, practices, thresholds, and protocols that comes between agreed-upon knowledge and the individual patient been so tightly woven and bureaucratically crafted.…

Yet, as I have emphasized, we are left with that inconveniently subjective object, the patient—in whose body these abstract entities manifest themselves. This is the characteristic split screen that faces today’s clinician: the tension between illness in the individual patient and disease as crystallized and made socially real in the laboratory’s and the epidemiologist’s outputs and inscriptions, practice guidelines, and algorithms.…Bedside, clinic, and physician’s office are the points at which the mandates of best—and increasingly most economically rational—practice bump up against the unique reality of the individual patient and challenge the physician’s traditional autonomy.…

It engenders a feeling of paradox, the juxtaposition of a powerful faith in scientific medicine with a widespread discontent at the circumstances in which it is made available. It is a set of attitudes and expectations postmodern as well as quintessentially modern

Closing Some Doors

Too many choices?  The New York Times has an article by John Tierney, “The Advantages of Closing a Few Doors.”<span>  Tierney discusses Dan Ariely’s new book, Predictably Irrational: The Hidden Forces That Shape Our Decisions: “an entertaining look at human foibles like the penchant for keeping too many options open.” 

Here’s one section about an experiment to look for cash behind one of three doors, with each door having a set pay-off and the other doors slowly disappearing over time unless one used a click to help keep it open (thus wasting a click that could provide more money): 

[The researchers] plumbed the players’ motivations by introducing yet another twist. This time, even if a door vanished from the screen, players could make it reappear whenever they wanted. But even when they knew it would not cost anything to make the door reappear, they still kept frantically trying to prevent doors from vanishing. Apparently they did not care so much about maintaining flexibility in the future. What really motivated them was the desire to avoid the immediate pain of watching a door close. “Closing a door on an option is experienced as a loss, and people are willing to pay a price to avoid the emotion of loss,” Dr. Ariely says.

That paragraph strikes me as in need of some good ethnography—that “apparently” looms too large in my imagination. 

Continue reading “Closing Some Doors”

Autism and Understanding Others

Amanda Baggs presents her own life and thoughts in her YouTube video, In My Language, her translation of how she is in a constant conversation with the world around her.  She is autistic and does not speak.  But she can type, and after three minutes showing her interacting with her environment, she uses computer technology to explain herself to us.


I came across this video through Tara Parker-Pope’s post, The Language of Autism.  As Parker-Pope relates, “Ms. Baggs does far more than give us a vivid glimpse into her mind. Her video is a clarion call on behalf of people with cognitive disabilities whose way of communicating isn’t understood by the rest of the world.”

Continue reading “Autism and Understanding Others”