Postmodern Medicine

Harvard Magazine has an excerpt from Charles Rosenberg’s new book, Our Present Complaint: American Medicine, Then and Now, in this month’s issue.  I have pasted the entire article below, as I find it a strong evocation of how disease is as much a social entity as a biological phenomenon.  It captures much of what is difficult to understand, that our biology is inevitably and essentially social.

Postmodern Medicine

We are all “medical citizens,” embedded as potential or actual patients, with physicians, in a system of social, moral, and organizational understandings. So writes Monrad professor of the social sciences Charles E. Rosenberg in Our Present Complaint: American Medicine, Then and Now (Johns Hopkins, $50; $19.95 paper), touching on sources of unease.


Disease has become a bureaucratic—and, thus, social and administrative—as well as biological and conceptual—entity.What do I mean when I describe disease as a “social entity”? I refer to a web of practice guidelines, disease protocols, laboratory and imaging results, meta-analyses, and consensus conferences. These practices and procedures have over time come to constitute a seemingly objective and inescapable framework of disease categories, a framework that increasingly specifies diagnostic criteria and dictates appropriate therapeutic choices. In America’s peculiar hybrid health-care system, layers of hospital and managed care administrators enforce these disease-based guidelines. The past generation’s revolution in information technology has only exacerbated and intensified these trends—in parallel with the research and marketing strategies of major pharmaceutical companies…. This web of complex relationships has created a new reality for practitioners and patients alike. Physicians have had their choices increasingly constrained—if, in some ways, enhanced. For the sick, such ways of conceptualizing and treating disease have come to constitute a tangible aspect of their illness experience.

Of course, every society has entertained ideas about disease and its treatment; patients have never been blank slates.…Think of the generations of sufferers who were bled, sweated, puked, or purged to balance their humors. But never has the infrastructure of ideas, practices, thresholds, and protocols that comes between agreed-upon knowledge and the individual patient been so tightly woven and bureaucratically crafted.…

Yet, as I have emphasized, we are left with that inconveniently subjective object, the patient—in whose body these abstract entities manifest themselves. This is the characteristic split screen that faces today’s clinician: the tension between illness in the individual patient and disease as crystallized and made socially real in the laboratory’s and the epidemiologist’s outputs and inscriptions, practice guidelines, and algorithms.…Bedside, clinic, and physician’s office are the points at which the mandates of best—and increasingly most economically rational—practice bump up against the unique reality of the individual patient and challenge the physician’s traditional autonomy.…

It engenders a feeling of paradox, the juxtaposition of a powerful faith in scientific medicine with a widespread discontent at the circumstances in which it is made available. It is a set of attitudes and expectations postmodern as well as quintessentially modern

Closing Some Doors

Too many choices?  The New York Times has an article by John Tierney, “The Advantages of Closing a Few Doors.”<span>  Tierney discusses Dan Ariely’s new book, Predictably Irrational: The Hidden Forces That Shape Our Decisions: “an entertaining look at human foibles like the penchant for keeping too many options open.” 

Here’s one section about an experiment to look for cash behind one of three doors, with each door having a set pay-off and the other doors slowly disappearing over time unless one used a click to help keep it open (thus wasting a click that could provide more money): 

[The researchers] plumbed the players’ motivations by introducing yet another twist. This time, even if a door vanished from the screen, players could make it reappear whenever they wanted. But even when they knew it would not cost anything to make the door reappear, they still kept frantically trying to prevent doors from vanishing. Apparently they did not care so much about maintaining flexibility in the future. What really motivated them was the desire to avoid the immediate pain of watching a door close. “Closing a door on an option is experienced as a loss, and people are willing to pay a price to avoid the emotion of loss,” Dr. Ariely says.

That paragraph strikes me as in need of some good ethnography—that “apparently” looms too large in my imagination. 

Continue reading “Closing Some Doors”

Autism and Understanding Others

Amanda Baggs presents her own life and thoughts in her YouTube video, In My Language, her translation of how she is in a constant conversation with the world around her.  She is autistic and does not speak.  But she can type, and after three minutes showing her interacting with her environment, she uses computer technology to explain herself to us.


I came across this video through Tara Parker-Pope’s post, The Language of Autism.  As Parker-Pope relates, “Ms. Baggs does far more than give us a vivid glimpse into her mind. Her video is a clarion call on behalf of people with cognitive disabilities whose way of communicating isn’t understood by the rest of the world.”

Continue reading “Autism and Understanding Others”

Cognitive Science and the Advance of Ideas

Here’s a link to the Center for Cognitive Science at the University of Minnesota top 100 cognitive science papers of the last century.  Definitely a useful reference.  Debates about modularity, connectionism, the mind as computational, limits on human rationality, and so forth all emerged from these papers.  Not a lot of culture, inequality or anthropology in the bunch, and a definite bias towards psychology as universal rather than also being variable and contextual–but, hey, this site has to work on something…

And if you haven’t seen it, Edge asked top scholars in 2008, What Have You Changed Your Mind About? Why?

In looking at the first page of answers, I am struck by how much scientists are now reworking the views developed in those top 100 cognitive science papers.

So, Joseph LeDoux: “Like many scientists in the field of memory, I used to think that a memory is something stored in the brain and then accessed when used. Then, in 2000, a researcher in my lab, Karim Nader, did an experiment that convinced me, and many others, that our usual way of thinking was wrong. In a nutshell, what Karim showed was that each time a memory is used, it has to be restored as a new memory in order to be accessible later. The old memory is either not there or is inaccessible. In short, your memory about something is only as good as your last memory about it.”

Continue reading “Cognitive Science and the Advance of Ideas”

Kwame Appiah

Kwame Appiah is a professor of philosophy at Princeton University, and has a new book Experiments in Ethics.  The book is interesting to me both because of his use of data, rather than just analysis, to think about ethics, and his emphasis on the contextual nature of morality.  NPR has an entertaining radio interview with Appiah, where he discusses his approach to “empirical philosophy.” 

There’s also a discussion of Appiah’s book in the NY Times, which presents a different take on trolleyology (discussed in our critical take on Pinker’s essay on morality).  Here’s what Paul Bloom writes in “Morality Studies“: 

[T]his book has teeth, particularly when Appiah looks hard at the emphasis on moral dilemmas like the trolley problems. These were originally developed to tap our intuitions about agency and responsibility, and are thought to bear on real-world issues like abortion and just war. But the dense trolley literature “makes the Talmud look like Cliffs Notes” even as its complexity fails, he argues, to capture the richness of morality in our everyday lives. Real moral problems don’t come in the form of SAT questions, and being a good person often requires figuring out for yourself just what the options are: “In life, the challenge is not so much to figure out how best to play the game; the challenge is to figure out what game you’re playing.”

 Here’s a review blurb by Cass Susstein: “This dazzlingly written book argues for reconnecting moral philosophy with the sciences, both natural and social–and demonstrates that the reconnection, while in a sense overdue, reconnects philosophy with its ancient interest in empirical issues. Appiah’s important argument promises to transform more than one field. It is not only wise and subtle; it is also inspiring.”

 And a summary from an Amazon reviewer raising a few critical points: 

1. In his chapter on “the varieties of moral experience,” the author discusses a number of “modules” that he feels characterize the human psyche: compassion, reciprocity, hierarchy, and so forth. He draws on other scholars who have posited such proclivities, and he also mentions Chomsky who, he says, has proposed a similar, presumably innate, human capacity for language. I do not find these “modules” persuasive as being human universals. There is very little in this discussion that would connect it to empirical science, for example to anthropology, not to speak of the findings of modern neuroscience. Indeed, the descriptions of modules are reminiscent of pre-scientific speculations concerning “four humors.”

2. The second chapter, “the case against character,” gives us a stimulating and challenging rundown of experiments that suggest that ethical choice is very much influenced by the immediate situation. So we learn, for example, that if you have just smelled the delicious odor of fresh-baked bread, you are more likely to be generous than you would be without such olfactory stimulus. The author seems to conclude (he does hedge this a bit) that there is no such thing as character, that everything depends on the situation.

The problem here is that in any of these situations there are minorities of subjects who don’t act as expected. Even with all that good smelling bread, some remain stingy; even without great smells, some are generous. So it would appear that these experimental situations explain some of the variance but not all.

Alice in Wonderland Syndrome

Vania Smith-Oka, my colleague at Notre Dame, pointed out this NY Times article “Curiouser and Curiouser” by Siri Hustvedt.  The piece starts by exploring changes in body image, “The afflicted person perceives herself, or parts of herself, ballooning or diminishing in size. The neurological terms for the peculiar sensations of growing and shrinking are macroscopy and microscopy.”  Equally interested is how the article examines “medical materialism,” a tendency to view the varieties in our lived experience in both a pathological and materialist light, the result of nerve cells and associated molecules run amock.  

The essay argues eloquently for the need to see complexity as the way to understand ourselves, overcoming dichotomies such as nature/nurture or materialist/subjective:

The human infant is born immature, and in the first six years of its life, the front part of its brain (the prefrontal cortex) develops enormously. It develops through experience and continues to do so, although not as dramatically… A child who has good parental care — is stimulated, talked to, held, whose needs are answered — is materially affected by that contact, as is, conversely, the child who suffers shocks and deprivations. What happens to you is decisive in determining which neural networks are activated and kept. Since we are born with far too many neurons, the ones that aren’t used are “pruned”; they wither away. This explains why so-called “wild children” are unable to acquire anything but the most primitive form of language. It’s too late. It also demonstrates how nurture becomes nature and to make simple distinctions between them is absurd. A baby with a hypersensitive genetic makeup that predisposes him to anxiety can end up as a reasonably calm adult if he grows up in a soothing environment.

Hustvedt also speaks to the importance of an interpretative approach to understanding human phenomena, something that many anthropologists would echo: “Crick’s reductionism does not provide an adequate answer to Alice’s question. It’s rather like saying that Vermeer’s “Girl (or Woman or Maidservant) Pouring Milk” is a canvas with paint on it or that Alice herself is words on a page. These are facts, but they don’t explain my subjective experience of either of them or what the two girls mean to me.”

Another quote, one that resonates with much of what we’ve written on this site:

It is human to clutch at simple answers and shunt aside ambiguous, shifting realities. The fact that genes are expressed through environment, that however vital they may be in determining vulnerability to an illness, they cannot predict it, except in rare cases, such as Huntington’s disease; that the brain is not a static but a plastic organ, which forms itself long after birth through our interactions with others; that any passionate feeling, whether it’s about politics or tuna fish, will appear on scans as activated emotional circuits in the brain; that scientific studies on weight and longevity tell us mostly about correlations, not causes; that the feelings evoked by the so-called “God spot” may be interpreted by the person having them as religious or as something entirely different — all this is forgotten or misunderstood.

Hustvedt ends with a similar call to our own: “We are all prisoners of our mortal minds and bodies, vulnerable to various kinds of perceptual transfigurations. At the same time, as embodied beings we live in a world that we explore, absorb, and remember — partially, of course. We can only find the out there through the in here… Our thinking, feeling minds are made not only by our genes, but through our language and culture.”