In the previous post Carol Worthman: From Human Development to Habits of the Heart, I covered two of Carol’s recent papers. Just after that I discovered a great lecture by Carol, where she covers her work on “Habits of the Heart: Life History and the Developmental Neuroendocrinology of Emotion Regulation.” So now you can see her in action!
Carol Worthman, a mentor of mine at Emory University and a real leader in doing neuroanthropological research (even if she might call it “biocultural”), has two recent articles out that I really want to highlight.
The first is The Ecology of Human Development: Evolving Models for Cultural Psychology. Here is the abstract, part of a whole special issue in the Journal of Cross-Cultural Psychology on the work of the husband-wife team John Whiting and Beatrice Whiting:
The Whiting model aimed to provide a blueprint for psychocultural research by generating testable hypotheses about the dynamic relationships of a culture with the psychology and behavior of its members. This analysis identifies reasons why the model was so effective at generating hypotheses borne out in empirical research, including its foundational insight that integrated nature and nurture, its reconceptualization of the significance of early environments, and its attention to biopsychocultural dynamics active in those environments.
Implications and the evolution of the ecological paradigm are tracked through presentations of three current models (developmental niche, ecocultural theory, bioecocultural microniche) and discussion of their related empirical literatures. Findings from these literatures converge to demonstrate the power of a developmental, cultural, ecological framework for explaining within- and between-population variation in cultural psychology.
The figure above is from this paper, and represents Carol’s own model for understanding human development. But the real point that Carol wants to make in emphasizing these three models goes as follows:
All of these models share a concern for how the cultural ecology of affect and affect regulation drive psychobehavioral development, competence, and well-being or health. Whoever has looked has found linkages among cultural practices, stress physiology, and emotion regulation. Note that each of these models foregrounds the development of emotion and emotion regulation and de-emphasizes classic knowledge acquisition. Although there are important reasons for this emphasis (Damasio, 2005), a reconsideration of what constitutes “knowledge” and more systematic investigation of the linkages between emotion and knowledge might prove valuable (588).
The second article is Habits of the Heart: Life History and the Developmental Neuroendocrinology of Emotion. This article was part of a special issue on Advances in Evolutionary Endocrinology in the American Journal of Human Biology. Here is Carol’s abstract:
The centrality of emotion in cognition and social intelligence as well as its impact on health has intensified investigation into the causes and consequences of individual variation in emotion regulation. Central processing of experience directly informs regulation of endocrine axes, essentially forming a neuro-endocrine continuum integrating information intake, processing, and physiological and behavioral response. Two major elements of life history—resource allocation and niche partitioning—are served by linking cognitive-affective with physiologic and behavioral processes. Scarce cognitive resources (attention, memory, and time) are allocated under guidance from affective co-processing. Affective-cognitive processing, in turn, regulates physiologic activity through neuro-endocrine outflow and thereby orchestrates energetic resource allocation and trade-offs, both acutely and through time. Reciprocally, peripheral activity (e.g., immunologic, metabolic, or energetic markers) influences affective-cognitive processing.
By guiding attention, memory, and behavior, affective-cognitive processing also informs individual stances toward, patterns of activity in, and relationships with the world. As such, it mediates processes of niche partitioning that adaptively exploit social and material resources. Developmental behavioral neurobiology has identified multiple factors that influence the ontogeny of emotion regulation to form affective and behavioral styles. Evidence is reviewed documenting roles for genetic, epigenetic, and experiential factors in the development of emotion regulation, social cognition, and behavior with important implications for understanding mechanisms that underlie life history construction and the sources of differential health. Overall, this dynamic arena for research promises to link the biological bases of life history theory with the psychobehavioral phenomena that figure so centrally in quotidian experience and adaptation, particularly, for humans.
In this second article, Carol is tying her work back into evolutionary theory. If the first took up more the cultural/psychological side, then here we are grounded in the mechanisms and ideas of biological anthropology. She writes here:
Given the evidence of gene-environment interactions and developmental effects discussed above, combinations of history and circumstance will condition the phenotypes generated from the genetic structure, and thus influence the impact of that structure on corresponding experience, welfare, behavior, and the balance of selective pressures upon genetic diversity. Such gene-environment interactions and their consequences for function and welfare deserve investigation across a wide range of human cultures and conditions. Such study bears exciting possibility for unlocking dynamics among culture, social conditions, the nature and distribution of social niches, and selection pressures operating on allelic variants (779).
Link to citation/abstract for Carol Worthman’s The Ecology of Human Development: Evolving Models for Cultural Psychology.
Link to citation/abstract for Carol Worthman’s Habits of the heart: Life history and the developmental neuroendocrinology of emotion.
The United States recently ranked 20th out of 21 rich countries in a UNICEF study of child well-being. The effects of childhood can last a life-time. Darcia Narvaez, writing with Jaak Panksepp and Allan Schore, argue in their post The Decline of Children and the Moral Sense:
American culture may be deviating increasingly from traditional social practices that emerged in our ancestral “environment of evolutionary adaptedness” (EEA). Empathy, the backbone of compassionate moral behavior, is decreasing…
In fact, the way we raise our children it seems that the USA is increasingly depriving them of the practices that lead to well being and a moral sense.
Together Narvaez and Panksepp are organizing a conference on Human Nature and Early Experience: Addressing the “Environment of Evolutionary Adaptedness”, where Schore will be one of the featured speakers.
Charles Darwin had high hopes for humanity. He pointed to the unique way that human evolution was driven in part by a “moral sense.” Its key evolutionary features are the social instincts, taking pleasure in the company of others, and feeling sympathy for fellow humans. It was promoted by intellectual abilities, such as memory for the past and the ability to contrast one’s desires with the intentions of others, leading to conscience development, and, after language acquisition, concern for the opinion of others and the community at large…
What Darwin considered the moral-engine of positive human thriving may be under threat. Ill-advised practices and beliefs have become normalized without much fanfare, such as the common use of infant formula, the isolation of infants in their own rooms, the belief that responding too quickly to a fussing baby is spoiling it, the placing of infants in impersonal daycare, and so on. We recommend that scientists and citizens step back from and reexamine these common culturally accepted practices and pay attention to potential life-time effects on people. It is an ethical issue.
Intellectual labels are always a tricky business, necessary for talking about ideas and suggesting that a theorist is in a particular ideological neighborhood. Yet, they can drag along so much baggage that they become self-defeating, evoking instant resistance or inevitable misinterpretation if poorly used. In the best of cases, they can help to create a clear identity for innovative work in an academic field, speeding the effort to carve out a space for ideas in a cluttered terrain of thought. Deployed well, they can help to clarify and orient us; applied clumsily, they become intellectual invective, prematurely close off discussion or debate, and substitute labeling for thinking.
Today, I want to write briefly about ‘neuroanthropology’ as a badge, but spend more time on ‘neuroconstructivism,’ as it’s a term that sometimes gets associated with the sort of research and thinking that we are advocating here at Neuroanthropology.net. In a sense, this piece is written for non-anthropologists, to help them understand why they might get a really strange reaction from an anthropologist colleague if they start talking excitedly about new ‘neuroconstructivist’ perspectives.
We’ve obviously decided that ‘neuroanthropology’ is one of the labels that we find helpful. We stand by the neologism, even though some of our readers have described our choice of terms ‘deplorable,’ and we’ve sometimes had to struggle against the term’s use elsewhere. For example, Oliver Sachs, the wonderful chronicler of the lived worlds of people with severe brain lesions, often calls himself a ‘neuroanthropologist,’ as Jovan Maud at Culture Matters pointed out to me and Daniel highlights in a recent, more thorough post on the relation of what we’re doing to what Sachs has done (see also Neuroanthropology).
Nicholas Kristof has an op-ed today, How to Raise Our I.Q. He opens with a standard version of the individual meritocracy argument, that IQ is largely inherited:
Poor people have I.Q.’s significantly lower than those of rich people, and the awkward conventional wisdom has been that this is in large part a function of genetics. After all, a series of studies seemed to indicate that I.Q. is largely inherited. Identical twins raised apart, for example, have I.Q.’s that are remarkably similar. They are even closer on average than those of fraternal twins who grow up together.
If intelligence were deeply encoded in our genes, that would lead to the depressing conclusion that neither schooling nor antipoverty programs can accomplish much. Yet while this view of I.Q. as overwhelmingly inherited has been widely held, the evidence is growing that it is, at a practical level, profoundly wrong.
Kristof cites Richard Nisbett’s new book Intelligence and How to Get It: Why Schools and Cultures Count. I covered some of Nisbett’s work in the post IQ, Environment and Anthropology, and Jim Holt gave a strong review of the book recently in the NY Times. The publisher’s home page simply says that this book is a “bold refutation of the belief that genes determine intelligence.”
Poverty Poisons the Brain was one of our most popular posts last year. Recent research has brought that topic back into public light. It’s good research, but today I will get critical about what really matters in our emerging realization that social disadvantage results in neurological disadvantage.
Gary Evans and Michelle Shamberg recently published a PNAS paper, Childhood Poverty, Chronic Stress and Working Memory (pdf). Here’s the abstract:
The income–achievement gap is a formidable societal problem, but little is known about either neurocognitive or biological mechanisms that might account for income-related deficits in academic achievement. We show that childhood poverty is inversely related to working memory in young adults. Furthermore, this prospective relationship is mediated by elevated chronic stress during childhood. Chronic stress is measured by allostatic load, a biological marker of cumulative wear and tear on the body that is caused by the mobilization of multiple physiological systems in response to chronic environmental demands.
The Evans and Shamberg paper has gotten prominent media attention. Over at Wired, Poverty Goes Straight to the Brain got an enormous number of diggs. Brandon Keim’s opening lines are, “Growing up poor isn’t merely hard on kids. It might also be bad for their brains. A long-term study of cognitive development in lower- and middle-class students found strong links between childhood poverty, physiological stress and adult memory.”
The Guardian (UK) brings us a recent example of technophobia based on comments by neuroscientist Lady Susan Adele Greenfield, this time about the latest prime suspects for ‘rotting the brains of our youth’: Facebook and social networking sites. Patrick Wintour offers us Facebook and Bebo risk ‘infantilising’ the human mind, suggesting that social networking websites might be responsible for ‘short attention spans, sensationalism, inability to empathise and a shaky sense of identity.’The article quotes at length from a statement to the House of Lords by Baroness Greenfield, Professor of Synaptic Pharmacology at Lincoln College, Oxford, and Director of the Royal Institution of Great Britain.
The Baroness Greenfield has written a stack of books, including a best-seller on the brain, earned a peerage for her outstanding career, and has so many titles and honours that I’m not even sure what to call her (Prof? Lady?). Browsing her homepage and publications list, there’s a range of interesting stuff on consciousness, analgesia, dopamine, and a fair number of subjects upon which I don’t have even the expertise to comment. The only problem is that her fears, closely examined, reveal that she doesn’t know what to be afraid of, adopting a ‘one-paranoia-fits-all’ approach to technological change.
The Guardian article seems a bit over-wrought, and I don’t have the transcript of Greenfield’s presentation to the House of Lords, so I’m hesitant to attribute too much of the phobia to the original speech (for a critique of Greenfield’s habit of alarmism, however, see Ben Goldacre’s weblog). As we’ve seen repeatedly, the transition from scientist presenting to science writer submitting the story to editor reworking to press printing can be really rough, transforming subtle and measured analysis into formulaic, exaggerated soundbites. However, there are some extensive quotes, so in this piece, I’ll do my best to analyze what we have. In another post, I want to move beyond the fear of Facebook, using Lady Greenfield’s comments to think about how we might actually do research on the effects of technological change among developmental influences, but I won’t get to that in this post, as it’s already too long.
I’m not blasé about the developmental consequences of heavy exposure to screen technology, but I think that a legitimate interest in the possible effects of significant technological change in our daily lives can inadvertently dovetail seamlessly into a ‘kids these days’ curmudgeonly sense of generational degeneration, which is hardly new. That is, we have to be careful when we look at the research as it’s easy to annex our popular understandings of generational dynamics, even frustrations with our own children, students, and other young people, into a snowballing sense that everything’s going to hell.
Is new technology affecting our brain development and how? Is the recent change in the developmental environment much greater than previous changes in childhood ecology? And what specifically can we say about social networking sites as a factor in cognitive development? Obviously, these are huge questions, and it’s not my area of research specialty exactly, so I’m not going to bring fresh unpublished data to the table. But I do have some thoughts on the subject nonetheless, as our regular readers might imagine… but here’s the first part, where I deal with the concerns voiced by Greenfield and others.
So here’s a recent New Scientist title: “Bad Boys Can Blame Their Behaviour on Hormones.”
All I can think is: New Scientist, Old School. Old, as in nature-nurture old and biological determinism old. Old as in moldy, rusted, failing ideas old.
But it’s not just New Scientist. Discover matches New Scientist with, “Teenage Hoodlums Can Blame Bad Behavior on Hormones.” And The Daily Mail delivers “Now Teenage Thugs Can Blame Their Hormones for Bad Behaviour.”
So what’s the problem? Well, it’s two-fold. First are journalists playing out a cultural script just like they subscribe to old-school cultural determinism. And second is some bad research that, not coincidentally, helps the journalists act like cultural automatons.
The cultural model goes like this: stereotypes, then blame, then biology. Take a stereotype we fear (“we” meaning journalists and readers alike). Bring in the politics and ideology of blame – hey, there’s a reason they are not like us, and why they threaten us. Invoke a cause, generally biological (though cultural causes come up too), outside of our particular realm of control. Hormones, nothing we can do about that, it means they were bad from the get-go. So we’re right to fear them and better make sure they don’t hurt us, whatever it takes.
Don’t believe me? Just look at the photos that accompany the articles. At the Daily Mail, a hooded guy point his hand like a gun at us the reader. Over at Discover, a crazed man with a clenched fist yells in our faces.
We all know journalists will play to stereotypes and will get research wrong and so forth. But in this case, like in most of the biologically-oriented research about complex human phenomena, the research only feeds into journalists typing out the normal crap.
The article in question is “Cortisol Diurnal Rhythm and Stress Reactivity in Male Adolescents with Early-Onset or Adolescence-Onset Conduct Disorder” (full access) by Graeme Fairchild, Stephanie van Goozen et al. and appears in the October 2008 issue of Biological Psychiatry. Neurocritic gives us the overview of the article if you don’t want to read the whole thing. (While I liked the Bad Boys music, I could have done with some more criticism in this particular Neurocritic post – but that’s okay, I’m going to play the bad boy this time.) Here’s the popular take from New Scientist on the article:
Out-of-control boys facing spells in detention or anti-social behaviour orders can now blame it all on their hormones. The “stress hormone” cortisol – or low levels of it – may be responsible for male aggressive antisocial behaviour, according to new research. The work suggests that the hormone may restrain aggression in stressful situations. Researchers found that levels of cortisol fell when delinquent boys played a stressful video game, the opposite of what was seen in control volunteers playing the same game.
Harvard Magazine has a short piece this month on the work of neurologists Frances Jensen and David Urion to popularize information about the “teen brain” to audiences. As Jensen says, “This is the first generation of teenagers that has access to this information, and they need to understand some of their vulnerabilities.”
That information? That, given the way their brain is maturing (both fast-growing synapses and other sections relatively unconnected), adolescents are more “easily influenced by their environment and more prone to impulsive behavior.” As expected, there follows a typical line of parental angst: the sexes are different, drugs harm brains, kids need to sleep and get exercise, they are suffering from sensory overload from all the new technology. By implication, it is all due to being in “this paradoxical period in brain development.”
Certainly there are some intriguing results about brain development in adolescent related to differential brain maturation, developmental plasticity, and the like. Some early research based on longitudinal research is summarized here in an NIMH press release, which concludes in better fashion: “the teenage brain is a very complicated and dynamic arena, one that is not easily understood,” whether for parents or for researchers. But as I covered earlier in a post on emotion and decision making, teenagers can actually be seen as rather good decision makers, just focused on differential goals and contexts than most adults.
And come on, teenagers are overwhelmed by information and multitasking in today’s “brave new world”? I wish I had half the skills that my incoming freshmen display in this arena-I’m the one who doesn’t quite know how to handle the sensory overload…
Another graphic accompanies the Harvard article (only in the pdf though), an illustration by Leslie Cober-Gentry. For me, it shows the enormous gap between the brain imaging graphic and this more cultural graphic. As with all imaging research, there can only be correlations between level of activity and a particular task at hand. But that equation leaves out all the other important correlations that exists between, say, being impulsive and a particular environmental context. The juxtaposition of the two images capture perfectly what Urion and Jensen do, project our everyday life and concerns onto our newest explanatory cause-the brain.
The Telegraph yesterday ran with an article, Brain downloads ‘will make lessons pointless,’ about some comments made by Chris Parry, former Rear Admiral and the CEO of the Independent Schools Council. Parry believe that ‘”Matrix-style” technology would render traditional lessons obsolete,’ because we’ll soon be beaming knowledge into kids brains. Parry told the Times Educational Supplement: “It’s a very short route from wireless technology to actually getting the electrical connections in your brain to absorb that knowledge.”
Okay, you all need to help me: do I feel this under ‘hokum,’ ‘malarky,’ or ‘balderdash’? Rear Admiral Parry, sir, will the wireless technology use the brain’s Bluetooth or WiFi receptors? Which part of the brain’s RAM will you use when you install the new ‘human operating system’?
Okay, Admiral Parry, repeat after me: The brain is not a computer.