The Guardian (UK) brings us a recent example of technophobia based on comments by neuroscientist Lady Susan Adele Greenfield, this time about the latest prime suspects for ‘rotting the brains of our youth': Facebook and social networking sites. Patrick Wintour offers us Facebook and Bebo risk ‘infantilising’ the human mind, suggesting that social networking websites might be responsible for ‘short attention spans, sensationalism, inability to empathise and a shaky sense of identity.’The article quotes at length from a statement to the House of Lords by Baroness Greenfield, Professor of Synaptic Pharmacology at Lincoln College, Oxford, and Director of the Royal Institution of Great Britain.
The Baroness Greenfield has written a stack of books, including a best-seller on the brain, earned a peerage for her outstanding career, and has so many titles and honours that I’m not even sure what to call her (Prof? Lady?). Browsing her homepage and publications list, there’s a range of interesting stuff on consciousness, analgesia, dopamine, and a fair number of subjects upon which I don’t have even the expertise to comment. The only problem is that her fears, closely examined, reveal that she doesn’t know what to be afraid of, adopting a ‘one-paranoia-fits-all’ approach to technological change.
The Guardian article seems a bit over-wrought, and I don’t have the transcript of Greenfield’s presentation to the House of Lords, so I’m hesitant to attribute too much of the phobia to the original speech (for a critique of Greenfield’s habit of alarmism, however, see Ben Goldacre’s weblog). As we’ve seen repeatedly, the transition from scientist presenting to science writer submitting the story to editor reworking to press printing can be really rough, transforming subtle and measured analysis into formulaic, exaggerated soundbites. However, there are some extensive quotes, so in this piece, I’ll do my best to analyze what we have. In another post, I want to move beyond the fear of Facebook, using Lady Greenfield’s comments to think about how we might actually do research on the effects of technological change among developmental influences, but I won’t get to that in this post, as it’s already too long.
I’m not blasé about the developmental consequences of heavy exposure to screen technology, but I think that a legitimate interest in the possible effects of significant technological change in our daily lives can inadvertently dovetail seamlessly into a ‘kids these days’ curmudgeonly sense of generational degeneration, which is hardly new. That is, we have to be careful when we look at the research as it’s easy to annex our popular understandings of generational dynamics, even frustrations with our own children, students, and other young people, into a snowballing sense that everything’s going to hell.
Is new technology affecting our brain development and how? Is the recent change in the developmental environment much greater than previous changes in childhood ecology? And what specifically can we say about social networking sites as a factor in cognitive development? Obviously, these are huge questions, and it’s not my area of research specialty exactly, so I’m not going to bring fresh unpublished data to the table. But I do have some thoughts on the subject nonetheless, as our regular readers might imagine… but here’s the first part, where I deal with the concerns voiced by Greenfield and others.
Fear of Facebook & video games & Bebo…
According to the Guardian story, Greenfield told the House of Lords that children’s experience on social networking sites are devoid of ‘cohesive narrative and long-term significance,’ leading to degeneration of attention span, lost empathy, undermining of identity, sensationalism, and even infantilization. Although she points to social networking sites, she focuses some of her comments on the pace of user-screen interaction. Greenfield testified:
If the young brain is exposed from the outset to a world of fast action and reaction, of instant new screen images flashing up with the press of a key, such rapid interchange might accustom the brain to operate over such timescales. Perhaps when in the real world such responses are not immediately forthcoming, we will see such behaviours and call them attention-deficit disorder.
It might be helpful to investigate whether the near total submersion of our culture in screen technologies over the last decade might in some way be linked to the threefold increase over this period in prescriptions for methylphenidate, the drug prescribed for attention-deficit hyperactivity disorder.
In her comments, quoted at length in the original article (worth checking out), Greenfield frets about a number of potential impacts of online interaction on childhood development: desire for immediate gratification, failure to consider consequences, assuming all outcomes are reversible (like ‘dying’ in a video game), lack of concern about context (because video game narratives are decontextualized), compulsive reward seeking (virtual rewards), disregard for others’ emotions (because we do not read novels), and identity erosion.
Greenfield worries that social networking websites might replace face-to-face interaction, which requires greater interpersonal sensitivity and offers less time to think up ‘clever or witty responses.’ She wonders whether people in the future might recoil from the ‘messiness, unpredictability and immediate personal involvement of a three-dimensional, real-time interaction’ just as, she alleges, ‘we’ now eat meat that is processed before ‘we’ get it (presumably, farmers, ranchers, butchers and meat-packers are not part of the ‘we’ to which she is referring — sorry, it’s petty, but I live in the country, and future steaks and burgers are wandering around outside my dining room window…).
Whereas children were once safe to interact in realtime, outside, they now are trapped indoors, and ‘a child confined to the home every evening may find at the keyboard the kind of freedom of interaction and communication that earlier generations took for granted in the three-dimensional world of the street.’ Given the insulation from direct interaction, Greenfield wonders whether young people will reveal too much, unrestrained by embarrassment, inhibition, or concern for being evaluated.
Untangling the anxieties
As you might be able to tell from the way that I’ve written this up, I find Greenfield’s critique of online interaction internally inconsistent and contradictory, in part because Greenfield is critiquing in one slather (at least the way it’s written up in the story) a number of distinct computer-based activities, some of which aren’t even really online. For example, the effects of violent first-person video games on a user would likely be significantly different than self-presentational social networking websites like Facebook or flash communication technologies like texting or Twittering.
So I’ll try to sort out what I think Lady Greenfield’s primary fears are see how they square with each other:
1) Some online experiences are ‘devoid of cohesive narrative and long-term significance,’ which may have detrimental effects on cognitive development.
2) Fast action and reaction onscreen might ‘accustom the brain to operate over such timescales,’ which might lead to attention-deficit disorder if non-screen responses do not live up to these accelerated expectations. Does immersion in ‘screen technologies’ in the last decade help to explain the three-fold influence in diagnoses of attention-deficit disorder?
3) Social network users might develop a preference for immediacy where there are no long-term consequences (this may actually be two fears).
4) The clear rewards of video games are potentially addictive, especially because they are so reliable and immediate.
5) The de-contextualized situations in online interactions will lead to a decrease in empathy, because unlike novels, in which the goal is to understand participants, online media do not offer avenues for developing insight into characters.
6) Social networking sites might lead to an erosion of a person’s sense of identity because people might become more dependent upon the reactions of others to understand their own identities.
7) Because online interaction is easy, people may become lured into greater dependence on online interaction rather than risk the perils of face-to-face interaction. Children deprived of a chance to interact with each other in real time (perhaps because of safety concerns or longer commutes) now do so online, and they may like it more than interacting face-to-face.
8 ) Without the intensity of face-to-face interaction, children might become less inhibited about revealing things about themselves.
Let’s all take a deep breath…
1) Anthropologists and others have long argued that ‘narrative’ is imposed on events, and may be created by the narrator or by the listener. For example, I might invent a ‘narrative’ about my blogging ‘career,’ creating coherence out of non-cumulative and loosely connected events. ‘Narrative’ is not inherent in events themselves but in the significance that we ascribe to those events (for example, see Erin Finley’s excellent post, Cultural Aspects of PTSD, Part II: Narrative and Healing).
And online interaction often has extensive narrative elements, in the sense that people often capture a sense of time, change, key events, and the like on personal Facebook pages. Far from evacuating narrative, some social networking sites might be said to cause users to ‘narrativize’ their experience, engaging with everyday life already with an eye toward how they will represent it on their personal pages. For example, I find visiting students from the US annoying in Australia because they’re always looking for opportunities for stupid posed photos to put on their websites rather than just engaging with Australia; they are prematurely narrativizing their experience, in my opinion, treating what they find as an elaborate set for photo opportunities.
2) Do fast action-reaction expectations cause attention-deficit disorder? I don’t know of any research that supports this theory, but I don’t think we can rule it out. Correlation, of course, is not causation, however, so I wouldn’t be too quick to jump on this particular theoretical bus.
Fast action-reaction patterns exist in other activities that are not treated as suspect for the historical spike in ADD: for example, sports or games of quickness and reflex, like four-square or even dodge ball. In addition, having watched children engrossed with onscreen interaction, it’s not immediately obvious to me how screen obsession, a highly focused almost trance-like state, is linked to the scattered attention and inability to focus more typical of ADD.
If ‘screen technologies,’ the general term Greenfield employs, are the suspect for attention-deficit disorder, I suspect that some old culprits — manic editing of television, over-stimulation by children’s programming, advertising, violence — are more likely candidates for the primary cause.
Of course, there are other theories about the increasing diagnosis of attention-deficit disorder (and ADHD), including that these disorders are substantially over-diagnosed (for a similar argument, see our Psychiatry affects human psychology: e.g., ‘bipolar’ children or Neurotosh, Neurodosh and Neurodash). But one interesting note of caution would be that some observers have argued that the increasingly sedentary demands we place on children — in activities like sitting and reading or working on a computer — are a significant challenge to active children, leading to their diagnosis as ‘pathological,’ rather than the stimulus that makes them over-active.
3) It’s also not clear to me that video games are all that different in the narrative department (point 1) and long-term consequences department from any other games, either of the Monopoly variety or of the backyard soccer sort. In fact, as I understand Dutch historian Johan Huizinga’s classic Homo Ludens, one of the defining characteristics of play, in general, is that it is outside everyday life, including the normally rigid rules and the social consequences of actions. The more a game has long-term consequences, the less play-like it is.
Prof. Greenfield seems to be in favour of more novel reading and less online interaction and video games for children (fair enough, and I’d probably agree with her), but I think she’d be very hard pressed to argue that novels have much greater long-term consequences than social interaction online.
4) Can anything with a clear and immediate reward become addictive? If so, then can gurgling at a baby, throwing a ball with a Labrador, or giving money to strangers on the street become addictive? Daniel’s written far better points on addiction on this website (see, for example, One Day at Kotaku: Understanding Video Games and Other Modern Obsessions and Studying Sin), but the idea that anything with clear, immediate rewards might become debilitatingly addictive doesn’t explain the negative cases very well, when clearly rewarding things don’t become compulsive.
5) If novels were necessary for empathy, we’d be in far deeper trouble than we are. I read somewhere that only 20% of the US population read a book last year (or something like that). Most serious studies of empathy don’t see Facebook as a serious problem, although there is some well-founded concern about desensitization to violence through some forms of first-person, realistic, graphically violent video games.
A lot of media, and even forms of social life, don’t allow us to investigate the motives of others, so I’m not persuaded that this is sufficient reason to be afraid of social networking websites. In addition, I think that this critique is more of video games than of Facebook or its ilk. This is one place where I think that the anxieties are being mixed to create hybrid technophobia.
6) Many social theorists argue that identity is inherently interactional (for example, George Herbert Mead or Herbert Blumer).
And it’s not clear how the extraordinary self-obsessed medium of Facebook might lead a person to lose their sense of identity; on the contrary, one could argue that the genre contributes to a generational narcissism, an excessive emphasis on working on one’s own identity performance. Prof. Jean Twenge of San Diego State University found that 30% more college students scored high on the Narcissistic Personality Inventory in 2006 than in 1982 (see her book, Generation Me and the accompanying website).
7) I’m not sure if online interaction is actually ‘easier’ than face-to-face interaction, nor am I convinced that young people are not interacting face-to-face, although they may be interacting remotely when they are in close proximity to us (but I doubt that they would want to talk to us geezers even if they were stripped of their mobile phones). When I see young people, they seem to be interacting a lot, even the ones with Facebook pages. The debasement of the word ‘friend’ by the website’s use of it should not make us assume that users can’t tell the difference between friends and Facebook ‘friends.’ As Vaughn at Mind Hacks discusses, research on the use of social networking sites by young people often find that they use these technologies to maintain social relations established in non-virtual interaction.
Although Greenfield describes this as a danger, I think it’s fascinating that we once were told to worry about how social isolation, borne of safety concerns and latchkey lives, would stunt our children’s social development. Now that they have technology that reaches around the barriers between people, we’re worried about their forms of interaction.
If I’m afraid of anything related to this pervasive embedding in electronic communication, it’s the way that bullying can follow victims anywhere and the sheer banality of so much of the communication. But I’m also afraid of face-to-face bullying, and the banality of most conversation is also pretty frustrating, that’s probably why I’m an academic.
8 ) Clearly, I would not advocate telling one’s children to disclose private information about themselves online, but this fear of excessive intimacy seems to be contradictory to fear #6. If online social networkers are growing more shallow and dependent upon others’ opinions, then their intimate selves are already less private as they are public performances of a virtual sort.
This may be a case where fear of online grooming of children by predators is being amalgamated with a fear of excessive self disclosure; the majority of people on Facebook are not children, so the fear that they are targeted by pedophiles doesn’t apply. While this is a real fear, again, I think it needs to be separated from Facebook or any other specific social networking site and applied to the context in which it actually occurs: children online being targeted for recruiting by pedophiles. No one in their right mind, who has read a newspaper or watched television news, could think that this concern is not pervasive in our communities (whether or not it is appropriate to the level of threat).
But this fear is not really about the effect of technology on the cognitive development of the users; this is a different fear about social threats and dangers in the community.
Technophobia in each generation
If we search for analogies, we can think of countless previous techno-moral panics that now seem positively quaint: the dangerous effects of rock ‘n’ roll, comic books, music videos, television, the wireless, air conditioning, trains… Mesopotamians parents were probably fearful of the impact of the newfangled chariot, and German parents no doubt fretted about what horrors Gutenberg’s movable type was about to introduce into their homes.
‘Every generation is phobic about the effects of new technology on the next,’ Boris Johnson, the Mayor of London, cautions on his website (well worth a read). He goes on to offer his own diagnosis:
I don’t like the idea of kids spending hours on the web, probably being groomed by paedophiles from Liège; and yet all the kids I know – whatever they have been goggling at – seem remarkably unruffled, and surprisingly moralistic. No matter how sordid the programmes, they disapprove vehemently of swearing. Anything remotely racist or homophobic sounds much more profane, to their ears, than it did to children 30 years ago. I could direct you to an 11-year‑old who certainly likes Desperate Housewives, but the show she really loves is called High School Musical and is so clean as to be positively emetic.
The young people I know may not disapprove of swearing, but like Johnson, I think we could find ample evidence to support a ‘our children are growing *more* moralistic/upstanding/square,’ if that were our inclination, our dominant lens for viewing cultural change. For example, I remember long discussions in grad school and when I was teaching at an American university in which members of my generation (X) and the older generation (young Boomers, I suppose) wondered at the naivité, androgynous relationships, and overall squareness of younger generations of students. We had to bite our tongues about our own pasts of experimentation with drugs and sex as they offered much more timid, fear-filled visions of the world of choices that they faced.
The point is not whether young people are more or less moral, more or less imperiled, or whether their fears, if greater, have sound grounding, but rather, why are we older folks so worried about their moral degeneration? Johnson thinks it’s less about protecting them from dangerous influences and more about our own shame: ‘Sometimes I think our censoriousness is not so much about protecting children as it is about preventing them from seeing the embarrassing silliness of adult behaviour.’
Johnson points to a bigger issue, in my opinion, when he suggests that overall consumption itself is frightening, whether or not it’s leading kids to depravity: ‘The real trouble is that they watch too much blasted electronic media altogether…’
Certainly, ADD might be, in some cases, a socially and technological induced or at least exacerbated health problem, but so, in some cases, are obesity, type-2 diabetes, heart problems, stress, and a host of other issues. Although Greenfield may see online social networking as a significant new threat, from a health perspective, it’s likely more an incremental change on what has already been a seismic shift in developmental environment: the spread of television and ‘screen technologies’ into households and their increasing dominance of daily activities.
From a whole-body or even an evolutionary perspective, I’m not convinced that online social networking is all that different from other sedentary activities, which have heralded a profound change in developmental activity patterns for children. If anything, texting and Internet surfing are more active and responsive than passive media consumption, and probably not as compelling (I’m looking at my computer screen right now and the large screen TV in the room is far louder, faster changing, colourful, and ‘instantly rewarding,’
I do think that Greenfield’s final point makes for a great set of questions (well, except for the laboured distinction between ‘mind’ and ‘brain’): ‘It is hard to see how living this way on a daily basis will not result in brains, or rather minds, different from those of previous generations. We know that the human brain is exquisitely sensitive to the outside world.’
I’m not sure that she’s right: the brain may be ‘exquisitely sensitive,’ but that doesn’t necessarily mean that the resulting neural architecture will necessarily be so alien. A developmental system can be exquisitely sensitive without being unstable or prone to erratic change. Part of the problem is an over-estimation of the fragility of the brain as a developmental system.
And the other possible blind spot is an over-estimation of the degree that recent developments, like Facebook or Twitter, are actually a changed environment, or that we automatically perceive the fundamental relevance of the change to ourselves as organisms. To understand how the ‘cognitive environment’ is changing, we really need a longer-term evolutionary perspective on our environment as a species.
For example, the ‘degeneration’ narrative about social relations may make assumptions about what our social relations look like as a species, over evolutionary time. That is, some institutions that we might identify as ‘long-standing’ might actually be relatively recent innovations, and their disappearance might be less of a crisis than the removal of a perturbation to the patterned development of our brains. For example, over evolutionary time, our recent explosion of reproduction and heightened social density may be an aberration to a pattern of less extensive social networking; maybe kids spending a lot of time alone with objects is actually pretty normal if we take the long perspective.
It’s these questions I want to pick up on the next part of this post (well, that, and this technophobic discussion of the brain rotting effects of Twitter: Twitter Nation Has Arrived: How Scared Should We Be?).
H/t: Chris Gilbey at Perceptric (and my neighbour here in Berry, NSW) pointed out the original article in the Guardian that started me down this particular path.
There’s an audio clip from an interview with Lady Greenfield available here.
Vaughn at Mind Hacks has a piece on a television debate between Greenfield and Aric Sigman, both concerned about the effects of Internet use on children, and Ben Goldacre, author of Bad Science: Think of the children, not the evidence. For more on Aric Sigman at Mind Hacks, see Facebook causes marble loss.
Goldacre writes about his televised encounter with Sigman on his blog: “Facebook causes cancer”.
Vaughan linked to the BBC Newsnight episode (24 Feb 2009) on YouTube, in which Goldacre debates with Sigman, who echoes many of Greenfield’s fears, and even trumps them, painting a picture of 5- and 6-year-olds spending half their waking hours on social networking sites (one caveat: apparently I have a Facebook page, but it was put up by a student and I don’t even know how to use it, so I can’t tell you firsthand if there are numerous 6-yr-olds out there trawling for ‘friends’).