They came up with equations to describe how the brain might in theory encode time indirectly. In their scheme, as sensory neurons fire in response to an unfolding event, the brain maps the temporal component of that activity to some intermediate representation of the experience — a Laplace transform, in mathematical terms. That representation allows the brain to preserve information about the event as a function of some variable it can encode rather than as a function of time (which it can’t). The brain can then map the intermediate representation back into other activity for a temporal experience — an inverse Laplace transform — to reconstruct a compressed record of what happened when.
Just a few months after Howard and Shankar started to flesh out their theory, other scientists independently uncovered neurons, dubbed “time cells,” that were “as close as we can possibly get to having that explicit record of the past,” Howard said. These cells were each tuned to certain points in a span of time, with some firing, say, one second after a stimulus and others after five seconds, essentially bridging time gaps between experiences.
The cost, ecologists think, is that ants trapped in bridges aren’t available for other tasks, like foraging. At any time on a march, a colony might be maintaining 40 to 50 bridges, with as few as one and as many as 50 ants per bridge. In a 2015 paper, Garnier and his colleagues calculated that as much as 20 percent of the colony can be locked into bridges at a time. At this point, a shorter route just isn’t worth the extra ants it would take to create a longer bridge.
Except, of course, individual ants have no idea how many of their colony-mates are holding fast over a gap. And this is where the second rule kicks in. As individual ants run the “bridging” algorithm, they have a sensitivity to being stampeded. When traffic over their backs is above a certain level, they hold in place, but when it dips below some threshold — perhaps because too many other ants are now occupied in bridge-building themselves — the ant unfreezes and rejoins the march.
In the seventeen years between 1992 and 2009, the Russian population declined by almost seven million people, or nearly 5 percent—a rate of loss unheard of in Europe since World War II. Moreover, much of this appears to be caused by rising mortality. By the mid-1990s, the average St. Petersburg man lived for seven fewer years than he did at the end of the Communist period; in Moscow, the dip was even greater, with death coming nearly eight years sooner.
In 2006 and 2007, Michelle Parsons, an anthropologist who teaches at Emory University and had lived in Russia during the height of the population decline in the early 1990s, set out to explore what she calls “the cultural context of the Russian mortality crisis.” Her method was a series of long unstructured interviews with average Muscovites—what amounted to immersing herself in a months-long conversation about what made life, for so many, no longer worth living. The explanation that Parsons believes she has found is in the title of her 2014 book, Dying Unneeded.
In a political season of dog whistles, we must be attentive to how talk of American freedom has long been connected to the presumed right of whites to dominate everyone else.
“Segregation now, segregation tomorrow, segregation forever!” Alabama governor George Wallace’s most famous sentence fired through the frigid air on the coldest day anyone in the state could remember. His 1963 inaugural address—written by a Klansman, no less—served as the war cry for the massive, violent response to the nonviolent civil rights movements of the 1960s. Wallace’s brand of right-wing populism would reconfigure U.S. party politics, making him, as his biographer put it, the “invisible founding father” of modern conservatism. As so many pundits have pointed out, when Donald Trump talks about “domination” today, he is talking the language and politics of Wallace.
Yet Wallace’s famous speech was less about segregation than it was about freedom—white freedom. Other than its infamous applause line, the inaugural mentions “segregation” only one other time. In contrast, it invokes “freedom” twenty-four times—more times than Martin Luther King, Jr., used the word during his “I Have a Dream” address the following summer at the 1963 March on Washington. Freedom is this nation’s ill-defined but reflexive ideological commitment. Winding through the heart of that complex political idea, however, is a dark and visceral current of freedom as the unrestrained capacity to dominate.
Tocqueville’s writings illuminate a deep paradox arising from modern forms of democracy—as is evident in common misconceptions of his critique of the tyranny of the majority. For Tocqueville, the real tyrant in democracy is not so much the group as the individual; or rather individualism as we know it—entitled, selfish, envious, consumerist, insatiable—which arises when certain conditions of collectivist populism are in place. The erosion of extended kinship structures, religion and broader systems of ritual and meaning—which afford both a source of support and a sense of duty to others and to a project greater than oneself—are certainly partly to blame.
But Tocqueville also directs our attention to the most perverse level at which modern individuation operates: that of what becomes imaginable, desirable but ultimately unattainable in the democracy of the masses. You might call this the cognitive-affective dimension of democracy. Once a certain ideal of equality—however ill-defined as a normative goal—is in place, envy and upward social comparison become the norm. Since anyone can become more of anything or anyone at any time, something akin to entropy increases. In affective terms, social and psychological entropy become something we now call anxiety.
There’s an interesting paradox in Language Unlimited. You write that language is endlessly creative but also our cognition is constrained by the structure of language. What does that paradox say about human beings?
You’re right. There is a wee paradox in there. That’s a nice thing to pick up on. I think about it like even numbers. There’s an unlimited number of even numbers but obviously they’re limited, right? Because 3s and 7s aren’t in there. Language is like that. There’s an unlimited number of possible things we can say, of sentence structures, but not anything can be a sentence structure.
So you’re absolutely right. Language is unlimited, but it’s unlimited in a limited way.
What does that say about us as human beings? That’s a gigantic and fascinating question. It probably means our cognition is limited. There may be things that we can never solve because we don’t have the cognitive structures that will help to solve them.
These lamentations continued unabated throughout 2017. Just two weeks ago, Facebook said it would no longer flag phony links with red-box warnings, since pointing to a lie only makes it stronger. The truth, this move implied, does more harm than good.
But there’s a problem with these stories about the end of facts. In the past few years, social scientists armed with better research methods have been revisiting some classic work on the science of post-truth. Based on their results, the most surprising and important revelations from this research—the real lol-nothing-matters stuff—now seem overstated. It may be that the internet does not divide us, that facts don’t make us dumber than we were before, and that debunking doesn’t really lead to further bunk.
In fact, it may be time that we gave up on the truth-y notion that we’re living in a post-truth age. In fact, it may be time that we debunked the whole idea.
If traumatic brain injuries can impact the parts of the brain responsible for personality, judgment, and impulse control, maybe injury should be a mitigating factor in criminal trials — but one neuroscientist discovers that assigning crime a biological basis creates more issues than it solves.
“In our study, socializing was just as effective as more traditional kinds of mental exercise in boosting memory and intellectual performance,” said Oscar Ybarra, a psychologist at the U-M Institute for Social Research (ISR) and a lead author of the study with ISR psychologist Eugene Burnstein and psychologist Piotr Winkielman from the University of California, San Diego.
The lack of true personal interaction limits the brain’s opportunities to make better connections. It can also lead to loneliness and depression — mental conditions that contribute significantly to reduced brain health.