Filtering by Category: General Biology
On Tuesday at the British Royal Society of Music, the New London Chamber Choir will publicly perform a new choral piece with the lilting but jargony name "Allele." The genetic allusion isn't a superficial conceit: it is genuinely genomic music. Each of the 40 members of the chorus will be singing a score based on part of his or her own DNA.
The project began with geneticist Andrew Morley and the Wellcome Trust's "Music from the Genome" project, which had sequenced the DNA of 40 gifted singers to learn whether they had any distinctive genetic commonalities that might be indicative of musical ability. The findings of that genomic study have not yet been published. In the interim, however, Morley—who had sung with choirs in his youth, according to the BBC—decided to use the genomic sequences as the raw material for an artistic work.
He turned the data over to composer Michael Zev Gordon, who first translated the strings of nucleotides into notes, then rendered them musical through his selection and rhythmic arrangement of them. The poet Ruth Padel provided the lyrics for the singers. As Pallab Ghosh of the BBC writes:
To begin with, there is a single voice singing a simple rhythmic phrase; but as the piece develops, more voices join in - conveying the biological idea of replication and reproduction. At its climax, each member of the choir is singing their own unique genetic code - resulting in everyone singing a subtly different song.
Morley and Gordon seem not to be the first to think of translating genome sequences into music. Indeed, some artist-scientists have attempted the maybe even more intriguing trick of turning music into DNA and inserting it into living cells.
Research fellow Gil Alterovitz at M.I.T. and Harvard Medical School has developed a computer program that translates information about cells' gene and protein expression into musical sequences. His purpose is scientific rather than aesthetic, however. Because our brains are particularly adept at picking up patterns in the sounds we hear, Alterovitz hopes that his system could help researchers identify subtle derangements in the synchrony of gene expression that might underlie disease states.
For example, since the 1990s, musician Peter Gena seems to have been developing compositions based on DNA, with the assistance of geneticist Charles Strom. For the multimedia installation Genesis, they worked with Eduardo Kac, who created a synthetic gene sequence based on a Morse code translation of a line from the eponymous chapter in the Bible: "Let man have dominion over the fish of the sea, and over the fowl of the air, and over every living thing that moves upon the earth." The DNA representing that sentence was inserted into bacteria and grown in a petri dish; fluorescent proteins helped to show how the plasmids holding the synthetic DNA spread throughout the cell population, moved horizontally into other cell lines and mutated over time.
In the past decade, fungal microbiologist Aurora Sánchez Sousa of Madrid’s Ramón y Cajal Hospital and Richard Krull have similarly used genetic sequences as the baselines for musical compositions, which were then elaborated upon further in accordance with Krull's inspirations. The results of their labors are available on the Genoma Music site.
Back in 1998, musician Susan Alexander of Sacramento worked with biologist David Dreamer to make music out of the molecular vibrations of DNA in response to illumination by various wavelengths of light.
And currently, poet Christian Bök is working on writing poetry that can be translated into DNA and then inserted into bacteria as a working gene (not just junk DNA filler). Just to add an additional level of difficulty to the project, though, he also wants the amino acid sequence in the protein made by this gene to itself be comprehensible as a poem different from the one in the DNA. Yikes.
By the way, the bacterium into which Bök wants to insert his multilayered genetic composition? It's the extremophile Deinococcus radiodurans, well known for its near invulnerability to radiation damage. D. radiodurans is a good choice because the DNA-repair properties that give it high radioresistance also make it extremely resistant to mutation—which means that if Bök succeeds, he won't have to worry as much about mutation reducing his genomic art to gibberish.
Of course, that will also mean that Bök's creation should be able to survive for a long, long time. That's one way to achieve artistic immortality.
A personal request: Perhaps one of you readers can help me with a request that no amount of racking my memory or searching online has yet solved. While writing this article, I was reminded of an anthologized science fiction story that I read, oh, probably 35 years ago. It concerned an inventive genius and music lover who feared that when civilization collapsed (as it surely would), all the beautiful musical works of genius would be lost forever, because recordings would perish and musical notation would be inscrutable. He therefore invented a machine that, when it received musical input, produced living creatures; his idea was that these creatures would survive in the wild and somehow preserve the music until such time as someone invented a similar machine that could change the beasties back into music again. The creatures in question all had traits that somewhat reflected the qualities of the music or its composers: I remember that the "wagner beasts" were threatening and wolflike. As you might imagine, this scheme did not work out well.
Can anyone identify the name and writer of this story? Not remembering it has bothered me for eons. Thanks.
Who sounds stronger: James Earl Jones or Mike Tyson? Who do you think would win in a fight? Most of us might rather have Darth Vader narrate a documentary, but the smart money says that in an actual knockdown drag-out, bet on the guy who bit off Evander Holyfield's ear. But apparently not all evolutionary psychologists would agree.
A new paper published in the Proceedings of the Royal Society by Aaron Sell, Leda Cosmides, John Tooby and their colleagues argues that we humans have evolved a faculty for assessing people's physical strength—and indirectly, their fighting prowess—just from the sounds of their voice. The idea sounds provocative and has a certain prima facie credibility, but unfortunately, my assessment is that this amounts to just one more in the long line of speciously reasoned papers that gives evolutionary psychology a dubious reputation. Here's why.
First he starting premise of the study is that because natural selection would have favored strong fighting abilities among our ancestors, any individuals who could assess others' fighting prowess quickly would have had a big advantage: depending on their inclinations, they could then either hide from the strong, pound on the weak or mate with the muy macho. For most of us still recovering from Post Traumatic Playground Disorder, this may make intuitive sense. And the authors cite a variety of papers representing "[m]ultiple converging lines of evidence," all of which might support the assumption (though their support still sounds circumstantial and debatable to me). Nevertheless, it is still just an assumption, and one of which we should be suspicious precisely because it does conform so easily to our current social experience.
Sure, if one's view of early human evolution is dominated by "nature red in tooth and claw" concepts of competition, in which Fred Flintstone has to wrestle a sabertooth and punch out Barney Rubble before dragging Wilma away by the hair, fighting ability would unquestionably be a plus. But combativeness also has its downsides. People who are good at fighting may be more likely to be in fights, which means that they may be killed or disabled sooner on average than those with more moderate skills and aggression. If early humans hunted in groups with weapons, the best hunters might not actually be the strongest: they might do better by virtue of their leadership, their strategies, their ability to maintain peace in the tribe. Not to mention that the women might be looking for something other than fighting prowess in a mate.
None of these points disqualifies physical strength or fighting ability as an important selective advantage. But they do mean that one can't simply say that strength and fighting ability automatically constitute important criteria for our ancestors' success.
Next, the researchers focus on upper body strength as a proxy for fighting ability. They offer reasons why the two correlate reasonably, and I don't have any objection to using upper body strength that way. Bear in mind, however, that this strength is only a proxy: any statement the researchers ultimately make about the relation of the voice to fighting ability is no stronger than that connection.
So then the researchers asked test subjects to rank unseen speakers by perceived strength, and found that the subject were indeed adept at doing so, even controlling for height and weight, even if the voices were speaking foreign languages. In a further apparent corroboration of their hypothesis, the researchers also found that people were better at assessing the strengths of males' voices than females'—because, of course, it would have been so much less important to assess women's physical strength or combativeness. (Of course, if there had been no difference in the facilities for judging men's and women's strengths from the sound of their voices, that might have only established just how fundamental and important this capability must be. So....)
Yet, again, what do these results truly show? At best, they establish only that people's voices do provide good cues about how others may perceive their physical strength. But the leap from physical strength to an evolutionary just-so story about natural selection favoring people's abilities to make this discrimination is still a big leap. People's voices may correlate well with their perceived strength, but let's face it, people's voices correlate well with lots of characteristics that may also correlate (causally or otherwise) with fighting prowess. For example, perhaps our perceptions of others' strength also corresponds to our sense of their self-confidence, or their social standing. Those may or may not be driven in part by their physical strength; but one therefore doesn't know for certain which of many such traits might have figured into natural selection (if any did).
I might be inclined to give this study more benefit of a doubt if it did not fall so squarely in line with many other evolutionary psychology papers—including ones from Leda Cosmides and John Tooby—that gravitate back to support for a view of human origins that heavily emphasizes the overwhelming dominance of violent conflicts, aggressive hunter females, more passive females and other clichés. Almost every other exploration of evolutionary biology seems to turn up surprises hidden from us in part by our cultural blinders; and yet human evolutionary psychology usually seems to shore up the cultural status quo. Doesn't that seem the least bit suspicious?
I'll repeat something I've said before: I completely want to believe in evolutionary psychology. I fully believe that someday we will have a strong, sound science of evolutionary psychology that offers real insights into human behavior. I support and encourage researchers' efforts today to start making that science a reality. But unfortunately, right now, I don't think we know enough in detail about either human psychology or human evolution to produce very persuasive theories in this area.
Oh, the allure of a misleading headline. When I saw this story on BBC news, “Mammoths had 'anti-freeze blood', gene study finds,” I thought for a moment that scientists had discovered those Ice Age behemoths had secreted some glycerol-like compound into their tissues to prevent freezing. Such a finding would have been genuinely astounding because, as far as I know, that trick is seen only among certain polar fish and overwintering insects; it would have been a first among mammals. In reality, the surprise is not that the mammoths’ blood resisted freezing but rather that it cleverly continued to do what blood vitally must do: transport oxygen to needy tissues, even at low temperatures. Kevin L. Campbell of the University of Manitoba and his colleagues reached this conclusion through a nifty piece of paleogenomic molecular biology, as they reported in Nature Genetics. Their technique’s fascinating potential to help biologists learn about the physiologies of extinct creatures has already drawn considerable attention, but the mechanism of the hardiness of the mammoths’ blood also helps to highlight a common way in which evolution innovates.
Mammoths display obvious features that must have helped them stay warm in the brutal subzero temperatures of the Pleistocene ice ages, such as long, shaggy coats and small ears. They may well also have had less obvious ones, too, like the arrangement of blood vessels in the legs of caribou that allows countercurrent exchange to minimize the loss of body heat from their legs while they stand in snow. Nevertheless, Campbell had wondered about whether the mammoths’ blood might have been adapted, too, because of hemoglobin releases oxygen into tissues only sluggishly at low temperatures.
By extracting the hemoglobin gene from DNA in well-preserved mammoth remains and inserting it into bacteria, Campbell and Alan Cooper of the University of Adelaide were able to replicate samples of the mammoth’s hemoglobin. And sure enough, in subsequent tests, the resurrected hemoglobin proved to release oxygen much more consistently across a wide range of temperatures—even glacially low ones.
Perhaps it sounds surprising that something so fundamental to mammalian physiology as its hemoglobin chemistry would be subject to evolutionary revision. Surely the mammoths might have survived the cold just as well by evolving more hair or thicker insulation. Yet hemoglobin chemistry is actually a feature particularly well suited to modification—and that has been modified many times throughout evolutionary history. The key is that the genes making the globin proteins have leant themselves to frequent duplication throughout evolutionary history, which opens up the opportunity of variation among the copies and specialization in their activities.
Humans, for example, have several different types of globin genes (designated alpha, beta, gamma and so on) on chromosomes 11 and 16 that may be used in various combinations to manufacture variant forms of the tetrameric (four-chain) oxygen-transport protein in red blood cells. For our first 12 weeks or so in utero, our bodies make embryonic hemoglobin, then switch to fetal hemoglobin, which can continue to be a major component of newborn babies for six months. Mammalian fetuses need hemoglobin that takes up oxygen very avidly because they need to steal it away from their mothers’ blood as it circulates through the placenta (see graph). So fetal hemoglobin does not respond to a chemical (2,3-bisphosphoglycerate) that reduces the oxygen affinity of adult hemoglobin.
Fetal hemoglobin has actually been the salvation of many adults who suffer from sickle-cell anemia. These people make a defective form of adult hemoglobin that distorts their red blood cells into the elongated, sickle shape that gives their condition its name; these sickle cells can clump together and block blood vessels, leading to painful and frequently fatal complications. A treatment for sickle-cell anemia, however, is to give a patient hydroxyurea and recombinant erythropoietin, which stimulate the body to begin making fetal hemoglobin again, alleviating some of the problems.
Of course, sickle-cell anemia is itself the result of a mutant variation in the copies of the genes that make the beta subunit of hemoglobin. As is well known today, the genetic trait for sickle cell seems to have originally taken root in populations in sub-Saharan Africa and other parts of the world where mosquito-borne malaria is endemic: people who are carriers of the trait are somewhat more resistant to malaria infection. The prevalence of sickle-cell genes in those populations therefore may represent an adaptive, evolved response—albeit it a cruel one—to the harsh burden of malaria. Indeed, the genetic blood disorders called thalassemias, which are prevalent in many Mediterranean ethnic groups, may similarly represent the result of natural selection for improved survival against malaria (although unlike sickle-cell anemia, which involves an abnormal hemoglobin, thalassemias are caused by underproduction of a normal hemoglobin).
Looking outside the human species, one finds that the globin proteins have a staggeringly wide range of forms and chemistries, representing an incredible amount of evolutionary experimentation that long preceded the development of our species’ own tidy system. Horseshoe crabs aren’t aristocrats but they are literally blue bloods, with a blood pigment that carries copper instead of iron. Sea squirts have a green blood pigment based on vanadium.
Many animals (including humans), in addition to some form of hemoglobin, also have myoglobin, a large globular protein that helps muscle tissue hang onto the oxygen it needs while working. (Myoglobin is the reason that meat is red, my fellow steak lovers.) Some dopaminergic neurons and glia in the brain also seem to contain hemoglobin, possibly to insure the brain’s own access to oxygen under suffocating conditions. And hemoglobin itself also seems to have functions quite apart from oxygen transport: its antioxidant properties and interactions with iron seem to be used by cells in the kidneys and immune system.
All this variation suggests that hemoglobin genetics might not have been such an odd target for natural selection in the evolution of mammoths: it is an extremely malleable protein with diverse capabilities, and because organisms often contain multiple copies of it, variants can creep into a population and survive long enough to be tested by selection.
In fact, another example not unlike the mammoths was reported in the scientific literature last year. Jay Storz of the University of Nebraska-Lincoln and an international team of collaborators reported last August in the Proceedings of the National Academy of Sciences on their genetic comparison of deer mice living in lowland prairies and in the mountains. The mice were identical in every respect except for just four genes—all of which boosted the oxygen capacity of the mountain mice’s hemoglobin.
Mammoths and deer mice may be at opposite ends of the spectrum for mammalian size, but similar principles of evolution applies at all scales.
As they reported in the Proceedings of the National Academy of Sciences, Ya-Ping Zhang of the Kunming Institute of Zoology and colleagues have found evidence that mitochondrial genes in bats, which are responsible for their metabolic use of energy, have been under heavy selection pressure since early in their evolutionary history. Such a conclusion makes considerable sense on its face because bats’ way of life is highly energy intensive.
Nevertheless, the finding also helps to sketch in some of the mysteries of the evolution of bats. Bats are hard to look at in fossil record because they don’t fossilize well; they suddenly appeared in Eocene strata from about 50 million years ago looking fairly much as they do now, which left cryptic some of the steps up to that point.
Over the past decade, several discoveries have helped to firm up the view that the ancestors of bats spent their lives leaping and hopping through vegetation before they could fly; that the evolution of their wings seemed associated with key changes in certain genes for digits; that their ability to echolocate seems to have emerged after their capacity for flight; that the larger bats called Megachiroptera are indeed descendants of the smaller Microchiroptera and not the convergently evolved cousins of primates. The place of the metabolic changes in that scheme, though, were largely speculative. The demands on bats’ mitochondria could have emerged only once they started flying, but instead it seems that their ancestors made a living as an extra hyperkinetic group of insectivores leaping through the trees and shrubs.
All that energy expenditure comes at some cost to the bats, however. Most obviously, they need to eat colossal numbers of insects to satisfy their energy needs. During the cold winter months when insects become more sparse, bat species that don’t migrate to warmer climes must overwinter in caves, suspended in a state of torpor with their metabolisms at a crawl.
Unexpectedly, those energy demands may also figure into the catastrophic epidemic of white nose syndrome that has been killing bats in North America in recent years. The bodies of the dead bats studied by researchers have commonly been marked by heavy accumulations of this fungal infection, which in severe cases can damage the overwintering animals' skin and wings, leaving them unfit. Yet the observed level of infection and the timing of the bats' deaths suggest that this obvious explanation isn't adequate: in many cases, the fungus should have been only about enough to pose an itchy nuisance, maybe akin to aggressive athlete’s foot in humans.
That benign comparison may break down when applied to bats in torpor, however. Bats going into hibernation carry stockpiles of fat to sustain them, but those stockpiles are often just sufficient to get them through the winter season. Moreover, the bats do not consume those fat reserves at a consistently slow pace: every two or three weeks throughout their hibernation, the bats wake out of torpor, excrete, groom themselves, settle into genuine sleep for a while, then slow down their metabolisms again. These periods of activity seem to be essential to hibernating mammals, perhaps because they help to reinvigorate their immune systems. Eighty percent or more of the energy that the bats consume during their hibernation is spent during these brief intervals.
One theory therefore being investigated by researchers such as Craig Willis at the University of Winnipeg is that the fungus makes the animals rouse from torpor more often to groom themselves, and in the process makes them burn through their crucial fat reserves too quickly. Long before winter is over and insects are plentiful again, the bats starve.
North American bat species may be more vulnerable to white nose fungus than their European counterparts are because they roost in far larger colonies and so may transmit the fungus more widely throughout their population. It’s entirely possible, however, that the smaller European colonies are in fact an evolutionary consequence of white nose fungus (or some equivalent affliction) selecting against larger gatherings in the distant past.
This is only my own speculation, but if a highly amped metabolism has long been an adaptive feature of bats and their immediate ancestors, then perhaps for as long as these creatures have been using torpor to survive the cold months, something like white nose fungus has been cropping back colony sizes periodically. Unfortunately, given the difficulty of finding good bat fossils, verifying such a hypothesis may never really be feasible. But who’s to say what other deductions might yet be possible from the DNA record?
(For the information about white nose syndrome and its possible mechanisms, I'm indebted to science writer Erik Ortlip, now of Environmental Health News, who wrote about this subject while he was my student in the Science, Health and Environmental Reporting Program at the Arthur L. Carter Journalism Institute of New York University. Thanks, Erik!)
Two heads are not necessarily better than one: A double-headed bobtail lizard born in Perth, Australia, has problems crawling because its hindlegs get conflicting signals from each brain. Also the bigger head keeps trying to eat the smaller one.
Update: By the way... "Bicephalous"? "Dicephalous?" Sources I've consulted seem to accept both, and if anything accept the former as more common, but the latter seems more consistent in its use of a Greek prefix. Thoughts, my little wordbugs?