Lost genes may help explain how vampire bats survive on blood alone

Surviving on blood alone is no picnic. But a handful of genetic tweaks may have helped vampire bats evolve to become the only mammal known to feed exclusively on the stuff.

These bats have developed a range of physiological and behavioral strategies to exist on a blood-only diet. The genetic picture behind this sanguivorous behavior, however, is still blurry. But 13 genes that the bats appear to have lost over time could underpin some of the behavior, researchers report March 25 in Science Advances.

“Sometimes losing genes in evolutionary time frames can actually be adaptive or beneficial,” says Michael Hiller, a genomicist now at the Senckenberg Society for Nature Research in Frankfurt.
Hiller and his colleagues pieced together the genetic instruction book of the common vampire bat (Desmodus rotundus) and compared it with the genomes of 26 other bat species, including six from the same family as vampire bats. The team then searched for genes in D. rotundus that had either been lost entirely or inactivated through mutations.

Of the 13 missing genes, three had been previously reported in vampire bats. These genes are associated with sweet and bitter taste receptors in other animals, meaning vampire bats probably have a diminished sense of taste — all the better for drinking blood. The other 10 lost genes are newly identified in the bats, and the researchers propose several ideas about how the absence of these genes could support a blood-rich diet.

Some of the genes help to raise levels of insulin in the body and convert ingested sugar into a form that can be stored. Given the low sugar content of blood, this processing and storage system may be less active in vampire bats and the genes probably aren’t that useful anymore. Another gene is linked in other mammals to gastric acid production, which helps break down solid food. That gene may have been lost as the vampire bat stomach evolved to mostly store and absorb fluid.

One of the other lost genes inhibits the uptake of iron in gastrointestinal cells. Blood is low in calories yet rich in iron. Vampire bats must drink up to 1.4 times their own weight during each feed, and, in doing so, ingest a potentially harmful amount of iron. Gastrointestinal cells are regularly shed in the vampire bat gut, so by losing that gene, the bats may be absorbing huge amounts of iron and quickly excreting it to avoid an overload — an idea supported by previous research.

One lost gene could even be linked to vampire bats’ remarkable cognitive abilities, the researchers suggest. Because the bats are susceptible to starvation, they share regurgitated blood and are more likely to do so with bats that previously donated to themselves (SN: 11/19/15). Vampire bats also form long-term bonds and even feed with their friends in the wild (SN: 10/31/19; SN: 9/23/21). In other animals, this gene is involved in breaking down a compound produced by nerve cells that is linked to learning and memory — traits thought to be necessary for the vampire bats’ social abilities.

“I think there are some compelling hypotheses there,” says David Liberles, an evolutionary genomicist at Temple University in Philadelphia who wasn’t involved in the study. It would be interesting to see if these genes were also lost in the other two species of vampire bats, he says, as they feed more on the blood of birds, while D. rotundus prefers to imbibe from mammals.

Whether the diet caused these changes, or vice versa, isn’t known. Either way, it was probably a gradual process over millions of years, Hiller says. “Maybe they started drinking more and more blood, and then you have time to better adapt to this very challenging diet.”

How a virus turns caterpillars into zombies doomed to climb to their deaths

Higher and higher still, the cotton bollworm moth caterpillar climbs, its tiny body ceaselessly scaling leaf after leaf. Reaching the top of a plant, it will die, facilitating the spread of the virus that steered the insect there.

One virus behind this deadly ascent manipulates genes associated with caterpillars’ vision. As a result, the insects are more attracted to sunlight than usual, researchers report online March 8 in Molecular Ecology.

The virus involved in this caterpillar takeover is a type of baculovirus. These viruses may have been evolving with their insect hosts for 200 million to 300 million years, says Xiaoxia Liu, an entomologist at China Agricultural University in Beijing. Baculoviruses can infect more than 800 insect species, mostly the caterpillars of moths and butterflies. Once infected, the hosts exhibit “tree-top disease,” compelled to climb before dying and leaving their elevated, infected cadavers for scavengers to feast upon.
The clever trick of these viruses has been known for more than a century, Liu says. But how they turn caterpillars into zombies doomed to ascend to their own deaths wasn’t understood.

Previous research suggested that infected caterpillars exhibit greater “phototaxis,” meaning they are more attracted to light than uninfected insects. Liu and her team confirmed this effect in the laboratory using cotton bollworm moth caterpillars (Helicoverpa armigera) infected with a baculovirus called HearNPV.

The researchers compared infected and uninfected caterpillars’ positions in glass tubes surrounding a climbing mesh under an LED light. Uninfected caterpillars would wander up and down the mesh, but would return to the bottom before pupating. That behavior makes sense because in the wild, this species develops into adults underground. But infected hosts would end up dead at the top of the mesh. The higher the source of light, the higher infected hosts climbed.

The team moved to the horizontal plane to confirm that the hosts were responding to light rather than gravity, placing caterpillars in a hexagonal box with one of the side panels illuminated. By the second day after infection, host caterpillars crawled to the light about four times as often as the uninfected.

When the researchers surgically removed infected caterpillars’ eyes and put the insects in the box, the blinded insects were attracted to the light a quarter as often as unaltered infected hosts. That suggested that the virus was using a caterpillar’s vision against itself.

The team then compared how active certain genes were in various caterpillar body parts in infected and uninfected larvae. Detected mostly in the eyes, two genes for opsins, the light-sensitive proteins that are fundamental for vision, were more active after an infection with the virus, and so was another gene associated with vision called TRPL. It encodes for a channel in cell membranes involved in the conversion of light into electrical signals.

When the team used the gene-editing tool CRISPR/Cas9 to shut off the opsin genes and TRPL in infected caterpillars, the number of hosts attracted to the light in the box was cut roughly in half. Their height at death on the mesh was also reduced.

Baculoviruses appear capable of commandeering the genetic architecture of caterpillar vision, exploiting an ancient importance of light for insects, Liu says.

Light can cue crucial biological processes in insects, from directing their developmental timing, to setting their migration routes.

These viruses were already known to be master manipulators in other ways, tweaking their hosts’ sense of smell, molting patterns and the programmed death of cells, says Lorena Passarelli, a virologist at Kansas State University in Manhattan, who was not involved with the study. The new research shows that the viruses manipulate “yet another physiological host process: visual perception.”

There’s still a lot to learn about this visual hijacking, Passarelli says. It’s unknown, for instance, which of the virus’s genes are responsible for turning caterpillars into sunlight-chasing zombies in the first place.

How scientists found an African bat lost to science for 40 years

Julius Nziza still remembers the moment vividly. Just before dawn on a chilly January morning in 2019, he and his team gently extracted a tiny brown bat from a net purposely strung to catch the nocturnal fliers. A moment later, the researchers’ whoops and hollers pierced the heavy mist blanketing Rwanda’s Nyungwe National Park. The team had just laid eyes on a Hill’s horseshoe bat (Rhinolophus hilli), which scientists hadn’t seen for nearly four decades.

Nziza, a wildlife veterinarian at Gorilla Doctors in Musanze, Rwanda, and a self-described “bat champion,” had been looking for the critically endangered R. hilli since 2013. For several years, Nziza and Paul Webala from Maasai Mara University in Narok, Kenya, with the help of Nyungwe park rangers, surveyed the forest for spots where the bats might frequent. They didn’t find R. hilli, but it helped them narrow where to keep looking.

In 2019, the team decided to concentrate on roughly four square kilometers in a high-elevation region of the forest where R. hilli had last been spotted in 1981. Accompanied by an international team of researchers, Nziza and Webala set out for a 10-day expedition in search of the elusive bat. It wasn’t rainy season yet, but the weather was already starting to turn. “It was very, very, very cold,” Nziza recalls.
Every night, from sunset until close to midnight, the researchers stretched nets across trails, where bats are most likely to fly, and kept watch. Then, after a few hours of rest, they woke early to check the traps again. It was cold enough that the bats could die if stuck too long.

At 4 a.m. on the fourth day, the researchers caught a bat with the distinctive horseshoe-shaped nose of all horseshoe bat species. But it looked slightly different from others they had captured. This one had darker fur and a pointed tip on its nose.

Everyone began shouting: “This is it!”
The researchers felt “almost 99 percent sure” they had found the lost bat. “We had a couple beers in the evening,” Nziza says. “It was worth celebration.” To be 100 percent sure, though, the team needed to compare its specimen to past ones of R. hilli. Fortunately, there were two in museums in Europe.

That’s because this isn’t the first time that R. hilli was lost, then found, to science. Victor van Cakenberghe, a retired taxonomist at the University of Antwerp in Belgium, rediscovered R. hilli 17 years after it was first seen in 1964. He says he still remembers finding the bat tangled in a mist net strung across a river. He kept the specimen and brought it back to a Belgian museum.

Nearly 40 years later, Nziza and colleagues compared the measurements of their bat, which was released into the wild, to the preserved bat. At long last, it can be confidently said that R. hilli was rediscovered again, researchers report March 11 in a preprint submitted to Biodiversity Data Journal.

And, for the first time ever, the scientists recorded R. hilli’s echolocation call. Now, the rangers can use acoustic detectors to keep an eye — or rather, an ear — on the bat (SN: 10/23/20). In nine months, they’ve already captured R. hilli calls from eight different locations in the same small area.
The team published its data to the open-access Global Biodiversity Information Facility in hopes of speeding up conservation efforts for the bat. Africa is home to over 20 percent of the world’s bats, but with a longstanding research focus on bats in Europe and the Americas, little is known about African bat species.

“It’s a whole new thing,” Nziza says. “That’s why everybody’s excited.”

Wally Broecker divined how the climate could suddenly shift

It was the mid-1980s, at a meeting in Switzerland, when Wally Broecker’s ears perked up. Scientist Hans Oeschger was describing an ice core drilled at a military radar station in southern Greenland. Layer by layer, the 2-kilometer-long core revealed what the climate there was like thousands of years ago. Climate shifts, inferred from the amounts of carbon dioxide and of a form of oxygen in the core, played out surprisingly quickly — within just a few decades. It seemed almost too fast to be true.

Broecker returned home, to Columbia University’s Lamont-Doherty Earth Observatory, and began wondering what could cause such dramatic shifts. Some of Oeschger’s data turned out to be incorrect, but the seed they planted in Broecker’s mind flowered — and ultimately changed the way scientists think about past and future climate.

A geochemist who studied the oceans, Broecker proposed that the shutdown of a major ocean circulation pattern, which he named the great ocean conveyor, could cause the North Atlantic climate to change abruptly. In the past, he argued, melting ice sheets released huge pulses of water into the North Atlantic, turning the water fresher and halting circulation patterns that rely on salty water. The result: a sudden atmospheric cooling that plunged the region, including Greenland, into a big chill. (In the 2004 movie The Day After Tomorrow, an overly dramatized oceanic shutdown coats the Statue of Liberty in ice.)
It was a leap of insight unprecedented for the time, when most researchers had yet to accept that climate could shift abruptly, much less ponder what might cause such shifts.

Broecker not only explained the changes seen in the Greenland ice core, he also went on to found a new field. He prodded, cajoled and brought together other scientists to study the entire climate system and how it could shift on a dime. “He was a really big thinker,” says Dorothy Peteet, a paleoclimatologist at NASA’s Goddard Institute for Space Studies in New York City who worked with Broecker for decades. “It was just his genuine curiosity about how the world worked.”

Broecker was born in 1931 into a fundamentalist family who believed the Earth was 6,000 years old, so he was not an obvious candidate to become a pathbreaking geoscientist. Because of his dyslexia, he relied on conversations and visual aids to soak up information. Throughout his life, he did not use computers, a linchpin of modern science, yet became an expert in radiocarbon dating. And, contrary to the siloing common in the sciences, he worked expansively to understand the oceans, the atmosphere, the land, and thus the entire Earth system.

By the 1970s, scientists knew that humans were pouring excess carbon dioxide into the atmosphere, through burning fossil fuels and cutting down carbon-storing forests, and that those changes were tinkering with Earth’s natural thermostat. Scientists knew that climate had changed in the past; geologic evidence over billions of years revealed hot or dry, cold or wet periods. But many scientists focused on long-term climate changes, paced by shifts in the way Earth rotates on its axis and circles the sun — both of which change the amount of sunlight the planet receives. A highly influential 1976 paper referred to these orbital shifts as the “pacemaker of the ice ages.”

Ice cores from Antarctica and Greenland changed the game. In 1969, Willi Dansgaard of the University of Copenhagen and colleagues reported results from a Greenland ice core covering the last 100,000 years. They found large, rapid fluctuations in oxygen-18 that suggested wild temperature swings. Climate could oscillate quickly, it seemed — but it took another Greenland ice core and more than a decade before Broecker had the idea that the shutdown of the great ocean conveyor system could be to blame.
Broecker proposed that such a shutdown was responsible for a known cold snap that started around 12,900 years ago. As the Earth began to emerge from its orbitally influenced ice age, water melted off the northern ice sheets and washed into the North Atlantic. Ocean circulation halted, plunging Europe into a sudden chill, he said. The period, which lasted just over a millennium, is known as the Younger Dryas after an Arctic flower that thrived during the cold snap. It was the last hurrah of the last ice age.

Evidence that an ocean conveyor shutdown could cause dramatic climate shifts soon piled up in Broecker’s favor. For instance, Peteet found evidence of rapid Younger Dryas cooling in bogs near New York City — thus establishing that the cooling was not just a European phenomenon but also extended to the other side of the Atlantic. Changes were real, widespread and fast.

By the late 1980s and early ’90s, there was enough evidence supporting abrupt climate change that two major projects — one European, one American — began to drill a pair of fresh cores into the Greenland ice sheet. Richard Alley, a geoscientist at Penn State, remembers working through the layers and documenting small climatic changes over thousands of years. “Then we hit the end of the Younger Dryas and it was like falling off a cliff,” he says. It was “a huge change after many small changes,” he says. “Breathtaking.”
The new Greenland cores cemented scientific recognition of abrupt climate change. Though the shutdown of the ocean conveyor could not explain all abrupt climate changes that had ever occurred, it showed how a single physical mechanism could trigger major planet-wide disruptions. It also opened discussions about how rapidly climate might change in the future.

Broecker, who died in 2019, spent his last decades exploring abrupt shifts that are already happening. He worked, for example, with billionaire Gary Comer, who during a yacht trip in 2001 was shocked by the shrinking of Arctic sea ice, to brainstorm new directions for climate research and climate solutions.

Broecker knew more than almost anyone about what might be coming. He often described Earth’s climate system as an angry beast that humans are poking with sticks. And one of his most famous papers was titled “Climatic change: Are we on the brink of a pronounced global warming?”

It was published in 1975.

Grainy ice cream is unpleasant. Plant-based nanocrystals might help

You can never have too much ice cream, but you can have too much ice in your ice cream. Adding plant-based nanocrystals to the frozen treat could help solve that problem, researchers reported March 20 at the American Chemical Society spring meeting in San Diego.

Ice cream contains tiny ice crystals that grow bigger when natural temperature fluctuations in the freezer cause them to melt and recrystallize. Stabilizers in ice cream — typically guar gum or locust bean gum — help inhibit crystal growth, but don’t completely stop it. And once ice crystals hit 50 micrometers in diameter, ice cream takes on an unpleasant, coarse, grainy texture.

Cellulose nanocrystals, or CNCs, which are derived from wood pulp, have properties similar to the gums, says Tao Wu, a food scientist at the University of Tennessee in Knoxville. They also share similarities with antifreeze proteins, produced by some animals to help them survive subzero temperatures. Antifreeze proteins work by binding to the surface of ice crystals, inhibiting growth more effectively than gums — but they are also extremely expensive. CNCs might work similarly to antifreeze proteins but at a fraction of the cost, Wu and his colleagues thought.

An experiment with a sucrose solution — a simplified ice cream proxy — and CNCs showed that after 24 hours, the ice crystals completely stopped growing. A week later, the ice crystals remained at 25 micrometers, well beneath the threshold of ice crystal crunchiness. In a similar experiment with guar gum, ice crystals grew to 50 micrometers in just three days.
“That by itself suggests that nanocrystals are a lot more potent than the gums,” says Richard Hartel, a food engineer at the University of Wisconsin–Madison, who was not involved in the research. If CNCs do function the same way as antifreeze proteins, they’re a promising alternative to current stabilizers, he says. But that still needs to be proven.

Until that happens, you continue to have a good excuse to eat your ice cream quickly: You wouldn’t want large ice crystals to form, after all.