Virus triggers immune proteins to aid enemy

Crucial immune system proteins that make it harder for viruses to replicate might also help the attackers avoid detection, three new studies suggest. When faced with certain viruses, the proteins can set off a cascade of cell-to-cell messages that destroy antibody-producing immune cells. With those virus-fighting cells depleted, it’s easier for the invader to persist inside the host’s body.

The finding begins to explain a longstanding conundrum: how certain chronic viral infections can dodge the immune system’s antibody response, says David Brooks, an immunologist at the University of Toronto not involved in the research. The new studies, all published October 21 in Science Immunology, pin the blame on the same set of proteins: type 1 interferons.
Normally, type 1 interferons protect the body from viral siege. They snap into action when a virus infects cells, helping to activate other parts of the immune system. And they make cells less hospitable to viruses so that the foreign invaders can’t replicate as easily.

But in three separate studies, scientists tracked mice’s immune response when infected with lymphocytic choriomeningitis virus, or LCMV. In each case, type 1 interferon proteins masterminded the loss of B cells, which produce antibodies specific to the virus that is being fought. Normally, those antibodies latch on to the target virus, flagging it for destruction by other immune cells called T cells. With fewer B cells, the virus can evade capture for longer.

The proteins’ response “is driving the immune system to do something bad to itself,” says Dorian McGavern, an immunologist at the National Institute of Neurological Disorders and Stroke in Bethesda, Md., who led one of the studies.

The interferon proteins didn’t directly destroy the B cells; they worked through middlemen instead. These intermediaries differed depending on factors including the site of infection and how much of the virus the mice received.
T cells were one intermediary. McGavern and his colleagues filmed T cells actively destroying their B cell compatriots under the direction of the interferon proteins. When the scientists deleted those T cells, the B cells didn’t die off even though the interferons were still hanging around.
Another study found that the interferons were sending messages not just through T cells, but via a cadre of other immune cells, too. Those messages told B cells to morph into cells that rapidly produce antibodies for the virus. But those cells die off within a few days instead of mounting a longer-term defense.

That strategy could be helpful for a short-term infection, but less successful against a chronic one, says Daniel Pinschewer, a virologist at the University of Basel in Switzerland who led that study. Throwing the entire defense arsenal at the virus all at once leaves the immune system shorthanded later on.

But interferon activity could prolong even short-term viral infections, a third study showed. There, scientists injected lower doses of LCMV into mice’s footpads and used high-powered microscopes to watch the infection play out in the lymph nodes. In this case, the interferon stifled B cells by working through inflammatory monocytes, white blood cells that rush to infection sites.

“The net effect is beneficial for the virus,” says Matteo Iannacone, an immunologist at San Raffaele Scientific Institute in Milan who led the third study. Sticking around even a few days longer gives the virus more time to spread to new hosts.

Since all three studies looked at the same virus, it’s not yet clear whether the mechanism extends to other viral infections. That’s a target for future research, Iannacone says. But Brooks thinks it’s likely that other viruses that dampen antibody response (like HIV and hepatitis C) could also be exploiting type 1 interferons.

Protein mobs kill cells that most need those proteins to survive

Joining a gang doesn’t necessarily make a protein a killer, a new study suggests. This clumping gets dangerous only under certain circumstances.

A normally innocuous protein can be engineered to clump into fibers similar to those formed by proteins involved in Alzheimer’s, Parkinson’s and brain-wasting prion diseases such as Creutzfeldt-Jakob disease, researchers report in the Nov. 11 Science. Cells that rely on the protein’s normal function for survival die when the proteins glom together. But cells that don’t need the protein are unharmed by the gang activity, the researchers discovered. The finding may shed light on why clumping proteins that lead to degenerative brain diseases kill some cells, but leave others untouched.
Clumpy proteins known as prions or amyloids have been implicated in many nerve-cell-killing diseases (SN: 8/16/08, p. 20). Such proteins are twisted forms of normal proteins that can make other normal copies of the protein go rogue, too. The contorted proteins band together, killing brain cells and forming large clusters or plaques.

Scientists don’t fully understand why these mobs resort to violence or how they kill cells. Part of the difficulty in reconstructing the cells’ murder is that researchers aren’t sure what jobs, if any, many of the proteins normally perform (SN: 2/13/10, p. 17).

A team led by biophysicists Frederic Rousseau and Joost Schymkowitz of Catholic University Leuven in Belgium came up with a new way to dissect the problem. They started with a protein for which they already knew the function and engineered it to clump. That protein, vascular endothelial growth factor receptor 2, or VEGFR2, is involved in blood vessel growth. Rousseau and colleagues clipped off a portion of the protein that causes it to cluster with other proteins, creating an artificial amyloid.

Masses of the protein fragment, nicknamed vascin, could aggregate with and block the normal activity of VEGFR2, the researchers found. When the researchers added vascin to human umbilical vein cells grown in a lab dish, the cells died because VEGFR2 could no longer transmit hormone signals the cells need to survive. But human embryonic kidney cells and human bone cancer cells remained healthy. Those results suggest that some forms of clumpy proteins may not be generically toxic to cells, says biophysicist Priyanka Narayan of the Whitehead Institute for Biomedical Research in Cambridge, Mass. Instead, rogue clumpy proteins may target specific proteins and kill only cells that rely on those proteins for survival.

Those findings may also indicate that prion and amyloid proteins, such as Alzheimer’s nerve-killing amyloid-beta, normally play important roles in some brain cells. Those cells would be the ones vulnerable to attack from the clumpy proteins.
The newly engineered ready-to-rumble protein may open new ways to inactivate specific proteins in order to fight cancer and other diseases, says Salvador Ventura, a biophysicist at the Autonomous University of Barcelona. For instance, synthetic amyloids of overactive cancer proteins could gang up and shut down the problem protein, killing the tumor.

Artificial amyloids might also be used to screen potential drugs for anticlumping activity that could be used to combat brain-degenerating diseases, Rousseau suggests.

The fight against infectious diseases is still an uphill battle

It was barely more than half a century ago that the Nobel Prize–winning virologist Sir Frank Macfarlane Burnet mused about the demise of contagions. “To write about infectious disease,” he wrote in 1962, “is almost to write of something that has passed into history.”

If only. In the past several decades, over 300 infectious pathogens have either newly emerged or emerged in new places, causing a steady drumbeat of outbreaks and global pandemic scares.

Over the course of 2016, their exploits reached a crescendo. Just as the unprecedented outbreak of Ebola in West Africa was collapsing in early 2016, the World Health Organization declared Zika virus, newly erupted in the Americas, an international public health emergency. What would balloon into the largest outbreak of yellow fever in Angola in 30 years had just begun. A few months later, scientists reported the just-discovered “superbug” mcr-1 gene in microbes collected from humans and pigs in the United States (SN Online: 5/27/16). The gene allows bacteria to resist the last-ditch antibiotic colistin, bringing us one step closer to a looming era of untreatable infections that would transform the practice of medicine. Its arrival presaged yet another unprecedented event: the convening of the United Nations General Assembly to consider the global problem of antibiotic-resistant bugs. It was only the fourth time over its 70-plus-year history that the assembly had been compelled to consider a health challenge. It’s “huge,” says University of Toronto epidemiologist David Fisman.
But even as UN delegates arrived for their meeting in New York City in September, another dreaded infection was making headlines again. The international community’s decades-long effort to end the transmission of polio had unraveled. In 2015, the WHO had declared Nigeria, one of the three last countries in the world that suffered the infection, free of wild polio. By August 2016, it was back. Millions would have to be vaccinated to keep the infection from establishing a foothold.
Three fundamental, interrelated factors fuel the microbial comeback, experts say. Across the globe, people are abandoning the countryside for life in the city, leading to rapid, unplanned urban expansions. In crowded conditions with limited access to health care and poor sanitation, pathogens like Ebola, Zika and influenza enjoy lush opportunities to spread. With more infections mingling, there are also more opportunities for pathogens to share their virulence genes.

At the same time, global demand for meat has quadrupled over the last five decades by some estimates, driving the spread of industrial livestock farming techniques that can allow benign microbes to become more virulent. The use of colistin in livestock agriculture in China, for example, has been associated with the emergence of mcr-1, which was first discovered during routine surveillance of food animals there. Genetic analyses suggest that siting factory farms full of chickens and pigs in proximity to wild waterfowl has played a role in the emergence of highly virulent strains of avian influenza. Crosses of Asian and North American strains of avian influenza caused the biggest outbreak of animal disease in U.S. history in 2014–2015. Containing that virus required the slaughter of nearly 50 million domesticated birds and cost over $950 million. Worryingly, some strains of avian influenza, such as H5N1, can infect humans.
The thickening blanket of carbon dioxide in the atmosphere resulting from booming populations of people and livestock provides yet another opportunity for pathogens to exploit. Scientists around the world have documented the movement of disease-carrying creatures including mosquitoes and ticks into new regions in association with newly amenable climatic conditions. Climate scientists predict range changes for bats and other animals as well. As the organisms spread into new ranges, they carry pathogens such as Ebola, Zika and Borrelia burgdorferi(a bacterium responsible for Lyme disease) along with them.
Since we can rarely develop drugs and vaccines fast enough to stanch the most dangerous waves of disease, early detection will be key moving forward. Researchers have developed a welter of models and pilot programs showing how environmental cues such as temperature and precipitation fluctuations and the insights of wildlife and livestock experts can help pinpoint pathogens with pandemic potential before they cause outbreaks in people. Chlorophyll signatures, a proxy for the plankton concentrations that are associated with cholera bacteria, can be detected from satellite data, potentially providing advance notice of cholera outbreaks.

Even social media chatter can be helpful. Innovative financing methods, such as the World Bank’s recently launched Pandemic Emergency Financing Facility — a kind of global pandemic insurance policy funded by donor countries, the reinsurance market and the World Bank — could help ensure that resources to isolate and contain new pathogens are readily available, wherever they take hold. Right now, emerging disease expert Peter Daszak points out, “we wait for epidemics to emerge and then spend billions on developing vaccines and drugs.” The nonprofit organization that Daszak directs, EcoHealth Alliance, is one of a handful that instead aim to detect new pathogens at their source and proactively minimize the risk of their spread.

Burnet died in 1985, two years after the discovery of HIV, one of the first of the latest wave of new pathogens. His vision of a contagion-free society was that of a climber atop a foothill surrounded by peaks, mistakenly thinking he’d reached the summit. The challenge of surviving in a world of pathogens is far from over. In many ways, it’s only just begun.

Monsoon deluges turned ancient Sahara green

Thousands of years ago, it didn’t just rain on the Sahara Desert. It poured.

Grasslands, trees, lakes and rivers once covered North Africa’s now arid, unforgiving landscape. From about 11,000 to 5,000 years ago, much higher rainfall rates than previously estimated created that “Green Sahara,” say geologist Jessica Tierney of the University of Arizona in Tucson and her colleagues. Extensive ground cover, combined with reductions of airborne dust, intensified water evaporation into the atmosphere, leading to monsoonlike conditions, the scientists report January 18 in Science Advances.
Tierney’s team reconstructed western Saharan rainfall patterns over the last 25,000 years. Estimates relied on measurements of forms of carbon and hydrogen in leaf wax recovered from ocean sediment cores collected off the Sahara’s west coast. Concentrations of these substances reflected ancient rainfall rates.

Rainfall ranged from 250 to 1,670 millimeters annually during Green Sahara times, the researchers say. Previous estimates — based on studies of ancient pollen that did not account for dust declines — reached no higher than about 900 millimeters. Saharan rainfall rates currently range from 35 to 100 millimeters annually.

Leaf-wax evidence indicates that the Green Sahara dried out from about 8,000 to at least 7,000 years ago before rebounding. That’s consistent with other ancient climate simulations and with excavations suggesting that humans temporarily left the area around 8,000 years ago. Hunter-gatherers departed for friendlier locales, leaving cattle herders to spread across North Africa once the Green Sahara returned (SN Online: 6/20/12), the investigators propose.

Snooze patterns vary across cultures, opening eyes to evolution of sleep

Hunter-gatherers and farming villagers who live in worlds without lightbulbs or thermostats sleep slightly less at night than smartphone-toting city slickers, researchers say.

“Contrary to conventional wisdom, people in societies without electricity do not sleep more than those in industrial societies like ours,” says UCLA psychiatrist and sleep researcher Jerome Siegel, who was not involved in the new research.

Different patterns of slumber and wakefulness in each of these groups highlight the flexibility of human sleep — and also point to potential health dangers in how members of Western societies sleep, conclude evolutionary biologist David Samson of Duke University and colleagues. Compared with other primates, human evolution featured a shift toward sleeping more deeply over shorter time periods, providing more time for learning new skills and knowledge as cultures expanded, the researchers propose. Humans also evolved an ability to revise sleep schedules based on daily work schedules and environmental factors such as temperature.
Samson’s team describes sleep patterns in 33 East African Hadza hunter-gatherers over a total of 393 days in a paper published online January 7 in the American Journal of Physical Anthropology. The team’s separate report on slumber among 21 rural farmers in Madagascar over 292 days will appear later this year in the American Journal of Human Biology.

Sleep patterns in these groups were tracked with wrist devices that measure a person’s activity levels. Both Hadza and Malagasy volunteers slept an average of about 6.5 hours nightly, less than the about seven-hour average for most U.S. adults. Foragers and villagers, who slept in areas with various family and group members, awoke more frequently during the night than has been reported among Westerners. Scalp electrodes worn at night by nine villagers during nine nights revealed biological signs of relatively light sleep compared with Westerners, including shorter periods of slow-wave and rapid eye movement sleep.
But Hadza and Malagasy individuals often supplemented nighttime sleep with one or two daytime naps. Shut-eye breaks averaged 47.5 minutes for the Hadza and about 55 minutes for villagers. Critically, Samson says, foragers and villagers displayed more consistent daily cycles of sleep and wakefulness than are characteristic of Westerners. Hadza adults tended to hit the sack — or, more commonly, the mat — shortly after midnight and nap in the early afternoon. Malagasy villagers napped once or twice during the day’s hottest hours, usually starting around noon, and retired in the early evening. At night, they slept in two phases, awakening for around an hour shortly after midnight. Historical accounts describe a similar sleep pattern among Western Europeans between 500 and 200 years ago — two sleep segments, divided by a period of activity or reflection (SN: 9/25/99, p. 205).
Nighttime sleep in both populations became deeper and less fragmented as tropical humidity dipped.

Researchers also noted that hunter-gatherers and villagers got plenty of direct sunlight, unlike many Westerners. Several studies have linked inconsistent sleep-wake cycles and lack of sun exposure to health problems, including inflammation and heart problems, Samson says. “People in modern societies can take lessons from this research by attempting to get lots of light exposure during the day while reducing blue-wave light exposure after dark and dropping inside temperatures by a few degrees at night.” Smartphones and other digital devices emit blue-wave light, which can suppress melatonin production and delay sleep.

Effects of wayward sleep patterns or too little sleep on health vary across cultures and regions, says biomedical anthropologist Kristen Knutson of Northwestern University Feinberg School of Medicine in Chicago. For instance, sleeping less than six hours per night may increase appetite, as some studies suggest, but a sleep-deprived office worker surrounded by fast-food joints is more likely to become obese than a physically active hunter-gatherer faced with a limited food supply.

Samson’s research aligns with previous evidence, conducted by Knutson, that rural Haitians living without electricity sleep an average of about seven hours nightly. In addition, Siegel’s team recently reported that nightly sleep averages 5.7 to 7.1 hours in three hunter-gatherer societies, including the Hadza (SN: 11/14/15, p. 10).

‘Cannibalism’ chronicles grisly science of eating your own

Until recently, researchers thought cannibalism took place only among a few species in the animal kingdom and only under extraordinary circumstances. But as zoologist Bill Schutt chronicles in Cannibalism, plenty of creatures inhabit their own version of a dog-eat-dog world.

Over the last few decades, scientists have observed cannibalism — defined by Schutt as eating all or part of another individual of the same species — among all major groups of vertebrates. The practice seems to be even more prevalent, and less discriminating, among invertebrates such as mollusks, insects and spiders, whose eggs, larvae and young are often produced in profusion and are therefore readily available, not to mention nutritious.
Cannibalism, Schutt contends, makes perfect evolutionary sense, and not merely as a feeding strategy. When food supplies are low or living conditions are crowded, some mammals and birds may eat some or all of their offspring to terminate an expenditure of effort with poor chances of paying off. For birds, eating a dead or dying hatchling also is a way to get rid of a carcass that could spread infection or whose scent could attract flies or predators to the nest.

Switching to a historical and cultural perspective, Schutt tackles the various forms of human cannibalism, where, he admits, “the ick factor is high.” That includes medicinal cannibalism, from 17th and 18th century Europeans’ consumption of powdered mummies to modern moms’ ingestion of their newborns’ placentas to purportedly restore nutrients lost during childbirth. The author also explores survival cannibalism (think famine victims, people under siege, plane-crash survivors and the ill-fated Donner Party) and briefly addresses our natural shock and seemingly unnatural fascination with criminal cannibalism (à la Jeffrey Dahmer).

As Schutt explains, ritual cannibalism — the consumption of a foe or loved one to acquire the decedent’s strength, courage or wisdom — is a practice that has apparently taken place in different cultures throughout history. In an interesting aside, Schutt ponders whether people who consume wafers and wine during Communion, especially those who firmly believe these items are literally converted into the body and blood of Christ, are engaging in a form of ritual cannibalism.

Cannibalism is a wide-ranging, engaging and thoroughly fun read. The author’s numerous field trips and lab visits with scientists who study the phenomenon heartily enrich this captivating book.

Human gene editing therapies are OK in certain cases, panel advises

Human gene editing to prevent genetic diseases from being passed to future generations may be permissible under certain conditions, a panel of experts says.

Altering DNA in germline cells — embryos, eggs, and sperm, or cells that give rise to them — may be used to cure genetic diseases for future generations, provided it is done only to correct disease or disability, not to enhance people’s health or abilities, a report issued February 14 by the National Academies of Sciences and Medicine recommends. The decision contradicts earlier recommendations by organizers of a global summit on human gene editing, who concluded that gene editing with molecular scissors such as CRISPR/Cas9 should not be used to produce babies (SN: 12/26/15, p. 12).
Heritable gene editing is not yet ready to be done in people, says Alta Charo, a bioethicist at the University of Wisconsin‒Madison Law School who cochaired the panel. “We are not trying to greenlight heritable germline editing. We’re trying to find that limited set of circumstances where its use is justified by a compelling need and its application is limited to that compelling need,” says Charo. “We’re giving it a yellow light.”

National Academies reports carry no legislative weight, but do often influence policy decisions in the United States and abroad. It will be up to Congress, regulatory agencies such as the U.S. Food and Drug Administration, and state and local governments to implement the recommendations.

Supporters of new genetic engineering technologies hailed the decision.

“It looks like the possibility of eliminating some genetic diseases is now more than a theoretical option,” says Sean Tipton, a spokesman for the American Society for Reproductive Medicine in Washington, D.C. “That’s what this sets up.” Diseases such as cystic fibrosis and Huntington’s, which are caused by mutations in single genes, could someday be corrected by gene editing. More complex diseases or disorders caused by changes in multiple genes, such as autism or schizophrenia, probably would not be the focus of genome editing.

Others worry that allowing any tinkering with the germline will inevitably lead to “designer babies” and other social ills. It raises fears of stigmatization of people with disabilities, exacerbation of inequalities between people who can afford such therapies and those who can’t, and even a new kind of eugenics, critics say.
“Once you approve any form of human germline modification you really open the door to all forms,” says Marcy Darnovsky, executive director of the Center for Genetics and Society in Berkeley, Calif.

Panelist Jeffrey Kahn, a bioethicist at Johns Hopkins University, says the door to heritable gene therapy remains closed until stringent requirements can be met. “It’s frankly more of a knock on the door,” he said at the public presentation of the report.

The report also changes the debate from whether to allow germline editing to instead focus on the line between therapy and enhancement, Darnovsky says. “I’m feeling very unsettled and disappointed by what they are recommending.”

Several clinical trials in the United States, China and other countries are already under way to do gene editing in people who have cancer or other diseases. But those therapies do not involve altering germline cells; instead they fix defects or make alterations to DNA in other body, or “somatic,” cells. The panel recommended that such somatic cell therapies should also be restricted to treating diseases, not allowing enhancements.

Researchers in the United Kingdom, Sweden and China have already done gene editing on early human embryos in the lab. Recent clinical trials in Mexico and Ukraine to produce “three-parent babies” are also seen as altering the germline because such children carry a small amount of DNA from an egg donor (SN Online: 10/18/16). But those children don’t have modifications of their nuclear DNA, where the genetic instructions that determine traits are stored.

Currently, researchers in the United States are effectively banned from conducting clinical trials that would produce heritable changes in the human genome, either by gene editing or making three-parent babies. The new recommendations could pave the way to allow such experiments.

But the panel lays out a number of hurdles that must be cleared before germline editing could move forward, ones that may be impossible to overcome, says Nita Farahany, a bioethicist at Duke Law School in Durham, N.C. “Some people could read into the stringency of the requirements to think that the benefits could never outweigh the risks,” she says.

One hurdle is a requirement to follow multiple generations of children who have gotten gene editing to determine whether the therapy has consequences for future generations. Researchers would never be able to guarantee that they could conduct such long-term studies, Farahany says. “You can’t bind your children and grandchildren to agree to be tracked by such studies.”

Distinctions between therapies and enhancements are also vague. Researchers may not be able to convincingly draw lines between them, says George Church, a Harvard University geneticist who has developed CRISPR/Cas9 for a variety of purposes. Virtually everything medicine has accomplished could be considered as enhancing human life, he says. “Vaccines are advancements over our ancestors. If you could tell our ancestors they could walk into a smallpox ward and not even worry about it, that would be a superpower.”

But the new technology may make it harder to enhance humans than drugs do, says Charo. Gene-editing technologies are so precise and specific that someone who does not carry a disease-causing mutation would probably not benefit from the technology, she says.

Anesthesia for youngsters is a tricky calculation

If your young child is facing ear tubes, an MRI or even extensive dental work, you’ve probably got a lot of concerns. One of them may be about whether the drugs used to render your child briefly unconscious can permanently harm his brain. Here’s the frustrating answer: No one knows.

“It’s a tough conundrum for parents of kids who need procedures,” says pediatric anesthesiologist Mary Ellen McCann, a pediatric anesthesiologist at Boston Children’s Hospital. “Everything has risks and benefits,” but in this case, the decision to go ahead with surgery is made more difficult by an incomplete understanding of anesthesia’s risks for babies and young children. Some studies suggest that single, short exposures to anesthesia aren’t dangerous. Still, scientists and doctors say that we desperately need more data before we really understand what anesthesia does to developing brains.

It helps to know this nonanswer comes with a lot of baggage, a sign that a lot of very smart and committed people are trying to answer the question. In December, the FDA issued a drug safety communication about anesthetics that sounded alarming, beginning with a warning that “repeated or lengthy use of general anesthetic and sedation drugs during surgeries or procedures in children younger than 3 years or in pregnant women during their third trimester may affect the development of children’s brains.” FDA recommended more conversations between parents and doctors, in the hopes of delaying surgeries that can safely wait, and the amount of anesthesia exposure in this potentially vulnerable population.

The trouble with that statement, though, is that it raises concerns without answering them, says pediatric anesthesiologist Dean Andropoulos of Texas Children’s Hospital in Houston. And that concern might lead to worse outcomes for their youngest patients. “Until reassuring new information from well-designed clinical trials is available, we are concerned that the FDA warning will cause delays for necessary surgical and diagnostic procedures that require anesthesia, resulting in adverse outcomes for patients,” Andropoulos and a colleague wrote February 8 in a New England Journal of Medicine perspective article.

By and large, the surgeries done in young children have good reasons. Surgery for serious heart disease and other life-threatening conditions can’t wait. Ear tubes need to be put in so that a child can hear and get auditory input that’s required early in life for normal language skills. Likewise, certain kinds of eye surgery and cleft palate repairs all lead to better developmental outcomes if done early.

That doesn’t leave many surgeries that can be put off. “The things that can be delayed are few and far between,” Andropoulos says. That’s why the FDA’s recent drug safety communication might cause extra parental worry about surgeries that ought to be done.

Scientists have lots of data showing that anesthetic drugs can cause long-lasting damage in a variety of species, from roundworms to rats to nonhuman primates. Anesthetics are “like any toxin,” says Andrew Davidson, an anesthesiologist at the Murdoch Childrens Research Center in Melbourne, Australia. “The more you have, the worse it is.”
Yet Davidson and others have uncovered some reassuring news for parents. Quick, single exposures to anesthesia, about an hour or less, don’t seem dangerous.

Davidson, McCann and colleagues recently compared children who, as babies, had undergone hernia repair surgery. Of these babies, 359 had brief general anesthesia and 363 instead received local anesthesia. At age 2, the children showed no differences in mental abilities, the researchers reported last year in The Lancet. That trial, called the GAS study, was particularly well-done because unlike many other studies of this question, babies were randomly assigned to receive either general or local anesthesia. And the experiment isn’t over yet. Scientists will test the children again at age 5, when it will be easier to test more complex forms of thinking.

More encouraging news came from the PANDA study, which tracked over 100 children who had received a short dose of anesthesia (the median was 80 minutes) when they were younger than 3. When those same kids were 8 to 15 years old, their IQs and most other thinking skills were similar to their healthy siblings who had not received anesthesia when they were young.

Along with the GAS results, the PANDA study, published June 7 in the Journal of the American Medical Association, offers some reassurance to parents whose child might need surgery. “If it’s a short procedure, you don’t have to worry about it,” Davidson says.

For now, doctors are making good efforts to talk through these complex questions with parents as they make medical decisions. “We face this issue essentially every day,” Andropoulos says, and at his institute, the FDA guidelines prompted even more conversations. Parents are largely appreciative of having these talks, he says. And hopefully scientists will soon have something more to tell parents about what Andropoulos calls “the most important problem we face in pediatric anesthesia.”

Spray-on mosquito repellents are more effective than other devices

Mosquitoes are more than an itchy nuisance. They can carry serious diseases, including Zika, West Nile, yellow fever and chikungunya. Now after testing 11 types of mosquito repellents, researchers say they’ve identified the products most effective at warding off the bloodsuckers.

Spray-on repellents with DEET or a refined tree extract called oil of lemon eucalyptus are most likely to keep you bite-free, the scientists report online February 16 in the Journal of Insect Science. The OFF! Clip-On repellent, which puffs out a vapor of the chemical metofluthrin, killed every mosquito in the cage. But Hansen says the mosquitoes couldn’t escape, so they probably got a higher dose than they would in a natural setting.
Other tested repellents such as a citronella candle simply don’t work, says study coauthor Immo Hansen, an insect physiologist at New Mexico State University in Las Cruces.

“There are a whole lot of different products out on the market that are sold as mosquito repellents, and most of them haven’t ever been tested in a scientific setting,” Hansen says.

To evaluate the repellents, the researchers used a person, safely protected from bites, as “bait.” The volunteer sat in a wind tunnel as her alluring scent — and repelling chemicals — were pulled toward a cage of Aedes aegypti mosquitoes.
The three-compartment cage allowed the mosquitoes to move toward or away from the volunteer. After 15 minutes, the researchers determined the portion of mosquitoes that had moved into the compartment closest to the volunteer.
Three deterrents did little to dissuade the insects: bracelets with geraniol oil, a sound machine that buzzes like a dragonfly and a citronella candle (which appeared to slightly attract the mosquitoes). Burning a candle releases carbon dioxide, which might have drawn the mosquitoes, which home in on a human meal by sensing exhaled CO2 (SN: 3/18/17, p. 10).

Repellents face-off
Researchers measured attraction rates of A. aegypti mosquitoes to a person one meter or three meters away who was wearing or seated next to the repellent. Attraction rates are the percentage of total mosquitoes, averaged over four tests, that flew toward the person.
These repellents were not significantly different from the no-repellent control: bracelets (Mosquito-NO!, Invisaband, Mosquitavert), Cutter Citro Guard candle and Personal Sonic Mosquito Repeller.

Genetic risk of getting second cancer tallied for pediatric survivors

WASHINGTON — A second cancer later in life is common for childhood cancer survivors, and scientists now have a sense of the role genes play when this happens. A project that mined the genetic data of a group of survivors finds that 11.5 percent carry mutations that increase the risk of a subsequent cancer.

“We’ve always known that among survivors, a certain population will experience adverse outcomes directly related to therapy,” says epidemiologist and team member Leslie Robison of St. Jude Children’s Research Hospital in Memphis. The project sought “to find out what contribution genetics may play.” The team presented their work at the American Association of Cancer Research meeting April 3.
“This is a nice first step,” says David Malkin, a pediatric oncologist at the University of Toronto. “The results validate the thoughts of those of us who believe there is a genetic risk that increases the risk of second malignancies.”

Five-year survival rates for kids with cancer have grown to more than 80 percent. But “there are long-term consequences for having been diagnosed and treated for cancer as a child,” notes Robison. Some survivors develop a later, second cancer due to the radiation or chemotherapy that treated the first cancer (SN: 3/10/07, p. 157).

The researchers examined 3,007 survivors of pediatric cancer who routinely undergo medical evaluation at St. Jude. About a third had leukemia as children. By age 45, 29 percent of this group had developed new tumors, often in the skin, breast or thyroid.

The team cataloged each survivor’s DNA, and looked closely at 156 genes known as cancer predisposition genes. Of the survivors, 11.5 percent carried a problematic mutation in one of the 156 genes. Some genes on the list convey a higher risk than others, so the team looked further at a subset of 60 genes in which only one mutated copy in each cell is enough to cause disease. These 60 genes also have high penetrance, meaning that a mutated copy is highly likely to lead to a cancer. Nearly 6 percent of the survivors had a problematic mutation in one of these 60 genes.

The research team also separated the survivors based on whether or not they had received radiation therapy as children. Close to 17 percent of survivors not exposed to radiation therapy had a problematic mutation in the subset of 60 genes. These survivors had an increased risk for any second cancer. Those with both a mutation in one of the 60 genes and radiation in their treatment history had a higher risk for specific kinds of second cancers: breast, thyroid or sarcomas, tumors in connective tissues.
Based on the new estimates of genetic risk, the team suggests that survivors not given radiation therapy undergo genetic counseling if a second cancer develops. Counseling is also recommended “for survivors who develop a secondary breast cancer, thyroid cancer or sarcoma in a site that received prior radiation therapy,” says St. Jude epidemiologist and project team member Carmen Wilson. Counseling can provide guidance on health practices going forward, reproductive choices and the implications for immediate family members who may have inherited the mutation, notes Robison.

The extensive amount of medical and genomic information collected for the survivors could help with cancer prevention efforts in the future, Robison says. The team would like to create prediction models that consider treatment, genetics and other clinical information, in order to place survivors into different risk groups. “It’s eventually going to have clear implications for how these patients are clinically managed, and how we either prevent or ameliorate the adverse effects,” Robison says.

Malkin notes that not only “what you got for treatment, but when you got it” is another factor influencing a survivor’s risk profile for second cancers, as treatments and doses have changed over time. He also thinks the percentage of survivors at risk reported by Robison’s team is lower than expected. “Expanding the pool of genes to look at will be very informative,” he says.