Potential causality between blood clot factors and migraine with aura

Nearly 15 percent of the U.S. population experiences migraine. One subtype of migraine that is not well understood is migraine with aura (MA). Individuals who experience MA often see flashing lights, blind spots, or jagged lines in their visual field prior to onset of their migraine headaches. Individuals who experience MA also face a heightened risk of stroke and cardiovascular disease, although scientists continue to explore why this correlation exists. In a new study from Brigham and Women’s Hospital, researchers used a technique in genetic analysis termed Mendelian randomization to examine 12 coagulation measures, uncovering four that are associated with migraine susceptibility. Interestingly, scientists only observed these associations in individuals who experience MA and did not observe such associations among individuals who experience migraine without aura (MO). Their research suggests that these hemostatic factors could potentially have a causal role in MA. Their results are published in Neurology.
“We’ve always wanted to know why people with MA have this higher risk of stroke and other cardiovascular diseases,” said corresponding author Daniel Chasman, PhD, of the Division of Preventive Medicine at the Brigham. “This study offers promising leads specific to MA. Finding a possible cause for migraine with aura has been an outstanding question in the field for a long time.”
There has been speculation in the field about relationships between coagulation and migraine susceptibility for some time, but previous research has been largely inconclusive. Most individuals first experience migraine at a young age for example, during childhood or young adulthood. Because previous study designs included only middle-aged and older adults, investigators have questioned whether coagulation causes migraine or if causality exists between these two elements at all. In this study, leveraging Mendelian randomization, which can support or refute potential causal effects on a health outcome, scientists for the first time found evidence that hemostatic factors may contribute to risk of MA.
“Even if we see an association between migraine and these coagulation factors when we measure both factors in a population at the same time, we still wonder: Which one came first?” said co-author Pamela Rist, ScD, of the Division of Preventive Medicine at the Brigham. “One of the interesting parts of Mendelian randomization is that it allows you to examine potential causality.”
Researchers used summary statistics from decades of previously collected data from individuals who experience migraine and individuals who do not experience migraine. Because the diagnostic criteria are different for MA versus MO, they could examine these two conditions separately.
Investigators found a strong association between four coagulation factors and migraine susceptibility. They observed that genetically increased levels of three blood clotting factors: coagulation factor VIII, von Willebrand factor, and phosphorylated fibrinopeptide A, and genetically decreased levels of fibrinogen (a protein important in the late stages of the blood clotting process) were all associated, in part, with migraine susceptibility. Interestingly, scientists did not find this association among individuals who experience migraine without aura (MO), indicating a specific relationship between these hemostatic factors and MA.
Scientists note that Mendelian randomization has its limitations. In the future, researchers could examine if the causal associations implied by genetics can be observed in clinical practice.
“It is very exciting that by using Mendelian randomization we were able to show that hemostatic factors are associated with MA,” said first author Yanjun Guo, MD, PhD of the Division of Preventative Medicine at the Brigham. “And because in the observational studies we saw that MA patients have a higher risk of stroke, these findings may reveal a potential connection between MA and stroke.”
Story Source:
Materials provided by Brigham and Women’s Hospital. Note: Content may be edited for style and length.

Read more →

Scientists discover five new species of listeria, improving food safety

While examining the prevalence of listeria in agricultural soil throughout the U.S., Cornell University food scientists have stumbled upon five previously unknown and novel relatives of the bacteria.
The discovery, researchers said, will help food facilities identify potential growth niches that until now, may have been overlooked — thus improving food safety.
“This research increases the set of listeria species monitored in food production environments,” said lead author Catharine R. Carlin, a doctoral student in food science. “Expanding the knowledge base to understand the diversity of listeria will save the commercial food world confusion and errors, as well as prevent contamination, explain false positives and thwart foodborne outbreaks.”
One of the novel species, L. immobilis, lacked motility, or the ability to move. Listeria move a lot. Among scientists, motility was thought to be common among listeria closely related to L. monocytogenes, a well-known foodborne pathogen — and used as a key test in listeria detection methods. This discovery effectively calls for a rewrite of the standard identification protocols issued by food safety regulators, Carlin said.
As listeria species are often found co-existing in environments that support the growth of L. monocytogenes, food facilities will monitor for all listeria species to verify their sanitation practices.
Listeria monocytogenes can have profound pathogenic influence on food processing plants and those plants must be kept clean. Listeriosis has a mortality rate of 20% to 30% — even with a patient taking antibiotics, according to the U.S. Food and Drug Administration.
The Centers for Disease Control and Prevention estimate that 1,600 people in the U.S. get listeriosis annually and nearly 260 die.
“This paper describes some unique characteristics of listeria species that are closely related to listeria monocytogenes, which will be important from an evolutionary perspective and from a practical standpoint for the food industry,” said co-author Martin Wiedmann, the professor in food safety and food science. “Likely, some tests will need to be re-evaluated.”
Understanding the different listeria species is key to comprehending their similarities. “This will help us to get better about identifying listeria monocytogenes,” Wiedmann said, “and not misidentifying it as something else.”
Since 2010, Wiedmann’s research group has discovered 13 of the 26 species classified in the genus listeria.
“When you’re inspecting the environments of food processing plants or restaurants, you need to know the pathogenic listeria from the non-pathogenic species,” Wiedmann said. “You need to tell the good guys from the bad guys.”
Story Source:
Materials provided by Cornell University. Original written by Blaine Friedlander. Note: Content may be edited for style and length.

Read more →

COVID-19 testing method gives results within one second, researchers report

The COVID-19 pandemic made it clear technological innovations were urgently needed to detect, treat, and prevent the SARS-CoV-2 virus. A year and a half into this epidemic, waves of successive outbreaks and the dire need for new medical solutions — especially testing — continue to exist.
In the Journal of Vacuum Science & Technology B, researchers from the University of Florida and Taiwan’s National Chiao Tung University report a rapid and sensitive testing method for COVID-19 biomarkers.
The researchers, who previously demonstrated detection of biomarkers relevant in epidemics and emergencies, such as the Zika virus, heart attacks, and cerebral spinal fluid leaks, leveraged their expertise to develop a sensor system that provides detection within one second, which is far faster than current COVID-19 detection methods.
“This could alleviate slow COVID-19 testing turnaround time issues,” said Minghan Xian, an author and a chemical engineering doctoral candidate at the University of Florida.
Detecting the presence of the virus requires amplifying the numbers of the biomarker, such as the copies of viral ribonucleic acid in the common polymerase chain reaction technique for COVID-19 detection, or amplifying the binding signal for a target biomarker. The group’s method amplifies the binding signal for a target biomarker.
“Our biosensor strip is similar to commercially available glucose test strips in shape, with a small microfluidic channel at the tip to introduce our test fluid,” said Xian. “Within the microfluidic channel, a few electrodes are exposed to fluid. One is coated with gold, and COVID-relevant antibodies are attached to the gold surface via a chemical method.”
During measurement, sensor strips are connected to a circuit board via a connector, and a short electrical test signal gets sent between the gold electrode bonded with COVID antibody and another auxiliary electrode. This signal is then returned to the circuit board for analysis.
“Our sensor system, a circuit board, uses a transistor to amplify the electrical signal, which then gets converted into a number on the screen,” said Xian. “The magnitude of this number depends on the concentration of antigen, the viral protein, present within our test solution.”
While the system’s sensor strips clearly must be discarded after use, the test circuit board is reusable. This means the cost of testing may be greatly reduced. The versatility of this technology goes far beyond detecting COIVD-19.
“By altering the type of antibodies attached to the gold surface, we can repurpose the system to detect other diseases,” said Xian. “The system can serve as a prototype for modularized, inexpensive protein biomarker sensors for expedient real-time feedback within clinical applications, operating rooms, or home use.”
Story Source:
Materials provided by American Institute of Physics. Note: Content may be edited for style and length.

Read more →

Nanofiber filter captures almost 100 percent of coronavirus aerosols in experiment

A filter made from polymer nanothreads blew three kinds of commercial masks out of the water by capturing 99.9% of coronavirus aerosols in an experiment.
“Our work is the first study to use coronavirus aerosols for evaluating filtration efficiency of face masks and air filters,” said corresponding author Yun Shen, a UC Riverside assistant professor of chemical and environmental engineering. “Previous studies have used surrogates of saline solution, polystyrene beads, and bacteriophages — a group of viruses that infect bacteria.”
The study, led by engineers at UC Riverside and The George Washington University, compared the effectiveness of surgical and cotton masks, a neck gaiter, and electrospun nanofiber membranes at removing coronavirus aerosols to prevent airborne transmission. The cotton mask and neck gaiter only removed about 45%-73% of the aerosols. The surgical mask did much better, removing 98% of coronavirus aerosols. But the nanofiber filter removed almost all of the coronavirus aerosols.
The World Health Organization and Centers for Disease Control have both recognized aerosols as a major mechanism of COVID-19 virus transmission. Aerosols are tiny particles of water or other matter that can remain suspended in air for long periods of time and are small enough to penetrate the respiratory system.
People release aerosols whenever they breathe, cough, talk, shout, or sing. If they are infected with COVID-19, these aerosols can also contain the virus. Inhaling a sufficient quantity of coronavirus-laden aerosols can make people sick. Efforts to curb aerosol spread of COVID-19 focus on minimizing individual exposure and reducing the overall quantity of aerosols in an environment by asking people to wear masks and by improving indoor ventilation and air filtration systems.
Studying a contagious new virus is dangerous and done in labs with the highest biosecurity ratings, which are relatively rare. To date, all studies during the pandemic on mask or filter efficiency have used other materials thought to mimic the size and behavior of coronavirus aerosols. The new study improved on this by testing both aerosolized saline solution and an aerosol that contained a coronavirus in the same family as the virus that causes COVID-19, but only infects mice.
Shen and George Washington University colleague Danmeng Shuai produced a nanofiber filter by sending a high electrical voltage through a drop of liquid polyvinylidene fluoride to spin threads about 300 nanometers in diameter — about 167 times thinner than a human hair. The process created pores only a couple of micrometers in diameter on the nanofiber’s surfaces, which helped them capture 99.9% of coronavirus aerosols.
The production technique, known as electrospinning, is cost effective and could be used to mass produce nanofiber filters for personal protective equipment and air filtration systems. Electrospinning also leaves the nanofibers with an electrostatic charge that enhances their ability to capture aerosols, and their high porosity makes it easier to breathe wearing electrospun nanofiber filters.
“Electrospinning can advance the design and fabrication of face masks and air filters,” said Shen. “Developing new masks and air filters by electrospinning is promising because of its high performance in filtration, economic feasibility, and scalability, and it can meet on-site needs of the masks and air filters.”
The paper, “Development of electrospun nanofibrous filters for controlling coronavirus aerosols,” is published in Environmental Science & Technology Letters. Other authors include Hongchen Shen, Zhe Zhou, Haihuan Wang, Mengyang Zhang, Minghao Han, and David P. Durkin. This work is funded by the National Science Foundation.
Story Source:
Materials provided by University of California – Riverside. Original written by Holly Ober. Note: Content may be edited for style and length.

Read more →

New model for infectious disease could better predict future pandemics

In the midst of a devastating global pandemic of wildlife origin and with future spillovers imminent as humans continue to come into closer contact with wildlife, infectious-disease models that consider the full ecological and anthropological contexts of disease transmission are critical to the health of all life. Existing models are limited in their ability to predict disease emergence, since they rarely consider the dynamics of the hosts and ecosystems from which pandemics emerge.
Published May 17 in Nature Ecology and Evolution, Smithsonian scientists and partners provide a framework for a new approach to modeling infectious diseases. It adapts established methods developed to study the planet’s natural systems, including climate change, ocean circulation and forest growth, and applies them to parasites and pathogens that cause disease.
Increased human-animal interactions lead to the emergence and spread of zoonotic pathogens, which cause about 75% of infectious diseases affecting human health. Predicting where, how and when people and animals are at risk from emerging pathogens — and the best ways to manage this — remains a significant challenge. Risks for spillover include, but are not limited to, habitat encroachment, illegal wildlife trade and bush meat consumption.
Despite incredible advances in the understanding of how infectious diseases are transmitted, the models these efforts are based on are relatively limited in scope, focusing on specific pathogens and often overlooking how pathogens interact within their hosts. While scientists and global health organizations are putting a lot of effort into studying the diversity of disease-causing organisms, existing models do not link this diversity to their roles within ecosystems.
“Just as a mechanic must understand how a car’s components interact and how it’s been engineered in order to improve performance, the same applies to our ability to model infectious disease,” said first author Dr. James Hassell, wildlife veterinarian, epidemiologist and Keller Family Skorton Scholar for the Smithsonian Conservation Biology Institute’s (SCBI) Global Health Program. “Applying systems-level thinking to forecast disease emergence requires a fundamental change in how we conceptualize infectious diseases. This presents significant challenges, but in this article, we explain why they’re not insurmountable. When you weigh the cost of prevention versus remediation, the investment in our shared global health, particularly the connections between nature and human health, is vital.”
Researchers say this new model will require expertise and collaboration across fields such as veterinary and human medicine, disease ecology, biodiversity conservation, biotechnology and anthropology.
“Disease and health are predominantly viewed as a human construct and the role the environment plays in disease is often overlooked,” said Yvonne-Marie Linton, research director for the Walter Reed Biosystematics Unit for the Smithsonian’s National Museum of Natural History and Walter Reed Army Institute of Research. “The health of other organisms, from parasites and insects to birds and aquatic organisms, can alter the structure of ecosystems. What we’re proposing is a new approach to modeling infectious diseases that are circulating in nature, which would allow scientists to simulate the behavior of these pathogens in wildlife populations, how they respond to human activities and better determine the risk that they pose to people.”
General ecosystem models are essentially complex models that can predict how food chains are assembled — the processes of energy transfer between plants and animals are what structure ecosystems — and determine the plants and animals that compose an ecosystem. With the new version, general “episystem” models, the paper’s authors outline a framework for integrating disease agents (including parasites, viruses and bacteria) into these models. By identifying general rules for how food chains that include disease entities are structured, it should be possible to predict the types of pathogens that are present in any given ecosystem. This would allow scientists to better understand the characteristics of an ecosystem (such as disturbance) that would make it more likely to contain zoonotic pathogens, predict the threat it poses to people who interact with this ecosystem and even permit computer simulation and testing of interventions aimed at reducing these threats.
While the amount of data that would be required to create these models is daunting, long-term studies of intact ecosystems where parasite data has been collected are excellent places to initiate these studies. Efforts to refine them more broadly could then leverage large-scale ecological studies that span continents such as the Smithsonian’s ForestGEO and MarineGEO programs.
The potential impacts of this new model go beyond reducing the human interface for disease spillover, to economics. “You could use this new approach to not only to look at human diseases, but also to look at the best way to conduct aquaculture or raise healthy livestock,” said Katrina M. Pagenkopp Lohan, a marine disease ecologist at the Smithsonian Environmental Research Center. “If you’re reintroducing a species into the wild, what do you need that ecosystem to look like for you to be successful? We could actually model that. It’s mind blowing.”
The cost of such a new approach is considerable, say researchers, and will take the global cooperation and commitment of scientists, communities, non-governmental organizations and nations. In an era of big data and massive advances in technology, this kind of approach is achievable but requires enhanced data collection, sharing and testing at far greater scales than currently occur.
The paper’s co-authors are Hassell, Global Health Program, SCBI and Department of Epidemiology of Microbial Disease, Yale School of Public Health; Tim Newbold, Centre for Biodiversity & Environment Research, Department of Genetics, Evolution and Environment, University College London; Andrew P. Dobson, Department of Ecology & Evolutionary Biology, Princeton University and Santa Fe Institute; Linton, doctorate in zoology, Walter Reed Biosystematics Unit, Smithsonian’s Museum Support Center, Department of Entomology, Smithsonian’s National Museum of Natural History and Walter Reed Army Institute of Research; Lydia H.V. Franklinos, Centre for Biodiversity & Environment Research, Department of Genetics, Evolution and Environment, University College London; Dr. Dawn Zimmerman, veterinarian, Global Health Program, SCBI and Department of Epidemiology of Microbial Disease, Yale School of Public Health; and Pagenkopp Lohan, doctorate in marine science, Marine Disease Ecology Laboratory, Smithsonian Environmental Research Center.

Read more →

Scientists reveal role of genetic switch in pigmentation and melanoma

Despite only accounting for about 1 percent of skin cancers, melanoma causes the majority of skin cancer-related deaths. While treatments for this serious disease do exist, these drugs can vary in effectiveness depending on the individual.
A Salk study published on May 18, 2021, in the journal Cell Reports reveals new insights about a protein called CRTC3, a genetic switch that could potentially be targeted to develop new treatments for melanoma by keeping the switch turned off.
“We’ve been able to correlate the activity of this genetic switch to melanin production and cancer,” says Salk study corresponding author Marc Montminy, a professor in the Clayton Foundation Laboratories for Peptide Biology.
Melanoma develops when pigment-producing cells that give skin its color, called melanocytes, mutate and begin to multiply uncontrollably. These mutations can cause proteins, like CRTC3, to prompt the cell to make an abnormal amount of pigment or to migrate and be more invasive.
Researchers have known that the CRTC family of proteins (CRTC1, CRTC2, and CRTC3) is involved in pigmentation and melanoma, yet obtaining precise details about the individual proteins has been elusive. “This is a really interesting situation where different behaviors of these proteins, or genetic switches, can actually give us specificity when we start thinking about therapies down the road,” says first author Jelena Ostojic, a former Salk staff scientist and now a principal scientist at DermTech.
The researchers observed that eliminating CRTC3 in mice caused a color change in the animal’s coat color, demonstrating that the protein is needed for melanin production. Their experiments also revealed that when the protein was absent in melanoma cells, the cells migrated and invaded less, meaning they were less aggressive, suggesting that inhibiting the protein could be beneficial for treating the disease.
The team characterized, for the first time, the connection between two cellular communications (signaling) systems that converge on the CRTC3 protein in melanocytes. These two systems tell the cell to either proliferate or make the pigment melanin. Montminy likens this process to a relay race. Essentially, a baton (chemical message) is passed from one protein to another until it reaches the CRTC3 switch, either turning it on or off.
“The fact that CRTC3 was an integration site for two signaling pathways — the relay race — was most surprising,” says Montminy, who holds the J.W. Kieckhefer Foundation Chair. “CRTC3 makes a point of contact between them that increases specificity of the signal.”
Next, the team plans to further investigate the mechanism of how CTRC3 impacts the balance of melanocyte differentiation to develop a better understanding of its role in cancer.
Story Source:
Materials provided by Salk Institute. Note: Content may be edited for style and length.

Read more →

Senators Visit NIH

Thanks to the group of U.S. senators that came out to visit NIH. It was a pleasure hosting the group, photographed here while touring the Dale and Betty Bumpers Vaccine Research Center (VIC) and learning more about NIH’s critical response to the COVID pandemic. The visit took place on May 17, 2021. Credit: NIH

Post Link

Senators Visit NIH

NIH Blog Post Date

Tuesday, May 18, 2021

Read more →

Colorectal cancer screening to begin at age 45, lowered from 50

Prompted by a recent alarming rise in cases of colorectal cancer in people younger than 50, an independent expert panel has recommended that individuals of average risk for the disease begin screening exams at 45 years of age instead of the traditional 50.
The guideline changes by the U.S. Preventive Services Task Force (USPSTF), published in the current issue of JAMA, updates its 2016 recommendations and aligns them with those of the American Cancer Society, which lowered the age for initiation of screening to 45 years in 2018.
Colorectal cancer (CRC) is one of the most preventable malignancies, owing to its long natural history of progression and the availability of screening tests that can intercept and detect the disease early. Overall incidence of CRC in individuals 50 years of age and older has declined steadily since the mid-1980s, largely because of increased screening and changing patterns of modifiable risk factors.
“A concerning increase in colorectal cancer incidence among younger individuals (ie, younger than 50 years; defined as young-onset colorectal cancer) has been documented since the mid-1990s, with 11% of colon cancers and 15% of rectal cancers in 2020 occurring among patients younger than 50 years, compared with 5% and 9%, respectively, in 2010,” said Kimmie Ng, MD, MPH, first author of an editorial in JAMA accompanying the article about the guideline change of the USPSTF. Ng is the director of the Young-Onset Colorectal Cancer Center at Dana-Farber Cancer Institute.
The causes of the increase in young-onset CRC aren’t currently known.
Lowering the recommended age to initiate screening “will make colorectal cancer screening, which is so important, available to millions more people in the United States, and hopefully many more lives will be saved by catching colorectal cancer earlier, as well as by preventing colorectal cancer,” said Ng.

Read more →

Racial disparities in COVID-19 mortality wider than reported, study finds

More than a year into the pandemic, the disproportionate burden of COVID-19 among racial and ethnic minorities in the US has been well documented. But a new study by Boston University School of Public Health (BUSPH) reveals that previous research has underestimated the true extent of racial disparities in COVID-19 deaths — as well as the extent to which structural racism contributes to these deaths.
Published in the Journal of Racial and Ethnic Health Disparities, the paper is the first to quantify the state-level differences in racial disparities in COVID-19 mortality among Black and White populations, using directly standardized, age-adjusted death rates. When comparing these age-adjusted rates, the study found that the Black-White disparity in COVID-19 mortality rates across states was substantially greater than what has previously been reported.
Until now, few studies on COVID-19 deaths have taken the age distribution of different populations into account, or explicitly compared race-specific mortality rates at the state level. But age is perhaps the single most important predictor of COVID-19 mortality, says study lead author Dr. Michael Siegel, professor of community health sciences at BUSPH. Different racial and ethnic groups have different age distributions, and comparing crude COVID death rates instead of age-adjusted rates can be misleading, he says.
“Because of structural racism, chronic diseases are much more common among the Black population compared to the White population, and for this reason, life expectancy for Black people is substantially less than that for White people,” says Siegel. The shorter life expectancy means that the Black population of a state is going to be considerably younger than the White population for that state, he says.
“If you ignore this fact, then it is going to appear that the COVID death rate is much higher among the White population because there are so many more older people,” Siegel says. “To get an accurate idea of the true disparity in death rates, you need to compare the COVID death risk of Black people and White people in a state at the same age.”
The study is also the first paper to measure structural racism at the state level, and model this racism as a direct, quantifiable predictor of racial disparities in COVID-19 deaths across states.
Siegel and a team of researchers used data from the Centers for Disease Control and Prevention to calculate both crude and age-adjusted COVID-19 mortality rates for non-Hispanic White and non-Hispanic Black populations in 35 states. Then they compared these age-adjusted rates in order to quantify the racial disparity in mortality rates. Using linear regression analyses, they examined the potential relationship between a previously defined structural racism index and the racial disparity in COVID-19 mortality. The team also utilized linear regression analysis to explore potential mediating effects of exposure based on occupation, prevalence of underlying conditions, and disparities in healthcare access. This analysis yielded the following key findings: For all 35 states, the Black-White disparity in COVID-19 mortality rates was substantially greater when examining age-adjusted rates compared to crude rates. Thus, relying on crude death rates severely underestimates the true magnitude of the Black-White disparity in COVID-19 mortality rates. A high level of structural racism was a robust predictor of increased racial disparities in COVID deaths. All 35 states showed a pattern of increasing racial disparities as the state structural racism index increased. The five states with the highest structural racism indices had an average disparity ratio of 2.7, compared to 2.1 for the five states with the lowest racism indices. Structural racism appears to be a root cause of the Black-White disparity in COVID-19 mortality.”Even if we could somehow equalize comorbidities between the White and Black populations, our results suggest that the racial disparity in COVID-19 death rates would still persist,” says Siegel. “These findings suggest that the only way to fully address the consequences of structural racism is to dismantle structural racism itself.”
The study was coauthored by Isabella Critchfield-Jain, a Boston University student; Matthew Boykin, a BUSPH student; and Alicia Owens, a Natick High School student.
Story Source:
Materials provided by Boston University School of Medicine. Original written by Jillian McKoy. Note: Content may be edited for style and length.

Read more →

Test detects childhood tuberculosis a year ahead

Researchers at Tulane University School of Medicine have developed a highly sensitive blood test that can find traces of the bacteria that causes tuberculosis (TB) in infants a year before they develop the deadly disease, according to a study published in BMC Medicine.
Using only a small blood sample, the test detects a protein secreted by Mycobacterium tuberculosis, which causes TB infection. It can screen for all forms of TB and rapidly evaluate a patient’s response to treatment, said lead study author Tony Hu, PhD, Weatherhead Presidential Chair in Biotechnology Innovation at Tulane University.
“This is a breakthrough for infants with tuberculosis because we don’t have this kind of screening technology to catch early infections among those youngest groups who are most likely to be undiagnosed,” Hu said. “I hope this method can be pushed forward quickly to reach these children as early as possible.”
Each year, nearly a million children develop TB and 205,000 die of TB-related causes. More than 80% of childhood TB deaths occur in those under the age of 5. Most of these deaths occur because their disease is undiagnosed as children with TB, particularly infants, usually have symptoms that are not specific for the disease. These children also have difficulty producing the respiratory samples used for TB detection by the best TB tests now in use.
Even when it is possible to obtain these samples from children, they tend to be less effective for diagnosis, since they often contain much less of the bacteria than samples from adults, Hu said. His test’s assay, however, uses a small blood sample that can be easily obtained from children of any age to detect a specific protein (CFP-10) that the bacteria secrete to maintain the infection that develops into TB. Since this protein is present at very low levels in the blood, Hu’s assay uses an antibody specific for this protein to enrich it from other proteins in blood and a mass spectrometer to detect it with high sensitivity and accuracy.
Hu and his team used this test to screen stored blood samples collected from 284 HIV-infected and 235 children without the virus who participated in a large clinical trial conducted between 2004-2008. Hu’s group found their test identified children diagnosed with TB by the current gold-standard TB tests with 100% accuracy. The assay also detected 83.7% of TB cases that were missed by these tests, but that were later diagnosed by a standard checklist employing an array of other information collected by each child’s physician (unconfirmed TB cases). Hu’s test also detected CFP-10 in 77% of the blood samples that were collected 24 weeks before children were diagnosed with TB by other methods, indicating its strong potential for early TB diagnosis. The biomarker from some positive cases can be detected as early as 60 weeks before their TB diseases were confirmed.
The researchers are working to develop an inexpensive, portable instrument to read the test to allow it to be more easily used in resource-limited settings often encountered in areas where TB is prevalent.
The study was funded by the National Institute of Allergy and Infectious Diseases, the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the U.S. Department of Defense.
Story Source:
Materials provided by Tulane University. Original written by Carolyn Scofield. Note: Content may be edited for style and length.

Read more →