Frequent COVID-19 testing key to efficient, early detection, study finds

The chance of detecting the virus that causes COVID-19 increases with more frequent testing, no matter the type of test, a new study found. Both polymerase chain reaction and antigen tests, paired with rapid results reporting, can achieve 98% sensitivity if deployed at least every three days.
“This study shows that frequent testing can be really effective at catching COVID-19 infections and potentially blocking transmission,” said study leader Christopher Brooke, a virologist and professor of microbiology at the University of Illinois Urbana-Champaign. “There are many places where vaccination is not yet widespread. With the rise of variants, testing remains an important tool for blocking the spread of the virus.”
Part of the Rapid Acceleration of Diagnostics Tech program of the National Institutes of Health, the study brought together researchers at Illinois; the University of Massachusetts Medical School, Worcester; Johns Hopkins School of Medicine, Baltimore; and the NIH National Institute of Biomedical Imaging and Bioengineering. The researchers published their results in the Journal of Infectious Diseases.
Students and employees at the U. of I. who had tested positive for COVID-19 or were identified as close contacts of a person who tested positive were invited to participate. Because of the SHIELD Illinois screening program, which required students and employees to take multiple saliva-based tests each week and returned results in less than 24 hours, the university provided an ideal location for identifying cases before they became symptomatic, the researchers said.
The 43 study participants received three tests daily for 14 days: a PCR nasal swab, a PCR saliva test and an antigen nasal swab. The results of each were compared with live viral cultures taken from the PCR nasal swab, which show when a person is actively infectious. The study also examined how the frequency of testing affected each method’s efficacy at detecting an infection.
“Different tests have different advantages and limitations. Antigen tests are fast and cheap, but they are not as sensitive as PCR tests. PCR are the gold standard, but they take some time to return results and are more expensive,” said Rebecca Lee Smith, a professor of epidemiology at Illinois and the first author of the study. “This study was to show, based on real data, which test is best under which circumstances and for what purpose.”
The results showed that the PCR tests — particularly saliva-based ones — were best at detecting cases before the person had an infectious viral load, a key to isolating individuals before they can spread the virus, Smith said. For all three methods, testing every three days had 98% sensitivity to detecting infection.
If that testing frequency declined to once a week, the PCR methods maintained their high sensitivity but the antigen tests dropped to around 80%. That means organizations that wish to deploy antigen testing as part of a reopening strategy or individuals who wish to monitor their status at home should use antigen tests multiple times each week to achieve similar results to PCR testing, the researchers said.
“This work also shows how the PCR and antigen tests could be used in combination,” Smith said. “For example, I work with a lot of school districts, helping them to plan for fall, since vaccines are not yet available to those under 12 years old. If a student had a known exposure or comes to school symptomatic, give them both tests. Antigen tests are really good at finding those highly infectious people, so that can tell administrators right away if the child needs to be sent home, rather than waiting 24 hours for PCR results. If the antigen test is negative, the PCR test is a backup, as it may detect the infection earlier than an antigen test would, before the student becomes contagious.”
The results of the study helped inform the U.S. Food and Drug Administration’s recommendations and instructions on how to use at-home antigen tests that recently received emergency use authorization. The researchers said they hope the results assist schools, businesses and other organizations as they reopen.
“If you are in a situation where you have the resources and capacity to do large-scale PCR testing with rapid results reporting like we did here at Illinois, you can identify infections early and potentially isolate people before they become contagious,” Brooke said. “In places where PCR testing is not readily available or rapid results reporting is not possible, but the cheaper and more rapid tests are available, our data show how those tests can be deployed in a way that can increase their sensitivity — through repeated serial testing, ideally three times a week or more.”
The study was conducted with support from the NIH RADx-Tech program under grant U54 HL143541-02S2.

Read more →

Human stem cells enable model to test drug impact on brain's blood barrier

Using an experimental model to simulate the blood-brain barrier, scientists in Sweden reported in unprecedented detail how antioxidants protect the brain from inflammation caused by neurodegenerative diseases such as Alzheimer’s and Parkinson’s.
The study, conducted as a proof of concept by brain model developers at KTH Royal Institute of Technology in Stockholm, showed in minute-by-minute detail how the blood-brain barrier reacts to high levels of inflammation after the administration of a next-generation derivative of the widely-used anti-inflammatory drug, NAC (N-acetylcysteine).
The testing of NACA (N-Acetylcysteine Amide) for the first time with human stem cell-derived cells showed that the breakdown of the barrier under high loads of inflammation is “actually more complex than we thought,” says KTH researcher Thomas Winkler.
The findings were published in the journal, Small.
“This was the first test of this NACA compound with human stem cells,” Winkler says. “The results show that we can use this to test other derivatives of the NAC compound — as well as different antioxidants — and see if we find anything that has even higher neural protection.”
Co-author Isabelle Matthiesen, a PhD student at KTH, says that the research is not meant to provide definitive proof of how anti-inflammatories affect the brain; yet the results provide encouraging evidence that the model could replace testing drugs on animals before clinical trials.

Read more →

Employed individuals more likely to contract the flu, study shows

A University of Arkansas researcher and international colleagues found that employed individuals, on average, are 35.3% more likely to be infected with the flu virus.
The findings confirm a long-held assumption about one prevalent way illness spreads and could influence government policy on public health and several issues for private companies, from optimal design and management of physical work spaces to policy decisions about sick leave and remote work.
To track influenza incidence, Dongya “Don” Koh, assistant professor of economics in the Sam M. Walton College of Business, and colleagues relied on nationally representative data from the Medical Expenditure Panel Survey, which provide comprehensive health care information about families and individuals, their medical providers and U.S. employers. The survey is the most complete source of data on the cost and use of health care and health insurance coverage.
Koh and his colleagues found significant differences in flu incidence across various occupations and industries. With the former, for example, people working in sales had a 40.5% higher probability of infection than farmers. In terms of industries, for example, education, health and social services showed a 52.2% higher probability of infection than mining. The results considered individual characteristics, including vaccinations, health insurance and other circumstances.
“Cross-industry differences in flu incidence cannot be fully explained by differences within an industry-specific occupational structure,” Koh said. “So we had to look at the extent of human contact and interaction at work as a potential mechanism for contagion.”
To do this, the researchers constructed a measure of occupation-specific and industry-specific human exposure and interaction, based on data gleaned from O*NET OnLine, a comprehensive source for the description of jobs, occupational information and workforce development. The researchers found that higher human contact at work was positively associated with higher contagion rates.
The results were larger in years of high aggregate flu incidence and consistent with regard to firm size, number of jobs and hours worked.
“These results shouldn’t surprise anyone,” Koh said. “We hope they are relevant for an understanding of the spread of flu and other infectious diseases transmitted via respiratory droplets or close human contact, including SARS and COVID. The fact that contagion risk varies across occupations and industries opens the door for an assessment of nonpharmaceutical policies to combat contagion and possibly pandemics. In this sense, we think these results provide a basis for an organizational policy that both protects workers and optimizes production and efficiency.”
In addition to Koh, the research team included Anna Houstecka, research associate in labor economics at the Institute for Employment Research in Nuremberg, Germany; and Raul Santaeulalia-Llopis, a senior research professor of economics at Universitat Autònoma in Barcelona, Spain.
Story Source:
Materials provided by University of Arkansas. Original written by Matt McGowan. Note: Content may be edited for style and length.

Read more →

Embryo freezing for IVF appears linked to blood pressure problems in pregnancy

A large cohort study drawn from the national IVF registry of France, which included almost 70,000 pregnancies delivered after 22 weeks gestation between 2013 and 2018, has found a higher risk of pre-eclampsia and hypertension in pregnancies derived from frozen-thawed embryos. This risk was found significantly greater in those treatments in which the uterus was prepared for implantation with hormone replacement therapies. The results confirm with real-life data what has been observed in sub-groups of patients in other studies.
The results are presented today by Dr Sylvie Epelboin from the Hôpital Bichat-Claude Bernard, Paris, at the online annual meeting of ESHRE. The study was performed on behalf of the Mother & child health after ART network, of the French Biomedecine Agency. She said that the results highlight two important considerations in IVF: the potentially harmful effects on vascular pathologies of high and prolonged doses of hormone replacement therapies used to prepare the uterus for the implantation of frozen-thawed embryos; and the protective effect of a corpus luteum (1), which is present in natural or stimulated cycles for embryo transfer. The hormone replacement therapy given to prepare the uterus for embryo transfer, explained Dr Epelboin, suppresses ovulation and therefore the formation of the corpus luteum.
The risk of pre-eclampsia and other pregnancy-related disorders of pregnancy has been raised in a growing number of studies of freezing in IVF.(2) However, the overall risks of maternal morbidity are known to be generally lower in pregnancies resulting from frozen embryo transfer than in those from fresh transfers — except in relation to the risk of pre-eclampsia. While some studies have observed such risks in frozen embryo transfers, few studies, said Dr Epelboin, have compared these “maternal vascular morbidities with the two hormonal environments that preside over the early stages of embryonic development.”
This study divided the cohort of pregnancies from IVF and ICSI in the French national database into three groups of singletons for comparison: those derived from frozen embryo transfer in a natural “ovulatory” cycle (whether stimulated or not) (n = 9,500); those from frozen embryo transfer with hormone replacement therapy (n = 10,373); and conventional fresh transfers (n = 48,152).
Results showed a higher rate of pre-eclampsia with frozen embryos transferred in the artificial (ie, prepared with hormone therapy) frozen cycle (5.3%) than in an ovulatory cycle (2.3%) or in fresh cycles (2.4%). The rates were found similarly distinct in pregnancy-induced hypertension (4.7% vs 3.4% vs 3.3%). These differences were statistically significant, even after adjusting for maternal characteristics (age, parity, tobacco, obesity, history of diabetes, hypertension, endometriosis, polycystic ovaries, premature ovarian failure) to avoid bias.
Dr Epelboin and colleagues concluded that the study demonstrates that preparation of the uterus with hormones in an artificial cycle is significantly associated with a higher risk of vascular disorders than from cycles with preserved ovulation and fresh embryo transfers.
The use of frozen embryos has increased in IVF in recent years. Success rates in frozen-thawed embryo transfers are reported to be as or more successful than with fresh embryos and, because frozen transfers appear to reduce the risk of hyperstimulation, it also has safety advantages; the blood pressure risks observed in this study and others do not appear to outweigh these benefits, said Dr Epelboin.
Moreover, because results obtained in an ovulatory cycle appear not to affect the chance of pregnancy, preservation of the ovulatory cycle could be advocated as first-line preparation in frozen embryo transfers whenever the choice is possible.
Presentation 0-182 Wednesday 30 June Higher risk of preeclampsia and pregnancy-induced hypertension with artificial cycle for Frozen-thawed Embryo Transfer compared to ovulatory cycle or fresh transfer following In Vitro Fertilization The corpus luteum in pregnancy The corpus luteum is a naturally developing cluster of cells which form in the ovary during early pregnancy and pump out a pulse of progesterone, a fertility hormone. Progesterone supports the lining of the uterus (endometrium) during pregnancy and improves blood flow. Embryo freezing and the risk of pre-eclampsia in pregnancy This is the first large-scale study to identify an association between a hormonally prepared uterus (artificial cycle) and a significantly raised risk of pre-eclampsia in pregnancies following the transfer of a frozen-thawed embryo. Several (but not all) randomised trials of freezing embryos generated from an initial egg collection (“freeze-all”) have observed such trends as a secondary endpoint. A substantial review of the literature published in 2018 (Maheshewari et al, Hum Reprod Update 2018) concluded that the evidence in favour of embryo freezing was “reassuring” while adding “a need for caution” from the increased risk of hypertension in pregnancy. Generally, embryo freezing allows several transfers from an initial egg collection treatment (and thereby encourages single embryo transfer and the avoidance of multiple pregnancies) and in freeze-all protocols avoids transfer in the same cycle in which the ovaries were stimulated.

Read more →

Eating disorder behaviors alter reward response in brain

Researchers have found that eating disorder behaviors, such as binge-eating, alter the brain’s reward response process and food intake control circuitry, which can reinforce these behaviors. Understanding how eating disorder behaviors and neurobiology interact can shed light on why these disorders often become chronic and could aid in the future development of treatments. The study, published in JAMA Psychiatry, was supported by the National Institutes of Health.
“This work is significant because it links biological and behavioral factors that interact to adversely impact eating behaviors,” said Janani Prabhakar, Ph.D., of the Division of Translational Research at the National Institute of Mental Health, part of NIH. “It deepens our knowledge about the underlying biological causes of behavioral symptom presentation related to eating disorders and will give researchers and clinicians better information about how, when, and with whom to intervene.”
Eating disorders are serious mental illnesses that can lead to severe complications, including death. Common eating disorders include anorexia nervosa, bulimia nervosa, and binge-eating disorder. Behaviors associated with eating disorders can vary in type and severity and include actions such as binge-eating, purging, and restricting food intake.
In this study, Guido Frank, M.D., at the University of California San Diego, and colleagues wanted to see how behaviors across the eating disorder spectrum affect reward response in the brain, how changes in reward response alter food intake control circuitry, and if these changes reinforce eating disorder behaviors. The study enrolled 197 women with different eating disorders (including anorexia nervosa, bulimia nervosa, binge-eating disorder, and other specified feeding and eating disorders) and different body mass indexes (BMIs) associated with eating disorder behaviors, as well as 120 women without eating disorders.
The researchers used cross-sectional functional brain imaging to study brain responses during a taste reward task. During this task, participants received or were denied an unexpected, salient sweet stimulus (a taste of a sugar solution). The researchers analyzed a brain reward response known as “prediction error,” a dopamine-related signaling process that measures the degree of deviation from the expectation, or how surprised a person was receiving the unexpected stimulus. A higher prediction error indicates that the person was more surprised, while a lower prediction error indicates they were less surprised. They also investigated whether this brain response was associated with ventral-striatal-hypothalamic circuitry, a neural system associated with food intake control.
The researchers found that there was no significant correlation between BMI, eating disorder behavior, and brain reward response in the group of women without eating disorders. In the group of women with eating disorders, higher BMI and binge-eating behaviors were associated with lower prediction error response. Further, for the women with eating disorders, the direction of ventral striatal-hypothalamic connectivity was the reverse of those without eating disorders, with connectivity directed from the ventral striatum to the hypothalamus. This connectivity was positively related to the prediction error response and negatively related to feeling out of control after eating.
These results suggest that for the women with eating disorders, eating disorder behaviors and excessive weight loss or weight gain modulated the brain’s dopamine-related reward circuit response, altering brain circuitry associated with food intake control, and potentially reinforcing eating disorder behaviors. For example, women with anorexia nervosa, restrictive food intake, and low BMIs had a high prediction error response. This response may strengthen their food intake-control circuitry, leading these women to be able to override hunger cues. In contrast, the opposite seems to be the case for women with binge-eating episodes and higher BMIs.
“The study provides a model for how behavioral traits promote eating problems and changes in BMI, and how eating disorder behaviors, anxiety, mood, and brain neurobiology interact to reinforce the vicious cycle of eating disorders, making recovery very difficult,” said Dr. Frank.
Overall, this study suggests that behavioral traits, including food intake behavior, contribute to eating disorder maintenance and progression by modulating one’s internal reward response and altering food intake control circuitry. However, further research is needed to investigate treatments that could target and change behaviors for individuals with eating disorders to achieve lasting recovery.

Read more →

Study finds changes in wealth tied to changes in cardiovascular health

A new study by investigators from Brigham and Women’s Hospital examines the associations between wealth mobility and long-term cardiovascular health. The multidisciplinary study borrowed methodology from the field of economics to analyze longitudinal changes in wealth. The team’s results indicate that negative wealth mobility is associated with an increased risk of cardiovascular events, while positive wealth changes are associated with a decreased risk of cardiovascular events. Their results are published in JAMA Cardiology.
“Low wealth is a risk factor that can dynamically change over a person’s life and can influence a person’s cardiovascular health status,” said Muthiah Vaduganathan, MD, MPH from the Brigham’s Division of Cardiovascular Medicine. “So, it’s a window of opportunity we have for an at-risk population. Buffering large changes in wealth should be an important focus for health policy moving ahead.”
The retrospective study leveraged data from the RAND Health and Retirement Study (HRS). While wealth data is infrequently categorized in most studies, the HRS uniquely captures detailed information about both housing (primary residence, mortgages, home loans and more) and non-housing (vehicles, businesses, stocks, mutual funds, checking and savings accounts and more) wealth across multiple interviews. The study examined 5,579 adults 50 years or older with no cardiovascular health concerns at baseline. Between January 1992 and December 2016, the HRS research team collected data through interviews with participants about any new diagnoses they had received in terms of their overall health. For deceased participants, next of kin were interviewed and the National Death Index was consulted for additional information.
“Income and wealth, while perhaps informally used interchangeably, actually provide different and complementary perspectives,” said Sara Machado, PhD, an economist at the Department of Health Policy at the London School of Economics. “Income reflects money received on a regular basis, while wealth is more holistic, encompassing both assets and debts. Could paying off one’s debt with a large relative wealth increase be important in promoting cardiovascular health, even without changes in income?”
For the purposes of this study, upward wealth mobility was defined as an increase of at least one wealth quintile and, similarly, downward wealth mobility was defined as a decrease of at least one quintile relative to peers of similar age. Participants who were in the same wealth quintile between interviews were classified as having stable wealth. Altogether, an increase in wealth was associated with protection against cardiovascular diseases and a decrease in wealth was associated with cardiovascular risk.
“Decreases in wealth are associated with more stress, fewer healthy behaviors, and less leisure time, all of which are associated with poorer cardiovascular health,” said Andrew Sumarsono, MD from University of Texas Southwestern’s Division of Hospital Medicine. “It is possible that the inverse is true and may help to explain our study’s findings.”
In terms of limitations, all interviews and reports of new diagnoses were self-reported by the participants. Additionally, there are many factors that influence general cardiovascular health, including certain environmental and socio-demographic variables. Some of these factors were not collected by the HRS and therefore were not considered in the study.
The research team hopes that the findings of their research can inform the future of health policy and medical literature.
“Wealth and health are so closely integrated that we can no longer consider them apart,” said Vaduganathan. “In future investigations, we need to make dedicated efforts to routinely measure wealth and consider it a key determinant of cardiovascular health.”
HRS is sponsored by the National Institute on Aging. This secondary analysis was unfunded.
Story Source:
Materials provided by Brigham and Women’s Hospital. Note: Content may be edited for style and length.

Read more →

Investigational malaria vaccine gives strong, lasting protection

Two U.S. Phase 1 clinical trials of a novel candidate malaria vaccine have found that the regimen conferred unprecedentedly high levels of durable protection when volunteers were later exposed to disease-causing malaria parasites. The vaccine combines live parasites with either of two widely used antimalarial drugs — an approach termed chemoprophylaxis vaccination. A Phase 2 clinical trial of the vaccine is now underway in Mali, a malaria-endemic country. If the approach proves successful there, chemoprophylaxis vaccination, or CVac, potentially could help reverse the stalled decline of global malaria. Currently, there is no vaccine in widespread use for the mosquito-transmitted disease.
The trials were conducted at the National Institutes of Health (NIH) Clinical Center in Bethesda, Maryland. They were led by Patrick E. Duffy, M.D., of the NIH National Institute of Allergy and Infectious Diseases (NIAID), and Stephen L. Hoffman, M.D., CEO of Sanaria Inc., Rockville, Maryland.
The Sanaria vaccine, called PfSPZ, is composed of sporozoites, the form of the malaria parasite transmitted to people by mosquito bites. Sporozoites travel through blood to the liver to initiate infection. In the CVac trials, healthy adult volunteers received PfSPZ along with either pyrimethamine, a drug that kills liver-stage parasites, or chloroquine, which kills blood-stage parasites. Three months later, under carefully controlled conditions, the volunteers were exposed to either an African malaria parasite strain that was the same as that in the vaccine (homologous challenge) or a variant South American parasite (heterologous challenge) that was more genetically distant from the vaccine strain than hundreds of African parasites. Exposure in both cases was via inoculation into venous blood, which infects all unvaccinated individuals.
At the lowest PfSPZ dosage, the CVac approach conferred modest protection: only two of nine volunteers (22.2%) who received the pyrimethamine combination were protected from homologous challenge. In contrast, seven out of eight volunteers (87.5%) who received the highest PfSPZ dosage combined with pyrimethamine were protected from homologous challenge, and seven out of nine volunteers (77.8%) were protected from heterologous challenge. In the case of the chloroquine combination, all six volunteers (100%) who received the higher PfSPZ dosage were completely protected from heterologous challenge. The high levels of cross-strain protection lasted at least three months (the time elapsed between vaccination and challenge) for both higher-dose regimens. One hundred percent protection for three months against heterologous variant parasites is unprecedented for any malaria vaccine in development, the authors note. These data suggest that CVac could be a promising approach for vaccination of travelers to and people living in malaria-endemic areas.
Story Source:
Materials provided by NIH/National Institute of Allergy and Infectious Diseases. Note: Content may be edited for style and length.

Read more →

Breakthrough for tracking RNA with fluorescence

Researchers at Chalmers University of Technology, Sweden, have succeeded in developing a method to label mRNA molecules, and thereby follow, in real time, their path through cells, using a microscope — without affecting their properties or subsequent activity. The breakthrough could be of great importance in facilitating the development of new RNA-based medicines.
RNA-based therapeutics offer a range of new opportunities to prevent, treat and potentially cure diseases. But currently, the delivery of RNA therapeutics into the cell is inefficient. For new therapeutics to fulfil their potential, the delivery methods need to be optimised. Now, a new method, recently presented in the highly regarded Journal of the American Chemical Society, can provide an important piece of the puzzle of overcoming these challenges and take the development a major step forward.
“Since our method can help solve one of the biggest problems for drug discovery and development, we see that this research can facilitate a paradigm shift from traditional drugs to RNA-based therapeutics,” says Marcus Wilhelmsson, Professor at the Department of Chemistry and Chemical Engineering at Chalmers University of Technology, and one of the main authors of the article.
Making mRNA fluorescent without affecting its natural activity
The research behind the method has been done in collaboration with chemists and biologists at Chalmers and the biopharmaceuticals company AstraZeneca, through their joint research centre, FoRmulaEx as well as a research group at the Pasteur Institute, Paris.
The method involves replacing one of the building blocks of RNA with a fluorescent variant, which, apart from that feature, maintains the natural properties of the original base. The fluorescent units have been developed with the help of a special chemistry, and the researchers have shown that it can then be used to produce messenger RNA (mRNA), without affecting the mRNA’s ability to be translated into a protein at natural speed. This represents a breakthrough which has never before been done successfully. The fluorescence furthermore allows the researchers to follow functional mRNA molecules in real time, seeing how they are taken up into cells with the help of a microscope.

Read more →

Cross-generational consequences of lead poisoning

Japanese and Zambian scientists have shown that environmental lead poisoning in children affects not only their own health and wellbeing, but the vitality and mental health of their mothers, as well.
Lead poisoning is a common pediatric problem caused by the environment, and is easily preventable. Due to their smaller size and mass, infants and children are at a higher risk of negative effects compared to adults. Chronic lead poisoning leads to fatigue, sleeping problems, headaches, stupor, and anemia. The population of Kabwe, Zambia, is exposed to extremely high levels of lead. This is a direct result of the Broken Hill mine, which operated until 1994, contaminating the surrounding area; a large number of citizens in Kabwe make a living working the mine tailings, further exposing themselves to heavy metal poisoning.
Recently, a team of scientists from Japan and Zambia, including Hokkaido University’s Professor Harukazu Tohyama and Dr. Hokuto Nakata, have established a significant negative correlation between chronic lead poisoning in children and health-related quality of life of their mothers. Their findings were published in the journal Chemosphere.
The health-related quality of life (HRQoL) comprehensively assesses the health and well-being of an individual. Children’s health and well-being strongly influences the HRQoL of their mothers, with both positive and negative outcomes having been documented. The effects of chronic lead poisoning of children on maternal HRQoL were assumed to be negative; however, it had not been investigated and the exact extent of the interrelation was unknown.
The study was carried out on 40 randomly selected areas in Kabwe, with 25 households tested in each area. The scientists combined data from tests on blood samples, a health survey (SF-36) and an economic survey (KHSS 2017), and carried out statistical analyses to find significant relationships between these three factors.
The scientists demonstrated significant negative associations between the blood lead levels (BLLs) of the children in Kabwe and the HRQoL scores of their mothers, irrespective of the blood lead levels in the mothers. Mental health and vitality were particularly impacted. Previous studies reported that lead exposure may cause behavioral problems in children, which could be the cause of the adverse effects on the vitality of their mothers that was found in this study. Socio-economic factors and maternal age did affect the HRQoL scores, but only in some areas, unlike children’s BLLs. In addition, the BLLs of children were significantly higher than that of their parents.
The biggest limitation of the study was that not all of the 1000 selected households were able to provide data for all parameters examined; in fact, just 404 households provided data of sufficient quality to be analysed. Future work must focus on examining the relations between HRQoL scores, household incomes, and BLLs at a larger scale.
“Urgent medical intervention for the children with high BLL combined in parallel with environmental remediation in Kabwe would not just improve the health status of children in Kabwe, but could also improve the HRQoL of mothers,” says Hokuto Nakata.
Story Source:
Materials provided by Hokkaido University. Note: Content may be edited for style and length.

Read more →

Thermal imaging offers early alert for chronic wound care

New research shows thermal imaging techniques can predict whether a wound needs extra management, offering an early alert system to improve chronic wound care.
It is estimated that 1-2% of the population will experience a chronic wound during their lifetime in developed countries — in the US, chronic wounds affect about 6.5 million patients with more than US$25 billion each year spent by the healthcare system on treating related complications.*
The Australian study shows textural analysis of thermal images of venous leg ulcers (VLUs) can detect whether a wound needs extra management as early as week two for clients receiving treatment at home.
The clinical study by RMIT University and Bolton Clarke, published in the Nature journal Scientific Reports, is the first to investigate textural analysis on VLUs using thermal images that do not require physical contact with the wound.
Researchers found the method, which provides information on spatial heat distribution in a wound, could accurately predict whether VLUs would heal in 12 weeks by the second week after baseline assessment.
This is because wounds change significantly over the healing trajectory, with higher temperatures signalling potential inflammation or infection while lower temperatures can indicate a slower healing rate due to decreased oxygen in the region.

Read more →