Kids' blood pressure measurements different between arms, potential for misdiagnosis

Blood pressure measurements in children and adolescents should be taken from both arms after new research showed substantial differences could be seen depending on which arm was used.
The study, led by the Murdoch Children’s Research Institute (MCRI) and published in the Journal of Hypertension, found even a small difference in blood pressure measurements between arms could lead to a wrong diagnosis.
MCRI PhD candidate and study lead author Melanie Clarke said this was the first study worldwide to determine the size and frequency of inter-arm blood pressure differences in children and adolescents.
The study involved 118 participants, aged 7-18 years, recruited from a cardiology day clinic in Melbourne. It found in healthy children one in four had an inter-arm difference that could lead to misdiagnosis. This figure doubled in those with a history of aortic surgery, which is often performed in infants with congenital heart disease.
Ms Clarke said the high rates of misclassification occurred because the difference between a normal and hypertensive recording was so small.
“Misdiagnosis could occur when the blood pressure difference is greater than about 5 mmHg, but one in seven healthy children had a difference greater than 10 mmHg, which could lead to a failure to identify stage one or two hypertension,” she said.

advertisement

“Given blood pressure measured in a child’s right and left arm are often different, it’s important to take measurements in both arms to make a correct diagnosis. Accurate blood pressure assessment in kids is critical for identifying the potential risk for damage to the heart and blood vessels, which can lead to early-onset cardiovascular disease.”
High blood pressure is one of the primary risk factors for heart disease and stroke, the leading causes of death worldwide. Globally, an average of three children per school classroom have elevated blood pressure or hypertension (almost 14 per cent).
MCRI Associate Professor Jonathan Mynard said children with high blood pressure were more likely to develop hypertension as adults at a relatively young age, and the damage it caused to the heart and blood vessels started silently in childhood.
“Children with high blood pressure, many of whom appear to be healthy, have a greater risk of developing hypertension in adulthood, a major risk factor for cardiovascular disease,” he said.
The European Society of Hypertension and the American Academy of Pediatrics recommend blood pressure be measured in children and adolescents at least once a year. However, Associate Professor Mynard said in Australia it wasn’t common practice for GPs to measure blood pressure in children or in both arms.
“We know high blood pressure is common in adults but many people don’t realise how common it is in kids too,” he said. Parents can help by encouraging their kids to eat a healthy diet that is low in salt and sugary drinks, and high in fruit, vegetables, and whole grains, and to engage in lots of physical activity.
“More work needs to be done to draw attention to the problem of childhood hypertension and its long-term consequences. Australia would benefit from having its own set of clinical guidelines addressing high blood pressure in children, including how to obtain accurate measurements and avoid misclassification.”
Heart Foundation Chief Medical Advisor and cardiologist Professor Garry Jennings said: “There are good clinical reasons for measuring blood pressure in both arms in children and adolescents in the evaluation of hypertension and this study provides clear support for this approach.” Researchers from the University of Melbourne, The Royal Children’s Hospital and the Slippery Rock University in Pennsylvania also contributed to the findings.

Read more →

Learning to help the adaptive immune system

Scientists from the Institute of Industrial Science at The University of Tokyo demonstrated how the adaptive immune system uses a method similar to reinforcement learning to control the immune reaction to repeat infections. This work may lead to significant improvements in vaccine development and interventions to boost the immune system.
In the human body, the adaptive immune system fights germs by remembering previous infections so it can respond quickly if the same pathogens return. This complex process depends on the cooperation of many cell types. Among these are T helpers, which assist by coordinating the response of other parts of the immune system — called effector cells — such as T killer and B cells. When an invading pathogen is detected, antigen presenting cells bring an identifying piece of the germ to a T cell. Certain T cells become activated and multiply many times in a process known as clonal selection. These clones then marshal a particular set of effector cells to battle the germs. Although the immune system has been extensively studied for decades, the “algorithm” used by T cells to optimize the response to threats is largely unknown.
Now, scientists at The University of Tokyo have used an artificial intelligence framework to show that the number of T helpers act like the “hidden layer” between inputs and outputs in an artificial neural network commonly used in adaptive learning. In this case, the antigens presented are the inputs, and the responding effector immune cells are the output.
“Just as a neural network can be trained in machine learning, we believe the immune network can reflect associations between antigen patterns and the effective responses to pathogens,” first author Takuya Kato says.
The main difference between the adaptive immune system compared with computer machine learning is that only the number of T helper cells of each type can be varied, as opposed to the connection weights between nodes in each layer. The team used computer simulations to predict the distribution of T cell abundances after undergoing adaptive learning. These values were found to agree with experimental data based on the genetic sequencing of actual T helper cells.
“Our theoretical framework may completely change our understanding of adaptive immunity as a real learning system,” says co-author Tetsuya Kobayashi. “This research can shed light on other complex adaptive systems, as well as ways to optimize vaccines to evoke a stronger immune response.”

Story Source:
Materials provided by Institute of Industrial Science, The University of Tokyo. Note: Content may be edited for style and length.

Read more →

Variant B.1.1.7 of COVID-19 associated with a significantly higher mortality rate, research shows

The highly infectious variant of COVID-19 discovered in Kent, which swept across the UK last year before spreading worldwide, is between 30 and 100 per cent more deadly than previous strains, new analysis has shown.
A pivotal study, by epidemiologists from the Universities of Exeter and Bristol, has shown that the SARS-CoV-2 variant, B.1.1.7, is associated with a significantly higher mortality rate amongst adults diagnosed in the community compared to previously circulating strains.
The study compared death rates among people infected with the new variant and those infected with other strains.
It showed that the new variant led to 227 deaths in a sample of 54906 patients — compared to 141 amongst the same number of closely matched patients who had the previous strains.
With the new variant already detected in more than 50 countries worldwide, the analysis provides crucial information to governments and health officials to help prevent its spread.
The study is published in the British Medical Journal on Wednesday, 10 March 2021.

advertisement

Robert Challen, lead author of the study from the University of Exeter said: “In the community, death from COVID-19 is still a rare event, but the B.1.1.7 variant raises the risk. Coupled with its ability to spread rapidly this makes B.1.1.7 a threat that should be taken seriously.”
The Kent variant, first detected in the UK in September 2020, has been identified as being significantly quicker and easier to spread, and was behind the introduction of new lockdown rules across the UK from January.
The study shows that the higher transmissibility of the Kent strain meant that more people who would have previously been considered low risk were hospitalised with the newer variant.
Having analysed data from 54609 matched pairs of patients of all age-groups and demographics, and differing only in strain detected, the team found that there were 227 deaths attributed to the new strain, compared to 141 attributable to earlier strains.
Leon Danon, senior author of the study from the University of Bristol said: “We focussed our analysis on cases that occurred between November 2020 and January 2021, when both the old variants and the new variant were present in the UK. This meant we were able to maximise the number of “matches” and reduce the impact of other biases. Subsequent analyses have confirmed our results.
“SARS-CoV-2 appears able to mutate quickly, and there is a real concern that other variants will arise with resistance to rapidly rolled out vaccines. Monitoring for new variants as they arise, measuring their characteristics and acting appropriately needs to be a key part of the public health response in the future.”
Ellen Brooks-Pollock from the University of Bristol expanded: “It was fortunate the mutation happened in a part of the genome covered by routine testing. Future mutations could arise and spread unchecked.”

Story Source:
Materials provided by University of Exeter. Note: Content may be edited for style and length.

Read more →

Are 'bacterial probiotics' a game-changer for the biofuels industry?

In a study recently published in Nature Communications, scientists from The Novo Nordisk Foundation Center for Biosustainability (DTU) and Yale University have investigated how bacteria that are commonly found in sugarcane ethanol fermentation affect the industrial process. By closely studying the interactions between yeast and bacteria, it is suggested that the industry could improve both its total yield and the cost of the fermentation processes by paying more attention to the diversity of the microbial communities and choosing between good and bad bacteria.
The scientists dissected yeast-bacteria interactions in sugarcane ethanol fermentation by reconstituting every possible combination of the microbial community structure, covering approximately 80% of the biodiversity found in industrial processes, and especially one bacterium deserves extra attention: Lactobacillus amylovorus. But how come exactly this one doesn’t fall into the category of “the bad guys”? The main reason is that it produces a lot of the molecule acetaldehyde, which is used to feed yeast and thus helps it to grow. You could say that Lactobacillus amylovorus is more generous by nature and shares the meal, whereas many other bacteria involved in these processes prefer simply to steal the food.
“It works almost in the same way as a probiotic that shields the bad bacteria from entering into the system. And when this bacterium grows, it will grow in a way that is almost symbiotic with the yeast which is very beneficial for the industrial process,” says Felipe Lino, former PhD Student at The Novo Nordisk Foundation Center for Biosustainability and Global R & D Manager at Anheuser-Busch InBev.
Significant improvement of yield
Thus, companies could take advantage of selecting not only for an ideal yeast strain for production, as they started doing already in the 90’s, but to select for the best-suited bacteria as well, since it is completely impossible to get rid of bacteria that are hanging around no matter what. An effort that could turn out to pay dividends already in a short-term perspective.
By using this probiotic in a sugarcane ethanol fermentation, it is estimated that the fermentation yield could increase by three percent. While three percent can sound like a rather low number this is definitely not the case. According to Brazil’s Biofuels Annual 2019, Brazil’s total ethanol production in 2019 was 34.5 billion liters with domestic demand for 34 billion liters making the country the home to the largest fleet of cars that use ethanol derived from sugarcane as an alternative fuel to fossil fuel-based petroleum.
These numbers indicate that optimised fermentation processes hold great potential. One way to start ensuring more efficient industrial production of ethanol would be to apply more targeted approaches and shift away from a “one-size fits all” strategy where sulfuric acid treatment is used without further consideration to lower the pH and kill the bacteria to keep the population under a certain threshold. This would be beneficial both economically and environmentally, says Morten Sommer, Professor and Group Leader at The Novo Nordisk Foundation Center for Biosustainability.
“Instead of using a broad range of antibiotics, one should go for a more specific solution where you keep the good bacteria inside the fermenter. This is definitely a paradigm shift because you are not per definition fighting against all bacteria, since some of the bacteria are actually good and improve your final output significantly while also having a positive effect on production costs and the environmental footprint.”

Story Source:
Materials provided by Technical University of Denmark. Original written by Anders Østerby Mønsted, Bernadette Maria Grant. Note: Content may be edited for style and length.

Read more →

New study links protein causing Alzheimer's disease with common sight loss

Newly published research has revealed a close link between proteins associated with Alzheimer’s disease and age-related sight loss. The findings could open the way to new treatments for patients with deteriorating vision and through this study, the scientists believe they could reduce the need for using animals in future research into blinding conditions.
Amyloid beta (AB) proteins are the primary driver of Alzheimer’s disease but also begin to collect in the retina as people get older. Donor eyes from patients who suffered from age-related macular degeneration (AMD), the most common cause of blindness amongst adults in the UK, have been shown to contain high levels of AB in their retinas.
This new study, published in the journal Cells, builds on previous research which shows that AB collects around a cell layer called the retinal pigment epithelium (RPE), to establish what damage these toxic proteins cause RPE cells.
The research team exposed RPE cells of normal mouse eyes and in culture to AB. The mouse model enabled the team to look at the effect the protein has in living eye tissue, using non-invasive imaging techniques that are used in ophthalmology clinics. Their findings showed that the mouse eyes developed retinal pathology that was strikingly similar to AMD in humans.
Dr Arjuna Ratnayaka, a Lecturer in Vision Sciences at the University of Southampton, who led the study said, “This was an important study which also showed that mouse numbers used for experiments of this kind can be significantly reduced in the future. We were able to develop a robust model to study AMD-like retinal pathology driven by AB without using transgenic animals, which are often used by researchers the field. Transgenic or genetically engineered mice can take up to a year and typically longer, before AB causes pathology in the retina, which we can achieve within two weeks. This reduces the need to develop more transgenic models and improves animal welfare.”
The investigators also used the cell models, which further reduced the use of mice in these experiments, to show that the toxic AB proteins entered RPE cells and rapidly collected in lysosomes, the waste disposal system for the cells. Whilst the cells performed their usual function of increasing enzymes within lysosomes to breakdown this unwanted cargo, the study found that around 85% of AB still remained within lysosomes, meaning that over time the toxic molecules would continue to accumulate inside RPE cells.
Furthermore, the researchers discovered that once lysosomes had been invaded by AB, around 20 percent fewer lysosomes were available to breakdown photoreceptor outer segments, a role they routinely perform as part of the daily visual cycle.
Dr Ratnayaka added, “This is a further indication of how cells in the eye can deteriorate over time because of these toxic molecules collecting inside RPE cells. This could be a new pathway that no-one has explored before. Our discoveries have also strengthened the link between diseases of the eye and the brain. The eye is part of the brain and we have shown how AB which is known to drive major neurological conditions such as Alzheimer’s disease can also causes significant damage to cells in retina.”
The researchers hope that one of the next steps could be for anti-amyloid beta drugs, previously trialled in Alzheimer’s patients, to be re-purposed and trialled as a possible treatment for age-related macular degeneration. As the regulators in the USA and the European Union have already given approval for many of these drugs, this is an area that could be explored relatively quickly.
The study may also help wider efforts to largely by-pass the use of animal experimentation where possible, so some aspects of testing new clinical treatments can transition directly from cell models to patients.
This research was funded by the National Centre for the Replacement Refinement & Reduction of animals in research (NC3Rs). Dr Katie Bates, Head of Research Funding at the NC3Rs said:
“This is an impactful study that demonstrates the scientific, practical and 3Rs benefits to studying AMD-like retinal pathology in vitro.”

Story Source:
Materials provided by University of Southampton. Note: Content may be edited for style and length.

Read more →

Face masks and the environment: Preventing the next plastic problem

Recent studies estimate that we use an astounding 129 billion face masks globally every month — that is 3 million a minute. Most of them are disposable face masks made from plastic microfibers.
“With increasing reports on inappropriate disposal of masks, it is urgent to recognize this potential environmental threat and prevent it from becoming the next plastic problem,” researchers warn in a comment in the scientific journal Frontiers of Environmental Science & Engineering.
The researchers are Environmental Toxicologist Elvis Genbo Xu from University of Southern Denmark and Professor of Civil and Environmental Engineering Zhiyong Jason Ren from Princeton University.
No guidelines for mask recycling:
Disposable masks are plastic products, that cannot be readily biodegraded but may fragment into smaller plastic particles, namely micro- and nanoplastics that widespread in ecosystems.
The enormous production of disposable masks is on a similar scale as plastic bottles, which is estimated to be 43 billion per month.

advertisement

However, different from plastic bottles, (of which app. 25 pct. is recycled), there is no official guidance on mask recycle, making it more likely to be disposed of as solid waste, the researchers write.
Greater concern than plastic bags:
If not disposed of for recycling, like other plastic wastes, disposable masks can end up in the environment, freshwater systems, and oceans, where weathering can generate a large number of micro-sized particles (smaller than 5 mm) during a relatively short period (weeks) and further fragment into nanoplastics (smaller than 1 micrometer).
“A newer and bigger concern is that the masks are directly made from microsized plastic fibers (thickness of ~1 to 10 micrometers). When breaking down in the environment, the mask may release more micro-sized plastics, easier and faster than bulk plastics like plastic bags,” the researchers write, continuing:
“Such impacts can be worsened by a new-generation mask, nanomasks, which directly use nano-sized plastic fibers (with a diameter smaller than 1 micrometer) and add a new source of nanoplastic pollution.”
The researchers stress that they do not know how masks contribute to the large number of plastic particles detected in the environment — simply because no data on mask degradation in nature exists.
“But we know that, like other plastic debris, disposable masks may also accumulate and release harmful chemical and biological substances, such as bisphenol A, heavy metals, as well as pathogenic micro-organisms. These may pose indirect adverse impacts on plants, animals and humans,” says Elvis Genbo Xu.
What can we do?
Elvis Genbo Xu and Zhiyong Jason Ren have the following suggestions for dealing with the problem:
Set up mask-only trash cans for collection and disposal
consider standardization, guidelines, and strict implementation of waste management for mask wastes
replace disposable masks with reusable face masks like cotton masks
consider development of biodegradable disposal masks.

Story Source:
Materials provided by University of Southern Denmark. Original written by Birgitte Svennevig. Note: Content may be edited for style and length.

Read more →

Psychedelic science holds promise for mainstream medicine

Psychedelic healing may sound like a fad from the Woodstock era, but it’s a field of study that’s gaining traction in the medical community as an effective treatment option for a growing number of mental health conditions.
While the study of psychedelics as medicine is inching toward the mainstream, it still remains somewhat controversial. Psychedelics have struggled to shake a “counterculture” perception that was born in the 1960s, a view that had stymied scientific study of them for more than 50 years.
But that perception is slowly changing.
Mounting research suggests that controlled treatment with psychedelics like psilocybin mushrooms, LSD, and MDMA — better known as ecstasy — may be effective options for people suffering from PTSD, anxiety disorders, and depression. The U.S. Food & Drug Administration recently granted “breakthrough therapy” status to study the medical benefits of psychedelics. And two years ago this month, the FDA approved a psychedelic drug — esketamine — to treat depression.
An increasing number of states and municipalities are also grappling with calls to decriminalize psychedelic drugs, a move that UNLV neuroscientist Dustin Hines says could further the recent renaissance in psychedelic science.
“The resurgence in interest in psychedelic medicine is likely related to multiple factors, including decreasing societal stigma regarding drugs like hallucinogens and cannabis, increasing awareness of the potential therapeutic compounds found naturally occurring in plants and fungi, and the growing mental health crisis our nation faces,” says Hines. “Because of the intersection between the great need for innovation and wider social acceptance, researchers have started to explore psychedelics as novel treatments for depressive disorders, including work with compounds that have been used for millennia.”
In the Hines lab at UNLV, husband and wife researchers Dustin and Rochelle Hines are uncovering how psychedelics affect brain activity. Their work, published recently in Nature: Scientific Reports, shows a strong connection in rodent models between brain activity and behaviors resulting from psychedelic treatment, a step forward in the quest to better understand their potential therapeutic effects.

Story Source:
Materials provided by University of Nevada, Las Vegas. Original written by Tony Allen. Note: Content may be edited for style and length.

Read more →

Bacteria know how to exploit quantum mechanics

Photosynthetic organisms harvest light from the sun to produce the energy they need to survive. A new paper published by University of Chicago researchers reveals their secret: exploiting quantum mechanics.
“Before this study, the scientific community saw quantum signatures generated in biological systems and asked the question, were these results just a consequence of biology being built from molecules, or did they have a purpose?” said Greg Engel, Professor of Chemistry and senior author on the study. “This is the first time we are seeing biology actively exploiting quantum effects.”
The scientists studied a type of microorganism called green sulfur bacteria. These bacteria need light to survive, but even small amounts of oxygen can damage their delicate photosynthetic equipment. So they must develop ways to minimize the damage when the bacterium does encounter oxygen.
To study this process, researchers tracked the movement of energy through a photosynthetic protein under different conditions — with oxygen around, and without.
They found that the bacterium uses a quantum mechanical effect called vibronic mixing to move energy between two different pathways, depending on whether or not there’s oxygen around. Vibronic mixing involves vibrational and electronic characteristics in molecules coupling to one another. In essence, the vibrations mix so completely with the electronic states that their identities become inseparable. This bacterium uses this phenomenon to guide energy where it needs it to go.
If there’s no oxygen around and the bacterium is safe, the bacterium uses vibronic mixing by matching the energy difference between two electronic states in an assembly of molecules and proteins called the FMO complex, with the energy of the vibration of a bacteriochlorophyll molecule. This encourages the energy to flow through the “normal” pathway toward the photosynthetic reaction center, which is packed full of chlorophyll.
But if there is oxygen around, the organism has evolved to steer the energy through a less direct path where it can be quenched. (Quenching energy is similar to putting a palm on a vibrating guitar string to dissipate energy.) This way, the bacterium loses some energy but saves the entire system.
To achieve this effect, a pair of cysteine residues in the photosynthetic complex acts as a trigger: They each react with the oxygen in the environment by losing a proton, which disrupts the vibronic mixing. This means that energy now preferentially moves through the alternative pathway, where it can be safely quenched. This principle is a bit like blocking two lanes on a superhighway and diverting some traffic to local roads where there are many more exits.
“What’s interesting about this result is that we are seeing the protein turn the vibronic coupling on and off in response to environmental changes in the cell,” said Jake Higgins, a graduate student in the Department of Chemistry and the lead author of the paper. “The protein uses the quantum effect to protect the organism from oxidative damage.”
These findings bring about an exciting new revelation about biology; using an explicitly quantum mechanism to protect the system shows an important adaptation and that quantum effects can be important to survival.
This phenomenon is likely not limited to green sulfur bacteria, the scientists said. As Higgins explained, “The simplicity of the mechanism suggests that it might be found in other photosynthetic organisms across the evolutionary landscape. If more organisms are able to dynamically modulate quantum mechanical couplings in their molecules to produce larger changes in physiology, there could be a whole new set of effects selected for by nature that we don’t yet know about.”
Air Force Office of Scientific Research (AFOSR), NSF, DOE Office of Science, Department of Defense (DoD), Arnold and Mabel Beckman Foundation.

Story Source:
Materials provided by University of Chicago Medical Center. Original written by Sheila Evans. Note: Content may be edited for style and length.

Read more →

New lung cancer screening recommendation, starting at age 50, expands access but may not address inequities

Calling the U.S. Preventive Services Task Force’s newly released recommendation statement to expand eligibility for annual lung cancer screening with low-dose computed tomography a step forward, UNC Lineberger Comprehensive Cancer Center researchers say future changes should address equity and implementation issues.
In an editorial published in JAMA, Louise M. Henderson, PhD, professor of radiology at UNC School of Medicine, M. Patricia Rivera, MD, professor of medicine at UNC School of Medicine, and Ethan Basch, MD, MSc, the Richard M. Goldberg Distinguished Professor in Medical Oncology and chief of oncology at the UNC School of Medicine, outlined their concerns and offered potential approaches to make the screening recommendation more inclusive of populations that have been historically underserved.
“The revised U.S. Preventive Services Task Force’s recommendations are sound and based on well-conceived evidence and modeling studies, but they alone are not enough, as we have seen limited uptake of the prior recommendations,” Basch said. “Implementation will require broader efforts by payers, health systems and professional societies, and, in the future, a more tailored, individual risk prediction approach may be preferable.”
The task force has made two significant changes to the screening recommendation it issued in 2013: Annual screening will begin at age 50, instead of 55, and smoking intensity has been reduced from 30 to 20 pack-year history. These more inclusive criteria could more than double the number of adults eligible for lung cancer screening, from 6.4 million to 14.5 million, according to some estimates. This represents an 81% increase.
Henderson, Rivera and Basch are encouraged that lung cancer screening will be available to more people, and they point out that expanding access alone won’t reduce racial inequities, especially as measured by lung cancer deaths prevented and life-years gained.
It may be possible to counter this shortcoming, they said, by adding risk-prediction models that identify high-benefit individuals who do not meet USPSTF criteria. This could reduce or eliminate some, though not all, racial disparities, according to one study. Also, future research should explore risks such as family history of lung cancer and genetic susceptibility to develop risk assessment strategies that may identify individuals who never smoked and still have a high risk for lung cancer but currently are not eligible to be screened.

advertisement

Financial-based barriers are also an issue. Expanding screening access to include people as young as 50 may lead to greater inequities for those who are enrolled in Medicaid, the state-based public health insurance program.
“Medicaid is not required to cover the USPSTF recommended screenings and even when screening is covered, Medicaid programs may use different eligibility criteria,” Henderson said. She adds this is problematic because people who receive Medicaid are twice as likely to be current smokers than those with private insurance (26.3% compared to 11.1%), and they are disproportionately affected by lung cancer. “This is a significant issue, particularly in the nine states where Medicaid does not cover lung cancer screening.”
Putting the screening recommendation into practice will be a substantial challenge, Rivera said. Primary care providers are critical to implementing the screening process because they initiate the conversation with their patients about the potential benefits and risk of lung cancer screening and make the screening referral. However, Rivera said many already have an overburdened workload, and it may be unrealistic to expect them to be able to spend the necessary time to have these complex conversations.
“A significant barrier to implementation of lung cancer screening is provider time. Many primary care providers do not have adequate time to have a shared decision-making conversation and to conduct a risk assessment,” Rivera said. “Although a lung cancer screening risk model that incorporates co-morbidities and clinical risk variables may be the best tool for selecting high risk individuals who are most likely to benefit from screening, such a model requires input of additional clinical information, thereby increasing the time a provider will spend; the use of such a model in clinical practice has not been established.”
Despite these limitations and challenges, the new recommendation can expand access to lung cancer screening, the researchers said in the editorial. “Beyond implementation challenges, the future of screening strategies lies in individualized risk assessment including genetic risk. The 2021 USPSTF recommendation statement represents a leap forward in evidence and offers promise to prevent more cancer deaths and address screening disparities. But the greatest work lies ahead to ensure this promise is actualized.”
Disclosures
Henderson reported receiving grants from the National Cancer Institute. Rivera reported receiving grants from the National Cancer Institute for research in lung cancer screening, serving on the advisory panel for Biodesix and bioAffinity, and serving as a research consultant to Johnson & Johnson, outside the submitted work. Basch reported receiving fees from Astra Zeneca, CareVive Systems, Navigating Cancer, and Sivan Healthcare for serving as a scientific advisor/consultant, outside the submitted work.

Read more →

Soft contact lenses eyed as new solutions to monitor ocular diseases

New contact lens technology to help diagnose and monitor medical conditions may soon be ready for clinical trials.
A team of researchers from Purdue University worked with biomedical, mechanical and chemical engineers, along with clinicians, to develop the novel technology. The team enabled commercial soft contact lenses to be a bioinstrumentation tool for unobtrusive monitoring of clinically important information associated with underlying ocular health conditions.
The team’s work is published in Nature Communications. The Purdue Research Foundation Office of Technology Commercialization helped secure a patent for the technology and it is available for licensing.
“This technology will be greatly beneficial to the painless diagnosis or early detection of many ocular diseases including glaucoma” said Chi Hwan Lee, the Leslie A. Geddes assistant professor of biomedical engineering and assistant professor of mechanical engineering at Purdue who is leading the development team. “Since the first conceptual invention by Leonardo da Vinci, there has been a great desire to utilize contact lenses for eye-wearable biomedical platforms.”
Sensors or other electronics previously couldn’t be used for commercial soft contact lenses because the fabrication technology required a rigid, planar surface incompatible with the soft, curved shape of a contact lens.
The team has paved a unique way that enables the seamless integration of ultrathin, stretchable biosensors with commercial soft contact lenses via wet adhesive bonding. The biosensors embedded on the soft contact lenses record electrophysiological retinal activity from the corneal surface of human eyes, without the need of topical anesthesia that has been required in current clinical settings for pain management and safety.
“This technology will allow doctors and scientists to better understand spontaneous retinal activity with significantly improved accuracy, reliability, and user comfort,” said Pete Kollbaum, the Director of the Borish Center for Ophthalmic Research and an associate professor of optometry at Indiana University who is leading clinical trials.

Story Source:
Materials provided by Purdue University. Original written by Chris Adam. Note: Content may be edited for style and length.

Read more →