This post was originally published on this site
The founding chair of the Biomedical Engineering Department at the University of Houston is reporting a new deep neural network architecture that provides early diagnosis of systemic sclerosis (SSc), a rare autoimmune disease marked by hardened or fibrous skin and internal organs. The proposed network, implemented using a standard laptop computer (2.5 GHz Intel Core i7), can immediately differentiate between images of healthy skin and skin with systemic sclerosis.
“Our preliminary study, intended to show the efficacy of the proposed network architecture, holds promise in the characterization of SSc,” reports Metin Akay, John S. Dunn Endowed Chair Professor of biomedical engineering. The work is published in the IEEE Open Journal of Engineering in Medicine and Biology.
“We believe that the proposed network architecture could easily be implemented in a clinical setting, providing a simple, inexpensive and accurate screening tool for SSc.”
For patients with SSc, early diagnosis is critical, but often elusive. Several studies have shown that organ involvement could occur far earlier than expected in the early phase of the disease, but early diagnosis and determining the extent of disease progression pose significant challenge for physicians, even at expert centers, resulting in delays in therapy and management.
In artificial intelligence, deep learning organizes algorithms into layers (the artificial neural network) that can make its own intelligent decisions. To speed up the learning process, the new network was trained using the parameters of MobileNetV2, a mobile vision application, pre-trained on the ImageNet dataset with 1.4M images.
“By scanning the images, the network learns from the existing images and decides which new image is normal or in an early or late stage of disease,” said Akay.
Among several deep learning networks, Convolutional Neural Networks (CNNs) are most commonly used in engineering, medicine and biology, but their success in biomedical applications has been limited due to the size of the available training sets and networks.
To overcome these difficulties, Akay and partner Yasemin Akay combined the UNet, a modified CNN architecture, with added layers, and they developed a mobile training module. The results showed that the proposed deep learning architecture is superior and better than CNNs for classification of SSc images.
“After fine tuning, our results showed the proposed network reached 100% accuracy on the training image set, 96.8% accuracy on the validation image set, and 95.2% on the testing image set,” said Yasmin Akay, UH instructional associate professor of biomedical engineering.
The training time was less than five hours.
Joining Metin Akay and Yasemin Akay, the paper was co-authored by Yong Du, Cheryl Shersen, Ting Chen and Chandra Mohan, all of University of Houston; and Minghua Wu and Shervin Assassi of the University of Texas Health Science Center (UT Health).
Story Source:
Materials provided by University of Houston. Original written by Laurie Fickman. Note: Content may be edited for style and length.