Latest News

Harnessing Machine Learning And Artificial Intelligence To Accelerate Discovery In Diagnostics

By Melissa Pandika

March 28, 2019 | As several diagnostics advances have found their footing in AI, researchers have been able to churn out massive amounts of data. However, results have revealed challenges in scaling up analysis. At the Molecular Medicine Tri-Conference in San Francisco, a multidisciplinary cadre of thought leaders tackled this issue, sharing their latest applications of these technologies to large-scale data to streamline diagnostics.

Looking at efforts to develop AI automated diagnostics, Ryan Amelon from IDx Technologies shared findings from a clinical trial of IDx-DR, an AI system designed to detect diabetic retinopathy in adults, which led to FDA approval last April as the first-ever fully autonomous diagnostic for use without a specialist. Ophthalmologists “are not so great” at classifying diabetic retinopathy, a leading cause of blindness in the working age population, with sensitivity ranging from 33 to 73%, Amelon said. The clinical trial of IDx-DR, which included 900 patients across 10 sites, found it to be 87.2% sensitive and 90.7% specific, although Amelon said he and his team remain blinded to the data. , Highly experienced, certified retinal photographers obtained the Early Treatment Diabetic Retinopathy Study (ETDRS) reference standard—the “gold standard” for measuring diabetic retinopathy severity--which was then compared to IDx-DR’s output.

Amelon then outlined how IDx-DR meets the qualifications of a fully autonomous AI system. He highlighted its usability, noting that operators need only a high school diploma and must have no experience using a fundus camera. What’s more, the system determined that 96% of exams in the FDA study were of diagnostic quality. IDx-DR also directs operators to retake poor-quality images. Finally, the output is actionable, and the system rigorously validated. In fact, Amelon and his team have developed algorithms that separately detect various biomarkers of diabetic retinopathy, resulting in roughly 12 points of validation instead of just one. When it comes to developing a fully automated diagnostic to replace the physician, “there’s a lot more that goes into it besides just training the algorithm,” Amelon said.

Later in the session, Marina Sirota, an assistant professor at the Bakar Computational Health Sciences Institute at UCSF, described her lab’s use of computational integrative methods to identify the determinants of preterm birth. She detailed how she and her team have integrated pollution exposure datasets with birth record datasets, both within California, enabling them to pinpoint two water contaminants associated with preterm birth.  Meanwhile, their meta-analysis of three studies using maternal blood samples identified a maternal transcriptomic signature of preterm birth, which showed an upregulation of innate immunity and downregulation of adaptive immunity. They saw the reverse in the fetal transcriptomic signature of preterm birth and are further investigating this inverse relationship between maternal and fetal transcriptomic signatures.

Renee Deehan Kenney of PatientsLikeMe described the development of the DigitalMe explorer and researcher platform that helps researchers glean meaningful health insights from complex biological datasets via machine learning and other modeling techniques. Guergana Savova, an associate professor at Harvard Medical School, then discussed DeepPhe, software that uses natural language processing to extract deep phenotype information from cancer patients’ electronic medical records, including temporality—meaning it places surgeries and other events on a timeline, crucial for measuring disease activity, for instance.