Enabling Precision Medicine Through AI
Precision medicine is of growing interest for healthcare providers and hospitals. The Precision Medicine Initiative defined precision medicine as an "emerging approach for disease prevention and treatment that takes into account people's individual variations in genes, environment, and lifestyle." At the Build Precision Medicine Summit earlier this year, there was a lengthy discussion on the definition of precision medicine. It became clear that beyond the Precision Medicine Initiative's high-level definition, the implementation of precision medicine potentially encompasses many things, including being more proactive in care, using broad data sets initially and over time, as well as considering patient preferences.
Genomics is a key enabler of precision medicine, as knowledge of a patient's underlying genomics can unlock therapeutic approaches that significantly improve clinical outcomes. Utilizing genomic data requires analysis of big data. Yes, there are well-known disease markers like the DeltaF508 mutation for cystic fibrosis. However, many genetic markers are novel or personal, even though they occur in a gene known to be associated with a certain condition. Indeed, each of us has 3-4 million mutations, or variants, in our genome. Consequently, this means that there are 3 million differences between most individuals. The majority of these variants are just benign differences or are associated with traits such as hair color. So how do we identify the variants that are important to disease –- to a specific disease manifesting or at risk for a patient?
For precision medicine to become a routine reality, big data analysis needs to be done quickly, comprehensively, and at scale. Traditional approaches to genomic analysis have utilized manual filtering through data and time-consuming analysis techniques. These manual techniques don't scale. Computational methods are needed. Artificial intelligence (AI) has become a buzzword across industries and has become ubiquitous in everything from cars to washing machines to online buying sites. AI has become critical in genomics as well, to enable accurate analysis of genomic big data at scale. The number of trained molecular geneticists who can interpret tests is growing incrementally, but the growth of patient testing is outpacing this significantly. AI, combined with expert review, enables scalable testing of many more patients.
Fabric Genomics has been applying algorithmic techniques to this problem of large-scale genomic data analysis since our inception in 2009. Fabric AI, the family of AI methods embedded in the Fabric Enterprise SaaS platform, supports panel testing as well as whole exome and genome and enables labs to reliably scale interpretation of reimbursable tests.
Beyond AI, IT systems need to interoperate to streamline workflows so that volume testing can be supported. The recently announced integration of XIFIN LIS and Fabric Enterprise enables end-to-end genomic testing workflows. This complete, integrated workflow solution from test requisition to data analysis, to clinical report, allows clinical labs to quickly adopt advanced molecular genomic testing, thus enabling healthcare providers to advance precision medicine programs and improve patient outcomes.