Submit manuscript...
Journal of
eISSN: 2373-4469

Investigative Genomics

Correspondence:

Received: January 01, 1970 | Published: ,

Citation: DOI:

Download PDF

Abstract

Recent advances in high-throughput sequencing, in concert with a greater understanding of the genetic basis for many diseases, have created an obvious avenue for “personalized” or “precision medicine.” On February 20th 2015, the U.S. Food and Drug Administration (FDA) held a public workshop to share in the discussion about standardization and oversight in the area of diagnostic next generation sequencing (NGS) tests. The meeting was preceded by President Obama’s announcement of the Precision Medicine Initiative. The challenges set forth by Precision Medicine Initiative include creating an environment for clinically reliable NGS testing. These challenges include clinical accuracy, standardization, interpretation and oversight. Participants including scientists, laboratory directors, and commercial entities provided short presentations, panelists, and public comments. The following review captures the comments and opinions of participants in three focus areas, analytical performance of NGS-based tests, clinical performance of NGS-generated results, and regulatory oversight.

Keywords: FDA, NGS, sequencing, laboratory, testing, genomic, genetic, precision, government

Abbreviations

FDA, federal drug administration; NGS, next generation sequencing; CFTR, cystic fibrosis trans membrane receptor; SNV, single nucleotide variant; Indel, insertion or deletion

Introduction

The customization of healthcare tailored to individual patients has gained public attention following the President’s Precision Medicine Initiative announced earlier this year.1 Converging technologies including NGS-based testing, drug therapies targeting specific types of human or pathogen variants, antisense nucleic acid therapies, and others allow for unprecedented level specificity in treating disease. However, while precision medicines offer a promising avenue for disease treatment, this new technology requires a new solution for oversight and public assurance. There is both public and commercial interest in understanding what processes and standards will be used by the FDA to monitor NGS testing and what areas of regulation to implement.2,3

Opening remarks

Margaret hamburg Commissioner of the FDA, provided the opening remarks. Dr. Hamburg had been the Commissioner of the FDA since 2009 and recently announced her upcoming retirement. She opened with an overview of the trajectory of next-generation sequencing, including the landmark authorization of the Illumina MiSeq Instrument Platform, its Cystic Fibrosis tests and two additional platforms, the Ion Torrent from Life Technologies and Sentosa from Vela Diagnostics. Hamburg outlined the responsibility of the FDA in helping to move forward innovative technologies which are both meaningful and safe. She also emphasized the complexities in evaluating technologies like NGS, where the idea of one test and one intended use are not easily be defined. The generation of billions of results from a single NGS run poses unique standardization and validation challenges. She concluded by recognizing upcoming initiatives and appropriations to the FDA for advancing the oversight of NGS quality and accuracy including crowd-sourced public platform for validating NGS results.

Jo handelsman Director of the Office of Science and Technology Policy, elaborated on the concepts and specifications for precision medicine. From the President’s Precision Medicine Initiative, a portion has been earmarked for creating databases containing extensive genomic data based on a cohort of 1 million participants.4 The infrastructure will facilitate the curation of this repository and allow for multi factorial data analysis on a population scale. The resulting data will undoubtedly provide invaluable information that can further advance our knowledge of the genetics of human disease. The initiative will accelerate targeted drug therapies to the market. As an example, Handelsman discussed FDA approval of the first targeted therapy in 1998, Herceptin targeting breast cancers carrying the HER2 biomarker. Since 2012, 30 targeted therapies have been approved including 8 in 2014 alone.5 Similar targeted approaches can be led by NGS testing including drugs targeting specific allele variants of diseases such as Kalydeco (ivacaftor) in certain patients with cystic fibrosis.

David litwack From the Office of In vitro Diagnostics and Radiological Health provided a framework for discussing the components and steps in NGS each potentially impacting clinical performance of a test: library preparation, the instrument, its reagents, analysis software, technical interpretation, and the geneticist’s interpretation of the result. The current paradigm follows a framework of one test and one disease, its intended use. In this framework, FDA regulation is affected by the overall risk of the test. He described three risk classes, low, moderate and high risk, which involve rising safety and efficacy requirements in the third class. Their evaluation examines whether the test reproducibly and accurately measures its intended target, whether an accurate result has clinical meaning, and whether the labeling and marketing reflects this evidence. He raised concerns about unexpected findings both at the intended target gene and in unintended loci. Communicating the results of NGS also pose major challenges as does keeping pace with the size and speed of NGS-generated discoveries. Litwack ended his opening remarks with the story behind FDA’s clearance of the MiSeqDx platform in 2013, the first clearance of an NGS platform. The FDA focused on the instrument’s analytical performance, the performance of its reagents and then assays for the cystic fibrosis whole gene and for 139 specific variants. The ramifications for FDA clearance are that cleared devices can be marketed with approved labeling and follow-on tests which are substantially equivalent may be exempt from the premarket process. From the Precision Medicine Initiative, development of standards-based approach such as the “precision FDA” initiative and computational tools will provide platforms for the evaluation of NGS performance and accuracy.

Analytical performance of NGS tests

Benjamin neale From the Broad Institute of Harvard and MIT discussed process standards such as platform validation, sequencing run validation, and variant validation. He added that internal data generated from whole genome or exome sequencing has a role in validating the performance of each sequencing run. Other validations utilize standards such as reference genomes, genomes-in-a-bottle and other benchmarks. Variant validation addresses how likely would a variant be detected in an individual genome, i.e. sensitivity and specificity for variant detection.

John pfeifer From Washington University discussed how technologies and methodologies vary to meet different sequencing needs. For example, amplification-based libraries are better suited for detecting single nucleotide variations and small indels, while capture methods potentially acquire larger target molecules and can detect broader range of DNA variants. Each method has its intrinsic biases and varies in sensitivity and specificity for different use cases. These differences inevitably affect the types of variants that can be identified. The quality of sequencing runs can also greatly influence variant calling. Poor quality reads convolute complex genomic targets and result in inaccurate or skewed results. Variant calling programs require their own validation processes. Bioinformatics programs are used to annotate variants and could produce different outcomes based on differences in alignment strategy, reference genomes, and quality control criterion. These points highlight the need for different standards depending on the type of NGS application.

Karl V voelkerding Professor of Pathology at the University of Utah, reported on implementation of NGS standards. A NGS Work Group was formed by College of American Pathologists (CAP) to develop standards for NGS and basic bioinformatics.6 The separation of wet lab standards from bioinformatics analysis standards allowed work to be done in separate settings. Representative standards for SNV and indels are needed for test validation. A standard genome, called the CAP genome, was developed for proficiency testing. This sample contains a known set of variants and indels, including 200 disease-relevant loci. 130 laboratories have engaged in NGS proficiency testing using genomic DNA standards developed by CAP.

Victor velculescu Professor of Oncology at Johns Hopkins Sidney Kimmel Cancer Center, spoke about the difficulty in developing standards for cancer genetic testing. In addition to different tumor types, cancer DNA differs from germ line DNA with heterogeneity and varying concentrations of somatic mutations. As an example, Velculescu stated that an NGS test with a 99 percent predictive value may still lack the sensitivity needed to detect a low frequency somatic mutation.

Marc salit From the National Institute of Standards and Technology (NIST) added that reference DNA samples have also been generated in conjunction with Genome-In-a-Bottle Consortium, FDA and NIST.7 Salit also commented that having common shared standards can drive optimization and accelerate technologies.

Barbara zehnbauer From the Center for Disease Control ‘raised concerns about the final interpretation and action ability of results from the physician. She also noted the difficulty in interpreting the results in the laboratory without visibility into the context of the patient. Disease, she states, is not static and the patient response cannot be entirely inferred from the cancer genome. There was some consensus from panelists that the FDA must balance regulation and oversight with standards created by independent entities and by new innovation.

Clinical performance of NGS tests

Heidi rehm The director of clinical genetics laboratory at Harvard Medical School, discussed the use of evidence and database curation from NGS testing. The Clinical Genome Resource also known as ClinGen is a public collection which centralizes clinical variants for the purpose of sharing data and support for evidence curation.8 The NIH-funded effort employs the Gene Curation Working Group’s “Clinical Validity classifications” to apply tiers of evidence to assign the causative role of a gene in a disease. The group has made several focus areas to evaluate 77,000 variants from ClinVar.9

Garry cutter of John Hopkins and major contributor to CFTR2 commented that some variants and rare disorders are not highly prevalent in the United States. In the CFTR2 project, 55 countries are participating. Other international consortia, COSMIC, cBio, My Cancer Genome, Leiden Open Variation Database, Global Health Alliance, Human Variome and others, are also contributing to collection and curating variants from around the world.

Louis staudt Director of the Center for Cancer Genomics for the National Cancer Institute, discussed efforts in the curation and taxonomy of cancer genomes. The Cancer Genome Atlas (TCGA) and other efforts utilize multiple approaches including gene expression, methylation, and copy number changes in addition to somatic mutations to identify converging pathways of molecular aberrations. National Cancer Institute is establishing the NCI Genomic Data Commons (GDC) to store, analyze and distribute cancer genomics data generated by NCI and other research organizations. The NCI GDC will provide a system for researchers to access data and suggest potential therapeutic targets based on genomic information.

Sherri bale of GeneDx discussed reporting variants which have been identified from clinical sequencing. From a historical perspective, the American College of Medical Genetics and Genomics (ACMG) convened a workgroup in 2013 which included members from ACMG, CAP and Association for Molecular Pathology (AMP) to update the previous 2007 guidelines on how to interpret specific variants.10 The ACMG/AMP consensus recommendation for a framework to classify variants was published this year. The framework provides different levels of pathogenicity based on the clinical significance and tiers of supporting evidence. Reporting to the physician poses additional challenges due to the complexity of the results of the primary test but also due to secondary findings which include 56 genes recommended by the ACMG and other unintended discoveries. She reported that approximately a third of reports from GeneDx report ‘definitive’ level results. She also commented that individual physicians request a wide range of reports and interpretation from minimal interpretation to providing the raw and intermediate data.

Robert penny From the University of Michigan and founder of Paradigm and Caris Life Sciences discussed reporting on somatic mutations in cancer. His approach has been to only report actionable targets - specifically pointing to the U.S. Preventative Service Task Force guidelines. He shared success stories using next-generation sequencing including a case where identification of Epiregulin expression led to a treatment plan and success involving Cetuximab and Cisplatin.

Regulatory considerations

Neil risch Director at the Institute for Human Genetics at University of California, San Francisco (UCSF) and President of the American Society of Human Genetics (ASHG), discussed the importance of clinical factors in evaluating regulatory oversight of NGS tests. He pointed out that the accuracy and clinical action ability of NGS testing was context-dependent. Whether the test is used for prognostic or diagnostic purposes greatly impact the clinical relevance of a lab result and the course of action. The evaluation and regulation of a test might then vary depending on the clinical context and upon several distinct intended uses. Context dependence is also important in understanding what types of information are relevant to the current need. For example, the primary need for a patient being tested for cystic fibrosis is whether they have a pathogenic mutation in CFTR. They may not want to manage secondary findings such as a breast cancer mutation in that clinical context. Another consideration for oversight is the source and quality of clinical genetic knowledge. Great efforts are under way to improve the quality of curation for genes and variants but currently these collections are not yet adequate.

Madhuri hegde Executive Director of Emory’s Genetics Laboratory, discussed CLIA certification, CAP accreditation, and state level accreditation. These checks assess test validation, quality assurance, analytical performance, exception logs, version history, variant interpretation, unexpected findings, etc.. Both dry and wet lab proficiency are performed as well.

Mya thomae Vice President of Regulatory Affairs at Illumina commented from the perspective of the manufacturer. Manufacturers are working toward producing more stable products, with appropriate labels. A clear regulatory paradigm for in vitro diagnostic products would benefit all parties. CytoScanR, Myriad, Cystic Fibrosis Mutation and 23andme assays have all cleared approval through the creative efforts of the FDA. She sees a path for regulatory clearance of large panels for diseases, whole genome and exome sequencing.

Conclusion and public comment

The meeting showed enthusiasm for standards, accuracy and validation for instrumentation and reagents however many of the participants felt that existing professional and state organizations have already established standards and monitors for NGS-based testing. There was consensus that instrumentation and reagents and clinical interpretation of results are separate entities with the former treated as as devices and reagents and the latter as laboratory developed tests and as professional interpretation. Public collections for variants and their curation with clinical relevance was applauded for their transparency and accessibility; although there were some points made for the unique value of private, commercial data. Non-professional entities supported efforts for education of the workforce and readily interpretable public data sources. Finally, understanding the context for which results are used, i.e. intended use and clinical context, were voiced as paramount in validating NGS test results and in providing safe and actionable information.

Acknowledgements

We apologize that many of the excellent comments and examples contributed by the participants could not be included in this review to maintain its brevity. We thank the FDA for providing this workshop for the public discussion of the current state of NGS-based diagnostics and challenges ahead.

Conflict of interest

B.D.Y. is an employee of Interpreta, Inc. The views expressed herein are those of the authors and do not necessarily reflect the official policy or position of Interpreta or its parent company nor does the mention of trade names, commercial products, or organizations imply an endorsement.

References

Creative Commons Attribution License

© . This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.