In Vivo is part of Pharma Intelligence UK Limited

This site is operated by Pharma Intelligence UK Limited, a company registered in England and Wales with company number 13787459 whose registered office is 5 Howick Place, London SW1P 1WG. The Pharma Intelligence group is owned by Caerus Topco S.à r.l. and all copyright resides with the group.

This copy is for your personal, non-commercial use. For high-quality copies or electronic reprints for distribution to colleagues or customers, please call +44 (0) 20 3377 3183

Printed By

UsernamePublicRestriction

US FDA’s Khozin On Defining Big Data, Safety Signaling, And The Patient Experience in Cancer

Executive Summary

The challenge for users of big data "is to develop the human organizational and technical capacity to turn the 'big”' into the 'smart,' through applied analytics to personalize therapies around the distinctive disease characteristics of each patient." Khozin says.

This article accompanies a larger discussion of INFORMED and the perspectives of Sean Khozin, acting associate director of the FDA Oncology Center and founding director of INFORMED. (Also see "Big Data And The FDA: To Mine The Value, First Mind The Gaps " - In Vivo, 21 May, 2018.) FDA describes INFORMED as an "incubator for collaborative oncology regulatory science research focused on supporting innovations that enhance FDA’s mission of promotion and protection of the public health ... Special emphasis is placed on systems thinking in oncology regulatory science research to facilitate development and adoption of new solutions for improving efficiency, reliability, and productivity in a broad range of workflows related to oncology drug development and regulatory decision making."

In Vivo: What does the FDA mean when it references the term “big data”? Like so many issues today involving advanced technologies, the concept is fuzzy but the implications – particularly on patients – are profound. That’s especially true given the focus of your work on INFORMED is cancer research.
FDA's Sean Khozin: The reason why big data is so important today is the potential it has in better capturing the actual experience of the patient in medicine. It is not just about the “big” factor – we don’t evaluate its promise solely as a volume-based exercise, although this tends to appear first in the conversation. A common definition of big data is built around four dimensions: (1) volume (data size); (2) variety (data type); (3) veracity (data noise and uncertainty); and (4) velocity (data flow and processing). At the FDA, most approval decisions are still based on data of limited variety, mainly from traditional randomized clinical trials, and are highly structured within data sets that are relatively small in size and are processed intermittently as part of a regulatory submission.

The challenge for the FDA – and indeed all users of big data – is to develop the human organizational and technical capacity to turn the 'big' into the 'smart,' through applied analytics to personalize therapies around the distinctive disease characteristics of each patient. What this means in practice is to put much more emphasis on that second dimension, data variety. This includes tracking the patient journey through the health system to accurately and consistently record the outcomes of treatment. But it also must incorporate data generated by the patients themselves, on an ongoing basis, in the form of diverse, web-based apps and wearable devices. The FDA is aware that this approach works: in one trial published in the Journal of the American Medical Association (JAMA) last year, metastatic solid tumor cancer patients who were given a web-based platform to record their side-effects from chemotherapy for real-time evaluation by cancer care teams experienced a five-month improvement in overall survival versus those who did not record. It was a simple experiment but it showed nonetheless that involving the patient with data relevant to their own condition can produce a positive health effect.
INFORMED is embedded in the FDA’s Oncology Center of Excellence. Why the focus on cancer and what impact will your work have on the pace of treatment for a disease that will strike nearly 2 million Americans this year alone?
Cancer is an extremely complex and varied condition, to the point where oncology drug development has largely become an exercise in evaluating huge volumes of data drawn from disparate sources. Our increased understanding about the genetic origins of individual cancers has led to the DNA sequencing of solid tumors, creating a data pool so vast it outpaces our technological capacity to analyze it. Big data in oncology also incorporates not just individuals’ genetic information but data drawn from the microbiome, as well as in that larger environment outside the body – the exposome of external and life-style exposures occurring from the prenatal period onward through life. These drive in turn the similarly endless variations in treatment response, where data is critical to providing insights on the potential of an increasingly diverse set of therapies, many of which work differently in focusing on an immune system response or are administered in combination with both new and older drugs.

The important point is that this trend runs counter to the traditional reductionist approach to drug treatment, relying on a single drug to attack an undifferentiated set of tumor sites and characteristics. This approach is not scalable to what we know about the biology of cancer today. For example, the most common mutation in non-small cell lung cancer, the epidermal growth factor receptor (EGFR), is present in only about 15% of the patient population, which means that fishing for the therapy that’s right for the individual patient requires a much bigger net – and a more nuanced approach. It demands a holistic therapeutic strategy focused on the complex signatures identifiable through systems biology and the entire multiomic milieu of gene and protein-based analytics. Only the biggest data sets can help researchers do that, which makes cancer the place where an incubator like INFORMED has the potential to contribute to the science and benefit patients.
Collaboration is a key rationale for INFORMED. How would you assess the biopharma industry’s response to your work to date – is it ahead of you or slower than it should be in helping advance your objectives on digital transformation?
Although we strive to cast our net widely, INFORMED welcomes the support we have earned from many big pharma players. One of INFORMED’s first projects was a pilot we conducted with four companies – Astra Zeneca [AstraZeneca PLC], Genentech [Genentech Inc.], Merck & Co. [Merck & Co. Inc.] and Novartis [Novartis AG] – where we tested the feasibility of a new digital framework for reporting of important safety events occurring in clinical trials subject to investigatory new drug (IND) regulations. Instead of submissions that were disaggregated on receipt and sorted in paper and PDF files, INFORMED put together a team that included technical experts from the FDA Office of Surveillance and Epidemiology to develop a new digital framework in which the reports were processed electronically as machine-readable data sets, amenable to standardized visualization and analytical tools, including AI-based methods to conduct safety signals detection and systematically identify gaps in meeting regulatory requirements. The overall aim was to uncover missing or inconclusive safety signals. Reports from the four companies were successfully registered in the new system, validating the new digital format.

The format is now being institutionalized here in the US as the FDA Premarket Digital Safety Program announced by Commissioner Gottlieb last month, beginning with oncology NDA submissions. The FDA Office of Oncology has concluded that digitization of the adverse event reporting process will also be a major productivity booster, saving the equivalent of 500 man-hours of work time every month once the program is fully implemented. Overall, we see the four companies’ contributions to making our idea work in practice as a highlight of what can be achieved through collaboration, using the big data tools allowed by the technology revolution.

What’s important about this is that FDA reviewers used to have to read cumbersome paper and PDF files to identify safety signals; there was no signal detection based on an accessible, organized data-set approach, in the premarket setting. And it’s really a global issue. We may decide in the near future to take our framework as a new foundation for the global harmonization of premarket safety event reporting.
Biopharma companies sometimes cite mixed signals from the FDA as a reason for not moving more aggressively to innovate in the use of data and evidence to advance pipeline performance. Is this perception still valid or has the situation truly changed?
It’s no surprise that industry will worry about what the world’s largest regulatory agency thinks. And siloed, insular thinking is a recurrent challenge to any large organization, including the FDA. I spend a good deal of time explaining to industry colleagues that the FDA today has a strong technology- and data-driven outlook toward innovation. We are in no means a barrier to the creative application of digital technology to generate more and better evidence to drive drug development. In fact, the agency is on the leading edge of change in this area, which in large part is due to efforts from the commissioner’s office to promote technology innovations and greater evidence diversity throughout the agency. Ironically, that top-level commitment is not always present in the private sector. It is particularly hard in large biopharma companies to sustain that seamless flow of ideas where the clinical development teams join forces with the commercial leads in exploiting novel evidence generation tools like RWE; each group has a history of approaching the product development cycle from a different perspective. I see technology, data science and digital as a bridge across the divide, but it takes initiative and a willingness for taking calculated risks outside organizational norms. Some organizations have been slower than others in confronting the disruptions this may entail, but I am confident both government and industry are moving in the right direction.

 

Topics

Related Companies

Latest Headlines
See All
UsernamePublicRestriction

Register

IV005338

Ask The Analyst

Ask the Analyst is free for subscribers.  Submit your question and one of our analysts will be in touch.

Your question has been successfully sent to the email address below and we will get back as soon as possible. my@email.address.

All fields are required.

Please make sure all fields are completed.

Please make sure you have filled out all fields

Please make sure you have filled out all fields

Please enter a valid e-mail address

Please enter a valid Phone Number

Ask your question to our analysts

Cancel