Welcome to BC|PODCAST

From Data To Health

The BC Platforms' podcast series, ‘From data to health’ brings together innovators representing the ecosystem of users, custodians, contributors of healthcare data. Each guest sheds light on how they advance and envision the future of health through data.

This series is hosted by Dr. Tõnu Esko, Head of Estonian Biobank Innovation Center, BC Platforms SAB chairman and Vice Director, Institute of Genomics, University of Tartu.

 


 

Episode #3 – Real World Data Access for Label Expansion

This podcast holds an open discussion on how real world data can be applied in drug development to enhance research and accelerate discoveries. It sheds light on industry challenges and expectations, as well as innovative approaches to collaboration. The discussion features insights from Jacek Nowak, with a focus on supplementing regulatory filings with RWD and how it can be used as an external control arm in clinical trials.
 
 
BCpodcast LOGO

Transcript

Ellen Sukharevsky  0:30  
Hello and welcome to the BC Platforms Podcast. My name is Ellen and I will be your moderator today. BC Platforms is the global leader in providing a powerful data and discovery and analytics platform, as well as data science solutions for personalized health care. BC Platforms enables cross-functional collaboration with our global federated network of data partners. Today's podcast will focus on the topic of real world data access for label expansion. It holds an open discussion with a few key leaders on how real world data can be applied in drug development to enhance research and accelerate discoveries. This podcast sheds light on industry challenges and expectations, as well as innovative approaches to collaboration. The discussion is led by Tõnu Esko, BC Platforms Scientific Advisory Board Chairman, and Vice Director, Institute of Genomics, University of Tartu, where he also holds a Professor of Human Genomics position. He is the Head of Estonian Biobank Innovation Center, and focuses on public-private partnerships and innovation transfer. Dr. Esko is also a research scientist at the Broad Institute of Harvard and MIT. He acts as one of the senior leaders for Estonian Personal Medicine Program and serves as scientific advisor for several companies. Our speaker today is Jacek Nowak, Amgen's Executive Medical Director for European Midsize Markets. Jacek has been active as a pharmaceutical industry leader for over 25 years. He lives in Vienna, Austria. Prior to joining Amgen in 2003, Jacek served as Medical Director for Central Europe at Wyeth. He also held a role of Marketing and Sales Manager with ?, and prior to that he worked as a Medical Doctor at the Cardiology and Intensive Care Unit at the University Hospital, Slaskie School of Medicine in Katowice, Poland. Jacek holds a degree of Medical Doctor and earned his doctorate on research work on leptin receptor polymorphism. Jacek holds a Diploma in Pharmaceutical Medicine from the Royal College of Physicians in London. Jacek's expertise pertains to the area of organ transplantation, oncology, cardiology and metabolic disorders. Now, I will hand over to the speakers for a brief intro to begin the discussion.
 
Tõnu Esko  2:36  
Hello, everybody. My name is Tõnu Esko and I will be leading the discussion today.
 
Jacek Nowak  2:42  
Hello, pleasure be here, Jacek Nowak. 
 
Tõnu Esko  2:44  
So let's jump into this quite exciting topic in my mind, how to use the real world data in drug discovery, but also how to get the data or the drugs into the patients using the data. I think you, Jacek, are an excellent partner to talk on those topics. Maybe we start on discussing, what is real world data that is so much used in the literature and discussions these days?
 
Jacek Nowak  3:18  
Well, real world data is becoming a sort of a buzzword, but the fact is that we kind of love randomized design. It's a long-term relationship, and we stick to it even though we know by now how expensive, inefficient it is. And it even doesn't answer the very question it was designed to answer, which is, does this drug really work? Well, think about it. The basic question that the doctor asks when he sees the patient is, is this intervention that I'm going to apply going to work for this particular patient? If you think about the payer, the payer would ask, do I know what I'm paying for? Or if you want to paraphrase that, can I predict the outcome in the patient that I'm covering? And finally the patient is interested in, am I going to be cured when this intervention is applied to me? So randomized clinical trial don't answer those questions yet, it's around because randomization is extremely a simple and powerful way of removing the background noise. And now, the era of patient identification and prediction of outcome is the driving force behind the efficiency of the healthcare system, we have to look for new solutions. What we observe right now is the sort of universal modification of the drug development process that goes beyond simple randomization. One example is actually an adjustment of the concept of randomized clinical trial. And the good example here is complex, innovative design, which is being piloted by FDA right now. A part of the pilot or so called adaptive trials or adaptive design, where, in a nutshell, the study fits the data that informs the design that can be changed in the course of the study. Another interesting example would be conversion of randomized study of comparative trial into single arm trial, and look at modeling parameter estimation through using existing data, so called historical control. And this is not an easy call, but we have two situations, at least the situation where we can think about it. One would be a situation where we see a large size of effect in life threatening disease, which somehow overcomes our concern for bias. And another one would be a small population, where randomization is not feasible.
 
Tõnu Esko  5:42  
So just taking from your last points, the number of patients is one of the trying points to use in real world data. But how important, how complicated is to work with different countries, different healthcare systems, even ways how the data is collected? I reckon, as the data quality or how the data is collected is very, very important in order to use the real world data in decision making.
 
Jacek Nowak  6:15  
Yeah, totally, you bring an important point. When we talk about historical controls, we might typically think about real world data, but actually, it doesn't have to be real world data. Quite often, what we see is that historical data can come from different trials of the same program, it may come from meta analysis, it can come from placebo arm of a different trial. What speaks for real world data, as inclusion in this kind of project, is the volume. What we see right now is an emerging number, growing number of electronic health record deals, and also tons of registries that we have around. So this speaks for using real world data, but still, we have to be conscious that the data coming from those assets should represent the population as close as possible to the one that we have in the experimental arm. So historical control is inherently biased, and obviously, it requires great explanation, when the applicant is talking to whoever the data is generated for, even if we talk about drug development programs, those will be regulators. So it's a good habit in order to avoid sort of the dead end, or stumbling over roadblocks to communicate where the data will be coming from, how the data will be analyzed. And even better, to reverse the order somehow. So when I'm thinking about this, instead of designing your experimental arm, and matching the historical controls, to look at the current standards of care and the data that we have, and then design the experimental arm that fits into it. So obviously, it's a quite challenging topic, and there are a couple of areas where actually we see it happening. One of those is hemato-oncology. Hemato-oncology somehow takes the two boxes, which is ethical and technical. So when you look at hematology, hematology traditionally, historically was looking at large size of effect in single arm studies in pediatric leukemias. When we look at the solid tumors right now, we see a shift from the anatomic definition into the definition of tumor that pertains into molecular profiling. So molecular profiling would stratify patients into smaller cohorts, which again, makes randomization quite difficult. The second thing that we have to consider is a relevant endpoint for this kind of design. So regulators like to see so called time-to-event endpoints, which would be disease progression or survival. Yet, those endpoints create a challenge, because a lot of things can happen over time. So at the end of the day, other factors than intervention can impact the outcome. We know by now that qualitative measures like the durable response forresponse correlates with the outcome pretty well. And it also reflects the real world. I think we can comfortably run single arm studies that are based on that endpoint, because remissions in that space are quite rare. This is a typical process that is applied into early stages of drug development, that can be a basis for approval, or may require confirmation but by randomised trials. So the last point I wanted to bring is that once we have agreed on the end point, it's also extremely important that we understand what is the threshold of clinical importance of that endpoint. And here again, real world data is extremely important because what we would do is we would look at different kinds of meta-analysis or standards of care that deliver particular outcomes and based on that outcomes, we can decide on design. If, for example, the current standard of care offers long survival, there's probably no way beyond a randomized clinical trial. But again, if we see a good result in early stages of development of the molecule that is experimental, we can think about alternative designs. So let me give you here an example of a platform that was developed a few years ago for treatment of acute lymphoblastic leukemia, which is quite a rare disease with incidence of about one to 100,000. This platform is called BIPE, which stands for by-specific piece of engagemen. In a nutshell, it's an immunotherapy. What it does, it reflects the concept of so-called induced proximity, where two molecules that exist in biology that may never come close to each other, they can be brought close to each other by a molecule that can engage receptors on both of them. And in this case, we just bring a T-cell which represents the immune system into a cancer cell by a sort of a hybrid antibody. And what happens when those two come together resembles, if you know Harry Potter, there was a character called Dementor that was giving a kiss of death. So you see a very efficient kiss of death given by the immune system to the cancer cell. So we knew that the drug was working, it was really difficult because of the rarity of the disease to design a large, randomized study. So that study used a historical control, and the experimental arm was roughly 200 patients, the historical control was 10 times bigger. The historical control is typically bigger, because there's a high level of diversity of the population that is brought into that control. So what you need to do is stratify that group into different strata based on their risk factors, then you have to wait for it. So analysis is quite complex. But at the end of the day, we were able to prove the beneficial survival in the experiments.
 
Tõnu Esko  12:01  
Thanks for this very good explanation of the situation and overview. And this is also a very elegant example of how, if we publish really good evidence, it would help to bring new medicines to the market. As my position as one of the Heads of the Biobank and driving the healthcare system to be more genomics focused, and in Estonia, hopefully getting to be the first country to get completely genetically profiled within the next 10 years. Knowing that there will be a lot of genetics available in healthcare systems, how that plays into this real world evidence model, and how you, Jacek, see, it could be helped or integrated into processes that you explained before?
 
Jacek Nowak  12:58  
Let me maybe go back to what I mentioned before, which is patient identification and prediction of outcome, as key drivers of success in the integrated healthcare model. I would hypothesize, if we nailed those two well, then we may bring enormous efficiency to healthcare, because we get the right treatment to the right patients at the right time. So obviously, there is no silver bullet here, but what you mentioned Tonu is probably offering a first step towards personalized medicine by assessment of the risk that is related to genetic architecture of the patients. So when we talk about genetically driven disease, we typically think about those monogenic mutations or structural changes like Down disease or phenylketonuria, or BRCA. BRCA is a good example, it's a risk that can be caused by a single mutation in a gene that codes for protein responsible for DNA repair. But what happens here is that we have two genes, one from the mother and one from the father. If we inherit this mutated gene, this other gene is quite often sufficient to produce functional protein. But as we go through life, the risk of having the second gene mutated rouse, and at the end of the day, the people who have the second gene mutated, they develop cancer at a young age. In women, this would be typically ovarian or breast cancer, but it also happens in men. But if you really look at the diseases that are related to heritability, that have genetic background, most of them are actually related to what we called polygenic mutations and polygenic variants. This is a new concept that has arised on the base of improved computing power, that allows us to make assessment of hundreds, thousands, or even million different variants, and each of those variants as a single variant has negligible impact on the trait, but collectively they do. So what we are able to do today is to make a full genetic screen, and stratify patients or healthy people into different risks. And a good example here is cardiovascular medicine where there is published data of validated scores that allows to stratify patients, about 8% of patients of healthy population, who may become cardiology patients with the risk of three to four higher than the patients who are on the other end of the spectrum. So certainly there is an opportunity of bringing genomics into practical medicine. It's not happening yet, we are probably at the stage of hypothesis generation, but I believe that practical application of that is just around the corner. A good example here is again, a study that was done, that was published in 2019 by a research group from Boston, who genotyped the patients, the participants of the study that was already published, trying to find out the impact of calculated polygenic score on the outcome of those patients in the study. The study was looking at response to a lipid lowering agent in terms of cardiovascular outcomes. What they found out is that the genetic score was an extremely powerful predictor of that response, it actually doubled the absolute response rate in the patient population. And it was more relevant even than the clinical factors. So we see how powerful genetic evaluation can be in prediction both outcomes and treatment effects in a particular population of patients. We could have more areas like this, we could think about metabolic disorders, inflammation disorders, we can talk about osteoporosis, even cancer. So there is growing evidence that this polygenic score, that again, is going to be calculated with higher precision as we move on, because there are more and more variants which are discovered, can become a very important part both of drug discovery development, but also of practical application medicine.
 
Tõnu Esko  17:16  
I'm really glad you brought up these polygenic risk scores, because very much these days, the discussion revolves around rare diseases and diagnostics, or genetic profiles for precision diagnostics in oncology. But I very much agree with you that there are so many more fields where this genetics could be applied, and more focused on the early detection and early prevention, and even stratifying people or patients in the high risk, low risks, and taking actions. Here in Estonia, we tried to work on implementing polygenic risk scores in breast cancer, which already you described, again, try to see if we focused on the top carriers of polygenic risk score, they can actually detect breast cancer earlier, and this way, make the screening programs much more efficient. The genetics, indeed, will play a great role, an important role, and someone has said that they will change medicine more from art to science, because we can rely on the data from the early age days, because genetics is not moving. The next important topic to cover is the expectations towards the real world data. Myself as a data custodian for Estonian Biobank, and I hope many of the listeners also manage big data sets or run hospital systems. So the question is what are the key properties, what those big data banks or healthcare systems should keep in mind when collecting data, or when trying to have partnerships with the industry in this space?
 
Jacek Nowak  19:12  
I think what happens right now is due to actual information development, information technology area, rather than medicine. I think it's a key driver that allows for analysis of large datasets, but also for linking those data. There are obviously a number of policy and legal issues that need to be sorted out in order to get access to anonymous healthcare data that would serve the society, but I'm observing the space of observational research and real world data, real world evidence for quite a while, and I see quite an important transition. So what we can do right now, and what we are observing is moving from observational studies as we understand them in a classical way, into what we call data science. It is a completely different area that brings a completely different value into healthcare. So let me start with a customer. Typically, when you think about observational study in a traditional sense, we were thinking about physicians. Physicians were interested in how the patients are treated in real life versus what the clinical trial told them. Today, we are moving from physician into a broader context of, let me call it, the healthcare administrator, so whoever is responsible for making the appraisal on what is the outcome of patients that are applied to critical intervention. and deciding on is this intervention going to be accessible for those patients, yes or no. And second thing is data source. We used to generate data for a particular project that was driven by CRF. Today, we have those huge data assets, so existing databases, biobanks there's a lot of data that is available for research. Also the size of the data set, typically it was small, we could have a couple of hundred data points. Today we are talking about hundreds, thousands and millions of data points when we link the data and look into big data sources. The analysis has changed. We used to apply typical epidemiological methodology to analyze the data. Today we are talking about artificial intelligence and machine learning. And we have good examples already of algorithms produced in medical imaging, in genomics, in streaming data from wearables into patients by using those technologies. Validity is another important factor. Simple observational studies, sometimes is, the data is obsolete when it is completed. Now what we can do is to generate the data on a real time basis. So we can talk about real time evidence on top of real world events. And finally, probably the most important part is the, what is the data, what does the data feed when it comes out of this research? So typically, we would use data for this utility analysis, appraisal the standard of care, or modeling for reimbursement. But today, we can produce all those predictive risk scores that we just discussed, which means we can bring medicine much closer to individual patients, rather than the population. So I'm extremely excited about what happens in that space. And I'm absolutely convinced that as we move on, we will make medicine much more efficient and accessible for the patients.
 
Tõnu Esko  22:27  
I also think what is important to stress is the quality of the data or to have it mapped on the international standards, have it machine readable, have it structured, have it linkable with different data sources. You mentioned that a lot of electronic health records are now becoming available, but in there still a lot of information is quite hectic, you have to be a field expert, or to really use advanced analytics and machine learning processing to extract meaningful datasets. I think it's really, really good to see that most of the healthcare systems are carrying up to use more and more automated data collection in order to make this type of research possible. So, Jacek, you also mentioned using variables or real time data collection. Have you, in your work, also planned to use or have ideas to use, some kind of variables to risk stratification, or even learn something about medicine? Those variables are quite a hot topic these days.
 
Jacek Nowak  23:52  
Well, what I think is happening right now, as we move people from hospitals to home based treatment. The hospital is the key element, the key cost item on the agenda of healthcare, so it has to change. 30% of hospital expenses is administration. I think it's no brainer that people will be moved from hospitals to a place where application of medicine is cheaper, and for that reason, it's not avoidable that we will have to make sure that we understand what happens to those patients. So the technology is here, we started using them in clinical trials, you can apply a lot of telemedicine in analytical platforms that collect data from the patients passing by hospital visits. I think the first step has been done. Probably in the next three to five years, and we all know that the pandemic accelerates a lot of technological solutions that are being applied in medicine right now. I have no doubt that this will become again the standard part of medical care.
 
Tõnu Esko  24:55  
I'm really glad that we end our discussion with really forward looking at topics like wearables in medicine, telemedicine, digitalization of data. I'm really thankful Jacek, that you borught up those topics. And for that I think we will conclude for the day and I give over to Ellen to conclude this episode.
 
jacek  25:18  
Thank you so much, really a pleasure talking to you.
 
Ellen Sukharevsky  25:20  
Thank you, Tonu and Jacek for joining today, and for everyone for listening. Speakers, do you have any final comments?
 
Tõnu Esko  25:27  
I agree we had an excellent discussion on relevant points, and as I said, telemedicine and digitalization is the future of healthcare and research as well.
 
Jacek Nowak  25:36  
The topic of the discussion was an opportunity of replacing randomized clinical trials with application of real world data. In my view, it's coming, but obviously application of real world data is much broader than that. It will certainly come into the drug development process. But what's more important, it will help us to personalize medicine and bring it closer to the patient.
 
Ellen Sukharevsky  25:59  
Great. Thank you very much Tonu and Jacek for those final points. And thank you everyone for tuning into our podcast. To connect with our company and learn more please email sales@BCplatforms.com , or visit our website www.BCplatforms.com . Thank you and we hope to stay connected with you.
 
Unknown Speaker  26:16  
Thank you for tuning into our podcast. To connect with our company and learn more, visit our website bc platforms.com. And follow us on LinkedIn for more engaging content. Thank you and we hope to stay connected with you