Email us : info@waife.com
Management Consulting for Clinical Research

Preparing for the Tsunami (1998)

Preparing for the Tsunami

 

As I write this, Armageddon is the top summer movie, high-profile drug withdrawals are calling into question whether our industry is doing enough pre-market testing, and the rapid changes in drug discovery mean that, in the words of a top FDA official, we are facing a “tsunami” of new therapeutic candidates. We seem to be at the forefront of the latest fashion for end-of-the-millennium disasters. Are you ready for this?

 

This is a serious question for all players in the development phase of the pharmaceutical industry. We are familiar with the statistics showing how the number of trials, patients in trials, and trial complexity have already been rising rapidly, accelerating the cost of clinical testing. This creates a feedback loop of questionable business strategy: companies are driven to find the blockbuster candidate, as quickly as possible, because who can afford to test a drug that will make “only” $100 million a year? They can’t all be blockbusters, and some important therapies that people need will never generate blockbuster revenues.

 

The recent market withdrawals highlight the gap between testing drugs in “real” use situations, as distinct from “artificial”, or what we would call “scientific”, controlled use conditions, and they also highlight the increasingly impossible task of testing for all possible drug interactions. Pfizer did not anticipate the need for a Phase IIIb study of Viagra in 80-year old men with heart conditions. And as more and more of us are taking more and more drugs, on a daily basis (our lipid-lowerer, our half-an-aspirin, our bone-loss inhibitor, our St. John’s Wort), the “normal” patient using a drug in real life will not be taking it in a vacuum. How do we anticipate the potential for problems?

 

Press attention will blow over, but the wave of discovery candidates will only build higher. Add genomics and pharmacogenetics to combinatorial chemistry and high-throughput screening, and not only are we pumping out many more potentially active substances, but we may end up developing 8 flavors of the same SSRI or COX-2 inhibitor instead of one. Are clinical development teams going to be faced with doing a suite of trials on all the flavors? Even without genetic tailoring, almost every company talks about significantly increasing the number of NDAs to be filed each year.

 

The first group to be hit will be the preclinical departments, a world until now relatively immune from re-engineering, time pressures, and the spotlight. But they are living right on the shoreline where discovery’s waves break. Will we be doing rats and dogs for the rest of our lifetimes, or will the very nature of drug discovery, at the molecular and gene level, enable us to devise preclinical testing methodologies radically different, faster and more productive than today?

 

Managing Armageddon

 

There are numerous implications for drug development in these trends we have described. Drug interaction dilemmas can be greatly served by improved use of safety surveillance procedures and technologies which already exist. Numerous schemes are available for electronic transmission, including Web transmission, of medWatch forms, for instance who and how do we get physicians to use them? A major initiative among industry, government and healthcare providers is needed to take this issue more seriously.

 

Preclinical will need to wake from its slumber and confront the changing demands it will be facing. Sponsors are likely to take the lead in blurring the line between late discovery and early nonclinical development in order to filter the volume of candidates to a more manageable size. Nonetheless, we will need to expand the capabilities of pre-human testing, and the implication of the blockbuster business strategy is that we must speed the results and analysis of preclinical testing to accelerate decision-making. Enhanced computer-based reporting tools will be critical to make this happen.

 

I think a new area of clinical research will have to open up: “real use” trials. In the foreseeable future we can imagine this becoming a regulatory advantage, if not a requirement. What are the implications? Will this be where organized healthcare providers demonstrate their unique contribution to clinical research, be they PPMs or SMOs or some new useful combination? How will we reconcile real, dirty data, with the controlled research model at the heart of our traditional research? New statistical models, and information technologies which make real use data collection easier, will be required.

 

And more than ever, we will need to continue to change the fundamentals of the clinical research process. We must push forward on processes and technologies which slash study startup time, accelerate enrollment, and eliminate paper: get the data right the first time, accelerate locking a clean database, and simplify electronic submission.

 

There’s no going back to an “easier” time. Public and business pressures are likely to increase. Even the causes of change are not what we thought they would be a few years ago. So continuous improvement in our processes, and creative application of ideas and information technology, will be essential to keep up with this spiral of change. Let’s keep the tsunami on the movie screen.

Sorry, the comment form is closed at this time.