Email us : info@waife.com
Management Consulting for Clinical Research

Slower than Evolution (Monitor, 2011)

“If only we were as adaptable as viruses. Our clinical research processes unfortunately evolve at a much slower rate.”

 

Flu virus evolves continuously, so much so that a new vaccine is needed every year. It’s the fast replication of virus generations that produces such rapid mutations. Typical viruses change so rapidly, through natural selection, they are able to respond quickly to new environmental conditions. If only we were as adaptable as viruses. Our clinical research processes unfortunately evolve at a much slower rate; clinical research can often take a decade or more to make much less dramatic “genomic” changes. Can we learn as fast as natural selection?

 

Excuses

I can hear the objections to this comparison already: biology is “objective,” process is “soft.” The excuses we have for the slow pace of change are many:

 

Regulatory requirements. Biopharma quickly takes refuge in the argument that the reason why things don’t change is because we are a regulated industry. This is rarely relevant to process discussions. There are actually surprisingly few regulations on clinical research process (for instance, nowhere do the regulations say we must perform 100% SDV), and on more than one occasion (I can think of at least three in the past decade), FDA has actually led the industry by issuing guidances describing processes more advanced than industry was initially willing to accept.

 

It’s not broken so why fix it? This is a typical reaction among those working in a long-lived, experienced and successful (i.e., drugs approved) development organization. But it is broken, and if you look, you will see the facts to prove it. You will find inefficient policies, overlapping responsibilities, unexploited technologies, and political speed bumps slowing your processes.

 

The “not broken” argument is used a lot when considering ePRO, for instance: “well, I know the paper data is probably bogus but it’s the standard of approval.” Well, no, not anymore, but how long does it take for that fact to filter through research departments who are still rejecting ePRO because it costs too much? That in itself is misleading: what costs “too much”? Don’t they really mean that they didn’t adequately budget for the cost of reliable data?

 

We don’t have enough evidence yet to justify the upheaval of change. This is false rigor and laziness. When management resists change for a lack of “data,” ask them where is the data to justify current processes? Try taking a zero-sum approach to how you work, and see what proportion of your current processes would survive the analysis.

 

It’s an unfair comparison – viruses are much simpler organisms than human. – Ah, now we are on to something. If only we could examine human behavior the way we can a genome.

 

A Method

Actually, we do have a method for accelerating human evolution – it’s called learning. And in the jargon of process improvement, we call it “Lessons Learned”. Many biopharmas would point out that they do Lessons Learned exercises routinely. So the questions are how do we do them, what do we do with them, and where can we see the impact?

 

One concern is that Lessons Learned exercises have become perfunctory and disrespected. Is the lesson from the “Lunch ‘n’ Learn” session in the cafeteria excreted later in the day? Is the Lessons Learned binder carefully shelved, accumulating dust like autumn leaves? Are those who run these sessions prepared to make them productive? We are neither born teachers nor born learners.

 

Is the whole Lessons Learned concept thought of as a training function (and therefore underfunded and mostly ignored)? At best it is probably a Clinical Operations function, so the lessons do not permeate medical, biometrics and planning departments. More importantly, where are the lessons coming from? That is, do we have the skills or training to analyze properly the experience before us, and derive the lessons needing to be learned?

 

Information technology is an important tool ready to help us learn – data on clinical research performance has never been so plentiful or accessible. But no company does a comprehensive job of mining that data for the most relevant process indicators, and no technology vendor does as well as it should in providing easy tools for analysis.

 

If we’ve outsourced research operations, do we think we can outsource learning too? Are those we have outsourced to learning from their experiences? Must we pay them to learn, otherwise they will have no incentive to improve?

 

We can and must do much more to learn from our past work. First, learning needs to be incorporated as an expectation and requirement of all functions, and not shuffled off to some ancillary group. I am amazed, for instance, when study teams are not held accountable to document the good and the bad of their experiences. Who were the high performing investigators? What approach worked best in getting drug supply produced on time? Which data manager should be promoted because she was so effective? And I am equally amazed when functional specialties do not take the time to assess their effectiveness, to learn from each other, to routinely identify means of self-improvement.

 

Be Afraid, Be Very Afraid

The best source of lessons learned is in the head of our experienced staff, and here is where we should be very concerned. At the moment when we most need to learn from past experience, that experience is walking out (or being shown) the door. As we send staff to CROs or retirement or forced career changes, we are losing the chance to capture and define and articulate the lessons they could be teaching us.

 

We have to care much more about learning. As always that starts with upper management. It continues by allocating money and time. And it is capped by a willingness to learn (change) and a method for applying lessons quickly.

 

 

George Santayana famously said, “those who cannot remember the past are condemned to repeat it.” That was more than a hundred years ago. And here we are decades later with that lesson still unlearned. Our industry cannot afford to wait that long to apply the many lessons available at our fingertips. We may not be able to match the adaptation rate of viruses, but we will fall victim to the disease of ignorance unless we organize the learning from our experience.

 

©Waife & Associates, Inc., 2011

Sorry, the comment form is closed at this time.