Email us : info@waife.com
Management Consulting for Clinical Research

Shortcuts to Nowhere (Monitor, April 2007)

Many of the readers of this magazine are clinical research monitors. For many of you, the technology revolution in clinical research has been a runaway train and you’ve been the tracks. Far too often, in the process of introducing electronic data capture (EDC), electronic patient-reported outcomes (ePRO), or a new clinical trial management system (CTMS), sponsors take the shortcut that runs across your backs.

 

How do sponsors do this? The shortcuts take many forms. It starts with not involving monitors and their management in the initial EDC and CTMS decisions, except perhaps in the most perfunctory manner. This is rooted in the tradition of running the vendor selection teams from the IT or data management organizations. Despite being well into the 21st century, these groups continue their historical narrow view of any tools which have something to do with “data”, and monitors are not in that view.

 

But the shortcuts further downstream are more insidious. Sponsors, and the CROs they depend on, find ways to skimp on end-user training, user mentoring, and internal monitoring expertise. The need for detailed planning for, and documentation of, changed work processes which intelligently and sensitively respond to monitoring realities is either not recognized as important, or silently understood to be too expensive. And the widespread outsourcing of monitors, while offering many advantages to sponsors, CROs and the many monitors who choose this work style, is too often a serious mis-match with technology use.

 

More mysteriously, why do sponsors take these shortcuts? The damaging irony is that monitors are primary users of CTMS and EDC systems – much more so than the IT and data management folks (or even project management folks, in the case of CTMS), who are selecting these tools, the vendors, and the processes which will be used to apply them. CTMS systems, for instance, are notoriously disappointing at most sponsors. They are powerful software applications, but too often the “data” they produce is known to be inaccurate, untimely, costly to collect, and at worst, misleading. The failure of these CTMS implementations is rooted in the causes cited above: not including monitors in the planning of the project, insufficient training, and the failure to grapple with the inherent challenges of expecting outsourced staff to effectively use an in-sourced tool. Unfortunately, it is also rooted in the application design itself; no one who has been a monitor would have ever come up with the interfaces, architectures and reporting mechanisms of common CTMS systems.

 

With a CTMS, bypassing monitors is all the more serious, because monitors are not only primary users, they are the primary source of the data in most CTMS designs (understandably). It is monitors who are expected to enter the core actions, events and facts which roll up to the beautiful charts for executive management at the end of the month. But the accuracy and timeliness of that data is undermined by shortcuts in training, user support, and staffing strategies.

 

EDC is similarly plagued. While sites may be the most important user of electronic data capture, monitors are not far behind in importance, and indeed they are expected to be the primary support for the sites themselves. And yet how much do we invest in monitor training and support in our overall EDC implementation projects? In story after story from sponsor after sponsor, we hear sheepish admissions that the study timeline, the project budget, clinical operations resistance, or interdepartmental politics got in the way of the best-laid plans for monitor preparation. In every one of these stories, the result has been frustrated sites and study managers, angry monitors, and watered-down benefits from the costly technology innovation. In the still common situations where the enterprise is skeptical of innovation (and down at the operational level where the real work gets done, the resistance is high in the heat of trial execution), this lack of support for monitors adds to the obstacles to speedy change.

 

What it Takes

What it takes to pave straight roads to research process change is straightforward. The cost and time for training and supporting monitors, help desks, and all those affected by the technology introduction must be planned for, committed to by upper management, and then executed though to the end, by professional trainers (ideally in-house) – not shortcut, not skimped, not put off to next year, not sloughed off to so-called “super-users” (the ultimate cop-out, if not backed up by ubiquitous support). Note that it is not just training that is needed – something most companies assume to be a one-time effort – but ongoing support through staff identified as coaches, mentors or similar specialization.

 

It also means not short-circuiting clinical involvement when acquiring, budgeting for, or changing tactics in the software acquisition itself. The path may seem shorter to avoid involving those unused to technology selection, but the consequences of skipping along this shortcut will eventually be a “Bridge Out” sign in your path.

 

e-Ready or Just e-Willing?

The most common shortcut to “e-readiness” is for the sponsor to look at monitoring preparation and say, “hey, let’s outsource it along with everything else”. Let’s find CROs who promise that their staff already know EDC or use a CTMS. Job done; on to the next issue. There is no stopping the use of outsourced or contract monitors, or full-service outsourcing of all trial functions (nor should there be), but sponsors must re-examine the value proposition, and the cost-benefit assumptions about CRO usage they have made for many years, if and when new technologies are expected to be a linchpin in the clinical plan.

 

Most CROs are savvy enough to tell their customers they are ready and even eager to use electronic tools. They may even be sincere in describing themselves as “e-ready”. But in some cases this has become a new source of sponsor disappointment: the monitors who show up in a state of “e-readiness” may have used EDC once in 1997, or the CTMS they are used to may have been designed for a CRO’s business, instead of a sponsor’s, and thus has considerably different functionality and interfaces. Similarly, you may be assured by a support vendor that their Help Desk is EDC savvy, or clinical research savvy, and the assertion is taken at face value but fails in the execution. Too often, sponsors don’t discover these gaps in readiness until the investigator meeting, or some time after the first monitoring visit, when the knowledge chasm is too wide to be hidden and very dangerous to cross. So then another shortcut is taken: let’s use an online tool to train users in the online tools – what could be more modern? It’s not that e-learning doesn’t work, it is that when sponsors don’t think through what the information and support needs are, they will continue to rely on shortcuts wherever they can be found.

 

Sponsors will also turn to the software vendors themselves for this training and support; they all offer it and who knows their product better? They certainly know their product, but they don’t know you. This “generic” systems training produces generic results: your staff will know what button to push, but not much about how to use these tools for the benefit of your trial and your clinical development productivity.

 

Following what is perceived as a shorter path to the target, sponsors are responsible for the primary failing of outsourcing in the technology context: they are not considering what it implies, and what will cost, to rely on contract employees, or generic training, or the spare time of super-users, for the success of EDC or the usefulness of CTMS data. In this way, sponsors carry a complex change management project to the brink of success, and realize they took shortcuts to nowhere.

 

We can only hope that sponsors reform their approach to training and resourcing when introducing new technologies now and in the future. Instead of the light ahead being that of an oncoming train, you should insist that the next light you see will only be one of knowledge, respect, and intelligent strategy.

Sorry, the comment form is closed at this time.