Email us : info@waife.com
Management Consulting for Clinical Research

Few areas in pharma can benefit more from a top ten list than that of pharmacovigilance (PV). Ongoing developments in areas like E2B formats, potential FDA tome requirements, data-mining, DSM Boards needs, Eudravigilance, and EDI gateways have made the world a complex and challenging place for PV professionals today. In that context, a top 10 list devoted to pharmacovigilance can be of great service, either as a planning tool – for those wondering how to respond to these challenges – or as an internal check-up – for those looking to make a mid-course correction. So here’s my top 10 tips related to pharmacovigilance. It is my way of organizing the PV world into an action list.

 

1.THINK PAPERLESS

Yes, ‘paperless’ can be a cliché, but within clinical development, it is more and more of a reality. EDC systems collect data at sites without paper case report forms. Field monitors produce trip reports ‘online’ within their company’s clinical trial management system. The same reports are reviewed and signed electronically by management. On the back-end of the clinical process, data managers and even medical monitors are reviewing the data with online tools that provide sorting, graphing and searching capabilities. Yet in the PV world, case processing still consists of piles of folders, passed physically from one desk to another. Red folders for ‘hurry’; blue folders for ‘no hurry’; the whole scene could come from an office in the 1950s. Searching the files of vacationing colleagues for needed documents is a typical story that only magnifies the sense of deja vu. It is ironic that today’s adverse events systems allow electronic routing of cases, scanning and storage of external documents and other capabilities that make the folders obsolete. Yes, you can still print a document or a listing as a working copy, but the official record remains within the validated, secure system. The technology supporting this is mature and ready to be used.

 

2. START MEASURING

Let’s face it, measuring takes time that many of us are not prepared to spare. How many cases do you process in a year? How many people do you need to do this? What is your average number of follow-ups to get a case closed? Is this number good or bad? How do you know? Before you go to your boss to ask for that new system or those additional headcounts, it would be good to know how well you are doing with what you have now. Metrics by one PV organization claim to handle over 1,000 cases per year per FTE, while 250 cases per year per FTE is given as the ‘industry benchmark’. If 1,000 is the upper bound and 250 is the mean, we know there has to be a lower bound out there somewhere. Can you imagine a company where only 100 cases per year per FTE are done? Can you then imagine the differing workflows and data flows that produce a four-fold, or even a 10-fold, difference in efficiency? Where do you fit into this scheme? Your boss should ask you these questions before she signs up for more money or headcount.

 

3. BROADEN YOUR VIEW OF E2B

In today’s pharma landscape, marketing rights for a product can be divided across multiple companies, countries, indications and even categories of healthcare providers. One marketing partner may end up with the general practitioner market segment for a single indication across the Nordic countries and the UK, while five other companies will divide up the rest of the market across the EU, the US and Japan. Such marketing agreements, while presumably advantageous from a sales perspective, produce multiple, overlapping ICSR reporting responsibilities for the PV professionals at each company. How can a PV professional meet the interlocking demands for receiving and sending ICSR’s across so many partners for a single product?

 

One answer lies in looking beyond E2B as a means of reporting to health authorities, and to consider the use of both the common format and the gateway functionality to communicate with marketing partners. System vendors have promoted this for years, but several years later the typical sponsor is still struggling just to use the gateway to communicate with authorities. Many of you will know that the so-called E2B or ESTRI gateway is built on an electronic data interchange (EDI) format that is used by thousands of companies (including pharma companies) to process millions of transactions each year with their customers and supply partners. Yet, even in 2007, the number of ICSRs transmitted via a gateway is still measured in mere thousands, and the vast majority of these are between sponsors and agencies. Inter-partner exchange is still dominated by PDF attachments and retyped data. E2B and the gateway offer a new and better way forward.

 

4. STOP CHECKING THE CHECKERS

Many PV processes are still marked by the idea that ‘checking and rechecking’ equals quality. One process I encountered produced the following workflow:

• ICSRs are printed out by clerical staff for medical monitors to review

• Medical monitors redline the ICSRs and send them back for data entry

• Clerical staff make the changes, reprint the ICSRs, and send both copies back to the medical monitor

• The medical monitor then checks to make sure the corrections have been ‘correctly’ made.

 

Such ideas of ‘quality’ ignore the fundamental advances made in this area by people like Edward Deming and movements like TQM. Quality is now understood to be a good process followed by trained professionals. Gone are the days when we start searching for problems at the end of the assembly line. The same needs to happen within your organization’s processes.

 

5. LOOK BEYOND REPORTING

In many PV organizations, the main focus of departmental work, and even departmental goals, remains satisfying regulatory deadlines for reporting. Did we meet the expedited targets? Are all the annual updates filed on time? Such a view minimizes the potential value of pharmacovigilance to the larger organization, and contributes to the view that PV professionals mainly push paper around for a living.

 

What do your data tell you about the profile of your company’s products? What ideas do your data generate about potential new studies? How can they inform on-going protocols? Getting things filed on time remains essential, but it is no longer sufficient. There are more ways to add value, and forward-looking PV departments are finding ways to move beyond it.

 

6. KEEP A DUAL FOCUS

While most of your staff are immersed in the specifics of each case, making sure to respond to PV work on time, it means that someone in your group needs to be looking at the bigger picture. Where are the cases coming from? Are these numbers in balance with market share data? Are incident frequencies giving us clues we should pay attention to? Is this bigger picture review a stated part of your process? If so, which role is doing this? My repeated impression when visiting PV operations is that they are very busy, they focus on detail, and are under pressure to get cases and reports done and ‘out the door’. Without planned, allocated time to review, consider and contemplate the big picture, an important part of the PV function is being missed.

 

7. MAXIMIZE THE BENEFITS OF YOUR SYSTEM

How much of your adverse event system are you actually using? It is amazing to find duplicate functionally being maintained by PV departments outside of the AE system. The most common example of this is tracking spreadsheets. Even though most adverse event systems have tracking and timing mechanisms that can display the status and timelines of cases, these features go unused in favor of a spreadsheet. Another common culprit is reporting systems. Some departments have built complete, parallel databases in MS Access to run reports. Data get double loaded, or even double entered, so that summary reports on the data can be used. The AE system’s own reporting capabilitiesare ignored, or simply not trusted. If this sounds like your situation, fixing this may become the number one idea on this list. Ignored functionality means more cost, more training, more risk and a lousy thing to admit when your boss is quizzing you on your budget increase requests.

 

8. UPDATE YOUR STANDARDS

The words ‘SAE reconciliation’ generate a universal response across the industry and the response is not necessarily a good one. Reconciliation is an ongoing job that creates burdens for both clinical data managers and PV professionals. The better news is that both sides now have a dominant standard: SDTM 3.x for the data managers and DTD 2.x for PV departments. The use of these standards can facilitate reconciliation by providing consistent, predictable relationships between the two domains. Automated reconciliation can become better than ever by capitalizing on this consistency across all clinical trials.

 

9. GOOGLE YOUR DATA

The buzzword is data-mining and it refers to sophisticated single detection capabilities run against large amounts of data. Drug interactions and event interactions are some of the key outcomes being produced. Data-mining, however, is heavy duty, technical stuff compared to case processing, with a very different tool set and a demand for

expertise that may go beyond the capabilities of your current staff and your IT support. It is, however, a big part of the future of pharmacovigilance and the sooner you get started the better.

 

10. START COMPARING

No, I don’t mean just comparing your operations with other companies; I mean comparing yourself against yourself. Contrary to widely held views, there is more value in knowing how much better (or worse!) you are getting over time, as compared to merely knowing whether you are keeping up with your neighbors. PV departments across the industry are organized and staffed quite differently, with big differences in their technology infrastructures as well. Some companies have large, centralized operations, while others, often due to mergers and other corporate history, are dispersed and decentralized, even using different legacy AE systems. So comparing yourself with the guy over the fence may be unhelpful and even misleading.

 

Much more valuable is both knowing and measuring the impact of your own improvement efforts. You bought that tool and rebuilt that process last year. Has it made a difference? Was the difference worth the cost? These are the real signals of process improvement. You can still trade stories with your PV colleagues at the DIA conference. Just don’t confuse that with measuring progress.

Frank Capra’s 1946 film, It’s a Wonderful Life, portrayed a common theme: George Bailey, like many of us, didn’t realize how fortunate he was to be living the life that he was. If you are a CRA fortunate enough to be using EDC to conduct clinical research, you too can have a fortunate life.

 

Hard to believe, since you are a busy CRA who is monitoring multiple sites for multiple trials, all at the same time? If you are “e-enabled”, your life consists of: 1) using the capabilities of EDC to make better use of your time; 2) enjoying increased flexibility as to when and where you do certain tasks; 3) allowing the computer to do mundane tasks for you; and 4) exploiting the EDC data to isolate potential problems before they can cause you trouble.

 

Let’s see how this works by looking at the wonderful life of Mike Monitor, a regional CRA, who is: 1) monitoring several sites for Study A; 2) completing and locking one site for Study B; and 3) initiating a new site for Study C. We’ll follow Mike for a week and see how he uses EDC to cope with his workload and optimize his life/work balance. (Note: If this week in Mike’s life doesn’t sound much like one of yours, it may be time to voice your concerns about how your company is using EDC.)

 

Monday, 9 A.M.

Cleaning data remotely. Mike’s company allocates regular office time for regional monitors, so they can review and clean data before the next site visit. Mike’s office day for this week is Monday. Mike begins by filtering the data in the EDC system to isolate all new data entered since his last cleaning session. He works his way through the data review items per his monitoring plan and raises several manual queries. He also checks for open queries and sees that nine out of ten from his last session have been answered. He reviews the answers on the nine and closes those queries. At the same time, he makes a note in the comment field of the data item with the open query, so that he can discuss this query with the site at his next visit.

 

Tracking site performance. Mike has set an expectation with the site that they will catch up on all data entry at least once a week. This expectation has been set during the training session and, beginning this year, is also expressed as an incentive clause in the site contracts for all new studies. Mike runs a standard report that shows the average time between a patient’s visit and the day on which the data for that visit are entered into the EDC system. An average time for each of his sites, as well as an overall average for each study, are displayed in the report. Two of his sites currently qualify for the incentive, while a third site is close to doing so. Mike makes a note in the system to discuss the incentive with this site at his next visit.

 

Anticipating problems. Mike also examines a list of all his queries, sorting them by eCRF form and then by data item. By doing this, he can see the number of queries that have been issued for a single data item. He notices that he has generated a query on the same primary efficacy endpoint for 12 patients across all his sites for Study A. As this study still has two years to run, he puts a comment on that data item to retrain his sites on the data completion guidelines for that item.

 

He also runs a separate report on the percentage of missing items and sees that one site has twice the percentage of his other sites. He composes an email to the coordinator for that location, citing several examples of missing data elements and offering to discuss with the site how to avoid this going forward.

 

Finally, he filters his query list to display only queries that have fired automatically. He examines and closes most of them, but notices that his largest site in Study A has a habit of insisting on the original entry and providing rather weak explanations for why the data value, in spite of the query message, should be considered to be correct. Mike is particularly concerned with this trend, as this site has three times the enrollment of the other sites and is the leading enroller across the entire study. He considers what this means as far as the suitability of these patients and spends additional time reviewing the five enrollment waivers that have been granted to this site. He also makes a note to reexamine the inclusion/exclusion data in the source documents at his next visit. Mike knows that this high enroller will most likely be the target of a pre-approval inspection by Bioresearch Monitoring auditors and he wants to be proactive in addressing potential concerns that may arise then.

 

Tuesday, 7 P.M.

On Tuesday evening, Mike hops a plane for a two-day road trip: one day for a monitoring visit at a site in Study A, and a second day for an initiation visit at the first site for Study C.

 

Wednesday, 9 A.M.

Doing SDV. This morning, Mike is at the site for Study A. The site coordinator provides Mike with the patient files and Mike logs in to the EDC system with his own laptop and verifies the data against the source. Since he is not using the site’s computer, he can work freely without interruptions. Having cleaned these data on Monday, Mike’s activity is focused on checking for transcription errors against the source and also looking for unrecorded AE’s. Only one transcription error is found and Mike chooses to raise a query about this within the system, as the site coordinator is unavailable for consultation at that moment. While in the EDC system, Mike also sees the note he created on Monday about the unanswered, open query. He finds the answer himself in the source documents and closes the open query.

 

Mike is done with his work by 1 P.M. This early finish is consistent across all of his EDC studies, where he finds that he spends about 50% less time working through patient files compared to a paper trial. Regularly viewing and querying these data prior to the site visit has made this acceleration possible.

 

Thursday, 6 P.M.

Catching the LPLV. On Thursday, Mike has traveled to a different site for an initiation visit for Study C. At the same time, Mike is anticipating the last visit of the last patient that afternoon for Study B. Fortunately, Mike has worked out an agreement with that site’s coordinator to expedite the entry of these data into the EDC system. Now Mike is at the airport on his way home, and he is able to log into the EDC system and view and clean these LPLV data. He also source verifies this final visit, using three, anonymized source documents that the site has agreed to send via e-mail. Just before boarding the plane, Mike is able to flag this visit in the EDC system as SDVed and clean. He also sends an email to the data manager for the trial indicating that, from a monitoring perspective, the trial data are ready for locking.

 

Friday, 9 A.M.

Checking enrollment. On Friday morning, Mike logs in to the EDC system for each of his studies. He knows that enrollment updates are due in the corporate CTMS on Fridays. He also knows that the IVRS/EDC interface has done its weekly update overnight on Thursday and that all randomized patients now have an entry in the EDC system. He runs an online report showing the total enrollment for every study and the progress status for each patient. He has to update the CTMS data with these data, but has figured out how to copy and paste the results from the online EDC report, so that he doesn’t have to retype them in the CTMS. He also knows that even this will eventually go away, as an EDC/CTMS interface will arrive in 2008 to complement the IVRS/EDC interface.

 

Having finished his EDC work for the week, Mike spends the rest of Friday on his many other duties, including writing trip reports, reviewing the protocol for Study C and participating via conference call on the project team for the new EDC/CTMS interface. At 5 P.M. on the dot, Mike is off into a restful weekend.

 

The Rest of the Story

In case you think this is sounding more like Sir Thomas More’s Utopia, let me quickly remind you of the rest of the story. EDC doesn’t mean that your sites will never make irritating mistakes; that site responses to your queries will never be confusing; or that the technology will never have its negative moments. EDC will also require, at least initially, more focus on training and a new perspective on defining and testing the EDC system from the site’s perspective, requirements that may involve you in activities you didn’t have to do in paper trials.

 

But good use of EDC will also always mean: 1) that you are more in control than ever; 2) that you now have new and flexible ways to do your work on your terms and according to your own schedule; and 3) that mundane tasks like query generation and resolution will be easier than ever, leaving more time for true clinical research and building positive site relationships.

 

A key part of our movie, It’s a Wonderful Life, is our hero George being shown what life in his hometown would have been like if he had never been born. Review Mike Monitor’s week and imagine it without EDC. Recall what it is like to do a paper trial: the mountains of binders; the queries coming from headquarters six months later; the forced timing of site visits to pick up CRFs; the pressure to finish your work at a site before they closed for the day; and the many data “surprises” that arise as you tried to lock the database. The good news is you don’t have to jump off the bridge: EDC is here to stay, and you really can have a relatively wonderful life – not in the movies, but working as an “e-enabled” monitor.

Biopharma executives regularly face a series of decisions beyond their original professional competency. This is a requirement for biopharma executives’ success and I imagine therein lies much of the appeal of the job. In all 21st century management, decisions are accompanied by choices in the use of information technology, and this is no less true in biopharma clinical development. This supplement covers a sampling of the myriad issues in using IT to support the process of human testing of new drugs.

 

IT decisions should be like any other decisions you are making – they must first be cast in the mold of your business strategy and business conditions. Both strategy and conditions vary widely from company to company, even when the companies may seem so similar in purpose and objectives. (This is why “benchmarking” is so dangerous.) As these circumstances vary, so do the parameters upon which IT decisions must be based. The issues are different and so too are the resolutions, depending on how your company is funded, staffed, experienced, pipelined, partnered, organized and led. Indeed, IT is entwined with company strategy and business conditions in a Gordian Knot of inseparable implications.

 

Biopharma execs are often impeded by their own backgrounds – commonly either academic or laboratory-based. Clinical research is unfamiliar territory and may appear to be misleadingly “simple” compared to bench science and the “genius of discovery”. While the cry of “eureka!” may be hard to hear during the long years of clinical trials, the science and rigor in clinical research are no less important to bringing a discovery into medical practice. So the first challenge for biopharma execs is to put the right people in charge of clinical development – those who understand and respect it. It may seem odd, but this is the first step to effective use of IT in clinical research.

 

Decision, decisions; Choices, choices

Managing IT for any purpose encompasses infrastructure, platforms (hardware/software), networking and security, quality management and validation, user support and maintenance, and the software applications which your staff actually use. In clinical development, there is a wide range of applications which can be employed, and for which a biopharma must determine which of these it really needs and when. A sampling of such applications include:

• Data handling (EDC, ePRO, CDMS)

• Trial conduct (CTMS, IVRS, study portals)

• Safety surveillance and reporting (AES)

• Submission preparation (submission manufacturing systems)

• “Infrastructure” (document management, data warehouse).

 

Too many biopharmas jump right into this list, like do-it-yourselfers at a big-box hardware store, and start watching vendor demos and freaking out at the price tags. The place to start instead is the business strategy: how are you going to run clinical trials, when, and why?

 

The first constellation of choices revolves around how you will resource your clinical development. Are you going to outsource most or all of the functions (a common approach for young companies)? Are you going to selectively outsource by function (keep data management in house but outsource site monitoring, or vice versa?), and what about project management? If you choose to operate functions internally, are you staffed appropriately? Are you willing to bear the cost and maintenance of these staff? Can you find the staff you need?

 

What do your partners use in the way of IT? Most biopharmas have all kinds of partners — companies you are licensing to, licensing from, using for key development services (radiographic readings, core labs, patient recruitment, clinical supply packaging), and so on. Do your partners offer technology systems you can leverage, or do you have a more efficient strategy? How do you pull together these multiple sources of data?

 

And where is your business at this moment, or next year, or in five years? Are you heading for submission? Quickly? Ever?

 

Each of these questions, each of these choices, dramatically alter the appropriateness, ROI and operational impact of any particular clinical IT application choice. Ultimately it comes down to a practical, essential business question: how do you control your clinical development process? And some executives would add, how do I have control and flexibility simultaneously? How do I have both rigor (compliance) and the creativity of entrepreneurial nimbleness? And of course, how do I do this on a limited budget?

 

Three Areas to Focus On

It is probably helpful for a biopharma executive to focus at first on three main areas of clinical research IT, what I will call control, product data, and safety.

 

Control, in this context, means knowing how your trial(s) — not your subjects or your product — are doing: are they on time, on budget, experiencing bottlenecks? Are they experiencing site performance issues, compliance issues, supply issues? Are your partners performing as expected? What changes need to be made? These questions are naturals for IT support. In the pharma world the application used is some kind of CTMS (clinical trials management system). Often, small young companies will avoid this arena, because the best known applications are big and expensive, and the small ones may not be robust or mature enough, or may have been developed for a customer’s situation too dissimilar from your own.

 

But in our experience, obtaining control over clinical trial conduct through information is as important, or may be more so, to a young company than the traditional focus on patient data handling. What is particularly challenging, besides the complexity of managing information from diverse partner data sources, is that the design for the kind of system your company needs must come from your clinical staff (not data managers or IT staff), and your clinical staff may be your least pharma-experienced.

 

I use the term product data handling to be as generic as possible in referring to your patient/subject data as it relates to the effects of your product (drug, biologic, device, combination thereof). This encompasses traditional CRF data, but these days increasingly includes relevant “non-clinical” data, PRO data (“patient-reported outcomes”), images (radiographic, pictorial, motion video), and more. This is often where a biopharma starts its clinical IT journey, particularly since this is where people with “data” in their title seem to reside, and where most executives are more willing to spend dollars on technology.

 

Handling product data for all biopharmas is increasingly focused on usability – both for the end user (the site) and the business (i.e., for accelerated decision-making). This means access, rapid startup, and ease of reporting. When seeking to control and analyze product data, it is harder and harder in 2007 to accept a paper-based, backend-heavy application strategy. Thus a traditional CDMS gets hard to justify, especially considering the time to start up and staff the necessary support, and to run the accompanying paper processing. But are newer approaches (EDC plus an analytical backend, versus a storage-oriented backend) too risky for a new company? These newer approaches may be actually more appropriate for a new company: a) they are easier to implement in a “blank slate” environment; and b) the risk, such as it exists, is likely to be more than offset by timely data and the facilitation of interim analyses. Regardless, a number of critical staffing, process and infrastructure decisions have to be made to implement an effective data handling approach. Again, a business’ priorities should guide these choices.

 

Conservatism finds a home in young biopharmas when considering the monitoring and reporting of patient safety. Fortunately, a handful of similar software applications are available to choose from in this area, and because the number of your staff who will be using them is likely to be small, the cost of these applications are quite reasonable. Here the choices are much easier: pick an application, buy it and use it. Complex resourcing algorithms are not necessary; ROI pales in comparison to the cost of a safety crisis.

 

Nonetheless it is surprising how often biopharma executives (who have the most to lose, personally and professionally, by a safety crisis) will balk at the cost and perceived complexity of owning a safety monitoring and reporting tool. This is particularly ironic for those companies who are counting on multiple indications for their compound or biologic, and must have the means to detect safety indicators across the development stream to ensure an acceptable safety profile. Once again, the business strategy and the supporting IT needs are intertwined.

 

Just a Taste

This overview of control, product data, and safety is just the beginning of the IT issues which require decisions in support of clinical development. There is much else to consider, including where and how you equip your infrastructure, on what platforms, under appropriate quality management systems and with compliant validation. The key is that biopharma executives should not abdicate their involvement in these decisions because the issues seem too technical or too narrow. Precisely because of their inextricable connection to the business decisions executives are responsible for, clinical IT choices must be made with the help of senior management.

 

Innumerable issues must be considered and resolved as you prioritize your IT needs, schedule IT adoption, select your vendors and shoulder the mighty work of implementing these tools with your staff (or new staff), under the correct governance model, with efficiency, flexibility and compliance. Trying to cut through the Gordian Knot will lead to your operations falling in pieces. Embrace the conundrum and you will learn much about the complexity of clinical development, and through such learning will come excellence in biopharma leadership.

Many of the readers of this magazine are clinical research monitors. For many of you, the technology revolution in clinical research has been a runaway train and you’ve been the tracks. Far too often, in the process of introducing electronic data capture (EDC), electronic patient-reported outcomes (ePRO), or a new clinical trial management system (CTMS), sponsors take the shortcut that runs across your backs.

 

How do sponsors do this? The shortcuts take many forms. It starts with not involving monitors and their management in the initial EDC and CTMS decisions, except perhaps in the most perfunctory manner. This is rooted in the tradition of running the vendor selection teams from the IT or data management organizations. Despite being well into the 21st century, these groups continue their historical narrow view of any tools which have something to do with “data”, and monitors are not in that view.

 

But the shortcuts further downstream are more insidious. Sponsors, and the CROs they depend on, find ways to skimp on end-user training, user mentoring, and internal monitoring expertise. The need for detailed planning for, and documentation of, changed work processes which intelligently and sensitively respond to monitoring realities is either not recognized as important, or silently understood to be too expensive. And the widespread outsourcing of monitors, while offering many advantages to sponsors, CROs and the many monitors who choose this work style, is too often a serious mis-match with technology use.

 

More mysteriously, why do sponsors take these shortcuts? The damaging irony is that monitors are primary users of CTMS and EDC systems – much more so than the IT and data management folks (or even project management folks, in the case of CTMS), who are selecting these tools, the vendors, and the processes which will be used to apply them. CTMS systems, for instance, are notoriously disappointing at most sponsors. They are powerful software applications, but too often the “data” they produce is known to be inaccurate, untimely, costly to collect, and at worst, misleading. The failure of these CTMS implementations is rooted in the causes cited above: not including monitors in the planning of the project, insufficient training, and the failure to grapple with the inherent challenges of expecting outsourced staff to effectively use an in-sourced tool. Unfortunately, it is also rooted in the application design itself; no one who has been a monitor would have ever come up with the interfaces, architectures and reporting mechanisms of common CTMS systems.

 

With a CTMS, bypassing monitors is all the more serious, because monitors are not only primary users, they are the primary source of the data in most CTMS designs (understandably). It is monitors who are expected to enter the core actions, events and facts which roll up to the beautiful charts for executive management at the end of the month. But the accuracy and timeliness of that data is undermined by shortcuts in training, user support, and staffing strategies.

 

EDC is similarly plagued. While sites may be the most important user of electronic data capture, monitors are not far behind in importance, and indeed they are expected to be the primary support for the sites themselves. And yet how much do we invest in monitor training and support in our overall EDC implementation projects? In story after story from sponsor after sponsor, we hear sheepish admissions that the study timeline, the project budget, clinical operations resistance, or interdepartmental politics got in the way of the best-laid plans for monitor preparation. In every one of these stories, the result has been frustrated sites and study managers, angry monitors, and watered-down benefits from the costly technology innovation. In the still common situations where the enterprise is skeptical of innovation (and down at the operational level where the real work gets done, the resistance is high in the heat of trial execution), this lack of support for monitors adds to the obstacles to speedy change.

 

What it Takes

What it takes to pave straight roads to research process change is straightforward. The cost and time for training and supporting monitors, help desks, and all those affected by the technology introduction must be planned for, committed to by upper management, and then executed though to the end, by professional trainers (ideally in-house) – not shortcut, not skimped, not put off to next year, not sloughed off to so-called “super-users” (the ultimate cop-out, if not backed up by ubiquitous support). Note that it is not just training that is needed – something most companies assume to be a one-time effort – but ongoing support through staff identified as coaches, mentors or similar specialization.

 

It also means not short-circuiting clinical involvement when acquiring, budgeting for, or changing tactics in the software acquisition itself. The path may seem shorter to avoid involving those unused to technology selection, but the consequences of skipping along this shortcut will eventually be a “Bridge Out” sign in your path.

 

e-Ready or Just e-Willing?

The most common shortcut to “e-readiness” is for the sponsor to look at monitoring preparation and say, “hey, let’s outsource it along with everything else”. Let’s find CROs who promise that their staff already know EDC or use a CTMS. Job done; on to the next issue. There is no stopping the use of outsourced or contract monitors, or full-service outsourcing of all trial functions (nor should there be), but sponsors must re-examine the value proposition, and the cost-benefit assumptions about CRO usage they have made for many years, if and when new technologies are expected to be a linchpin in the clinical plan.

 

Most CROs are savvy enough to tell their customers they are ready and even eager to use electronic tools. They may even be sincere in describing themselves as “e-ready”. But in some cases this has become a new source of sponsor disappointment: the monitors who show up in a state of “e-readiness” may have used EDC once in 1997, or the CTMS they are used to may have been designed for a CRO’s business, instead of a sponsor’s, and thus has considerably different functionality and interfaces. Similarly, you may be assured by a support vendor that their Help Desk is EDC savvy, or clinical research savvy, and the assertion is taken at face value but fails in the execution. Too often, sponsors don’t discover these gaps in readiness until the investigator meeting, or some time after the first monitoring visit, when the knowledge chasm is too wide to be hidden and very dangerous to cross. So then another shortcut is taken: let’s use an online tool to train users in the online tools – what could be more modern? It’s not that e-learning doesn’t work, it is that when sponsors don’t think through what the information and support needs are, they will continue to rely on shortcuts wherever they can be found.

 

Sponsors will also turn to the software vendors themselves for this training and support; they all offer it and who knows their product better? They certainly know their product, but they don’t know you. This “generic” systems training produces generic results: your staff will know what button to push, but not much about how to use these tools for the benefit of your trial and your clinical development productivity.

 

Following what is perceived as a shorter path to the target, sponsors are responsible for the primary failing of outsourcing in the technology context: they are not considering what it implies, and what will cost, to rely on contract employees, or generic training, or the spare time of super-users, for the success of EDC or the usefulness of CTMS data. In this way, sponsors carry a complex change management project to the brink of success, and realize they took shortcuts to nowhere.

 

We can only hope that sponsors reform their approach to training and resourcing when introducing new technologies now and in the future. Instead of the light ahead being that of an oncoming train, you should insist that the next light you see will only be one of knowledge, respect, and intelligent strategy.

How to write about information technology in the global scene of clinical research? Well, the Internet took care of that, didn’t it? Someone in Ecuador can bid on an item on eBay, posted by someone from Slovakia, so what can be more global than that? End of topic. Hmm, perhaps not.

 

It appears that biopharma’s global use of information technology in clinical research is no more advanced than its global approach to clinical research in general, and while that might seem logical, biopharma’s “globalness” is much less advanced than the technology it could be using (or even might be using) to support it.

 

In part, biopharma’s use of technology varies in parallel to the individual company’s approach to being global. At one extreme is the company which has multiple, heavily-staffed, clinical development centers around the world. Such a company must rely heavily on computer technology (modern or not) to even function. Companies which are more in the middle of the spectrum – what could be called “hub-and-spoke”, where there is a primary company site and other country locations are quite small – also need technology in order to effectively tie together the many threads of the clinical research organization. Even if a company is essentially “mono-national”, most any company still finds itself running trials in multiple countries, and thus face certain technology challenges. The same can be said for CRO’s, since they range from the tiniest, serving only their local patron sponsor, to the huge multinationals who teeter on the edge of fragmentation in process consistency.

 

Running global trials, or running a global organization, raises common challenges to effective use of technology, as well as providing needs and opportunities for great benefits.

 

The least technical factor can have quite an impact – basic cultural variations. For instance, we are familiar with how investigator sites differ importantly among the US, Europe, Japan, India and all the many new countries where trials are being conducted. Sites vary in staffing (are there are study coordinators available to enter EDC or CTMS data?), technical equipment (are Internet — versus intranet – connections available? Is there a local printer? Does the site need a computer provided?), and connections (is wireless more ubiquitous in a country than wired connections, is Internet access really universal?). For instance, so-called “provisioning rates” (how often hardware or a high-speed connection must be provided to a site) varies widely even across a single border in the same region of the world, or indeed, within a country. And sites can differ widely on their expectation of whether a sponsor-provided computer or ePRO PDA is supposed to be returned to the sponsor or retained by the site after the study close. These become very sensitive situations which are not necessarily best addressed by single worldwide practices.

 

But subtler cultural variations will impact clinical technology success. Countries and cultures who historically feel they are treated unequally will be particularly sensitive to situations like date fields in a form which only accept one date format (the order of year, month and day), or form instructions which use English vernacular which is too idiomatic or culture-specific (even a Canadian or Australian can react negatively to an “Americanism” in wording). This kind of reaction can undermine the benefits and performance of the tool being used: if a CTMS relies on a Web-based Portal which is designed based on “first-world”, high-web-use user conventions and language, and is expected to be used in developing countries or places with low-web-consumer practices, the Portal may simply be under- or un-utilized and thus deny critical information to the study team. In such a situation, the ability to localize the Portal design, or the eCRF in and EDC study, would be an important consideration.

 

Data entry practices will vary globally, not only at sites but among sponsor/CRO field monitors. It is less true, but still true, that in some countries Principal Investigators may be the only ones entering data into an EDC eCRF. This in turn will dictate a very different pattern of frequency and quality which has to set a different set of expectations for the sponsor’s workflow, even within the same study. Another example is the variation of electronic health record (EHR) use around the world. Some countries are more advanced than others, and more standardized, or intend to be. This may be one of the most important challenges to eClinical trials in coming years.

 

Perhaps nothing is more important on a day-to-day basis in eClinical use globally than how end users are supported on basic technical and clinical questions and problems. How the sponsor designs and supports its users through a Help Desk service is critical to the success of EDC, ePRO and CTMS, or even geographically distributed adverse event tracking systems. We all have our favorite hate-the-Help-Desk anecdotes; the challenge for sponsors pursuing eClinical processes is to make sure they don’t generate new anecdotes!

 

Even application integration is more of a challenge when conducting trials globally. First, of course, is the challenge of speed and data integration of combining multiple instances or data stores of the same application. More difficult is integrating data sources from multiple countries and the CROs and other third-party sources (like core labs) usually used in global trials. More difficult still is considering the integration of disparate UI (user interface) approaches: it is easier to adjust a CTMS UI to local cultural conventions than eCRFs which are necessarily more driven by a concern for standards. As EDC and CTMS applications increasingly need to be integrated, these interfaces (and resulting data formats) may clash.

 

Last but not least, running global trials means juggling diverse regulatory requirements around the world. The challenges are myriad, such as ensuring your AES is generating all the different required reporting forms on the different reporting schedules, managing the differences in data collection conventions (such as race definitions), ensuring compatibility with eSubmissions requirements, fulfilling local patient privacy protection requirements, and more.

 

All of these situations perhaps could be addressed through a robust set of requirements developed by a multicultural working team. Certainly so, but of course this solution would be at least as challenging as the procedural questions raised above. As poorly designed as most requirements development efforts are, they particularly disappoint when trying to effectively and respectfully balance cultural differences, priority issues, and tight timelines.

 

Despite what Walt Disney may have led you to believe, when running global clinical trials with modern eClinical technologies, it is indeed still a large, diverse and hyper-sensitive world, after all. To realize the prodigious benefits of subject availability, disease distribution, market experience, and flexible labor pools, biopharmas will need to learn how to exploit information technology despite the global challenges.