Email us : info@waife.com
Management Consulting for Clinical Research

Implementing Technology, Part I: Critical Success Factors(1997)

(Below is the Column from the Winter issue, January 1997)

 

Implementing Technology, Part I: Critical Success Factors

 

The theme of this column is that information technology will be a primary source of competitive advantage for sponsors and sites in the coming years. But before you go out and buy some software, you should have a good understanding of your primary business objectives, how you do your work today and where are the major sources of delay and cost, where technology can help and what the options are, and how to evaluate the vendors who will become important players in your business.

 

Critical Success Factors: Know Your Business

 

Table I lists what most system professionals agree to be the critical success factors for implementing new information technologies. The first success factor is to understand how the proposed use, or felt need, for a particular technology application fits with the overall business strategy of your company. Some projects may be pursued because they are someone’s favorite idea, or because competitors are using it. But if your business doesn’t have a critical need for it, there may be some other application more important for you. For instance, if you work at an investigative site whose business depends on enrolling patients on a timely basis, it may be much more important for you to deploy a patient recruitment application than a clinical data repository. On the other hand, if your site specializes in a therapeutic area with a large and stable patient base, you may not have a compelling need to computerize your patient information and your IT opportunity may lie elsewhere.

 

Table I.

Critical Success Factors

For Implementing Technology

 

Put technology use in the context of your competitive advantage

 

Analyze your business processes — workflow and dataflow

 

Discover and document your process costs — in time and dollars

 

Leadership: top-down and bottom-up

 

Involve your users up-front, and train them well

 

Know your politics

 

Know your people

 

Build an adequate technology infrastructure

 

Use cost-effective prototyping

 

Commitment & follow-through

 

Having identified an important application area, you need to understand what it is you are computerizing — often referred to as process or workflow analysis. Process analysis at a relatively high level can help you understand what part of your operations are the most resource-intensive, and therefore perhaps most likely to benefit from technology. Let’s say you work in a clinical monitoring group at a pharmaceutical company: it is likely you will find considerable time is spent in cleaning and clarifying data on CRFs.

Then you need to look more closely at how that data cleaning process works — who hands off to whom, how many people are involved in each step, what are the physical or virtual distances between them, how long does each step usually take? To each answer, challenge the situation — ask why, repeatedly.

 

It may also be revealing to determine if the people involved in the process understand it themselves. Often we find that even in a small organization, people make assumptions about their work – what is a priority, what their “internal customer” needs – that turn out to be incorrect.

 

Process effectiveness is particularly crucial in clinical research as critical functions are increasingly outsourced, often to multiple players. A clinical development project may depend the most on how well you coordinate and integrate the various vendors, service providers and internal departments who participate.

 

Ideally, you can quantify process steps, and entire strategies, in terms of their dollar cost to your company. Very few pharmaceutical companies have made the commitment to understand the costs of how they operate. This can mean that you assume a particular technique or process is effective, perhaps because “it’s always been done that way,” without knowing what it really costs you. If you can benchmark your performance against those who do similar work, you will have some context in which to compare your findings.

 

Leadership

 

Implementing a new technology application can have powerful effects — good and bad — on an organization. The success of the project often comes down to leadership: from the top, bottom and middle of the organization. One can debate endlessly about whether top-down or bottom-up initiative is more powerful, or long lasting. Of course both is ideal. When a project is led from the top, you have the advantages of financial commitment, management oversight, and perhaps strategic alignment. But a project led only from the top will fail because actual application users will not see the need, or perhaps actively resist the initiative.

 

Within large pharmaceutical companies, we often see a technology initiative begin with a middle-level manager. The advantage is that she may have some discretionary funds, whatever she does will be necessarily small and therefore unobtrusive, and she is close to the real users and the real need. Where technology initiatives have failed is that this manager doesn’t articulate the strategic benefit of what she has demonstrated to her boss, and the project does not get further funding. This is another reason why every technology project must keep in mind both the end user’s needs and the corporate strategic advantage.

 

Whether top-down or bottom-up, a successful technology implementation project needs an internal champion. They are the ones who infect others with their enthusiasm and persevere over inertia. The most important role of the champion is to clone herself: sometimes a project fails when the champion leaves the scene, and no one else had really bought into the project.

 

Politics

 

Internal corporate politics & personalities can have a great influence on technology implementation. We have often seen technology projects delayed for years because of interdisciplinary rivalries between data management, clinical monitoring and biostatistics, or because of competition or conflict between therapeutic areas at a large company. This is where the internal champion can have a great impact. Often, by involving the source of greatest opposition into the project early on, the team can defuse the objections and these folks become your strongest advocates. Many times when implementing electronic data collection at an investigative site, for instance, we’re told of a study nurse who would “never” use a computer. Yet they end up the staunchest defenders of the concept because they are on the frontline of operations where the technology can best help.

 

Nonetheless, there is no doubt that implementing technology changes people’s roles in the organization. Electronic data collection, for instance, changes the role of the data manager, the CRA, the CRC, the investigator, and even the patient. Change of this nature can be threatening, or seem illogical, or put a person “out of their depth” based on their professional training. These changes need to be documented and openly discussed, and alternatives should be explored which respect the current and future contributions of each staff member. Involving end users in the project early on, and taking special care in their training, can do much to alleviate these concerns.

 

Examples

 

Why have I spent so much time on this “touchy feely” topic in a technology column? Because time and again it has been painfully proven how important people and process are to getting the business benefit out of new software.

 

Here are just a few examples from the experiences of companies beginning to implement electronic data collection (EDC):

 

 

The data is collected electronically at the site, but field monitors don’t have the electronic tools to review the data, so they revert to paper.

 

EDC is implemented by a sophisticated site with an electronic patient chart, but the sponsor’s clinical department insists on 100% source data verification on paper.

 

A clinical team is ready to pilot new EDC software in a small trial, but the QA department insists on a full system validation audit first.

 

A company deadlocks on technology implementation because one therapeutic area leader enthusiastically proves EDC is beneficial while another equally energetically rejects it.

 

The sponsor’s project team has worked for months to develop an EDC application they all think is great, but nobody obtained input from an investigative site and the application fails under real conditions.

 

In Part II of this article, we’ll look at the other critical success factors: building a technology infrastructure, and prototyping your solution, and suggest some cost-saving strategies for exploring and implementing technology in clinical research.

 

Sorry, the comment form is closed at this time.