Email us : info@waife.com
Management Consulting for Clinical Research

Sometimes, when the resistance is greatest, the treasure being guarded is the most valuable.

 

“The path of least resistance” is a seductive phrase. It is used so often by those made nervous by change, or those who simply want to avoid conflict. It is seductive because it sounds wise, mature, even efficient. And sometimes I suppose it is. But often as not, the path of least resistance is very winding, indirect by definition, filled with backtracks, detours and roundabouts. It may have the least resistance, but sometimes it is the slowest path, without any guarantee that you won’t come to a dead end.

 

Boulders Aplenty

The path we take to arrive at change – in a process or strategy or the tools we use – is a key determinant of success. The resistance that we usually seek to avoid can take innumerable forms, and so these seem like daunting boulders in our way. Let’s list just a few:

§ Inertia, the immutable law that states that an employee entrenched tends to stay entrenched

§ Lack of a perceived demand for change, or “what’s everyone so upset about?”

§ Distractions, as in “I’m too busy figuring out how the latest merger affects my retirement plan”

§ Alternative priorities (“I have my own agenda, thanks very much”)

§ Compliance, as in “I’m not sure which one, but I’m sure the regs don’t allow that”

§ We have a process for that, where “process” means bureaucracy

§ We have channels for that, as in “go see the contracts office”

§ Personalities (take your pick, from passive or hostile to know-it-all)

§ And the ultimate boulder, the budget (“sorry, we didn’t budget for that this decade”).

 

When faced with this landscape, no wonder we typically look for less resistance, but if we try to miss all the boulders, how will we ever find our way?

 

Examples Aplenty

Throughout clinical development there are many boulder-strewn pathways to greater efficiency. Let’s think about just two examples: reducing source data verification (SDV) and improving protocol feasibility. First let’s look at the paths of least resistance.

 

SDV is a sacred cow milked by data management, monitoring, statistics and QA. And time-and-materials-based CROs endorse the labor-intensive policy wholeheartedly. The path of least resistance will lead us past these pools of quicksand on a route that nods empathetically to each of the resistant constituencies. We will concede that once you start looking at one field on the eCRF, you might as well look at them all. We will let statistics throw more queries on the truck. We won’t try fighting QA’s fears by taking the time to read the regulations more carefully. And we will give in when the CROs warn how expensive it will be if they have to change their SOPs just for us. The path of least resistance will lead us to a minor reduction in SDV, with virtually no efficiency benefit, but lots of arguments avoided.

 

With improving protocol feasibility as our destination, the path of least resistance will take a very wide turn around the medical affairs staff who just joined industry from academia yesterday, abdicating our responsibilities as clinical operations professionals. We will go miles out of our way to accommodate key opinion leaders essential to research paper authorship, but out of touch with patient populations and their health behaviors. And we will take the superhighway to the advertising agencies who will “rescue” our study after infeasible enrollment targets are missed.

 

There’s Gold in Them Thar Resistances

We cannot base our work on the paths of least resistance. Of course we want to work without acrimony, arguments, and escalations. But our industry is suffering from this ingrained fear. Complacency and inertia have led to a degree of ineffectualness that triggers executive dismissal of the clinical development function. So far, that means simply transposing the inefficiency from internal resources to external ones. A better analysis would recognize that sometimes, when the resistance is greatest, the treasure being guarded is the most valuable.

 

Facing the resistance should not be corporate suicide; instead it can take the lid off of hidden misunderstandings and past grudges. For instance, challenging traditional SDV policies head-on, if done correctly, generates a healthy debate that puts the issue in the context of modern realities instead of 1990’s assumptions. Similarly, maintaining weak protocol evaluation practices to protect interpersonal scientific relationships will only cost us time and money we do not have. A direct comparison of expert opinion with properly collected data on patient populations, distribution and attitudes can have a lasting improvement on clinical development performance.

 

If we can get past entrenched self-interests defending the current SDV policy, by confronting the boulders head on, we will discover remarkable reductions in necessary data monitoring effort. If we face the rock cliff of infeasible protocol designs at the first gate, and dare to say that climbing it is more important than ego, then we may find a much shorter path to full enrollment on the other side.

 

The path of most resistance may be arduous at first, but as with all pathfinding, the more we travel it, the smoother and faster the path will become.

 

The cost of dysfunctional relationships is very high in time, money, reduced motivation, and reduced productivity.

 

This column marks the beginning of a new series of essays for the ACRP Monitor entitled “Operating Assumptions.” Its focus will be on the process of clinical research – the good and the less good, the way things are and the way they should be. We will range over the broad spectrum of clinical research conduct and look at the operational challenges – tactical and strategic – that we will be facing in the coming decade.

 

As the title indicates, we will be targeting “assumptions,” the sacred cows of operational conduct. One such bovine is the common phrase, “it’s not personal, it’s just business,” a phrase used most often to be mean exactly the opposite (not unlike another classic, “with all due respect,” when no respect is being paid at all). Clinical operations success is all about the personal, but our skills in this arena, and our willingness to engage in interpersonal challenges, are limited.

 

It is Personal

Coping with the personal side of business is a daily task and the source of universal frustration. What is key to recognize is that dysfunctional people create dysfunctional processes. An analysis of poorly performing clinical research groups usually reveals an illogical or flawed process that has its roots in an accommodation to a dysfunctional key player, or flawed leadership, or fatal compromises designed to avoid political conflict.

When we say “it’s not personal,” we are trying to push away this most difficult part of our work, and we are trying to pretend, or hope, that somehow de-personalizing the process makes it “just business,” by which we mean that somehow personalities will be replaced by virtual machines. This example of wishful thinking is rarely satisfied.

Actually, it’s all personal. Let’s think about a not-so-exaggerated workday for a Clinical Project Manager. When she opens up her email in the morning she finds a hundred new email messages. Her study’s CRO project manager has changed again and she learns she has to provide an orientation to the new guy as soon as possible. Her boss returns an email she wrote yesterday, saying she is complaining too much and she should fix her problems with her CRO by speaking directly with them. QA is writing to point out how one of her studies is not following SOPs. Contracting wants her to switch CROs on her program for the next trial because they have negotiated a better deal. Program management wants her to revise next Quarter’s spend estimates.

She checks her calendar and sees that the department admin has her triple-booked all day. She looks up from her desk and some management consultant is waiting at her door for an interview she is already late for, but does not know why it is scheduled.

She goes to her first meeting of the day, a study team meeting. The clinical data manager wants to revisit the edit checks agreed to last week. The statistician is objecting to one of the eCRF designs, even though FPI is only 4 weeks away. The clinical supplies representative hasn’t been coming to these meetings until today, and now announces they can’t possibly have study drug in time. The in-house CRA, who works for her, reports that she’s been too busy and the TMF isn’t up-to-date. Her Blackberry is filling up with new emails and it’s only 9:30.

Not such a farfetched scenario, and most will say that this is just the reality of clinical research operations. But what may be behind this acid-churning hour of work? All kinds of pathologies.

Pathology is Personal

There are several common personal pathologies in clinical research organizations:

Passive-aggressive behavior. Does any of this definition sound familiar?: “Passive–aggressive behavior, is passive, sometimes obstructionist resistance to following through with expectations … marked by a pervasive pattern of negative attitudes and passive, usually disavowed resistance …It can manifest itself as learned helplessness, procrastination, stubbornness, resentment, sullenness, or deliberate/repeated failure to accomplish requested tasks for which one is (often explicitly) responsible…”

Avoidance of confrontation. For various reasons (discomfort, fear of reprimand or reprisal, corporate cultural taboo), we dance around what we want to say and never quite get the topic on the table, thus ensuring the dysfunction will occur again.

Pressure, fear, intimidation. Intertwined with the pathologies above is the pressure to perform with fewer resources, meet the trial timelines or the regulatory deliverables, and keep one eye always on regulatory compliance. The fear means we are afraid to sanction dysfunctional behavior and afraid to fire people who are damaging to the group environment. Many research managers say “it’s impossible to fire anyone around here.” Ironically, the fear factor also means we are afraid to be more inquisitive during the hiring process, which could help us avoid the dysfunction before it starts.

Conflicting agendas. If left without vocal and consistent executive leadership, various groups will pursue individual vertical goals or philosophies that clash with those of other groups. For instance, data management may assume it is the sole entity responsible for data cleaning, while statistics, CQA and drug safety may also feel equally, or more, responsible.

 

Obscurity and opacity. Each discipline in clinical research preserves its value and job security by speaking in its own professional language, jargon, and unnecessary detail. This enables internal and external staff to shield problems, underperformance or conflicting agendas from those who might object. It is particularly frustrating when all parties are sincerely trying to cooperate, but have lost the ability to communicate with one another.

Incompetence. I can feel readers cringe as they read that word – it is so politically incorrect. But what do we all talk about half the time? Incompetent leadership, incompetent service providers, incompetent staff, incompetent sister departments. Sometimes it’s true – they really are incompetent. And if you can’t fire them (see above)….well, talk about dysfunction!

These pathologies are ubiquitous in research organizations and yet the typical clinical operations manager or employee is not equipped with the training to recognize and improve on interpersonal dysfunction, nor do we commonly work in cultures that protect and nourish interpersonal effectiveness.

Essentially it’s willy-nilly: the people we work with like us or not, have another agenda or not, respect our authority or not. And thus our interpersonal success seems like the luck of the draw – a good boss or a bad boss, a good employee or a bad one. But leaving it to chance is very risky. The cost of dysfunctional relationships is very high in time, money, reduced motivation, and reduced productivity.

Have a Better Day

How do we fix this clinical project manager’s day? Any improvement will be a step forward, so it can be taken in pieces – how expectations are set with employees and CROs, how meetings are run (not for getting through the agenda, but for effective decision-making), writing consistency and cooperation into service provider contracts, and more. These steps may seem small, but they can have great impact.

The larger challenge is to improve the company culture. Many clinical research organizations, perhaps because they are science-based but required to be business-like, are passive-aggressive cultures. Other clinical research units, such as some CROs, are run in climates of intimidation. How do we change a culture? Company culture is a rollup of personal attributes, rewarded explicitly or implicitly over time.

Leadership can improve (if not “change”) culture. If you think you have these problems, share this column with your leadership, and start a dialogue. Be frank, which means you have to be prepared for hearing frank things in return. It may not achieve what you are hoping for, but frank dialogue will cut through the Jell-O of animosity, incompetence and obstinacy.

 

What is wrong with the catchphrase we started with is its false duopoly: clinical research is a business that is manifestly personal. We need to celebrate, learn about, and become expert in both. Our companies need to value, and professionalize, “the business of people” for clinical research to be a cost-effective success.

Clinical research professionals considering upcoming information technology investments are at a crossroads in 2010: complexity versus simplicity – which should we choose?

The decision is not at all obvious. The implications of the decision are profound: the resulting strategy will direct the spending of millions of dollars and thousands of hours, and decide the fate of dozens of vendors.

 

Simple Complexity

We all understand that information technology is an essential underpinning of every clinical development function. As a result, software applications have been installed for almost every function, with little ultimate regard for each other. Each individual decision has generally been justified, and paid for, by the function for which the software was written, and not primarily for how similar it is to other applications, or how well it may talk with others.

 

Many of us have argued for years that this “silo” approach to building and buying individual software packages, which are increasingly sophisticated servants of their intended users, is reinforcing bad organizational behavior. It enables individual micro-professions to make their data so specialized as to be unusable by others. It means multiple instances of re-entering identical data. It makes ensuring the currency of data very difficult, to say nothing of the challenge of finding the necessary linkages between silos. The answer would be, as much as possible, a single system: an interweaving of applications so transparent, based on common means of entry, storage and visualization, so that the entire enterprise breathes data as a single organism.

 

So on the one hand, we have a path for functionally focused, deeply featured independent software, integration and cross-talk be damned. And on the other hand, we have a path based on the belief that integration is everything, realism be damned. You can see that we cannot have both. In order to integrate effectively – to make the mammoth effort and frequent compromises worthwhile – we cannot maintain the rich depth of vertical profession-specific applications. It is too much to tie together. And vice versa. We cannot keep digging deeper for richer and richer vertical functionality and expect to tie these unique entities together usefully.

 

Complex Simplicity

We have a magnificent example of complex simplicity in every facet of daily life today – the Internet, or more precisely, the Web. (Indeed, what better metaphor for complex simplicity than a spider’s work.) Everyone with access to some kind of Web-enabled device can easily learn how to use this vast network that enables the simplest and most sophisticated exploration, transactions, learning, discovery, entertainment and influence. It seems so simple at its face, but of course it is enormously complex, as only something which tens of thousands of people, and literally billions of dollars, can create. And besides people and money, the Web has needed repeated flashes of brilliance to invent it and sustain it and expand it. Think of the search algorithms, fiber optics, high capacity storage, wireless access, global ubiquity, and on and on. This is highly complex simplicity.

 

And that’s the rub. Clinical research is not a universe of a billion users justifying billions of dollars for IT innovation. How can biopharma R&D mimic what the Web did with information? We in R&D want a confluence of basic science, applied biochemistry and physiology, genomics, real patient experience, epidemiology, macro- and micro-marketing, economic forecasting, regulatory and provider/payor policymaking, and more. All of this information, together, is what we actually need to be integrated. Integration in clinical research is not about having only one contact address per investigator, or even about being able to track a physiologic parameter from mice to Phase III. We want to know whether if we invest millions of dollars in therapy X, it will help enough people at the right price to more than cover the cost of developing it. Again, this is highly complex simplicity.

 

One has to wonder if our industry would be ready for simple access to such a complex picture. Do we understand how to run an enterprise that knew so much? What would our flashes of brilliance be to make this flower? Outsourcing and deal-making are not the stuff of genius. What would be the R&D equivalent of search algorithms or wireless access?

 

And On the Other, Other Hand

Let us return to the R&D IT choice in front of us. Should we invest in the complex vision of a single integrated universe of R&D software, be it powered by SAS or Oracle or Microsoft? How far can they take us today? How long will it take to get to the point where their integrated system, having been irrevocably chosen back in 2010, will justify the commitment?

 

On the other hand, continuing on the path of collecting sophisticated (and expensive) vertical applications is in fact a dead end. But there is one adjustment that might make the choice of paths more distinct and dramatic. If we want an alternative to the integration nirvana that may never be reached, a better path would be to design and acquire simple vertical applications, optimized for the barest minimum of business requirements, with the leanest of function sets and the easiest of user interfaces. We would carefully but doggedly dismantle the thick infrastructure of the vertical applications we have today, by a merciless zero-sum examination of what we really need, knowing that the simpler the tool, the more likely it will be used and the more relevant the output will be. Such a collection of applications would be cheaper to acquire and easier to learn. We would worry much less about the commitment to these applications because we would intentionally plan on their obsolescence, and ruthlessly strip their purposes to the bone. If only we could find vendors willing to build and sell such things.

 

Remember when watching TV was really simple? There was almost nothing to watch and very little choice. Perhaps your little sister had to ground the antenna with her hands, but hey, that’s what little sisters were for. The picture resolution (a word we didn’t even know) was terrible. And the world of media was divided like the continents: TV, radio, movies, records, books, newspapers, libraries and stores were entirely separate and knew their place. Who wants to go back there again, when today all of those media and more fit in the palm of your hand? So the question is, can anyone really (and I mean really) get the world of R&D decision-making into the palm of our hands, or should we count on just turning on a TV-simple CTMS to hear the latest trial news?

 

The choice is a conundrum: simple complexity or complex simplicity? Which can we afford? Which do we need? Which can we make happen?

Delay in no way guarantees thoughtfulness, as some wishful thinking would have it….In the worst cases (which are not uncommon), delay leads to cynicism and passivity.

 

It has been said that one sure way not to win the lottery is not to play it. Without condoning indiscriminate gambling, and notwithstanding the parallels between buying software and gambling, we should be increasingly concerned that the caution with which biopharma approaches information technology may be turning into cynicism, and may be keeping our clinical research organizations from practical and much needed benefits.

 

Delay is Not Benign

Delay in IT selection and implementation is not benign. It is not simply a matter of “luxury” deferred. And delay in no way guarantees thoughtfulness, as some wishful thinking would have it. The dilemma of IT delay is that operational needs and software to fix them are increasingly fleeting in our rapidly changing business and technical environments. A need expressed one year may be gone (or greatly diminished) in 18 months. I have seen executives who count on this phenomenon, thereby proudly “saving” money. But these are false savings: if the need was truly there to begin with, the delay only represents a period of sub-efficiency and missed performance.

 

Similarly, the rapid pace of technical change is the serpent in the garden of software use. Delay in software acquisition and implementation means that the users’ needs are ultimately addressed by technology which may no longer be worth the investment (compared to newer, often cheaper, alternatives), and at the same time, the newly enabled users will be disappointed they are not using a platform or interface or function set which they find commonly available in their non-business lives.

 

Delays in IT projects have many causes, driven by sponsors as well as vendors:

• Cultural caution (“I want to analyze this some more”)

• Financial restraint (“We’ll wait until next year”)

• Mistrust (“I remember how badly the upgrade went last time, so let’s hold off awhile”)

• The grass is greener over there (“Gee look at that new vendor, or that new strategy”)

• Acquisitions (“Rumor is, we’ll get acquired in a couple years so let’s sit tight”)

• Personnel turnover (“The new VP doesn’t like that vendor” or “The vendor’s management changed and we have to wait and see”)

• Team management (“We have to have the input of all the stakeholders, and darn it’s hard to get them all together at the same time….”)

• Poor software development practices (The new release is really buggy)

• Excessive legalism (the downside of professionalizing, and isolating, contract administration)

• And so on.

 

But all these excellent reasons for inaction have to be balanced against the cost of delay. Besides not meeting the operational needs that prompted the project in the first place, delay has other insidious side-effects:

• Delay softens the demand (which may mean they weren’t important, but also may mean your organization is losing willpower)

• Delay softens the energy behind the project, the project champions move on to other things, and the staff perceive a lack of drive for improvement from management

• Focus and talent drift away; like fielders on a baseball team behind a slow-working pitcher, it’s hard to keep your edge. The mind, and the project, drifts.

• In the worst cases (which are not uncommon), delay leads to cynicism and passivity. Staff who put in considerable effort in helping the project leader analyze needs and specify requirements see that little or nothing comes from their efforts, creating a dangerous negative feedback loop that will undermine future efforts and poison the corporate culture.

 

Less Time, More Benefit

What are the antidotes to delay? The primary solution is one of cultural change: a corporate environment where impact is favored over cost-savings, where performance instead of caution still matters, where contributing to staff cynicism is a mortal sin.

 

Based on such a foundation, the research organization has to approach its IT projects with a sense of urgency, flexibility, and trust. Since trust is bred by success, quick successes should be sought out (the essential opposite of delay). All of those involved in identifying needs and solutions must understand that time is of the essence, and that some benefit is better than none. And as always, senior management support for this strategy must be vocal and prominent.

 

Speedier IT implementations can be obtained through:

• Focus (prioritize and re-prioritize, not just which projects to do, but what to do in each project; forego nice-to-have features for those that drive key business objectives)

• Understand what is important (essential to the focus referred to above; every employee should always be clear as to the 3-4 things the research organization must achieve in the near term)

• Narrow team membership (nothing causes delay in the soup like too many cooks; up-end your corporate fetish for broadly populated project teams and include only the absolute minimum contributors who you can count on to participate knowledgeably)

• Set aggressive deadlines (pick a date which seems impossible, given your organization’s history, and then move it up a month or two)

• Ensure sustained senior management visibility and oversight (they have to stay involved and stay on message)

• Implement sustained, closely managed, tightly scheduled vendor management (do not confuse arms-length oversight with trust – you cannot set and re-set expectations with your vendors often enough)

• Trumpet your successes, and the features and benefits of the new software, through actual stories from your experience (the proof is in the pudding and everyone needs to hear a lot about it)

• Keep moving! (side-step obstacles, shorten your objectives list when necessary, and keep that sense of urgency).

 

Looking back at this list, I realize the same advice could be given to clinical trial study teams. Hmm. Does reducing delay have resonance for the essence of our work?

 

Speedier implementations will require important changes in the skills and perspective of research staff, in sponsor-vendor trust, and in project execution. An organization that tries to move in these directions may win the lottery after all.

 

©Waife & Associates, Inc., 2010

We all now know that the cost [of software acquisition] in money, time, headcount and disruption will be high….the benefit therefore must be high as well, or the cost reduced.

 

“Begin with the end in mind” is one of those classic business phrases which is no less valuable for the number of times it is ignored. Software developers and clinical research sponsors alike are guilty of sometimes fatal forgetfulness of this key concept when planning the development, implementation and use of software applications.

 

Clinical research sponsors generally start a software acquisition process not with the end in their mind, but with some stimulus in their back: a department is complaining that everybody else gets new tools except them; our competitors all use this tool so we should; I met a salesperson on the airplane; the vendor just announced an upgrade and they won’t support our version anymore; we just hired a new vice president and she prefers vendor “x” over vendor “y”. While some or all of these situations may justify a re-evaluation of our need for software, they do not in themselves sufficiently define the “why” and “what”.

 

Starting Isn’t the Hard Part

Sponsors may also start from some business trigger which gives them the illusion that the end is mind: we need to save headcount so let’s use EDC (electronic data capture); or we’re frustrated with having multiple overlapping and out-of-date investigator databases so let’s buy a CTMS (clinical trial management system); or our new translational medicine VP says we’re going to have a flood of pharmacogenomics data coming in so let’s get one of those “data warehouses.”

 

What’s missing from these situations is the company’s consideration of the strategic benefit, and of how the software’s users will actually have to use it and what they will get from it. What is the tie-in between the initial impetus – the needle in the back or the business trigger – and the actual output the new software will provide? This disconnect is particularly critical in enterprise software projects because we all now know that the cost in money, time, headcount and disruption will be high. The benefit therefore must be high as well, or the cost reduced, to be in line with the diminished (and realistic) results.

 

Analyzing a potential project’s end user benefit compared to the initial impetus need not be fatally time-consuming, which is the usual reaction to the suggestion. But it can save a large amount of wasted time and money. We should recognize that it is very easy to fall into the disconnect trap. For instance, let’s consider the situation where clinical operations gets that frustration over the multiple investigator databases. The complaint is forwarded to the IT department (or worse, a naïf goes to a booth at a trade show), and the answer comes back: there is no “investigator database fixer” product out there, but there are these CTMS packages and boy, they do everything. Before you know it, you are installing a multi-million dollar application over multiple years, you’ve doubled the amount of training everyone has to go through, and you have all this rich functionality, and no one can or wants to use it because it’s not relevant – neither to the original trigger or actual user circumstances.

 

I would suggest that even a good understanding of how the end user works, and what he or she needs, is not sufficient in today’s business environment. We have fewer and fewer in-house staff, we are narrowing our “distinctive competencies,” we have uncertain economic and reimbursement conditions, and we have unrelenting competitive pressures. All of this mitigates against expensive multi-year infrastructure projects unless we do more to predict and understand the future end user business need. What are the future identity, purpose and constitution of our business, and therefore, what tools do we need to get there?

 

Even for projects where the pain and the solution appear more clear and pragmatic, we are usually missing a robust and detailed visualization of how a tool will be used, and without this, we will mis-configure and mis-spend our time and money. For instance, h0w does a shift to outsourcing change who the users are for a CTMS, document management system, EDC, and similar programs? How does a thoroughly “e”-oriented sponsor exploit tools like ePRO (electronic patient-reported outcomes) which are still fundamentally device-based? Or alternatively, how useful are e-tools if the “back end” of the workflow stays “paper-minded” in its policies and procedures, reflected in unchanged workflows, double-checks, and review practices?

 

And Vendors Too

The developers of software used in clinical research are equally guilty of forgetting the context of how customers use their tools. Vendors have a great opportunity to add significant value to their customers by helping sponsors see the possibilities that their tools open up, and by knowing the clinical research business as deeply and broadly as possible. This knowledge should translate into more focused and anticipatory designs, creating more powerful and efficient tools.

 

Typical software development, even the industry-specific kind we encounter, falls for chasing after customer-driven enhancement requests that are often shortsighted for all the reasons cited above (responding to the “needle in the back”). The result is needlessly complex software with features even the requesting sponsor may forget they wanted! More damaging than needless complexity is that the effort to chase enhancements takes money away from the vendor’s literal “end” – the output of the tools they are developing, which is all a tool is really good for.

 

This irony plagues each aspect of the research software universe. Vendors may see the whole gamut of functionality possible, but as professional engineers, they see it, and build it, linearly (they begin at the beginning and end with the end). As a consequence, they inevitably run out of time and money before they reach the output function (reporting). How many times do we hear vendors do their demos this way: they start with the very first point of data entry, move through to the point everyone is waiting for (getting something back for all that entry), and then they say, “well, there was no point in re-inventing a report writer so use something standard, off-the-shelf.” It is the “data out” that matters in the actual business context, but to a software engineer it looks like a data processing problem, not a business use problem. If this were true, and off-the-shelf reporting was adequate, so too would be off-the-shelf entry – so actually, let’s forget the whole thing. And yet there really is a utility for clinical research specific software products, if built with the end in mind.

 

Today’s software vendors need a knowledge base and a discipline not commonly found. The need for vendor domain knowledge is greater than ever, plus an understanding and vision of where their customers are going. Certainly sponsors have the bulk of the responsibility in teaching this. For the vendors, the discipline is in rejecting enhancements for enhancements’ sake and leading their customers towards being enabled to handle the future.

 

“Begin with the end in mind” is certainly the start of a solution. Begin with an understanding of the end is probably more profound. Identifying possible “ends” is one thing; understanding their meaning to the user and the enterprise requires more thought, breadth and management than most sponsors or vendors are used to supplying.