Email us : info@waife.com
Management Consulting for Clinical Research

Almost everywhere you turn today, or any conference you attend, you are likely to encounter someone talking about Artificial Intelligence (AI) and the associated proclamation of how AI’s implementation is going to be disruptive to almost every aspect of business.  There are countless articles being written, talks being given, and startups emerging, all leveraging AI as a game changer and sometimes overselling it as a panacea.

The biopharma industry, typically a slow adopter of new technology, has moved rather quickly to grasp the potential of AI, albeit not its application yet, in bringing new drugs to market, and like other industries is enthused at the numerous opportunities that AI affords in completely automating what were once manual tasks.  There is of course the regular chorus of caution against the use of artificial intelligence, with many good and salient arguments about why we should be careful in how we adopt and apply AI.

One of the most recent arguments I heard on exercising caution presented the premise of “benefit” and how that premise is vastly different in humans as opposed to machines.  The argument, in short, was that humans do tasks in order to gain some benefit which can take any number of forms – monetary, charity, goodwill, benevolence, personal growth, etc.  While on the other hand, a machine will never possess a conscience that will dictate to them a diverse reasoning for why it is doing a particular task and for what purpose.  Instead, the argument claims, a machine’s primary focus will be to advance its own directive without emotion or thought of others.  This may very well have some veracity to it, but like it or not, the argument will not stop the advance of AI in the biopharma industry or others.

So, the next question then becomes, how do we deal with AI once we begin to advance its application? I believe this is the particular area that requires a more direct, and specific focus.  The potential benefits of applying AI to our industry are irrefutable.  If you can get machines to predict outcomes more effectively, analyze data more holistically, and pinpoint potential roadblocks to success more accurately, then all of those outcomes in the end add significant benefit to bringing new treatments to market for people who desperately need them.

But, are your organizations prepared for the integration of AI into your existing processes and structures? The answer is, probably not.  Most of the attention to date has been applied to the application of AI to a problem and how it can solve that problem, but very little has been applied to how to integrate AI into the company structure, culture and process.  There is a wonderful TED talk on this very issue by Matt Beane (professor at University of California at Santa Barbara), where he points out the devasting impact that a one-dimensional implementation of AI can have to the next generation of human knowledge and capability.  I recommend you take 9 minutes out of your day and listen to the talk, because it is very poignant thought-provoking.   Matt’s conclusions are equally applicable to the rush to implement AI at biopharma companies without taking some time to plan ahead and adjust your organization to accept it.  How will your company adapt when AI is leveraged to identify and target specific geographic areas for subject and site recruitment.  How will your governance structures change if AI is successful in predicting and analyzing safety projections and how will this impact your PV departments and DSMBs?  How will your organization’s procedures and support structure adapt to an AI solution that automates a large portion of monitoring or protocol development?   As this technology matures, there is little doubt that you will gain efficiencies in a number of areas, but at what cost to the human, cultural and emotional intelligence portions of your organizations?

There is an analogous example, and good case study to draw from, in biopharma’s strong shift to outsourcing in clinical research that continue to accelerate inexorably.  The shift started when executives at biopharma companies, with the advice of consultants, decided that because of the fluidity of clinical trials they should look at reducing their fixed costs (in-house resources) in favor of variable costs (outsourced resources).   The financials all made a lot of sense, and so in a relatively short time the resource models were drastically overhauled, and the few people that were retained at the biopharma companies were shifted almost overnight, from a role of a contributor to a role of overseer, with little more than a few days of training to help them along.  The result of this quick shift in role and organizational expertise culminated in relationships with CROs and other vendors wrought with friction and assignments of blame.  In addition, it had the unintended consequence of resulting in a perceptible decay of operational knowledge and expertise at the biopharma companies themselves, as those skills were no longer put into practice or fostered.  Equally unfortunate is that the projected savings, in both costs and efficiencies, have not materialized in a meaningful way, as evidenced by some recent studies conducted by Tufts Center for the Study of Drug Development (CSDD).

So, as we embark upon the exciting and inspirational path of AI, and all that it can offer to the clinical research world, it would behoove us all to direct just a portion of our focus away from the technology itself and towards our organizations seeking to benefit. Preparing our organizations, and its people, to accept a technology that promises to be far more disruptive than anything we have encountered before, may be the difference between a rapid successful adoption versus a path strewn with impediments and tribulation.

© Waife & Associates, Inc., 2019

Life is complicated, at home and work, day in and day out. The complication has only been multiplied by the rapid infusion of technology in our lives. Despite streamlining and simplifying many things, technology creates an environment that can overwhelm us with an abundance of stimuli and a surfeit of information. So much so, that unless we manage our world well, our capacity to absorb that information is overwhelmed, and our attention span wanders.

It happens to all of us over and over again.  When working on completing a particular task, my mind wanders to other tasks, and then, wait – is that an alert I heard on my phone?  I check the message immediately, because I am helplessly predisposed to do so, and lo and behold I have another item requiring my immediate attention. I am off down that rabbit hole and have lost my focus once again.

In our biopharma work, not only does this plague us all in daily desk work, it burdens us at the enterprise level as well. In addition to technology-driven distractions, we have what I call “initiative overload”. In conjunction with a persistent influx of text messages, we have a steady stream of overarching stimuli: multiple concurrent corporate or department initiatives, each with a “high priority.”  How (and where) can we focus when:

  • These initiatives are competing for the same resources and attention.
  • These initiatives are additive, in that they are extra work on top of our daily jobs.
  • These initiatives often have overlapping goals and targets but are not aligned, resulting in duplication of effort and/or conflicting methods and solutions.

In biopharma operations, it is common for well-intentioned people to be working very hard on any number of initiatives which end up taking longer than planned, uncover additional complexities, and/or exceed budget forecasts and expectations.  I attribute many of these stumbles to a lack of “allowed” focus, the word “allowed” being stressed.  It is not that we do not focus, but rather we are not allowed to contribute meaningfulfocus to a particular effort because we have too many other conflicting initiatives in which we are participating. We are constantly shifting our focus from what we were doing to the new “immediate” need and in the end, none of the initiatives get the full attention required to be successful.

I have tried to manage this condition in my personal life. I put my phone away at family dinners and during other conversations, so I can devote my full attention to those interactions in order to get the most out of them.  I turn off “push” notifications of the latest distressing news, and I try to evaluate competing priorities before agreeing to the next tempting activity. A consciousness of the practical limits of time and attention, and the confidence to challenge the expected and the politically correct, can bring order to the array of priorities. Putting some initiatives aside to focus on one (or fewer) at a time will likely mean a better, faster output of work, and bring forth the best ideas and effort.

When confronted with multiple initiatives, participation in all, and meaningful contribution to all, are inversely proportional to one another.  The more things you are doing at one time, the less well you are doing any one of them. Management has the responsibility of improving this focus problem, through clear leadership, triage, resource allocation and budgeting. But all of us need to be vigilant to create an environment capable of productive focus.

Steve Shevel

Senior Associate

© Waife & Associates, Inc., 2019

According to Wikipedia, Fear, Uncertainty and Doubt (FUD) is “a strategy to influence perception by disseminating negative and dubious or false information and a manifestation of the appeal to fear.” This phrase dates to at least the early 20th century. How does this relate to the present common usage of disinformation related to software, hardware and technology in the biopharma industry? Well it is actually closer than you would think, so let’s have a look at some data management related issues, which are not new but obviously also not solved either.

There are quite a number of scientific evaluations which examine the impact of “wrong” data on study results. Whether this is because Source Data Verification (SDV) didn’t catch it or because edit checks were insufficient – all these studies show high robustness of collected data against errors. Nevertheless, we are all afraid of missing or erroneous data (Fear), we are uncertain about the number and kind of edit checks we need (Uncertainty) and we doubt that our monitoring and data management forces are doing enough to provide us with data quality which will be matching regulatory expectations (Doubt).

It seems that there isn’t much motivation to change the situation. CROs have sponsors pay for “intensive” data checking, sponsors want to be “on the safe side” and technology providers try to sell some innovations which claim to solve everything. Meanwhile, we are trying to use new clinical research information technologies without taking into account the impact this may have for the overall data flow and corresponding processes. What started with EDC (applying paper processes in an electronic world) continues with eCOA and mHealth approaches.

In a recent client example, we observed five different data sources, multiple technology providers plus a Data Management / Statistics CRO who was in charge to transfer, consolidate, clean, transform and eventually analyze the data. Primary endpoint data was coming from an eCOA device via a symptom score. Of course the sponsor was concerned about the completeness, the consistency and the overall quality of the data. Despite the fact that the eCOA data was transferred to the database at the eCOA vendor instantly, the transfer of eCOA data to the DM CRO didn’t happen until after 2 months of study start, and at that time 80% of the overall data had already been collected. Similarly, uncovering errors within the eCOA system or device handling issues by sites and patients could only be discovered relatively late in the process, after data transfer.

This leads to the ongoing and critical questions all sponsors have about proper data flow, processes and the single point of truth for all data. This is becoming more complicated as the number of different data source keeps increasing. Does the technology solution need to be one single big repository? This topic has been discussed for many years, with quite different answers coming from technology providers (or CROs) on the one hand, and company-specific solutions among sponsors. To define and consistently execute proper processes for these scenarios is at least as important as the technology itself. This requires people with technical understanding, flexibility and process thinking. The challenge will be to identify, develop, recruit and maintain these people who will do the work in this evolving environment.

Eventually, the combination of people, processes and technology (in this order!) can make a difference. It will enable us to gain better efficiency and could be the route to guide us from FUD to CCT: Confidence (we do it right!), Certainty (we do it in the most appropriate way) and Trust (we do it with the best people).

 

Detlef Nehrdich

Senior Associate

© Waife & Associates, Inc., 2019

The “M” (Management) in CTMS is no longer applicable to the sponsor…you need a CTOS – a Clinical Trial Oversight System.   
In the not too distant past, if you were running clinical trials, we would never have expected this question to be posed: “do we really need a Clinical Trials Management System (CTMS)?” You needed to have a system that would capture and report out information related to the operational progress of your trials, without question.  But, these days, we do hear that assumption challenged more and more often.
For many years the scope and functionality of widely available CTMS systems have spread and grown unwieldly. Software vendors looked to incorporate additional tools in their products such as investigator databases, monitoring visit reports, safety letter distribution, and even Trial Master File and document compliance functions.  These tools, while not directly relevant to the day-to-day operations of a trial, served the hub of centralized clinical operations. Sponsors appreciated the idea of having one place to go to get their information, and bombarded vendors with a steady stream of requests for additional functionality that supported their particular way of operating, as distinct from the way another sponsor might work or be organized.
Of course, not all requests could be fulfilled, and so the rollout of new functionality was predicated upon variables such as common sponsor desire, vendor cost containment and technological feasibility.   But, in recent years one key factor changed the CTMS landscape entirely: large-scale Outsourcing.  Unfortunately, traditional CTMS solutions do not adapt well to a heavily outsourced environment, especially when sponsors are using multiple CROs.
The primary barrier in using traditional CTMS systems in an outsourced model is the question who is doing the actual work?  Most CROs serve multiple clients, and a large part of the success of their business model relies on limiting fixed costs (non-billable resources).  What this means is that they need their workforce to be fluid and adaptable (which still performing to client satisfaction). To support this they need their own internal management systems where knowledge and process is transferrable and fungible regardless of trial or sponsor.  This is the only way CROs can recognize economies of scale, but both parties to the work environment (sponsor and CRO) end up with an investment in their own optimized CTMS, with compelling reasons for their investment. Rarely, however, can these systems be shared, talk to each other, align their metadata, or even have access allowed one party to the other. At best, each party can review reports or data extracts from the other, almost never in real-time, and rarely in their preferred format.
Here in 2018, with approximately 50% of every research dollar going to third-party providers, what are sponsors to do?  We are still responsible for the trial, its subjects and the data it produces. But in an outsourced environment, sponsors are really no longer involved in the day-to-day operational aspects of a trial.  This is what they are paying CROs to do. And so the M(Management) in CTMS is no longer applicable to the sponsor.  Likewise, all of the associated functionality in traditional CTMS systems that focused on the Mof a trial is of limited value to sponsor users.   CROs still require this functionality, since they are doing the operations, but they have their own CTMS systems that are configured around their unique (and consistent) processes.
Sponsors need to consider replacing the Min CTMS with an O, and pursue the development and implementation of a Clinical Trial Oversight System.  In an outsourced environment, oversight is really what sponsors are supposed to be doing, and indeed this is the regulatory expectation.  The complexity of changing the mindset at a sponsor from “doer” to “overseer” is perhaps a topic for another article, but clearly the overseer needs data summarized and consolidated for a trial to ensure they know what is going on, and more importantly, to be able to take action based on this information.
This is why it pains us to see sponsors who outsource most of their trials still spend large sums of money for, and devote enormous effort to, implementing a traditional CTMS system.  Inevitably, in this situation, a significant portion of the functionality they spent months discussing and configuring is underutilized, or worse, forgotten by their user community.  There are a number of vendors in the space that offer systems or software that can meet the oversightneeds and requirements of a sponsor – they are not (and shouldn’t be called) CTMS systems. They are often simpler and faster to configure (less functionality) and less expensive. Furthermore, these solutions can integrate with CRO and other third-party systems that could eliminate the requests for CROs to work in sponsor systems, while still allowing sponsors to see and analyze the relevant operational data.  It is important to emphasize that oversight does not consist solely of looking at reports and metrics, but includes many other important factors such as communication/collaboration streams and issue management.   This locus of functionality get at the true heart of sponsors’ responsibilities and business needs.
Do you really need a CTMS system?  The answer, as usual, is “it depends”. If you do not outsource the majority of your trials to CRO providers and do much of the operational and management work in house, then yes, you would/could benefit from a traditional CTMS solution.  If you do outsource a large portion of your trials, then you would be better served to seek out options that focus on supporting trial oversight.
How you implement this oversight and associated process change is as important, or perhaps more important, than the technical solution you chose.  So, before you commit millions of dollars to your next CTMS initiative, ask yourself two questions: 1) Do we manage trials or oversee them?  2) What is it that we need to support that work?
For further discussion or assistance on this topic, reach out to Waife & Associates, Inc. at www.waife.com, or email Steve directly at shevel@waife.com
 

Sleep_Desk-Photo_golubovy-iStock_100330345-600x400-450x300Our Senior Associate Steve Shevel shares a personal experience with unfortunate relevance.

I recently had an unfortunate experience where a close relative was admitted to a hospital in East Africa with a life-threatening event. There were many frustrating aspects of this situation that I encountered, however one that drew a close analogy to the BioPharma industry was the entrenchment of culture and just how difficult this is to navigate.

The culture in which I was embedded in this African city was one in which there was very little or no sense of urgency. While this may be typical and a mere annoyance in the daily routine of life around the world, it became almost unbearable in a hospital intensive care setting, where loved one’s lives are at risk. I just could not get over how slowly people moved and how simple administrative tasks proceeded at a glacial pace. Some of this had to do with technological limitations, but the more significant part was attributed to the institutional culture that not only accepted this pace but expected it.

This experience gave me pause as I drew the analogy to our work. In almost all of our engagements with a biopharma client, our recommendation to address a problem and/or initiative is inevitably tailored to fit within the culture of that company. There are a few times when the recommendation requires a necessary shift in culture. When in Africa, I spoke with a woman from the United States who at one point was in charge of training new nurses and she admitted that one of her biggest obstacles was changing their mindset to work and move at a quicker pace.

It surprises me then that this crucial aspect of organizational behavior is either overlooked or marginalized by management when implementing a new initiative, project or change. Sure, there is the obligatory change management discussion that occurs with either HR or a third-party vendor taking the lead to ease people into the change. While this is important, it is individual in its focus. In other words, “This is how the change will impact you individually and how you can learn to deal with it”. Some might argue that this is veering off into the psychological aspect of corporate citizenry, but there is a mechanical aspect to this that is visible if one reads between the lines.
When a new initiative, technology, or project is undertaken in the biopharma world, the typical practice is to look at the problem as entirely mechanical and rely on middle management to sell it to the larger organization. This strategy routinely fails, as there are competing biases amongst the participating departments and further, it neglects to address a crucial aspect to the success of the project – how to mold the implementation into the existing corporate culture. An alternative strategy, which is equally unsuccessful, is to try to overhaul the culture through a standardized, templated implementation of a specific initiative.

The adaptation, and even acknowledgement of, a cultural specificity to a project in clinical development is not a widely available skillset and not one that can be commoditized as an offering on a website or pamphlet. It takes years of experience to hone and recognize the nuances of the biopharma culture in order to adapt to meet the needs of the organization without being too disruptive (which may be a fashionable buzzword these days but is often counterproductive).

Understanding and adapting organizational culture is even more relevant today given the large number of mergers and acquisitions in the industry. In those situations, the new entity has to deal with two or more highly divergent cultures, resulting in change initiatives that are taking so much time and are so financially burdensome.

A company’s culture is not defined by a mission statement or inspirational poster; it is deeply embedded in the people that have worked there for many years. It is time for executives to begin considering the culture variable in the success of their company’s operation. If they do, they will recognize greater success in their operational projects and initiatives.

For further discussion or assistance on this topic, reach out to Waife & Associates, Inc. at www.waife.com, or email Steve directly at shevel@waife.com