Research Efficiency in Hard Times: It’s Time To Adapt

Life Science Leader

Life Science LeaderBy Michael Rosenberg, M.D., MPH

May 2009 - Nobody likes hard times, but they do have one redeeming quality: They focus attention on areas neglected in good times. The need for greater efficiency is a prime example, especially for small drug development companies. Even in the best of times, they must cope with the formidable combination of high costs, long timelines, and low success rates. And when hard times come, they feel it first and most acutely. These companies should be the first to ask: Is there a better way?

In a word, yes. Hard times make a strong case for adopting recent advances in clinical development that can greatly improve efficiency, reduce costs, shorten timelines, and maximize the impact of investment budgets. In addition, these advacements allow earlier acquisition of knowledge and the ability to optimize strategy in ways ranging from terminating futile programs earlier to moving directly to the next development phase.

The Current Approach: Guess, Wait, and Hope

Current methods leave study planners no alternative but to guess about important unknowns. These guesses are always wrong; the only question is “by how much?” When the answer is found at the end of a study, it is too late to make changes. This explains why failures in clinical research are so numerous and spectacular. It is common for years of work and millions of dollars to go to waste.

Learn and Adjust

The drug and biotech industries can no longer afford the wait-and-see approach to clinical research. Luckily, better research methods render the old approach obsolete. New tools allow clinical researchers to adjust both strategy and operations based on data collected during a study. Sometimes, these adjustments can transform a study from a wasted investment to the basis for bringing a new drug to market. In other cases, the best adjustment may be to terminate the study early and shift resources elsewhere. Whether or not the data pronounces a favorable verdict on the test drug, these tools allow timely action to make the best of the situation, and they represent a new approach to clinical development: adaptive research.

There are two types of adaptive tools: design and operational. Adaptive design tools address the big-picture elements: study size, length, endpoints measured, etc. Often overlooked but just as important, adaptive operational techniques address “little” things than can cause big problems, such as missed timelines, exceeded budgets, and failure to produce a conclusive result. Used in concert, design and operational adaptations can reduce study timelines by 10% to 25%, while improving profitability and accuracy of information.

Adaptive Design Tools

Perhaps the most important element of study design is sample size, or how big a study will be. Because the sample size required to show a difference between a test drug and the comparator is unknown, study planners often liberally increase the number of subjects to prevent expensive study failures. This over-building drives up study costs and is perhaps the most common source of waste in clinical research. As a rule of thumb, studies enroll about 20% more patients than estimated necessary.

The smarter alternative is an adaptive method known as sample size re-estimation (SSRE). This still involves guessing at the original parameters, but builds into the project a midcourse look at the actual data collected, including the treatment difference. Studies can use this process to arrive at a more precise determination of sample size without over- or underbuilding.

Estimating the Cost of Not Using Adaptive Methods

In hard times, there is a tendency to avoid change for fear of straining an already tight budget. While it can be tempting to put off “risky” innovations until the worst is over, maintaining an inefficient status quo can exact even higher costs. Consider the cost of not doing SSRE. Suppose a study costs $5 million, including the cost of padding sample size. If the treatment difference turns out to be less than estimated, as is often the case, the project would likely fail. Conversely, if the difference is equal to or higher than estimated, the study would have wasted substantial funds by enrolling and treating at least 20% more patients than
necessary.

But how great would these costs be? If the study fails due to underestimation of sample size, the cost of not doing SSRE would be $5 million. If the estimate of treatment difference is accurate, it would cost $1 million to enroll and treat the 20% extra patients. If overestimated, it would cost $2 million to enroll and treat the 40% extra patients.

Adaptive Operational Tools

Similar to design adaptations, adaptive operational tools enable midcourse adjustments to a study’s day-to-day implementation processes. These tools continuously analyze and track performance of key study activities such as enrollment, query resolution, data management, and site closeout in real time. This information is used to identify areas in which efficiency can be improved and enables project teams to take necessary action.

Though operational tools produce results of equal—and sometimes greater—value than design adaptations, they’re often overlooked by the industry as a source of efficiency.

For example, enrollment is often the primary driver of both direct and indirect study costs, yet studies seldom adjust enrollment strategy to take advantage of successful field practices. Reasons for screen failures and patient and follow-up dropouts should be closely tracked and analyzed for their implications on enrollment strategy.

Why do studies fail to optimize activities that so greatly influence costs and timelines? The simple answer is that conventional practices don’t include adjusting recruitment strategies, fine-tuning inclusion/exclusion criteria, or intervening promptly when a patient fails to show up for a visit. However, keeping close track of such activities is possible. New, adaptive clinical research systems make midcourse adjustments possible not only in isolated instances, but continuously and comprehensively. The result is a new standard of operational performance, including fastest-ever enrolling studies in CNS (Alzheimer’s disease), oncology, and other areas.

The Platform for Design, Operational Adaptations

The same technology platform supports both design and operational adaptations. To manage design adaptations, it is imperative to have a technology infrastructure in place that quickly and reliably provides information regarding whether and how extensively to enact design adaptations. Operational adaptations demand the same capabilities, including rapid, accurate data capture; prompt data cleaning and analysis; and support for timely decision making.

Current Web electronic data capture (EDC) systems and the processes associated with them seldom perform at the level required to support adaptive methods. In practice, Web EDC systems mimic paper processes. Site personnel collect data on paper forms and are often delayed in converting this data to electronic formats. Transcription also introduces inaccuracies and causes further delay. These and other problems explain why such systems fail to provide the anticipated improvements in clinical trial efficiency.

The Big Picture

The adaptive approach provides the best tools available for shortening timelines—typically by 10% to 40%—minimizing opportunity cost and enabling the most efficient possible use of study resources. While these financial implications are compelling with regards to a single project, the projection of savings for an entire development program is staggering. The resulting improvements to investment desirability in internal rate of return (IRR) and net present value (NPV) can be a godsend much like an infusion of new capital for hard-pressed development programs in difficult times.