Operational Adaptive Research

European Biopharmaceutical Review

EBR_Michael Rosenberg of Health Decisions examines the operational aspects of adaptive research – a traditionally overlooked component of clinical trials

Summer 2008 l Adaptive clinical research, defined as research that allows changes in a trial or programme while it is underway, is increasingly used in pharma as a means of shortening development timelines and reducing costs – in short, improving efficiency. While there is much industry discussion of adaptive methods, the range of topics is restricted by an unduly narrow definition that limits the concept to midcourse changes in study design. Allowing such design changes is an important and valuable aspect of adaptive methods that encompasses techniques such as sample size re-estimation, pruning of treatment arms, alteration of randomisation ratios, Bayesian analyses, and similar methodologies. There can be little question about the benefit of such design-level changes based on their ability to incorporate knowledge as it is generated and to change how an evaluation proceeds. Two of the simplest techniques – sample size re-estimation and pruning of treatment arms during dose-finding – are so valuable that I believe they should be routinely considered in study planning – a sentiment echoed by the PhRMA working group on adaptive methodology (1). Failure to at least consider employing these techniques risks wasting time and money – and often substantial amounts of both.

However, there is an even more useful aspect of adaptive methods – allowing midcourse changes that affect how the study is actually run, while leaving the study design unchanged. Such adaptations at the operational level are at least as important as and often even more important than the design-level changes noted above. This is true because these operational elements of adaptive methods can add substantial value, in most cases reducing time and expense by 20 per cent or more, and do not require specific regulatory approval. As a result, adaptations at the operational level are high-impact tools available immediately to virtually all companies.

Perhaps even more importantly, operational level adaptations also serve as a foundation for enabling the design-level adjustments noted by providing infrastructure essential for timely data capture, cleaning and reporting that is the essential basis for midcourse changes in study design. The operational aspects of adaptive methods enable decision making at all levels of a trial, from utilisation of resources to enrolment strategies to making clean data available before a patient walks out the door after visiting an investigational site. Design changes are based on trial data; operational changes are based on metadata, or the data about the data, which include many levels of performance metrics. Such performance metrics are in many respects the most fundamental and important study management tools. They are a prerequisite for successfully conducting any adaptive study or programme.

TIMELY INFORMATION

The availability of timely, validated information and performance metrics lies at the heart of any adaptive approach because mid-course corrections of any type, from refining enrolment strategies to re-assessing sample size, rely on the timely availability of accurate information as a basis for decision making. The ability to look ahead based on timely information increasingly allows management to think in terms of entire development programmes, due to the ability to incorporate new information moving forward, rather than considering each individual study in isolation.

Enrollment is an excellent example of just how valuable operational adaptations can be. Enrolment is a critical element of virtually every study and drives current efforts to globalise studies. The majority of studies – 80 per cent, according to one conservative survey – enrol late (2). A central reason for this is the lack of information needed to manage the enrolment process. The most fundamental component of any management system includes timely status information, an element lacking in

However, such information is either partially or entirely lacking in almost all commercially-available EDC and CTMS systems, with several exceptions. This is a leading cause of inefficiency in clinical studies. Indeed, modern management of complex projects such as pharmaceutical development is impossible without timely, accurate data on a broad range of performance metrics, and yet such metrics are lacking in this industry. The following challenge proves the point.

  • Can you, within a matter of minutes, determine the number of and frequency of screen failures in your study?
  • Can you determine the current enrolment rate, by site?
  • Can you identify the most and least successful sites, and the reasons for the differences, allowing field and internal staff to intervene promptly to bring laggards up to speed?
  • Can you tell when monitoring visits should be scheduled, based on quality as well as quantity of work performed?

If you answered yes to any of these, you are among a distinct minority. In most cases, the industry continues to rely on outdated processes that fail to provide the kind of timely information so essential to effective management. Such a situation would be unthinkable in the highly competitive world of manufacturing, yet is the norm in our industry.

THE IMPORTANCE OF PROCESS

The pharmaceutical industry is not the first to underestimate the importance of operational processes. In the 1980s, in the quest to catch up with more efficient Japanese car manufacturers, US manufacturers invested billions of dollars for impressive new robotic assembly lines. Although industrial robots could improve production, US companies that leaped into robotics overlooked the more important underlying requirement for fundamentally efficient processes. Toyota, the world’s most efficient and profitable car manufacturer, was at the time making relatively little use of robots on its assembly lines. Rather, Toyota achieved its breakthrough in productivity not by replacing people with robots, but by optimising manufacturing processes – matching the flow of work to customer demand, replacing large standing inventories with ‘just in time’ component delivery, and perfecting the art of getting things right the first time rather than reworking vehicles that flunked inspection at the end of the assembly line (3).

Reworking the errors made along the assembly line accounted for an estimated 30 per cent of the cost of a typical American or European car. Porsche deduced that a problem costing around one mark to fix on the assembly line was estimated to cost 10 marks to fix at the end of the assembly line, 100 marks in the vehicle rectification area at the end of the plant, and 1,000 marks at the dealer under warranty (4). Porsche, like most leading manufacturers at the time, underestimated both the ingenuity involved in improving operational processes and the efficiency to be gained.

The pharmaceutical industry now finds itself in the same position: approximately 27 per cent of the budget for Phase III trial costs, for example, goes to correcting protocol violations – work not done correctly the first time (5). For a typical small study, that means that as much as ( 15,000/patient x 500 patients x 30 per cent =) € 2.25 million is wasted on fixing work completed incorrectly. Few industries can afford to sustain this level of waste, particularly in today’s world of tightening credit and pullbacks in funding. Using adaptive methods to intervene promptly and accelerate enrolment, optimise monitoring efficiency and minimise site queries is an example of fixing errors promptly rather than waiting for errors to proliferate and fixing them at much greater cost late in the study. In each case, the availability of timely, accurate data during the study allows the generation of performance metrics that enable study managers to make significant improvements in efficiency, reducing the need for extra enrolment effort or time, and reducing the number of queries by getting more accurate data sooner rather than devoting unlimited resources to resolving queries at the end of the process.

Although current technology and communication platforms enable close tracking of many performance metrics, most study management systems make it difficult even to identify the ultimate source of an error and the proportion of resources that is routinely devoted to fixing particular errors. This is a basic, essential management capability.

SYSTEM REQUIREMENTS

For both operational and design-level adaptations, it is essential to have timely, validated information to inform management decisions.

Study systems must collect, process and make the data available promptly. One system that achieves this goal utilises an optical pen (SmartPenTM) for simple rapid collection of a variety of data and performance metrics (see Figures 1 and 2). While the SmartPenTM is the most visible component, the backbone of the system is the centralised processing capability that tracks data and generates a range of performance metrics (see Table 1). The final component of this management system is a means of rapid reporting. The effect of this end-to-end system is enabling quick completion of CRFs and rapid data transmission simply by ‘docking’ the pen, allowing both data and metadata to be made instantly available through a variety of mechanisms, including full reports available on a dedicated, secure website and continuously updated desktop widgets that summarise a range of variables customised for the informational needs of each project role (see Figure 4).

THE BENEFITS OF OPERATIONAL ADAPTATIONS

The value of a system with efficient data collection, processing and reporting becomes clear in a large scale global CNS study involving seven countries and more than 1,200 patients observed for 12 months, producing more than 450,000 pages of data. Data and metadata were processed and queries returned to sites an average of 4.9 hours after receipt, providing rapid

Operational adaptive techniques enabled one oncology study to accelerate timelines and cut costs by $16 million. Even more important, the study reduced time-to-market by approximately one year. The consequences of such a reduction in time-to-market are far-reaching – in this case, generating a $366 million increase in net cash, a $133 million increase in net present value (NPV) and a 16 per cent increase in internal rate of return (IRR).

Each of these examples achieved striking improvements without altering the study design. Rather, they took advantage of adaptive-grade data capture and current performance metrics to make operational improvements. Since the necessary changes for making such operational improvements did not alter the design of the studies, the study sponsors did not have to seek regulatory approval before reaping the benefits of adaptive infrastructure and methods. Sponsors can implement such tactical or operational adaptations at will, often with benefits comparable to design-level adaptations such as sample-size re-estimation, adaptive pruning or adaptive randomisation. The following areas illustrate some of the processes for which operational adaptations can deliver the biggest improvements in efficiency.

Enrollment

Adaptive methods accelerate enrolment by allowing study managers to determine at an early stage which sites are enrolling patients most successfully, which techniques are responsible for their success, and which factors are responsible for inhibiting enrolment progress. This system has been leveraged to establish four industry records for rapid enrolment: CNS, reproductive health, oncology and cardiovascular. For example, a large Phase III study of a new Alzheimer’s drug enrolled 1,264 patients in 12 months. A global oncology project was also completed three months ahead of schedule – the fastest-enrolling metastatic breast cancer study to date.
What made such gains possible was the availability of timely performance metrics and the processes established that were built around continuous improvement. Study managers were able to compare enrolment performance and analyse enrolment practices, inclusion-exclusion criteria and the causes for screen failure. Continuous assessment and refinement, coupled with the use of immediate communications to notify all sites as to which strategies were most and least effective, provided significant gains in efficiency.

Adaptive Monitoring

The ability to allocate expensive field monitors according to need rather than a fixed, rigid schedule can easily save as much as €1 million in a typical €10 million study. First, numerous performance metrics such as query rate are continuously tracked, enabling much of the management normally performed at field visits to be carried out continuously in-house, which allows issues to be rectified earlier and corrected through telephone calls rather than personal visits. By closely monitoring such metrics, study managers can ensure that monitors intervene early in order to correct problems before they threaten timelines and budgets.

Furthermore, the capability to track the number of unmonitored fields (and other performance measures) allows the allocation of monitoring resources according to need rather than a fixed schedule. If a problem occurs that needs immediate onsite attention, monitors can be dispatched, rather than leaving problems to be identified until the next visit; similarly, well-performing sites may require fewer visits. The SmartPenTM system can also reduce the monitoring workload, since the digital CRFs can serve as source material, obviating the need to compare an original paper CRF with data transcribed or re-entered at a computer keyboard. This more efficient approach to source data verification eliminates the possibility of transcription errors and provides a comparison of values in the study database with source electronic files – a much more efficient process than shuffling through stacks of paper forms. The net result is savings of €1-2 million on a €10 million study, as well as secondary benefits such as simplifying database lock and providing more timely validated information – an essential for the design-level adaptations referred to earlier.

Queries

Queries are an essential activity in clinical studies. Rather than devoting resources to fixing problems after the event, it is far cheaper and faster to identify and resolve the problems causing the generation of queries. For example, if continuously updated performance metrics indicate a high number of queries for a specific CRF field, this may suggest a flaw in the CRF design; an overlooked ambiguity in the instructions for the form provided to investigational sites, or an inadequate level of training for site personnel depending on distribution of this error between sites. In many cases, prompt intervention can correct such problems, reducing the number of queries and the monitoring time required to resolve the queries, and increasing the likelihood of rapid site closeout.

One cautionary note applies to both design-level and operational adaptations: do not attempt either unless you are using a data-capture method that is up to the task. The ability to carry out adaptive research, whether operational or design-oriented, rests on the ability to provide clean, actionable information on operational indices, as well as data itself within minutes to hours, not days to weeks. An electronic system of data – capture is a must, but web-based EDC systems fail to meet these requirements because not only do they require hand entry of data, but they also focus on data to the exclusion of management metrics. The input of raw data by hand is often done by clinical personnel who are slow and inaccurate, introducing errors that may not be detected (especially if within range). Perhaps most important, web-based EDC also often operates in isolation, dealing only with data, rather than generating performance metrics and other information essential for effective study management.

A REVOLUTION IN THE MAKING

Operational adaptations offer the potential to revolutionise monitoring and study management. Needs-based allocation of monitoring often reduces the number of site visits required, and thus cuts the amount of travel and the substantial associated expenses. The cost of conducting a study can drop quite substantially because more work can be performed in-house cheaper employees. Furthermore, identifying and correcting problems early in the process not only reduces costs and shortens timelines, but also improves quality. For clinical studies, simultaneous improvements in these three areas transform what has often been a sad tale of cost overruns, missed deadlines and error-plagued close-outs into a winning combination.

The potential for efficiency gains is greatest in studies that perform both design-level and operational adaptations. Application of the full spectrum of adaptive techniques, both design-level (strategic) and operational (tactical) can yield dramatic improvements in net present value (NPV) and internal rate of return (IRR) for development projects. For example, a programme that costs €50 million in development costs, takes six years, and has peak sales of €300 million in year three has an NPV of €280 million and IRR of 65 per cent. Use of operational and design adaptive components in dose-finding (Phase II) alone can increase these numbers by €80 million and 13 per cent, respectively. See (7) for the address at which an interactive calculator is available.

Since the same infrastructure – electronic data capture, rapid validation, prompt conversion of data to performance metrics and other information to support decisions – is the foundation by which all type of adaptations are enabled, there is every reason to exploit the infrastructure to maximise both types of capabilities. Although decision-making differs between design and operational adaptations, the goal is the same: to arm study managers with a variety of timely information that allows them to continuously refine a broad range of study procedures. In an environment where time- and cost-efficiency are increasingly important to the market, these benefits appear extremely attractive. Can any investor or company afford to overlook such potential?

References

  1. Chuang-Stein C, Anderson K, Gallo P and Collins S, Sample size re-estimation: a review and recommendations, Drug Information Journal; 40(4): pp475-484, 2006

    2.  Quoted in Stover D, E-recruitment: trial by wire, Bio-ITWorld, http://www.bio-itworld.com/archive/031003/insights_recruitment.html Accessed March 13, 2008

    3.  Womack J, Jones D and Roos D, The Machine That Changed the World, New York: Harper Collins, p77, 1990

    4. Womack J, James P, Jones D and Daniel T, Lean thinking: banish waste and create wealth in your
    corporation, New York: Free Press Division of Simon & Schuster, p199, 2003

    5. Bishop B, Predicting and preventing protocol violations, audio presentation, June 14, 2007

    http://www.covance.com/library/audio_070614.php

    6. Schoenberger C, An Alzheimer’s drug goes on trial, Forbes Magazine, pp94-96, 20th March 2000

    7. www.healthdec.com/adaptive