YouTube Facebook LinkedIn Google+ Twitter Xingrss  

For Efficient Clinical Trials, Three’s Company

By Igor Altman  

February 18, 2013 | Contributed Commentary | In recent years, clinical sponsors have struggled to streamline their clinical operations. Studies have grown increasingly complex with the inclusion of more procedures and eligibility criteria than ever before. The volume of data being collected for trials has increased and enrollment has become more difficult. However, sponsors’ return on investment remains unclear despite the acceptance of increased risk and efforts.

In light of current economic pressures on the life science industry, the need to control costs without impacting output has become more urgent. Recent research from the Tufts Center for Drug Study Development continues to highlight the need for cost efficiency, finding that up to a quarter of procedures performed in clinical trials are non-core—not supporting the primary efficacy, safety or regulatory objectives. Similarly, 2009 research from the Center demonstrated that 15-30 percent of data collected in a trial is not submitted in a new drug application.

Responding to these challenges and pressures, many sponsors have embarked on efficiency initiatives focused on operational process standardization and optimization. These initiatives have rightly driven many towards organizational and vendor consolidation, as well as data and process standardization. However, few organizations have reaped the full benefits to be gained by reducing work and data volume without impacting a program's success. This is because few have focused on linking the very beginning and heart of a trial's execution—its design expressed in the protocol—to the resources, time and capital consumed during the operational phase.

The marriage of three key trial components enables a sponsor to cut cost and time from a trial without impacting its probability of success. These fundamental components include structured study design; operational metrics and benchmark data associated with the design; and process leveraging the first two components to rigorously vet and shape the study design early and often. 

Structured Study Design 

According to the recent Tufts research, an effective approach to identifying non-core procedures is to structurally link study objectives to endpoint/outcome measures, and then to apply those objectives to corresponding procedures in each visit. This linkage allows study-authoring teams to quickly and easily understand the role of each procedure as defined within the trial protocol. Structuring the study design separately from a full protocol document not only enables such linking, but also encourages rigorous analysis. Rather than focusing on a long document, protocol authors should identify the key study design concepts as structured components— such as objectives, endpoints, eligibility criteria, indication, phase, test article, visits and procedures—and present those components in a logical, organized way via a table, spreadsheet or a specialized technology tool. This organization enables researchers to immediately differentiate between procedures created to collect and assess data needed to achieve a primary endpoint/ objective, versus procedures that do not clearly link back to a tertiary endpoint or other relevant data set. Clear differentiation allows research questions to be more easily answered and decisions to be more objectively made regarding the prioritization and needs of various endpoints, criteria and procedures. 

Metrics and Industry Benchmarking 

The second critical piece of study design optimization is the context provided by metrics and industry benchmarks for the cost, work effort and frequency of the procedures contained within a protocol. Layering this data onto a structured design enables organizations to immediately see the cost and work effort drivers of a study (e.g., when a study's exploratory endpoint is driving 50 percent of the cost and work effort). With this insight, an organization can have an objective discussion to determine whether it makes sense to include specific endpoints from both an economic and a risk perspective. Then, in the context of industry benchmarks, the study authors can compare the work effort, cost and duration of their study to other similar studies. Benchmarks equip authors to dig deeper and identify outliers, specifically in terms of the frequency of a planned procedure within a trial protocol. Examining why a study performs a given procedure more frequently than the typical industry study can often lead to leaner study schedules without impacting the key data. 


Structured study designs with metrics and benchmarks are only impactful when the protocol author and organization rigorously leverage them. At Medidata Solutions, we have seen several sponsors take the lead in optimizing their studies through a "challenge" process. This process gathers a group outside the study team to discuss and challenge the proposed study design. The goal is not to approve the study or move it forward. Rather, the process ensures that only required, valuable data is being collected, necessary procedures are planned and included eligibility criteria makes sense, all while achieving the study's key objectives.

As a result of the rigorous challenge process, a number of sponsors have seen reductions in amendments and improvement in patient recruitment, among other benefits. Successful challenge processes separate key design concepts from the protocol document, challenge the study design draft as early as possible, incorporate participants from outside the study team, and include skilled process facilitators.  Structured design that includes metrics and data analysis also enables the challenge process to be more informed, data-driven, objective and efficient. 

By structuring protocol design, informing the design with quantitative and qualitative understandings of site cost, work burden, and data management impact, and developing a disciplined process around design optimization, organizations can derive cost and time savings without adversely impacting a study's primary goals or quality.

Igor Altman is product manager at Medidata Solutions. He can be reached at  

Click here to log in.


Add Comment

Text Only 2000 character limit

Page 1 of 1

For reprints and/or copyright permission, please contact  Terry Manning, 781.972.1349 ,