Q&A: A New Approach to Improving Clinical Research Workforce Guidelines

Beth Harper, ACRP Workforce Innovation Officer

Beth Harper, ACRP Workforce Innovation Officer

ACRP’s Workforce Innovation Officer Beth Harper is committed to moving competency in clinical trial research from blackboard theory to roll-up-your-sleeves practice. ACRP’s Editor-in-Chief recently sat down with Harper to learn about the latest developments.

Q: How did you approach the development of the recently released Clinical Research Coordinator (CRC) guidelines, and what’s different in them from what’s been done previously?

A: Some great work has been done by groups such as the Joint Task Force for Clinical Trial Competency (JTF) and others. In fact, their work served as the core foundation for the development of ACRP’s CRC guidelines. There is a lot of merit to having general competencies that any clinical research professional should be able to demonstrate, as this helps to create a broad portfolio to build upon for translating into diverse types of job roles.

What’s been challenging, however, is to put these into practice, as individuals and employers are often solely focused on how the competencies relate to a specific job role. That’s been the focus of ACRP with the release of the study monitoring guidelines last year, and this most recent effort with the CRC guidelines.

After the release of the study monitoring guidelines, we received a lot of feedback on the format. While our members, organizations, and others felt the role-based effort with various levels for entry-level to higher skilled monitors/clinical research associates (CRAs) was helpful, the format was somewhat difficult to work with (a lengthy PDF document). Folks wanted something that was more usable…both in terms of organization and structure (which is why we converted this from a Word/PDF-based document to an Excel file), but that could be used to better document and track progress against developing the competencies.


Download the Core Competency Guidelines for Clinical Research Coordinators—Get your free copy today and learn how these new guidelines can support CRCs, research sites, sponsors, and CROs. Learn More


In other words, an individual could use this to self-assess where they are, what gaps they have, and what competencies they need to focus on building, so they can expand to higher level roles or demonstrate more transferrable competencies to go into different types of roles. Similarly, a manager could use this to do a performance assessment to help identify gaps and areas to develop training plans and such.

So, a lot of effort went in to really thinking about how the general JTF competency statements map to the specific roles of the CRC, and to organize this into a format that was more actionable and usable. But to be honest, this is just the beginning. We know that the industry wants even better tools to incorporate into job descriptions, more sophisticated performance evaluation tools, and better ways to identify specific training programs to address the competency gaps. 

Q: Let’s drill down a little. Can you talk about how some of these future enhancements might work?

A: Let’s take the example of a performance assessment tool. Currently our Excel file allows an individual and his or her manager to simply check off if the competency was achieved. The total domains and competency statements are automatically calculated, and someone can see what percentage of the total 114 competencies have been achieved and which domains have the biggest gaps.

This is a pretty rudimentary way to think about a performance assessment or competency gap analysis. What we aim to do in future releases is tie this to a more structured performance assessment ranking. For example, not just whether the competency was achieved (yes or no), but the extent to which the individual has mastered the competency. For example, this might be graded on a scale of basic level mastery to fully mastered.

Competency development is an evolutionary process of growth, including experience and exposure to the opportunity to develop competencies. It may take many months or years—or working on many different types of clinical studies—to fully master the competency and identify where the gaps are, so having more precise tools to help distinguish various levels of mastery and proficiency are one area we hope to build upon.

Another example is to use the competency guidelines to create competency-based job descriptions. Whereas many job descriptions outline duties and tasks, they don’t necessarily describe the core knowledge, skills, and behaviors that are needed to be successful in each role. Or, the competencies may not be described in a way that can be directly measured and assessed, so tying the competencies back to job descriptions and associated performance assessment tools, we believe, would really help to enable the standardization and quality enhancements that are really lacking in our industry.

Q: Many are recognizing the need for these guidelines. How do you think people can use them today?

A: At the risk of sounding like a Nike commercial, we really want people to “just do SOMETHING” with them, to be honest. I would encourage every CRC to download the guidelines do a personal competency self-assessment. What surprises do they find? Perhaps they will see a lot of areas that they never really thought about, and this will prompt them to explore taking more classes or reading more about some of the competencies. Or perhaps the self-assessment will show that they really have competencies that are at a higher level than their current job role, and the documentation of their competency levels can be used in discussions with their managers about a possible promotion.

Similarly, managers could do the same—perhaps selecting out the top competency domains they think are most relevant to their staff—and they can do an assessment of where their CRCs are in terms of achieving competency levels and compare the current job levels and roles to the guidelines. Evaluating the similarities, differences, and trends might start the discussion about modifying job descriptions, pay grades, or more.

We fully expect that we will learn the most from the early adopters who try these out to see what is and isn’t working—whether they are too comprehensive and could be pared down, whether there are some domains that aren’t really addressed at all, and if and how the formatting can be improved. I welcome all comments and suggestions, and am eager to see how we can continue to support the adoption of these guidelines.