The intent of a certification program is to evaluate the knowledge and skills of professionals seeking a credential to ensure a desired level of competence. For certification exams to serve their intended goal, the exam must assess the performance that the specific job requires. A number of widely recognized organizations have established guidelines and standards for how certification exams should be developed.
The Academy’s programs are accredited by the National Commission for Certifying Agencies (NCCA), which sets internationally recognized standards for the development and operation of certification programs. The standards assure that a program is valid, reflects current practice, and treats candidates fairly and are based on the established processes for developing certification exams.
The development of Certification examinations begins with a Job Analysis, which conducts research into the actual work done by various individuals to create a definitive description of the tasks required to perform a job role and the knowledge needed in order to complete those tasks. A Job Analysis is conducted via a survey and is typically conducted every five to seven years to assure the exam is testing current practice in a job role.
In 2010, the Academy updated its CRA, CRC and PI Job Analyses with a survey sent to thousands of clinical research professionals, 3,636 of whom responded. The results of the survey identified for the Academy what to include on the Detailed Content Outlines (DCOs) for each job role. Each role has a different DCO that reflects those task and knowledge statements that the majority of CRAs, CRCs, and PIs respectively said were essential to their job role. The results of the job analysis also dictate how much of each topic is covered; those tasks performed more frequently or deemed most critical are tested more heavily on the exam. Each designation has an outline that reflects the particular emphasis of their job role and function. The Academy will be updating its Job Analysis through a new survey in Spring 2015. New exams reflecting the results of those job analysis results are expected in late 2016.
Individuals who are already certified as a CRA, CRC, or PI are then trained to write test questions based on current practice and conduct of clinical research. We call these volunteers “Subject Matter Experts,” or “SMEs.” All questions must test knowledge and skills as defined by the DCO. The correct answer for each item must be supported by at least one citation of a reference found on the resource list, currently comprised of specific areas of ICH Guidelines, as described in our Candidate Handbook. Once the SMEs have written draft questions, the questions go to the CCRA, CCRC, and CPI Exam Committees for review. This process is constantly in motion, with new questions being written, current questions being reviewed, and older or nonperforming questions being “retired” from the item bank.
One exam committee exists for each exam. The CCRA, CCRC, and CPI Exam Committees consists of a separate group of currently practicing, certified CRAs, CRCs, or PIs who review, edit, discuss, and rewrite the draft test questions. Many draft test questions are discarded in the process. Others are completely rewritten or heavily edited. Each question must meet minimum standards for applicability to the job role. All Exam Committee members must agree that the answer keyed as correct is, in fact, the only correct answer possible. The Committee verifies the content tested falls within the appropriate DCO and that the reference(s) cited support the correct answer.
Once a draft question is approved by an Exam Committee, it then becomes a “pre-test question.” All questions are pre-tested before they are counted toward a candidate’s score. The exams given to candidates are 125 questions long, and 100 of them count toward the candidate’s score while 25 are pre-test questions. This means the Academy is collecting statistical data on the pre-test items to see if they are well-constructed enough to appear on the exam as a scored item. Hundreds of candidates answer a pre-test question before it can be determined if it can be used toward a candidate’s score.
Once enough data have been collected, analyses are performed on the item statistics in conjunction with the Academy’s professional test development partner to see if items have performed well enough to be used. If they have not performed well (for example, if many candidates are choosing the wrong answer; or each of the four answers is being selected equally, which indicates test-takers are guessing; or candidates who score well on the exam overall are selecting a wrong answer), then the questions are set aside for further review and rewriting, or they may be discarded. Only those questions that demonstrate they are fair to the test taker and identify proficiency in a candidate are used.
Each time the exam is administered, there are multiple “exam forms” being used. This helps minimize cheating and assures that those retesting do not receive the same exam as taken previously. All exam forms comply with the Detailed Content Outline. All candidates must achieve a score of 600 or greater in order to have demonstrated sufficient knowledge and skill to pass the exam and become certified.
Several diverse groups of volunteers are used to write the questions, review the questions, select questions to be pre-tested, and select and review questions that actually appear on the exam. Each year, over 160 volunteers from around the world participate in some aspect of test development for Academy certification exams. This helps assure the exam represents a variety of perspectives, practice settings and types of research. The Academy follows a process that meets international standards for test development and works with a highly regarded testing partner.
To date, more than 30,000 clinical research professionals have been certified by the Academy. On average, 73 percent of first time test takers are successful on their exam.