Follow ACMA
Twitter Facebook Linkedin Instagram
v
GO
InformationACM FAQBeta Period TestingApply for ACMACM Candidate HandbookRemote proctoringStudy Guide (Web-based)ACM RecertificationCode of ConductExam Scoring
ACM Exam Scoring

Exam Scoring

The ACM certification exams are all pass/fail exams developed to measure minimum competence in case management practice. The certification is earned by passing both a Core (multiple choice) exam and a Specialty Simulation exam. A score report is issued for each portion of the exam indicating pass or fail. Additional detail is provided in the form of raw scores by major content category on the Core score report, and a raw score for Information Gathering (IG) and Decision Making (DM) on the Specialty Simulation score report. A raw score is the number of questions (points) answered correctly. Pass/fail status is determined by the raw score. Pass rates are reviewed at each NBCM meeting to look for abnormalities or trends.

Scores for test candidates who did not successfully pass both the Core and Specialty Simulation exams, but did pass one of them, are kept on file for six months from date of issue. If retesting does not occur within six months, or if both portions of the exam are not successfully passed within six months, the original passing score cannot be applied toward certification and the candidate will be required to sit for the entire examination. Candidates can re-apply to take only the portion of the examination they did not pass, if retest occurs within the six (6) month period. There is no limit to the number of times a candidate can retest during the six (6) month period. Re-testers take a different form of the exam upon each time a re-test is attempted.

Pass/Fail Score Determination: Core Exam

The minimum passing score is determined using a modified Angoff Method, which is applied during the performance of a Passing Point Study by a panel of experts in the field. This universally accepted psychometric procedure relies on content experts to estimate the passing probability of each item on the examination. The experts evaluate each question to determine the number of correct answers necessary to demonstrate the knowledge and skills required to pass. A candidate’s ability to pass the examination depends on the knowledge and skill displayed, not on the performance of other candidates. Passing scores may vary slightly for each version of the examination. To ensure fairness to all candidates, a process of statistical equating is used. This involves selecting an appropriate mix of individual questions for each version of the examination that meet the content distribution requirements of the examination content outline. Because each question has been pre-tested, a difficulty level can be assigned. The process then considers the difficulty level of each question selected for each version of the examination - attempting to match the difficulty level of each version as closely as possible. To assure fairness, slight variations in difficulty level are addressed by adjusting the passing score, depending on the overall difficulty level statistics for the group of scored questions that appear on a particular version of the examination.

Pass/Fail Score Determination: Specialty Simulation Exam

The passing point for the Specialty Simulation portion was set by an examination committee using a similar criterion-referenced method. The exact passing point may vary from one form of the examination to another, depending on the scored problems included on the examination form attempted. The examination committee follows strict guidelines in selecting the problems for each examination form to ensure the versions of the examination are parallel in difficulty. Each option on the Specialty Simulation exam is assigned a specified number of points by the examination committee, corresponding to the degree to which the option contributes to appropriate management of the situation described. Options may be weighted from -3 to +3. However, most options are assigned a weight of either -1 for inappropriate selections or +1 for appropriate selections; values of 2 or 3 are associated with the most serious errors, or the most essential actions. Total scores for a candidate are computed by adding together the weights of the options selected; the scores are computed separately for the IG and DM sections. Each IG or DM section is evaluated by content experts during the problem’s development, and a minimum pass level (MPL) is established for the section using the scoring weights assigned to the options in that section.

Each Specialty Simulation will have separate scores and minimum pass levels for both the DM and IG questions. One score for IG is determined by adding the IG MPLs for all the problems. One score for DM is determined by adding the DM MPLs for all the problems. Passing only one of the two sections will not result in an overall passing score. To assist candidates in evaluating their performance on the Specialty Simulation examination, scores are provided for both IG and DM sections. Candidates must achieve passing scores in both IG and DM on the entire exam to successfully complete this exam portion. Additionally, candidates must achieve passing scores on the IG and DM sections as well as the multiple-choice examination to earn the ACM credential.

The National Board of Case Management (NBCM) does verify to any individual or organization whether or not an individual is currently certified, but it does not report or disclose actual scores of individual certificants or candidates.

Content Area Scoring

The content area scores on the score report received after you take the exam are not used to determine pass-fail decision outcomes. They are only provided to offer a general indication regarding candidate performance in each content area. The examination is designed to provide a consistent and precise determination of a candidate’s overall performance and is not designed to provide complete information regarding a candidate’s performance in each content area. Candidates should remember that areas with a larger number of items (questions) will affect the overall score more than areas with a fewer number of items. The precision and consistency of scores diminishes with fewer items, and therefore, sub-scores should be interpreted with caution, especially those that correspond to content areas with very few items.

Delay/Cancelling of Test Scores

Test results are normally immediately available at the conclusion of an individuals testing session. Under some circumstances, scores may be delayed.

  • Scores for newly developed or substantially revised tests may be delayed in order to set passing score standards and/or perform post-administration statistical analyses.
  • Scores may be delayed due to problems with registration, failure to comply with the policies and procedures set forth in the Candidate Handbook and/or with instructions given by the test administrator.
  • Scores may be permanently voided if it is found the certificant or candidate did not meet the eligibility requirements at the time of application for the ACM exam.
  • On occasion, occurrences, such as computer malfunction or misconduct by a candidate, may cause a score to be suspect. NBCM and PSI reserve the right to void or withhold examination results if, upon investigation, violation of its regulations is discovered.

Pre-Test Questions

All ACM exams include unscored pre-test items, and examinees have no way of distinguishing these items from the scored items. This testing method allows for the collection of important statistics about how pre-test items perform on the exam, which informs the final decision about whether a particular question meets the standards for inclusion as a scored item on future exams. To ensure the best possible testing result, candidates should treat each question as if it is being used in the final exam computation.

Multiple Test Forms and Scoring

The ACM certification exam uses multiple forms containing different items to minimize item exposure and ensure the continued relevance of test items. The examination committee accounts for differences in the exam forms by assessing the difficulty levels of individual items on different versions of the same test. When a candidate completes an exam, testing software calculates a raw score—the actual number of correctly answered questions. Because raw scores can be affected by the difficulty of individual items on a particular form of an exam, these slight variations are accounted for through an equating process. Equating adjusts up or down the number of items needed to answer correctly depending on the difficulty level of a particular exam form. Through equating, the passing raw score is adjusted for each exam so that fewer correct items are needed to pass a “more difficult” form of the test, and more correct answers are needed to pass an “easier” form of the test. These statistical adjustments ensure the overall knowledge and skill that needs to be demonstrated by the candidate remains the same in all test forms. In other words, no one receives an advantage or disadvantage because of test form administered. Additionally, with multiple exam forms being administered, there is not a singular passing score to identify an unchanging number of correctly answered items needed to pass.


Mailing Address:
National Board of Case Management
c/o American Case Management Association
17200 Chenal Parkway Suite 300 #345
Little Rock, AR 72223

Email: certification@acmaweb.org

Phone: (501) 907-2262

Announcements

LIMITED AVAILABILITY ANNIVERSARY PACKAGES! | ACMA 2024 National Conference

Join us for the 25th Anniversary ACMA National Conference, April 19-22, 2024 in Nashville, TN. Don't miss out on the most exciting case management conference of the year! Experience ACMA at National!

Save The Date | 2024 Leadership and Physician Advisor Conference

Our 2024 Leadership and Physician Advisor Conference is November 18-20 in Huntington Beach, CA! Health plans and providers, this conference has everything you need. Find out more here!

Share Your Research | Collaborative Case Management

Do you have a project or measurable initiative you've instituted at your organization? Have you conducted research on a current issue in the field? Share your experiences and results with your professional community! Email your proposal (or ask any questions): vmatthews@acmaweb.org. Learn More >>>

Get ACMA News in the Palm of Your Hand

Join the conversation with ACMA! Text the keyword ACMA to 844-554-2497 to stay up to date on all the latest news and announcements, delivered straight to your phone!

Now Accepting Presentations for Chapter Conferences

We are now accepting presentations for upcoming ACMA chapter conferences. If you have a unique solution, intervention or strategy to improve case management, this is a great opportunity to share your knowledge and be a part of ACMA's national-caliber education at the local level. There is no deadline to submit; presentations will be accepted throughout the year so you can prepare a submission as your schedule allows. Submit a presentation >>

National Case Management Week - Save The Date!

2024 - October 13-19
2025 - October 12-18
2026 - October 11-17
2027 - October 10-16
2028 - October 8-14

American Case Management Association
17200 Chenal Parkway Ste 300 #345
Little Rock, AR 72223
Phone: 501-907-ACMA (2262)
Fax: 501-227-4247