NBME U Lesson Catalog • Self-guided, interactive lessons • Evidence-based best practices •C onsistent and high-quality student assessments
The evidence is out there. NBME SM
YOUR PARTNER IN EVIDENCE-BASED ASSESSMENT
www.my.nbme.org
Table of Contents 03
NBME U Overview
05 06 07
Assessment Principles How to Create a Good Score Purposes and Types of Assessments: An Overview Score Reporting Test Blueprinting I: Selecting an Assessment Method Test Blueprinting II: Creating a Test Blueprint Test Score Reliability Overview
08
Validity and Threats to Validity
09 10 11 12 13
Method: Multiple-Choice Questions (MCQs) Incorporating Graphics and Multimedia into MCQs Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Purposes, Types and Educational Uses of MCQ Examinations Setting Pass/Fail Standards Strategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge
14 15 16 17
Method: Objective Structured Clinical Examinations (OSCEs) An Introduction to the Use of Generalizability Theory in OSCEs Building a Clinical Skills Center Developing Rating Scales and Checklists for OSCEs Provision of Feedback by Standardized Patients Quality Assurance of Standardized Patients Reality Check: Promoting Realism in Standardized Patients Training Physicians to Rate OSCE Patient Notes Working with Standardized Patients for Assessment
18 19
Method: Workplace-Based Assessment Conducting the Feedback Session Educational Frameworks for Assessing Competence Workplace Assessment: Encounter-Based Methods
20 Professionalism Introduction to the Construct of Professionalism and Its Assessment 21
1
Meet the Authors
Reaching new horizons in student assessments.
Five actions to create more confident student assessments
REGISTER for NBME U ENROLL in NBME U lessons ASSESS your skills LEARN new skills APPLY your new skills
engage, develop and improve National Board of Medical Examiners
2
NBME U Lesson Catalog NBME U is a new, evidence-based, digital resource to help educators more accurately, confidently and consistently assess their students. With NBME U, you can learn from the experts on student assessments. As a series of short, self-guided, interactive online lessons, NBME U helps educators create and deliver consistent, valid, reliable and high-quality assessments of their students. Learn anywhere, anytime and on any device. By enrolling in NBME U online, healthcare professionals can learn more about: • Assessment principles • Scoring principles • And other topics related to: - Objective structured clinical examinations (OSCEs) - Multiple-choice questions (MCQs) - Workplace-based assessments - Professionalism
This activity has been planned and implemented in accordance with the accreditation requirements and policies of the Accreditation Council for Continuing Medical Education (ACCME) through the joint providership of the Federation of State Medical Boards and the National Board of Medical Examiners. The Federation of State Medical Boards is accredited by the ACCME to provide continuing medical education for physicians. The Federation of State Medical Boards designates this enduring material for a maximum of 0.25 AMA PRA Category 1 Credit(s)™. Physicians should claim only the credit commensurate with the extent of their participation in the activity. © 2016 National Board of Medical Examiners® All Rights Reserved.
3
Reaching new horizons in student assessments.
NBME U leverages 100+ years of expertise in creating evidence-based student assessments for health professionals. Content for each lesson was created by a team of nationally recognized experts on assessment. In this catalog you’ll find: • A list of every lesson offered by NBME U, • A brief description of each lesson’s objectives, and • A biography of each lesson’s author(s). Each of the 28 NBME U lessons can be completed on any device in 15-20 minutes. In addition, each successfully completed course earns 0.25 AMA PRA Category 1 Credits™ for physicians or a certificate of participation for other healthcare professionals. The evidence is here, at NBME U. Just select the lessons of most interest to you, and begin strengthening your assessment skills today. Enroll today at www.my.nbme.org.
National Board of Medical Examiners
4
Assessment Principles
How to Create a Good Score Lesson Objectives 1. Identify the characteristics of a good score. 2. Identify the necessary steps to create a good score.
Authors Marc Gessaroli, PhD
Suggested NBME U Companion Courses • Purposes and Types of Assessments: An Overview • Score Reporting • Test Blueprinting I: Selecting an Assessment Method
• Test Blueprinting II: Creating a Test Blueprint • Test Score Reliability Overview • Validity and Threats to Validity
Purposes and Types of Assessments: An Overview Lesson Objectives 1. Explain the common purposes of testing and the different types of decisions that can be made from test results. 2. Define the terminology commonly used to describe different types of tests in medical education. 3. Identify the steps for developing sound assessments.
Authors Marc Gessaroli, PhD (Editor) Mark Raymond, PhD
Suggested NBME U Companion Courses • How to Create a Good Score • Score Reporting • Test Blueprinting I: Selecting an Assessment Method
5
Reaching new horizons in student assessments.
• Test Blueprinting II: Creating a Test Blueprint • Test Score Reliability Overview • Validity and Threats to Validity
Score Reporting Lesson Objectives 1. Apply an evidence-driven framework to design and evaluate a score report. 2. Identify the directionality, or ordering, of test design and assembly based on an evidence-driven framework (purpose → claims → evidence → content). 3. Define the terms Claim, Evidence, and Score. 4. Write unambiguous claims that can be supported by evidence.
Authors Amanda Clauser, MSEd, EdD Marc Gessaroli, PhD (Editor) Howard Wainer, PhD
Suggested NBME U Companion Courses • How to Create a Good Score • Purposes and Types of Assessments: An Overview • Test Blueprinting I: Selecting an Assessment Method
• Test Blueprinting II: Creating a Test Blueprint • Test Score Reliability Overview • Validity and Threats to Validity
Test Blueprinting I: Selecting an Assessment Method Lesson Objectives 1. Recognize the role of learning objectives in deciding what knowledge and skills to assess. 2. Classify learning objectives according to skill domain (cognitive, affective and psychomotor) and level of learning (recognition/recall or application/critical thinking). 3. Consider a vast range of assessment methods and select a method that is optimal for the skill to be assessed.
Authors Marc Gessaroli, PhD (Editor) Joseph Grande, MD, PhD
Mark Raymond, PhD Kathleen Short, MALS, MS
Suggested NBME U Companion Courses • How to Create a Good Score • Purposes and Types of Assessments: An Overview • Score Reporting
• Test Blueprinting II: Creating a Test Blueprint • Test Score Reliability Overview • Validity and Threats to Validity
National Board of Medical Examiners
6
Assessment Principles
Test Blueprinting II: Creating a Test Blueprint Lesson Objectives 1. Explain the benefits of using a test blueprint when developing assessments. 2. List the common frameworks used for blueprinting. 3. Describe how to create a blueprint that includes two topic dimensions.
Authors Gail Furman, MSN, CHSE, PhD Marc Gessaroli, PhD (Editor) Joseph Grande, MD, PhD
Mark Raymond, PhD Kathleen Short, MALS, MS
Suggested NBME U Companion Courses • How to Create a Good Score • Purposes and Types of Assessments: An Overview • Score Reporting
• Test Blueprinting I: Selecting an Assessment Method • Test Score Reliability Overview • Validity and Threats to Validity
Test Score Reliability Overview Lesson Objectives 1. Explain what reliability is and why it’s important. 2. Identify which measures of reliability to use with different types of assessments. 3. Identify and interpret the different measures of reliability for different purposes of assessments. 4. Identify the factors that can affect the reliability of a score.
Authors Marc Gessaroli, PhD
Suggested NBME U Companion Courses • How to Create a Good Score • Purposes and Types of Assessments: An Overview • Score Reporting
7
Reaching new horizons in student assessments.
• Test Blueprinting I: Selecting an Assessment Method • Test Blueprinting II: Creating a Test Blueprint • Validity and Threats to Validity
Validity and Threats to Validity Lesson Objectives 1. Describe what validity means in the context of educational tests. 2. Explain what inferences are and how they relate to validity evidence. 3. Identify different threats to the validity of test scores. 4. List approaches to mitigate threats to validity.
Authors Richard Feinberg, PhD Marc Gessaroli, PhD (Editor) Kimberly Swygert, PhD
Suggested NBME U Companion Courses • How to Create a Good Score • Purposes and Types of Assessments: An Overview • Score Reporting
• Test Blueprinting I: Selecting an Assessment Method • Test Blueprinting II: Creating a Test Blueprint • Test Score Reliability Overview
National Board of Medical Examiners
8
Method: Multiple-Choice Questions (MCQs)
Incorporating Graphics and Multimedia into MCQs Lesson Objectives 1. Determine when it makes sense to use media in your exam. 2. Determine which content areas are conducive to media. 3. Explain what a media blueprint shows.
Authors Kathy Angelucci, MS Kieran Hussie Carol Morrison, PhD (Editor)
Miguel Paniagua, FACP, MD Mark Raymond, PhD
Suggested NBME U Companion Courses • Item Analysis and Key Validation • MCQ Flaws and How to Avoid Them • Purposes, Types and Educational Uses of MCQ Examinations • Setting Pass/Fail Standards
• Strategies for Organizing Question Writing and Review • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Basic Science Knowledge • Writing MCQs to Assess Application of Clinical Science Knowledge
Item Analysis and Key Validation Lesson Objectives 1. Review multiple-choice question item analysis data to identify items that require review by a content expert. 2. Identify the key components of a key validation review.
Authors Kathleen Holtzman Carol Morrison, PhD (Editor) Miguel Paniagua, FACP, MD
David Swanson, PhD Kimberly Swygert, PhD
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • MCQ Flaws and How to Avoid Them • Purposes, Types and Educational Uses of MCQ Examinations • Setting Pass/Fail Standards
9
Reaching new horizons in student assessments.
• Strategies for Organizing Question Writing and Review • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Basic Science Knowledge • Writing MCQs to Assess Application of Clinical Science Knowledge
MCQ Flaws and How to Avoid Them Lesson Objectives 1. Recognize and avoid common technical flaws related to testwiseness and irrelevant difficulty.
Authors Kathleen Holtzman Carol Morrison, PhD (Editor) Miguel Paniagua, FACP, MD
David Swanson, PhD Kimberly Swygert, PhD
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • Item Analysis and Key Validation • Purposes, Types and Educational Uses of MCQ Examinations • Setting Pass/Fail Standards
• Strategies for Organizing Question Writing and Review • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Basic Science Knowledge • Writing MCQs to Assess Application of Clinical Science Knowledge
Purposes, Types and Educational Uses of MCQ Examinations Lesson Objectives 1. Describe what information is best captured by a multiple-choice question (MCQ). 2. List four types of MCQ examinations. 3. Explain the different uses for each type of test. 4. Describe the inferences that can be supported based on the type of test.
Authors Rich Feinberg, PhD Dan Jurich, PhD Carol Morrison, PhD (Editor)
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • Item Analysis and Key Validation • MCQ Flaws and How to Avoid Them • Setting Pass/Fail Standards
• Strategies for Organizing Question Writing and Review • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Basic Science Knowledge • Writing MCQs to Assess Application of Clinical Science Knowledge
National Board of Medical Examiners
10
Method: Multiple-Choice Questions (MCQs)
Setting Pass/Fail Standards Lesson Objectives 1. Describe important considerations when setting pass/fail standards for multiple-choice tests. 2. Outline the process for four standard setting methods.
Authors Carol Morrison, PhD (Editor) David Swanson, PhD
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • Item Analysis and Key Validation • MCQ Flaws and How to Avoid Them • Purposes, Types and Educational Uses of MCQ Examinations
• Strategies for Organizing Question Writing and Review • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Basic Science Knowledge • Writing MCQs to Assess Application of Clinical Science Knowledge
Strategies for Organizing Question Writing and Review Lesson Objectives 1. Recognize the strengths and weaknesses of different strategies for organizing question writing and review for tests.
Authors Kathleen Holtzman Carol Morrison, PhD (Editor)
Miguel Paniagua, FACP, MD David Swanson, PhD
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • Item Analysis and Key Validation • MCQ Flaws and How to Avoid Them • Purposes, Types and Educational Uses of MCQ Examinations
11
Reaching new horizons in student assessments.
• Setting Pass/Fail Standards • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Basic Science Knowledge • Writing MCQs to Assess Application of Clinical Science Knowledge
Structuring Multiple-Choice Questions Lesson Objectives 1. Describe the three components of a well structured single-best-answer multiple-choice question. 2. Evaluate whether questions follow the “rules” for well structured single-best-answer multiple-choice questions. 3. Describe three types of single-best-answer questions.
Authors Kathleen Holtzman Carol Morrison, PhD (Editor) Miguel Paniagua, FACP, MD
David Swanson, PhD Kimberly Swygert, PhD
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • Item Analysis and Key Validation • MCQ Flaws and How to Avoid Them • Purposes, Types and Educational Uses of MCQ Examinations
• Setting Pass/Fail Standards • Strategies for Organizing Question Writing and Review • Writing MCQs to Assess Application of Basic Science Knowledge • Writing MCQs to Assess Application of Clinical Science Knowledge
Writing MCQs to Assess Application of Basic Science Knowledge Lesson Objectives 1. Write clinical and experimental vignettes, lead-ins, and option sets that test examinees’ application of basic science knowledge. 2. Use a standard vignette structure to write consistently structured multiple-choice question stems with focused lead-ins that pose clearly defined tasks for examinees.
Authors Kathleen Holtzman Carol Morrison, PhD (Editor)
Miguel Paniagua, MD, FACP David Swanson, PhD
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • Item Analysis and Key Validation • MCQ Flaws and How to Avoid Them • Purposes, Types and Educational Uses of MCQ Examinations
• Setting Pass/Fail Standards • Strategies for Organizing Question Writing and Review • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Clinical Science Knowledge
National Board of Medical Examiners
12
Method: Multiple-Choice Questions (MCQs)
Writing MCQs to Assess Application of Clinical Science Knowledge Lesson Objectives 1. Write vignettes, lead-ins and option sets that test examinees’ application of clinical science knowledge. 2. Use worksheets and templates to write consistently structured multiple-choice question stems with focused lead-ins that pose clearly defined clinical tasks for examinees. 3. Write homogeneous option sets with distractors that are appropriate for the examinees’ stage of training.
Authors Kathleen Holtzman Carol Morrison, PhD (Editor)
Miguel Paniagua, FACP, MD David Swanson, PhD
Suggested NBME U Companion Courses • Incorporating Graphics and Multimedia into MCQs • Item Analysis and Key Validation • MCQ Flaws and How to Avoid Them • Purposes, Types and Educational Uses of MCQ Examinations
13
Reaching new horizons in student assessments.
• Setting Pass/Fail Standards • Strategies for Organizing Question Writing and Review • Structuring Multiple-Choice Questions • Writing MCQs to Assess Application of Basic Science Knowledge
Method: Objective Structured Clinical Examinations (OSCEs)
An Introduction to the Use of Generalizability Theory in OSCEs Lesson Objectives 1. Identify basic concepts related to Generalizability Theory. 2. Explain how reliability is conceptualized in the Generalizability Theory framework. 3. Calculate two of the reliability coefficients used in the Generalizability Theory framework. 4. Understand how the number of raters and the number of cases affect the reliability of an Objective Structured Clinical Examination (OSCE).
Authors Gail Furman, MSN, CHSE, PhD (Editor) Kimberly Swygert, PhD
Suggested NBME U Companion Courses • Building a Clinical Skills Center • Developing Rating Scales and Checklists for OSCEs • Provision of Feedback by Standardized Patients • Quality Assurance of Standardized Patients
• Reality Check: Promoting Realism in Standardized Patients • Training Physicians to Rate OSCE Patient Notes • Working with Standardized Patients for Assessment
Building a Clinical Skills Center Lesson Objectives 1. List the features that need to be considered when designing a clinical skills center for teaching and assessment in a new or refurbished space. 2. Discuss the features required in your center with builders and architects. 3. Realize the need for future planning for upkeep and improvements.
Authors Nancy Ambrose Gail Furman, MSN, CHSE, PhD (Editor)
Gayle Gliva, PhD Jessica McBride
Suggested NBME U Companion Courses • An Introduction to the Use of Generalizability Theory in OSCEs • Developing Rating Scales and Checklists for OSCEs • Provision of Feedback by Standardized Patients • Quality Assurance of Standardized Patients
• Reality Check: Promoting Realism in Standardized Patients • Training Physicians to Rate OSCE Patient Notes • Working with Standardized Patients for Assessment
National Board of Medical Examiners
14
Method: Objective Structured Clinical Examinations (OSCEs)
Developing Rating Scales and Checklists for OSCEs Lesson Objectives 1. Describe the differences between rating scales and checklists and identify scenarios in which each might be used. 2. Explain the implications of scale choice on rater biases and other factors that impact scoring in an Objective Structured Clinical Examination (OSCE). 3. Explain the implications of scale choice on rater biases and other factors that impact scoring in an Objective Structured Clinical Examination (OSCE).
Authors Amanda Clauser, MSEd, EdD Gail Furman, MSN, CHSE, PhD (Editor) Kimberly Swygert, PhD
Suggested NBME U Companion Courses • An Introduction to the Use of Generalizability Theory in OSCEs • Building a Clinical Skills Center • Provision of Feedback by Standardized Patients • Quality Assurance of Standardized Patients
• Reality Check: Promoting Realism in Standardized Patients • Training Physicians to Rate OSCE Patient Notes • Working with Standardized Patients for Assessment
Provision of Feedback by Standardized Patients Lesson Objectives 1. Describe the types of feedback that can be given by standardized patients. 2. Develop training materials for standardized patients to enable them to give effective feedback. 3. Develop a feedback quality assurance tool for monitoring standardized patient–student encounters.
Authors Carol Pfeiffer, PhD Gail Furman, MSN, CHSE, PhD (Editor)
Suggested NBME U Companion Courses • An Introduction to the Use of Generalizability Theory in OSCEs • Building a Clinical Skills Center • Developing Rating Scales and Checklists for OSCEs • Quality Assurance of Standardized Patients
15
Reaching new horizons in student assessments.
• Reality Check: Promoting Realism in Standardized Patients • Training Physicians to Rate OSCE Patient Notes • Working with Standardized Patients for Assessment
Quality Assurance of Standardized Patients Lesson Objectives 1. Explain the importance of implementing a quality assurance approach for standardized patients during OSCE exams. 2. Develop case-specific quality assurance monitoring checklists.
Authors David Disbrow, MACM Gail Furman, MSN, CHSE, PhD
Suggested NBME U Companion Courses • An Introduction to the Use of Generalizability Theory in OSCEs • Building a Clinical Skills Center • Developing Rating Scales and Checklists for OSCEs • Provision of Feedback by Standardized Patients
• Reality Check: Promoting Realism in Standardized Patients • Training Physicians to Rate OSCE Patient Notes • Working with Standardized Patients for Assessment
Reality Check: Promoting Realism in Standardized Patients Lesson Objectives 1. Identify the vocabulary used to promote realism in standardized patient performance. 2. Recognize behaviors that standardized patients can employ to promote realism.
Authors David Disbrow, MACM Gail Furman, MSN, CHSE, PhD (Editor)
Suggested NBME U Companion Courses • An Introduction to the Use of Generalizability Theory in OSCEs • Building a Clinical Skills Center • Developing Rating Scales and Checklists for OSCEs • Provision of Feedback by Standardized Patients
• Quality Assurance of Standardized Patients • Training Physicians to Rate OSCE Patient Notes • Working with Standardized Patients for Assessment
National Board of Medical Examiners
16
Method: Objective Structured Clinical Examinations (OSCEs)
Training Physicians to Rate OSCE Patient Notes Lesson Objectives 1. Describe three tools that help physicians rate OSCE post-encounter tasks (PETs). 2. Lists the steps needed to train physicians to effectively rate patient notes. 3. Identify potential rater biases.
Authors Jessica Salt, MD Ellen Turner, MD Gail Furman, MSN, CHSE, PhD (Editor)
Suggested NBME U Companion Courses • An Introduction to the Use of Generalizability Theory in OSCEs • Building a Clinical Skills Center • Developing Rating Scales and Checklists for OSCEs • Provision of Feedback by Standardized Patients
• Quality Assurance of Standardized Patients • Reality Check: Promoting Realism in Standardized Patients • Working with Standardized Patients for Assessment
Working with Standardized Patients for Assessment Lesson Objectives 1. Name areas where standardized/simulated patients can be integrated into the curriculum for assessment. 2. Identify advantages of using standardized/simulated patients. 3. Describe a framework for understanding learner skill levels. 4. List the resources needed to develop a standardized/simulated patient program at your institution.
Authors Gail Furman, MSN, CHSE, PhD Henry Pohl, MD
Suggested NBME U Companion Courses • An Introduction to the Use of Generalizability Theory in OSCEs • Building a Clinical Skills Center • Developing Rating Scales and Checklists for OSCEs • Provision of Feedback by Standardized Patients
17
Reaching new horizons in student assessments.
• Quality Assurance of Standardized Patients • Reality Check: Promoting Realism in Standardized Patients • Training Physicians to Rate OSCE Patient Notes
Method: Workplace-Based Assessment
Conducting the Feedback Session Lesson Objectives 1. Identify the characteristics of effective feedback. 2. Describe the feedback process. 3. Critique a feedback session based on the characteristics of effective feedback.
Authors M. Brownell Anderson, MEd (Editor) Colleen Canavan, MS
Peter Katsufrakis, MD Margaret Richmond, MS
Suggested NBME U Companion Courses • Educational Frameworks for Assessing Competence • Workplace Assessment: Encounter-Based Methods
Educational Frameworks for Assessing Competence Lesson Objectives 1. Explain the purpose of frameworks in instructional design and assessment. 2. Describe the relationship between frameworks and different definitions of competence. 3. Describe current examples of frameworks.
Authors M. Brownell Anderson, MEd Lou Pangaro, MD
Suggested NBME U Companion Courses • Conducting the Feedback Session • Workplace Assessment: Encounter-Based Methods
National Board of Medical Examiners
18
Method: Workplace-based Assessment
Workplace Assessment: Encounter-Based Methods Lesson Objectives 1. Describe the three most popular encounter-based methods of workplace assessment. 2. Explain the factors that contribute to the quality of workplace assessment. 3. List different types of feedback and their use in the context of workplace assessment.
Authors John Norcini, PhD Mark Raymond, PhD (Editor)
Suggested NBME U Companion Courses • Conducting the Feedback Session • Educational Frameworks for Assessing Competence
19
Reaching new horizons in student assessments.
Professionalism
Introduction to the Construct of Professionalism and Its Assessment Lesson Objectives 1. Articulate your own definition of professionalism. 2. Access resources for assessing professionalism. 3. Explain how you will integrate the assessment of one aspect of professionalism in your institution.
Authors M. Brownell Anderson, MEd
National Board of Medical Examiners 20
Meet the Authors
Nancy Ambrose, MBA Lesson Building a Clinical Skills Center Nancy Ambrose was Assistant Director of Center Operations for the Educational Commission for Foreign Medical Graduates (ECFMG) and oversaw the evaluation centers where the United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills assessment was administered. Ms. Ambrose earned her MBA from the Fox School of Management at Temple University, Philadelphia, PA. M. Brownell Anderson, MEd Vice President, International Programs National Board of Medical Examiners Lessons Conducting the Feedback Session Educational Frameworks for Assessing Competence Introduction to the Construct of Professionalism and Its Assessment M. Brownell “Brownie” Anderson is Vice President of International Programs with NBME, where she works to create a culture of assessment by aligning outcomes to student assessments with medical schools around the world. She is currently helping medical schools and organizations in Brazil, China, and Kazakhstan to build a stronger culture of assessment. A frequent author, educator and speaker around the world, Ms. Anderson is the editor of Really Good Stuff, a biannual collection of medical education innovation reports. She was a member of the Arabian Gulf University faculty, Manama, Bahrein, for several years and has worked with the Foundation for Advancement of International Medical Education and Research. Ms. Anderson received her degrees from Washington University in St. Louis, St. Louis, MO, and the University of Illinois at Chicago and was employed at the Southern Illinois University School of Medicine, Springfield, IL, from 1978 to 1983. Kathy Angelucci, MS Managing Editor, International Programs American Board of Medical Specialties Lesson Incorporating Graphics and Multimedia in MCQs Kathy Angelucci is currently the Managing Editor, International Programs, at the American Board of Medical Specialties (ABMS), where she oversees the examination development process for the Singapore program, ensuring that all examinations meet quality standards and that participating specialty item banks contain high-quality test questions. Prior to joining ABMS, Ms. Angelucci worked at NBME for more than 20 years in several test development roles, most recently as Manager of Developmental Projects and System Enhancements. She has in-depth expertise and technical skills in examination development and publication, media development and acquisition, and item banking and webbased testing. Ms. Angelucci received a BA in English from LaSalle University and an MS in Technical and Science Communication from Drexel University, Philadelphia, PA.
21
Reaching new horizons in student assessments.
Colleen Canavan, MS Lesson Conducting the Feedback Session Colleen Canavan contributed to the research and development of multiple observational assessment programs during her tenure at NBME. As part of this work, she delivered training on professionalism assessment and feedback to physicians across the US. Ms. Canavan holds a BA in Sociology from Vassar College, Poughkeepsie, NY, and an MS in Library and Information Science from Drexel University, Philadelphia, PA. Amanda Clauser, MSEd, EdD Psychometrician National Board of Medical Examiners Lessons Score Reporting Developing Rating Scales and Checklists for Objective Structured Clinical Examinations (OSCEs) Amanda Clauser is a Psychometrician with NBME, where she specializes in test design, equating, score report development and other operational testing issues. Her research interests include evidence-centered design, applied generalizability theory and performance assessment. She earned an MSEd from the University of Pennsylvania, Philadelphia, and an EdD in Educational Measurement and Psychometrics from the University of Massachusetts. David Disbrow, MACM Center for Innovation Officer National Board of Medical Examiners Lessons Quality Assurance of Standardized Patients Reality Check: Promoting Realism in Standardized Patients David Disbrow runs NBME’s Center for Innovation, which serves as an incubator for disruptive ideas and concepts that may extend the boundaries of NBME strategic principle areas. He started his career with the ECFMG in 1998, and he helped to train standardized patient trainers and patients at the Clinical Skills Evaluation Collaboration’s six exam centers across the country. He joined NBME in 2012 to help create the Educational Design and Development Program with Gail Furman. Mr. Disbrow holds an MACM degree from the Keck School of Medicine, University of Southern California, Los Angeles.
National Board of Medical Examiners 22
Meet the Authors
Marc Gessaroli, PhD Principal Measurement Scientist National Board of Medical Examiners Lessons How to Create a Good Score Purposes and Types of Assessments: An Overview Score Reporting Test Blueprinting I: Selecting an Assessment Method Test Blueprinting II: Creating a Test Blueprint Test Score Reliability Overview Validity and Threats to Validity Marc Gessaroli is Principal Measurement Scientist at NBME, where his research focuses on the use of multidimensional models to address psychometric issues in testing. Dr. Gessaroli received a PhD in Educational Measurement and Applied Statistics at the Ontario Institute for Studies in Education at the University of Toronto. For 10 years, he was a faculty member at the University of Ottawa, where he taught graduate courses in educational measurement, applied statistics and psychometrics before joining NBME. Gayle Gliva-McConvey, PhD Director, Center for Simulation & Immersive Learning Eastern Virginia Medical School Lesson Building a Clinical Skills Center Gayle Gliva-McConvey is Director of the Center for Simulation & Immersive Learning and the Professional Skills Teaching and Assessment Center at Eastern Virginia Medical School, Norfolk, VA. Richard Feinberg, PhD Senior Psychometrician National Board of Medical Examiners Lessons Validity and Threats to Validity Purposes, Types and Educational Uses of MCQ Examinations Richard Feinberg is a Senior Psychometrician with NBME, where he leads and oversees the data analysis and score reporting activities for large-scale high-stakes licensure and credentialing examinations. He is also an Assistant Professor at the Philadelphia College of Osteopathic Medicine, Philadelphia, PA, where he teaches a course on Research Methods and Statistics. His research interests include psychometric applications in the fields of educational and psychological testing. He earned a PhD in Research Methodology and Evaluation from the University of Delaware, Newark, DE.
23 Reaching new horizons in student assessments.
Gail Furman, MSN, CHSE, PhD Director of Educational Design and Development Clinical Skills Evaluation Collaboration National Board of Medical Examiners Lessons An Introduction to the Use of Generalizability Theory in OSCEs Building a Clinical Skills Center Developing Rating Scales and Checklists for OSCEs Provision of Feedback by Standardized Patients Quality Assurance of Standardized Patients Reality Check: Promoting Realism in Standardized Patients Training Physicians to Rate OSCE Patient Notes Working with Standardized Patients for Assessment Gail Furman is Director of Educational Design and Development for USMLE’s Step 2 Clinical Skills examination, which uses standardized patients to evaluate the clinical skills of those seeking medical licensure in the US. Dr. Furman has more than 20 years of experience in medical education as a professor and director of a clinical skills center, working with standardized patients and designing Objective Structured Clinical Examinations (OSCEs). She earned a PhD from Saint Louis University, St. Louis, MO. Joseph Grande, MD, PhD Course Director, Pathology and Cell Biology Mayo Medical School Lessons Test Blueprinting I: Selecting an Assessment Method Test Blueprinting II: Creating a Test Blueprint Joseph Grande has served as director for the Pathology and Cell Biology course taught to first-year students at Mayo Medical School, Rochester, MN, for more than 25 years. His previous roles include Associate Dean for Academic Affairs at Mayo Medical School, 2007-2013, and Chair of the Step 1 Committee for NBME, 2008-2011. Dr. Grande currently serves on the Executive Board of NBME. He is a reviewer for many education-related journals and is currently on the editorial board of Biochemistry and Molecular Biology Education. He also chairs a study section for the National Institutes of Health that reviews educational conference grant applications.
National Board of Medical Examiners 24
Meet the Authors
Kathy Holtzman Director of Assessment and International Operations American Board of Medical Specialties Lessons Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Strategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge Kathy Holtzman is Director of Assessment and International Operations for the ABMS, where she provides leadership and project management for international programs and examinations. Prior to joining ABMS, Ms. Holtzman worked for 35 years at NBME, most recently as Assistant Vice President in the Assessment Programs unit. She has extensive experience with assessment of medical decision-making skills with multiple-choice tests and simulation formats; methods for development and review of test material; design and introduction of computer-based and webbased tests; and development of new assessment formats, some utilizing multimedia. In addition, she has conducted item-writing workshops at medical schools, specialty boards and professional conferences nationally and internationally. Ms. Holtzman holds a bachelor’s degree from Tennessee Technological University, Cookeville. Kieran Hussie Manager of Multimedia Services National Board of Medical Examiners Lesson Incorporating Graphics and Multimedia into MCQs Kieran Hussie is the Manager of Multimedia Services for Test Development with NBME, where he has managed the development, acquisition, processing and standardization of multimedia formats and applications for use in NBME-produced assessments. After working for the Warner Brothers Network television station in Philadelphia, he joined the testing industry with Assessments Systems and Promissor, which became a part of Pearson. Mr. Hussie earned a BA in Communications from Temple University, Philadelphia, PA. Daniel Jurich, PhD Psychometrician National Board of Medical Examiners Lesson Purposes, Types and Educational Uses of MCQ Examinations Daniel Jurich is a Psychometrician with NBME, where he manages the psychometric activities for various licensure and in-training examinations. His primary research interests include improving the diagnostic utility of assessments to aid in tailoring remediation and data forensic techniques to examine test security. He received a PhD in Assessment and Measurement from James Madison University, Harrisonburg, VA.
25 Reaching new horizons in student assessments.
Peter J. Katsufrakis, MD, MBA Family Physician President and CEO National Board of Medical Examiners Lesson Conducting the Feedback Session Peter Katsufrakis is a board-certified family physician and Senior Vice President for Assessment Programs with NBME. His responsibilities at NBME include oversight of the Medical School Services and Medical Education and Health Profession Services programs, International Programs, the Post-Licensure Assessment Service, and the United States Medical Licensing Examination program. He is a past Associate Dean for Student Affairs at the Keck School of Medicine, University of Southern California. Dr. Katsufrakis received a BA from the University of California, Berkeley, an MBA from the University of Southern California, Los Angeles, and an MD from the University of California, San Diego. He served his internship and residency in family practice at Santa Monica Hospital, and is a Diplomate of the American Board of Family Medicine. Jessica McBride Operations Director Clinical Skills Evaluation Collaboration Educational Commission for Foreign Medical Graduates Lessons Building a Clinical Skills Center Jessica McBride is an Operations Director with NBME and has more than 10 years of experience enhancing operational capabilities, building and leading top-performing teams and resolving ongoing issues. Ms. McBride is responsible for exam session scheduling at all sites. She oversees facility enhancement and development and maintenance issues (including AV, software and equipment), and acts as project manager for special projects. She holds a BA in psychology, is a certified PMP and is completing CPBPM certification at Villanova University, Villanova, PA.
National Board of Medical Examiners 26
Meet the Authors
Carol Morrison, PhD Manager, Psychometrics National Board of Medical Examiners Lessons Incorporating Graphics and Multimedia into MCQs Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Purposes, Types and Educational Uses of MCQ Examinations Setting Pass/Fail Standards Strategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge Carol Morrison is Manager of Psychometrics with NBME. Dr. Morrison supervises the scoring, equating, standard setting and other psychometric analyses for Step 1, Step 2 CK and Step 3 of the USML and other certification, in-training and self-assessment examinations. Dr. Morrison is an active member of several professional organizations, including the American Educational Research Association and the National Council on Measurement in Education. She earned a PhD in Educational Psychology (Quantitative Methods) from the University of Texas. John J. Norcini, PhD President and CEO Foundation for Advancement of International Medical Education and Research Lesson Workplace Assessment: Encounter-Based Methods John J. Norcini has been the President and CEO of the Foundation for Advancement of International Medical Education and Research (FAIMER®) since its inception in 2000. Before that, he held a number of senior positions at the American Board of Internal Medicine. His principal academic interest is in assessment. He has published extensively, has lectured and taught in many countries and is on the editorial boards of several peer-reviewed journals in health professions education. He is an honorary Fellow of the Royal College of General Practitioners and the Academy of Medical Educators, and has received numerous awards, including the Karolinska Institutet Prize for Research in Medical Education, Stockholm, Sweden.
27 Reaching new horizons in student assessments.
Louis N. Pangaro, MD, MACP Professor and Chair Department of Medicine Uniformed Services University of the Health Sciences Lesson Educational Frameworks for Assessing Competence Louis Pangaro is a graduate of Georgetown University Medical School with subsequent training in endocrinology and metabolism. As a recognized expert in quantitative and descriptive evaluation of students’ progress, Dr. Pangaro created “standardized examinees” to calibrate the validity of the prototype clinical skills examination of the USMLE. He also created a “synthetic” framework, or RIME scheme (reporter-interpreter-managereducator), for defining expectations of students and residents. He co-directs the annual Systems Approach to Assessment in Health Professions Education program at the Harvard Macy Institute, Boston, MA. Miguel A. Paniagua, MD, FACP Medical Advisor for Test Development Services National Board of Medical Examiners Lessons Incorporating Graphics and Multimedia in MCQs Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Strategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge Miguel A. Paniagua currently serves as Medical Advisor for Test Development Services at NBME. His work at NBME includes the development of assessments of procedural skills, communication skills, interprofessional teamwork, and other innovations in computer-based examinations. He has served on multiple item writing and reviewing committees at NBME over the past 10 years, and has served as a representative member of the National Board from 2011 to 2014 and on the Executive Board from 2013 to 2014. Dr. Paniagua practices consultative hospice and palliative medicine at the Hospital of the University of Pennsylvania and holds adjunct appointments to the faculties of the Saint Louis University School of Medicine, St. Louis, MO, and the Perelman School of Medicine at the University of Pennsylvania, Philadelphia. Dr. Paniagua received an undergraduate degree from Saint Louis University and an MD at the University of Illinois College of Medicine, Chicago. Dr. Paniagua completed his internal medicine residency and fellowship in gerontology and geriatric medicine at the University of Washington, Seattle.
National Board of Medical Examiners 28
Meet the Authors
Carol Pfeiffer, PhD Professor Emeritus University of Connecticut School of Medicine Lesson Provision of Feedback by Standardized Patients Carol Pfeiffer is Professor Emeritus at the University of Connecticut School of Medicine, Farmington, CT, where she had a 35-year career as a medical educator. She was a founding member of the Association of Standardized Patient Educators and received awards as an outstanding educator from both that organization and the University of Connecticut. She was a member of the Prototype Committee in the initial phases of the development of NBME’s Step 2 CS. She holds a PhD in Sociology from Washington University in St. Louis, St. Louis, MO, and her focus in medical education has been on communication skills. Mark Raymond, PhD Research Director and Principal Assessment Scientist National Board of Medical Examiners Lessons Purposes and Types of Assessments: An Overview Test Blueprinting I: Selecting an Assessment Method Test Blueprinting II: Creating a Test Blueprint Incorporating Graphics and Multimedia into MCQs Workplace Assessment: Encounter-Based Methods Mark Raymond is a Research Director and Principal Assessment Scientist with NBME, where he conducts and coordinates research on assessment and promotes scholarly interactions with external organizations. Over the past 25 years, Dr. Raymond has consulted with licensing agencies, professional associations and universities on test development and psychometrics. His scholarly interests include job analysis and test blueprinting, generalizability theory and performance-based assessments. He serves on the editorial boards of several journals in health care and testing, and recently relinquished his role as Associate Editor of Applied Measurement in Education. He is also the co-editor of The Handbook of Test Development, published by Routledge in 2015. He earned a PhD in Educational Psychology from Pennsylvania State University, State College.
29 Reaching new horizons in student assessments.
Margaret Richmond, MSc Process Expert National Board of Medical Examiners Lesson Conducting the Feedback Session Margaret Richmond is a Process Expert with the Strategic Planning and Institutional Effectiveness team at NBME. In this role, she manages and facilitates cross-functional project teams to implement and improve internal processes, focusing on new product development and marketing processes to enhance the effectiveness of the NBME. She previously worked at the Center for Innovation to diversify the products and services offered by the NBME, including the Assessment of Professional Behaviors (APB) program. Ms. Richmond earned an MSc in Information Science from the University of Michigan, Ann Arbor, MI. Jessica Salt, MD, MBE Medical Director and Patient Note Director Clinical Skills Evaluation Collaboration Educational Commission for Foreign Medical Graduates Lesson Training Physicians to Rate OSCE Patient Notes Jessica Salt is the Medical Director and Director of the Patient Note Program for the Clinical Skills Evaluation Collaboration (CSEC), where she is responsible for recruitment, training and QA of Patient Note Raters for the USMLE Step 2 CS Exam. Prior to joining CSEC in 2012, Dr. Salt was Associate Program Director for the Internal Medicine Residency and Director of the Internal Medicine Clerkship at Jefferson University Hospital in Philadelphia. She is a board-certified internist and currently practices as a General Hospitalist at Lankenau Medical Center in Wynnewood, PA. Dr. Salt received an MBE in Bioethics from the University of Pennsylvania, Philadelphia, and an MD from Virginia Commonwealth University, Richmond. She completed her internal medicine residency at Brown University, Providence, RI.
National Board of Medical Examiners 30
Meet the Authors
Kathleen Short, MALS, MS Program Officer National Board of Medical Examiners Lessons Test Blueprinting I: Selecting an Assessment Method Test Blueprinting II: Creating a Test Blueprint Kathleen Short is a Program Officer with NBME. With a wide range of assessment experience, she leads a team that works on the USMLE, board certifications, self-assessments and educational tools, including NBME’s Customized Assessment Service. Her research interests include investigations into innovative assessments, including those using multimedia simulations in conjunction with multiple-choice questions. Ms. Short earned her MALS from the University of Pennsylvania, where she focused in Ethnographic Research, and her MS in Statistics, Measurement, Assessment and Research Technology from the University of Pennsylvania’s Graduate School of Education, Philadelphia. David B. Swanson, PhD Vice President American Board of Medical Specialties Lessons Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Setting Pass/Fail Standards Strategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge David B. Swanson is currently a Vice President at the ABMS, where he works with committees to manage the certification and recertification processes of ABMS Member Boards. He also worked with the NBME, where he wrote, spoke at and hosted seminars around the world about various topics related to student assessment. He also wrote Constructing Written Test Questions For the Basic and Clinical Sciences with Dr. Susan Case. He earned his PhD in Psychology from the University of Minnesota. In 2011, he received the Richard Farrow Gold Medal for Outstanding Contributions to Medical Education from the UK Association for the Study of Medical Education. He also holds an honorary professorial appointment in the University of Melbourne Medical School, Victoria, Australia.
31
Reaching new horizons in student assessments.
Kimberly Swygert, PhD Director, Research and Development in Test Development National Board of Medical Examiners Lessons Validity and Threats to Validity Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Structuring Multiple-Choice Questions Kimberly Swygert is Director of Research and Development in Test Development at NBME. Her work on performance assessments, examinee timing and pacing, examinee repeater behavior and score reporting has been presented at conferences and published in journals such as Academic Medicine, Advances in Health Sciences Education, the Journal of General Internal Medicine and the Journal of Educational Measurement. In addition to her work at NBME, Dr. Swygert has taught graduate courses in biostatistics and psychometrics at Drexel University, Philadelphia, PA, and the Uniformed Services University of the Health Sciences, Bethesda, MD, where she is an external advisory board member. She earned a PhD in Quantitative Psychology from the University of North Carolina, Chapel Hill. Ellen Turner, MD Assistant Director, Patient Note Program Clinical Skills Evaluation Collaboration Educational Commission for Foreign Medical Graduates Lesson Training Physicians to Rate OSCE Patient Notes Ellen Turner serves as Assistant Director of the Patient Note Program at CSEC in Philadelphia, PA. Dr. Turner is a board-certified infectious diseases physician with more than 10 years of clinical and teaching experience. She is an adjunct faculty member at Drexel University College of Medicine, Philadelphia, PA. After receiving her medical degree from Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, Dr. Turner completed an internal medicine residency and infectious diseases fellowship at Temple University Hospital in Philadelphia. In addition to her experience as a clinical educator, she was also a Patient Note Rater at the ECFMG prior to joining the CSEC staff.
National Board of Medical Examiners 32
Meet the Authors
Howard Wainer, PhD Distinguished Research Scientist National Board of Medical Examiners Lesson Score Reporting Howard Wainer is a Distinguished Research Scientist with NBME, where he acts as senior statistical/ psychometric advisor and writes books. As a research scientist and former professor, he has published more than 400 articles and book chapters. His 20th book, Medical Illuminations: Using Evidence, Visualization & Statistical Thinking to Improve Healthcare, was published by Oxford University Press in 2014 and was a finalist for the Royal Society Winton Book Prize. His latest book, Truth or Truthiness: Distinguishing Fact from Fiction by Learning to Think like a Data Scientist, was published by Cambridge University Press in 2015. He is a Fellow of the American Statistical Association and the American Educational Research Association and has received several awards for his work in the industry. Dr. Wainer earned a PhD in Psychometrics from Princeton University, Princeton, NJ.
33 Reaching new horizons in student assessments.
“You are most likely a faculty member because of your content and your expertise, not because of your educational skills. We have always assumed that people can teach if they understand the content. And I think that has changed because there is a greater recognition that education requires some skills unto themselves and not just the content area.” Larry Gruppen, PhD Professor, Master of Health Professions Education University of Michigan
Reaching new horizons in student assessments.
National Board of Medical Examiners 34
NBME SM
YOUR PARTNER IN EVIDENCE-BASED ASSESSMENT