Menu Expand
Practical Guide to the Evaluation of Clinical Competence E-Book

Practical Guide to the Evaluation of Clinical Competence E-Book

Eric S. Holmboe | Steven James Durning | Richard E. Hawkins

(2017)

Additional Information

Book Details

Abstract

Designed to help medical educators implement better assessment methods, tools, and models directly into training programs, Practical Guide to the Evaluation of Clinical Competence, 2nd Edition, by Drs. Eric S. Holmboe, Steven J. Durning, and Richard E. Hawkins, is a hands-on, authoritative guide to outcomes-based assessment in clinical education. National and international experts present an organized, multifaceted approach and a diverse combination of methods to help you perform effective assessments. This thoroughly revised edition is a valuable resource for developing, implementing, and sustaining effective systems for evaluating clinical competence in medical school, residency, and fellowship programs.

  • Each chapter provides practical suggestions and assessment models that can be implemented directly into training programs, tools that can be used to measure clinical performance, overviews of key educational theories, and strengths and weaknesses of every method.
  • Guidelines that apply across the medical education spectrum allow you to implement the book’s methods in any educational situation.
  • New chapters on high-quality assessment of clinical reasoning and assessment of procedural competence, as well as a new chapter on practical approaches to feedback.
  • Reorganized for ease of use, with expanded coverage of Milestones/Entrustable Professional Assessments (EPAs), cognitive assessment techniques, work-based procedural assessments, and frameworks.
  • The expert editorial team, renowned leaders in assessment, is joined by global leader in medical education and clinical reasoning, Dr. Steven Durning.

Table of Contents

Section Title Page Action Price
Front Cover Cover
IFC ES1
Practical Guide to the Evaluation\rof Clinical Competence i
Practical Guide to the Evaluation\rof Clinical Competence iii
Copyright iv
Preface v
Contributors vii
Acknowledgments ix
Dedication ix
Contents xi
Video Contents xii
1 - Assessment Challenges in the Era of Outcomes-Based Education 1
The Rise of Competency-Based Medical Education 1
Outcomes and Competency-Based Medical Education 2
A Brief History of Assessment 3
Drivers of Change in Assessment 3
Accountability and Quality Assurance 4
Quality Improvement Movement 4
Technology 4
Psychometrics 4
Qualitative Assessment and Group Process 5
Framework for Assessment 5
Dimension 1: Competencies 5
Dimension 2: Levels of Assessment 6
Miller’s Pyramid 6
The Cambridge Model 6
Dimension 3: Assessment of Progression 7
Criteria for Choosing a Method 7
Elements of Effective Faculty Development 8
Overview of Assessment Methods 9
Traditional Measures 9
Methods Based on Observation 9
Simulation 9
Work 9
New Directions in Assessment 10
Milestones 10
Entrustable Professional Activities 11
Combining Milestones and Entrustable Professional Activities 12
Entrustable Professional Activities – Competencies – Skills 13
Entrustment Decision Making as Assessment 14
Systems of Assessment (See Chapter 16.) 15
Conclusion 16
Acknowledgment 16
References 16
1.1 -\rDeveloping an EntrustableProfessional Activity 19
1.2 -\rEntrustable Professional Activities, Competencies, and Milestones:Pulling It All Together 21
2 - Issues of Validity and Reliability for Assessments in Medical Education 22
Historical Context 22
Kane’s View of Validity 25
Scoring 26
Example I: A Multiple-Choice Examination 26
Example II: Performance Assessment 26
Example III: Workplace-Based Assessment 27
Generalization 27
Generalizability Theory 29
Example I: A Multiple-Choice Examination 29
Example II: Performance Assessment 30
Example III: Workplace-Based Assessment 31
Extrapolation 32
Example I: A Multiple-Choice Examination 32
Example II: Performance Assessment 33
Example III: Workplace-Based Assessment 33
Decision/Interpretation 33
Example I: A Multiple-Choice Examination 34
Example II: Performance Assessment 34
Example III: Workplace-Based Assessment 34
Conclusion 34
Annotated Bibliography 35
References 35
Annotated Bibliography 35.e1
3 - Evaluation Frameworks, Forms, and Global Rating Scales 37
Introduction 37
Evaluation Forms and Frameworks 38
Analytic Frameworks 39
Developmental Frameworks 39
A Synthetic Model 40
Achieving Construct Alignment Through Simplicity 42
Descriptive Terminology for Evaluation 42
Complementary Frameworks – ACGME General Competencies and RIME 44
Frameworks: Concluding Thoughts 44
Rating Scales 44
Rating Scales: Basic Design 44
Purposes and Advantages of Evaluation Forms 46
Written Assessment 49
Evaluation Sessions 50
Psychometric Issues 50
Reliability 51
Validity 51
Rating Errors 52
Rater Accuracy 53
Faculty Development and Evaluation Forms 54
Performance Dimension Training and RIME 54
Conclusions 54
Annotated Bibliography 55
References 55
Reporter 57
3.1- \rThe RIME Evaluation Framework: A Vocabulary of Professional Progress 57
Interpreter 58
Manager 58
Educator 58
4 - Direct Observation 61
Introduction 61
Direct Observation as Workplace-Based Assessment 62
Reasons for Direct Observation 63
Importance of and State of Core Clinical Skills 63
Direct Observation as an Educational Tool for Deliberate Practice and Coaching 63
Direct Observation as an Assessment Method in Competency-Based Medical Education 64
Direct Observation as a Method to Guide Supervision 64
Overview 65
Increasing Faculty Buy-In for Direct Observation 65
Interactive Activities to Increase Buy-In for Direct Observation 65
Finding Time for Direct Observation 66
Interactive Activity to Identify Observation Snapshots 67
Preparing for and Performing the Observation 67
Interactive Activity for Better Preparation 68
Assessment Tools for Direct Observation 68
Assessment Tool Format 69
Global Ratings Versus Checklists 69
Scale Anchors 69
Overview 73
Reliability and Validity Concerns 73
Overview of Faculty Development Approaches to Improve Assessment Quality 75
Motivating Faculty to Participate in Rater Training 75
Performance Dimension Training 76
Frame-of-Reference Training 77
Practicing Skills in Direct Observation and Feedback 78
Opportunities for Additional Practice 79
Creating a System for Direct Observation at the Program Level 79
Timing and Purpose of Direct Observation 79
Assigning Responsibility for Direct Observation 79
Tracking Observations 80
Creating a Culture and System That Support High-Quality, Frequent Direct Observation 80
Key Messages About Faculty Development and Implementation 81
Annotated Bibliography 81
References 81
Sampling 85
Outpatient Setting Snapshots 85
Inpatient Setting Snapshots 85
Two (or Even Three) Birds With One Stone 85
Create a Simple System for Tracking 86
Other Tips for Direct Observation 86
4.2 -\rExamples of Rater Training Workshops 86
4.3 -\rFaculty Guide to Training Videos 88.e1
Medical Interviewing Tapes (Videos 4.1 to 4.3) 88.e1
5 - Direct Observation: Standardized Patients 91
Introduction 91
Components of a Typical Standardized Patient Encounter 92
Introduction to the Encounter 92
Standardized Patient Encounter 92
Recording or Scoring of the Standardized Patient Encounter 92
Postencounter Activities (Interstation Exercises) 92
Assessment 93
Psychometrics of Standardized Patient Assessment 93
Scoring of Standardized Patient Assessments 94
Checklists and Rating Scales 94
Training the Assessors 95
Score-Equating Strategies 96
Quality Assurance 96
Standard Setting 97
Identifying Threats to Validity 97
Development of Standardized Patient–Based Examinations 97
Examination Purpose 98
Examination Content 99
Case Development and Standardized Patient Training 101
Standardized Patient–Based Methods for Assessing Educational Outcomes 102
Assessment for Learning 102
Assessment of Learning 103
The Use of Unannounced Standardized Patients for Assessing Patient Care 104
Strengths and Limitations of Standardized Patient–Based Methods for Education and Assessment 105
New Developments and Future Directions 107
Scoring 107
Teamwork/Interprofessional Skills 107
Multifaceted Simulation 107
Residency Training, Certification, and Maintenance of Certification 107
Annotated Bibliography 108
References 108
Annotated Bibliography 108.e1
6 - Using Written Examinations to Assess Medical Knowledge and Its Application 113
Introduction 113
Roles for Assessment Before, During, and After Clinical Instruction 113
Assessment of Learning Before, During, and After Instruction 114
Test-Enhanced Learning and Use of Repeated, Spaced Examinations to Promote Retention 114
Programmatic Assessment and Assessment for Learning 116
Methods for Assessment of Knowledge With Written Examinations 117
Response Formats 117
Stimulus Formats 117
Selection of Stimulus and Response Formats 117
Reliability of Scores and Validity of Score Interpretations on Written Assessments 118
Reliability of Test Scores 119
Validity of Score Interpretations 120
Use of Written Examinations Within Educational Programs 121
Locally Developed Examinations 121
National Standardized Examinations 123
USMLE and NBME Subject Examinations 123
Assessment of Individual Students 124
Evaluation of Educational Programs 125
Selection of Residents 126
In-Training Examinations 126
Use of In-Training Examinations in Predicting Certifying Examination Performance 127
Comparison of In-Training Examination Results With Other Assessment Methods 128
Improving In-Training Examination Scores 128
Advantages of Written Examinations as Assessment Tools 131
Disadvantages of Written Examinations as Assessment Tools 131
Conclusions 131
Annotated Bibliography 132
References 132
Annotated Bibliography 132.e1
7 - Assessing Clinical Reasoning in the Workplace 140
Introduction 140
Background 140
Definition and a Theoretical Framework 141
“Expert” Assessments 142
SNAPPS 143
One-Minute Preceptor and IDEA 144
Direct Observation 144
Chart-Stimulated Recall 144
Work-Based Related Assessments 145
Objective Structured Clinical Examination and High-Fidelity Simulations 145
Emerging Strategies in Clinical Reasoning Assessment 145
Concept Mapping 146
Script Concordance Test 146
Self-Regulated Learning 146
Team-Based Diagnosis 146
Audio and Video Review of Diagnostic Reasoning 146
Conclusions 149
Annotated Bibliography 149
References 149
Annotated Bibliography 149.e1
7.1 -\rLasater Clinical Judgment Rubric 151
8 - Workplace-Based Assessment of Procedural Skills 155
Purpose 155
Introduction to Structured Assessment Tools for Procedural Skills 155
“Validity” Lies in the Process of Assessment, Not in the Instrument Itself 156
Simplifying the Assessment Tools: Construct-Aligned Scales 157
Dimensions of Procedural Competence 158
Useful Tools for Assessing Procedural Skills in All Specialties 159
Practical Issues in the Design and Selection of Assessment Instruments for Procedural Skills 160
Conclusions 161
Take-Home Messages 161
Annotated Bibliography 161
References 161
Annotated Bibliography 161.e1
8.1 -\rIssues Concerning the BroaderContext of Assessment 163
The Importance of Variance 163
Assessment in the Broader Context of Competency-Based Medical Education 163
Other Resources 164
9 - Evaluating Evidence-Based Practice 165
Introduction 165
General Issues in Evaluation in Medical Education 166
EBP Evaluation Domains 167
EBP Evaluation Instruments 168
Evaluating EBP Knowledge and Skills 169
Instruments With the Multiple Types of Evidence for Validity, Including Discriminative Validity 169
Instruments With “Strong Evidence” for Responsive Validity 169
Additional EBP Knowledge and Skill Instruments 169
EBP Evaluation Objective Structured Clinical Examinations 169
Critically Appraised Topic 173
Evaluating Ask: Articulating Clinical Questions 173
Evaluating Acquire: Searching for Evidence 173
Evaluating Apply: Applying Evidence to Decision Making 173
Evaluating EBP Attitudes and Learning Climate 173
Evaluating EBP Behaviors (Performance) 174
Evaluating the Performance of EBP Steps in Practice 174
Evaluating the Performance of Evidence-Based Clinical Maneuvers and Affecting Patient Outcomes 175
Which Level of EBP Behaviors Should We Measure? 175
Recommendations 176
Annotated Bibliography 176
References 176
Annotated Bibliography 176.e1
9.1 -\rInternet EBP Education Resources 181
9.2 -\rExamples of Educational Prescriptions 182
10 - Clinical Practice Review 184
Background 184
A Systems and Quality Primer 185
What Is a System? 185
Components of a Clinical Microsystem 186
Population of Patients with Need 186
Clinical Processes 186
Outcomes of Care—Patient’s Needs Met 187
Supporting Processes 187
Supplier Microsystems 188
Systems and Adaptation 188
Clinical Practice Review to Assess Quality and Safety of Care 189
A Primer on Quality (Performance) Measures 189
Sources of Data for Practice Review 189
Paper-Based Medical Records 190
The Electronic Medical Record 190
Claims Data 190
Laboratory and Other Clinical Databases 191
Registries 191
The Review Process 191
Advantages of Clinical Practice Review 192
Availability 192
Feedback 192
Changing Clinical Behavior 192
Practicality 193
Evaluation of Clinical Reasoning 193
Reliability and Validity 193
Learning and Assessing by Doing 193
Self-Assessment and Reflection 195
Potential Disadvantages of Clinical Practice Reviews 195
Quality of the Documentation 196
Process Versus Outcomes 196
Assessment of Clinical Judgment 197
Time and Quantity of Review 197
Cost 197
Faculty Development 198
Unannounced Standardized Patients 198
Clinical Vignettes 199
Summary 199
Annotated Bibliography 199
References 199
Annotated Bibliography 199.e1
10.1 -\rList of Useful Resources for Quality Improvement and Patient Safety 203
10.2 -\rSample Medical Record Abstraction Form for Diabetes 203.e1
11 - Multisource Feedback 204
Introduction 204
Use of the Tool for Learning 205
Getting Started 206
Validity and Reliability 207
Strengths and Weaknesses of Multisource Feedback as a Tool for Assessment 210
Summary 212
Annotated Bibliography 213
References 213
Annotated Bibliography 213.e1
Useful Websites 213.e2
12 - Simulation-Based Assessment 215
What Are Medical Simulations and Why Use Them? 215
Psychometric Properties and Related Considerations 216
Reliability 217
Validity 217
Fidelity 218
Feasibility 219
Scoring and Rating Instruments 219
Strengths and Best Applications 220
Weaknesses and Challenges 223
Available Technologies 224
Part Task Trainers 225
Computer-Enhanced Mannequin Simulators 227
Virtual Reality Simulators 229
Practical Suggestions for Use Now and Future Directions 234
Conclusion 237
Acknowledgment 238
Conflict of Interest Disclosure 238
Annotated Bibliography 238
References 238
Annotated Bibliography 238.e1
12.1 -\rList of Simulators and Their Characteristics 248
13 - Feedback and Coaching in Clinical Teaching and Learning 256
Setting the Stage 256
Provide a Framework That Positions Feedback Within the Central Activities of Teaching, Learning, and Assessing 257
Factors That Currently Interfere With Sharing Effective Performance Data and Engaging in Feedback Conversations 259
Programmatic Assessment 263
Competency-Based Medical Education 264
Coaching and Cocreation of Development Plans 264
R2C2 Evidence-Based Feedback Model 265
Encouraging Feedback Seeking 266
Taking Positive Steps to Change the Feedback Culture 267
Practical Exercises: Putting It All Into Practice 267
Exercise 2: Individual or Role-Play Practice Giving Feedback in a Specific Scenario 267
Exercise 3: Creating a Positive Culture of Feedback: Coaching 268
Acknowledgments 268
Annotated Bibliography 268
References 268
Annotated Bibliography 268.e1
14 - Portfolios 270
Background 270
Strengths of the Portfolio Process 271
Purpose of the Portfolio 272
Construction of a Portfolio 273
Challenges in the Assessment of Portfolios 273
Portfolio Content 275
Response Process 275
Internal Structure 275
Relationship to Other Variables 276
Consequences 276
Kane’s Framework 276
Implementation 276
Decision and Evaluator Consistency 276
Relationship of Portfolio Assessment to Other Measures 278
Oral Presentation of Portfolio 278
Reflection by Learners About Portfolio Evidence 278
The “Comprehensive Portfolio” 279
Characteristics of a Comprehensive Portfolio 279
The Learner’s Contribution 282
Dedication 285
Annotated Bibliography 285
References 285
Annotated Bibliography 285.e1
15 - The Learner With a Problem or the Problem Learner? Working With Dyscompetent Learners 288
Background: Setting the Stage and Definitions 288
Scope of the Problem: Dyscompetent Learners 290
Problem Identification 291
Problem Investigation and Classification 293
Problem Definition and Confirmation 294
Secondary Causes and Contributing Factors 295
Burnout 295
Impairment 296
Determining an Appropriate Intervention 296
Assessment of the Intervention 298
Professionalism 298
Legal Principles 298
Legal Issues: General Guidelines 300
Challenges for the Future 300
Annotated Bibliography 301
References 301
Annotated Bibliography 301.e1
16 - Program Evaluation 303
Introduction 303
Evaluation Purposes 303
Overview of Evaluation Models 304
Program Evaluation Models 305
Goal and Measure 305
Kirkpatrick Model and Moore’s Expanded Outcomes Framework 305
Logic Model 306
Before, During, After 308
Medical Research Council Model for Complex Interventions 308
CIPP 310
Realist Evaluation 311
Constructing an Evaluation Program 312
Define Evaluation Goals 314
Engage Stakeholders 314
Design and Methods 315
Borrowing From Various Methods 315
Use of Quantitative and Qualitative Methods 315
Measures 316
Structure and Process Measures 316
Outcome Measures 317
. Program evaluation should include educational outcomes that are measured during the training program and those that reflect th... 318
. National conversations regarding GME funding have raised questions regarding the role and contribution of the medical educatio... 323
. Because the overarching purpose of medical education is to improve the quality of care delivered to patients, and in considera... 323
Learning Environment 324
Reporting and Feedback 325
Conclusions 325
Annotated Bibliography 325
Annotated Bibliography 325.e1
References 326
16.1 - \rExercise in Program Evaluation 329
Index 331
A 331
B 331
C 331
D 332
E 333
F 333
G 334
H 334
I 334
J 334
K 334
L 334
M 334
N 335
O 335
P 335
Q 336
R 336
S 336
T 337
U 337
V 337
W 338
Z 338
IBC ES2