|
Indian Pediatr 2014;51:
713-717 |
 |
Direct Observation and Focused Feedback for
Clinical Skills Training
|
Tejinder Singh, Shaveta Kundra and *Piyush Gupta
From Department of Pediatrics, Christian Medical College, Ludhiana
and *University College of Medical Sciences, New Delhi, India.
Correspondence to: Dr Tejinder Singh, Professor, Department of
Pediatrics, Christian Medical College and Hospital, Ludhiana, India.
Email: [email protected]
|
Direct observation of the medical trainee by an expert assessor and
providing authentic feedback is considered an important tool for
development of clinical and procedural skills. Mini-clinical evaluation
exercise and Direct Observation of Procedural Skills are two important
tools to observe the trainee during a clinical encounter or during a
procedure, make an expert standardized (though subjective) observation,
and use it to provide developmental feedback. Both can be easily
integrated into routine work of clinical departments, and both provide a
reliable assessment if 6-8 such encounters are used.
Keywords: Assessment, Medical education, Skill evaluation.
|
Traditional case-presentations have been the mainstay of assessment in
medicine for a long period of time. Despite their use over a long period
of time and wide acceptability, case presentations have certain inherent
flaws [1]. The trainee is not observed during interaction with the
patient, and so the assessment is not based on history taking, physical
examination or counselling skills; rather, most of the assessment
focuses on the presentation skills. Residents’ poise, confidence and
linguistic skills – in addition to the luck factor in getting an easy or
a complicated case – affect such assessments. Sometimes, patients with
rare diseases are used in the assessment, many of these diseases the
trainee is unlikely to see again during his/her career. Also, the
trainee is not told about the strengths and weaknesses of his
presentation and how it can be made better. All these factors make the
assessments summative in nature, with no opportunity to use assessment
as a learning tool. Objective structured clinical examination (OSCE) can
help to compensate for these weaknesses to some extent but again has
certain limitations [2]. It de-constructs the task into smaller
components, which may not necessarily add up to the whole. Preparations
for OSCE are even more elaborate than those required for a long case.
Importance of acquiring core clinical skills can
never be overemphasized. In over three - fourths of the cases at primary
care level, a diagnosis can be made by a good history and clinical
examination [3]. Still, none of our currently used assessment tools
makes an attempt to assess acquisition of skills. Sometimes and at some
places, there are few OSCE stations to assess skills but these are in an
artificial and controlled environment. Many skills like endotracheal
intubation, central line insertion, lumbar puncture, or intramuscular
injection cannot be replicated in an OSCE setting, except on a dummy.
Residents learn it from their seniors and the mistakes are passed down
for generations. There is no observation and feedback from faculty.
Additionally, a physician also needs to learn a number of soft skills,
which are important for practice of medicine and patient safety [4].
Soft skills make the difference between a successful and not so
successful physician. When things go wrong in practice, it is generally
attributable to lack of soft skills rather than to lack of technical
knowledge [5]. In our assessments, soft skills are never assessed. This
can be attributed to non-availability of suitable tool(s) and more
importantly, a fear of subjectivity.
Attributes of good assessment tool: Deciding the
right assessment tool for clinical competence is an enigma. There is
enough material in the literature to pick assessment tools based on the
notional concept of ‘utility’. Utility of assessment [6] is
conceptualized as a product of its validity, reliability, feasibility,
acceptability and educational impact. Viewed from this perspective, long
case may be high in validity, feasibility and acceptability but is low
on reliability and educational impact. The key point of this notion is
that an assessment low on one of the attributes can still be useful by
virtue of being high on others [7]. Thus a tool with low reliability and
high educational impact (e.g. essay questions, long case) would
be considered as much useful as a tool with low educational impact but
high reliability [e.g. multiple choice questions (MCQs), OSCE].
This is in line with the contemporary thinking that assessment should
not only tell us whether learning occurred or not, but should also help
us in improving it.
A good assessment of clinical competence should be
valid i.e. it should mimic the actual clinical encounter as
closely as possible, and it should be reliable. It is now accepted that
reliability is independent of objectivity of a tool [8]. It should be
easy to organize, should be acceptable to the stakeholders, and should
positively impact learning.
Importance of feedback: Of all factors, feedback
is recognized as the single most important factor that impacts learning
[9]. Veloski [10] has also demonstrated the utility of feedback in
making clinical learning better [10]. However, to be effective, the
feedback has to be authentic, based on direct observation, and provided
immediately. Less than one-third of clinical encounters are actually
observed during training [11,12]. At the Postgraduate level, up to 80%
of Postgraduate residents may have only one observed clinical encounter
[13]. Situation in Indian medical schools is expected to be no better:
direct observation (structured observation of the trainee interacting
with the patient, taking history, performing physical examination and
giving advice and not simply being present in the same room) of skills
is negligible.
Mini Clinical Evaluation Exercise (Mini-CEX)
This was introduced by the American Board of Internal
Medicine [14] as one of the series of assessments to address these
issues. Mini-CEX is a snapshot observation of a doctor-patient encounter
in a real authentic setting (outdoor or wards), lasting 15-20 minutes.
Its focus is on the core clinical skills that a resident should
demonstrate during clinical encounters. For each mini-CEX, a single
assessor observes and evaluates a resident who conducts a focused
history and physical examination on a patient. Each encounter can focus
on one or more of the competencies listed in Box 1 [15].
BOX1 Competencies Assessable by Mini-CEX
1. Medical interviewing skills
2. Physical examination skills
3. Humanistic qualities/professionalism
4. Clinical judgment
5. Counseling skills
6. Organization/efficiency
7. Overall clinical competence
|
All competencies need/may not be tested during each
encounter and a choice can be made depending on the case and the
seniority of the resident (e.g. history taking during early
residency while counseling skills can be assessed during latter part).
After asking the resident for a diagnosis and treatment plan, the
faculty member completes a short evaluation form and gives feedback to
the resident. The competencies picked up for that encounter are rated on
a 9-point scale, where 1-3 are considered unsatisfactory, 4-6 are
satisfactory, and 7-9 are considered superior. It uses global ratings
and subjective expert judgment rather than checklists. The results are
recorded on a generic form, which can be downloaded from ABIM website
[16]. The form also records the resident’s identification data,
complexity of the case and the site where the encounter was held
(outpatients, wards, emergency etc). The form also details the
list of competencies and their brief description to provide guidance in
evaluation.
To build generalizability and to provide reliable and
valid assessment, 6-8 encounters per year are recommended [17]. Each
encounter should be observed by a different assessor and entail a
different clinical problem. The process can be initiated by the
residents (so that each one completes 6 cases a year) or by the
department (a designated day for the encounter depending on availability
of the patients and assessor). We find the outpatient department to be
the best place for a mini-CEX. A resident working up a new case can be
observed by an assessor and provided feedback on the clinical encounter.
In either situation, it is the flexibility and ease of integrating
mini-CEX within the routine working of department without any special
preparation that stands out as a positive point. The filled up rating
forms provide a documentation of the resident’s progress and can be
stored either in personal file or as part of a learning portfolio.
Sample mini-CEX clip [18] and the method of providing feedback [19] –
provided by St. George’s University of London – are available for online
viewing.
Mini-CEX uses a different assessor and a different
case for each encounter. Over a year, each resident is assessed
by 6-8 assessors on 6-8 different cases. This is considered as the
biggest strength of mini-CEX as each teacher brings a distinct way of
thinking and approaching a patient [20]. Although many of the items on
the rating forms are subjective, reliability of mini-CEX has been
reported to be much higher than that of an OSCE
[8]. Similarly the assessment is based on a
different patient and a different setting each time, further building
the validity and reliability of the judgments. It may be pertinent to
reiterate that the best way to augment validity and reliability of the
assessments is to increase the breadth and depth of sample for tasks and
assessors [7].
Mini-CEX has several advantages over other forms of
assessment of clinical competence. By assessing the residents in the
real-life settings on a variety of cases and in a variety of settings,
validity is ensured. Mini-CEX looks at the entirety of the clinical task
rather than breaking it into components, which also contributes to its
construct validity. Large number of assessments using different cases
and different assessors ensures reliability and generalizability. It is
easier to organize a mini-CEX than OSCE or a case presentation.
Residents see value in it by way of immediate feedback in a
non-threatening situation, making it more acceptable. It contributes to
better learning by aligning working and learning in the workplace. It
also has the advantage of exposing the residents to different ways of
thinking about problems. A number of publications have established its
utility in the West [14,15]; experiences about its applicability,
acceptability and utility have also been reported from India [22]. In
our experience, mini-CEX was found to be feasible and acceptable to
faculty and the trainees. Table I compares the 3 commonly
used assessment tools.
TABLE I Comparison of Common Tools for Performance Assessment
|
Long case |
OSCE |
Mini-CEX |
Purpose |
Assessment |
Assessment (sometimes feedback) |
Assessment and feedback |
Occasion |
Specific time allotment |
Specific time allotment |
Integrated into daily clinical activities |
Task |
Real but limited |
Artificial |
Real and authentic |
Time |
1-2 hrs |
3-4 hrs for adequate reliability |
6-8 encounters of 20 min each |
Focus |
Presentation skills |
Skills in isolation |
Performance |
Basis of assessment |
Examiner judgment |
Checklists |
Examiner judgment on global ratings |
*Reported reliability |
0.60, 0.75 |
0.47, 0.64 |
0.46, 0.63 |
Modified from Singh and Norcini [23]; *for 1 and 3 hours of
testing time. |
Feedback has very important role in utility of
mini-CEX. In line with attributes of an effective feedback, feedback in
mini-CEX is based on direct observation rather than historical facts,
and is available immediately after the encounter [9]. The assessors can
use various tweaks to enhance the value of feedback by using one of the
various models like Pendleton’s framework [24]. Here the assessor first
asks the trainee to rate his/her performance and how he could have done
better. He then provides positive re-enforcement for what was done
right, corrective advice for what was wrong, and suggestions to improve.
The whole process takes about 5-7 minutes. The recording form also has a
provision to ask the trainee about his/her satisfaction with the entire
learning process, which provides a feedback to the assessor as well.
There are certain limitations of mini-CEX. Various
residents are assessed on different patients by different assessors,
which make comparison between the residents difficult. For this reason,
mini-CEX is currently used only for formative purposes rather than
summative [25]. Standardization is difficult with mini-CEX given its
flexible logistics. Mini-CEX is not a replacement for other assessment
tools. It only compliments the information generated by other tools. The
results of mini-CEX also need to be supplemented by other measures of
performance and knowledge like case presentation, OSCE and MCQs/essays.
While mini-CEX targets clinical, analytical and
counseling skills, procedural skills is another area which is not
adequately represented in current assessment. Simulations in skill
laboratories can assess these to some extent but this is not possible in
real life setting. This leads to a generation of physicians who may have
theoretical knowledge but who are deficient in procedural skills. This
led to development of direct observation of procedural skills (DOPS) as
another important tool for directly observing these skills as part of
workplace-based assessment.
Direct Observation of Procedural Skills (DOPS)
This was developed by the Royal College of Physicians
[26] as part of assessment for its foundation program. DOPS refers to
observation and evaluation of a procedural skill performed by a resident
on a real patient. The assessor directly observes and assesses
residents’ skill performance, usually focusing on a single procedural
skill. DOPS, like mini-CEX, serves the twin purpose of assessment as
well as enhancing skill learning.
The focus of DOPS is common procedures which are
usually performed by physicians in practice. A list of such procedures
can be drawn out for each specialty. The assessor rates the procedure
using a checklist or a global rating scale. Though both can be used but
there is a possibility that the resident may perform a procedure, which
may be correct as per checklist, but there may be technical errors and
the required sequence may not be followed. Unlike mini-CEX, the same
assessor- trainee pair can have multiple encounters involving different
skills.
The DOPS assessment is also recorded on a standard
assessment form, which has place for trainee identi-fication, name of
the procedure, its complexity and the place where performed. The
procedure is graded on attributes like ‘demonstrates understanding of
the procedure’, ‘obtains informed consent’, ‘makes appropriate
preparations’, ‘gives adequate analgesia/sedation’, ‘uses aseptic
techniques’, using a 6 point scale, with 1-2 indicating unsatisfactory,
3-4 as satisfactory and 5-6 as superior. Assessor can mark unobserved if
any of the procedure is not observed. An overall score is given for the
technical aspects of the procedure (not for each of the steps). In order
to rate a procedure as satisfactory, most (but not necessarily all)
competencies should have been rated as satisfactory. Achieving a
satisfactory level on one occasion does not confirm that the trainee is
competent to perform that procedure unsupervised. This judgment requires
repeated assessments by more than one assessor.
The type of procedures to be observed can be
staggered, taking into account the progression of the trainee. During
early years, emphasis can be on basic procedures (IV cannulation,
endotracheal intubation, neonatal resuscitation etc) and with increasing
experience, more complex procedures (central line placements,
ventilation etc) can be considered. The completed forms [27] are stored
in personal files or in a portfolio and provide evidence of residents’
progression. Ideally, all residents should be observed on all procedures
required for that course. Senior residents or sometimes senior nurses
can also function as assessors.
There are no formal studies on the validity and
reliability of DOPS. However, it appears to have face validity and its
reliability can be improved by increasing the number of procedures and
assessors. Residents; however, feel that DOPS helps them in learning the
skills better [28]. To a great extent, this might be related to the
feedback provided to the trainee. All the arguments regarding utility
and use of feedback advanced for mini-CEX are equally applicable to
DOPS.
Faculty training: Both mini-CEX and DOPS rely
heavily on examiner judgment – therefore some form of training is
required for getting reliable results. The assessors need to be trained
for direct observation and for the ability to discriminate between
levels of performance. For initial iterations, rater accuracy and inter-rater
reliability may need to be monitored. Assessors also need training in
providing developmental feedback based on direct observation, rather
than on historical facts [29]. The residents also need sensitization
regarding potential benefits of this tool.
Direct observation of the residents can go a long way
in improving clinical competence. The major factor for this benefit is
the provision of immediate feedback based on direct observation in the
vicinity of the assessment opportunity. This also helps to amalgamate
learning and assessment, making assessment more valid. Both these tools
can be integrated with the regular working of the clinical unit, without
having to make any special preparations for assessment.
References
1. Norcini JJ. Workplace based assessment. In:
Swanwick T, editor. Understanding Medical Education: Evidence, Theory
and Practice. 2nd edition. West Sussex, UK: Wiley Blackwell; 2010. p.
232-45.
2. Gupta P, Dewan P, Singh T. OSCE revisited. Indian
Pediatr. 2010:47:911-23.
3. Hampton JR, Harrison MJG, Mitchell JRA, Prichard
JS, Seymour C. Relative contributions of history-taking, physical
examination, and laboratory investigation to diagnosis and management of
medical outpatients. BMJ.1975;2:486-9.
4. Gordon M, Darbyshire D, Baker P. Non-technical
skills training to enhance patient safety: A systematic review. Med
Educ. 2012;46:1042-54.
5. Papadakis M, Hodgson CS, Teherani A, Koatsu ND.
Unprofessional behavior in medical school is associated with subsequent
disciplinary action by a state medical board. Acad Med. 2004;79:244-9.
6. van der Vleuten CPM, Schuwirth L. Assessing
professional competence: from methods to programs. Med Educ.
2005;39:309-17.
7. Singh T. Basics of assessment. In: Singh T,
Anshu, editors. Principles of Assessment in Medical Education. 1st ed.
New Delhi: Jaypee Brothers; 2012: p 1-13.
8. Singh T. Student assessment: issues and dilemmas
regarding objectivity. Natl Med J India. 2012;25:287-90.
9. Hattie JA. Identifying the salient facets of a
model of student learning: A synthesis of meta-analyses. Int J Educ Res.
1987;11:187-212.
10. Veloski J, Boex JR, Grasberger MJ, Evans A,
Wolfson DB. Systematic review of the literature on assessment, feedback
and physicians’ clinical performance: BEME Guide No. 7. Med Teach.
2006;28:117-28.
11. Daelmans HE, Hoogenboom RJ, Donker AJ, Scherpbier
AJ, Stehouwer CD, van der Vleuten CPM. Effectiveness of clinical
rotations as a learning environment for achieving competences. Med
Teach. 2004;26:305-12.
12. Kogan JR, Hauer KE. Use of the mini-clinical
evaluation exercise in Internal Medicine core clerkships. J Gen Intern
Med. 2006;21:501-2.
13. Day SC, Grosso LG, Norcini JJ, Blank LL, Swanson
DB, Horne MH. Residents’ perceptions of evaluation procedures used by
their training program. J Gen Intern Med. 1990;5:421-6.
14. Norcini JJ, Blank LL, Arnold GK, Kimball HR. The
Mini-CEX [Clinical Evaluation Exercise]: A preliminary investigation.
Ann Intern Med. 1995;123:795-9.
15. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The
Mini- CEX: A method for assessing clinical skills. Ann Intern Med.
2003;138:476-81.
16. American Board of Internal Medicine. Direct
Observation Assessment Tool. Available from:
http://www.abim.org/pdf/paper-tools/minicex.pdf . Accessed July 1,
2014.
17. St. George University of London. Mini-CEX clip.
Available from: https://www.youtube.com/watch?v= hwreA4DGvtw.
Accessed July 1, 2014.
18. St. George University of London. Proving Feedback
clip. Available from: https://www.youtube.com/watch?v= nod7SKwIQhg.
Accessed July 1, 2014.
19. Kogan J, Bellini L, Shea J. Feasibility,
reliability and validity of the mini clinical evaluation exercise in
medicine core clerkship. Acad Med. 2003;78:S33-5.
20. Norcini JJ, Blank LL, Arnold GK, Kimball HR.
Examiner differences in the mini-CEX. Adv Health Sci Educ Theory Pract.
1997;2:27-33.
21. Singh T, Norcini JJ. Workplace based assessment.
In: Mc Gaghie W, editor. International Best Practices in
Assessment. 1st ed. London: Radcliffe Publishers; 2013. P. 257-80.
22. Singh T, Sharma M. Mini clinical evaluation
exercise as a tool for formative assessment. Natl Med J India.
2010;23:100-3.
23. Singh T, Norcini JJ. The mini clinical evaluation
exercise. In: Singh T, Anshu, editors. Principles of Assessment
in Medical Education. 1st ed. New Delhi: Jaypee Brothers; 2012. P.155-65.
24. Pendleton D, Schofield T, Tate P, Havelock P. The
consultation: an approach to learning and teaching. Oxford: Oxford
University Press; 2003.
25. Swanwick T, Chana N. Workplace-based assessment.
Br J Hosp Med. 2009;70:290-3.
26. Beard J, Strachan A, Davies H, Patterson F, Stark
P, Ball S, et al. Developing an education and assessment
framework for the Foundation Programme. Med Educ. 2005;39:841-51.
27. Norcini J, Burch V. Workplace-based assessment as
an educational tool: AMEE Guide No.31. Med Teach. 2007;29: 855-71.
28. Kundra S, Singh T. Feasibility and acceptability
of direct observation of procedural skills to improve procedural skills.
Indian Pediatr. 2014;51:59-60.
29. Liao KC, Pu SJ, Liu MS, Yang CW, Kuo HP.
Development and implementation of a mini-Clinical Evaluation Exercise
(mini-CEX) program to assess the clinical competencies of internal
medicine residents: From faculty development to curriculum evaluation.
BMC Med Educ. 2013;13:31.
|
|
 |
|