The National Health Training Institute (NHTI)-Maridi, in South Sudan adapted the Objective structured Clinical Examination (OSCE) for assessing clinical competence in 2008. However, since then no formal evaluation of the OSCE for the Clinical Officers programme had been conducted.
The objectives of this educational evaluation study were to assess the tutors’ and students’ knowledge on the OSCE, evaluate the practice/implementation of the OSCE, assess the attitudes of the NHTI tutors and students towards the OSCE, assess the validity and reliability of the previous OSCEs and to determine the impact of the OSCE on the teaching and learning in the Clinical Officer (CO) programme.
It was a cross-sectional educational research that employed both quantitative and qualitative techniques as well as documentary analysis of past OSCE papers and results. Convenience sampling method was used to select the NHTI faculty and students to participate in the study. Self-administered questionnaires were used to collect data. SPSS was used to analyse the quantitative data while the qualitative data was analysed manually.
Results established that all the eleven NHTI tutors and the majority (71%) of the third year and (59%) second year students had a clear understanding of the OSCE. All tutors and students who participated in this study had practical experience of the OSCE. A high rating was given for the previous OSCE organization and implementation by both the faculty and students. Fifty five percent of the tutors preferred OSCE alone while 36% of them preferred a combination of long-case/short case format of assessing clinical competence.
Twenty eight percent of the students preferred the OSCE alone while 64% of them preferred a combination of the OSCE and long-case/short case formats of assessing clinical competence. Findings also demonstrated that all of the previous OSCEs had good content, face and predictive validity as well as reliability. Overall, results illustrated that the OSCE has had a positive impact on both the teaching and learning in the CO programme. This has resulted in better performance in clinical assessment since it was introduced in 2008.
TABLE OF CONTENTS
ACRONYMS AND ABBREVIATIONS
LIST OF TABLES
LIST OF FIGURES
ABSTRACT
CHAPTER ONE: INTRODUCTION
Background Information
Statement of the Problem
Study Objectives
CHAPTER TWO: LITERATURE REVIEW
History of the OSCE
What is an OSCE?
Advantages of using the OSCE
The impact of the OSCE on teaching and learning
Competences assessed in the OSCE
Perception of the OSCE by students
Perception of the OSCE by Tutors
The OSCE process
How to set up an OSCE
The number of stations used in the OSCE and the duration per station
Patients used in the OSCE
Standard setting in the OSCE
The Validity of the OSCE
Reliability of the OSCE
The Internal and Inter-rater reliability in the OSCE
Measuring the internal consistency in OSCE
Other considerations in the OSCE
Constraints/limitations of the OSCE
CHAPTER THREE: BACKGROUND TO THE STUDY AREA
General background about the NHTI –Maridi
Student assessment methods in the NHTI
Rational for choosing the NHTI – Maridi for the Study
CHAPTER FOUR: METHODOLOGY
Study Design
Sampling methods
Data Collection Instruments
Data Analysis
Pre-testing methodology
Expected Study Outcomes
Study limitations
CHAPTER FIVE: RESULTS
Introduction
NHTI Tutors’ evaluation of the OSCEs
Student’s Evaluation of the OSCE
CHAPTER SIX: DISCUSSION
NHTI Tutors’ evaluation of the OSCE Practice in NHTI
NHTI Students’ evaluation of the OSCE Practice in NHTI
The Validity of the OSCEs in NHTI
The Reliability of the OSCEs in NHTI
The Impact of the OSCE on Teaching and Learning in NHTI
CHAPTER SEVEN: CONCLUSION & RECOMMENDATIONS
Conclusion
Recommendations
REFERENCES
ANNEXES
ACRONYMS AND ABBREVIATIONS
AMREF; African Medical and Research Foundation
CASC; Clinical Assessment of Skills and Competencies
CAT; Continuous Assessment Test(s)
CO; Clinical Officer(s)
CPR; Cardiopulmonary resuscitation
EPI; Expanded programme of immunisation
GYN; Gynaecology
JIPMER; Jawaharlal Institute of Postgraduate Medical Education and Research
NHTI; National Health Training Institute
OBS; Obstetrics
OBM; Objective Borderline Method
OSCA; Objective Structured Clinical Assessment
OSCE; Objective Structured Clinical Examination(s)
PA; Per abdomen examination
PBL; Practice Based Learning
SEs; Side effects
SBL; Systems Based Learning
SP; Simulated patient
SPSS; Statistical Package for Social Sciences
LIST OF TABLES
Table 1: NHTI Tutors’ knowledge of the OSCE by Definition
Table 2: Type of OSCE Tutor’s Participated in previously
Table 3: NHTI Tutors’ attitudes towards OSCE practice in NHTI
Table 4: Tutors’ preferred format for assessing Clinical Competence in NHTI
Table 5: Identified Areas of weakness/limitations in OSCE practice by NHTI Tutors
Table 6: Tutors’ Recommendations for improving the OSCE practice in NHTI
Table 7: Impact of OSCE on teaching and learning in NHTI
Table 8: Third Year CO Students’ knowledge of the OSCE by definition
Table 9: Second Year Students’ knowledge of the OSCE by definition
Table 10: Third year CO Students’ comment on the previous OSCE content
Table 11: Second year CO Students’ comment of Previous OSCE content
Table 12: Third year CO Students’ Attitude towards OSCE practice in NHTI
Table 13: Second Year CO Students’ Attitudes towards OSCE practice in NHTI
Table 14: Third year CO Students’ Preferred Format for assessing Clinical Competence in NHTI
Table 15: Second Year CO Students’ Preferred Format for assessing Clinical Competence in NHTI
Table 16: Identified Areas weakness/limitations in OSCE by NHTI students
Table 17: Students’ suggestions for improving OSCE practice in NHTI
Table 18: Ways in which OSCE had Impacted student learning in NHTI
Table 19: Blueprint for the November 2014 CO Final OSCE at NHTI
LIST OF FIGURES
Fig. 1: The Number of stations an ideal OSCE should have
Fig. 2: Tutor’s role in the Previous OSCE
Fig. 3: The Number of Stations used in previous OSCE
Fig. 4: Duration used per OSCE Stations
Fig. 5: Tutors’ rating of the level of the previous OSCE difficulty
Fig. 6: Tutors’ rating of the previous OSCE organization
Fig. 7: Tutor’s rating of Previous OSCE implementation
Fig. 8: Role played by Second year Students in the previous OSCEs in NHTI
Fig. 9: Adequacy of Time -Third year CO Students4
Fig. 10: Adequacy of time -Second year CO students
Fig. 11: Level of OSCE difficulty -Third year CO Students
Fig. 12: Level of OSCE difficulty - Second year CO students
Fig. 13: Third year CO students’ comment about previous OSCE Organisation
Fig. 14: Second Year CO Students Comments about previous OSCE Organization
Fig. 15: Third year CO Students’ rating of previous OSCE implementation
Fig. 16: Second year CO Students Comments on previous OSCE implementation
ABSTRACT
The National Health Training Institute (NHTI)-Maridi, in South Sudan adapted the Objective structured Clinical Examination (OSCE) for assessing clinical competence in 2008. However, since then no formal evaluation of the OSCE for the Clinical Officers programme had been conducted.
The objectives of this educational evaluation study were to; assess the tutors’ and students’ knowledge on the OSCE, evaluate the practice/implementation of the OSCE, assess the attitudes of the NHTI tutors and students towards the OSCE, assess the validity and reliability of the previous OSCEs and to determine the impact of the OSCE on the teaching and learning in the Clinical Officer (CO) programme.
It was a cross-sectional educational research that employed both quantitative and qualitative techniques as well as documentary analysis of past OSCE papers and results. Convenience sampling method was used to select the NHTI faculty and students to participate in the study. Self-administered questionnaires were used to collect data. SPSS was used to analyse the quantitative data while the qualitative data was analysed manually.
Results established that all the eleven NHTI tutors and the majority (71%) of the third year and (59%) second year students had a clear understanding of the OSCE. All tutors and students who participated in this study had practical experience of the OSCE. A high rating was given for the previous OSCE organization and implementation by both the faculty and students. Fifty five percent of the tutors preferred OSCE alone while 36% of them preferred a combination of long-case/short case format of assessing clinical competence. Twenty eight percent of the students preferred the OSCE alone while 64% of them preferred a combination of the OSCE and long-case/short case formats of assessing clinical competence. Findings also demonstrated that all of the previous OSCEs had good content, face and predictive validity as well as reliability. Overall, results illustrated that the OSCE has had a positive impact on both the teaching and learning in the CO programme. This has resulted in better performance in clinical assessment since it was introduced in 2008.
More practice for students on the OSCE, more focused training for the faculty on the OSCE and more research to evaluate the validity and reliability of the OSCE in NHTI should be organised in the future, for continuing OSCE improvement.
CHAPTER ONE: INTRODUCTION
Background Information
The National Health Training Institute (NHTI)-Maridi, in South Sudan adapted the Objective structured Clinical Examination (OSCE) for assessing Clinical competence at various stages of the Clinical Officer training programme in 2008. The OSCE has now replaced the traditional long case and short case type of clinical assessment especially in the mock and final qualifying Clinical Officer clinical examinations.
Statement of the Problem
No formal evaluation of the OSCE for the NHTI Clinical Officers programme has been conducted to assess its strengths and its impact on the teaching and learning process.
Study Objectives
The objectives of this study were to;
a) Assess the knowledge of the NHTI tutors and students about the OSCE.
b) Evaluate the practice/implementation of the OSCE in the Clinical Officer examination in the NHTI.
c) Measure the attitudes of tutors and students towards the OSCE in the NHTI.
d) Determine the validity of previous OSCE stations.
e) Determine the reliability of previous OSCEs.
f) Determine the impact of the OSCE on teaching and learning in the Clinical Officer programme.
CHAPTER TWO: LITERATURE REVIEW
Introduction
This chapter reviews the OSCE to date and focuses on; the history of, the definition of, the uses of, knowledge on, attitudes towards, practice of, validity of, reliability of, impact of, limitations of and finally the ways of improving the OSCE.
History of the OSCE
The OSCE was first introduced at the University of Dundee by Ronald Harden in 1972, as the routine end-of-term and final clinical Examination (Friedman 2003). By then the need for a new approach to the assessment of clinical competence had become obvious due to the weaknesses associated with the traditional methods. The OSCE was therefore introduced as a way of measuring clinical competence that allowed for control of many of the biases of conventional methods (Carraccio & Englander 2000a).
The OSCE started as a pilot project in the undergraduate context but later became part of the formal clinical examination up to postgraduate medical education level in the University of Dundee (Harden 2011a; Davis 2003a). With time, the use of OSCEs has spread internationally and into other disciplines.
In the last decade, the use of the OSCE has become very popular in nursing and allied professions (Caballero et al. 2012). It has now become a standard method of assessment in both undergraduate and postgraduate studies in the United States, Canada and indeed most reputable colleges of medicine (Marliyya 2011). For instance, Patil et al. (2003) reported that the OSCE was introduced in the Department of Surgery, Faculty of Medicine, University of Hong Kong for specialty clerkship final year in 1995. Since then, the OSCE has been introduced in the first and the second year of the course as part of their formative assessment. Troncon (2004a) also reported that the OSCE had become established as one of the most valid, reliable and effective tests for assessment of clinical skills, having been introduced at the Faculty of Medicine of Ribeirao Preto, University of Sao Paulo in 1995. More recently, the incorporation of the OSCE into the Canadian Licensing Examination, the certification process of the Educational Commission for Foreign Medical Graduates, and the US Medical Licensing Examination is a clear evidence of the direction in which medical education is moving (Carraccio & Englander 2000b).
What is an OSCE?
According to Shumway and Harden (2003a), the OSCE is a multi-station assessment approach primarily used to measure clinical competence. Students are assessed at a number of stations, on focused activities, that simulate aspects of clinical competence. The OSCE has also been defined as a form of performance-based testing used to measure candidates’ clinical competence (OSCE Home n.d., a)
According to Praveen and Suman (2008a); Sajjad (2008); Anshu (2008) the OSCE is a method of evaluating students’ clinical competence in a planned and structured way with well-defined objectives which may be a small component of a larger clinical competence e.g. history taking, physical examination or the interpretation of a laboratory investigation.
Uses of the OSCE
As already mentioned, the OSCE was first introduced in the context of undergraduate medicine to assess clinical competence (Harden 2011b; Davis 2003b). According to Gupta et al. (2010), the OSCE has been used as a tool for both formative and summative evaluation of medical undergraduate and postgraduate students across the globe for years (Marwaha 2011a). The OSCE was used for the same purpose at the University of the United Emirates (Townsend et al. 2001). In the UK, the Clinical assessment of Skills and Competencies (CASC) is a type of OSCE used as a clinical examination gateway, granting access to becoming a senior Psychiatrist (Marwaha 2011b).
According to Varkey et al. (2008a), the OSCE provided an opportunity to systematically sample the different sub-domains of quality improvement and to demonstrate skills rather than the testing of knowledge alone. This has made the OSCE a potentially powerful assessment tool for the System based practice (SBP) and Practice based learning (PBL). In addition, Hodder and Papp (2001) explained that OSCEs provided an opportunity to assess a student’s ability to integrate knowledge, clinical skills, and communication with the patients as well as evaluation of an educational programme. Therefore based on the above observations, appropriate changes can be made in the curriculum to better address the learning needs of students (Sachdeva 2001a).
At Ohio University, the OSCE was used to; provide feedback on clinical performance, evaluate basic clinical skills, determine the minimal desired competencies and to provide practice for the final examinations (Ohio University 2012). The OSCE was used in educational research as outcome measures of specific teaching interventions. It was also used to enhance acquisition of medical knowledge, to facilitate acquisition of core clinical skills, as an early feedback tool when teaching core clinical examination skills and as part of the plan and activities for continuing medical education (Moattari et al. 2009a).
Advantages of using the OSCE
According to Hodges et al. (1998), the OSCE is widely used because of its feasibility as well as good reliability and validity. In addition, it has become the standard tool for performance-based assessment, particularly in undergraduate clinical examinations. Harden (2003b) pointed out that the OSCE has made the simulation of real situations possible. The OSCE also facilitates integration of teaching and evaluation, provides variety that maintains students’ interest and promotes faculty-student interaction. It is also adaptable to local needs, and allows a large number of students to be tested within a relatively short time (Ananthakrishnan 1993a; Turner and Dankoski 2008a)
According to Chan (2009a), the OSCE provides a uniform marking scheme for examiners and consistent examination scenarios for all students. It provides an authentic way to assess medical students, generates formative feedback for both the learners and the teaching program. It provides more insights about students’ clinical and interactive competencies. It can objectively assess important aspects of clinical expertise, such as physical examination skills, interpersonal skills, technical skills, problem-solving abilities, decision-making abilities, and patient treatment skills as well as communication with the patients (Praveen and Suman 2008). Overall, the OSCE is a reliable, valid and a reproducible method for both formative and summative assessment.
NB. More literature on the reliability and validity of the OSCE is handled in the later sections of this chapter.
The impact of the OSCE on teaching and learning
According to Shumway and Harden (2003b) student participation in an OSCE has a positive impact on learning because it focuses students’ attention on the acquisition of essential clinical skills. In their study to investigate the effect of an OSCE on dental students' learning strategies and competence to manage periodontal diseases in patients, Schoonheim-Klein et al. (2006) found out that its implementation did not have any effect on student study time and study strategies. However, they observed that the OSCE appeared to stimulate learning, resulting in greater achievement of specific clinical competences and a greater level of realistic self-assessment. Zahid et al. (2011) also studied the effect of the OSCE on students' performance, a year after its introduction in 2007 and established that the 2008 group had significantly higher scores on the OSCE compared to their predecessors who had an Oral examination in 2007. It was concluded that the higher scores on OSCE in the year 2008 were due to their exposure to OSCE experiences.
The OSCE can be used as a teaching method or tool. For instance, Brazeau et al. (2002a) found that students who were involved in the OSCE appreciated the constructive discussion of their strengths and weaknesses in clinical encounters and practicing for future OSCE-type examinations. Faculty members also enjoyed this active teaching format and found the process of students giving feedback to their peers educationally useful (Turner and Dankoski 2008b).
Ratzmann et al. (2012) studied the effect of integrating the OSCE into the dental preliminary examinations and found that the intervention group performed significantly better than the control group. They concluded that early integration of the OSCE in dental training improved student performance in OSCE. They recommended that such integrated training be offered to students as preparation for the OSCE. Rush et al. (2014) conducted a similar evaluation study of the objective structured clinical assessment (OSCA) with immediate feedback and discovered that it was perceived positively by the students and had a positive impact on student learning and confidence. This was in agreement with Rushforth’s (2006) review of the key issues affecting the utilisation of the OSCE within the assessment of nursing students. She concluded that whilst caution must be applied in relying on the OSCE as a sole means of practitioner assessment, when used carefully it can make a helpful and meaningful contribution to the health professional education. Prislin et al. (1998a) demonstrated that an OSCE experience facilitates the identification of learning needs on the part of both faculty and students. It is therefore clear that over the years, increasing experience with the procedure has led to the use of the OSCE not merely as an assessment tool but as a teaching method (Ananthakrishnan 1993b).
Competences assessed in the OSCE
According to Harden (1990a and 2006), the OSCE should be based on the objectives of the curriculum including clinical methods, mastery of practical procedures, problem solving and clinical reasoning as well as laboratory data interpretation. Hollis and James (2001a) recommended that the OSCE should measure objectively, the competencies in specific areas of behavior, techniques, attitudes, decision making skills and strategies based on the objectives of the course or licensing body. In addition, Sanjay (2008) suggested six general competencies to be assessed i.e. Medical Knowledge, Patient Care, Professionalism, Communication Skills, Practice- Based Learning (PBL) and Systems-Based Learning (SBL).
Perception of the OSCE by students
Pierre et al. (2004a) found an overwhelming acceptance of the OSCE among students at the University of West Indies, Jamaica, with respect to the comprehensiveness (90%), transparency (87%), fairness (70%) and authenticity of the required tasks (58–78%). However some students felt that it was a strong anxiety-producing experience, had ambiguity of some questions and time was inadequate for expected tasks. It was concluded that student feedback was invaluable in influencing faculty teaching and curriculum implementation. These findings correlated well with those of students from New Jersey Medical school who appreciated the opportunity of having constructive discussions of their strengths and weaknesses in clinical encounters, observing a variety of doctor—patient interaction styles, and practicing for future OSCE-type examinations (Brazeau et al. 2002b). In a study conducted at School of Nursing and Public Health at the University of Namibia, student perceptions and experiences indicated that the OSCE was an accepted method of assessment. Nevertheless, there was a need for careful preparation and organization (Small 2013a).
In their study to evaluate students’ perception about the validity, comprehensiveness and acceptability of OSCE at Jimma University, Belay and Tsinuel (2008a) reported that 60.7% of respondents thought the OSCE was a good learning opportunity. Moreover, 54.1% agreed that the examination covered common and relevant topics consistent with stated teaching objectives. However, 43.4% of them reported that examiners at manned stations were intimidating and 24.5% of them claimed that individual feedback was offered only to a minority. Fifty four percent of them felt that the test was fair in assessing their knowledge and skills and 71.3% of them stated that the personality, gender and other attributes of the candidates did not affect the test scores. They concluded that overall, students’ evaluation of the OSCE was remarkably encouraging. In a similar type of study, Tayyaba et al. (2014) found out that the vast majority of the diverse groups of students at Cardiff University School of Medicine had a high overall satisfaction with the process, fairness, and quality of the OSCEs and felt that the OSCEs accurately assessed their skills and competencies. Over 60% of the students reported that OSCE examiners were objective and 62% of them reported that they were not intimidating. The personality, gender, and ethnicity of the patient did not affect their performance according to 70% of them. Seventy eight percent of them reported that they were well informed on the content to be covered by the OSCE.
In a study to assess the feasibility and acceptability of using OSCE for a large number of medical students at the College of Medicine at Riyad, Khairy (2004) reported that all students agreed that the organizational aspect of the examinations was smooth and the time for each station was adequate. Sixty percent of them felt that the station topics were covered in the course; 82% of them agreed that the examination was fair and objective and 93% of them wanted this method to be used in the assessment of third year medical students. Similar responses were received from the examiners who were involved in the OSCE. It was concluded that the OSCE was a practical and acceptable method for assessing medical students' basic surgical skills, even for a large number of candidates.
In their evaluation of an OSCE in surgery and orthopaedics Kalbitz et al. (n.d.) reported that the OSCE was chosen as the preferred future examination method, followed by multiple choice testing and clinical practical examination. They concluded that the OSCE showed a high acceptance rate by the medical students at the University of Ulm.
In comparing the effectiveness of the OSCE with the conventional examination as formative assessment tools in Pediatrics, Mondal et al. (2004) reported that 73.8% of students felt that the OSCE was a better formative assessment tool. However, 9.5% of the students preferred the conventional examination.
Concerning OSCE organisation, Belay and Tsinuel (2008c) reported that 54.1% of the students indicated that the OSCE environment was free of noise; 51.6% of them agreed that the venue had adequate illumination, and 45.1% of them felt the sequence of stations was logical and 45.1% of them felt it was appropriate. In contrast Troncon (2004b) established that students were comfortable with cases and tasks, but nearly half (48%) of them criticized organizational aspects of the OSCE. A good number of students (70%) reported difficulties with both time management and stress control.
Perception of the OSCE by Tutors
Troncon (2004c) found out that faculty members at University of São Paulo Medical School thought the OSCE was highly relevant and more accurate than the previous examination methods. It was useful for imparting clinical skills to students, was effective for assessing what is taught by different tutors and how it is learned and had positive influence on their personal way of teaching. It helped to achieve more uniform teaching, had a positive effect on students' drive to actually study and practice, and was perceived as highly effective in revealing a detailed picture of both student performance and course efficacy (Walters et al. 2005a). However, the same faculty at the University of São Paulo criticized the OSCE, reporting that it was unable to provide assessment of integrated approach to patients and could actually distort the students’ view of the patients. In addition, the OSCE results may be inconsistent since the examinees are excessively stressed during the process and that it was a very complex task to organise and time consuming in selecting and preparing patients for the examination (Troncon 2004d).
In their discussions (Kalyan 2008) reported that at Jawaharlal Institute of Postgraduate Medical Education and Research (JIPMER), the OSCE ended with instant feedback from the examiners particularly at formative evaluation. It was rated as superior to the conventional practical examinations, because of its perceived objectivity. In comparison, Johnson and Reynard (1994); Walters et al. (2005b) reported that the OSCE was an acceptable examination to both medical students and examiners. Their study revealed significant correlation between marks obtained on the OSCE and subjective rating of the students’ ability. Similar results were obtained by Varkey et al. (2008a), in which they reported that all faculty members agreed that the OSCE was realistic and capable of providing accurate assessments.
The OSCE process
According to Praveen and Suman (2008b) the OSCE process involves the following steps;
a) Defining the purpose and domain of what is to be assessed based on the curriculum.
b) Identifying the skills which the student must learn and thus need to be assessed.
c) Converting the identified skills into specific questions or tasks that can be answered in 4 to 5 minutes. Each question or task assigned is identified as an OSCE/OSPE station, which can be; a procedure, question, couplet and or rest stations.
d) Preparing a checklist consisting of the vital steps, which students have to follow to accomplish at the procedure station. Each step in the checklist can be given differential marks depending upon the significance of that step over another.
e) Reviewing and validating the test with a mock examination or practice.
When conducting an OSCE, Sachdeva (2001b) recommended that about one hour prior to commencement, each station should be properly set up with the appropriate equipment, examination materials as well as instructions. Examiners, examinees and patients should be briefed half an hour prior to the examination.
Cohen et al. (1989) suggested that the number of support staff required to run the OSCE depends upon the number of stations in the examination. They include; an official timer, one support person every three stations to collect completed answer sheets, a clerical person at a central desk who is responsible for the sorting of the completed checklists and answer sheets as they arrive, a chaperone for each of the rest stations whose responsibility is to attend to candidate’s needs and at least two reserve standardized patients.
Walter and David (2001) pointed out that the key to successful OSCE implementation is careful planning. In addition, there must a single leader with the required knowledge and experience to coordinate all aspects of the development and implementation of the OSCE including quality assurance throughout the OSCE cycle (Kamran 2013a).
How to set up an OSCE
To avoid problems on the day of the examination Harden (1990b) recommended a set of advance tasks which should take place including; appropriate briefing of examiners and students to gain their cooperation, the assembly of resources and patients required for the OSCE along with their reserves and completion of appropriate arrangements with regard to the site where the examination will be conducted. There should also be a coordinator, who has the responsibility of for taking overall charge of the advance planning of the examination and its implementation on the set day. Where the examination is run simultaneously at multiple sites, there should be an additional coordinator in charge of each site. The required resources for the OSCE depend on the nature and design of the OSCE stations (Harden 1990c). However, all the necessary resources should be identified and listed prior to the examination namely;
a) Examiners who observe procedures at the manned stations and those who mark written answers
b) Patients (real and or simulated)
c) Clinical equipment e.g. sphygmomanometer, thermometers, specimen samples, etc.
d) Furniture e.g. tables, chairs, examination couches/beds
e) Paperwork including checklists and instructions to examiners and candidates as well as briefs for simulated patients and
f) An audible change signal.
The number of stations used in the OSCE and the duration per station
Harden (1990c) explained that the total number of stations together with the time allocated per station determines the total time required to complete the whole examination. For instance, 20 stations of 5 minutes each can be completed in 1 hour 40 minutes. In contrast, OSCE Home (n.d., b) suggested that the number of stations should be 18-20 and each last for 5 minutes. Ananthakrishnan (1993b);Sajjad (2008) suggested about 15-20 stations each requiring 4-5 minutes. At the University of West Indies, the OSCE comprised a circuit of 13 stations of 7 minutes duration each with the exception of the 14-minute history-taking station. One minute was given between stations to facilitate change over and the reading of instructions. Some rest stations were strategically included to reduce candidates’ and patients’ stress (Pierre et al. 2004a). Hollis and James (2001a) reported that total number of OSCE stations varied from 10-25. They recommended that the transit time be in-built per station. They however cautioned that less than ten stations were probably inadequate and greater than 25 stations could become unwieldy.
Patients used in the OSCE
The OSCE can use real, simulated or standardized patients. James (2001) defined a standardized patient as an individual with a health problem that is in a chronic but stable condition. Such patients may be used when properly trained for history taking and physical assessments. Simulated patients on the other hand may come from the ranks of volunteers or paid actors. He concluded that all simulated patients require training but more training was required for patients used in history taking than for those used for physical examinations. This training enables patients to present their problems consistently (Cohen et al. 2001; Praveen 2008). In comparison, Epstein (2007) defined standardized patients as actors who are trained to portray a patient’s role consistently on repeated occasions and are often incorporated into the OSCE. In addition, interactions with standardized patients can be tailored to meet specific educational goals.
Standard setting in the OSCE
Bouriscot and Roberts
They classified standard setting methods into two main categories i.e.
a) Relative standards/norm referenced methods:
These are based on a comparison among the performances of examinees. A set proportion of candidates fail regardless of how well they perform e.g. the top 84% pass and the rest fail the examination.
b) Absolute standards/criterion referenced methods:
These are based on how much the examinees know. Candidates pass or fail depending on whether they meet specified criteria or not e.g. examinees must correctly answer 70% of the questions in order to pass. Examples of absolute standard setting methods include; Angoff’s method and Ebel’s method.
Kaufman et al. (2000) compared four standard-setting procedures for an OSCE i.e. Angoff, borderline, relative, and holistic methods to establish a cutoff score for a pass/fail decision. They concluded that the Angoff and borderline procedures provided more reasonable and defensible approaches to standard setting applicable in medical schools. However, from a cost perspective, they suggested that the borderline method is preferable. They also recommended further investigation of the other procedures of standard setting. Two years later Kilminster and Roberts (2002) established that the borderline approach to standard setting was feasible and ensured a pass standard that is both justifiable and credible. Wilkinson et al. (2002) in their evaluation study of a simple 'contrasting groups' method, applied to an OSCE conducted simultaneously in three separate schools found out that there was good agreement between examiners in identifying borderline candidates. However, there were significant differences between schools in the borderline score for some stations, but these disappeared when more than three stations were aggregated. They concluded that combining marks from all the OSCE stations before determining the pass mark was more reliable than making decisions based on individual stations.
According to Mash (2002), the borderline regression method is used at Stellenbosch University in the Division of Family Medicine, South Africa, and has contributed towards a higher standard of clinical assessment. It is performed by the examiners during the examination and is calculated automatically once the scores are entered into an Excel spreadsheet. Mash explained that this method has shown to provide a more credible and reliable standard than the Angoff method. In comparison, Wood et al. (2005a) established that the advantage of the Borderline Regression Method was that; it uses all of the examinee data for setting the pass mark and not just the scores from examinees rated as borderline and is therefore less susceptible to variation due to unequal weighting of examinees in the borderline groups. In addition, it has all the advantages Modified Borderline-Group Method i.e. using actual examinee performances, having better face validity, and being an efficient use of the clinicians examiners’ time.
Shulruf et al. (2013) on other hand evaluated the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. They compared three methods i.e.; the Regression Method, the Borderline Group Method, and the new Objective Borderline Method (OBM). They established that the pass/fail cut-off scores generated by the OBM were similar to those generated by the more established methods. They then concluded that based on theoretical and empirical analysis, the OBM had advantages over existing methods in that it combined objectivity, realism, robust empirical basis and was simple to use.
The Validity of the OSCE
In his review article Barmen (2005a) emphasized the four criteria that must be considered in developing any method of assessment i.e. validity, reliability, objectivity and feasibility or practicability. He defined validity as the extent to which a test measures what it is intended to measure. He explained that for a high level of validity, a test must contain a representative sample of what students are expected to have achieved in terms of both the subject matter and the educational objectives. He also defined face validity or credibility as the extent to which the OSCE includes material that is considered important to measure. He further defined content validity as the degree to which the OSCE covers the area of competence in question. He concluded by stating that that every OSCE has to pass the criteria of validity.
Commenting about the face and content validity of OSCE in University of West Indies, Pierre et al. (2004b), reported that face and content validity of each checklist was established by review and consensus by a core group of senior paediatricians. The OSCE stations were selected to represent the curricular goals and objectives and to reflect authentic clinical situations (Hamdy 2006a; Walters et al. 2005c).
Varkey et al. (2008b) found out that the content of the OSCE stations in their study matched the curriculum blueprint, emphasized significant curriculum concepts, and allowed demonstration of skills in the appropriate context. They concluded that their pilot study provided promising evidence for validity, feasibility, and acceptability of an OSCE for the assessment of competency in System Based Practice (SBP) and Practice Based Learning (PBL). In comparison Prislin et al. (1998b) in their study, to assess 8 years’ experience with a family medicine clerkship OSCE found out that both students and faculty agreed that the OSCE experience reflected skills that students should possess. However, there was less agreement that the OSCE reflected clerkship-related learning and actual student performance. It was then concluded that although OSCE content validity was high, further assessment was needed to assure construct validity.
According to Hamdy (2006b) blueprinting an assessment helps to ensure that it is conducted according to a replicable plan and that the test content is mapped carefully against learning objectives to produce a ‘valid examination’. Blueprinting helps to reduce two major validity threats namely; ‘construct under-representation’ which means under-sampling or biased sampling of the curriculum or course content. The second threat is ‘construct- irrelevant variance’, that can occur as a result of flawed item formats i.e. items that are too easy or too hard, or the choice of inappropriate test modalities. In addition, blueprinting helps to identify test instruments that are appropriate to the constructs and contents of the assessment. Finally blueprinting helps to determine; what content to be assessed, how to assess it and the level of competence to be assessed based on Miller’s Pyramid of learning outcomes of; knows, knows how, shows how and does. Therefore blueprinting is an efficient method for guiding the test construction process (Walters et al. 2005c)
Belay and Tsinuel (2008b) reported that 74.6% of the students at Jimma University felt cases at manned OSCE stations were relevant to real-life situations. However, 62.3% of the students reported that their performance was affected by patients’ uncooperativeness during the examination. Relatedly, Mossey et al. (2001) in their study on the scope of the OSCE in the assessment of clinical skills in dentistry found out that students perceived OSCE scenarios which involved phantom heads as lacking clinical authenticity, and were inappropriate for the assessment of certain clinical operative skills.
Reliability of the OSCE
Barmen (2005b) defined reliability as reproducibility of a set of measurements, consistency or stability of measures over time. He explained that short time periods that require speedy responses may greatly affect reliability. He added that reliability is also affected when OSCE stations are few. He reported that other factors that negatively affect reliability in OSCE were; unreliability of standardized patients, personalised ways of scoring, disorganized examiners and noisy examination rooms. He suggested that for a high level of reliability OSCEs must have an appropriate number of stations and be combined with other methods of assessment. For instance, Hollis and James (2001b) suggested a minimum of 10 and a maximum of 25 stations as appropriate. In addition, Brannick et al. (2011) established that better than average reliability was associated with a greater number of OSCE stations and a high number of examiners per station.
The Internal and Inter-rater reliability in the OSCE
DaRosa and Kaiser (2001a) explained that the inter-rater reliability refers to the correlation between the different raters (examiners) individually observing or judging or rating the same performance, or reviewing the same essay type answers or performance using a checklist. According to them a significant correlation of 0.8 or higher is desirable although 0.7 is considered acceptable.
Walters et al. (2005c) evaluated the validity and reliability of four 15–18-station OSCEs for 128 students over 1 year. Inter-rater reliability and internal consistency (interstation reliability) were assessed using generalizability theory. They found out that the OSCE was feasible and had a high level of perceived face and content validity. In addition, the overall reliability was moderate to good, with G-coefficients in the range 0.55–0.68 for the 4 OSCEs. They concluded that integrating a range of modalities into an OSCE in psychiatry was a feasible, generally valid and reliable.
When studying the effect of examiners on the evaluation of students communication skills performance, Harasym et al. (2008) revealed the variance and reliability of scores as follows; communication scores (0.20 and 0.87), examiner stringency/leniency (0.86 and 0.91), case (0.03 and 0.96), and item (0.86 and 0.99), respectively. All facet scores were reliable (0.87-0.99). Examiner variance (0.86) was more than four times the examinee variance (0.20). There was large variability in examinee scores due to variation in examiner stringency/leniency behaviors that could impact pass-fail decisions. To minimize the variance examiner training was recommended (Tejinder 2008a). In their study, Brannick et al. (2011b) analysed the available literature on OSCE and sources of systematic variance in estimates. They concluded that it was generally helpful to use two examiners and large numbers of stations.
Measuring the internal consistency in OSCE
According to Tavakol and Dennick (2011a), calculating Cronbach’s alpha has become a common practice in medical education research when multiple-item measures of a concept or construct are employed because it is easier to use in comparison to other estimates of reliability. Cronbach’s alpha is a measure of internal consistency, that is, how closely related a set of items are as a group (Explorable.com 2010a). It was developed by Lee Cronbach in 1951 to provide a measure of the internal consistency of a test or scale. It is expressed as a number between 0 and 1. There are different reports about the acceptable values of alpha, ranging from 0.70 to 0.95. A low value of alpha could be due to a low number of questions, poor interrelatedness between items or heterogeneous constructs. For example if a low alpha is due to poor correlation between items then some should be revised or discarded. A high value of alpha (>0.90) may suggest redundancies and show that the test length should be shortened (Tavakol and Dennick 2011c).
Other considerations in the OSCE
DaRosa and Kaiser (2001b) stressed feasibility as another property that should be considered whenever an OSCE is planned. All of the required costs, staff, time and logistics should be taken into account if the OSCE has to be implemented successfully. Patrício et al. (2013) conducted a study to review and synthesize evidence on the technical and economic feasibility of an OSCE in undergraduate medical studies and concluded that despite being an expensive test format, evidence suggests that the use of OSCE produces reliable results.
Constraints/limitations of the OSCE
According to Ananthakrishnan (1993c) the following are possible constraints to OSCE implementation; the difficulty in ensuring that all stations invariably require equal time, the breaking of clinical skills into individual competencies which makes OSCE appear artificial and not meaningful and the risk of observer fatigue with many candidates using lengthy check lists (Walters et al. 2005c).
Hejaz and Downing (2008) added the following as constraints and limitations of to the OSCE implementation; the need for more time, physical space and personnel, trainee's ethics and behavior cannot be reliably assessed using the OSCE and not all clinical situations can be simulated by standardized patients and it has apparent limitations in the examination of invasive operative procedures.
According to Chan (2009b); Barmen (2005d), the limitations of the OSCE include; it may discourage students from looking at the patient as a whole because the students’ knowledge and skills are being put into compartments. Secondly, the assessment examines a narrow range of knowledge and skills and does not test for history-taking competency properly. In addition, students only examine a number of different patients in isolation at each station instead of comprehensively examining a single patient (Shumway and Harden 2003c).
In conclusion, Turner and Dankoski (2008b) recommended the following measures concerning future OSCEs; more research about the best uses of the OSCE method and how to maximize reliability and validity, production and dissemination of evidence of the far-reaching impact that is seen by students and educators. All institutions should avoid the overreliance on any single evaluation method. Finally, each institution must judge the relative value of comprehensive testing in light of local resources as well as the need to prepare students for the OSCE component.
CHAPTER THREE: BACKGROUND TO THE STUDY AREA
General background about the NHTI –Maridi
The National Health Training Institute (NHTI), Maridi, is located in the Western Equatorial State of South Sudan, about 350km west of Juba, the capital city (see Map-annex VI). It was established in 1998 to train Clinical Officers (COs). Clinical Officers are medical professionals who undergo a three-year training programme in Clinical Medicine and Public Health. They are awarded a Diploma in Clinical Medicine and Public Health by the Ministry of Health- Republic of South Sudan and African Medical and Research Foundation (AMREF) upon successful completion of the course. This programme is implemented by the Ministry of Health - Republic of South Sudan with support from the African Medical and Research Foundation (AMREF) – an international Non-governmental Organisation with its Headquarters at Nairobi, Kenya.
Clinical officers are equipped with relevant competencies to enable them to offer Clinical and Community health services in health facilities and communities.
In 2007 the 18 months Community Midwifery programme was introduced in the NHTI. In 2009 a new training programme for Public Health Officers was introduced in the institute. At the time of this study (2011) the institute had 12 fulltime tutors assisted by administrative staff and Hospital staff from Maridi Civil Hospital situated about 3 km away. The institute was offering three training programmes and had a total student population of 194 in the following groups;
a) 127 Clinical Officer students
b) 15 Public Health Officer students and
c) 52 Community Midwifery students
Though all of the above students went through the Objective Clinical/practical Examinations at various stages of their programmes, this study targeted only the Clinical Officers who were in their third and second year of training.
Student assessment methods in the NHTI
Students’ performance in various course units is assessed through Continuous Assessment Tests (CATs), end of practical/clinical attachment assessment and end of semester examinations which account for 40% of the promotional and final examination mark.
Promotional/End of year examinations consist of written and practical/clinical assessments. Final Mock and Qualifying Examinations consist of written and practical/clinical assessments.
Written papers consist of a combination of the following types of questions.
- Multiple choice questions (MCQs)
- Short answer questions
- Matching items
- True or false items
- Modified Essay questions (MEQs).
Each paper is marked out of 100% and the pass mark is 50%. A score of 75% and above is a Distinction while between 65% and 74% is a Credit.
The Practical and Clinical examinations consist of a variety of methods depending on the available resources and include;
- The traditional long case and short case type of examination in the ward.
- Objective structured clinical examination (OSCE) or Objective structured practical examination (OSPE) type of examination.
The pass mark in every clinical or practical discipline is 50% and is considered independent of the written paper. This pass mark is criterion referenced and is based on the Clinical Officer curriculum. As standard setting practice in NHTI, every test or examination has to be moderated by at least two tutors before it is administered, to ensure it meets the set criteria for the programme. This is because all CATs and internal examinations are set and moderated by the respective subject experts at the institutional level. However, external examinations are set and moderated by subject experts at the national level by the MoH Interim National examination Board.
Rational for choosing the NHTI – Maridi for the Study
Though this study could have been conducted in any other health training institute in the region, the researcher chose to conduct it in NHTI, Maridi for convenience- being a staff of the same institute at the time, and to minimize the cost that would be involved if it was conducted elsewhere.
CHAPTER FOUR: METHODOLOGY
Study Design
This was cross-sectional educational research that employed a triangulation of both quantitative and qualitative research approaches as well as documentary analysis of the past OSCE question papers and OSCE results. Available past question papers of the OSCEs were analysed for validity and reliability through a blueprinting process (See Table 19 for the results the blueprinting).
Sampling methods
This study targeted all the 12 NHTI tutors as the primary implementers of the OSCE in the institute. Secondly, it targeted all of the 85 second and third year Clinical Officer students. Convenience sampling method was used to select as many respondents as possible from both the faculty and the students. All the available and willing NHTI tutors and students in the second and third year classes were requested to participate in this study on their free will. Tutors who were not present in the institute at the time were sent questionnaires by email.
Data Collection Instruments
Data was collected from participants using self-administered (postal) questionnaires, consisting of both closed and open ended questions. Two types of questionnaires were used - one type for tutors consisting of 26 open ended and closed questions and one for students’ which consisted of 20 open ended and closed questions. Both questionnaires covered the main study objectives (see annex I and II). In addition a checklist was used in the documentary analysis of the validity and reliability of the previous OSCEs.
Data Analysis
Quantitative data was analysed using SPSS while the qualitative data was analysed manually and summarized into relevant themes that were presented in tabular form.
Pre-testing methodology
Before data collection, the researcher piloted the questionnaires within the NHTI. Identified errors were corrected before actual data collection commenced.
Ethical Considerations
Permission to conduct this study was sought from and granted by the Principal of the NHTI on 31st of March 2011 (see annex V). Though this was an educational research (with low risks for participants), appropriate ethical consideration in the form of informed consent, was sought from all participants. The faculty and students were assured that their information was confidential and would only the used for the purpose of the study. Students in particular, were reassured that non-participation in the study carried no risk on their part.
Expected Study Outcomes
The following outcomes were anticipated of this study;
a) Dissemination of the report to;
- The Directorate of Training and Professional Development-MOH, Republic of South Sudan- Juba as the Health Personnel training policy makers.
- The Directorate of Capacity Building at AMREF Headquarters and the Country Director- AMREF in South Sudan as chief financial supporters of NHTI programmes.
- The Interim National Examination Board and Administration of NHTI as the key implementers of the OSCE.
- The NHTI students as stakeholders in the OSCE processes.
b) Review of the OSCE practice in NHTI where necessary.
Study limitations
This study had only one main limitation i.e. office work pressure that kept the researcher too busy to complete the report in time. The second one was lack of capacity to carry out statistical calculations like Cronbach's alpha for reliability.
CHAPTER FIVE: RESULTS
Introduction
This section consists of three main parts namely; results from filled questionnaires by NHTI Tutors (n=11) and NHTI students (n=83) in two cohorts of Clinical Officer (CO) Intake 15 students, who were third years CO Intake 16 students who were in their second year of training in 2011. The last part of results consists of findings from documentary analysis of some of the past OSCE papers and results. In all eleven out of twelve (91.6%) of the NHTI tutors and 83/85 [97.6%] of the targeted NHTI students participated in this study.
NHTI Tutors’ evaluation of the OSCEs
Summary of the demographic data about the NHTI Tutors
Seven out of the eleven NHTI tutors who participated in this study were males. Nine of them were fulltime staff while the rest were part time staff based at Maridi Hospital. Six out the eleven tutors were South Sudanese, 3 of the tutors were Ugandans and the rest were Kenyans. By profession five out of the eleven tutors were Clinical officers, four of them were Midwives and the rest had Public health background. One of the tutors had a Master’s degree in Epidemiology. In addition, only 4 of the tutors had completed the Diploma in Medical education course, which is essential for all tutors in the NHTI. The rest were in the process of joining the Medical education programme which was only offered outside South Sudan. Each of the tutors had at least two to three course units they taught and examined in the OSCE in the NHTI. The following is the summary of evaluation of their knowledge, attitudes and experience of OSCE in NHTI.
NHTI Tutors’ Knowledge of the OSCE
Table 1: NHTI Tutors’ knowledge of the OSCE by Definition
Abbildung in dieser Leseprobe nicht enthalten
From table 1, it is clear that all NHTI tutors had a clear knowledge of the meaning of the OSCE.
Abbildung in dieser Leseprobe nicht enthalten
Fig.1: The Number of stations an ideal OSCE should have
Figure 1 indicates that NHTI tutors had differing views about the number of stations and ideal OSCE should have, from no idea, to as many as 50 stations. The majority (60%) of them felt that an ideal OSCE should have less than 15 stations. 36.3% suggested 20 and above stations. However, one tutor had no idea about how many stations an ideal OSCE should have.
NHTI Tutors’ Experience in OSCE
Abbildung in dieser Leseprobe nicht enthalten
Fig. 2: Tutor’s role in the Previous OSCEs
The above results show that 6 (57%) of NHTI tutors had participated in previous OSCEs as examiners. Other roles played by tutors included coordinating and time keeping.
Table 2: Type of OSCE Tutor’s Participated in previously
Abbildung in dieser Leseprobe nicht enthalten
Table 2 above shows that the final qualifying OSCE was the type that the majority (6 of the 11) tutors had participated in followed by the mock OSCE.
Abbildung in dieser Leseprobe nicht enthalten
Fig. 3: The Number of Stations used in previous OSCE
Figure 3 shows that the most commonly used number of OSCE stations in the NHTI is between 19-20. The least number of stations is four stations, which were last used in the Community Midwifery course that was phased out in 2011. Thirty stations were used in an Ophthalmology end of course unit examination for Clinical officers.
Abbildung in dieser Leseprobe nicht enthalten
Fig. 4: Duration used per OSCE Stations
Figure 4 reveals that the most frequently used duration per OSCE station in NHTI was 10 minutes for followed by 5 minutes. In addition, all the tutors reported that they felt the time allocated per OSCE station was adequate.
Abbildung in dieser Leseprobe nicht enthalten
Fig 5: Tutors’ rating of the level of the previous OSCE difficulty
Nine of the 11 tutors felt that the level of OSCE difficulty was appropriate (Fig. 5)
In addition all Tutors reported that all OSCE questions and procedures were based on the expected learning outcomes for the Clinical officer curriculum.
Abbildung in dieser Leseprobe nicht enthalten
Fig. 6: Tutors’ rating of the previous OSCE organization
Seven (64%) of NHTI tutors rated the previous OSCEs organisation as being well organized (Fig. 6)
Abbildung in dieser Leseprobe nicht enthalten
Fig. 7: Tutor’s rating of Previous OSCE implementation
Ten (91%) of the tutors rated the previous OSCE implementation as well conducted (Fig. 7)
Tutors’ Attitudes towards OSCE Practice in NHTI
Table 3: NHTI Tutors’ attitudes towards OSCE practice in NHTI
Abbildung in dieser Leseprobe nicht enthalten
The above table illustrates a generally positive attitude towards the OSCE in assessing clinical competence in the clinical officer programme compared to the long case/short case format.
Table 4: Tutors’ preferred format for assessing Clinical Competence in NHTI
Abbildung in dieser Leseprobe nicht enthalten
Six out of the eleven tutors preferred the OSCE to other approaches of assessing clinical competence in NHTI. Only one tutor was undecided about the preferred format (Table 4)
Identified Areas of weakness/limitations in OSCE practice by NHTI Tutors
NHTI Tutors were asked to list whatever they did not like about OSCE practice in NHTI and the following table summarises their responses.
Abbildung in dieser Leseprobe nicht enthalten
The following were the tutors’ recommendations for improvement of OSCE implementation in NHTI.
Abbildung in dieser Leseprobe nicht enthalten
Impact of OSCE on teaching/learning in NHTI
NHTI Tutors were asked to suggest ways in which OSCE had influenced the teaching/learning in NHTI and table 7 summarises their responses.
Abbildung in dieser Leseprobe nicht enthalten
Student’s Evaluation of the OSCE
Introduction
Eighty three NHTI students participated in this study i.e. thirty eight 3rd year Clinical officer students (37 males and 1 female) and forty five 2nd year Clinical officer (42 males and 3 females) students. They completed the questionnaire about knowledge, attitudes and practice relating to the OSCE in NHTI.
Summary of the Demographic data about both cohorts of students
All of the 83 students were from South Sudan. The biggest number came from Jongolei state with about 18% representation in both classes. Western and Northern Bar-el-Gazal had the least representation (7%) of the students in both classes. The majority of the students (63%) had attained their secondary school education in East Africa. About 53% of the students had attended other courses before joining the Clinical officer programme. Courses attended included; Nursing, Mental health, Laboratory technicians, Community Health Workers, Maternal and Child health workers (MCHW), Short courses on HIV/AIDS, computer applications.
NHTI Students’ Knowledge of the OSCE
Students were asked to choose the most correct definition of an OSCE. Table 8 shows their responses;
Table 8: Third Year CO Students’ knowledge of the OSCE by definition
Abbildung in dieser Leseprobe nicht enthalten
Table 9: Second Year Students’ knowledge of the OSCE by definition
Abbildung in dieser Leseprobe nicht enthalten
According to tables 8 & 9 the majority (71%) of 3rd year students and (57.8%) of 2nd year CO students knew the correct meaning of the OSCE. It is clear from both tables that as expected, the 3rd year students had more correct responses compared to their juniors.
NHTI Students’ experience in the OSCE
All students had been involved in OSCEs in NHTI. Most of them had participated in the OSCE as candidates, a few of them as timekeepers and simulated patients as shown in figure 8.
Abbildung in dieser Leseprobe nicht enthalten
Fig. 8: Role played by Second year Students in the previous OSCEs in NHTI
Figure 8 illustrates that most of the students had been involved in past OSCEs as candidates. The least number had acted as timekeepers
Abbildung in dieser Leseprobe nicht enthalten
Fig 9: Adequacy of Time -Third year CO Students
Abbildung in dieser Leseprobe nicht enthalten
Fig. 10: Adequacy of time -Second year CO students
The majority (71%) of the 3rd year students and most (84.4%) of the 2nd year class reported that the time allocated per station was adequate. (Fig. 9 & 10).
Abbildung in dieser Leseprobe nicht enthalten
Fig. 11: Level of OSCE difficulty -Third year CO Students
Abbildung in dieser Leseprobe nicht enthalten
Fig. 12: Level of OSCE difficulty - Second year CO students
Most of the students in both classes rated their previous OSCEs as `just fair’ (Figs 11 and 12).
Table 10: Third year CO Students’ comment on the previous OSCE content
Abbildung in dieser Leseprobe nicht enthalten
Table 11: Second year CO Students’ comment of Previous OSCE content
Abbildung in dieser Leseprobe nicht enthalten
Tables 10 and 11 reveal that the majority of the respondents in both classes felt that the previous OSCEs content was within the expected course learning outcomes - evidence of a valid assessment.
Abbildung in dieser Leseprobe nicht enthalten
Fig. 13: Third year CO students’ comment about previous OSCE Organisation
Abbildung in dieser Leseprobe nicht enthalten
Fig. 14: Second Year CO Students Comments about previous OSCE Organization
Figures 13 and 14 reveal that the majority (68%) of the 3rd year and most (91%) of 2nd year students felt the previous OSCEs were well organised.
Abbildung in dieser Leseprobe nicht enthalten
Fig. 15: Third year CO Students’ rating of previous OSCE implementation
Abbildung in dieser Leseprobe nicht enthalten
Fig. 16: Second year CO Students Comments on previous OSCE implementation
The majority (55%) of the 3rd year class and most (80%) of the 2nd year students rated the previous OSCEs implementation as well conducted. (Fig. 15 and 16).
Students’ Attitude towards OSCE practice in NHTI
Table 12: Third year CO Students’ Attitude towards OSCE practice in NHTI
Abbildung in dieser Leseprobe nicht enthalten
Key: SA = Strongly agree; A =agree; U =Uncertain; SD = Strongly disagree; D = Disagree
Table 13: Second Year CO Students’ Attitudes towards OSCE practice in NHTI
Abbildung in dieser Leseprobe nicht enthalten
Key: SA = Strongly agree; A =agree; U =Uncertain; SD = Strongly disagree; D = Disagree
According to results in the tables 12 and 13, the majority of students in both classes demonstrated a very positive attitude towards OSCE practice in NHTI Clinical officer programme in that they felt the OSCE; was a better assessment of what they were able to do clinically, was less frightening than long case/short case format of clinical assessment, was less intimidating, was more a comprehensive assessment, allowed all students to be assessed on the same patients, took shorter time for all students to be assessed, used checklists for assessing students’ performance and hence it promoted objectivity and its results could be used to provide valid feedback on students’ performance.
Table 14: The Preferred Format for assessing Clinical Competence in NHTI by 3rd year students
Abbildung in dieser Leseprobe nicht enthalten
Table 15: The Preferred Format for assessing Clinical Competence in NHTI by 2nd year students
Abbildung in dieser Leseprobe nicht enthalten
Table 14 shows that 44.7% of the 3rd year students preferred the OSCE alone. The same percentage preferred a combination of the OSCE and long case/short case formats. In comparison, 80% of the second year class preferred a combination of OSCE and long case/short case formats for assessing clinical competence.
Areas of weakness/limitations in OSCE practice in NHTI
According to both cohorts of students, the following were areas of weakness in the implementation of OSCE in NHTI
Abbildung in dieser Leseprobe nicht enthalten
Students’ suggestions for improvement of OSCE practice in NHTI
The following is a summary of their suggestions;
Abbildung in dieser Leseprobe nicht enthalten
The Impact of the OSCE on student learning in NHTI
According to both classes of students, the OSCE had influenced their learning as summarized in table below;
Abbildung in dieser Leseprobe nicht enthalten
Findings from analysis of the Past OSCEs and Results
Introduction
Analysis was carried out on the available past OSCE papers and results i.e.
a) Five past CO OSCE papers i.e.;
- CO Intake 17 Mock OSCE October 2013
- CO Intake 17 Final OSCE November 2013
- CO Intake 17 Supplementary Final April 2014
- CO Intake 18 Mock OSCE October 2014
- CO Intake 18 Final OSCE October 2014
b) Four previous CO OSCE results i.e.;
- Results of CO Intake 17 Mock OSCE October 2013
- Results of CO Intake 17 Supplementary Final April 2014
- Results of CO Intake 18 Mock OSCE and
- Results of CO Intake 18 Final OSCE
The above papers were reviewed for content and face validity as well as reliability and the following were the findings.
a) In terms of content validity
Blueprinting was done (see Table 19 below) and areas of strengths included the following;
- All the five previous OSCE were based on expected learning outcomes for Clinical officer programme.
- All the 5 disciplines of Medicine, Paediatrics, Surgery, Obstetrics and Public health were covered in the mock and final OSCEs.
Areas of weakness included;
- Not all clinical problems could be simulated especially in paediatrics.
- Use of manikins for assessing the examination of and performance of CPR in the newborn was possible however it was a bit far from reality.
Table 19: Blueprint for the November 2014 CO Final OSCE at NHTI
Abbildung in dieser Leseprobe nicht enthalten
The above blueprint illustrated that the CO final OCSE of November had content validity required by the CO curriculum.
b) In terms of face validity
Areas of strength included;
- Clear lay out of the OSCE stations
- Clear instructions given per station
- Proper font size and style used
- Straight forward language used
In terms of reliability of scores
Areas of strengths included;
- Clear marking schemes for all unmanned stations were available for use.
- Clear checklists were available for use for manned stations
- Knowledgeable and experienced examiners were deployed accordingly
Areas of weakness included;
- The there was a big difference in terms of marks allocated per station for instance some OSCE stations had as low as 5 marks while others had as high as 16 marks.
- Not all the correct answers were included in the marking schemes. This required the examiners who were marking to use their professional knowledge and judgment in marking and scoring some of the answers.
CHAPTER SIX: DISCUSSION
NHTI Tutors’ evaluation of the OSCE Practice in NHTI
All NHTI tutors had a clear understanding of the OSCE as a form of assessment for clinical competence. Such understanding is the basis for the proper implementation of this approach of examination in any institution. However, the same tutors had varying views concerning the number of stations an ideal OSCE should have. For instance, the majority of them suggested less than 15 stations as optimum for the OSCE while others suggested twenty and above stations per OSCE. Their wide varying views on the number of OSCE stations were most likely because the tutors came from different professional backgrounds. Never the less, the variation in the number of stations per OSCE is justifiable, because it actually depends on the type and number of clinical skills to be tested (Hollis and James 2001). For instance, NHTI had three different training programmes which used the OSCE/OSPEs. However, 15-20 OSCE stations has been suggested as ideal (Ananthakrishnan 1993). In addition, it is important to note that using less than 10 OSCE stations reduces the OSCE reliability and more than 25 stations makes the examination unwieldy (Hollis and James 2001, Brannick et al. 2011).
All tutors in NHTI had practical experience of OSCEs as examiners, coordinators and timekeepers (Cohen et al. 1989; DaRosa and Kaiser 2001a). Even tutors who are not directly involved in the OSCE organisation often pop in to inspect the set up and observe the progress of the OSCE. This level of tutor participation has enhanced the face validity of the NHTI OSCEs (DaRosa and Kaiser 2001a).
The most commonly used duration per OSCE station in NHTI was 10 minutes, followed by 5 minutes and the least was 3 minutes. The variation in duration per OSCE station in NHTI is justified (Harden 1990b; Hollis and James 2001a) because it depends on the particular competency being assessed per station. However, it is important to ensure that all tasks can be performed within the allocated time (Harden 1990c). In addition, the OSCE reliability is negatively affected when the duration per station is too short (Barmen 2005b)
The OSCE was most commonly used in NHTI for final qualifying and mock examinations and was least used in the promotional examinations. These are some of the recommended formative and summative uses of the OSCE (Townsend et al 2001; Gupta et al. 2010; Harden 2011b). Other uses that were not yet considered in NHTI are assessing the different sub-domains of quality improvement (Varkey et al. 2008a) and assessing a student’s ability to integrate knowledge, clinical skills, and communication with the patients as well as evaluation of an educational programme (Hodder and Papp 2001).
The organization and implementation of previous OSCEs in NHTI was well rated by the majority of tutors. However, good OSCE organization is not automatic. It is a product of careful advance planning in order to avoid last minute problems (Harden 1990d).
The setting of the previous OSCE was rated as `just standard’ by most of the tutors because it was similar to the modified Angoff and borderline setting approaches (Kaufman et al. 2000; Kilminster and Roberts 2002). Criterion referenced method of grading the results with 50% as the pass mark was followed for both written examinations and the OSCE (Bouriscot and Roberts n.d.). This means that whoever scores the set pass mark as a minimum level of competence qualifies. For reliability purpose, the final OSCE score in NHTI consists of total marks from all the OSCE stations before determining the pass mark (Wilkinson et al. 2002). This means a candidate does not fail just because of one or two poorly performed stations. However the Borderline regression method which combines the advantages of both; the modified Angoff and the modified Borderline-Group Method (Mash 2002; Wood et al. (2005a), is what is what should be recommended for use in setting the pass mark for all OSCEs in the NHTI for better standards.
The fact that most of the of tutors felt the OSCE was a better assessment of what a student was able to do clinically and was a more comprehensive probably explains why the OSCE has become a popular type of clinical assessment in NHTI. This correlated well with Troncon (2004c); Kalyan (2008); Varkey et al. (2008a) in those aspects.
Compared to the long case and short case clinical assessment, which used to take 3-5 days running simultaneously in 5 different departments in Maridi Hospital and NHTI, for about eight hours each day, the OSCE had an advantage of allowing all students to be assessed in a shorter time (Ananthakrishnan 1993a; Turner and Dankoski 2008a). The fact that the same examiners in the OSCE see all candidates promotes its objectivity. This objectivity is further enhanced by the use of standard checklists and marking schemes (Chan 2009a; Ananthakrishan (1993). This enhanced level of objectivity could partly explain why there are fewer complaints about bias associated with examination results in NHTI since it was introduced. In addition, assessing all students on the same type of patents promotes reliability, as patient variability is uniform across all candidates (Shumway and Harden 2003b). The content of the NHTI OSCE was based on the expected learning outcomes stipulated in the CO curriculum (Hodges et al 1998; Hodder and Papp 2001; Praveen and Suman 2008; Chan 2009a). Therefore, the OSCE was rated as a more valid way of assessing a student’s clinical competencies in NHTI compared to the long case-short case format.
Unlike in other formats the OSCE produces results that can be used to provide more valid feedback on a student’s performance. The information generated through the individual checklists from manned stations and marked scripts from unmanned stations is used in providing feedback on clinical performance after the examination (Ananthakrishan 1993; Shumway and Harden 2003; Chan 2009a; Ohio University 2012). The feedback provided is instrumental in designing remedial measures for leaners as well as improving the training programme design and implementation in general.
In summary, the NHTI faculty had a generally positive attitude towards the OSCE implementation, which in turn has promoted its use in all the three training programmes in the institute.
Areas of weakness/limitations in the OSCE practice as identified by NHTI Tutors and their suggestions for improvement.
Despite the above areas of strengths in the OSCE practice at the NHTI, it has had several challenges that are similar to what has been highlighted in other studies (Troncon 2004c; Hejaz and Downing 2008). The NHTI tutors therefore recommended the use of real patients where possible and better training of simulated patients to enable them present their problems consistently during future OSCEs (Cohen et al. 2001; Praveen 2008). Construction of a more spacious hall for OSCE in the institute would solve the problem of shortage of space in the OSCE hall. Alternatively, modification of the big tables used for the OSCE especially for unmanned stations, by partitioning them could to enable up to four students to use one table. This could further improve the utilisation of the available space in the hall. Acquisition of more equipment and specimen for both the OSCE and teaching purposes could enhance relevant skill acquisition instead of relying on what is borrowed from hospital and other institutions from time to time. Organizing more clinical and practical experiences for students before the OSCE is another measure that could boost clinical skills acquisition among students. Advance OSCE preparations are a key to minimizing last minute fixing of things as well as avoiding delays on the examination day (Pierre et al. 2004b; Belay and Tsinuel 2008b; Chan 2009c; Small 2013b; Turner and Dankoski 2008b).
NHTI Students’ evaluation of the OSCE Practice in NHTI
The fact that the majority of the students had clear knowledge of what OSCE meant probably explains why most of them were eager to participate in the OSCE either as candidates or as either simulated patients or timekeepers. Their participation in OSCEs from time to time has inspired them to become more focused on the acquisition of essential clinical skills (Shumway and Harden 2003b). Student participation in previous OSCEs was also a requirement for participation in this evaluation. The use of the OSCEs at various levels of training in NHTI has helped to students to integrate knowledge and skills acquisition during training (Hodder and Papp 2001). In addition, OSCE results have been used to identify students’ areas of weakness and to provide credible feedback (Brazeau et al. 2002b; Ohio University 2012). Because of the feedback provided, it has been observed that final OSCE performance is often better than that of the mock OSCE in NHTI.
Though the duration allocated per OSCE station was adequate according to the majority of the students, a few of them complained of its inadequacy. This calls for the NHTI faculty and the National Interim Examination Board to review the duration allocated per station to ensure its adequacy for completion of the type of skill or competence to be assessed (Hollis and James 2001a; Harden 1990a and 2006).
The content in the previous OSCEs was rated as fair and within their expected CO programme learning outcomes by most of the students. This positive rating signifies content validity of the OSCE from the students’ point of view (Belay et al. 2008a; Khairy 2004). It also means NHTI students understood their CO curriculum objectives that determine the content to be assessed (Harden 1990a and 2006). It also partly explains why there have been fewer student complaints about clinical examination results since OSCE was introduced in NHTI (Pierre et al. 2004a; Belay and Tsinuel 2008a; Tayyaba et al. 2014). In addition, the `fair rating’ revealed the level of confidence students had in the OSCE practice in NHTI.
While the majority of the students rated the previous OSCEs in NHTI as well organized and implemented, some of them however, gave a fair to poor rating about the same but with no explanation. While it may be difficult to please all students, this type of rating calls for more efforts on the part of the OSCE organizers in NHTI to ensure better planning and implementation of future OSCEs (Small 2013a).
Students’ attitude towards OSCE practice in NHTI
Most of the NHTI students rated the OSCE as a better assessment of what they were able to do clinically compared to the long case/short case format. This rating was similar to what was reported in other studies on the same (Kalbitz et al. (n.d); Mondal et al. 2004). It was also similar to NHTI tutors’ rating on the same aspect and reveals the level of confidence both the faculty and students had in the practice of the OSCE in the institute.
The majority of the students in NHTI were happy with OSCE because it was a less frightening examination than long case/short case format of clinical assessment just like students of Jimma and Cardiff university (Belay and Tsinuel 2008c; Tayyaba et al. 2014). However, some of the students reported that some of the examiners were unkind to them during the OSCE particularly at manned stations. Similar complaints have been reported in other studies (Tayyaba et al. 2014; Pierre et al., 2004a). It is therefore necessary that the NHTI faculty ensure that intimidation and all forms of unfriendly attitudes towards students during OSCEs are avoided always. The reason is that such attitudes negatively affect the very objectivity that the OSCE is supposed to promote.
The OSCE was a more comprehensive assessment of what the student had learnt clinically and covered the most relevant aspects of the programme. This rating revealed the content validity of the OSCE practice in NHTI from the students’ point of view (Varkey et al. 2008b; Hamdy 2006; Pierre et al. 2004a; Belay and Tsinuel 2008a; Khairy 2004).
The NHTI OSCEs allowed all candidates to be assessed by the same examiners on the same patients and content thus promoting objectivity of the test (Barmen, 2005b; DaRosa and Kaiser 2001a). In reality, objectivity was the main reason behind the design of the OSCE in the beginning (Friedman 2003; Carracio and Englander 2000a). In addition, the OSCE in NHTI has eliminated patient and examiner variability hence ensuring test reliability (Ananthakrishan 1993; DaRosa and Kaiser 2001b; Barmen 2005). The use of standard checklists and marking schemes in OSCE in NHTI has also promoted objectivity and reliability of the examination (Tejinder 2008). It has also improved both internal and inter-rater reliability among the examiners in the OSCE (DaRosa and Kaiser 2001b; Walters et al. 2005c).
Most of the NHTI students had benefited from the valid provided feedback after the promotional and mock examinations (Klara 2001; Pierre et al. 2004; Chan 2009a; Gupta et al. 2010). Consequently, it has been observed that the final OSCE performance in NHTI is better than or similar to mock OSCE. From the above discussion it is evident that clear that the NHTI students had a very positive attitude towards and hence a high level of acceptance of OSCE practice at NHTI.
Areas of weakness/limitations in the OSCE implementation identified by NHTI Students and their suggestions for improvement
Despite the generally positive attitude towards OSCE shown in the above discussion, the students pointed out some areas of weakness in the OSCE practice and their recommendations for improvements included; increase in time allocated per OSCE station, a friendlier examiner attitude towards candidates during the OSCE, better stations arrangement to ease movement, increasing the number of examiners during the OSCE, availing real patients instead of the simulated ones or models during the OSCE where possible, ensuring that the instructions are clear at all stations, provision of seats for candidates during OSCE, and providing rest stations with refreshments during the OSCE. Other suggestions for the OSCE improvement included; a more gentle change signal during the OSCE, improved teaching of practical/clinical skills in the NHTI to prepare students for clinical assessment and performance after qualification. Most of the above recommendations are in line with what was suggested by Belay and Pierre et al. (2004b); Tsinuel (2008b); Chan (2009c) and Small (2013b).
The Validity of the OSCEs in NHTI
Previous OSCE past question papers and the OSCE results were reviewed for validity and reliability and the following were the findings;
The OSCEs had content validity because all the five previous final OSCEs were based on expected learning outcomes for Clinical officer programme. The OSCEs consisted of 20 stations, with four stations from each of the five course units in the third year i.e. Medicine, Paediatrics, Surgery, Obstetrics and Public health based on the CO curriculum (Barmen 2005a). The identified areas of strengths in content validity matched the curriculum blue print for NHTI CO programme (Table 19) and correlated well with what was established by Varkey et al. (2008b); Prislin et al. (1998); Hamdy (2006a); Walters et al. 2005c)
The NHTI OSCEs had face validity because; only key procedures and relevant content were assessed, there was a clear lay out of the OSCE stations, and clear instructions given per station (DaRosa and Kaiser 2001a; Pierre et al. 2004b). In addition the NHTI OSCEs had predictive validity for instance student performance of 100% pass in CO Intake 18 mock OSCE of October 2014 was repeated in the final OSCE in November 2014 (see Annex IV a & b)
However, areas of weakness in OSCE validity included; not all clinical problems could be simulated especially in paediatrics and the use of manikins for assessing the examination of and performance of CPR in the newborn was possible but a bit far from reality (Hajaz and Downing 2008). Therefore, the NHTI faculty has to take all possible measures to minimize the negative effects of the identified weaknesses on the validity of the OSCE in the institute.
The Reliability of the OSCEs in NHTI
Reliability in the previous the OSCE in the NHTI was demonstrated by use of; clear and standard marking schemes for all unmanned stations and clear checklists for assessing performance at procedural stations. In addition, knowledgeable and experienced examiners were deployed accordingly. The OSCE consisted of 20 OSCE stations per examination. These two positive aspects of OSCE implementation have promoted consistency and objectivity in both the scoring at procedure stations and the marking of scripts from unmanned stations (Barmen 2005b; DaRosa and Kaiser 2001a).
However, the NHTI OSCE still had areas of weakness in reliability which included; a big difference in terms of marks allocated per station e.g. from 5-16 marks, not all the correct answers were included in the marking scheme, which required the examiners to use their professional judgment in marking and scoring some of the answers, which can negatively affect the inter-rater reliability (DaRosa and Kaiser 2001a). The NHTI faculty therefore has to put more effort in ensuring a more rational approach in the allocation of marks per OSCE station. Secondly, there is need to make the marking schemes as comprehensive as possible. Thirdly, there is need for continuous training of all tutors for better standards in setting both the OSCE and other tests in NHTI.
Study Limitations
This discussion on the reliability of the NHTI OSCE is not complete. As mentioned in the study methodology, ideally Cronbach’s Alpha was supposed to be calculated based on the available OSCE results (Annex IV) to measure their internal consistency. However, the researcher lacked capacity to accomplish this statistical calculation.
The Impact of the OSCE on Teaching and Learning in NHTI
According to NHTI Tutors, the OSCE had influenced the teaching/learning in NHTI in the following ways; increased motivation to learn clinical skills among the students, reduction in the number of the required examiners by more than 50%, the number of students who pass clinical assessment is higher than before the OSCE was introduced, increased student’s confidence in their own clinical skills, more accurate and timely feedback on students’ performance, more team work among tutors during clinical examinations. Other areas impacted by the OSCE included; integrated assessment of all clinical disciplines in the same hall at the same time, elimination of student’s complaints about bias in clinical examination results, more learner-centered learning initiatives whereby students identify their own weakness in clinical skills and seek guidance from tutors. In addition, tutors have become more focused in the teaching of relevant clinical skills. The above positive effects of the OSCE on teaching and learning are similar to what was reported by Shumway and Harden (2003b); Schoonheim-Klein et al. (2006); Zahid et al. (2011)
According to NHTI students, the OSCE had impacted on their learning as follows; they had more motivation to practice what is taught especially in the clinical aspects of the course; it had improved their time consciousness during clinical procedures, it increased student confidence in their own clinical skills, they were able to identify their own weaknesses in clinical performance, it had increased their chances of passing clinical examination unlike in the past, it reinforced learnt clinical skills through their application. These views indicated that even students had appreciated the positive influence that the OSCE has had on their learning in NHTI (Brazeau et al. 2002a; Ratzmann et al. 2012; Rushforth 2006; Rush et al. 2014).
CHAPTER SEVEN: CONCLUSION & RECOMMENDATIONS
Conclusion
This educational evaluation study on the OSCE established that all the eleven NHTI Tutors and most of the third year and second year students had had the correct knowledge about the OSCE. It revealed that all the Tutors and most of the students had participated in the previous OSCE and therefore had practical experience in the OSCE implementation. It demonstrated majority acceptance of the OSCE by tutors and students as the preferred method of assessing clinical competence in the CO programme. There was good rating of the previous OSCE organisation and implementation by both the faculty and students. It demonstrated that previous OSCEs had good content, face and predictive validity as well as reliability. A number of areas or concern and limitations were identified by both tutors and students and recommendations for improvement suggested. Overall, the OSCE has had a positive impact on the teaching and learning activities in the CO programme since it was introduced in NHTI in 2008, the most outstanding being; it had increased their chances of passing clinical examination unlike in the past.
Recommendations
Based on the above findings the following measures were the suggested way forward for the improvement of the OSCE practice in NHTI.
a) Better OSCE organisation which should include; improved OSCE hall organisation with better spacing between stations to ease movement, ensuring better screening for all manned stations, provision of seats at the OSCE stations for candidates who may need them, using a more gentle change signal during the OSCE, inclusion of rest stations to assist candidates who may need some break during the OSCE, ensuring that the OSCE procedures can be performed and or questions can be answered within the set time per station, more time for OSCE preparation and resource mobilization to minimize last minute fixing of things on the examination day and availing real patients instead of the simulated ones or models where possible.
b) More practice/orientation for students which should involve; introduction of the OSCE in the first year of the Clinical Officer programme, more improved clinical/practical learning experiences for students using the skills lab and clinical placements at Maridi Hospital and other health facilities.
c) More examiner training with emphasis on the following; standard setting procedures, the development and use of standard checklists and rating scales, the design use of the standard marking schemes for OSCE, control of personal emotions to avoid upsetting or intimidating candidates during the OSCE and the provision of feedback to students where and when it is required.
d) Improving the validity of the OSCE by; ensuring that all the OSCE questions and procedures are based on the curriculum as the blue print, focusing on the most important procedures and relevant content in setting in the OSCE, provision of clear instructions for examiners and candidates per OSCE station and provision of all of the required equipment and materials during OSCE.
e) Improving the reliability of the OSCE by; comprehensive sampling of OSCE content from the curriculum, ensuring at least 10 stations with adequate duration per OSCE station, development and use of concise marking schemes that clearly distinguishes between good and poor performance, using standard checklist or rating scales for all manned stations and availing real patients instead of simulated patients or models where possible.
f) More research on the OSCE practice in the NHTI with emphasis on the evaluation of its validity and reliability should be carried out in the future.
Final conclusion
This evaluation study of the OSCE used in the NHTI Clinical Officer Programme, Maridi, South Sudan established that all of the NHTI faculty and most of the students had the correct knowledge about the OSCE. Secondly, they had a generally positive attitude towards the OSCE practice in the institute. The previous OSCEs had good content, face and predictive validity as well as reliability. Both the teaching and learning had been positively impacted by the OSCE practice resulting in better performance in overall clinical assessment than before its introduction the institute in 2008.
REFERENCES
ANANTHAKRISHNAN, N., 1993a. Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med; 39:82 http://www.jpgmonline.com/text.asp? 1993/39/2/82/628 Accessed on 5/5/2012
ANANTHAKRISHNAN, N., 1993b. Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med; 39:82 http://www.jpgmonline.com/text.asp? 1993/39/2/82/628 Accessed on 5/5/2012
ANANTHAKRISHNAN, N., 1993c. Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med; 39:82 http://www.jpgmonline.com/text.asp? 1993/39/2/82/628 Accessed on 5/5/2012
ANSHU, 2008. In OSCE & OSPE Listsery Discussion, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/basicsofospeosce Accessed on 26/4/2012
BARMEN, A., 2005a. Critiques on OSCE, Annals Academy of Medicine, Singapore, 34, 478-482.
BARMEN, A., 2005b. Critiques on OSCE, Annals Academy of Medicine, Singapore, 34, 478-482.
BELAY, S., and TSINUEL, G., 2008a. Examinee Perception, Department of Pediatrics and Child Health, Jimma University, in OSCE pp 47-51
BELAY, S., and TSINUEL, G., 2008b. Examinee Perception, Department of Pediatrics and Child Health, Jimma University, in OSCE pp 47-51
BOURSICOT, K. and ROBERTS, T., n.d., Principles of Standard Setting www.medev.ac.uk/static/.../51_Prinicples_of_Standard_Setting.ppt Accessed on 19/5/2012
BRANNICK, M.T., EROL-KORKMAZ H.T., PREWETT M., 2011a. A systematic review of the reliability of OSCE scores, Medical Education, 45:1181-1189
BRANNICK, M.T., EROL-KORKMAZ H.T., PREWETT M., 2011b. A systematic review of the reliability of OSCE scores, Medical Education, 45:1181-1189
BRAZEAU, C., BOYD, L.D., CROSSON, J., 2002a. Changing an Existing OSCE to a Teaching Tool: The Making of a Teaching OSCE, Association of American Medical Colleges http://journals.lww.com/academicmedicine/Fulltext/2002/09000/Changing_an_Existing_OSCE_to_a_Teaching_Tool__The.36.aspx Accessed on 21/7/2014
BRAZEAU, C., BOYD, L.D., CROSSON, J., 2002b. Changing an Existing OSCE to a Teaching Tool: The Making of a Teaching OSCE, Association of American Medical Colleges http://journals.lww.com/academicmedicine/Fulltext/2002/09000/Changing_an_Existing_OSCE_to_a_Teaching_Tool__The.36.aspx Accessed on 21/7/2014
BRAZEAU, C., BOYD, L.D., CROSSON, J., 2002c. Changing an Existing OSCE to a Teaching Tool: The Making of a Teaching OSCE, Association of American Medical Colleges http://journals.lww.com/academicmedicine/Fulltext/2002/09000/Changing_an_Existing_OSCE_to_a_Teaching_Tool__The.36.aspx Accessed on 21/7/2014
BUCKLEY, S. and ZAMORA, J., 2007. Effects of participation in a cross year peer tutoring programme in clinical examination skills on volunteer tutors' skills and attitudes towards teachers and teaching http://www.biomedcentral.com/1472-6920/7/20 Accessed on 5/5/2012
CABALLERO, C., CREED, F., GOCHMANSKI, C., LOVEGROVE, J., 2012. An introduction to OSCE assessments in Nursing OSCEs: A Complete Guide to Exam Success www.books.google.com/books?isbn=0191631116 Accessed on 12/6/2014
CARRACCIO, C., and ENGLANDER, R., 2000a. The Objective Structured Clinical Examination: A Step in the Direction of Competency-Based Evaluation, Arch Pediatr Adolesc Med . 2000; 154 (7):736-741. doi:10.1001/archpedi.154.7.736-741 http://archpedi.jamanetwork.com/issue.aspx?journalid=75&issued=8545 Accessed on 14/6/2014
CARRACCIO, C., and ENGLANDER, R., 2000b. The Objective Structured Clinical Examination: A Step in the Direction of Competency-Based Evaluation, Arch Pediatr Adolesc Med . 2000; 154 (7):736-741. doi:10.1001/archpedi.154.7.736-741 http://archpedi.jamanetwork.com/issue.aspx?journalid=75&issued=8545 Accessed on 14/6/2014
CHAN, C. 2009a. Assessment: Objective Structured Clinical Examination, Assessment Resources@HKU, University of Hong Kong http://ar.cetl.hku.hk Accessed on 25/2/2013
CHAN, C. 2009b. Assessment: Objective Structured Clinical Examination, Assessment Resources@HKU, University of Hong Kong http://ar.cetl.hku.hk Accessed on 25/2/2013
CHAN, C. 2009c. Assessment: Objective Structured Clinical Examination, Assessment Resources@HKU, University of Hong Kong http://ar.cetl.hku.hk Accessed on 25/2/2013
COHEN, R., MACINNES, A., FINKELSTEIN, J., ROTHMAN, A.I., AND ROBB, A. 1989. How to prepare and implement an Objective Structured Clinical Examination, In OSCE Manual, ASE Educational pp 50-51 www.surgicaleducation.com Accessed on 6/6/11
DAROSA, D.A. and KAISER, K., 2001a. Measurement Considerations in Planning and Evaluating an OSCE, ASE: The Objective Structured Clinical Examination, pp 95-103 www.surgicaleducation.com Accessed on 6/6/2011
DAROSA. D.A. and KAISER, K., 2001b. Measurement Considerations in Planning and Evaluating an OSCE, ASE: The Objective Structured Clinical Examination, pp 95-103 www.surgicaleducation.com Accessed on 6/6/2011
DAVIS, M.H., 2003a. OSCE- the Dundee experience, Medical teacher, Vol. 25, No. 3, pp 255-261 (doi:10.1080/0142159031000100292) http://informahealthcare.com/action/doSearch Accessed on 1/3/2014
DAVIS, M.H., 2003b. OSCE- the Dundee experience, Medical teacher, Vol. 25, No. 3, pp 255-261 (doi:10.1080/0142159031000100292) http://informahealthcare.com/action/doSearch Accessed on 1/3/2014
EPSTEIN, R.M., 2007. Assessment in Medical Education, The New England Journal of Medicine; 356:387-396 http://www.nejm.org/doi/full/10.1056/nejmra054784 Accessed on 2/June/2012
EPSTEIN, R.M., 2007. Assessment in Medical Education, The New England Journal of Medicine; 356:387-396 http://www.nejm.org/doi/full/10.1056/nejmra054784 Accessed on 2/June/2012
Explorable.com, 2010. Cronbach’s Alpha. https://explorable.com/cronbachs-alpha Accessed on 20/2/2015
FRIEDMAN, B.M., 2003. Life beyond OSCE, Medical Teacher , Vol.25 No. 3, May 2003 pp 239
GUPTA, P., DEWAN, P., SINGH, T., 2010a. Objective Structured Clinical Examination (OSCE) Revisited, Department of Pediatrics, University College of Medical Sciences, Delhi, India. 2010 Nov 7; 47(11):911-20.
GUPTA, P., DEWAN, P., SINGH, T., 2010b. Objective Structured Clinical Examination (OSCE) Revisited, Department of Pediatrics, University College of Medical Sciences, Delhi, India. 2010 Nov 7; 47(11):911-20.
HAMDY, H., 2006a. Blueprinting for the assessment of health care professionals, The Clinical Teacher, 3:175-179
HAMDY, H., 2006b. Blueprinting for the assessment of health care professionals, The Clinical Teacher, 3:175-179
HARASYM, P.H., WOLOSCHUK, W., CUNNING, L., 2008. Undesired Variance Due to Examiner Stringency/Leniency Effect in Communication Skill Scores Assessed in OSCEs,Advances in Health Sciences Education, v13 n5 p617-632 http://eric.ed.gov/?q=Reliability+in+OSCE&ff1=subReliability&id=EJ821766 Accessed on 28/7/2014
HARDEN, R.M., and GLEESON, F.A., 1979. Assessment of clinical competence using an objective structured clinical examination (OSCE), 29. DOI: 10.1111/j.1365-2923.1979.tb00918.x http://onlinelibrary.wiley.com/doi/101111/medu.1979.13issue1/issuetoc Accessed on 8 Jan 2011
HARDEN, R.M. 1990a. Twelve tips for organizing an OSCE, Medical Teacher, Volume 12, No.3/4 1990 pp 259
HARDEN, R.M. 1990b. Twelve tips for organizing an OSCE, Medical Teacher, Volume 12, No.3/4 1990 pp 259
HARDEN, R.M. 1990c. Twelve tips for organizing an OSCE, Medical Teacher, Volume 12, No.3/4 1990 pp 259
HARDEN, R.M. 1990d. Twelve tips for organizing an OSCE, Medical Teacher, Volume 12, No.3/4 1990 pp 259
HARDEN, R.M., 2006. Trends and the future of postgraduate medical education, Emergency Medicine Journal 2006;23: 798-802 doi:10.1136/emj.2005.033738 http://emj.bmj.com/content/23/10/798 Accessed on 1/5/2012
HARDEN, R.M., 2011. Looking Back to the Future: A message for a new generation of Medical Educators. Medical Education 2011; 45: 779-780
HIJAZI, M. and DOWNING, S. M., 2008. Objective structured clinical examinations as an assessment method in residency training: Practical considerations. Ann Saudi Med [serial online] 28:192-9. http://www.saudiannals.net/text.asp?2008/28/3/192/51734 Accessed on 8 Jan 2011]
HODDER, S.L. and PAPP, K.K., 2001. The role of OSCE in the Assessment of Student Performance, In ASE: The Objective Structured Clinical Examination, pp 20-22 www.surgicaleducation.com Accessed on 6/6/2011
HODGES, B., REGEHR, G., HANSON, M., 1998. Validation of an objective structured clinical examination in psychiatry. Academic Medicine, 73, 910–912. http://apt.rcpsych.org/external-ref?access_num=9736854 Accessed on 22/2/214
HOLLIS, W.M. and JAMES, A.M., 2001a. The OSCE: Present and Future, In ASE: The Objective Structured Clinical Examination, pp 13-14 www.surgicaleducation.com Accessed on 6/6/2011
HOLLIS, W.M. and JAMES, A.M., 2001b. The OSCE: Present and Future, In ASE: The Objective Structured Clinical Examination, pp 13-14 www.surgicaleducation.com Accessed on 6/6/2011
HOLLIS, W.M. and JAMES, A.M., 2001c. The OSCE: Present and Future, In ASE: The Objective Structured Clinical Examination, pp 13-14 www.surgicaleducation.com Accessed on 6/6/2011
JAMES, A.M., 2001. Components of the OSCE, In ASE: The Objective Structured Clinical Examination pp14-165 www.surgicaleducation.com Accessed on 6/6/2011
JOHNSON, G. and REYNARD, K., 1994. Assessment of an objective structured clinical examination (OSCE) for undergraduate students in accident and emergency medicine, Journal of Accident and Emergency Medicine, 1994, 11, 223-226
KALBITZ, M., LIENER, U., KORNMANN, M., GEBHARD, F., HUBER-LANG, M., n.d. Evaluation of an objective structured clinical examination (OSCE) in surgery and orthopedics by medical students]. http://www.ncbi.nlm.nih.gov/pubmed Accessed on 19/2/2013
KALYAN, 2008. Practical Experiences of OSCE, in Listery Discussion, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/practicalexperiencesofospeosce Accessed on 7/1/2012
KAMRAN, Z.K. , KATHRYN, G., SANKARANARAYANAN, R., PIYUSH, P., 2013. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: Organisation & Administration, Vol. 35, No. 9 , pp e1447-e1463 http://informahealthcare.com/action/doSearch Accessed on 1/3/2014
KAUFMAN, D.M., MANN, K.V., MUIJTJENS, A.M.M., VAN DER VLEUTEN, CEES, P.M., 2000a, b. A Comparison of Standard‐setting Procedures for an OSCE in Undergraduate Medical Education, Advances in Health Sciences Education Volume 9, Issue 3, pp 201-209 http://journals.lww.com/academicmedicine/Fulltext/2000/03000/A_Comparison_of_Standard_setting_Procedures_for_an.18.aspx Accessed on 27/7/2014
KHAIRY, G.A., 2004. Feasibility and acceptability of OSCE for a large number of candidates experience at a university hospital, Journal of family & community medicine www.ncbi.nlm.nih.gov/medlineplus/badbreath.html Accessed on 26/9/2013
KILMINSTER, S. and ROBERTS, T., 2002. Standard Setting for OSCEs: Trial of Borderline Approach http://link.springer.com/article/10.1023%2FB%3AAHSE.0000038208.06099.9a Accessed on 27/7/2014
MARLIYYA, Z., 2011. Objective Structured Clinical Examination: The Assessment of Choice, Oman Med J. Jul 2011; 26(4): 219–222. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3191703/ Accessed on 12/6/2014
MARWAHA S., 2011a b. Objective Structured Clinical Examinations (OSCEs), psychiatry and the Clinical assessment of Skills and Competencies (CASC) Same Evidence, Different Judgment, BMC Psychiatry, 11:85 http://www.biomedcentral.com/1471-244X/11/85 Accessed 16/2/2015
MASH, B., 2002. Assessing Clinical Skills: Standard Setting in the Objective Structured Clinical Exam (OSCE), South African Family Practice http://www.hrhresourcecenter.org/node/1681 Accessed on 27/7/2014
MOATTARI, M., ABDOLLAH-ZARGAR S., MOUSAVINASAB, M., ZARW, N., BEYGI. M.P., 2009a. Reliability and validity of OSCE in Evaluating Skills of Nursing Students. Journal of Medical Education, Summer 2009, Vol.13, No. 3, 79-85 http://www.SID.iv Accessed on 23/2/2013
MOATTARI, M., ABDOLLAH-ZARGAR S., MOUSAVINASAB, M., ZARW, N., BEYGI. M.P., 2009b. Reliability and validity of OSCE in Evaluating Skills of Nursing Students. Journal of Medical Education, Summer 2009, Vol.13, No. 3, 79-85 http://www.SID.iv Accessed on 23/2/2013
MOATTARI, M., ABDOLLAH-ZARGAR S., MOUSAVINASAB, M., ZARW, N., BEYGI. M.P., 2009c. Reliability and validity of OSCE in Evaluating Skills of Nursing Students. Journal of Medical Education, Summer 2009, Vol.13, No. 3, 79-85 http://www.SID.iv Accessed on 23/2/2013
MODAL, R., SARKAR S., NANDI, M., HAZRA, A., 2004. Comparative analysis between OSCE and Conventional examination as a formative evaluation tool in Paediatrics in semester examination for final MBBS students, Kathmandu University medical journal (KUMJ) www.ncbi.nlm.nih.gov/pubmed/22971865 Accessed on 26/9/2013
MOSSEY, P.A., NEWTON J.P., STIRRUPS, D.R., 2001. OSCE: Scope of the OSCE in the assessment of clinical skills in dentistry, British Dental Journal 190, 323 - 326 http://www.nature.com/bdj/archive/bdj_s21_current_archive.html Accessed on 1/May/2012
NORCINI, J.J., 2003. Setting standards on educational tests, in The metric of medical education, Medical Education 2003; 37:464–469 OHIO UNIVERSITY COLLEGE OF OSTEOPATHIC MEDICINE, 2012a. OSCE Questions, Heritage College of Osteopathic Medicine. http://www.oucom.ohiou.edu/academicaffairs/predoc/osce/index.html Accessed on 26/10/2012
OHIO UNIVERSITY COLLEGE OF OSTEOPATHIC MEDICINE, 2012b. OSCE Questions, Heritage College of Osteopathic Medicine. http://www.oucom.ohiou.edu/academicaffairs/predoc/osce/index.html Accessed on 26/10/2012
OSCEHOME, n.d., What is OSCE? http://www.oscehome.com/what_is_objective_structured_exmaination_OSCE.html Accessed on 18/5/2014
PATIL, N.G., SAING, H., WONG, J., 2003. Role of OSCE in evaluation of practical skills, Medical Teacher, Vol. 25, No. 3 , Pages 271-272
PATRÍCIO, M.F., JULIÃO, M., FARELEIRA, F., CARNEIRO A.V., 2013. Is the OSCE a feasible tool to assess competencies in undergraduate medical education?, Med Teach. 35(6):503-14.
PIERRE, R.B., WIERENGA, A., BARTON, M., BRANDAY, J.M., CHRISTIE C.D., 2004a. Student evaluation of an OSCE in Paediatrics at the University of the West Indies, Jamaica in BMC Medical Education 2004, 4:22 http://www.biomedcentral.com/1472-6920/4/22/prepub Accessed on 21/2/2013
PIERRE, R.B., WIERENGA, A., BARTON, M., BRANDAY, J.M., CHRISTIE C.D., 2004b. Student evaluation of an OSCE in Paediatrics at the University of the West Indies, Jamaica in BMC Medical Education 2004, 4:22 http://www.biomedcentral.com/1472-6920/4/22/prepub Accessed on 21/2/2013
POFAHL, W.E., and SLOAN, D., 2001. Implementation of OSCE In ASE: The Objective Structured Clinical Examination, pp 28 www.surgicaleducation.com Accessed on 6/6/2011
PRAVEEN, R.S., 2008. In OSCE& OSPE Listsery Discussion, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/howtocollecttheanswersheetsinosceospe Accessed on 5/5/2012
PRAVEEN, R. S., 2008. In OSCE& OSPE Listsery Discussion, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/simulatedpatients Accessed on 5/5/2012
PRAVEEN, R.S. and SUMAN, P.S., 2008. In What is OSCE OSPE, in Listery Discussions, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/discussionsummarywhatisosceospe Accessed on 26/4/2012
PRAVEEN, R.S. and SUMAN P.S., 2008. In Process - OSCE and OSPE, Listery Discussion, CMCL-FAIMER http://sites.google.com/sites/faimertraining2008/processosceandospe Accessed on 5/5/2012
PRAVEEN, R.S. and SUMAN P.S., 2008. In Is OSPE glorified spotting, Listery Discussion, CMCL-FAIMER http://sites.google.com/sites/faimertraining2008/02test Accessed on 5/5/2012
PRISLIN, M.D., FITZPATRICK, C.F., LIE, D., GIGLIO, M., RADECKI S., LEWIS E., 1998a. Use of an Objective Structured Clinical Examination in Evaluating Student performance, Fam Med, 8;30(5):338-44.) http://www.dhpescu.org/media/elip/use%20of%20osce%20in%20student%20evaluation.pdf Accessed on 27/7/2014
PRISLIN, M.D., FITZPATRICK, C.F., LIE, D., GIGLIO, M., RADECKI S., LEWIS E., 1998b. Use of an Objective Structured Clinical Examination in Evaluating Student performance, Fam Med, 8;30(5):338-44.) http://www.dhpescu.org/media/elip/use%20of%20osce%20in%20student%20evaluation.pdf Accessed on 27/7/2014
RAJ, N., BADCOCK, L.J., BROWN, G.A., DEIGHTON, C.M., O’REILLY, S.C., C.O., 2007a. Design and Validation of 2 Objective Structured Clinical Examination Stations to Assess Core Undergraduate Examination Skills of the Hand and Knee, The Journal of Rheumatology Publishing Company Limited, London http://www.jrheum.com/arhives/febo7.html Accessed on 7/6/2012
RAJ, N., BADCOCK, L.J., BROWN, G.A., DEIGHTON, C.M., O’REILLY, S.C., C.O., 2007b. Design and Validation of 2 Objective Structured Clinical Examination Stations to Assess Core Undergraduate Examination Skills of the Hand and Knee, The Journal of Rheumatology Publishing Company Limited, London http://www.jrheum.com/arhives/febo7.html Accessed on 7/6/2012
RATZMANN, A., WIESMANN, U., KORDAB, B., 2012. Integration of an Objective Structured Clinical Examination (OSCE) into the dental preliminary exams, GMS Z Med Ausbild, 29(1):Doc09 http://www.ncbi.nlm.nih.gov/pubmed/22403594 Accessed on 27/7/2014
RUSHFORTH, H.R., 2006. Objective structured clinical examination (OSCE): Review of literature and implications for nursing education, Nurse Education Today http://www.nurseeducationtoday.com/article/S0260-6917%2806%2900138-9/abstract Accessed on 29/7/2014
RUSH, S., OOMS A., MARKS-MARAN, D., FIRTH, T., 2014. Students' perceptions of practice assessment in the skills laboratory: An evaluation study of OSCAs with immediate feedback. Nurse Educ Pract., 2. pii: S1471-5953(14)00088-2. http://www.ncbi.nlm.nih.gov/pubmed/25042546 Accessed on 29/7/2014
SACHDEVA, A.K. 2001a. Implementation and Uses of Objective Structured Clinical Examinations (OSCE). In OSCE Manual: ASE Educational pp 41 www.surgicaleducation.com Accessed on 6/6/2011
SACHDEVA, A.K. 2001b. Implementation and Uses of Objective Structured Clinical Examinations (OSCE). In OSCE Manual: ASE Educational pp 41 www.surgicaleducation.com Accessed on 6/6/2011
SAJJAD, 2008. In OSCE& OSPE Listery Discussion, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/basicsofospeosce Accessed on 26/4/2012
SAJJAD, 2008. In How to Set up OSCE/OSPE stations, OSCE& OSPE Listsery Discussion, CMCL-FAIMER http://sites.google.com/sites/faimertraining2008/03howtosetuposceandospestations Accessed on 5/5/2012
SANJAY, B., 2008. In OSCE& OSPE Listsery Discussion, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/levelsofcompetenciesinevaluation Accessed on 5/5/2012
SCHOONHEIM-KLEIN, M.E., HABETS, L.L., AARTMAN, I.H., VAN DER VLEUTEN. C.P., HOOGSTRATEN, J., VAN DER VELDEN, U., 2006. Implementing an Objective Structured Clinical Examination (OSCE) in dental education: effects on students' learning strategies, Eur J Dent Educ., 10(4):226-35. http://www.ncbi.nlm.nih.gov/pubmed/17038015 Accessed on 29/7/2014
SHULRUF, B., TURNER, R., POOLE, P., WILKINSON, T., 2013. The Objective Borderline Method (OBM): A Probability-Based Model for Setting up an Objective Pass/Fail Cut-Off Score in Medical Programme Assessments,Advances in Health Sciences Education, v18 n2 p231-244 May 2013 http://eric.ed.gov/?q=standard+setting+approach+in+OSCE&id=EJ999019 Accessed on 28/7/2014
SHUMWAY, J.M. and HARDEN, R.M., 2003a. The assessment of learning outcomes for the competent and reflective physician www.fmhs.uaeu.ac.ae/researchgroups/merg/assessment/amee.pdf Accessed on 23/2/2013
SHUMWAY, J.M. and HARDEN, R.M., 2003b. The assessment of learning outcomes for the competent and reflective physician www.fmhs.uaeu.ac.ae/researchgroups/merg/assessment/amee.pdf Accessed on 23/2/2013
SHUMWAY, J.M. and HARDEN, R.M., 2003c. The assessment of learning outcomes for the competent and reflective physician www.fmhs.uaeu.ac.ae/researchgroups/merg/assessment/amee.pdf Accessed on 23/2/2013
SMALL, L.F., 2013a. Students’ perceptions regarding the objective structured clinical evaluation as an assessment approach www.hsag.co.za/index.php/HSAG/issue/view/62 Accessed on 29/1/2014
SMALL, L.F., 2013b. Students’ perceptions regarding the objective structured clinical evaluation as an assessment approach www.hsag.co.za/index.php/HSAG/issue/view/62 Accessed on 29/1/2014
TAVAKOL M. and DENNICK R., 2011a. Making sense of Cronbach’s alpha International Journal of Medical Education, 2:53-55
TAVAKOL M. and DENNICK R., 2011b. Making sense of Cronbach’s alpha International Journal of Medical Education, 2:53-55
TAYYABA, S., HAWTHORNE, K., SMITH, P., 2014. Students’ evaluation of the OSCE experience in Cardiff Medical School Institute of Medical Education, School of Medicine, Cardiff University, Cardiff University School of Medicine C21 Curriculum Showcase http://www.scribd.com/doc/214626464/Students-evaluation-of-the-OSCE-experience-in-Cardiff-Medical-School Accessed on 26/7/2014
TEJINDER, S., 2008a, b. In OSCE & OSPE Listsery Discussion, CMCL-FAIMER http://sites.google.com/site/faimertraining2008/standardsettinginosce2 Accessed on 19/5/2012
TOWNSEND, A.H., MCLLVENNY, S., MILLER, C.J., DUNN, E.V., 2001. The use of an objective structured clinical examination (OSCE) for formative and summative assessment in a general practice clinical attachment and its relationship to final medical school examination performance, Med Educ. 35(9):841
TRONCON, L.E. DE ALMEIDA, 2004 a. Clinical skills assessment: Limitations to the Introduction of an "OSCE" (Objective Structured Clinical Examination) in a traditional Brazilian medical school. Sao Paulo Medical Journal, 122 (1), 12-17. http://dx.doi.org/10.1590/S1516-31802004000100004 Accessed on 5/May/2012
TRONCON, L.E. DE ALMEIDA, 2004b. Clinical skills assessment: Limitations to the Introduction of an "OSCE" (Objective Structured Clinical Examination) in a traditional Brazilian medical school. Sao Paulo Medical Journal, 122 (1), 12-17. http://dx.doi.org/10.1590/S1516-31802004000100004 Accessed on 5/May/2012
TRONCON, L.E. DE ALMEIDA, 2004c. Clinical skills assessment: Limitations to the Introduction of an "OSCE" (Objective Structured Clinical Examination) in a traditional Brazilian medical school. Sao Paulo Medical Journal, 122 (1), 12-17. http://dx.doi.org/10.1590/S1516-31802004000100004 Accessed on 5/May/2012
TURNER, J.L., and DANKOSKI, M.E., 2008a. Objective Structured Clinical Exams: A Critical Review, Fam Med.; 40(8):574-8.)
TURNER, J.L., and DANKOSKI, M.E., 2008b. Objective Structured Clinical Exams: A Critical Review, Fam Med.; 40(8):574-8.)
VARKEY, P., NATT, N., LESNICK, T., DOWNING, S., YUDKOWSKY, R., 2008a. Validity Evidence for an OSCE to Assess Competency in Systems-Based Practice and Practice-Based Learning and Improvement: A Preliminary Investigation; Academic Medicine, Vol.83. No, 8/August, 2008; 83:775–780. www.ncbi.nlm.nih.gov/pubmed/18667895 Accessed on 1/3/2014
VARKEY, P., NATT, N., LESNICK, T., DOWNING, S., YUDKOWSKY, R., 2008b. Validity Evidence for an OSCE to Assess Competency in Systems-Based Practice and Practice-Based Learning and Improvement: A Preliminary Investigation; Academic Medicine, Vol.83. No, 8/August, 2008; 83:775–780. www.ncbi.nlm.nih.gov/pubmed/18667895 Accessed on 1/3/2014
VARKEY, P., NATT, N., LESNICK, T., DOWNING, S., YUDKOWSKY, R., 2008c. Validity Evidence for an OSCE to Assess Competency in Systems-Based Practice and Practice-Based Learning and Improvement: A Preliminary Investigation; Academic Medicine, Vol.83. No, 8/August, 2008; 83:775–780. www.ncbi.nlm.nih.gov/pubmed/18667895 Accessed on 1/3/2014
WALTERS, K., OSBORN, D., RAVEN, P., 2005 a. The development, validity and reliability of a multimodality OSCE in psychiatry, Medical Education, 39:292-298
WALTERS, K., OSBORN, D., RAVEN, P., 2005 b. The development, validity and reliability of a multimodality OSCE in psychiatry, Medical Education, 39:292-298
WALTERS, K., OSBORN, D., RAVEN, P., 2005c. The development, validity and reliability of a multimodality OSCE in psychiatry, Medical Education, 39:292-298
WILKINSON, T.J., NEWBLE, D.I., FRAMPTON, C.M., 2002. Standard setting in an objective structured clinical examination: use of global ratings of borderline performance to determine the passing score, Med Educ. http://www.ncbi.nlm.nih.gov/pubmed/11703640 Accessed on 27/7/2014
WOOD T.J., HUMPHREY-MURTO, S.M., NORMAN, G.R., 2005a. Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method, Advances in Health Sciences Education, 11:115–122 http://link.springer.com/article/10.1007%2Fs10459-005-7853-1 Accessed on 21/2/2015
WOOD T.J., HUMPHREY-MURTO, S.M., NORMAN, G.R., 2005b. Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method, Advances in Health Sciences Education, 11:115–122 http://link.springer.com/article/10.1007%2Fs10459-005-7853-1 Accessed on 21/2/2015
ZAHID, M.A., AL-ZAYED, A., OHAERI, J., VARGHESE, R., 2011. Introducing the Objective Structured Clinical Examination (OSCE) in the Undergraduate Psychiatric Curriculum: Evaluation after One Year http://eric.ed.gov/?q=validity+in+OSCE&id=EJ950864 Accessed on 28/7/2014
ANNEXES
ANNEX I
QUESTIONNAIRE A: ON THE OSCE EVALUATION IN NHTI-MARIDI, SOUTH SUDAN
(FOR TUTORS)
Questionnaire Number .[Leave this space blank please]
Dear Participant
You are kindly requested to participate in this study by answering the following questions. Your information will be treated with utmost confidentiality and used for the purpose of this study only. You are free not to indicate your names at all. In addition, your participation in this study is both voluntary and optional. It may take you about 5-10 minutes to fill this form.
1. What is your date of birth ….. /…../..
2. What is your gender? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
3. What is your nationality? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
4. What is your highest level of Education [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
5. What professional qualifications do you hold [Tick the one[s] that apply]
Abbildung in dieser Leseprobe nicht enthalten
Others, specify.
6. Please list the subjects you teach and or examine in NHTI in the following spaces.
Abbildung in dieser Leseprobe nicht enthalten
7. Which of the following statements best explains what is meant by Objective Structured Clinical Examination [OSCE]? [Circle the most correct statement]
a) OSCE is any type of examination that is used to measure clinical performance of candidates.
b) OSCE is any form of performance based testing used to measure candidates’ clinical competence.
c) OSCE is a multi-station assessment approach primarily used to measure clinical competence of candidates as they go through each station objectively.
d) It is an examination consisting of objective questions only.
8. How many stations should an ideal OSCE have?.
9. Have you ever participated in an OSCE before? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
10. If yes to the above, what was your role in the OSCE? [Tick the appropriate one(s) ]
Abbildung in dieser Leseprobe nicht enthalten
Others, specify…...
11. If yes, to the above, in what type of examination was the OSCE? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
12. In what subject was OSCE? .
13. How many stations did the whole OSCE have? .
14. How much time was allowed at each OSCE station? .
15. According to you, was the time allocated per station adequate? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
15. If No to the above question, how much time would be adequate for you per station?.
16. According to you, how tough was the test/examination? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
17. According to you, the content of OSCE questions and procedures were [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
19. If the last OSCE was administered again to another class it is likely to [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
Briefly explain your choice of the above options
.….
20. How was the organization of the OSCE Stations done? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
21. How well was the whole OSCE conducted? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
22. Compared to long and short type of clinical examination, what do you like about OSCE type of clinical examination? Tick whether you agree or disagree with any the statements in the following table;
SA = Strongly agree or A =agree or U =Uncertain or SD = Strongly disagree D = Disagree
Abbildung in dieser Leseprobe nicht enthalten
23. Please list down whatever you don’t like about OSCE practice in NHTI?
Abbildung in dieser Leseprobe nicht enthalten
24. Please give suggestions about what should be done to improve OSCE practice in NHTI?
Abbildung in dieser Leseprobe nicht enthalten
25. Suggest ways in which OSCE has influenced the teaching/learning in NHTI.
Abbildung in dieser Leseprobe nicht enthalten
26. Which of the following approaches of approaches of assessing clinical competence would you like to promote in NHTI? [Tick one]
a) Long case/short case format
b) OSCE type
c) Both a and b
After filling it please hand this questionnaire back to the person who gave it to you. Many thanks indeed for your participation.
ANNEX II
QUESTIONNAIRE B: ON THE OSCE EVALUATION IN NHTI-MARIDI, SOUTH SUDAN
(FOR STUDENTS)]
Questionnaire Number . [Leave this space blank please]
Dear Participant
You are kindly requested to participate in this study by answering the following questions. Your information will be treated with utmost confidentiality and used for the purpose of this study only. You are free not to indicate your names at all. In addition your participation in this study is voluntary and optional. Non participation carries no risk on your part. It may take you about 5-10 minutes to fill this form.
1. In which year of the course are you? [tick one]
Abbildung in dieser Leseprobe nicht enthalten
2. What is your date of birth? …./….../.
3. I which state of South Sudan do you come from? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
4. What is your highest level of education? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
Others, specify….
5. What other course had you done before joining NHTI? [Tick the one[s] that apply]
Abbildung in dieser Leseprobe nicht enthalten
Others Specify
6. Which of the following statements best explains what is meant by Objective Structured Clinical Examination [OSCE]? [Circle the most correct statement]
a) OSCE is any type of examination that is used to measure clinical performance of candidates
b) OSCE is any form of performance based testing used to measure candidate’s clinical competence.
c) OSCE is a multi-station examination approach primarily used to measure clinical competence of candidates as they go through each station objectively.
d) OSCE is an examination consisting of objective questions only.
7. Have you ever participated in an OSCE before? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
8. If yes, what was your role in the OSCE? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
9. If yes to the above, in what type of examination was it? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
10. How many stations did the whole examination have?
11. How much time was allowed at each station? ...
12. According to you was the time allocated per station adequate? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
13. If No to the above question, how much time would be adequate for you per station?..
14. According to you the test/examination was [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
15. According to you the content of test/examination questions were [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
16. How were the OSCE Stations organized? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
17. How was the whole OSCE conducted? [Tick one]
Abbildung in dieser Leseprobe nicht enthalten
18. Compared to long and short type of clinical examination, what do you like about OSCE type of clinical examination? Tick whether you agree or disagree with any the statements in the following table;
SA = Strongly agree or A =agree or U =Uncertain or SD = Strongly disagree D = Disagree
Abbildung in dieser Leseprobe nicht enthalten
18. Please list down whatever you don’t you like about OSCE practice in NHTI?
Abbildung in dieser Leseprobe nicht enthalten
19. Please give suggestions about what should be done to improve OSCE practice in NHTI?
Abbildung in dieser Leseprobe nicht enthalten
20. How has the introduction of OSCE type of assessment in NHTI influenced your learning?
Abbildung in dieser Leseprobe nicht enthalten
20. Which of the following approaches of approaches of assessing clinical competence would you like to promote in NHTI? [Tick one]
a) Long case/short case format
b) OSCE type
c) Both a and b
After filling it please hand this questionnaire back to the person who gave it to you. Many thanks indeed for your participation.
ANNEX III
CHECKLIST FOR ASSESSING VALIDITY AND RELIABILITY OF
PREVIOUS OSCEs IN NHTI
Abbildung in dieser Leseprobe nicht enthalten
General comments
All candidates passed the final OSCE
26% passed with Distinctions (75% & above)
28% .passed with Credits (65-74%)
Designed by Academic master – NHTI
b) RESULTS OF CO INTAKE 18 MOCK OSCE OCT 2014
Abbildung in dieser Leseprobe nicht enthalten
General comments
- All candidates passed the final OSCE
- No candidate passed with distinctions (75% & above)
- 24% passed with credits (65-74%)
Designed by Academic master – NHTI
c) RESULTS OF INTAKE 17 CO MOCK OSCE OCT 2013
Abbildung in dieser Leseprobe nicht enthalten
General comments
- General performance was 69% pass
- No Candidate passed with distinctions (75% & above)
- 10% of them passed with credits (65-74%)
Designed by Academic master – NHTI
d) RESULTS OF FINAL NHTI SUPPLEMENTARY APRIL 2014
Abbildung in dieser Leseprobe nicht enthalten
General comments
- All the referred Candidates passed.
- One candidate (Elijah) did not return to repeat OSCE due to insecurity
Prepared by the Academic Master NHTI
ANNEX V
PERMISSION TO CONDUCT A STUDY IN NHTI
Abbildung in dieser Leseprobe nicht enthalten
ANNEX VI
Maps of South Sudan showing political boundaries and main towns
Abbildung in dieser Leseprobe nicht enthalten
ANNEX VII: AERIAL MAP & PHOTO OF NHTI-MARIDI
This image has been removed due to copyright reasons.
Abbildung in dieser Leseprobe nicht enthalten
-
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X.