Assessing Midwifery and Medical Student’s Interprofessional Learning: the use of portfolio

Bulan Kakanita Hermasari, Ari Natalia Probandari, Zulaika Nur Afifah

Abstract


To evaluate the portfolio in assessing interprofessional learning between medical and midwifery students. The student portfolio is assessed using a rubric consisting of four assessment criteria. A total of 32 student portfolios were tested for reliability coefficients and interrater agreements. We conducted an in-depth interview with mentors and focus group discussion (FGD) with students for exploring their perceptions of the ability of the portfolio to assess the learning. Interview and FGD data were converted into verbatim transcripts then were analyzed by two coders using open coding techniques. The reliability coefficient is 0.808. Inter-rater agreements for each assessment criteria are ranging from moderate to high. Mentors and students have positive insights toward the assessment system. This study supports the use of portfolios as an interprofessional educational assessment tool.

Keywords


Assessment, Interprofessional learning, Medical, Midwifery, Portfolio

Full Text:

PDF

References


Interprofessional Education Collaborative Expert Panel. Core Competencies for Interprofessional Collaborative Practice: Report of an expert panel. Washington DC; 2011.

Anderson ES, Kinnair D, “Integrating the Assessment of Interprofessional Education Into the Health Care Curriculum,” J Taibah Univ Med Sci, 2016.

Domac S, Anderson L, O’Reilly M, Smith R, “Assessing Interprofessional Competence using a Prospective Reflective Portfolio,” Journal of interprofessional care, vol. 29, 2015.

Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, Dijkstra J, Tigelaar D, Baartman LKJ, et al, “A model for programmatic assessment fit for purpose,” Med Teach, vol. 34, no. 3, pp. 205–14, 2012.

Simmons B, Egan-Lee E, Wagner SJ, Esdaile M, Baker L, “Reeves S, Assessment of Interprofessional Learning: The Design of an Interprofessional Objective Structured Clinical Examination (iOSCE) Approach,” J Interprof Care, vol. 25, no. 1, pp. 73–4, 2011.

Sevin AM, Hale KM, Brown N V, Mcauley JW, “Assessing Interprofessional Education Collaborative Competencies in Service-Learning Course,” vol. 80, no. 2, pp. 1–8, 2016.

Friedman Ben David M, Davis MH, Harden RM, Howie PW, Ker J, Pippard MJ “AMEE Medical Education Guide No. 24: Portfolios as a Method of Student Assessment,” Med Teach, vol. 23, no. 6, pp. 535–51, 2001.

Driessen EW, Van Tartwijk J, Vermunt JD, Van der Vleuten CPM, “Use of Portfolios in Early Undergraduate Medical Training,” Med Teach, vol. 25, no. 1, pp. 18–23, 2003.

Van Tartwijk J, Driessen EW, “Portfolios for Assessment and Learning: AMEE Guide no. 45,” Med Teach, vol. 31, no. 45, pp. 790–801, 2009.

Buckley S, Coleman J, Davison I et al, “BEME Guide: The Educational Effects of Portfolios on Undergraduate Student Learning: A Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11,” Med Teach, vol. 31, no. 4, pp. 282–98, 2009.

Driessen E, Van Tartwijk J, Van Der Vleuten C, Wass V, “Portfolios In Medical Education: Why Do they meet with Mixed Success? A Systematic Review,” Med Educ, vol. 41, no. 12, pp. 1224–33, 2007.

Baartman LKJ, “Assessing the assessment” Development and use of Quality Criteria for Competence Assessment Programmes,” Utrecht University, 2008.

Chandratilake MN, Davis MH, Ponnamperuma G, “Evaluating and Designing Assessments for Medical Education: the Utility Formula,” Internet J Med Educ, vol. 1, no. 1, pp. 1–8, 2010.

Dijkstra J, van der Vleuten CPM, Schuwirth LWT, “A New Framework for Designing Programmes of Assessment,” Adv Heal Sci Educ,vol.15, no. 3, pp. 379–93, 2009.

Downing SM, “Validity : on the Meaningful Interpretation of Assessment Data,” Med Educ, vol. 37,

pp. 830–7, 2003.

Cook DA, Beckman TJ, “Current Concepts in Validity and Reliability for Psychometric Instruments : Theory and Application,” Am J Med, vol. 119, pp. 166.e7-166.e16, 2006.

Driessen EW, Overeem K, Tartwijk J Van, Vleuten CPM Van Der, “Validity of Portfolio Assessment: Which Qualities Determine Ratings,” Med Educ, vol. 40, pp. 862–6, 2006.

Michels NRM, Driessen EW, Muijtjens AMM, Gaal LF Van, Bossaert LL, Winter BY De, “Portfolio Assessment during Medical Internships : How to Obtain a Reliable and Feasible Assessment Procedure ? Educ Heal,” vol. 22, no. 3, 1–9, 2009.

Howarth A, “The portfolio as an Assessment Tool in Midwifery Education. Br J Midwifery,” vol. 7, no. 5, pp. 327–9, 2016.

Driessen E, Vleuten C Van Der, Schuwirth L, Tartwijk J Van, Vermunt J, “The Use of Qualitative Research Criteria for Portfolio Assessment as an Alternative to Reliability Evaluation : a case study,” pp. 214–20, 2005.

Roberts C, Shadbolt N, Clark T, Simpson P, “The Reliability and Validity of a Portfolio Designed as a Programmatic Assessment of Performance in an Integrated Clinical Placement,” BMC Med Educ, vol. 14, no. 1, pp. 197, 2014.

Cresswell JW, “Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. 4th ed. Knight V, Young J, Koscielak K, editors,” London: SAGE Publications, Inc, pp. 146-342, 2014.

Elo S, Kyngäs H, “The Qualitative Content Analysis Process,” J Adv Nurs, vol. 62, no. 1, pp. 107–15, 2008.

Tang W, Cui Y, Babenko O, “Internal Consistency: Do We Really Know What It Is and How to Assess It?,”

J Psychol Behav Sci, vol. 2, no. 2, pp. 205–20, 2014.

McHugh ML, “Interrater Reliability: the Kappa Statistic. Biochem Medica [Internet]. Croatian Society for Medical Biochemistry and Laboratory Medicine, vol. 22, no. 3, pp. 276–82, 2012.

Harlen W, “Criteria for Evaluating Systems for Student Assessment,” Stud Educ Eval, vol. 33, no. 1, pp. 15–28, 2007.

Downing SM, Haladyna TM, “Validity Threats: Overcoming Interference with Proposed Interpretations of Assessment Data,” Med Educ,vol. 38, pp. 327–33, 2004.

Grant AJ, Vermunt JD, Kinnersley P, Houston H, “Exploring Student's Perceptions on the Use of Significant Event Analysis, as Part of a Portfolio Assessment Process in General Practice , as a Tool for Learning How to use Reflection in Learning,” BMC Med Educ, vol. 7, no. 5, pp. 1–8, 2007.

Pitts J, Coles C, Thomas P, Deanery W, Alfred K, “Enhancing Reliability in Portfolio Assessment : ` Shaping ’ the Portfolio,” Med Teach,vol. 23, no. 4, pp. 351–6, 2001.

Rees CE, Sheard CE, “The Reliability of Assessment Criteria for Undergraduate Medical Students’ Communication Skills Portfolios: the Nottingham Experience,” Med Educ,vol. 38, pp. 138–44, 2004.

Meissel K, Meyer F, Yao ES, Rubie-Davies CM, “Subjectivity of Teacher Judgments: Exploring Student Characteristics that Influence Teacher Judgments of Student Ability,” Teach Teach Educ,vol, pp. 65:48–60, 2017.

Mitchell M, “The Views of Students and Teachers on the use of Portfolios as a Learning and Assessment Tool in Midwifery Education,” Nurse Educ Today, vol. 14, no. 1, pp. 38–43, 1994.

Davis MH, Ponnamperuma GG, Ker JS, “Student Perceptions of a Portfolio Assessment Process,” Med Educ, vol. 43, no. 1, pp. 89–98, 2009.




DOI: http://dx.doi.org/10.11591/edulearn.v12i4.8713

Article Metrics

Abstract view : 50 times
PDF - 68 times

Refbacks

  • There are currently no refbacks.


Copyright (c) 2018 Universitas Ahmad Dahlan

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.


Journal of Education and Learning (EduLearn)
ISSN: 2089-9823, e-ISSN 2302-9277
Published by: Universitas Ahmad Dahlan (UAD) in collaboration with Institute of Advanced Engineering and Science (IAES)

View EduLearn Stats