Background: There has been a substantial body of research examining feedback practices, yet the assessment and feedback landscape in higher education is described as ‘stubbornly resistant to change’. The aim of this paper is to present a case study demonstrating how an entire programme’s assessment and feedback practices were re-engineered and evaluated in line with evidence from the literature in the interACT (Interaction and Collaboration via Technology) project. Methods: Informed by action research the project conducted two cycles of planning, action, evaluation and reflection. Four key pedagogical principles informed the re-design of the assessment and feedback practices. Evaluation activities included document analysis, interviews with staff (n = 10) and students (n = 7), and student questionnaires (n = 54). Descriptive statistics were used to analyse the questionnaire data. Framework thematic analysis was used to develop themes across the interview data. Results: InterACT was reported by students and staff to promote self-evaluation, engagement with feedback and feedback dialogue. Streamlining the process after the first cycle of action research was crucial for improving engagement of students and staff. The interACT process of promoting self-evaluation, reflection on feedback, feedback dialogue and longitudinal perspectives of feedback has clear benefits and should be transferable to other contexts. Conclusions: InterACT has involved comprehensive re-engineering of the assessment and feedback processes using educational principles to guide the design taking into account stakeholder perspectives. These principles and the strategies to enact them should be transferable to other contexts.
Barton, K. L., Schofield, S. J., McAleer, S., & Ajjawi, R. (2016). Translating evidence-based guidelines to improve feedback practices: the interACT case study. BMC Medical Education, 16(53). https://doi.org/10.1186/s12909-016-0562-z