This blog post was written in partnership with my classmate and friend Todd Pezer. This blog post was originally written as part of a course requirement for the Royal Roads Master of Arts in Learning and Technology program.
Design thinking is a human-centred iterative process utilized to gain empathy for the view of others to a complex problem. This process is employed prior to the search for solutions to best ensure extraordinary results. Design thinking has been successfully applied by organizations to improve products, services, processes, or education. Our team utilized the learning tools and design thinking process from Stanford University d.school (Stanford, 2016) to investigate common challenges within our respective organizations and to develop a prototype solution that would be useful and meaningful for each team member.
Our team consisted of a pilot educator from within the airline industry and a paramedic educator from a community college. Despite varied competency-based objectives and standards within each organization, both educators identified common challenges during the early stages of the design thinking process.
Two challenges resonated with each educator in both organizations. Specifically, is the learning material being delivered to students sufficiently relevant and enticing to maintain their interest, and to what level are students engaged with the artefacts or education resources? After addressing these key components, our team hopes to have a better understanding of the student level of engagement and will then repeat the design thinking process to develop solutions for improvement.
For each delivery model within the respective organizations, including online, blended, and in-class modalities, our team has speculated that a disconnect exists between student engagement and the educator’s perspective on the students’ level of engagement.
To identify the level of student engagement, the team will develop a multiple-item Likert formative and summative scale assessment (Gliem, 2003) and incorporate this assessment within, and at the conclusion of, each learning module. This assessment will be embedded within the learning materials, for example, Captivate, Blackboard, Moodle, or other Learning Management System module. This assessment will be a mandatory completion element by the student before proceeding through larger modules and at the conclusion of each learning module. Educator perspectives on student engagement within the module will be measured using a similar assessment tool.
A combination report consisting of a negotiated score determined by both the educator and student assessments will be generated for each learning module. It is anticipated that these reports will provide valuable insight into the level of student engagement in comparison to the educator’s perspective of engagement. Additionally, it is proposed that trends of engagement patterns will be gleaned from the in-situ assessments. Specifically, it is anticipated that patterns will emerge outlining the type and quality of materials that showed increased student engagement during the learning module.
Our team identified that level of engagement was the first step in a larger initiative. Once the initial findings are assessed, our team plans to expand the prototype to identify the level of engagement with identified individual students and compare this with other education metrics including attendance, test scores, competency attainment, etc. Additional one-on-one interviews related to students’ level of engagement within learning modules is anticipated to be beneficial but is not within the scope of this prototype.
Our team identified that the timely collection and distribution of the combination report will allow educators to self-reflect and adapt subsequent learning modules to improve student engagement. Providing training to improve student engagement would be beneficial for educators within each organization. Comprehensive training on the use of formative and summative assessments would need to be provided for each educator and student to ensure compliance and accuracy of completion.
Thank you for reviewing our design thinking process summary. Part two of our process will be posted soon. I look forward to your feedback and insight.
Gliem, J. A., & Gliem, R. R. (2003). Calculating, Interpreting, and Reporting Cronbach’s Alpha Reliability Coefficient for Likert-Type Scales. In Midwest Research-to-Practice Conference in Adult, Continuing, and Community Education, (pp. 1–88). Columbus, OH. Retrieved from https://scholarworks.iupui.edu/
Mattelmäki, T., Vaajakallio, K., & Koskinen, I. (2014). What Happened to Empathic Design? Design Issues, 30(1), 67–77. Retrieved from http://10.0.4.138/DESI_a_00249
Stanford University Institute of Design. (2016). A virtual crash course in design thinking. Retrieved from http://dschool.stanford.edu/dgift/
Stanford University Institute of Design. (2016). The Virtual Crash Course Playbook. Retrieved from http://dschool.stanford.edu/dgift/
Tran, N. (2016). Design Thinking Playbook. Retrieved from https://dschool.stanford.edu/resources/design-thinking-playbook-from-design-tech-high-school
Image by Karolina Grabowska is licensed under CC BY 4.0 (CC0 license)