By: Don Fraser
At , we believe in the importance of intentionally training learners in 21st-century skills. We help colleges and universities design programs, micro-credentials and micro-pathways tied to employers’ needs and that can stack to a full degree. We rely on micro-credentialing to make that learning visible to educators and employers, but micro-credentialing is still in its infancy and has a tough nut yet to crack for institutions to scale them.
In recent years, and fueled by the pandemic, higher education has been coming around to the idea that they need to begin offering more discrete credentials and the process of unpacking a degree, boiling it down to component parts, and offering digital badges for those parts as they’re mastered. This allows students to earn value along the way, rather than at the completion of a course or a degree. This approach is particularly useful in validating a student’s mastery of 21st-century skills, which tend not to be taught in a class of their own, but implicitly within course and majors.
The problem in the world of micro-credentialing is that assessment when done correctly, is a complicated process. Professors are already doing a lot of assessments, or they’re bringing in a teaching assistant because they need additional capacity to handle the grading of assessments. In many cases, that additional assessing and grading could be a non-starter for institutions interested in offering rigorous micro-credentials.
To solve that challenge, we partnered with and , a company that provides a platform and other support to help educators create simulations for students to practice and assess skills, to pilot the use of simulations for assessing critical thinking skills. Here’s how it worked.
From our years of research, one thing we’ve learned from employers is that performance-based assessments are the most effective way to evaluate and display competency in 21st-century skills, so we worked with employers and developed a set of assessments they’d find useful. We offer in eight of these skills:
- Creative Problem-Solving
- Intercultural Fluency
- Oral Communication
- Critical Thinking
Those assessments were good, but not great, mainly because they were time-intensive to administer and to grade.
A friend had introduced me to Muzzy Lane and the learning and assessment simulations they helped educators build, and we began to think that simulations might solve that challenge for us. So we began creating scenarios that someone might encounter in the workplace based on the earlier assessment we’d created. We chose to start with critical thinking because, according to our work, it is the most in-demand among higher education institutions and employers regardless of industry.
One of the benefits of simulations is that they allow for teachers who are not experts in the content being assessed to assess students. We often hear from faculty, “I teach history. I’m not an expert in critical thinking as a skill. How do I teach that?” With simulations, if you put the learning in front of students, the assessment will draw on their skills and automatically assess them. The teacher doesn’t need to know what to look for in answers or what level of mastery the student must demonstrate to pass. That’s all handled on the back end.
Simulations also provide another level of engagement. As a lifelong educator, I know that we learn best by applying the skills that we’re learning. Reading and listening to lectures is important, but it’s not a direct application of skills relevant to the real world. A simulation will ask students to do exactly that: to put the knowledge they are learning to use in a way that they haven’t done in the classroom, an internship, or likely anywhere else at all.
Creating the Simulation
Creating the simulation turned out to be a pretty intensive process, but we learned a lot from the experience.
Generally, faculty who are using the Muzzy Lane platform author their own simulations, tailored specifically to their class and content. In our case, we were looking to create a more universally applicable simulation that would fit into any class with minimal tweaking. We also had our rubric from the previous assessment, which had been battle-tested and proven, and which we did not want to change.
We used our assessment rubrics to match the four sub-competencies we’ve identified within critical thinking. Since we wanted a simulation that cut all the way across critical thinking, we needed the simulation to include all four sub-competencies. After advice and software expertise from the Muzzy team, we were able to create a scenario that would provide a dynamic setting and experience to measure all four competencies. In the end, we had not just an interactive simulation, but only a single assessment where we’d previously had four.
Piloting the Simulation
We chose to partner with Hope College because they were one of the institutions that helped us design the critical-thinking framework, assessments, and rubric. We asked students who had already completed the original assessments to go through the simulation. To our delight, we received a lot of great feedback from the students. They said the simulation was more engaging and more fun than traditional assessments–UX matters to us, so this was music to our ears. But what we were really interested in was the response from educators. While the rubric was the same as the previous assessment, the scenario in the simulation was different, so we were very interested in hearing whether they thought it assessed the skill as well and to what extent it improved their experience as a facilitator.
The feedback was overwhelmingly positive, and they described it as a game-changer that freed them up to do other things besides grading multiple assessments. Since this initial test, other institutions and organizations have tested the critical thinking simulation and have provided us with similar feedback. One 21st century skill down and seven more to go! The Lab is well on its way to ushering in the next, improved phase of its 21st-century skill micro-credentialing.
Don Fraser is the chief program officer at Education Design Lab. Prior to his work at the Lab, Don founded CollegeSnapps, a Washington, D.C. based education technology startup company. He also served as the director of education for the National Association for College Admission Counseling (NACAC), where he created educational opportunities for high school counselors and college admission professionals. Don began his work as a school counselor and brings his roots in psychology and history of transforming student perspectives and needs into action to the Lab’s design thinking-driven process. Don received his B.A. in Psychology from Boston College and his Master’s of Education in School Psychology from the University of Massachusetts. He can be reached at .