Unprecedented global shock
Covid-19 has changed our modern lives in a way that we are only just beginning to understand.
We are in an environment that we have not experienced before, where we are learning day by day, assimilating data that quickly emerges in this new world, and uniting patterns that arise from reflecting on relevant questions.
We are now living in a world that, in many ways, seems to be more aligned with life from another era, with the difference that now we are globally connected in a 24x7 flow of information.
The health contingency caused by the COVID-19 pandemic in the world has impacted everything about our way of life, including in education.
This worldwide period of emergency remote teaching in education is an anomalous and (hopefully) temporary situation, but it is nonetheless forcing education providers to re-think most of their expectations, as it is hard to imagine going back to where we were before this happened.
There are signs that some educational institutions are recognizing the challenges ahead, and we at EDT Partners, have been identifying a diverse and broad perspective on the patterns and questions that affect the Future of Assessment.
As an example, some of the data shared by Times Higher Education triggered us to approach this topic with a new perspective:
Stop the damage; build a better future
The immediate response has been to adapt classroom activities to a virtual environment with different levels of interaction with teachers synchronously or asynchronously. The first reaction for some teachers was to transfer all the content and the face-to-face experiences to the virtual environment without further adaptation, as if it were copy and paste. But it is evident that this doesn’t work, and that the change of environment is also a change in the rules.
Up until now, the appropriation of digital culture in education has mostly consisted of replicating the pedagogical experience of sharing knowledge that is packaged and non-interactive, designed for a uniform set of students, to be used at the same time, in the same space.
In digital culture, less is more. Would this be also applicable for assessing student learning?
In recent years, we have witnessed a progressive evolution of assessment processes that has changed the focus of attention towards students’ strategic and lifelong learning.
Start by asking the tough questions
Based on that, here are some questions we believe are going to drive the transformation going forward in terms of assessment :
Do we still need summative and high stakes assessment during this period?
How can we use assessment to identify the external conditions in which students have learned during the pandemic?
Will the COVID19 influence help educators to establish new and better aligned mechanisms of evaluation and assessment?
How shall we integrate and ensure equity and safety conditions in our assessments?
Is Proctoring really needed, or do we give trust another chance?
Are existing assessment technology tools ready for massive concurrent levels?
At an institutional level, how will we measure learning outcomes and impact?
Do we modify our previous processes to better assess knowledge, skills, and attitudes from the combination of experiences that our students have experienced during the pandemic?
Is it right to have parents assess their own child’s work, and can they be unbiased?
What approaches can we think of for post-covid assessment that generates evidence that relevant parties will find credible, suggestive, and applicable to decisions that need to be made?
We are convinced that there is not a right answer for every level, every subject, every country, every institution or every student. With that in mind, our proposal is to promote a broad and diverse discussion by which we can guide a transformation in assessment.
It is all about what we value most
Assessment threatens to be an exercise in measuring what's easy, rather than a process of improving what we really care about. We must resist the temptation to simply reach for the most common or convenient assessment format available.
The quick change of pace will transform the nature of the skills demanded by labor markets, favoring adaptability and flexibility. In turn, academic institutions will need to use assessments that are more flexible and adaptable to a changing environment.
Assessment practices currently tend to focus on what students know. Students are typically assessed, above all, on their understanding of some domain of specific knowledge within the subject area they studied. Progressively, the emphasis has been refocused on what students can do and the value of transferable, generic or essential skills, that is, the skills and competencies that all students should develop.
At stake are the future lives of many millions of young people, and the competitiveness of entire economies. The crucial thing to focus on now is how to learn. Future skills will not only encompass those meant to help students find gainful employment in the future.
It’s now common for a group of classmates to be pursuing wildly different activities given their own home context where everything is interdisciplinary. Judgement on assessed skills during the pandemic needs to be fair and equitable; it is vital that we rethink our competence in the practice of assessment.
Which ones best demonstrate our aim for skill development?
How do we develop new tools for identification or discernment of standards (and what is the new standard)?
Can we apply these standards to a given student?
Which new techniques for calibrating judgement are valid?
The future of assessment
The assessment strategy should now emphasize less tangible creative and analytical skills, such as leadership and entrepreneurship. Trial & error and iteration are the hallmarks of the innovation era and not easily taught through traditional methods. There is a strong influence of context, including cultural context, on developing and assessing these skills.
There is an opportunity to optimize learner performance with technology that offers flexibility to craft a more immersive assessment experience for learners. Relying on technology that lets authors quickly create interactive assessments for a more engaging, personalized learning experience would accelerate this transformation.
We must keep students engaged with dynamic, adaptive assessments that adjust question difficulty according to individual learner performance and estimated ability. It is important to get a deeper understanding of learner progress across individuals or groups of all sizes with reports, live tracking, and specific item analysis. Institutions must also control how assessments look, feel, and perform.
Formative feedback is facilitated by technologies such as connected classrooms, videography, online formative quizzes, and manuscript multi-draft editing. Technology-assisted formative assessment represents a powerful option to promote improved classroom communications that support formative assessment practices for teachers in twenty-first century classrooms.
Change in teaching and/or learning strategies for either individual students or for the whole class completes the formative assessment cycle.
The assessment of student learning begins with educational values. There is a disconnect between what institutions value and what they measure.
Assessment is not an end in itself but a vehicle for educational improvement. Educational values should drive not only what we choose to assess but also how we do so. We recommend that assessments from now on are constantly questioned and permanently integrated into a valuable student experience.
Practice does not make perfect. Perfect practice makes perfect.
Adachi, C., Tai, J., & Dawson, P. (2018). A framework for designing, implementing, communicating and researching peer assessment. Higher Education Research and Development, 37(3), 453–467. https://doi.org/10.1080/07294360.2017.1405913.
Ajjawi, R., & Boud, D. (2017). Researching feedback dialogue: an interactional analysis approach. Assessment & Evaluation in Higher Education, 42(2), 252–265. https://doi.org/10.1080/02602938.2015.1102863.
Ajjawi, R., & Boud, D. (2018). Examining the nature and effects of feedback dialogue. Assessment & Evaluation in Higher Education, 43(7), 1106–1119. https://doi.org/10.1080/02602938.2018.1434128.
Bearman, M., Dawson, P., Bennett, S., Hall, M., Molloy, E., Boud, D., & Joughin, G. (2017). How university teachers design assessments: a cross-disciplinary study. Higher Education, 74(1), 49–64. https://doi.org/10.1007/s10734-016-0027-7.
Boud, D. (2014). Shifting views of assessment: from secret teachers’ business to sustaining learning. In C. Kreber, C. Anderson, N. Entwistle, & J. McArthut (Eds.), Advances and inovations in university assessment and feedback (pp. 13–31). Edinburgh: Edinburgh University Press Ltd.. https://doi.org/10.3366/edinburgh/9780748694549.003.0002.
Boud, D. (2016). Current influences on changing assessment: implications for research to make a difference. In EARLI SIG1 Conference. Munchen. Boud, D., & Falchikov, N. (2007). Developing assessment for informing judgement. In D. Boud & N. Falchikov (Eds.), Rethinking assessment in higher education (pp. 181–197). London: Routledge.
Boud, D., & Molloy, E. (Eds.). (2013). Feedback in higher and professional education. London: Routledge.
Boud, D., & Soler, R. (2016). Sustainable assessment revisited. Assessment & Evaluation in Higher Education, 41(3), 400–413. https://doi.org/10.1080/02602938.2015.1018133.
Boud, D., Ajjawi, R., Dawson, P., & Tai, J. (Eds.). (2018a). Developing evaluative judgement in higher education. Assessment for knowing and producing quality work. London: Routledge.
Ahren, T. C. (2005). Using online annotation software to provide timely feedback in an introductory programming course. Paper Presented at the 35th ASEE/IEEE Frontiers in Education Conference, Indiannapolis, IN. Available at: http://www.icee.usm.edu/icee/conferences/FIEC2005/papers/1696.pdf