PERFORMANCE ASSESSMENT IN SCIENCE NATIONAL LEVEL DIAGNOSTIC TESTS
DOI:
https://doi.org/10.17770/sie2018vol1.3215Keywords:
explain natural phenomena scientifically, performance assessment, national level diagnostic testingAbstract
National Centre of Education of the Republic of Latvia in 2016 lunched national project of the renewal of the curriculum for education funded by the European Social Fund. One of the key priorities of successful implementation of the new education curriculum is transforming national level assessment system.Assessment system measuring student performance must meet different requirements from the traditional measurement instrument assessing content knowledge.
Analysing 2017 national level science assessment diagnostic tests of 15-16 years old students, it is impossible to discriminate student different level of performance. Student, demonstrating singly stored pieces of knowledge, integrating pieces of knowledge into a coherent system, deriving general principle or transferring understanding to new situations, are assessed the same.
The research goal is to analyse 9th grade student performance in national level science diagnostic testing, which measures conceptual understanding by explaining natural phenomena scientifically. This paper describes developing and calibrating measuring instrument assessing student performance according to the cycles of cognitive growth and taxonomy of observed learning outcomes.
References
Biggs, J. B., Collis, K. F. (1982). Evaluating the quality of learning: the SOLO taxonomy (structure of the observed learning outcome). New York: Academic Press.
Kools, M., Organisation for Economic Co-operation and Development (Red.). (2016). Education in Latvia. Paris: OECD.
McTighe, J., Wiggins, G. (2004). Understanding by design: professional development workbook. Alexandria, Va.: ASCD.
Panizzon, D. (2003). Using a cognitive structural model to provide new insights into students’ understandings of diffusion. International Journal of Science Education, 25(12), 1427–1450. https://doi.org/10.1080/0950069032000052108
Pegg, J. (1992). Students’ understanding of geometry: Theoretical perspectives. No O. Keeley (Red.), Proceedings of the 15th Annual Conference of the Mathematics Education Research Group of Australasia (lpp. 18–35). Richmond: Merga.
Pegg, J. (2002). Fundamental Cycles of Cognitive Growth. No Proceedings of the Annual Meeting of the International Group for the Psychology of Mathematics Education (lpp. 9). Norwich, England.
Pegg, J. (2003). Assessment in Mathematics: a developmental approach. No J.M. Royer (Red.), Advances in Cognition and Instruction (lpp. 227–259). New York: Information Age Publishing Inc.
Pegg, J., Tall, D. (2005). The fundamental cycle of concept construction underlying various theoretical frameworks. ZDM, 37(6), 468–475.
Pestovs, P., Namsone, D. (2017). National level test in science in Latvia for assessing how students explain phenomena scientifically. 2nd International Baltic Symposium on Science and Technology Education.
Quinn, F., Pegg, J., & Panizzon, D. (2009). First‐year Biology Students’ Understandings of Meiosis: An investigation using a structural theoretical framework. International Journal of Science Education, 31(10), 1279–1305. https://doi.org/10.1080/09500690801914965
Wertheim, J., Holthuis, N. C., & Schultz, S. E. (2016). Evaluating Item Quality in Large-Scale Assessments, Phase I Report of the Study of State Assessment Systems. Stanford, California: Understanding Language/Stanford Center for Assessment, Learning, & Equity.
Wilson, M. (2005). Constructing measures: an item response modeling approach. Mahwah, N.J: Lawrence Erlbaum Associates.
Wu, M., Tam, H. P., & Jen, T.-H. (2016). Educational Measurement for Applied Researchers. Singapore: Springer Singapore. https://doi.org/10.1007/978-981-10-3302-5