INTERPRETING LARGE SCALE NATIONAL LEVEL ASSESSMENT DATA IN MATHEMATICS BY USING RASCH ANALYSIS
DOI:
https://doi.org/10.17770/sie2020vol3.5118Keywords:
assessment data, data-driven decisions, large scale national level assessmentAbstract
Latvia is undergoing a nation-wide curriculum reform in general education, with an aim to help students to develop 21st century skills. In order to successfully implement reform, not only teacher performance in the classroom is important, but also the transformation of the school culture is of high priority. One of the key dimensions that is characteristic for a school as learning organization culture is whether it has data-driven culture and is using data on continuous basis to improve student achievement.
Large scale national level assessment data is used for many different purposes, however, this data only rarely is recognised as useful data source for planning actions to improve student achievement at school level. Authors argue that in different grades average performance of students cannot be compared in a meaningful way to develop action plan and evaluate the impact of the initiatives at the school level. It is based on the issues rising from varying difficulty level of the tests and different skills, which are being assessed. The study design is based on in-depth analysis of items of large-scale national level assessment in mathematics, defining minimum level of competency of mathematics and calculating percentage of students in school with minimum level of competence in a cohort. This analysis is conveyed for the students of 3rd, 6th and 9th grade by using Rasch model, thus allowing to effectively monitor the student performance during the general education and use of data to make informed decisions.
References
Abell, S.K., & Lederman, N.G. (Red.). (2007). Handbook of research on science education. Lawrence Erlbaum Associates.
Andrich, D. (1982). An index of person separation in latent trait theory, the traditional KR. 20 index, and the Guttman scale response pattern. Education Research and Perspectives, 9(1), 95–104.
Biggs, J.B., & Collis, K.F. (1982). Evaluating the quality of learning: The SOLO taxonomy (structure of the observed learning outcome). Academic Press.
Biggs, J.B., & Tang, C. (2011). Teaching For Quality Learning At University. McGraw-Hill Education. Retrieved from
http://public.eblib.com/choice/publicfullrecord.aspx?p=798265
Bond, T.G., & Fox, C.M. (2001). Applying the Rasch model: Fundamental measurement in the human sciences. Psychology Press.
Boone, W.J., Staver, J.R., & Yale, M.S. (2013). Rasch analysis in the human sciences. Springer.
Brown, N.J.S., Nagashima, S.O., Fu, A., Timms, M., & Wilson, M. (2010). A Framework for Analyzing Scientific Reasoning in Assessments. Educational Assessment, 15(3–4), 142–174. DOI: https://doi.org/10.1080/10627197.2010.530562
Linacre, J.M. (1999). Understanding Rasch measurement: Estimation methods for Rasch measures. Journal of outcome measurement, 3, 382–405.
Linacre, J.M. (2004). Rasch model estimation: Further topics. Journal of applied measurement, 5(1), 95–110.
Mandinach, E.B. (2012). A Perfect Time for Data Use: Using Data-Driven Decision Making to Inform Practice. Educational Psychologist, 47(2), 71–85. DOI: https://doi.org/10.1080/00461520.2012.667064
Mandinach, E.B., Honey, M., & Center for Children and Technology (Education Development Center) (Red.). (2008). Data-driven school improvement: Linking data and learning. Teachers College Press.
Messick, S. (1995). Validity of Psychological Assessment. American Psychologist, 9.
Noteikumi par valsts pamatizglītības standartu, pamatizglītības mācību priekšmetu standartiem un pamatizglītības programmu paraugiem, Pub. L. No. 468 (2014).
Darbības programmas "Izaugsme un nodarbinātība" 8.3.1. Specifiskā atbalsta mērķa "Attīstīt kompetenču pieejā balstītu vispārējās izglītības saturu" 8.3.1.1. Pasākuma "Kompetenču pieejā balstīta vispārējās izglītības satura aprobācija un ieviešana" īstenošanas noteikumi., Pub. L. No. 670 (2015). Retrieved from https://likumi.lv/ta/id/278201
Noteikumi par valsts pamatizglītības standartu un pamatizglītības programmu paraugiem, Pub. L. No. 747 (2018). Retrieved from https://likumi.lv/ta/id/303768-noteikumi-par-valsts-pamatizglitibas-standartu-un-pamatizglitibas-programmu-paraugiem
OECD. (2016). PISA 2015 Results (Volume I): Excellence and Equity in Education. OECD Publishing.
OECD. (2019). PISA 2018 Results (Volume I). Retrieved from https://www.oecd-ilibrary.org/content/publication/5f07c754-en
Pestovs, P., & Namsone, D. (2017). National level test in science in Latvia for assessing how students explain phenomena scientifically. 2nd International Baltic Symposium on Science and Technology Education (BalticSTE 2017).
Pestovs, P., Namsone, D., Čakāne, L., & Saleniece, I. (2019). Alignment of 6th Grade Large-Scale Assessment Constructs with the Revised Curriculum Framework. Society. Integration. Education. Proceedings of the International Scientific Conference, 2, 387. DOI: https://doi.org/10.17770/sie2019vol2.3811
Schraw, G.J., & Robinson, D.R. (Red.). (2011). Assessment of higher order thinking skills. Information Age Pub.
Senge, P.M., Cambron-McCabe, N., Lucas, T., Smith, B., & Dutton, J. (2012). Schools that learn (updated and revised): A fifth discipline fieldbook for educators, parents, and everyone who cares about education. Crown Business.
Senge, P.M., & Sterman, J.D. (1992). Systems thinking and organizational learning: Acting locally and thinking globally in the organization of the future. European journal of operational research, 59(1), 137–150.
Volante, L. (2006). An Alternative Vision for Large-scale Assessment in Canada. Journal of Teaching and Learning, 4(1). DOI: https://doi.org/10.22329/jtl.v4i1.89