• Laila Niedrite University of Latvia, Faculty of Computing
  • Darja Solodovnikova University of Latvia, Faculty of Computing



research evaluation, research metrics, data integration, information system, data quality


The measuring of research results can be used in different ways e.g. for assignment of research grants and afterwards for evaluation of project’s results. It can be used also for recruiting or promoting research institutions’ staff. Because of a wide usage of such measurement, the selection of appropriate measures is important. At the same time there does not exist a common view which metrics should be used in this field, moreover many existing metrics that are widely used are often misleading due to different reasons, e.g. computed from incomplete or faulty data, the metric’s computation formula may be invalid or the computation results can be interpreted wrongly. To produce a good framework for research evaluation, the mentioned problems must be solved in the best possible way by integrating data from different sources to get comprehensive view of academic institutions’ research activities and to solve data quality problems. We will present a data integration system that integrates university information system with library information system and with data that are gathered through API from Scopus and Web of Science databases. Data integration problems and data quality problems that we have faced are described and possible solutions are presented. Metrics that are defined and computed over these integrated data and their analysis possibilities are also discussed.


Download data is not yet available.


D. Hicks, P. Wouters, L. Waltman, S. De Rijcke, and I. Rafols, “The Leiden Manifesto for research metrics”, in Nature, vol. 520(7548), 2015, pp. 429.

J. Kosten, “A classification of the use of research indicators”, in Scientometrics, vol. 108(1), 2016, pp. 457-464.

S. Nikolić, V. Penca, D. Ivanović, D. Surla, and Z. Konjović, “Storing of Bibliometric Indicators in CERIF Data Model”, in Proceedings of the ICIST 2013 Conference (CD), Kopaonik, Vol. 3, 2013.

G. Sivertsen,. “Data integration in Scandinavia”, in Scientometrics, vol. 106 (2), 2016, pp. 849-855.

I. Rampāne, G. Rozenberga, Latvijas Universitātes publikāciju citējamība datubāzēs (2012-2015), 2016. Available:, [Online], [Accessed: 15.03.2017]

“Ministru kabineta noteikumi Nr.1316

Kārtība, kādā aprēķina un piešķir bāzes finansējumu zinātniskajām institūcijām”, 2013.gada 12.novembrī, Available:, [Online], [Accessed: 15.03.2017].

Scopus citation database, Available: [Online], [Accessed: 15.03.2017].

Web of Science citation database, Available: [Online], [Accessed: 15.03.2017].

L. Colledge, and R. Verlinde, “Scival metrics guidebook”. Netherlands: Elsevier, 2014.

H. Zijlstra, R. McCullough, “CiteScore: a new metric to help you track journal performance and make decisions”, December 8, 2016. [Online]. Available: [Accessed: March 15, 2017].

W. Winkler, “The state record linkage and current research problems”, Technical report, Statistics of Income Division, Internal Revenue Service Publication, 1999.

H.F. Moed, “Measuring contextual citation impact of scientific journals, Journal of Informetrics”, vol. 4(3), pp. 265–277, 2010

B. Gonzalez-Pereira, V. P. Guerrero-Bote, & F. Moya-Anegon, “A new approach to the metric of journals’ scientific prestige: The SJR indicator,” Journal of Informetrics, vol. 4, pp. 379–391, 2010

J. Hardcastle, “New journal citation metric – Impact per Publication”, July 22, 2014. [Online]. Available: [Accessed: March 15, 2017]

“LU Rektora rīkojums par publikāciju līmeņiem”. Nr.1/278 (09.10.2013), University of Latvia, Internal management document, 2013.




How to Cite

L. Niedrite and D. Solodovnikova, “UNIVERSITY IS ARCHITECTURE FOR THE RESEARCH EVALUATION SUPPORT”, ETR, vol. 2, pp. 112–117, Jun. 2017, doi: 10.17770/etr2017vol2.2528.