Publications and Presentations

Publications since 2011

2020

Kassab, O., Bornmann, L., & Haunschild, R. (2020). Can Altmetrics Reflect Societal Impact Considerations? Exploring the Potential of Altmetrics in the Context of a Sustainability Science Research Center. Quantitative Science Studies, 1(2), 792-809. https://doi.org/10.1162/qss_a_00032

Kassab, O., Mutz, R., & Daniel, H.-D. (2020). Introducing and testing an advanced quantitative methodological approach for the evaluation of research centers: A case study on sustainability science. Research Evaluation, 29 (2), pp. 135-​149. https://doi.org/10.1093/reseval/rvz029

Hug, S. E., Hołowiecki, M., Ma, L., Aeschbach, M., & Ochsner, M. (2020). Practices of peer review in the SSH I: a systematic review of peer review criteria. In M. Ochsner, N. Kancewicz-Hoffman, M. Hołowiecki, J. Holm (eds.), Overview of peer review practices in the SSH. ENRESSH Report (pp. 61–66). European Network of Research Evaluation in the Social Sciences and Humanities. https://dx.doi.org/10.6084/m9.figshare.12032589Kancewicz-Hoffman, N., Ochsner, M., Hołowiecki, M., & Holm, J. (2020). Introduction: aims and scope of the report. In M. Ochsner, N. Kancewicz-Hoffman, M. Hołowiecki, J. Holm (eds.), Overview of peer review practices in the SSH. ENRESSH Report (pp. 6–9). European Network of Research Evaluation in the Social Sciences and Humanities. https://dx.doi.org/10.6084/m9.figshare.12032589

Landák-Kabók, K., & Ochsner, M. (2020). A gender and geopolitical perspective on peer review. In M. Ochsner, N. Kancewicz-Hoffman, M. Hołowiecki, J. Holm (eds.), Overview of peer review practices in the SSH. ENRESSH Report (pp. 78–86). European Network of Research Evaluation in the Social Sciences and Humanities. https://dx.doi.org/10.6084/m9.figshare.12032589

Ochsner, M. (2020). Evaluation criteria and methodology. In M. Ochsner, N. Kancewicz-Hoffman, M. Hołowiecki, J. Holm (eds.), Overview of peer review practices in the SSH. ENRESSH Report (pp. 15–22). European Network of Research Evaluation in the Social Sciences and Humanities. https://dx.doi.org/10.6084/m9.figshare.12032589

Ochsner, M. (2020). Place, role, form and significance of peer review in National Research Evaluation Systems. In M. Ochsner, N. Kancewicz-Hoffman, M. Hołowiecki, J. Holm (eds.), Overview of peer review practices in the SSH. ENRESSH Report (pp. 55–60). European Network of Research Evaluation in the Social Sciences and Humanities. https://dx.doi.org/10.6084/m9.figshare.12032589

Ochsner, M. (in press). Messung von Forschungsleistungen? Was gemessen wird und was gemessen werden will. In I. M. Welpe, J. Stumpf-Wollersheim, L. Ritzenhöfer, & M. Prenzel (eds.), Leistungsbewertung in wissenschaftlichen Institutionen und Universitäten. Berlin: De Gruyter. https://doi:10.1515/9783110689884-016

Ochsner, M., Kancewicz-Hoffman, N., Hołowiecki, M., & Holm, J. (2020). Conclusions. In M. Ochsner, N. Kancewicz-Hoffman, M. Hołowiecki, J. Holm (eds.), Overview of peer review practices in the SSH. ENRESSH Report (pp. 99–100). European Network of Research Evaluation in the Social Sciences and Humanities. https://dx.doi.org/10.6084/m9.figshare.12032589

Ochsner, M., Kancewicz-Hoffman, N., Hołowiecki, M., & Holm, J. (eds.). (2020). Overview of peer review practices in the SSH. ENRESSH Report. European Network of Research Evaluation in the Social Sciences and Humanities. https://dx.doi.org/10.6084/m9.figshare.12032589

Ochsner, M., Ma, L., Kancewicz-Hoffman, N., Holm, J., Gedutis, A., Šima, K., Hug, S. E., Dewaele, A., & de Jong, S. (2020). Better adapted procedures for research evaluation in the SSH. ENRESSH Brief: Research Evaluation. European Network of Research Evaluation in the Social Sciences and Humanities. doi:10.6084/m9.figshare.12049314

2019

Bornmann, L., Haunschild, R., Mutz, R. (2019). MHq indicators for zero-​inflated count data - A response to the comment by Smolinsky. Journal of Informetrics, 13 (1), pp. 464-​465. https://doi.org/10.1016/j.joi.2019.02.004

Daniel, H.-D. (2019). Lutz Bornmann: Recipient of the 2019 Derek John de Solla Price Medal. Scientometrics, 121 (3), pp. 1235-​1238. https://doi.org/10.1007/s11192-​019-03251-4

Daniel, H.-D., Krempkow, R., & Schmidt, U. (2019). Einführung der geschäftsführenden Herausgeber. In: Qualität in der Wissenschaft, 13(3+4), S.II-​III. Download (PDF, 103 KB)

Giménez-​Toledo, E., Mañana-​Rodríguez, J., Engels, Tim C.E., Guns, R., Kulczycki, E. Ochsner, M., Pölönen, J., Sivertsen, G., Zuccala, A.A. (2019). Taking scholarly books into account, part II: a comparison of 19 European countries in evaluation and funding. Scientometrics, 118 (1), pp. 233-​251. https://doi.org/10.1007/s11192-​018-2956-7

Kassab, O. (2019). Does public outreach impede research performance? Exploring the 'researcher’s dilemma' in a sustainability research center. Science and Public Policy, 46 (5), pp. 710-​720. https://doi.org/10.1093/scipol/scz024

Mutz, R. & Daniel, H.-D. (2019). How should we measure individual researcher`s performance capacity within and between universities? A multilevel extension of the Bibliometric Quotient (BQ). In: Catalano, G., Daraio, C., Gregori, M., Moed, H. F., & Ruoca, G. (Eds.): Proceedings of the 17th International Conference on Scientometrics & Informetrics ISSI, September 2-5, 2019 Sapienza University Rom, Italy, Volume 1 (pp. 1098-​1109).

Mutz, R., Daniel, H.-D. (2019). How to consider fractional counting and field normalization in the statistical modeling of bibliometric
data: A multilevel Poisson regression approach. Journal of Informetrics, 13 (2), pp. 643-​657. https://doi.org/10.1016/j.joi.2019.03.007

2018

Mutz, R. & Daniel, H.-D. (2018, accepted). The Bibliometric Quotient (BQ), or how to measure a researcher’s performance capacity: A Bayesian Poisson Rasch model. Journal of Informetrics. 12(4), 1282-1295. https://doi.org/10.1016/j.joi.2018.10.006

Thor, A., Bornmann, L., Marx, W., Mutz, R. (2018). Identifying single influential publications in a research field: new analysis opportunities of the CRExplorer. Scientometrics 116, 591–608. https://doi.org/10.1007/s11192-​018-2733-7

Bornmann, L., Haunschild, R. & Mutz, R. (2018). MHq indicators for zero-​inflated count data - A response to Smolinsky and Marx (2018). Journal of Informetrics, 3(1), 1012-​1014. https://doi.org/10.1016/j.joi.2018.08.001

Kassab, O., Schwarzenbach, R. E., Gotsch, N. (2018). Assessing Ten Years of Inter-​ and Transdisciplinary Research, Education, and Outreach. The Competence Center Environment and Sustainability (CCES) of the ETH Domain. GAIA, 27(2), 226-​234. https://doi.org/10.14512/gaia.27.2.10

Ochsner, M., Kulczycki, E., & Gedutis, A. (2018). The Diversity of European Research Evaluation Systems. In STI Conference Proceedings, 12.–14. September 2018, Leiden (pp. 1235–1241). Leiden, NL: Centre for Science and Technology Studies (CWTS). http://hdl.handle.net/1887/65217

Daniel, H.-D. (2018): Internationalization in German Higher Education. In: R. M. Helms, L. E. Rumbley, L. Brajkovic (Eds.): Mapping Internationalization Globally: National Profiles and Perspectives. In: International Briefs for Higher Education Leaders, No. 7, pp. 18-​21. Washington, DC/Boston, MA: The American Council on Education Center for Internationalization and Global Engagement / The Boston College Center for International Higher Education. ISSN: 1084-​0613. Weblink: http://www.acenet.edu/news-​room/Documents/Mapping-​Internationalization-Brief-2018.pd

2017

Ochsner, M., Hug, S. E. & Daniel, H.-D. (2017). Assessment Criteria for Early Career Researcher’s Proposals in the Humanities. In Proceedings of the 16th International Conference on Scientometrics and Informetrics, 16.–20. October 2017, Wuhan (pp. 105–111). Wuhan: Wuhan University.

Galleron, I., Ochsner, M., Spaapen, J., Williams, G. (2017). Valorizing SSH research: towards a new approach to evaluate SSH research’ value for society. Journal of Research and Technology Policy Evaluation, 44, 35-​41. https://doi.org/10.22163/fteval.2017.274

Galleron, I., Ochsner, M., Spaapen, J., Williams, G. (2017). Evaluating to Valorise: The Societal Value of SSH Research and the ENRESSH COST Action. Journal of Research and Technology Policy Evaluation, 43, 175-​177.

Ochsner, M., Hug, S. E., Galleron, I. (2017). The future of research assessment in the humanities: Bottom-​up assessment procedures. Palgrave Communications, 3, 17020 https://doi.org/10.1057/palcomms.2017.20

Mutz, R., Wolbring, T., & Daniel, H.-D. (2017). The effect of the “very important paper” (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis. Journal of the Association for Information Science and Technology, 69(9), 2139-​2153. https://dx.doi.org/10.1002/asi.23701

Mutz, R., Bornmann, L., & Daniel, H.-D. (2017). Are there any frontiers of research performance? Efficiency measurement of funded research projects with the Bayesian stochastic frontier analysis for count data. Journal of Informetrics, 11(3), 613-​628. http://dx.doi.org/10.1016/j.joi.2017.04.009

2016

Mutz, R. (2016). Some further aspects of sampling: Comment on Williams and Bornmann. Journal of Informetrics, 10(4), 1241–1242.

Mutz, R. (2016). Do we really need BIBLIO-​metrics to evaluate individual researchers? Infozine, Special Issue 1, 21-​22. https:/dx.doi/10.3929/ethz-​a-010748893

Mutz, R., Wolbring, T., & Daniel, H. D. (2016). The effect of the "Very Important Paper" (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis. Journal of the Association for Information Science and Technology. https:/dx.doi/10.1002/asi.23701

Mutz, R., Bornmann, L. & Daniel, H.-D. (2016). Funding decision-​making systems: An empirical comparison of continuous and dichotomous approaches based on psychometric theory. Research Evaluation. https:/dx.doi/10.1093/reseval/rvw002

Bornmann, L., Stefaner, M., de Moya Anegón, F., Mutz, R. (2016). Excellence networks in science: A Web-​based application based on Bayesian multilevel logistic regression (BMLR) for the identification of institutions collaborating successfully. Journal of Informetrics, 10,1, 312-​327. https:/dx.doi/10.1016/j.joi.2016.01.005

2015

Bornmann, L., & Daniel, H.-D. (2015). Count regression models in Informetrics. Journal of Informetrics. https:/dx.doi/10.1016/j.joi.2015.10.003

Bornmann, L., & Mutz, R. (2015). Growth rates of modern science: A bibliometric analysis based on the number of publications and cited references. Journal of the Association for Information Science and Technology, 66(11), 2215-​2222. https:/dx.doi/10.1002/asi.23329

Daniel, H.-D., & Heger, M. (2015). Einführung der geschäftsführenden Herausgeber in das Themenheft "Studienabbruch, Fach-​ und Hochschulwechsel". Qualität in der Wissenschaft, 9(3+4), 65-​66.

Daniel, H.-D., & Mutz, R. (2015). Methodenkritische Anmerkungen zum Leiden-​Ranking. Forschung, 8(1+2), 21-​25.

Mutz, R., Bornmann, L., & Daniel, H.-D. (2015). Cross-​disciplinary research: What configurations of fields of science are found in grant proposals today? Research Evaluation, 24(1), 30-​36. https:/dx.doi/10.1093/reseval/rvu023

Mutz, R., Bornmann, L., & Daniel, H.-D. (2015). Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-​ante and ex-​post peer evaluation data from the Austrian science fund. Journal of the Association for Information Science and Technology, 66(11), 2321-​2339. https:/dx.doi/10.1002/asi.23315

Mutz, R., & Daniel, H.-D. (2015). What is behind the curtain of the Leiden Ranking? Journal of the Association for Information Science and Technology, 66(9), 1950-​1953. https:/dx.doi/10.1002/asi.23360

Ochsner, M., Wolbring, T., & Hug, S. E. (2015). Quality criteria for sociology: What sociologists can learn from the project «developing and testing research quality criteria in the humanities». Sociologia e Politiche Sociali, 18(2), 90-​110. https:/dx.doi/10.3280/SP2015-​002005

Bornmann, L., Mutz, R. (2015). How well does a university perform in comparison with its peers? the use of odds, and odds ratios, for the comparison of institutional citation impact using the Leiden Rankings. Journal of the Association for Information Science and Technology, 66(11), 2711-​2713. https:/dx.doi/10.1002/asi.23451

2014

Bornmann, L., Stefaner, M., Moya Anegón, F. d., & Mutz, R. (2014a). The scientific excellence mapping tool. European science editing, 40(1), 28-​28.

Bornmann, L., Stefaner, M., Moya Anegón, F. d., & Mutz, R. (2014b). What is the effect of country-​specific characteristics on the research performance of scientific institutions?: Using multi-​level statistical models to rank and map universities and research-​focused institutions worldwide. Journal of Informetrics, 8(3), 581-​593. https:/dx.doi/10.1016/j.joi.2014.04.008

Wolbring, T. (2014a). Kausalanalyse und Wirkungsevaluation: Potential Outcomes, Graphenmethodologie und ihre Anwendung am Beispiel der Bologna-​Reform. Zeitschrift für Evaluation, 13(2), 243-​270.

Wolbring, T. (2014b). Wie valide sind studentische Lehrveranstaltungsbewertungen?: Sachfremde Einflüsse, studentische Urteilsstandards, Selektionseffekte. Qualität in der Wissenschaft, 2014(2-3), 56-​60.

Wolbring, T. (2014c). Wissenschaftssoziologie: Erfolg und Erfolgungsgleichheit in der Wissenschaft aus soziologischer Perspektive. Bulletin / Vereinigung der Schweizerischen Hochschuldozierenden, 40(1), 48-​53.

Barth, A., Marx, W., Bornmann, L., & Mutz, R. (2014). On the origins and the historical roots of the Higgs boson research from a bibliometric perspective. European Physical Journal Plus, 129(6), 111. https:/dx.doi/10.1140/epjp/i2014-​14111-6

Bornmann, L., & Mutz, R. (2014). From P100 to P100': A new citation-​rank approach. Journal of the Association for Information Science and Technology, 65(9), 1939-​1943. https:/dx.doi/10.1002/asi.23152

Bornmann, L., Stefaner, M., de Moya Anegón, F., & Mutz, R. (2014). Ranking and mapping of universities and research-​focused institutions worldwide based on highly-​cited papers: A visualization of results from multi-​level models. Online information review, 38(1), 43-​58. https:/dx.doi/10.1108/OIR-​12-2012-0214

Hug, S. E., Ochsner, M., & Daniel, H.-D. (2014). A framework to explore and develop criteria for assessing research quality in the humanities. International Journal for Education Law and Policy, 10(1), 55-​64.

Mutz, R., & Daniel, H.-D. (2014). Studentische Evaluation von Lehre: Eine themenorientierte Bestandsaufnahme der wissenschaftlichen Literatur. Qualität in der Wissenschaft, 2, 34-​39.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2014). Setting the stage for the assessment of research quality in the humanities: Consolidating the results of four empirical studies. Zeitschrift Fur Erziehungswissenschaft, 17(6 Supplement), 111-​132. https:/dx.doi/10.1007/s11618-​014-0576-4

2013

Wolbring, T. (2013a). Kausalanalytische Anforderungen an die Theoriebildung. Zeitschrift für theoretische Soziologie, 2013(2), 195-​217.

Wolbring, T. (2013b). Sinn und Unsinn der Evaluation universitärer Lehre durch Studierende: Ein Plädoyer für einen sachgemäßen Umgang mit Lehrveranstaltungsbewertungen. Forschung & Lehre, 12, 1012-​1013.

Bornmann, L., Anegon, F. D., & Mutz, R. (2013). Do Universities or Research Institutions With a Specific Subject Profile Have an Advantage or a Disadvantage in Institutional Rankings? A Latent Class Analysis With Data From the SCImago Ranking. Journal of the American Society for Information Science and Technology, 64(11), 2310-​2316. https:/dx.doi/10.1002/asi.22923

Bornmann, L., Leydesdorff, L., & Mutz, R. (2013). The use of percentiles and percentile rank classes in the analysis of bibliometric data: opportunities and limits. Journal of Informetrics, 7(1), 158-​165. https:/dx.doi/10.1016/j.joi.2012.10.001

Bornmann, L., & Mutz, R. (2013). The advantage of the use of samples in evaluative bibliometric studies. Journal of Informetrics, 7(1), 89-​90. https:/dx.doi/10.1016/j.joi.2012.08.002

Bornmann, L., Mutz, R., & Daniel, H.-D. (2013). Multilevel-​statistical reformulation of citation-​based university rankings: The Leiden ranking 2011/2012. Journal of the American Society for Information Science and Technology, 64(8), 1649-​1658. https:/dx.doi/10.1002/asi.22857

Hug, S. E., Ochsner, M., & Daniel, H.-D. (2013). Criteria for assessing research quality in the humanities: A Delphi study among scholars of English literature, German literature and art history. Research Evaluation, 22(5), 369-​383. https:/dx.doi/10.1093/reseval/rvt008

Mutz, R., Bornmann, L., & Daniel, H.-D. (2013). Types of research output profiles: A multilevel latent class analysis of the Austrian Science Fund’s final project report data. Research Evaluation, 22(2), 118-​133. https:/dx.doi/10.1093/reseval/rvs038

Mutz, R., & Daniel, H.-D. (2013). University and student segmentation: Multilevel latent-​class analysis of students’ attitudes towards research methods and statistics. Br J Educ Psychol, 83(2), 280-​304. https:/dx.doi/10.1111/j.2044-​8279.2011.02062.x

Mutz, R., Heidemann, L., & Lewark, S. (2013). Der Berufserfolg von Absolventinnen und Absolventen forstwissenschaftlicher Studiengänge: Eine Re-​Analyse forstlicher Absolventenstudien aus den Jahren 2005 und 2006. Allgemeine Forst-​ und Jagdzeitung, 184(9/10), 214-​224.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2013). Four types of research in the humanities: Setting the stage for research quality criteria in the humanities. Research Evaluation, 22(2), 79-​92. https:/dx.doi/10.1093/reseval/rvs039

Patel, V. M., Ashrafian, H., Bornmann, L., Mutz, R., Makanjuola, J., Skapinakis, P. et al (2013). Enhancing the h index for the objective assessment of healthcare researcher performance and impact. J R Soc Med, 106(1), 19-​29. https:/dx.doi/10.1258/jrsm.2012.120253

Peric, B., Ochsner, M., Hug, S. E., & Daniel, H.-D. (2013). AHRABi: Arts and Humanities Research Assessment Bibliography (2. überarbeitete Ausgabe ed.): ETH Zürich.

2012

Mutz, R., Bornmann, L., & Daniel, H.-D. (2012a). Does Gender Matter in Grant Peer Review?: An Empirical Investigation Using the Example of the Austrian Science Fund. Zeitschrift Fur Psychologie-​Journal of Psychology, 220(2), 121-​129. https:/dx.doi/10.1027/2151-​2604/a000103

Mutz, R., Bornmann, L., & Daniel, H.-D. (2012b). Heterogeneity of Inter-​Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach. PLoS ONE, 7(10), e48509-​. https:/dx.doi/10.1371/journal.pone.0048509

Bornmann, L., Herich, H., Joos, H., & Daniel, H.-D. (2012). In public peer review of submitted manuscripts, how do reviewer comments differ from comments written by interested members of the scientific community? A content analysis of comments written for Atmospheric Chemistry and Physics. Scientometrics, 93(3), 915-​929. https:/dx.doi/10.1007/s11192-​012-0731-8

Bornmann, L., Schier, H., Marx, W., & Daniel, H.-D. (2012). What factors determine citation counts of publications in chemistry besides their quality? Journal of Informetrics, 6(1), 11-​18. https:/dx.doi/10.1016/j.joi.2011.08.004

Mittag, S., Mutz, R., & Daniel, H.-D. (2012). Anforderungen an Qualitätssicherungsinstrumente für Lehre und Studium an Hochschulen: Ergebnisse einer Meta-​Evaluation an der ETH Zürich. Beiträge zur Hochschulforschung, 34(3), 8-​31.

Mutz, R., & Daniel, H.-D. (2012a). The generalized propensity score methodology for estimating unbiased journal impact factors. Scientometrics, 92(2), 377-​390. https:/dx.doi/10.1007/s11192-​012-0670-4

Mutz, R., & Daniel, H.-D. (2012b). Skewed citation distributions and bias factors: Solutions to two core problems with the journal impact factor. Journal of Informetrics, 6(2), 169-​176. https:/dx.doi/10.1016/j.joi.2011.12.006

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2012). Indicators for Research Quality in the Humanities: Opportunities and Limitations. Bibliometrie - Praxis und Forschung, 1, 4-.

Tinsner, K., & Daniel, H.-D. (2012). Analyse von Studienverlaufsdaten: Ein differenzierter Blick auf einen naturwissenschaftlichen Bachelor-​Studiengang. Qualität in der Wissenschaft, 6(3), 64-​71.

2011

Bornmann, L. (2011a). Mimicry in science? Scientometrics, 86(1), 173-​177. https:/dx.doi/10.1007/s11192-​010-0222-8

Bornmann, L. (2011b). Scientific Peer Review. Annual review of information science and technology, 45, 199-​245.

Bornmann, L., Schier, H., Marx, W., & Daniel, H.-D. (2011a). Does the h index for assessing single publications really work? A case study on papers published in chemistry. Scientometrics, 89(3), 835-​843. https:/dx.doi/10.1007/s11192-​011-0472-0

Bornmann, L., Schier, H., Marx, W., & Daniel, H.-D. (2011b). Is interactive open access publishing able to identify high-​impact submissions? A study on the predictive validity of Atmospheric Chemistry and Physics by using percentile rank classes. Journal of the American Society for Information Science and Technology, 62(1), 61-​71. https:/dx.doi/10.1002/asi.21418

Bornmann, L., & Daniel, H.-D. (2011). Seasonal bias in editorial decisions?: A study using data from chemistry. Learned publishing, 24(4), 325-​328. https:/dx.doi/10.1087/20110410

Bornmann, L., & Mutz, R. (2011). Further steps towards an ideal method of measuring citation performance: The avoidance of citation (ratio) averages in field-​normalization. Journal of Informetrics, 5(1), 228-​230. https:/dx.doi/10.1016/j.joi.2010.10.009

Bornmann, L., Mutz, R., Hug, S. E., & Daniel, H.-D. (2011). A multilevel meta-​analysis of studies reporting correlations between the h index and 37 different h index variants. Journal of Informetrics, 5(3), 346-​359. https:/dx.doi/10.1016/j.joi.2011.01.006

Bornmann, L., Mutz, R., Marx, W., Schier, H., & Daniel, H.-D. (2011). A multilevel modelling approach to investigating the predictive validity of editorial decisions: do the editors of a high profile journal select manuscripts that are highly cited after publication? Journal of the Royal Statistical Society: Series A (Statistics in Society), 174(4), 857-​879. https:/dx.doi/10.1111/j.1467-​985X.2011.00689.x

Bornmann, L., Neuhaus, C., & Daniel, H.-D. (2011). The effect of a two-​stage publication process on the Journal Impact Factor: A case study on the interactive open access journal Atmospheric Chemistry and Physics. Scientometrics, 86(1), 93-​97. https:/dx.doi/10.1007/s11192-​010-0250-4

Bornmann, L., Wolf, M., & Daniel, H.-D. (2011). Closed versus open reviewing of journal manuscripts: how far do comments differ in language use? Scientometrics, 91(3), 843-​856. https:/dx.doi/10.1007/s11192-​011-0569-5

Daniel, H.-D. (2011). Welche Qualitätskriterien für die Geisteswissenschaften? Bulletin, 4, 15-​16.

Hug, S. E., & Ochsner, M. (2011). Qualitätskriterien für die Forschung in den Geisteswissenschaften: Eine Explorationsstudie. Bulletin, 2, 42-​43.

Leydesdorff, L., Bornmann, L., Mutz, R., & Opthof, T. (2011). Turning the Tables on Citation Analysis One More Time: Principles for Comparing Sets of Documents. Journal of the American Society for Information Science and Technology, 62(7), 1370-​1381. https:/dx.doi/10.1002/asi.21534

Thor, A., & Bornmann, L. (2011). The calculation of the single publication h index and related performance measures A web application based on Google Scholar data. Online information review, 35(2), 291-​300. https:/dx.doi/10.1108/14684521111128050



Presentations since 2011

2020

Daniel, H.-D. (2020). Lunchtime Lecture. Bonn: Alexander von Humboldt-​Stiftung. 28.10.2020.

Daniel, H.-D. (2020). Seminar mit Mitarbeitenden der VolkswagenStiftung zur Studie über international mobile Postdoktorierende. Hannover. 24.06.2020.

Daniel, H.-D. (2020). Effektmessung wissenschaftlicher Wettbewerbe. II. Villa Vigoni-​Symposium zu Wissenschaft und Politik 2020: "Wozu Wettbewerbe? Wissenschaft zwischen Konkurrenz und Kollegialität". Loveno di Menaggio. 23.-25.04.2020.

Daniel, H.-D. (2020). Methoden und Instrumente der Forschungsevaluation. Kurs im Modul "Qualitätsmanagement in der Forschung" im Rahmen des Teilzeit-​Studienprogramms "Certificate of Advanced Studies in Research Management". Zentrum für universitäre Weiterbildung der Universität Bern. 24.01.2020.

Ochsner, M. (2020). Should the extra-scientific impact/value of research in the SSH be evaluated? Input Presentation in the Workshop “Societal Impact in the SSH” at Swiss National Science Foundation (SNSF), Division Humanities and Social Sciences, Berne, Switzerland, 05.05.2020.

Ochsner, M. (2020). Aligning research evaluation with clear policy goals: risks and opportunities. Presentation at the Final Stakeholder Conference of the European Network for Research Evaluation in the Social Sciences and Humanities (ENRESSH), Université Sorbonne Nouvelle, Paris, France, 18.02.2020.

Ochsner, M. (2020). Identifying research quality. Presentation at the Final Stakeholder Conference of the European Network for Research Evaluation in the Social Sciences and Humanities (ENRESSH), Université Sorbonne Nouvelle, Paris, France, 18.02.2020.

2019

Daniel, H.-D. (2019). Lutz Bornmann – Recipient of the 2019 Derek John de Solla Price Medal (Laudatio). 17th International Conference on Scientometrics & Informetrics, September 5th, 2019. Sapienza University, Rome, Italy.

Mutz, R. & Daniel, H.-D. (2019). How Should We Measure Individual Researcher`s Performance Capacity Within and Between Universities? A Multilevel Extension of the Bibliometric Quotient (BQ). Paper at the 17th International Conference on Scientometrics & Informetrics ISSI, September 2-5, 2019. Sapienza University Rom, Italy.

Ochsner, M. (2019). „Societal Impact als Gegenstand der Forschungsevaluation“. Presentation at the public workshop of the general assembly of the Swiss Academy of Humanities and Social Sciences (SAGW), University of Berne, Berne. 24.05.2019.

Daniel, H.-D. (2019). Regionale Besonderheiten in der Hochschule. Vortrag im Rahmen der Veranstaltung "Region und Bildung. Mythos Stadt - Land". hbw Haus der Bayerischen Wirtschaft. München. 21.05.2019.

Ochsner, M. (2019). "National Research Evaluation Systems, Research Quality and the SSH". Keynote for the KNOWSCIENCE Workshop 2019, Lund University, Lund, Sweden. 21.03.2019.

Daniel, H.-D. (2019). Methoden und Instrumente der Forschungsevaluation. Kurs im Modul "Qualitätsmanagement in der Forschung" im Rahmen des Teilzeit-​Studienprogramms "Certificate of Advanced Studies in Research Management". Zentrum für universitäre Weiterbildung der Universität Bern. 25.01.2019.

Michael Ochsner, Aldis Gedutis & Emanuel Kulczycki (2019). „The diversity of research evaluation systems“. Course at the Training School “Research Evaluation and Careers in the SSH”. at Vilnius University, Vilnius, Lithuania. 08.01.2019.

Ochsner, M. (2019). „Quality Criteria for Research and Management“. Course at the Training School “Research Evaluation and Careers in the SSH”. at Vilnius University, Vilnius, Lithuania. 08.01.2019.

Ochsner, M. (2019). „Validity Issues of Peer Review“. Course at the Training School “Research Evaluation and Careers in the SSH”. at Vilnius University, Vilnius, Lithuania. 08.01.2019.

Michael Ochsner, Jolanta Šinkūnienė & Agnė Girnkontaitė (2019). “Research Evaluation and Careers in the SSH”. Training School an der Vilnius University, Litauen. 07.-11.01.2019.

Ochsner, M. (2019). Representativeness of Surveys and its Analysis. A Critical Examination of a Central Concept in Survey Research. Presentation at the FORS Lunch Seminar, University of Lausanne, Lausanne, Switzerland, 19.12.2019.

Ochsner, M., Peruginelli, G., Giménez-Toledo, E., Holm, J., Ramos, A., & Simon, D. (2019). Towards a clear Research Evaluation Strategy? Overview of SSH Research Evaluation Practices across Europe. Round Table at the 3rd International Conference on Research Evaluation in the Social Sciences and Humanities (RESSH), Polytechnical University of Valencia, Valencia, Spain, 20.09.2019.

Girkontaitė, A., & Ochsner, M. (2019). How reporting requirements can shape research activities. Presentation at the 3rd International Conference on Research Evaluation in the Social Sciences and Humanities (RESSH), Polytechnical University of Valencia, Valencia, Spain, 19.09.2019.

Lanamäki, A., Ahmad, M. U., & Ochsner, M. (2019). Any Publicity Good Publicity? The Effect of Satirical Bias on Twitter and the Altmetrics Attention Score. Presentation at the 3rd International Conference on Research Evaluation in the Social Sciences and Humanities (RESSH), Polytechnical University of Valencia, Valencia, Spain, 19.09.2019.

Ochsner, M., Lendák-Kabók, K., & Šinkūnienė, J. (2019). Early Career Investigators’ Views on Evaluation. Presentation at the 3rd International Conference on Research Evaluation in the Social Sciences and Humanities (RESSH), Polytechnical University of Valencia, Valencia, Spain, 19.09.2019.

2018

Ochsner, M. (2018). "Conceptual frameworks for evaluation and the role of impact". Humanities in Practice Workshop “Studying the humanities through policy concepts: quality, excellence and impact” University of Bergen, Bergen, Norway, 6 December 2018.

Ochsner, M. (2018). Bottom-​up approaches to research assessment. Presentation at the Conference “Impact factor, h-​Index and university rankings: sense and no(n)sense of quantifying science”, Swiss Academy of Sciences, Bern, 21 November 2018.

Daniel, H.-D. (2018). Sustainability and impact of funding initiatives for international postdocs - impact on individuals, institutions and society. Paper presented at the annual meeting of the International Advisory Board of the Alexander von Humboldt Foundation, Berlin, November 13th, 2018.

Ochsner, M., Dokmanović, M., Kulczycki, E., Gedutis, A., & Hug, S. E. (2018). The Usefulness of Quality Criteria for Research Policy. Presentation at the 23rd Nordic Workshop on Bibliometrics and Research Policy at the University of Borås, 7.-9. November 2018. Borås, Sweden.

Kassab, O. (2018). Does engaging in public outreach activities impede research performance? Assessing the researcher dilemma in the context of a sustainability science research center. EU-​SPRI Early Career Conference Public R&D funding and evaluation: Methods, Trends and Changes. Rome, 27 September 2018.

Mutz, R. & Daniel, H.-D. (2018). Wie wissenschaftlich leistungsfähig sind Forscherinnen und Forscher in der Schweiz? Ein Bayesscher Mehrebenen-​Poisson-Rasch-Ansatz. Forschungsreferat am Kongress der Deutschen Gesellschaft für Psychologie (DGPS), Frankfurt a. Main, 17.9.-20.9.2018.

Thor A., Bornmann, L., & Marx, W. & Mutz, R. (2018). New analysis features of the CRExplorer for identifying influential publications. 23rd International Conference on Science and Technology Indicators (STI 2018), Leiden University, 12.-14.9.2018.

Ochsner, M., Kulczycki, E., & Gedutis, A. (2018). The Diversity of European Research Evaluation Systems. Presentation at the STI Conference, 12.–14. September 2018, Leiden.

Daniel, H.-D. (2018). Welchen Nutzen stiften Postdoc-​Stipendien für international mobile Wissenschaftlerinnen und Wissenschaftler? 12. Hochschulforum Sylt, 30. August 2018.

Mutz, R. (2018). How to Model Production in Psychology? A Bayesian Stochastic Frontier Structural Equation Model. Paper at the VIII conference of the European Association of Methodology (EAM), University of Jena (Germany), 25.7.-27.7.2018.

Ochsner, M. (2018). Moderator der World-​Café Session "Research Evaluation and Research Assessment“ am Workshop "Nurturing a Culture of Responsible Research in The Era of Open Science“ an der Universität Genf, 25. Mai 2018.

Ochsner, M. (2018). Was ist Forschungsqualität und kann man sie messen? Nutzen und Gefahren von Bibliometrie, Szientometrie und Altmetrics in Bezug auf wissenschaftliche Karrieren. Workshop im Rahmen des Doktoratsprogrammes des kunstihistorischen Institutes der Universität Zürich, 18. Mai 2018.

Daniel, H.-D. (2018). Peer review and beyond: Randomisation at the Margin in the Selection of Research Grant Proposals. Invited speaker at the PEERE International Conference on Peer Review at the National Research Council, Rome, 7-9 March 2018. Weblink: http://www.peere.org/conference/conference-​invited-speakers/

Daniel, H.-D. (2018). Methoden und Instrumente der Forschungsevaluation. Kurs im Modul "Qualitätsmanagement in der Forschung" im Rahmen des Teilzeit-​Studienprogramms "Certificate of Advanced Studies in Research Management". Zentrum für universitäre Weiterbildung der Universität Bern. 26.01.2018.

Ochsner, M. (2018). Wie lässt sich Forschungsqualität sichtbar machen? Ein bottom-​up Ansatz zur Entwicklung adäquater Kriterien für die Beurteilung von Forschungsleistungen. Präsentation und anschliessende Paneldiskussion an der Disskussionsveranstaltung „‚Und wie möchten Sie bewertet werden?‘ Diskussion zur Bewertung von Leistungen in den Geistes-​, Sozial-​ und Kulturwissenschaften“ and der Universität Graz, 15. Januar 2018.

2017

Ochsner, M. (2017). The misconception of societal impact in research evaluation. Consequences for its measurement and its impact on science policy and research. Presentation at the 22nd Nordic Workshop on Bibliometrics and Research Policy, University of Helsinki, Finland. 10.11.2017.

Ochsner, M., Kulczycki, E., & Gedutis, A. (2017). Diversity of Research Evaluation Systems in Europe. Presentation at the 22nd Nordic Workshop on Bibliometrics and Research Policy, University of Helsinki, Finland. 09.11.2017.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2017). Assessment Criteria for Early Career Researcher’s Proposals in the Humanities. Presentation at the 16th International Conference on Scientometrics & Informetrics at Wuhan University, Wuhan, China. 18.10.2017.

Ochsner, M. (2017). How to improve research quality in the social sciences? Course for Young Scholars at the Ss. Cyril and Methodius University Skopje, Skopje, Macedonia. 03.10.2017.

Ochsner, M, & Dokmanovic, M. (2017). Quality Criteria for Research in the Social Sciences in Macedonia. Presentation at the National Roundtable "Research Quality in Social Sciences and Brain Drain Prevention in the Republic of Macedonia: Lessons Learned and Challenges Ahead" at the Ss. Cyril and Methodius University Skopje, Skopje, Macedonia. 03.10.2017.

Mutz, R. (2017). Vorhersage oder Produktion? – Ein Bayessches Strukturgleichungsmodell für Stochastic Frontier Analysen. Vortrag an der 13. Tagung der Fachgruppe Methoden & Evaluation der Deutschen Gesellschaft fur Psychologie, 17.-20.9.2017, Universität Tübingen.

Ochsner, M. (2017). Responsible Individual Metrics? What we can measure and what we want to measure. Presentation at the Workshop Governance of Science organized by the VolkswagenStiftung und Leopoldina at Schloss Herrenhäusern in Hannover, Germany. 25.07.2017.

Mutz, R. (2017). Prediction or Production? A Bayesian Stochastic Frontier Structural Equation Approach. Paper presented at the International Meeting of the Psychometric Society (IMPS 2017). July 18-​21, University of Zurich.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2017). Quality Criteria for the Ex-​Ante Evaluation of Research Proposals from Young Humanities Scholars. Presentation at the conference „Research Evaluation in the Social Sciences and Humanities“ at the University of Antwerp, Antwerp, Belgium. 07.07.2017.

Ochsner, M., & Dokmanovic, M. (2017). Quality criteria and research obstacles in the SSH in Macedonia. Presentation at the conference „Research Evaluation in the Social Sciences and Humanities“ at the University of Antwerp, Antwerp, Belgium. 07.07.2017.

Ochsner, M., Kulczycki, E., & Gedutis, A. (2017). SSH research evaluation in Europe: Towards a classification. Presentation at the conference „Research Evaluation in the Social Sciences and Humanities“ at the University of Antwerp, Antwerp, Belgium. 07.07.2017.

Daniel, H.-D. (2017). Zusammenfasung und Ausblick der Fachtagung „Studienerfolg und Studienabbruch“ des Bundesministeriums für Bildung und Forschung am 8./9. Juni in Berlin.

Ochsner, M. (2017). European Network for Research Evaluation in the Social Sciences and Humanities (ENRESSH). Presentation at the Network Meeting of the Program P-3 at swissuniversities, Bern, Switzerland. 12.05.2017

Daniel, H.-D. (2017). Digitalisierung aller Lebensbereiche. Veranstaltung „Bildung 2030 – Fragen an die Bildungspolitik". Haus der Bayerischen Wirtschaft, München. 10.05.2017. Mehr...

Daniel, H.-D. (2017). Quantitative Comparison of Outcomes. Paper presented at the Workshop „Randomization – A New Element in the Selection of Research Grants?“. March 11, 2017, Volkswagen Foundation and EMBO, Herrenhausen Palace, Hanover/Germany.

Hug, S. E. (2017). ERIH PLUS: a European journal list for the SSH. Paper presented at the Workshop “Performances de la recherche en sciences humaines et sociales”. March 10, 2017, swissuniversities, Bern.

Hug, S. E. (2017). Microsoft Academic: a new bibliometric superpower? Paper presented at the Workshop “Performances de la recherche en sciences humaines et sociales”. March 10, 2017 swissuniversities, Bern.

Hug, S. E. (2017). Microsoft Academic: an opportunity for the Social Sciences and Humanities? Paper presented at the Work Group Meeting of the European Network for Research Evaluation in the Social Sciences and the Humanities, March 7-8, 2017, Sofia University, Sofia.

Daniel, H.-D. (2017). Studienrelevante Vielfalt – Internationalisierung, Digitalisierung, Diversität und didaktische Herausforderungen der ‚Massenuniversität’: Studierenden-​ und Lernendentypen als Herausforderung an die Hochschule. Forum zum Start der 2. Phase des Qualitätspakts Lehre an der Universität Kassel. 14.02.2017. Videoaufzeichnung

Daniel, H.-D. (2017). Methoden und Instrumente der Forschungsevaluation. Kurs im Modul "Qualitätsmanagement in der Forschung" im Rahmen des Teilzeit-​Studienprogramms "Certificate of Advanced Studies in Research Management". Zentrum für universitäre Weiterbildung der Universität Bern. 27.01.2017.

Ochsner, M. (2017). State of the art - representations of research quality in the SSH. Presentation at the conference „SSH evaluation: Reconciling needs and methods“ at the Institute of Sociology of the Academies of Sciences of the Czech Republic, Prague, Czech Republic. 19.01.2017

2016

Michael Ochsner (2016). Forschungsevaluation in den Geisteswissenschaften in der Schweiz und Europa: Schweizer Beteiligung am European Network for Research Evaluation in the SSH. Presentation at the CHESS-​Book-presentation „Research Assessment in the Humanities: Towards Criteria and Procedures“ at the University of Zurich, 15.12.2016

Ioana Galleron, Michael Ochsner, Jack Spaapen & Geoffrey Williams (2016). Evaluating to valorise: the societal value of SSH research and the ENRESSH COST action. Presentation at the conference „OpenEvaluation 2016“ in Vienna, 24.-25.11.2016

Daniel, H.-D. (2016). Studentische Lehrevaluation. Gastvortrag an der Universität Konstanz, 08.11.2016.

Hug, S.E., Ochsner M., & Daniel, H.-D. (2016). Criteria and Indicators for Evaluating Research in the Humanities. Poster presentation at the Conference "Swiss Way to Research Quality". 3-4.11.2016, University of Bern.

Hug, S.E. (2016). Forschungsinformation in den Geistes-​ und Sozialwissenschaften. Welche Daten für welche Zwecke? Panel discussion at the Conference "The Swiss Way to Research Quality". 3-4.11.2016, University of Bern.

Silvia Martens, Karin Byland, Ruth Langner, Michael Ochsner, Dilini Sylvie Jeanneret & Marlène Iseli (2016). Die Entwicklung von Qualitätskriterien bottom-​up. Session at the „Abschlusskonferenz des Programmes P3 Performances de la recherche en sciences humaines et sociales“ an der Universität Bern, 03.11.2016

Daniel, H.-D. (2016). Peer Review Under Fire. Korreferat zum Vortrag „Open Post-​Publication Review als Alternative zur Peer Review“ von Prof. (em.) Dr. Dr. h.c. mult. Alfred Kieser. CHESS lecture am 6.10.2016, Universität Zürich.

Michael Ochsner (2016). Scientific careers and bibliometrics, altmetrics and scientometrics: opportunities and pitfalls for young scholars. Presentation at the DOPE-​Meeting at the University of Lausanne, 06.10.2016

Hug, Sven E. (2016). Qualitative criteria in SSH research. Invited presentation at the International Workshop for Young Researchers in the Social Sciences and Humanities, "How do you survive evaluation?", 14-​17 September 2016, University of Geneva.

Jack Spaapen, Gunnar Sivertsen, Michael Ochsner, Julia Melkers & Tim Engels (2016). Developing appropriate methods and indicators for evaluation of research in the social sciences and humanities. Presentation of a new COST Action. Session at the „21st International Conference on Science and Technology Indicators“ in València, 14-​17.09.2016

Michael Ochsner & Sven E. Hug (2016). Indicators for Research Performance in the Humanities? The Scholars’ View on Research Quality and Indicators. Presentation at the „21st International Conference on Science and Technology Indicators“ in València, 14-​17.09.2016

Wolfgang Kaltenbrunner & Michael Ochsner (2016). Think Pair in the session „Multiplying methods in the field of research evaluation“  at the „21st International Conference on Science and Technology Indicators“ in València, 14-​17.09.2016

Mutz, R. (2016). Measurement of Research Performance Capacity using Bibliometric Data – A Bayesian Rasch Model for Count Data. Paper presented at the VII European Congress of Methodology, Palma de Mallorca.

Mutz, R. (2016). Messung wissenschaftlicher Leistungsfähigkeit ‐ Ein Bayesianisches Rasch‐Modell für Zähldaten als Alternative zum h‐index. Paper presented at the Congress of the German Psychological Society (DGPs), Leipzig.

Michael Ochsner & Sven E. Hug (2016). When does adding a survey language make sense? Representation bias, response rates, and strategic considerations. Presentation at the „2nd International Conference on Survey Methods in Multinational, Multiregional and Multicultural Contexts (3MC)“ in Chicago, 28.07.2016

Hug, S.E. (2016). The current state of publication databases in Switzerland. Presentation at the Working Group Meeting of the European Network for Research Evaluation in the Social Sciences and the Humanities, July 11-​12, 2016, Adam Mickiewicz University, Poznań.

Mutz, R. (2016). Growth rates of science - Is there an accelerated growth of science? Eingeladener Vortrag am Frühlingsseminar des Schweizer Klub für Wissenschaftsjournalismus, am 11.5.2016, Bern.

Daniel, H.-D. (2016). Methoden und Instrumente der Forschungsevaluation. Kurs im Modul "Qualitätsmanagement in der Forschung" im Rahmen des Teilzeit Studienprogramms "Certificate of Advanced Studies in Research Management". Zentrum für universitäre Weiterbildung der Universität Bern.

Daniel, H.-D. (2016). Peer Review im Fadenkreuz der Kritik. Impulsvortrag anlässlich des Kaminabends der Koordinationsstelle an der TU München für die Förderlinie „Leistungsbewertung in der Wissenschaft“ des Bundesministeriums für Bildung und Forschung am 6. April 2016 in der Katholischen Akademie in Bayern, München.

2015

Daniel, H.-D. (2015a). Forschungsevaluation heute: ubiquitär und vielgestaltig. Paper presented at the Workshop "Forschungsevaluation: Grenzen überwinden/ Thinking out of the box" des Arbeitskreises für Bildung, Forschung, Innovation der SwissFoundations, Kulturhaus Helferei, Zürich.

Daniel, H.-D. (2015b). Peer review. Paper presented at the European summer school for scientometrics (Eingeladener Vortrag), Katholieke Universiteit Leuven.

Daniel, H.-D. (2015c). Peer review: (Mis-​)Judgment of scientific quality? Paper presented at the Governance of Science: The (Mis-​)Measurement of Scientific Quality, Herrenhausen Castle.

Daniel, H.-D., & Mutz, R. (2015). Very important papers - eine Analyse. Paper presented at the Kuratoriumssitzung der Zeitschrift Angewandte Chemie bei der Gesellschaft Deutscher Chemiker, Frankfurt a. M.

Hug, S. E. (2015). Bibliometrie in der Hochschulevaluation. Paper presented at the Workshop "Bibliometrie" des Vereins AG Informationskompetenz an Schweizer Hochschulen, Universität Zürich.

Hug, S. E., & Ochsner, M. (2015a). Interdisziplinäre Forschungsprojekte angemessen beurteilen: Ein Bottom-​up-Ansatz für die Entwicklung valider Qualitätskriterien. Paper presented at the Exzellenzcluster Kulturelle Grundlage von Integration, Universität Konstanz.

Hug, S. E., & Ochsner, M. (2015b). Validation strategies in the field of research evaluation: Traditions and new directions. Paper presented at the 20th Nordic Workshop on Bibliometrics and Research Policy, Oslo.

Hug, S. E., & Ochsner, M. (2015c). Validierung von Begutachtungskriterien für geisteswissenschaftliche Forschungsgesuche. Paper presented at the 10. Workshop "Performances de la recherche en sciences humaines et sociales" Bern.

Mutz, R. (2015). How to use bibliometric data to rank universities according to their research performance? Paper presented at the "Bibliometrics" an der COST Conference der ETH Zürich, Villa Hatt, Zürich.

Mutz, R., Hug, S. E., & Bopp, S. (2015). Are students' evaluations of teaching really reliable? A Bayesian meta-​analysis. Paper presented at the 13th European Conference of Psychological Assessment, Zürich.

Ochsner, M., & Hug, S. E. (2015a). Collaborations and research quality: The notions of quality of humanities scholars. Paper presented at the Research Evaluation for the Social Sciences and the Humanities (Input presentation), Rennes.

Ochsner, M., & Hug, S. E. (2015b). Evaluation criteria in the humanities: preferences for traditional and modern conceptions of research as a matter of scholars' characteristics. Paper presented at the Research Evaluation for the Social Sciences and the Humanities, Rennes.

Ochsner, M., & Hug, S. E. (2015c). Research assessment and bibliometrics: Bringing quality back in. Paper presented at the 15th International Conference on Scientometrics and Informetrics, Bogazici University, Istanbul.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2015). Notion of quality in the humanities: Possible inspiration for law studies? Paper presented at the Legal Research Assessment Seminar im Rahmen der Interaction Between Legal Systems Conference "Room for Reflection", Universität Leiden.

2014

Bornmann, L., Stefaner, M., & De Moya Anegón, F. (2014). Multilevel statistical models to rank and map universities and research-​focused institutions worldwide in an excellence mapping tool. Paper presented at the 10th International Conference on Webometrics, Informetrics and Scientometrics & 15th COLLNET Meeting, Illmenau.

Bozoyan, C., & Ochsner, M. (2014). A mechanism approach to income discrimination wih an application for the case of overweight males and females. Paper presented at the Parallel Session "Labor Market Inequalitites II" an der SOEP 2014 11th International Socio-​Economic Panel User Conference, DIW Berlin.

Daniel, H.-D. (2014a). Möglichkeiten und Grenzen der Wirkungsmessung in Programmevaluationen. Paper presented at the "Wie wirken Stiftungen? Eine Veranstaltung zur Professionalisierung des Stiftungshandels" (Keynote), Tagungszentrum Schloss Herrenhausen, Hannover.

Daniel, H.-D. (2014b). Neue Modelle der Forschungsevaluation. Paper presented at the Konferenz "Braucht es eine neue Wissenschaftskultur?" der Akademien der Wissenschaften Schweiz, der Deutschen Akademie der Naturforscher Leopoldina und der Österreichischen Akademie der Wissenschaften, Universität Zürich.

Daniel, H.-D. (2014c). Qualität der Verwaltung. Paper presented at the Projektmesse zum Zukunftskonzept  Universitätsverwaltung der Universität Hamburg (Impulsreferat), Universität Hamburg.

Daniel, H.-D., Ochsner, M., & Hug, S. E. (2014). Notions of quality: Humanities scholars' perception of "good research". Paper presented at the Workshop "What Is Intellectual Quality in the Humanities?" der VolkswagenStiftung in Zusammenarbeit mit dem Max-​Planck-Institut für Wissenschaftsgeschichte, Schloss Herrenhausen, Hannover.

Hug, S. E., & Ochsner, M. (2014). Quality criteria and indicators in the light of humanities scholars' notions of quality: a bottom-​up approach. Paper presented at the Workshop "La valutazione della ricerca nelle Humanities and Social Sciences" der National Agency for the Evaluation of Universities and Research Institutes (ANVUR), , Auditorium Antonianum, Rom.

Hug, S. E., Ochsner, M., & Daniel, H.-D. (2014). Taking scholars seriously - a non-​mainstream approach to research assessment. Paper presented at the 13th Congress of International Association of Legal Methodology an der Universität Genf, Genf.

Mutz, R. (2014a). Ist die Psychologie vom Forschungsergebnisprofil eine Naturwissenschaft? - Eine Mehrebene Latente Klassenanalyse von Forschungsergebnisberichten des FWF. Paper presented at the 49. Kongress der Deutschen Gesellschaft für Psychologie (DGPs), Bochum.

Mutz, R. (2014b). Modeling clustered receiver operating characteristic curves by a multilevel location-​scale binormal approach - Concept and application. Paper presented at the VI European Congress of Methodology (EAM), , Utrecht, Niederlande.

Ochsner, M. (2014). Multiple imputation of missing values: Why, how and the do's and don'ts. Paper presented at the Research Seminar "Methods and Research Meetings", Universität Lausanne.

Ochsner, M., & Hug, S. E. (2014a). Addressing validity issues of research indicators: Quality criteria for humanities research. Paper presented at the Konferenz der portugiesischen Forschungsförderungsagentur (Fundação para a Ciência e a Tecnologia, FCT) und des Bildungsministeriums (Ministérioda Educação e Ciência) "Indicadores de Desempenho para a Ciência e o Ensino Superior [Performance Indicators for Science and Higher Education]", Teatro Thalia, Lissabon.

Ochsner, M., & Hug, S. E. (2014b). How to recognize quality? Paths towards discipline-​specific quality criteria for research. Paper presented at the Workshop "Assessing Quality and Performance in Area Studies", Universität Fribourg.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2014a). La qualité de la recherche? Notions, critères et indicateurs. Paper presented at the Journée de la Faculté des Lettres an der Universität Genf, Universität Genf.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2014b). The Project 'Developing and Testing Research Quality Criteria in the Humanities, with an emphasis on Literature Studies and Art History'. Paper presented at the Qu’est-​ce que recherche de qualité ?" an der Tagung "Performance de la recherche en sciences humaines et sociales" (Inputreferat), Universität Fribourg.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2014c). Validity issues of bibliometric and performance indicators: What we can learn from the case of the humanities. Paper presented at the Nordic Workshop on "Bibliometrics and Research Policy", Reykjavik.

Sivertsen, G., & Ochsner, M. (2014). Forschungsevaluation in den Geisteswissenschaften. Paper presented at the Workshop zum Thema Forschungsevaluation in den Geisteswissenschaften, Lissabon.

Wolbring, T. (2014a). Field experiments: Potentials and problems for the case of the broken windows theory. Paper presented at the Kolloquium von Prof. Dr. Stefanie Eifler, Universität Eichstätt-​Ingolstadt.

Wolbring, T. (2014b). How beauty works. Kausale Mechanismen am Beispiel der studentischen Lehrveranstaltungsbewertung. Paper presented at the Kolloquium von Prof. Dr. Eldad Davidov, Prof. Dr. Andreas Diekmann, Prof. Dr. Jörg Rössel, Prof. Dr. Katja Rost, ETH Zürich/Universität Zürich.

Wolbring, T. (2014c). Methodische Fallstricke der Lehrveranstaltungsevaluation. Paper presented at the Kolloquium von Prof. Dr. Martin Abraham und Prof. Dr. Monika Jungbauer-​Gans an der Friedrich-​Alexander Universität Erlangen-​Nürnberg, Erlangen.

Wolbring, T. (2014d). Physische Merkmale und soziale Ungleichheit. Paper presented at the Vortrag an der Universität Bielefeld, Universität Bielefeld.

Wolbring, T. (2014e). Relatives Einkommen und Lebenszufriedenheit. Paper presented at the Auswahlverfahren zur Juniorprofessur für Soziologie, Universität Mannheim.

Wolbring, T., & Keuschnigg, M. (2014). Broken windows. Paper presented at the Konferenz "Devianz und Delinquenz: Theorien, Modelle und empirische Analysen" (DGS-​section Model Building and Simulation), Erfurt.

Wolbring, T., & Kroher, M. (2014). Social control and cheating. Evidence from lab and online experiments on dishonesty. Paper presented at the Konferenz "Experimentelle Techniken in den Sozialwissenschaften" (DGS-​section Methods of Social Research), Eichstätt.

Wolbring, T., & Riordan, P. (2014). How beauty works: Theoretical mechanisms and two empirical applications on students' evaluation of teaching. Paper presented at the Section on Social Psychological Paper Session" am American Sociological Association (ASA) Annual Meeting, San Francisco.

2013

Daniel, H.-D. (2013a). Anmerkungen zum Deutschen Akkreditierungswesen. Paper presented at the 7. Hochschulforum "Strategien der Qualitätssicherung und -​steigerung in Hochschule und Forschung - sind sie überhaupt zielführend?", Sylt.

Daniel, H.-D. (2013b). Bibliometric agora (panel). Paper presented at the European Summer School of Scientometrics (Panelist), Berlin.

Daniel, H.-D. (2013c). Die Nutzung von Absolventenstudien für die Qualitätssicherung von Hochschulen. Paper presented at the BAS-​Tagung "Absolventenstudien und Qualitätssicherung" am IHF Bayrisches Staatsinstitut für Hochschulforschung und Hochschulplanung, München.

Daniel, H.-D. (2013d). Hochschulrankings: gestern-​heute-morgen. Paper presented at the 11. Fachtagung der Fachgruppe Methoden und Evaluation der Deutschen Gesellschaft für Psychologie an der Alpen-​Adria-Universität Klagenfurt (Keynote), Klagenfurt.

Daniel, H.-D. (2013e). Ideenwerkstatt zu Forschungsorientierter Lehre. Paper presented at the Veranstaltung am Philosophischen Institut der Freien Universität Berlin, Universität Berlin.

Daniel, H.-D. (2013f). Qualitätssicherung an Hochschulen: von der Akkreditierung zur Auditierung. Paper presented at the Veranstaltung "Qualitätsmanagement an Hochschulen: von der Akkreditierung zur Auditierung", München.

Daniel, H.-D. (2013g). Research evaluation at the university of Zurich. Paper presented at the European Summer School of Scientometrics, Berlin.

Daniel, H.-D. (2013h). Research evaluation: Improvisation or science? Paper presented at the "Symposium Bibliometrics: Use and Abuse in the Review of Research Performance" der Academia Europaea und der Wenner-​Gren Foundation, Stockholm.

Daniel, H.-D. (2013i). Science and funding policy. Paper presented at the 14. International Society of Scientrometrics and Informetrics Conference (Sitzungsleiter), Wien.

Daniel, H.-D. (2013j). Studienqualität fördern statt kontrollieren! Von der Akkreditierung von Studiengängen zur Auditierung des Qualitätsmanagements von Hochschulen. Paper presented at the Bildungskongress "Beste Bildung ist ein Bürgerrecht" der FDP-​Fraktionsvorsitzendenkonferenz, München.

Daniel, H.-D. (2013k). University assessment #1. Paper presented at the 18. International Conference on Science and Technology Indicators (Sitzungsleiter), Berlin.

Hinsch, W., Hornbostel, S., Ochsner, M., & Raggautz, A. (2013). Darstellung, Messung, und Bewertung Geisteswissenschaftlicher Forschung. Paper presented at the Sommerschule an der Universität Graz, Graz.

Hug, S. E. (2013). Humanities scholars’ criteria for assessing research quality. Paper presented at the EAIR 35th Annual Forum "The Impact of Higher Education: Addressing the challenges of the 21st century", Rotterdam.

Mutz, R. (2013a). Metaevaluation von Lehre und Studium am Beispiel der ETH Zürich. Paper presented at the Veranstaltung "Lehren und Lernen an der ZHAW-​ Evaluation der Lehre", ZHAW Winterthur.

Mutz, R. (2013b). Receiver operating characteristic analysis with nonlinear mixed effects models – A model architecture. Paper presented at the 78th International Meeting of the Psychometric Society, Niederlande.

Mutz, R. (2013c). Statistische Modelle für Hochschul-​Rankings am Beispiel der Psychologie. Paper presented at the 11. Fachtagung der Fachgruppe Methoden & Evaluation der Deutschen Gesellschaft für Psychologie an der Alpen-​Adria-Universität Klagenfurt, Klagenfurt.

Ochsner, M. (2013). Criteria for Research Quality: International Perspectives. Paper presented at the EAIR 35th Annual Forum "The Impact of Higher Education: Addressing the challenges of the 21st century", Rotterdam.

Ochsner, M., & Hug, S. E. (2013). Beurteilung von Forschungsleistungen: Die Perspektive von Geisteswissenschaftlern. Paper presented at the Sommerschule "Darstellung, Messung, und Bewertung Geisteswissenschaftlicher Forschung" an der Universität Graz, Graz.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2013a). Bibliometrie in den Geisteswissenschaften - Perspektiven, Möglichkeiten und Grenzen. Paper presented at the Bibliometrie und Bewertung wissenschaftlicher Leistungen der Universitätsbibliothek Salzburg und des BdR Qualitätsentwicklung der Universität Salzburg an der Universitätsbibliothek Salzburg, Salzburg.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2013b). Kommentar zum Positionspapier der SAGW aus Sicht des Projektes "Qualitätskriterien für die Geisteswissenschaften.". Paper presented at the Vorstellung des Positionspapiers der SAGW "Für eine Erneuerung der Geisteswissenschaften", Bern.

Tinsner, K. (2013). Evaluation von Lehre und Studium am Beispiel einer Studiengangsevaluation an der Universität Zürich. Paper presented at the Lehren und Lernen an der ZHAW-​ Evaluation der Lehre, ZHAW Winterthur.

Wolbring, T. (2013a). Grenzenlos Messen? Was messen Lehrveranstaltungsbefragungen, und was nicht? Paper presented at the Zentrum für Qualitätsentwicklung in Lehre und Studium, Universität Potsdam.

Wolbring, T. (2013b). How beauty works. Kausale Mechanismen am Beispiel der studentischen Lehrveranstaltungsbewertung. Paper presented at the Forschungskolloquium "Empirie" der Professoren Dr. Rolf Becker, Prof. Dr. Axel Franzen, Prof. Dr. Ben Jann, Universität Bern.

Wolbring, T. (2013c). Kausalanalyse und Wirkungsevaluation. Potential Outcomes, Graphenmethodologie und ihre Anwendung am Beispiel der Bologna-​Reform. Paper presented at the Kolloquium des Instituts für Psychologie, Chemnitz.

Wolbring, T. (2013d). Status-​Effekte in der Wissenschaft. Ist (Zitations-​)Erfolg ansteckend? Paper presented at the Meeting der DGS-​Sektion Wirtschaftssoziologie „The Winner Takes All“, München, München.

2012

Daniel, H.-D. (2012a). Auf dem Weg zu Kriterien und Verfahren zur Qualitätsprüfung für die geisteswissenschaftliche Forschung. Paper presented at the ZfE-​Forum "Qualität und Qualitätsmessung für das Bildungswesen", Universität Hamburg.

Daniel, H.-D. (2012b). Book citation index, regional journal expansion & criteria for journal submission, and research solutions and future offers from Thomson Reuters (Panel). Paper presented at the Austrian Central Library for Physics, Universität Wien.

Daniel, H.-D. (2012c). Internationalisierung der Hochschulen - Eine institutionelle Gesamtstrategie. Paper presented at the Veranstaltung "Internationalisierung der Hochschulen" des Aktionsrats Bildung im Literaturhaus München, München.

Daniel, H.-D. (2012d). Qualitätsmanagement in der Forschung. Paper presented at the Modul des CAS-​Studiengangs "Forschungsmanagement", Universität Bern.

Daniel, H.-D. (2012e). Wie kann Internationalisierung gelingen? Paper presented at the Konferenz "Erfolgreich internationalisieren! Internationalität von Hochschulen erheben, bewerten und weiterentwickeln", Bonn.

Hug, S. E. (2012). Einführung in die Bibliometrie. Paper presented at the Weiterbildungsveranstaltung der Arbeitsgruppe „Vermittlung und Förderung von Informationskompetenz“ der BibliothekarInnen der Institutsbibliotheken der Universität Zürich, der ETH Zürich und der Zentralbibliothek Zürich, Zürich.

Hug, S. E., Ochsner, M., Wolf, J., & Daniel, H.-D. (2012). Entwicklung und Erprobung von Qualitätskriterien für die Forschung in den Geisteswissenschaften am Beispiel der Literaturwissenschaften und der Kunstgeschichte. Paper presented at the Gemeinsamer Transferworkshop des Projekts "Forschungsevaluation der Germanistik" an der Johannes Gutenberg-​Universität Mainz und dem Kooperationsprojekt der Universitäten Zürich und Basel "Entwicklung und Erprobung von Qualitätskriterien für die Forschung in den Geisteswissenschaften am Beispiel der Literaturwissenschaften und der Kunstgeschichte", Johannes Gutenberg Universität, Mainz.

Mutz, R. (2012a). Heterogene Intra-​Klassen-Korrelationen und ihre Determinanten – Ein generalisierter Schätzgleichungen-​Ansatz zur Interrater-​Reliabilität von Gutachterurteilen des FWF Österreich. Paper presented at the 48. Kongress der Deutschen Gesellschaft für Psychologie (DGPs), Bielefeld.

Mutz, R. (2012b). Modeling heterogeneous intra-​class correlations with generalized estimating equations (GEE) - A simulation study. Paper presented at the V European Congress of Methodology / SMABS in Santiago de Compostela, Santiago de Compostela, Spain.

Ochsner, M. (2012). Grundlagen zur Messbarkeit von Forschungsqualität. Paper presented at the Weiterbildungsveranstaltung der Arbeitsgruppe „Vermittlung und Förderung von Informationskompetenz“ der BibliothekarInnen der Institutsbibliotheken der Universität Zürich, der ETH Zürich und der Zentralbibliothek Zürich, Zürich.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2012a). Forschungsindikatoren für die Evaluation in den Geisteswissenschaften: Möglichkeiten und Grenzen. Paper presented at the Fachtagung "Bibliometrische Standards in den Natur-​, Sozial-​ und Geisteswissenschaften: Aktueller Stand und zukünftige Trends", Regensburg.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2012b). Qualitätskriterien und -​indikatoren für die Forschung in den Geisteswissenschaften. Paper presented at the Transferworkshop"B-​05 mesurer les performances de la recherche" der Rektorenkonferenz der Schweizer Universitäten (CRUS), Bern.

Ochsner, M., Hug, S. E., & Wolf, J. (2012). Qualitätskriterien und -​indikatoren für das Institut für Romanistik. Paper presented at the Leitung eines Workshops an der Universität Wien, Wien.

Tinsner, K. (2012a). Analyse von Studienverlaufsdaten. Ein differenzierter Blick auf einen naturwissenschaftlichen BA-​Studiengang. Paper presented at the Konferenz "Bologna-​Reform – Eine Zwischenbilanz der empirischen Bildungs-​ und Hochschulforschung und Entwicklungsperspektiven", Berlin.

Tinsner, K. (2012b). Quality assurance and course evaluation. Paper presented at the Workshop an der Summer school des SCOPES-​Project - Swiss National Science Foundation (SNSF): Case study teaching in economics and management education, Kolobrzeg, Poland.

2011

Daniel, H.-D. (2011a). Internationale Forschungsrankings - Pros und Contras. Paper presented at the Werkstattgespräch Hochschulrankings, Universität Innsbruck.

Daniel, H.-D. (2011b). Methoden zur Qualitätssicherung in der Lehre. Paper presented at the Konferenz "Villa Vigoni - 6. Tage des Wissenschaftsmanagements 2011", Loveno di Menaggio.

Daniel, H.-D. (2011c). Wie wollen und sollen die Geisteswissenschaften Qualität und Leistung messen und steuern. Paper presented at the Tagung "Für eine neue Kultur der Geisteswissenschaften?", Bern.

Hug, S. E., Ochsner, M., & Daniel, H.-D. (2011). Developing research quality criteria in the humanities. Paper presented at the ECOOM-​Colloquium "Assessing research performance in the social sciences and humanities" (Keynote), Antwerpen.

Mutz, R. (2011). Die räumliche Dimension in der Psychometrie. Ein latentes ‘Geopress-​State‘-​Modell. Paper presented at the 10. Fachtagung der Fachgruppe DGPs für Methoden und Evaluation, Bamberg.

Ochsner, M., & Hug, S. E. (2011). An inside out approach to identify criteria for assessing research quality in the humanities. Paper presented at the Veranstaltung "Scientometric Indicators for Arts & Humanities and Social Sciences", Universität Wien.

Ochsner, M., Hug, S. E., & Daniel, H.-D. (2011). Definition von Qualität der Forschung. Die Sicht der Geisteswissenschaften. Paper presented at the Tagung "Messung der Forschungsleistungen: Herausforderungen und Perspektiven", Lausanne.