Enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility
- Time: 2023-2025
- Funding: Horizon Europe (EC)
- Role: Lead
Open Science Impact Pathways
- Time: 2022-2024
- Funding: Horizon Europe (EC)
- Role: Partner
PathOS aims to collect concrete evidence for the effects of Open Science (OS) practices. The project’s main focus is the identification and quantification of pathways of OS from input to output, outcome and impact. In this investigation of impact pathways enabling factors and key barriers will also be considered. PathOS wants to improve understanding of the implications of OS, provide recommendations, and develop new tools and methods to study causal effects.
To study impacts and causal mechanisms, an approach including six iterative steps will be taken:
- Scope current status in OS research and impact assessment
- Conceptualise model of OS impact pathway
- Quantify: exploring a range of methods
- Operationalise and employ methods for measuring impacts in case studies
- Analyse costs and benefits of specific OS practices
- Validate model of OS impact pathways
- Time: 2022-2023
- Funding: Dutch Research Council (NWO)
- Role: Partner
As new models of publishing such as Publish-Review-Curate, publication as you go, preprint review and others emerge, distinct publishing functions like dissemination and evaluation are increasingly decoupled. This creates the need for different platforms to interact and at least to be aware of each other’s policies and requirements. At present, there are minimal standards to enable the systematic interoperability of these platforms. At the research output level, standards like Docmaps and COAR Notify are under development. However, to empower further innovation in scholarly communication, a shared way to describe these different platforms and their possibilities of interaction is required. To address these above challenges, the Journal Observatory project aims:
1. To define an extensible, machine-readable and traceable way to describe the policies and practices of the various platforms involved in disseminating and evaluating scholarly works: the Scholarly Communication Platform Framework.
2. To demonstrate the value of this new framework by building a demonstration prototype called the Journal Observatory, a resource which combines data on journals and other publication platforms from various sources to clarify policy information for authors, reviewers and others.
Open Practices, Transparency and Integrity for Modern Academia
- Time: 2021-2024
- Funding: Erasmus+ (EC)
- Role: Partner
- Requirements to the Open Peer Review Platform
- Requirements to Updated Courses with New Subjects on Open Science
OPTIMA aims at improving the quality of higher education in Ukraine by increasing the level of academic integrity through bringing open practices and transparency to relevant content and services, as well as through modernisation and internationalization of Ukrainian HEIs. OPTIMA will introduce open practices as a quality assurance (QA) process and IT solutions and international virtual community as a quality assurance mechanism.
The system of higher education (HE) in Ukraine is characterized by serious deficiencies, such as inefficient quality assurance (QA) and low levels of internationalization, which negatively affect educational attainment and reduce the country’s general potential. At the same time, the military conflict in eastern Ukraine is acutely affecting the HE system. Since the beginning of hostilities in the Donbass region, 18 higher education institutions (HEIs) have been moved from the temporarily ceased territories and continue to educate over 40,000 students and employ about 3,500 academic staff. However, problems of quality and integrity remain in the Ukrainian education system, harming society and economy. Misconduct in HE, when students do not properly represent their acquired knowledge because of cheating, plagiarism and ghostwriting, is one of the main problems of the Ukrainian education system. There is, then, a need for development and implementation of innovative QA mechanisms built on academic integrity culture.
Introducing Open Peer Review (OPR) has the biggest potential in Ukraine as it brings transparency to the already familiar practice of academic evaluation and provides hands-on learning opportunities for early career researchers, helping to build new skills under collective mentorship of international experts. OPTIMA will develop and implement an online OPR platform for academic conferences along with an international virtual community of peer reviewers and researchers.
The Open and Reproducible Research group provides expertise and consultation in the field of Open Science (OS) to help the Ukrainian partners absorb best practices. All training courses within the project are created under the supervision of ORRG/TU Graz. Additionally, experts from TU Graz contribute to the development and support of a national platform for OPR including a virtual community, providing expertise in OPR workflows and OS training, through close involvement in the conceptualization and implementation of OPR services. TU Graz is leading activities in work package 1 (“Learning EU best practices”), work package 2 (“Academic courses on Open Science”), and work package 3 (“Web platform for Open Peer Review”), providing expertise from participation in writing “The Open Science Training Handbook” and the development of similar programmes at TU Graz to support best-practice implementation of OS training, teaching, and certification programmes. TU Graz hosts three OS training workshops for project partners (WP1), provides consultation on OPR platform development (WP2), and contributes to OS content for academic courses (WP3).
Evaluation study on the implementation of cross-cutting issues in Horizon 2020
This study evaluated cross-cutting issues in Horizon 2020 (such as interdisciplinarity, sustainability, and international cooperation, among others). Developing suggestions on how to define, implement and monitor such overarching priorities in future programmes was also one aim of the study. Twelve cross-cutting issues were investigated each in a small case study with a diverse set of methods. The ORRG took the lead on the case studies on widening participation across Europe and on responsible research and innovation (RRI).
Assessing the Reproducibility of Research Results in EU Framework Programmes for Research
Observing and Negating Matthew Effects in Responsible Research
- Time: 2019-2022
- Funding: Horizon 2020 (EC)
- Role: Lead
ON-MERRIT aimed to investigate inequalities in the uptake of Open Science. Open Science promises to make scientific research more inclusive, understandable to the public, and accessible to and reusable for large audiences. However, making science open to the general public stands the risk of being undermined by a dynamic of cumulative advantage – those who already have stand to gain even more through Open Science. ON-MERRIT – Observing and Negating Matthew Effects in Responsible Research and Innovation Transition recognised this threat to be urgent. Using a mix of sociological, bibliometric and computational approaches, the project investigated how existing inequalities along dimensions such as gender, geographical location or institutional standing drive outcomes in the uptake of Open Science and Responsible Research and Innovation across academia, industry and policy-making. ON-MERRIT gathered a range of skills in a consortium funded by the EU Horizon 2020 programme including experts in open science, data analytics, interaction with data, policy research, as well as stakeholder engagement.
ORRG took the function as coordinating partner of this interdisciplinary expert consortium, which included Know-Center (AT), Open University (UK), University of Goettingen (DE), University of Minho (PT) and Graz University of Technology (AT). Moreover, ORRG contributed to reviews and surveys of the uptake of open science resources in industry and policy-making, and the analyses of relationships between Open Science practices and academic performance, research training as well as institutional promotion criteria. Based on these results, ORRG finally used agent-based modeling to test the effect of various policies and incentives, and compile a set of policy-recommendations.