Assessing the reproducibility of research results in EU Framework Programmes for Research
The core issue the study will address is that research results published today are often impossible to reproduce. The lack of reproducibility has serious negative effects on the performance of the research and innovation system. It also affects citizens’ trust in science.
The study will assist the Contracting Authority (Directorate-General for Research and Innovation (DG RTD), EC) to understand, test and monitor the progress of reproducibility over time and across the programmes, as a direct and/or indirect response to a range of policy interventions to increase the wider availability of results (reproducibility as strictly defined above and open science more in general). Furthermore, the study will assist directly with the gradual introduction of the principle and practices of reproducibility in EU-funded research and innovation. The study will be completed through the implementation of five tasks:
- Task 1: Assessment of the overall quality and reproducibility of projects and programmes
- Task 2. Assess the impact of measures to increase reproducibility (including predicting reproducibility from applications)
- Task 3. Assess the actual reuse of existing data in funded projects under H2020 and HE
- Task 4. Assess the effects of interventions on reproducibility on trust in science
- Task 5. Determine the overall implication of the study for policy action on reproducibility and provide actionable recommendations
The overall design of the study will follow a ‘blinded’ approach where two separate teams collect data and compare the results. Under the first approach, experts from Athena will collect quantitative data for 1000 projects and their outputs and individuals. Under the second track, researchers from PPMI and the ORRG will scrutinise 50 projects in great detail qualitatively (assessing the projects by undertaking an expert review).
OPTIMA is a 3-year EC-Erasmus+-funded project that aims at improving the quality of higher education in Ukraine by increasing the level of academic integrity through bringing open practices and transparency to relevant content and services, as well as through modernisation and internationalization of Ukrainian HEIs. OPTIMA will introduce open practices as a quality assurance (QA) process and IT solutions and international virtual community as a quality assurance mechanism.
The system of higher education (HE) in Ukraine is characterized by serious deficiencies, such as inefficient quality assurance (QA) and low levels of internationalization, which negatively affect educational attainment and reduce the country’s general potential. At the same time, another acute problem currently facing the country’s HE system is caused by the military conflict in eastern Ukraine. In 2014 the concept of “displaced higher education institutions” emerged as since the beginning of hostilities in the Donbass region, 18 higher education institutions (HEIs) have been moved from the temporarily ceased territories and continue to educate over 40,000 students and employ about 3,500 academic staff.
However, problems of quality and integrity remain in the Ukrainian education system, harming society and economy. Misconduct in HE, when students do not properly represent their acquired knowledge because of cheating, plagiarism and ghostwriting, is one of the main problems of the Ukrainian education system. There is, then, a need for development and implementation of innovative QA mechanisms built on academic integrity culture.
Introducing Open Peer Review (OPR) has the biggest potential in Ukraine as it brings transparency to the already familiar practice of academic evaluation and provides hands-on learning opportunities for early career researchers, helping to build new skills under collective mentorship of international experts. OPTIMA will develop and implement an online OPR platform for academic conferences and build an international virtual community of peer reviewers and researchers on the base of it.
ORRG is one of several European partners and will act as a consultant, delivering train-the-trainers workshops on various aspects of Open Science (including Open Access, FAIR Data, and Open Peer Review), and will help in designing and implementing an Open Peer Review platform for Ukrainian HEIs. Moreover, ORRG will contribute to general and subject-specific OS-related course materials for Ukrainian HEIs.
ON-MERRIT (Observing and Negating Matthew Effects in Responsible Research) is a 30-month, EC H2020-funded project to investigate inequalities in the uptake of Open Science. Open Science promises to make scientific research more inclusive, understandable to the public, and accessible to and reusable for large audiences. However, making science open to the general public stands the risk of being undermined by a dynamic of cumulative advantage – those who already have stand to gain even more through Open Science. ON-MERRIT – Observing and Negating Matthew Effects in Responsible Research and Innovation Transition recognises this threat to be urgent. Using a mix of sociological, bibliometric and computational approaches, the project investigates how existing inequalities along dimensions such as gender, geographical location or institutional standing drive outcomes in the uptake of Open Science and Responsible Research and Innovation across academia, industry and policy-making. ON-MERRIT gathers a range of skills in a consortium funded by the EU Horizon 2020 programme including experts in open science, data analytics, interaction with data, policy research, as well as stakeholder engagement.
ORRG is the coordinating partner of this interdisciplinary expert consortium, which includes Know-Center (AT), Open University (UK), University of Goettingen (DE), University of Minho (PT) and Graz University of Technology (AT). Moreover, ORRG will contribute to reviews and surveys of the uptake of open science resources in industry and policy-making, and the analyses of relationships between Open Science practices and academic performance, research training as well as institutional promotion criteria. Based on these results, ORRG will finally use agent-based modeling to test the effect of various policies and incentives, and compile a set of policy-recommendations.