Projects
European Lighthouse to Manifest Trustworthy and Green AI
- Time: 2023-2026
- Funding: Horizon Europe (EC)
- Role: Partner
ENFIELD aims to elevate European AI research in the pillars of Adaptive, Green, Human-Centric and Trustworthy AI and advance research in areas of great societal relevance (energy, healthcare, manufacturing, space). The project also wants to contribute to collaboration, education and training, building of networks and the innovation ecosystem.
Nicki Lisa Cole is co-leading the Green AI pillar at Know-Center and contributing to the development of Green AI monitoring metrics.
Enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility
- Time: 2023-2025
- Funding: Horizon Europe (EC)
- Role: Lead
Reproducibility is often claimed as a central principle of the scientific method. It refers to the possibility for the scientific community to obtain the same results as the originators of a specific finding. Recently, concerns about a “reproducibility crisis” have grown in a variety of disciplines, especially in behavioural and medical sciences. This has been exacerbated by key problems such as a lack of transparency in reporting, data, and analysis, lack of replication studies, publication bias towards reporting of positive results, and a growing awareness of questionable research practices.
Poor levels of reproducibility are seen as a serious threat to scientific self-correction, efficiency of research processes, and societal trust in research results. There is a need to address issues in reproducibility in order to reduce inefficiencies, avoid repetition, maximise return on investment, prevent mistakes, and speed innovation to bring trust, integrity and efficiency to the European Research Area (ERA) and the global Research and Innovation (R&I) system in general.
In response to these challenges, TIER2 will centre epistemic diversity by selecting three broad research areas – social, life, and computer sciences, and two cross-disciplinary stakeholder groups – research publishers and funders to systematically investigate reproducibility across contexts. Through coordinated co-creation with these communities, TIER2 will:
- examine the epistemological, social, and technical factors that shape reproducibility across contexts
- build a state-of-the-art evidence-base on extent and efficacy of existing reproducibility interventions and practices
- co-create techniques of scenario-planning, backcasting, and user-centred design to select, prioritise, adapt, and implement new tools to enhance reproducibility across contexts.
TIER2 will contribute to increasing the re-use and overall quality of research results and consequently boost trust, integrity and efficiency in research.
Open Science Impact Pathways
- Time: 2022-2024
- Funding: Horizon Europe (EC)
- Role: Partner
PathOS aims to collect concrete evidence for the effects of Open Science (OS) practices. The project’s main focus is the identification and quantification of pathways of OS from input to output, outcome and impact. In this investigation of impact pathways enabling factors and key barriers will also be considered. PathOS wants to improve understanding of the implications of OS, provide recommendations, and develop new tools and methods to study causal effects.
To study impacts and causal mechanisms, an approach including six iterative steps will be taken:
- Scope current status in OS research and impact assessment
- Conceptualise model of OS impact pathway
- Quantify: exploring a range of methods
- Operationalise and employ methods for measuring impacts in case studies
- Analyse costs and benefits of specific OS practices
- Validate model of OS impact pathways
Open Practices, Transparency and Integrity for Modern Academia
- Time: 2021-2025
- Funding: Erasmus+ (EC)
- Role: Partner
OPTIMA aimed at improving the quality of higher education in Ukraine by increasing the level of academic integrity through bringing open practices and transparency to relevant content and services, as well as through modernisation and internationalization of Ukrainian HEIs. OPTIMA introduced open practices as a quality assurance (QA) process and IT solutions and international virtual community as a quality assurance mechanism.
The system of higher education (HE) in Ukraine is characterized by serious deficiencies, such as inefficient quality assurance (QA) and low levels of internationalization, which negatively affect educational attainment and reduce the country’s general potential. At the same time, the military conflict in Ukraine is acutely affecting the HE system, with displaced universities continuing to educate thousands of students. However, problems of quality and integrity remain in the Ukrainian education system, harming society and economy. Misconduct in HE, when students do not properly represent their acquired knowledge because of cheating, plagiarism and ghostwriting, is one of the main problems of the Ukrainian education system. There is, then, a need for development and implementation of innovative QA mechanisms built on academic integrity culture.
Introducing Open Peer Review (OPR) has the biggest potential in Ukraine as it brings transparency to the already familiar practice of academic evaluation and provides hands-on learning opportunities for early career researchers, helping to build new skills under collective mentorship of international experts. OPTIMA has developed and implemented an online OPR platform for academic conferences along with an international virtual community of peer reviewers and researchers.
The Open and Reproducible Research group provided expertise and consultation in the field of Open Science (OS) to help the Ukrainian partners absorb best practices. All training courses within the project were created under the supervision of ORRG/TU Graz. Additionally, experts from TU Graz contributed to the development and support of a national platform for OPR including a virtual community, providing expertise in OPR workflows and OS training, through close involvement in the conceptualization and implementation of OPR services. TU Graz was leading activities in work package 1 (“Learning EU best practices”), work package 2 (“Academic courses on Open Science”), and work package 3 (“Web platform for Open Peer Review”), providing expertise from participation in writing “The Open Science Training Handbook” and the development of similar programmes at TU Graz to support best-practice implementation of OS training, teaching, and certification programmes. TU Graz hosted three OS training workshops for project partners (WP1), provided consultation on OPR platform development (WP2), and contributed to OS content for academic courses (WP3).
Journal Observatory
- Time: 2022-2023
- Funding: Dutch Research Council (NWO)
- Role: Partner
As new models of publishing such as Publish-Review-Curate, publication as you go, preprint review and others emerge, distinct publishing functions like dissemination and evaluation are increasingly decoupled. This creates the need for different platforms to interact and at least to be aware of each other’s policies and requirements. At present, there are minimal standards to enable the systematic interoperability of these platforms. At the research output level, standards like Docmaps and COAR Notify are under development. However, to empower further innovation in scholarly communication, a shared way to describe these different platforms and their possibilities of interaction is required. To address these above challenges, the Journal Observatory project aims:
1. To define an extensible, machine-readable and traceable way to describe the policies and practices of the various platforms involved in disseminating and evaluating scholarly works: the Scholarly Communication Platform Framework.
2. To demonstrate the value of this new framework by building a demonstration prototype called the Journal Observatory, a resource which combines data on journals and other publication platforms from various sources to clarify policy information for authors, reviewers and others.
Evaluation study on the implementation of cross-cutting issues in Horizon 2020
- Time: 2022
- Client: Directorate General for Research and Innovation (DGRTD, EC)
- Role: Partner
This study evaluated cross-cutting issues in Horizon 2020 (such as interdisciplinarity, sustainability, and international cooperation, among others). Developing suggestions on how to define, implement and monitor such overarching priorities in future programmes was also one aim of the study. Twelve cross-cutting issues were investigated each in a small case study with a diverse set of methods. The ORRG took the lead on the case studies on widening participation across Europe and on responsible research and innovation (RRI).
Assessing the Reproducibility of Research Results in EU Framework Programmes for Research
- Time: 2020-2022
- Client: Directorate General for Research and Innovation (DGRTD, EC)
- Role: Partner
The core issue the study addressed is that research results published today are often impossible to reproduce. The lack of reproducibility has serious negative effects on the performance of the research and innovation system. It also affects citizens’ trust in science.
The study assisted the Contracting Authority to understand, test and monitor the progress of reproducibility over time and across the programmes, as a direct and/or indirect response to a range of policy interventions to increase the wider availability of results (reproducibility as strictly defined above and open science more in general). Furthermore, the study assisted directly with the gradual introduction of the principle and practices of reproducibility in EU-funded research and innovation. The study was completed through the implementation of five tasks:
- Task 1: Assess the overall quality and reproducibility of projects and programmes
- Task 2. Assess the impact of measures to increase reproducibility (including predicting reproducibility from applications)
- Task 3. Assess the actual reuse of existing data in funded projects under H2020 and HE
- Task 4. Assess the effects of interventions on reproducibility on trust in science
- Task 5. Determine the overall implication of the study for policy action on reproducibility and provide actionable recommendations
The overall design of the study followed a ‘blinded’ approach where two separate teams collected data and compared the results. Under the first approach, our team of experts collected quantitative data for all 1000 projects and their outputs and individuals. Under the second track, researchers scrutinised the 50 projects in great detail qualitatively (assessing the projects by undertaking the expert review).
ORRG was leading evidence synthesis and development of recommendations to the EC, as well as contributing heavily to the qualitative assessment of project outputs and researcher/editor/funder attitudes.
Observing and Negating Matthew Effects in Responsible Research
- Time: 2019-2022
- Funding: Horizon 2020 (EC)
- Role: Lead
ON-MERRIT aimed to investigate inequalities in the uptake of Open Science. Open Science promises to make scientific research more inclusive, understandable to the public, and accessible to and reusable for large audiences. However, making science open to the general public stands the risk of being undermined by a dynamic of cumulative advantage – those who already have stand to gain even more through Open Science. ON-MERRIT – Observing and Negating Matthew Effects in Responsible Research and Innovation Transition recognised this threat to be urgent. Using a mix of sociological, bibliometric and computational approaches, the project investigated how existing inequalities along dimensions such as gender, geographical location or institutional standing drive outcomes in the uptake of Open Science and Responsible Research and Innovation across academia, industry and policy-making. ON-MERRIT gathered a range of skills in a consortium funded by the EU Horizon 2020 programme including experts in open science, data analytics, interaction with data, policy research, as well as stakeholder engagement.
ORRG took the function as coordinating partner of this interdisciplinary expert consortium, which included Know-Center (AT), Open University (UK), University of Goettingen (DE), University of Minho (PT) and Graz University of Technology (AT). Moreover, ORRG contributed to reviews and surveys of the uptake of open science resources in industry and policy-making, and the analyses of relationships between Open Science practices and academic performance, research training as well as institutional promotion criteria. Based on these results, ORRG finally used agent-based modeling to test the effect of various policies and incentives, and compile a set of policy-recommendations.