Building Trust Through Verifiable Evidence: How OEP Supports TIER2's Reproducibility Work
Evidence, not certification. The Olexian Evidence Platform (OEP) produces verifiable evidence about what ran, with what inputs, and what outputs were produced. It does not certify compliance, validate scientific conclusions, or guarantee outcomes.
Independence notice: OEP is an independent project and is not affiliated with, endorsed by, or funded by TIER2, the European Union, or the Embassy of Good Science.
The Reproducibility Problem Is Not Theoretical:
Across disciplines, replication studies reveal a persistent failure mode: independent teams can't reliably reproduce published results, even with high motivation and good intent. The Reproducibility Project in psychology found that only a minority of replications reproduced statistically significant results. Similar patterns emerge in cancer biology, social sciences, and computational research.
This isn't just an "academic" problem. When results can't be reproduced, everyone downstream—other labs, reviewers, funders, policy-makers, regulators—ends up doing forensic archaeology. They hunt for missing files, mismatched environments, undocumented preprocessing steps, or silent version drift.
TIER2 addresses structural barriers to reproducibility: workflow design, incentives, tools, and evidence infrastructure. TIER2 explicitly targets improved trust, integrity, and efficiency in research through "next-level reproducibility," spanning multiple contexts and stakeholder groups.
What TIER2 Is Doing (and Where Gaps Remain):
TIER2 (funded under Horizon Europe) operated from 1 January 2023 to 31 December 2025, focusing on reproducibility across diverse contexts—social sciences, life sciences, computer science—plus stakeholder groups like publishers and funders.
A core theme in TIER2's outputs is practical adoption: reproducibility measures only work if they fit real workflows. This is reflected in TIER2's work on tools and practices for different communities and in its Reproducibility Hub (ReproHub), which aggregates outputs and connects related initiatives.
TIER2's tooling deliverables include:
D5.1: Tools and practices for researchers
D5.2: Tools and practices for publishers
D5.3: Tools and practices for funders
The project concluded December 2025, with a final symposium on 11 February 2026 titled "TIER2 Final event: The Future of Reproducibility Research and Policy."
But here's the practical gap: many "reproducibility interventions" still fail at the mechanical layer. Teams can have good policies, templates, and checklists—and still ship bundles that cannot be independently verified later because the evidence is incomplete, altered, or ambiguous.
That's the seam OEP targets.
What OEP Is (In One Sentence):
OEP is offline-first verification infrastructure that checks whether a research bundle matches an explicit, versioned contract, and emits machine-readable pass/fail evidence (or refusal reasons) that other systems can gate on.
Think of it as a "bundle verifier" for computational research artifacts—designed to be boring in exactly the way audits and reproducibility demand.
What OEP Does:
OEP verifies artifact integrity and contract conformance for research bundles (data + metadata + code + declared outputs). Concretely, it supports checks like:
Structural integrity: Required files present; declared schemas valid; no unknown or surprise inputs.
Provenance binding: Cryptographic hashes[^1] bind inputs → processing artifacts → outputs into a tamper-evident chain.
Protocol conformance: The bundle matches a declared protocol/version, so expectations are explicit and stable.
Deterministic verification output: The verifier emits stable, machine-readable results for the same bundle under the same verifier version/toolchain.
When verification fails, OEP is built to fail closed and emit specific, actionable refusal reasons—not a vague "something changed." This is the difference between "a red light" and "a red light with the broken subsystem labeled."
What OEP Does Not Do:
OEP is verification middleware, not a scientific authority:
Does not determine whether a scientific claim is correct
Does not certify regulatory compliance or clinical safety
Does not replace peer review, domain review, or statistical validation
Does not require sending data to a third party (offline-first posture)
OEP proves: "This bundle is intact and matches its declared contract." Interpretation remains with the research team and their stakeholders.
How OEP Aligns with TIER2's Direction
TIER2 is building ecosystems: tools, practices, hubs, and policy recommendations that make reproducibility more achievable across contexts. OEP is designed to be a drop-in verification primitive that supports that ecosystem—not to replace it.
1) It Complements TIER2's "Tools and Practices" Outputs
TIER2's WP5 materials explicitly discuss reproducibility tooling across the research lifecycle, including workflow packaging specs (e.g., RO-Crate) and containerization approaches.
OEP does not replace those tools. It verifies the resulting bundles as-built:
Did the RO-Crate (or equivalent) include what it claims?
Do declared manifests match the actual bytes?
Are the referenced outputs present and hash-bound to the inputs?
Are versioned expectations being met?
2) It Supports ReproHub-Style Discoverability with Verifiable Evidence
TIER2's ReproHub is a "hub of hubs"—helping communities find reproducibility resources and outputs.
OEP fits by enabling a simple rule: share bundles with verification evidence attached, so reuse starts from something checkable, not something merely described.
3) It Maps Cleanly to Funders/Publishers Needs
TIER2 explicitly produced tool/practice deliverables for publishers and funders (D5.2, D5.3). Those stakeholders often want automation-friendly signals:
Is the artifact package complete?
Is it tamper-evident?
Can it be re-checked later, offline?
Can we classify failure modes consistently across submissions?
OEP's core output is machine-readable pass/fail evidence plus failure taxonomy. That makes it compatible with automated checks without forcing a single "one true workflow."
Real-World Scenario: From Protocol to Proof
Without OEP:
A researcher collects data, runs analysis, generates figures. Months later, a reviewer asks: "Can you reproduce Figure 3?" The lab discovers dependency drift, missing calibration files, and undocumented preprocessing. Reproduction becomes archaeology.
With OEP:
The protocol is expressed as a versioned contract. Data and analysis are packaged as a bundle. OEP verifies the bundle and emits evidence:
✅ Required inputs present
✅ Schema and manifest integrity checks pass
✅ Calibration artifacts match declared hashes
✅ Outputs match declared structure and are hash-bound to inputs
✅ Verifier output is machine-readable for third-party re-checking
Reviewers (or future lab members) can independently re-verify the bundle's integrity without trusting the original machine, the original person, or the original narrative.
Why This Matters for TIER2's Broader Goals
TIER2's aim is to increase trust, integrity, efficiency, reuse, and overall quality of research results by improving reproducibility in ways that respect different contexts.
OEP doesn't solve incentive structures or policy. It solves one specific technical substrate problem:
Making integrity and provenance mechanically checkable, offline, and automatable.
That shifts reproducibility from "best effort documentation" toward "evidence-backed artifacts."
The Path Forward (and the Boundaries)
TIER2 concluded December 2025 (with a final event in February 2026), leaving behind tools, hubs, and policy recommendations meant to outlive the grant.
OEP is positioned as infrastructure that can help operationalize that work in real labs—without demanding that every lab adopt the same stack.
OEP is not a stamp of truth. It is a stamp of integrity.
Learn More
TIER2 Resources:
Tool/practice deliverables: D5.1 (Researchers), D5.2 (Publishers), D5.3 (Funders)
Disclaimer: OEP is an independent verification tool that supports reproducibility and auditing workflows. It is not affiliated with, endorsed by, or funded by TIER2, the European Union, or the Embassy of Good Science. OEP does not certify regulatory compliance, guarantee scientific correctness, or replace expert review.
[^1]: OEP uses BLAKE3, a cryptographic hash function that produces tamper-evident fingerprints of data. Hash-binding means outputs are cryptographically linked to their inputs, making alterations detectable.

