Saturday, February 7, 2026

Fragmented Infrastructures and Institutional Strategies: Pablo de Castro’s Contribution to Evaluating Research Tools and Persistent Identifiers


The corpus of work by Pablo de Castro offers a unique, critical lens on the evaluation and governance of research infrastructure tools, with a marked emphasis on Persistent Identifiers (PIDs) and the mechanisms by which they are embedded within institutional workflows and scholarly ecosystems; his contributions trace a cohesive arc from the assessment of transformative Open Access agreements to the integration of Current Research Information Systems (CRIS) with institutional repositories, exposing the technical and administrative fragmentation present in today's digital research landscape and raising questions on interoperability standards, quality assurance and policy-driven tool adoption; in presentations such as “The (Currently) Fragmented PID Landscape” and reports like “Building the Plane as We Fly It”, de Castro dissects the dichotomy between technical identifiers and community-oriented implementations, offering a typology of fragmentation scenarios—from redundant PID infrastructures to diverging metadata standards—that compromise the utility of these tools across regions and institutions; within these studies, a core analytical concern is how tool quality should not be measured solely through performance or compliance, but via trustworthiness, stakeholder engagement, and capacity to serve multi-scalar governance; one illustrative case is the European Commission’s FP7 Post-Grant Open Access Pilot, where de Castro co-analyses the success of supranational APC funding in producing equitable access to publication infrastructure, revealing how financial transparency and anti-hybrid models can serve as metrics of system robustness; similarly, his work with OpenAIRE and Knowledge Exchange underscores the necessity of cross-institutional alignment, highlighting how inconsistent PID workflows impede both research visibility and data reuse; ultimately, de Castro’s output crystallises into a guiding vision: that the evaluation of research tools must transcend metrics of functionality to incorporate layers of policy coherence, community validation and infrastructural resilience, forging an agenda where the quality of digital tools is inherently tied to their capacity to facilitate sustainable, open, and inclusive scholarly communication systems.