BIBFRAME Dilemmas for Libraries: Challenges and Opportunities

I recently attended the 2024 BIBFRAME Workshop in Europe (BFWE), hosted by the National Library of Finland in Helsinki. It was an excellent conference in a great city! Having attended several BFWEs over the years, it’s gratifying to witness the continued progress toward making BIBFRAME the de facto standard for linked data in bibliographic metadata. BIBFRAME was developed and is maintained by the Library of Congress to eventually replace the flat record-based metadata format utilised by the vast majority of libraries – MARC (a standard in use since 1968). This year, Sally McCallum from the Library of Congress shared significant

From MARC to BIBFRAME and Schema.org in a Knowledge Graph

The MARC ingestion pipeline is one of four pipelines that keep the Knowledge Graph, underpinning the LDMS, synchronised with additions, updates, and deletions from the many source systems that NLB curate and host.

Visualising Schema.org

One of the most challenging challenges in my evangelism of the benefits of using Schema.org for sharing data about resources via the web is that it is difficult to ‘show’ what is going on. The scenario goes something like this….. “Using the Schema.org vocabulary, you embed data about your resources in the HTML that makes up the page using either microdata or RDFa….” At about this time you usually display a slide showing html code with embedded RDFa.  It may look pretty but the chances of more than a few of the audience being able to pick out the schema:Book

OCLC Preview 194 Million Open Bibliographic Work Descriptions

demonstrating on-going progress towards implementing the strategy, I had the pleasure to preview two upcoming significant announcements on the WorldCat data front: 1. The release of 194 Million Linked Data Bibliographic Work descriptions. 2. The WorldCat Linked Data Explorer interface

Content-Negotiation for WorldCat

I am pleased to share with you a small but significant step on the Linked Data journey for WorldCat and the exposure of data from OCLC. Content-negotiation has been implemented for the publication of Linked Data for WorldCat resources. For those immersed in the publication and consumption of Linked Data, there is little more to say.  However I suspect there are a significant number of folks reading this who are wondering what the heck I am going on about.  It is a little bit techie but I will try to keep it as simple as possible. Back last year, a

Putting Linked Data on the Map

Show me an example of the effective publishing of Linked Data – That, or a variation of it, must be the request I receive more than most when talking to those considering making their own resources available as Linked Data, either in their enterprise, or on the wider web. Ordnance Survey have built such an example.

From Records to a Web of Library Data – Pt2 Hubs of Authority

As is often the way, you start a post without realising that it is part of a series of posts – as with the first in this series.  That one – Entification, and the next in the series – Beacons of Availability, together map out a journey that I believe the library community is undertaking as it evolves from a record based system of cataloguing items towards embracing distributed open linked data principles to connect users with the resources they seek.  Although grounded in much of the theory and practice I promote and engage with, in my role as Technology

Putting WorldCat Data Into A Triple Store

I can not really get away with making a statement like “Better still, download and install a triplestore [such as 4Store], load up the approximately 80 million triples and practice some SPARQL on them” and then not following it up. I made it in my previous post Get Yourself a Linked Data Piece of WorldCat to Play With in which I was highlighting the release of a download file containing RDF descriptions of the 1.2 million most highly held resources in WorldCat.org – to make the cut, a resource had to be held by more than 250 libraries. So here