Major Upgrade for TPEN

We are humbled to announce that our TPEN project has been awarded a Digital Humanities Advancement Grant through the National Endowment for the Humanities (press release). Work has already begun and you can expect a more active thread of posts about it over the next couple years.

Read More

Dynamic Collections Through Annotation

Combining remote resources into useful, curated Collections can be difficult. IIIF describes the look of these collections, but they are usually hosted by the same repository as the included items. Ong CDH has implemented two different methods for creating changeable collections that can be used by researchers and programmers alike. Demonstrations will include real resources and public services available to everyone.

Read More

Building a IIIF-Aware Ecosystem

OngCDH would not be able to operate as lean as it does without a convenient foundation of tools to work access, manipulate, and create IIIF and Web Annotation compliant Objects. The existence of the standard has allowed for useful tools for application logic, such as Manifest in the IIIF-Commons, but we have extended this usefulness to create interfaces, utilities, public data APIs, and templating frameworks for rapid deployment of prototypes, classroom projects, and specialized web portals. This session will not only offer a tour of the principles and parts of our ecosystem, but showcase the pieces of it that are already available to the public and the projects that have benefited from it including: Lived Religion in a Digital Age, a data collection and visualization project around sacred spaces; Glossing Matthew, a paleography project combining transcription, lemmatization, and annotation to visualize biblical glosses; Newberry Library Renaissance Paleography, a self-directed educational module; and the Dunbar Archive, a digital repository combining a dispersed and varied collection of manuscripts, artifacts, locations, and people related to the poet laureate. Completed, active, and forthcoming projects will be demonstrated. Content is designed for developers, but is accessible enough for anyone with an interest in building a bespoke digital humanities project.

Read More

Geolocating with IIIF Presentation API 3.0: Part 2

Annotation was successful during the first phase of using IIIF Presentation API 3.0 for geolocating an entire Web entity. Another piece is to ensure that we could provide geolocated assertion upon an entity fragment, since situations like inset maps are common and need specific geospatial information that does not apply to the entire entity.

Read More

IIIF Cookbook: Data Recipes For Humans

As with any standards committee or best practices group, there is a subgroup tasked with providing examples for the internal community as well as external implementers. For IIIF, this group is the Cookbook group. Examples exist in the public Cookbook that properly format data assertions and combine them with resources in order to enhance those resources. The group is governed by careful guidance for, and strict adherence to, the best practices and standards promoted within. This is the most helpful and appropriate place to incubate technical application of the standards in the framework.

Read More

IIIF and Maps Conference: Coordinates in Data

The OCDH has been involved with the International Image Interoperability Framework™ (IIIF) since its beginning. We are adept with Annotation to supply supplementary data to data resources that are owned by others. We have our own public open annotation store and Web API known as RERUM to support this ecosystem of assertions. The scope of IIIF encompasses media like Image, 3D, and A/V files within the Internet of Things. Our challenge of People + Places + Things + Events extends beyond this scope but is still driven by the standards and best practices employed by such a framework. We focus on Web Annotation which IIIF uses to supplement the data resources under its scope. We realized using IIIF resources as subjects for geographic Web Annotations could set some foundation for standardized, interoperable, usable, and geospatially described linked open data in the broader Internet of Things.

Read More

Maps and Humanities

It should not be a surprise that data describing humanity depends heavily on the dimensions of time and space. How else can data representing human interaction, error, rectification, discovery, or adventure exist without these dimensions? The human construct of reality stands on concepts that give bounds to infinity. Time and space afford humans the concepts of change that describe their existence in this construct. The human ability to communicate is born of these concepts. Humanity, and perhaps all biologic life, without time and space or without the concepts of change and difference is not well described. It is significant that humans live on planet Earth in the Milky Way galaxy, a distinction one might take for granted, and one that may change with time.

Read More

Structural REform

The world of annotations is one of unstructured targeting. There is an existing piece of data somewhere in the world and an annotation targets it. At a higher level we work with data objects called Manifest. The manifests we work with follow the construct created by the IIIF frame work (https://iiif.io/api/presentation/2.1/#manifest). An important attribute of a Manifest is the ability to organize a structure around data in scope of the Manifest (https://iiif.io/api/presentation/2.1/#range) using the “structures” property. In this way, multiple structures can be built to offer different representational hierarchies of the data in scope.

Read More

A IIIF Concordance

As IIIF Manifests increasingly contain transcription annotations, the tools that are able to handle these documents must also treat text content as a first-class citizen. In our TPEN software, users regularly generate new annotations with text content that is included in the manifest documents made available for every project. In addition, TPEN allows for the easy inclusion of split-screen tools in the transcription interface, so I often spend some time experimenting with possible tools that assist in transcription, connect resources, or generate useful visualizations.

Read More

The Standards Approach

As developers in the field we want to follow the standards emerging for the web and for data. For the challenges our field faces, we often combine RESTful API practices, CORS, Web Annotation, Web Components, IIIF, JSON, JSON-LD, and Linked Open Data standards together so that APIs and applications we create are automatically applicable to other APIs and applications built under the same guidelines. The Walter J. Ong, S.J. Center for Digital Humanities at Saint Louis University are facing these challenges in particular as we develop RERUM.

Read More

Auth + Attribution of Open Data

Open Data is supposed to be accessible without any constraints to availability. The idea of authentication around Open Data is an oxymoron, but in practice we have found great benefit for keeping track of who can claim ownership to an object and how we can use ownership to put natural restrictions on the openness of data to make it a more comfortable realm for people.

Read More

Authentication and Attribution in RERUM

Any new web service or application must take a considered look at authorization, authentication, and attribution—authorization, to make changes to data; authentication, to ensure those making changes are known; and attribution, to apply proper credit for contributions. The prevailing practice is to authenticate users within applications and using appropriate context to make attributions. Popular transcription software, like TPEN and FromThePage, rely on user accounts and a private database to authenticate, attaching attribution based on the user’s account information in the interface and whenever the data is exported, for example, as a IIIF manifest document. Our goal to make RERUM a potent supplement to the heavier data APIs these type of interfaces rely on forced us to reevaluate the “obvious” choice to create and authenticate users.

Read More

Forgetting Deleted Objects in RERUM

At the Walter J. Ong, S.J. Center for Digital Humanities, we have been working hard on RERUM, the public object repository for IIIF, Web Annotation, and other JSON documents. The latest feature we’ve been diving into for the 1.0 release is DELETE. As is covered in the documentation on Github, there are a few guiding principles that are relevant here:

Read More

Deleted Objects in RERUM

In the last post, we explored how the tree of the version history is healed around a deleted object. In this post, we look more directly at the transformations to the deleted object itself. Let’s take the same abbreviated object to begin:

Read More

Versioning in RERUM

Versioning as it is known in software is simply the process of preserving previous iterations of a document when changes are made. There are many systems available to the developer which differ in centralization, cloning behaviors, delta encoding, etc., but for our purposes, the philosophy and utility should suffice.

Read More

TPEN Updating the transcription interface. Part 2.

The last blog covered a little bit about what challenge we laid out for ourselves in reworking the T-PEN Transcription interface. We set out to see if we could arrange and reorder the interface to be cleaner, easier to use, improve the access to the hidden tools, privilege the most used tools be more consistent in the tool functioning but not abandon any tool. In the last blog we talked about what we did to support transcription directly. In this blog we will talk a little about how we arranged our tools around transcription and we set the various tools at different distances from the transcription fiction but as a matter of physical layout and through different modes of interaction.

Read More

Pages tools and getting more out of your images

Page tools - CSS3 to the rescue The page tools split screen was developed to take advantage of CSS3 image manipulation tools. The reason was that while we allow access to 4000+ manuscripts the vast majority of our traffic is private uploads. And those uploads are invariably the poor quality of microfilm scans or photocopies of such scans or even photos taking in poor lighting with handheld cameras and phones. While is wonderful to be able to use the high quality images of repository we all know that such images are not an option because of availability cost or the resources of the intuition.

Read More

Identity through Evidence

Let’s take a real case and find out what it indicates and how it ought to be annotated. This image is the first page of a six page (3 folio) letter from Private Allen Gooch to his family, but let’s not get ahead of ourselves.

Read More

TPEN Updating the transcription interface. Part 1.

The Center had the good fortune last year to work on an custom imbedded version of TPEN for the New French Paleography website from Newberry Library. University of Toronto did a great job of building out the site while we turn the backend of TPEN into web services to allow for more flexible versions of the front end transcription interface to accommodate Newberry’s needs and to better suit early modern French Paleography. This year we are able to bring those changes into t-pen.org.

Read More

Tradamus

logo Tradamus is a free digital Critical Digital Edition creation web application. Whether you have straight transcriptions of your text or full TEI encoded documents you can bring them together in Tradamus to build a Critical Edition using the methodology that you want. From the Apparatus Criticus to the final publication you decide!

Read More