Digital Content Review: Process and Results

Version 1.1 - date last updated: 2 March 2016


The digital content review process assists repositories with achieving effective preservation planning, which relies upon current, comprehensive, and cumulative information about digital content that an organization is currently managing and/or anticipates managing. To complete a digital content review, the digital preservation team gathers information and iteratively accumulates as part of a structured process. The results of ongoing digital content reviews produce a digital content review dataset that enables near-term and long-term planning by organizations. 


A digital content review:

  • is a structured process to complete a gap analysis for your majority of digital content types
  • uses a template with questions pertaining to decisions and current practice for life cycle stages including: selecting, acquiring, processing, storing, preserving, disseminating, and managing rights
  • documents the current status of your workflow and practice for each digital content type
  • identify implications for your organization in managing digital content types across generations of technology
  • recommends possible solutions and priorities for addressing implications of taking in or acquiring more digital content of a particular type


Each review includes these components:

1. Individual digital content overviews:

note: content that is not digital and is not a candidate to be digitized are not reflected in digital content overviews; physical or analog content that is replaced by digital formats will be reflected in versions of the digital content overview results when the digital content is acquired

  • Provide a concise scope for each digital content type that reflects key aspects of life cycle management rather than subject-based categories and descriptive metadata, which is extensively captured and tracked in other processes and systems
  • Develop a diagram with categories of content for each content type, known relationships between categories, relative size of categories within the content type, and a rough indication of the amount of content that is currently managed and known to be anticipated to support planning and growth
  • Identify categories within each content type that have common life cycle-based characteristics (e.g., use a common workflow to receive content from producers, are processed be a central unit or using a common workflow once received, are discovered or made available in a common way)
  • Represent categories within categories as:
    • circles (nicknamed "buckets") for content that is currently managed
    • triangles for known content that is managed elsewhere on campus, and
    • squares for content that is provided as a service but not managed or preserved by the DP program
  • Category definitions within each digital content are based on factors including: the type of content (e.g., source), how content is received, how it is processed, how content is discovered and used, and how rights may effect use
  • Iterations of the overviews show the accumulation of content over time and progress towards addressing objectives for managing and providing content more effectively

2. Individual digital content reports:

  • Produces one or more concise reports per content type, depending on the results of the overview (can the range of content be captured in one report?
  • Uses the DCR template to complete the report
  • Highlights areas and possible priorities for improved practice

3. Landscape view(s) of digital content

  • Provides a single-page view of the whole of known or documented digital content (example)
  • Represents each digital content type in relation to the other digital content types using common conventions
  • Documents changes and progress over time as results accumulate and as versions of the overviews are captured

4. Digital Content Review Dataset

  • Accumulate information gathered to complete the digital content overviews and the digital content reports 
  • Enable ongoing life cycle management of current and anticipated digital content that supports good practice for digital curation and preservation
  • Utilize available and current technologies to build, manage, and provide access to the data

Cumulatively, the results of digital content reviews help an organization to:

  • Address content-specific and general requirements
  • Prioritize digital content management improvements
  • Plan for growth and change in managing digital content

Five Stages Context for a Digital Content Review

  1. Acknowledge: be aware that gathering and managing information about your digital content is essential
  2. Act: initiate a digital content review project and complete a review of each of your digital content types
  3. Consolidate: capture cumulative results of digital content reviews to produce a landscape view of your digital content
  4. Institutionalize: complete periodic updates of your digital content review results
  5. Externalize: share your results with the community and encourage feedback


About the Digital Content Review process:

  • the need for organizations to explicitly develop a high-level inventory has been called out by the DPM workshops from the start of the program in 2003, we identify the absence of a content inventory as a showstopper for the Technology Leg (essential for preservation planning)
  • informed by the curriculum we developed for the DPM workshops, Nancy McGovern developed a very basic version for use at Cornell University Library (2005); elaborated on that basic digital content review process while at ICPSR (2006-2011) with the addition of a report template and a structured process for completing these life cycle reports as well as a set of examples; and further extended the process at MIT Libraries with digital content overviews and the definition of a DCR dataset, a curation dataset to continually manage digital content (2012-on) - the cumulative results of those examples are reflected in this overview of the DCR process
  • Helen Bailey at MIT Libraries completed a linked data project to demonstrate how the DCR dataset might be managed as her research project during her Digital Curation and Preservation Library Fellowship that she described in her blog and we are continuing to work towards a production version
  • Contributors at ICPSR: Anne Thompson, Courtney Egan; contributors at MIT Libraries: Helen Bailey, Liz Francis, Lorrie McAllister, Kari Smith, Ann Marie Willer