Volume 10 (2021-22)

Each volume of Journal of Digital Media Management consists of four 100-page issues published in both print and online.

The Articles published in Volume 10 include:

Volume 10 Number 4

  • Editorial
    Simon Beckett, Publisher
  • Papers
    The language metadata table: Providing a single, unified language reference source for media and entertainment
    Yonah Levenson, Co-Director, Digital Asset Management Certificate Program, Rutgers University, Bruce Devlin, Founder, Mr MXF, Eric Emeric, Senior Director of Metadata Management & Data Governance, Showtime Networks and Sarah Nix, Senior Director of Archives and Data Governance, Paramount Global

    The Language Metadata Table (LMT) offers a single source of reference for language codes for use throughout the media and entertainment ecosystem. This paper explains how it came to be, and why it has been embraced so enthusiastically. The paper also discusses the role of the Society of Motion Picture and Television Professionals (SMPTE) as the standards body responsible for the provision of language codes adopted by the LMT. This is followed by a case study of how Paramount Global has leveraged the LMT to resolve its language needs following a series of mergers and acquisitions.
    Keywords: language, metadata, media and entertainment, content distribution, terminology, localisation

  • Is your digital asset management system fit for purpose? Common challenges that signal outgrowth and key questions to ask when upgrading
    Josh Van Dyk, Vice President of North American Sales, censhare US and Chris Leaman, Vice President Solution Architecture, Emmsphere Plus

    Organisations looking to procure their first digital asset management (DAM) platform, or renew their existing solution, need something that not only meets their needs today, but that can also scale as the organisation grows, and use cases change and expand. This paper discusses the questions that companies must ask both of themselves and of vendors to ensure they select a mature platform and avoid the costly mistake of having to procure a new platform two years from now.
    Keywords: procurement, enterprise, Request for Proposal (RFP) questions, vendor questions, DAM maturity, digital asset management, upgrade

  • Metadata for conservators: Overcoming obstacles to digitising conservation documentation
    Ryan Lieu, Operations Manager and Technology Specialist, Conservation Services, Stanford University Libraries

    Conservation documentation plays an essential role in the long-term preservation of cultural property, and conservators are ethically responsible for keeping permanent and accessible records. While the conservation profession has been slow to digitise legacy documentation, Stanford Libraries Conservation Services began a process of regularly accessioning its conservation records to the university’s digital repository in 2018. To determine strategies for the effective cataloguing of accessioned conservation records, internal search needs were assessed to develop a metadata profile based on the MODS standard. This paper describes mappings from Stanford Libraries conservation data to MODS and Dublin Core elements, enumerates useful controlled vocabularies for describing conservation documentation in repository metadata and examines the potential benefits of descriptive metadata profiles to the conservation profession.
    Keywords: cataloguing, conservation, documentation, metadata, vocabularies

  • Finding one source of truth between multiple digital asset management platforms at Autodesk
    Sarah Britt, Digital Asset Management Librarian, Kristen Ederer, Digital Asset Management Librarian, Jacqueline Karpov, Product Manager, Content Management and Rebeka Martin, Digital Asset Management Librarian, Autodesk

    The Autodesk digital asset management (DAM) system utilises an application programming interface (API) protocol to connect assets between two digital asset management platforms. In preparing, designing and implementing the API, the DAM librarian team reduced taxonomic noise, reinforced file-naming to one schema and backed up the asset life cycle to a single archive. These decisions and implementations have helped the Autodesk DAM team to work with a single source of truth for marketing and sales assets. This paper covers how the DAM API functions, the benefits the API has brought the team and asset stakeholders, and the single source of truth model reinforced by the API call.
    Keywords: API, storage solutions, content life-cycle management

  • Mote Marine Laboratory’s organisation-wide approach to digital asset management: A case study
    Alexis Crabtree, Manager of Visual Engagement and Jaime Fogel, Library & Archives Director, Mote Marine Laboratory

    Mote Marine Laboratory & Aquarium is an independent, non-profit marine science research and education institution, founded by Dr Eugenie Clark in 1955. What began as a small, one-room laboratory has since grown into a multi-campus enterprise with research and conservation programmes that span the spectrum of marine science, and a robust education division that provides programming to over 30,000 students each year, as well as an accredited public aquarium that welcomes around 300,000 visitors annually. This paper describes Mote’s search for a digital asset management (DAM) system that would work for the whole organisation — and keep pace with its planned growth. The paper discusses each stage of the procurement process, from research all the way to pitching the idea to executive leadership. The lessons learned during each phase of this process should prove insightful for any organisation looking to start its own DAM journey.
    Keywords: aquarium, laboratory, use case, pitch to management, stakeholder buy-in, organisation-wide approach

  • Benchmarking digital preservation with the Digital Preservation Coalition’s Rapid Assessment Model: A case study
    Robin Dean, Accessibility Technologist, Colorado School of Mines and Lisa Lorenzo, Metadata Librarian, Michigan State University Libraries

    This paper describes how Michigan State University Libraries used the Digital Preservation Coalition’s Rapid Assessment Model (DPC RAM) to evaluate its digital preservation practices. Using the model, a small group of librarians established a baseline of digital preservation activities at the Libraries and made recommendations for incremental improvements. By keeping the working group small and consulting stakeholders as needed, the benchmarking process was completed within a short timeline and resulted in a concise report that provided a digestible and accurate picture of digital preservation initiatives at the time of the evaluation, as well as realistic goals for the next 18 months. Sharing the report with library administration led to a charge to establish working groups that would test two preservation systems and evaluate their usefulness as tools for expanding digital preservation practices at the Libraries. The format of the model ensured that evaluation and goal-setting were straightforward, even in the absence of any unit or position dedicated to digital preservation. Speaking with peers at other institutions has revealed a need for a simple tool like the DPC RAM that can be completed more quickly than other preservation evaluation frameworks, especially given the lack of administrative support for digital preservation at many institutions.
    Keywords: digital preservation, DPC RAM, evaluation, assessment, digital collections

  • Safety in numbers: Criteria for the mass digitisation of obsolete videotape recordings
    Charles Fairall, Videotape & Engineering Adviser, BFI National Archive

    A key aim of preservation is to retain the essence and goodness of the original creation. In the case of videotape, however, this covers a vast and constantly evolving technological landscape. This paper discusses the substantial challenges faced by audiovisual archivists, who must effectively reinvent their collections of videotape recordings in data form to save them from impending format and carrier obsolescence. This paper describes such challenges as the lack of both skills and parts, and the paramount importance of quality assurance.
    Keywords: video, archive, engineering, technology, obsolescence, quality, digitisation, collections

  • Opening up: The National Gallery of Art’s Wikimedia project
    Benjamin Zweig, Digital Projects Coordinator, National Gallery of Art

    In 2018, the National Gallery of Art, Washington, began a project with the Wikimedia Foundation with the goal of donating its 53,000 open access images of public domain artworks to the Wikimedia Commons media repository. The project grew to include a contribution of 120,000 art object collection records to the linked data platform Wikidata and was completed in February 2020. This paper details the process of the contribution. It recounts the various stages and workflow of the project, from image preparation and metadata management to uploading and organising the images and data on Wikimedia Commons and Wikidata. It outlines some of the major challenges faced during the contribution process and discusses some of the various tools used to complete the project. Finally, it describes some of the project’s results, such as audience reach and user engagement, and outlines possible future steps.
    Keywords: museums, open access, Wikimedia Commons, Wikidata, art, digital humanities

  • Making collections more accessible and inclusive digitally during COVID-19
    Emma Hetrick, Graduate Research Assistant, Harry Ransom Center, The University of Texas at Austin

    This paper considers the possibilities of searching and visualising collection materials at the Harry Ransom Center (HRC). In doing so, it argues that digital archival engagement is not a replacement for in-person reading room experiences, but rather enables an altogether different type of engagement with such materials. Following an analysis of the accuracy of searches in the HRC’s digital collections, the article identifies areas for improvement in increasing digital access to the HRC’s collections. Further, they reveal the misrepresentation or underrepresentation of historically oppressed groups (particularly in archives), including women, Black Americans and non-English speakers. The article concludes with recommendations for the more accurate and ethical representation of these groups in online archives.
    Keywords: digital collections, search, digitisation, data visualisation, bias

Volume 10 Number 3

  • Editorial
    Simon Beckett, Publisher
  • Papers
    DAM without governance is asking for trouble
    Michelle Roell, Taxonomist, Editorial Creative, Netflix and John Horodyski, Managing Director, Insights & Analytics, Salt Flats

    This paper presents the fireside chat session on data governance that was held as part of the 2021 Henry Stewart Festival of DAM. The session, titled ‘Governing your data in a world of constant change’, was conducted by John Horodyski, Managing Director, Insights & Analytics at Salt Flats, and Michelle Roell, Taxonomist, Editorial Creative at Netflix. The paper documents not only their discussion, but also expands on the importance of data governance, how to set up and run a governance committee, and where to look for further guidance and recommendations. The discussion also covers the importance of diversifying the information science field so that new perspectives and voices can participate on an equal basis.
    Keywords: data governance, metadata, digital asset management, DAM, collaboration, diversity and inclusion

  • Cooperative models for digital archives
    Emily Zinger, Southeast Asia Digital Librarian, Cornell University, Shaneé Yvette Murrain, Director of Community Engagement, Digital Public Library of America and Emma Thomson, Project Manager, Digital Scriptorium, Schoenberg Institute for Manuscript Studies

    Cooperative models for digital archives management offer innovative means of utilising resources, skills and knowledge from multiple institutions to support shared online collections. To outline the benefits and challenges of working within a consortial model, this paper provides three case studies, focusing on distributed governance, metadata quality assurance, resource sharing, financial security, and diversity and equity. While the increased number of stakeholders can complicate the management of a shared digital media repository, strategies such as hierarchical funding structures, means of ensuring transparent and equitable decision making, and a reliance on distributed skill sets can ensure a digital library project is successful for both partners and users.
    Keywords: digital archives, cooperative models, metadata, funding structures, governance structures, resource sharing

  • Risk management for digital assets: Questions companies should be asking about their systems and workflow for asset management
    Nancy Mandowa, Global Digital Assets, AE, Royal Caribbean International

    Too often, companies are unaware of the risks inherent in their digital asset management systems until they are faced with possible legal action. This paper discusses the questions that companies need to ask themselves about how they are managing asset distribution, both before and after purchasing any kind of digital asset management system.
    Keywords: risk management, asset protection, liability, rights management, asset usage tracking, workflow

  • The ecommerce explosion: Rising to the challenge with digital asset management
    Matt Haws, VP, Global Marketing, Esko

    To remain competitive in today’s dynamic market, brands are under increasing pressure to provide their customers with the ultimate personalised experience. This means e-commerce strategies must be aligned across all platforms, and digital assets must be of the highest quality and consistency. A strong digital asset management (DAM) architecture enables brands to centralise assets and product information, ensuring content is unified and delivered to the right people, systems and channels at the right time. When carefully planned and managed, DAM provides a powerful engine for e-commerce. In this paper, readers will learn the efficacy of digital assets for e-commerce, how to manage asset data effectively, and the ability for brands to differentiate themselves through personalisation.
    Keywords: e-commerce strategy, omni-channel consistency, DAM architecture, data management, data flow, personalisation

  • Navigating from metadata disparities toward best practices: Analysis through crosswalking
    Jamie Patrick-Burns, Digital Archivist, State Archives of North Carolina, Robin Haley, Postgraduate Student and Owen King, Postgraduate Student, School of Information and Library Science, University of North Carolina at Chapel Hill

    As organisations responsible for the management of digital assets grow, their collections eventually attain a critical mass where the need to shift from ad hoc procedures to industry standards becomes apparent. This paper provides a case study from the State Archives of North Carolina to illustrate one organisation’s approach to analysing, reconciling and remediating disparate sources of metadata. The methodology combines both bottom-up and top-down approaches to the assessment of digital asset metadata — a method that could be useful for other institutions when neither orientation by itself seems sufficient. While the example is taken from a cultural heritage institution in the government sector, the method is applicable to digital asset management practitioners in other settings as well.
    Keywords: metadata, crosswalk, databases, internships, electronic records, digital archives, digital repository

  • Digital collection self-migration at Loyola Marymount University: Accessing platforms and managing implementation
    Jessea Young, Scholarly Communications Librarian and Steph Gritz, Systems Librarian, William H. Hannon Library, Loyola Marymount University

    In 2017, the William H. Hannon Library at Loyola Marymount University (LMU) used a locally hosted licence of CONTENTdm for managing its digital collections. Then LMU was informed that CONTENTdm would no longer be supporting locally hosted instances of CONTENTdm. The Systems & Digital Initiatives (S&DI) department took this change as an opportunity to assess a range of digital asset management systems. After an initial assessment period and review of several product options S&DI decided to self-migrate the library’s content to Adam Matthew’s new platform, Quartex. This paper describes LMU’s assessment of digital asset management systems and the process of self-migration, and highlights the challenges associated with self-migration for early adopters of a platform.
    Keywords: self-migration, digital asset management system assessment, digital asset management system migration

  • The museum of obsolete media: Tech evolution and the need for continuous data migration
    Joao Paulo Querette, Chief Executive Officer, Alfred.video

    With the digital universe growing exponentially and the recognition that, with the advancement of technology, all physical media will eventually become obsolete, it is essential for archivists and the developers of archive technologies to acknowledge the fundamental necessity of data migration. Unfortunately, the migration of archives is a costly and burdensome process, and as such frequently postponed until it becomes urgent. This paper proposes a simpler model of ongoing and semi-automated migration, where the process becomes part of the daily activity and workflow of an archive, cutting costs and guaranteeing its continuity and longevity. To this end, the paper argues that digital archives should take advantage of datacentre technologies, especially object-based storage, with the goal of creating highly dynamic and abstracted from physical media object archives.
    Keywords: migration, obsolescence, evolution, storage formats, storage technology, data centre, object storage, object archive

  • Photographic regenerative interfaces
    Ram Shergill, PhD Candidate, The Bartlett School of Architecture

    In the 21st century, photographic images are everywhere and are being further advanced with the progression in image-making through photographic technologies. The progressive and contemporary camera creates images that produce a ‘fantasy through product’. These images, once used in branding and communication, tell consumers a story, illustrating concepts that cannot be seen in ‘real’ images from the natural world. The constructed digital image is thus used as a ‘projection model’ furthering communication paradigms and design technologies. The digital photographic ‘regenerated’ image becomes an imaginative and technical communicative tool relaying a vision. The unaltered original raw image itself becomes an interface and is further modified to create new narratives—the itinerant image becomes an inextricable link between a brand and the consumer. Novel image production methods and methodologies are necessary for human advancement in terms of science and technology and are a useful tool for development approaching Industry 4.0. This paper analyses how photographic regenerative interfaces and image regeneration methods can expand visibility and affect the world through the interlinking and symbioses of images. It also analyses how the pre-existing raw image will soon be used as a design tool, leading to potential uses of photographic regenerative interfaces for novel applications.
    Keywords: photography, imaging, design, technology, regenerative

Volume 10 Number 2

  • Editorial
    Simon Beckett, Publisher
  • Papers
    Lessons learned: Collaborating to digitise Yiddish-language collections at Cornell
    Barbara Morley, Digital Archivist (retired), Steven Calco, Research Archivist and Elizabeth Parker, Technical Services Archivist, Kheel Center for Labor-Management Documentation & Archives, Cornell University

    This paper describes a grant-funded collaboration between Cornell University Library, the Kheel Center for Labor-Management Documentation & Archives, Digital Consulting and Production Services, and faculty members from Cornell’s Jewish Studies Program to select, digitise, describe and disseminate English and Yiddish-language records created between 1930 and 1953 by the Jewish People’s Fraternal Order division of the International Workers Order. To serve researchers in the Yiddish-language community as well as English readers, Yiddish-language documents were partially translated and transliterated to reduce the language barrier and facilitate discovery and analysis. The project required a wide variety of specialists to communicate effectively and negotiate goals, workflows and standards. As this paper discusses, responsibilities such as file and document management, metadata creation and review, and quality control are especially demanding when working with documents composed of a language and alphabet that most of the project participants do not understand. Challenges and opportunities encouraged collaborators to re-evaluate and redesign certain existing workflows, and reaffirmed the benefit of others. The online dissemination of the records has opened access to a global audience, promoting scholarly work based on the digitised records along with greater interest in related analogue collections.
    Keywords: Yiddish, digitisation, project management, metadata, quality control, labour archives, collaboration

  • Establishing an archive asset management system at the Sharjah Broadcasting Authority
    Jean-Christophe Kummer, Managing Director and Silvester Stöger, Project and Account Manager, Archive Asset Management, NOA GmbH

    This paper describes the establishment of a video-tape digitisation factory and an archive asset management system at the Sharjah Broadcasting Authority, in the United Arab Emirates. It explains the context for the decision to adopt in-house digitisation, along with the arguments for adopting the proposed solution. Special focus is put on the preservation process, cataloguing needs, integration with production asset management systems, multi-site setup and disaster recovery solutions.
    Keywords: case study, archive asset management, digitisation, requirements, implementation, disaster recovery

  • Building international partnerships for digitisation and preservation
    Rachel Deblinger, Director of the Modern Endangered Archives Program, Jennifer Osorio, Head of International and Area Studies, Sharon E. Farb, Associate University Librarian for Distinctive Collections and Todd Grappone, Associate University Librarian for Digital Initiatives, Research and Development, UCLA Library

    For the past decade, the UCLA Library’s International Digital Ephemera Project and Modern Endangered Archives Program have worked to create open access collections of international content through post-custodial partnerships. This paper details the work of those initiatives to support communities, archivists, librarians and researchers around the world to set priorities, understand complex legal issues and provide training and technology for cultural heritage preservation where content is at most risk. This paper describes how the UCLA Library works alongside a growing network of partners to preserve and present diverse voices and perspectives on global events and ensure community narratives about individual and collective experiences are accessible. This paper details both the ethic of care that informs the library’s work and the infrastructure that enables access and discovery to these community collections as it improves and expands the scope of these programmes.
    Keywords: post-custodial, community-building, digital archives, international, open access

  • My TV lies over the ocean: The digitisation of France’s overseas heritage
    Geneviève Abadie, Project Manager and Brice Amouroux, Deputy Head, Technical Department, INA, Gérard Letienne, Founder and Honorary President, Vectracom and Sandrine Boulon, Linguist, Reverso-Softissimo

    This paper describes the successes and setbacks of a 12-year project to digitise over 150,000 hours of audiovisual footage from France’s overseas territories. After a brief overview of the major milestones of the project, the paper focuses on technical and metadata processing issues, and the various unexpected problems that were faced along the way.
    Keywords: mass digitisation, metadata alignment, itinerant technical facility, damages audiovisual carriers

  • The digitisation and commercial licensing of local television news as a model for preservation and access: A case study
    Anna Esparza, Digitization and Metadata Specialist, University of North Texas and Morgan Davis Gieringer, Head of Special Collections, University of North Texas Libraries

    Since 2014, the Special Collections Department of the University of North Texas Libraries (UNTL) has worked to digitise over 60 years’ worth of historical film and video and provide online digital access to news programming that was originally broadcast under the name WBAP-TV, and later NBC 5/KXAS-TV. Revenue from commercial licensing is expected to contribute an increasingly significant percentage of the funding needed to make the collection fully digitally accessible. This paper looks briefly at the history of the NBC5 news collection and examines the workflows required to prepare film and video for digitisation, perform quality control on digital files, and apply metadata keywords and content description to make footage easier for filmmakers to discover online. The paper will also discuss the licensing workflows in place at UNTL and how footage has been licensed for use in a variety of projects.
    Keywords: news film, videotape, digitisation, digital workflows, commercial licensing, special collections

  • Artificial intelligence for a role change in television archives: The Atresmedia–Etiqmedia experience
    Eugenio López de Quintana, Head of Archive, Atresmedia Corporación de Medios de Comunicación and Antonio León Carpio, Managing Director and Founder, Etiqmedia

    Given the increasing volume of audiovisual content being produced, combined with the growing demand for granularity in image searches, it is becoming increasingly impractical for television archives to catalogue their collections without automatic processing and artificial intelligence technologies. This paper describes a pioneering project to implement such technologies.
    Keywords: automation, information processing, artificial intelligence, audiovisual content, television archives, media content, speech recognition, facial recognition

  • Overcoming barriers to audiovisual preservation in cultural heritage institutions
    Biz Maher Gallo, Statewide Digitization Initiatives Coordinator, Library of Michigan and Sarah Mainville, Media Preservation Librarian, Michigan State University Libraries

    This paper provides an overview of the issues that organisations face with regard to the preservation of the analogue audiovisual material in their archival collections. It discusses what stewards can do right now to take better care of their collections, as well as to provide resources for training and funding opportunities. The paper gives particular attention to inventories, preservation planning, storage and funding.
    Keywords: audiovisual, preservation, digitisation, media, under-funded

Volume 10 Number 1

  • Editorial
    Simon Beckett, Publisher
  • Papers
    Automation and integration are the keys to successful digital asset management
    Paul Murphy, Digital Image Expert, UEFA

    This paper examines how digital asset management at UEFA has grown from a traditional library and archive system to an integral part of the organisation’s ecosystem. It describes the various challenges faced and discusses the lessons learned.
    Keywords: facial recognition, data model, data mapping, sport

  • Eighty years of literary audio archives at the Library of Congress: Preserving collections from the physical to digital
    Kristy Darby, Digital Collections Specialist, Library of Congress, et al.

    This paper reviews the 80-year history of two Library of Congress literary audio archives — the Archive of Recorded Poetry and Literature and the PALABRA Archive — and details the challenges and opportunities that the dawn of the digital era posed for such collections. Curators and archival professionals who had been accustomed to analogue collection frameworks and workflows began to develop strategies for digitisation and digital access, paving the way for the establishment of the Library’s Digital Content Management Section. This new section’s Digital Collections Management Compendium outlines the institution’s policy and guidance for its digital content managers. New complexities with handling digital files highlighted the need to develop innovative digital processing workflows, as well as the importance of documenting these workflows and techniques for future processing efforts. Continuous documentation and efforts to process digital files have led to increased confidence in utilising scripting for batch processing as well as an improved understanding of the requirements for making this content accessible. The collaboration between literary audio archives curators and digital content managers laid the foundation for similar digital preservation practices that the institution continues to build upon for other projects, and continues to ensure the successful transition of these historic literary collections into the digital era.
    Keywords: audio archives, literature, digital collections, workflows, digital preservation, collaboration, analogue to digital conversion

  • The Olympic World Feed Project: Searching, acquiring and preserving the international television signal of the Olympic Games from 1956 to 1988
    Yasmin Meichtry, Associate Director and Isabel Sánchez, Project Manager, Heritage — Images & Sounds at Olympic Foundation for Culture and Heritage

    This paper describes the project carried out by the Olympic Foundation for Culture and Heritage (OFCH) to locate, acquire and preserve all official televised coverage (ie the international television signal) of the Olympic Games broadcast between 1956 and 1988. After providing a general overview of the audiovisual archives of the International Olympic Committee (IOC), which are managed and overseen by the OFCH, and outlining the OFCH’s mission and challenges in this respect, the paper discusses how the television broadcasting of the Olympic Games has evolved and the impact of these developments on the race to repatriate and preserve this content. The paper goes on to discuss the Olympic World Feed Project, describing the planning that went into recovering more than 3,000 hours of content missing from the IOC’s archives, as well as the various steps involved and the numerous challenges faced along the way. In particular, the paper provides a case study of the acquisition of the international television signal produced during the XIV Olympic Winter Games — Sarajevo 1984.
    Keywords: international signal, Olympic Games, Olympic movement, digitisation, preservation, heritage, archives, Olympics

  • The Glastonbury New Bands Competition Collection at the British Library: Technical challenges of migrating an optical disc collection
    Tom Ruane, Preservation Audio Engineer & Trainer, British Library

    The Glastonbury Festival kindly donated the 2004–2009 entries for its Emerging Talent Competition to the British Library for long-term preservation. The collection comprises more than 4,000 optical disks, primarily Compact Disk-Recordable (CD-R), and was migrated to a stable file-based format in 2019, as part of the Library’s Unlocking Our Sound Heritage project. Migration presented many challenges — while large parts of the workflow could be automated, and the audio data extracted verifiably bit-perfect and faster than real time, the collection is populated by poor-quality CD-R stock with many examples written using low-quality domestic burners, making the process of data extraction significantly more complex. The Library not only had to develop a multi-stage workflow with sufficient capacity to process the large quantity of disks in an efficient manner, but the workflow also had to be easily adjustable to accommodate the 1:1 transfer of severely damaged or degraded disks. This paper outlines the step-by-step technical workflow and the hardware/software employed to extract this unique content for long-term preservation.
    Keywords: cultural heritage, audio-visual, archive, preservation, compact disk, workflow

  • Validating media quality in a largescale outsourced media migration project: The second layer
    Christoph Bauer, Senior Expert in Research and Development, Österreichischer Rundfunk and Jörg Houpert, Head of Technology, Cube-Tec International

    This paper describes the quality control (QC) processes for a large-scale digital migration project conducted by Österreichischer Rundfunk, the Austrian Broadcasting Corporation. It provides an overview of the initial requirements and the motivation for introducing an additional, independent QC layer, and describes the implementation process as well as the actual service itself. It draws on first-hand experience to provide a detailed introduction to the technology used for the task. The examples presented and the conclusions drawn provide the reader with the information they need for their own preservation and migration projects.
    Keywords: quality control (QC), video cassette migration, auto QC, MXF file inspection, trusted playback

  • Badge drops and points galore: Crowdsourcing metadata at a public library archive
    Heidi Morse, Library Technician — Archives, Ann Arbor District Library

    This paper discusses two strategies for crowdsourcing metadata at a public library archive: (1) leveraging social media networks to identify people and places in photographs; and (2) gamification techniques for large-scale metadata collection. A comparative approach leads to conclusions about best practices for different contexts and goals. Crowdsourcing initiatives encourage participants to build a relationship with an institution or repository. Smaller-scale crowdsourcing campaigns often succeed not just in collecting metadata, but in drawing in new users. Crowdsourcing via social media can boost community engagement, strengthen partnerships and raise awareness about collection highlights. Larger-scale metadata collection via gamification is more oriented towards individuals and existing users. Designed with user engagement in mind, crowdsourcing via online games can succeed in both digital outreach and metadata collection.
    Keywords: community engagement, crowdsourcing, digital collections, games, metadata, social media

  • The audible and the inaudible in a post-digitised world: Preserving both sound and object
    Lauren Walker, Head of Digital Projects, Harry Ransom Center, The University of Texas at Austin

    Analogue audio materials often exhibit indexical traces of recording, editing and playback that offer insight to society’s relationship with recorded sound as it has changed over time. The obsolescence of analogue audio media has rapidly ushered the experience of sound recordings into the digital environment. As archives race to digitise analogue media before the media degrade too far to allow playback, it is important to capture a holistic representation of sound recordings. This paper addresses the challenges to preserve and make accessible the physical and often inaudible content of analogue audio media. The paper outlines approaches to enrich preserved content of audio recordings, especially as data must stand in for a physical encounter with the media, and illustrates ways to enhance the access experience that encourage discovery and research of the creative practices of audio recording.
    Keywords: preservation, digitisation, sound archives, special collections, cultural heritage, sound studies

  • Discovering, describing and digitising CCTV: Challenges and attempts of making New York Chinatown’s community television archive accessible
    Klavier Jie Ying Wang, Moving Image Archiving and Preservation, New York University

    Archives created by and of ethnic minority groups play an important role in the preservation of cultural heritage. However, making such archival materials effectively and openly accessible is not straightforward, not least due to language barriers and the limitations of digitisation. Using the case of a Chinese American audiovisual collection housed at New York University, this article demonstrates the challenges associated with making a historically important archive more visible and accessible. As the article will show, digitisation does not necessarily lead to immediate accessibility. The article discusses ways to elevate archive visibility and accessibility, not simply through technical workflows but also outreach campaigning. Thoughts on granting more open access to ethnic minority archives are also included.
    Keywords: Chinese American, community media, Asian CineVision, CCTV, archival description, video digitisation, archival access