This is the recording of an AVP webinar given on the topic of the Fixity application by Amy Rudersdorf on February 8th, 2017.
Webinar description: How do you know if your digital files are corrupt, missing, moved or renamed? We invite you to join us to learn how Fixity will allow you to monitor and report on file integrity and attendance. How does it work? Fixity scans a folder or directory, creating a manifest of the files including their file paths and their checksums, against which regular comparative analyses are performed. Fixity then emails a report to the user documenting flagged items along with the reason for a flag, such as that a file has been moved to a new location in the directory, has been edited, or has failed a checksum comparison for other reasons. When run regularly, Fixity becomes a powerful tool for monitoring digital files in almost any storage location.
This is the recording of an AVP webinar given on the topic of the AVCC application by Rebecca Chandler on January 27th, 2017.
Webinar description: For organizations planning preservation work with their audiovisual materials but are confronted with little or no quantitative data about their collection, we invite you to join us for a look the inventory tool, AVCC. An open source web application and guideline, AVCC was developed to enable collaborative, efficient item-level cataloging of audiovisual collections. The application incorporates built-in reporting on collection statistics, digital storage calculations, shipping manifests, and other data critical to prioritizing and planning preservation work with audiovisual materials. AVCC establishes a minimal set of required and recommended fields that provide basic intellectual control and enables collection holders to quantify and plan a reformatting project.
This is the recording of an AVP webinar given on the topic of our Exactly tool by Chris Lacinak on January 19th, 2017.
Webinar description: For organizations challenged by remotely and safely transferring born-digital material from a sender to a recipient, we invite you to join our workshop on Exactly. A simple and easy to use application that utilizes the BagIt File Packaging Format standard, Exactly supports FTP (and SFTP) transfer, as well as standard network transfers, and integrates into desktop-based file sharing workflows such as Dropbox or Google Drive. Additionally, Exactly allows the recipient to create customized metadata templates for the sender to fill out before submission.
AVP has developed or contributed to the development of several tools for the inventory, assessment, and preservation prioritization of physical audiovisual materials, ranging in approach from collection level or format level analysis down to item level cataloging and selection. Primary among these are three tools: MediaScore and MediaRivers from Indiana University, our Catalyst inventory tool, and the AVCC inventory and planning tool.
The approach one takes in such preservation efforts and the tools one might use depends on the scope of collections, the budgets and staffing available, and the end goals of the project. This spreadsheet presents a comparative analysis of the tools available on our site to help you determine which one might be right for your collection. Visit our tools page to access the applications themselves.
This high-level checklist was created by Bertram Lyons for a program coordinated by the Conservation Center for Art & Historic Artifacts on collecting and preserving oral history materials in libraries, museums, and archives. This checklist does not consider the act of recording oral histories as a collector or interviewer; it documents basic principles for managing oral history collections within a collecting repository.
Whether outsourcing or digitizing in-house, collection managers need to be able to define the parameters and specifications for preservation reformatting in order to properly care for their assets and to control and understand the outcomes of the digitization process. In association with the ARSC Guide to Audio Preservation, AVP is releasing this Guide to RFPs for the Digitization of Audio, along with recommendations for technical and preservation metadata to collect during the process and a sample spreadsheet to obtain estimated pricing from digitization vendors. Every digitization project and organizational requirements are different; this guide is a starting point for creating an RFP specific to those needs.
In this tutorial, we explore how to understand and apply features of the OpenRefine (formerly Google Refine) tool in an archival context. OpenRefine can enable organizations to clean up, merge, and manipulate their metadata so that the information can be better integrated into workflows and across systems. OpenRefine is “a free, open source power tool for working with messy data” that libraries, museums, archives, and other organizations can employ to analyze, normalize, and clean up datasets through its simple yet powerful features.
In this tutorial, spreadsheets are positioned as a mechanism to help practitioners manage metadata more accurately, efficiently, and effectively. There are many opportunities to leverage sophisticated features in spreadsheet applications that allow you to work faster, smarter, and with greater accuracy towards a more robust metadata management system. The tutorial features instruction using Microsoft Excel. Self-directed exercises and practices worksheets are linked here:
With the increasing ingest of born digital and digitized collections, we are at the point (perhaps well past the point) of admitting that almost all archives are digital archives, and as a profession we must identify and gain training on the tools that will help us describe, store, and manage file-based collections in the same ways we do with physical collections.
The Command Line Interface (CLI) is a critical tool here, both for managing files from ingest to storage directly or being able to access certain applications that only have a CLI option for running. These introductions to CLI (both for Mac and Windows OS) provide a basic understanding of managing directories and files with the command line, skills which can be expanded from managing individual files to ingesting and caring for large sets of files in batches in order to save time and address the realities of file-based acquisition.
This four-part series of video tutorials, created by Kathryn Gronsbell is focused on Exiftool, a command-line application that can read, write, and edit embedded metadata in files. The tutorial series provides detailed support to users looking for an approachable and practical introduction to Exiftool.
Featured exercises have wide-ranging applications but trend towards improving digital preservation workflows through a step-by-step exploration of Exiftool’s basic features and functions.
Clear articulation and understanding of goals and specifications is essential to ensuring the success of any project. Whether performing digitization work in-house or using a vendor, a statement of work or request for proposal serves as the foundation of the project.
This resource is intended to guide organizations in thinking critically about and discussing – internally and with vendors – the salient aspects of a request for proposal and the details within. Although this guide uses video as a focus point it is relevant and applicable for all media types.
The human desire to classify and name is a highly personal and a greatly prized act. Naming the files we create is no different, though the number of files and tools used for managing them place a great need on consistent structure and application of file naming guidelines. What to do is then very simple — consistency. More to the point is what not to do in order to avoid pitfalls.
Becoming an effective advocate for your collections means becoming a proactive participant in the management and planning of their preservation and long term maintenance. The amount of work to do and the costs can feel overwhelming, but things will never change until you take charge, make a plan, and actively seek the resources you need. Here are 5 tips on how you can start to manage your collections rather than letting your collections manage you.
A simple Excel spreadsheet that shows the total capacity for 1/4 inch open reel audio, using variable for Track Configuration, Sound Field Configuration, Tape Thickness, and Reel Size. Assumes full tape reels and full use of capacity. Look for an online app version coming soon.
Papers & Publications
Part of our Feet on The Ground: A Practical Approach to The Cloud series, these profiles break down the offerings of third party cloud storage providers from a preservation point of view. Assessment points include Data Management, Reporting/Metadata, Redundancy, Accessibility, Security, End of Service, and adherence to the NDSA’s Levels of Preservation.
The ARSC Guide to Audio Preservation is a practical introduction to caring for and preserving audio collections. It is aimed at individuals and institutions that have recorded sound collections but lack the expertise in one or more areas to preserve them. Among the many expert authors of the Guide, AVP President Chris Lacinak contributed Chapter 7, “What to do after digitization”, and Senior Consultant Kara Van Malssen contributed Chapter 9, “Disaster prevention, preparedness, and response”.
The ARSC Guide to Audio Preservation was commissioned for and sponsored by the National Recording Preservation Board of the Library of Congress, and was co-published by the Association for Recorded Sound Collections (ARSC), the Council on Library and Information Resources (CLIR), and The Library of Congress. More information can be found on the CLIR website.
In 2014, AVP and the Northeast Document Conservation Center (NEDCC), with funding from The Andrew W. Mellon Foundation, undertook an in-depth, multi-faceted assessment to quantify the existing audio items held in institutional collections throughout the United States. This was performed in response to The Library of Congress National Recording Preservation Plan and its call for the appraisal of collections, as well as to establish a foundation for articulating the current preservation need of sound recordings in collections nationwide.
Our goal was to acquire enough trustworthy data to be able to answer questions such as “How many sound recordings exist in broadcast organizations across the US?” or “How many sound recordings exist in archives throughout the US?” Moreover, we wanted to answer more complex questions such as “How many of such items are preservation-worthy?” or “How many have already been digitized?” Prioritization for digitization is as critical as both funding and timeliness. The foundation for action on all three of these fronts is trustworthy quantitative data. This paper aims to provide such data along with supporting information about the methodologies used in its generation.
A guest opinion piece by Chris Lacinak featured in Post Magazine on the importance of using preservation oriented workflows in a production environment. Establishing reliable preservation and archival practice makes sound business sense, promoting efficient and cost-effective workflows, providing find-ability and the wherewithal to support premium repurposing projects.
La publicación inicial de cinco de nuestros documentos técnicos traducidos al español. Esperamos traducir más recursos pronto y compartirlos con nuestros colegas.
- NUEVE FACTORES A CONSIDERAR AL EVALUAR EL ALMACENAMIENTO EN LA NUBE (PDF)
- INTRODUCCIÓN A LA PRESERVACIÓN DE MEDIOS ÓPTICOS (PDF)
- INTRODUCCIÓN A LOS CÓDECS DE ARCHIVOS SONOROS Y AUDIOVISUALES (PDF)
- METODOLOGÍA MÁS PRODUCTO, MENOS PROCESO PARA EL PROCESAMIENTO DE COLECCIONES AUDIOVISUALES (PDF)
- LA RECUPERACIÓN DE LA COLECCIÓN MULTIMEDIA DESPUÉS DE LA SUPERTORMENTA SANDY (PDF)
In 2011 The Netherlands Institute for Sound and Vision, in collaboration with the audiovisual heritage network AVA_net published a collection of essays on the topic of digital preservation entitled Making Invisible Assets: The Preservation of Digital AV Collections. The book is available for only the cost of shipping from Sound and Vision. AVP Senior Consultant Kara Van Malssen was one of the international professionals commissioned to write an essay for the collection. Her article, “Planning Beyond Digitization”, is available here in PDF:
Archiving and preservation consist of technology, people, and policies. For technology in particular, digital AV archives are largely indebted and beholden to a few sizable industries: cinema, broadcast, and information technology.
Commercial interests catering to the aforementioned industries have produced a seemingly attractive tool set that has the potential to provide archives with the ability to apply their policies in service of preservation-oriented workflows. Yet, even in the hands of larger well-resourced organizations, employing these tools can be challenging and resource intensive. How can smaller, resource-constrained AV archives efficiently apply cost effective tools and technologies to their workflows?
There are a variety of algorithms that can be used for generating checksums, with two in particular – MD5 and SHA-256 – being the most common. The comparative benefits and drawbacks of both are well-understood: while MD5 is weaker against random and deliberate collisions, it is faster to generate than SHA-256. However, there are no published empirical estimates for the difference in time-to-generate between MD5 and SHA-256 in archival and repository environments, leading to difficulty in making an informed decision as to which algorithm to implement for preservation monitoring.
This white paper documents a comparative checksum test of the same files under the same conditions, leading to some surprising findings about the actual processing speeds of the two algorithms.
The basis of Bertram Lyons’ panel presentation at Digital Preservation 2014. To date, the difficulty and high bar of doing an internal assessment as a Trusted Digital Repository have created a hurdle to the ability of organizations to track or rank their progress towards digital preservation standards. AVP has been working on means of adapting TDR risk assessment by improving reporting options and analyses. Two assessment tools currently in use for digital preservation risk assessment are the NDSA’s Levels of Digital Preservation Matrix (Version 1) and ISO 16363:2012 Audit and Certification of Trustworthy Digital Repositories.
The two tools offer overlapping yet distinct methods of analysis, very useful but resulting in differing reporting classifications and outcomes that are not easy to reconcile. In order to encourage the use of the two tools under one roof, and, especially, to increase the outputs of a standard ISO 16363 assessment, AVP staff have mapped the Levels of Digital Preservation categories to the ISO 16363 requirements. A full paper on the topic will be available here soon.
Linked below is our work that documents the mapping of NDSA Levels of Digital Preservation categories to ISO 16363 criteria and the DigPres14 slidedeck. We offer these as an opportunity for community discourse and involvement. Please evaluate our mappings and let us know what you think to help us work towards a shared mapping that others can employ in a standardized way.
Physical audiovisual media collections are at risk for extreme levels of loss if action is not taken to preserve them in the next 10-15 years. Most archives are well aware of this critical issue, but are unable to move forward with preservation projects because it is difficult to quantify the intellectual impact and cost impact of action or inaction in order to advocate and secure budgets.
Our new Cost of Inaction Calculator provides graphics and metrics that compare resource expenditures, digitization and storage costs, and the rate of loss of physical media to help provide an approach to planning and advocating for preservation. This paper presents a sample case study showing how the COI model and Calculator can be used to support preservation efforts. This is a PDF version of an article that originally appeared in the International Association of Sound & Audiovisual Archives Journal No. 43, July 2014.
The latest technical brief from Digital & Metadata Preservation Specialist Alex Duryee explores the use of inodes in the functionality of Fixity, our free digital preservation file monitoring tool. Fixity offers the unique capability of tracking file attendance as well as file integrity.
In this free download, learn how we used filesystem structure to achieve that and how tracking files through their inode makes for a more powerful, more flexible monitoring approach.
In today’s world of digital information, previously disparate archival practices are converging around the need to manage collections at the item level. Media collections require a curatorial approach that demand archivists know certain information about every single object in their care for purposes of provenance, quality control, and appraisal. This is a daunting task for archives, as it asks that they retool or redesign migration and accession workflows. It is exactly in gaps such as these that practical technologies become ever useful.
This article offers case studies regarding two freely-available, open-source digital asset metadata tools—BWF MetaEdit and MDQC. The case studies offer on-the-ground examples of how four institutions recognized a need for metadata creation and validation, and how they employed these new tools in their production and accessioning workflows. By Alex Duryee and Bertram Lyons. This article originally appeared in the Practical Technology for Archives Journal, Issue 2, June 2014.
As the archival horizon moves forward, optical media will become increasingly significant and prevalent in collections. This paper sets out to provide a broad overview of optical media in the context of archival migration.
Author Alex Duryee begins by introducing the logical structure of compact discs, providing the context and language necessary to discuss the medium. The article then explores the most common data formats for optical media: Compact Disc Digital Audio, ISO 9660, the Joliet and HFS extensions, and the Universal Data Format (with an eye towards DVD-Video). Each format is viewed in the context of preservation needs and what archivists need to be aware of when handling said formats.
Following is a discussion of preservation workflows and concerns for successfully migrating data away from optical media, as well as directions for future research. This is a PDF version of an article that originally appeared in the online Code4Lib Journal, Issue 24, 2014-04-16, ISSN 1940-5758.
Embedded metadata is a key component of managing digital files, providing information on the correct presentation, file source, rights, and other information which supports findability, access, authentication, preservation, and more.
This paper discusses the concept and uses of embedded metadata in general, and then looks more specifically at its use in WAVE audio files, focusing on the efforts of the Federal Agencies Digitization Guidelines Initiative (FADGI) to develop recommendations on embedding metadata in audio files created by government agencies. This project resulted in the development of BWF MetaEdit, a tool which allows users to view, edit, and create embedded metadata in WAVE files.
What happens to a collection when its sole caretaker suddenly goes away? This case study examines such a situation and how the use of AVP’s Catalyst inventory solution was used to document an audio collection in support of preservation planning. Download the first in a series of case studies about practical, outcomes based approaches to audiovisual collection appraisal and processing.
When evaluating cloud storage providers, it is dangerous to assume such services are only storage and therefore uncomplicated or that requirements for storage are obvious and therefore inherently met by the service provider. Experience with any technology selection will prove the opposite.
No two services are the same and the variance between services often represents the difference between successful implementation and a failed initiative. Never purchase a service without proper vetting; uninformed decisions risk loss of time, money, and even assets. These nine assessment criteria will help you get started in asking the right questions and making a practical, informed decision on using cloud storage for archival or preservation needs.
When “Superstorm” Sandy swept through the New York City region it left unforeseen levels of flooding and damage in its wake in areas such as Red Hook, The Rockaways, and the Chelsea Gallery District. Though prepared for anticipated levels of flooding, Eyebeam Art+Technology Center ended up with three feet of water on the ground floor of its space. Amongst the damage was the majority of Eyebeam’s media archive: 15 years of videotape and computer disks containing artworks, documentation of events, and even server backups—essentially, Eyebeam’s entire legacy.
This case study shares Eyebeam’s experience responding to the disaster in the hope that it will be of benefit as organizations consider preparing for future events. It is a reminder to archives, caretakers, curators, stewards, and others responsible preservation of content that our work on disaster preparedness is not, and never will be, done.
An online introduction to the concepts and application of Dolby Noise Reduction. Misapplication of noise reduction can have a highly deleterious effect on the quality and integrity of audio recordings, thus an understanding of the system and use of the correct Dolby settings during playback and reformatting is extremely important to preservation. Includes audio examples illustrating the differences.
Creating item level records for archival media collections is seen as a high-cost investment, but it may help save costs and efforts in the long run, especially in the event of a major loss due to disaster.
What’s Your Product? Assessing The Suitability Of A More Product, Less Process Methodology For Processing Audiovisual Collections
The widely referenced and adopted More Product, Less Process methodology (MPLP) represents a much-needed evolution in the manner of processing archival collections in order to overcome backlogs and resource shortfalls that institutions face. In the case of audiovisual-based collections, however, the ability to plan budgets, timelines, equipment needs, and other preservation plans that unequivocally impact access is directly tied to the documentation of some degree of item-level knowledge about one’s collection.
This paper proposes an extension of the MPLP model which is necessitated to properly address the particular needs of audiovisual and other complex media in a way that properly meets archival standards and that assists the archivist in generating their true product: the provision of the three basic services of Findability, Access, and Sustainability regardless of the format, the content, or the tools used.
The Federal Agencies Digitization Guidelines Initiative is a governmental interagency activity that draws participants from the Library of Congress, the National Archives and Records Administration, the Smithsonian Institution, the National Libraries of Medicine and Agriculture, Voice of America, and several other interested agencies.
The initiative is divided into two parts: the Still Image Working Group and the Audio-Visual Working Group. Chris Lacinak has drafted the initial report on the Audio-Visual Working Group’s efforts to evaluate audio digitization systems and develop performance metrics in order to set guidelines and evaluative measurements for conducting and monitoring digitization activities.
The ease of using cassette-based media — pop it in and press play — and the development of compact, no-frills consumer electronics helped make audiovisual materials more accessible to a wider population, but there has also been the side effect of distancing users from the processes involved in recording and playback that were more apparent with open reel media and higher end decks. This is less of an issue with commercially recorded tape where standards are more regulated, but when dealing with field recordings, oral histories, and other original material, the configurations and settings of the recording device and playback device can have a major impact on audio or visual quality if unaccounted for.
In the first in a series exploring all of those knobs, switches, and buttons you see on decks, Audrey Young and our own Peter Oleksik have written a brief primer on azimuth and why it matters for archivists, researchers, and other people who listen to or work with magnetic audio recordings.
Barcode Scanners, MiniDV Decks, And The Migration Of Digital Information From Analog Surfaces
Dave Rice and Stefan Elnabli – October 28, 2010
Due to the susceptibility and challenges of both digital and analog carriers, data must be periodically moved from one carrier to another within a preservation process. When analog data is migrated from its original carrier to a new digital carrier, the analog data is ultimately transformed through the process of sampling. Challenges are then posed to authenticating the accuracy of such a migration. Despite the perceptual exactness of an analog source to its digital copy, the analog data and the digital data are never exactly the same. However, in the realm of file-based digital-to-digital migration, exactness can be achieved and evaluated. Within the entirely file-based environment, checksums and data comparison tools can verify that two copies are exact matches or reveal their deviation in a way that is not feasible between analog and digital environments.
10 Recommendations for Codec Selection and Management
The increasing number of digital objects under our guardianship as archivists will require a greater convergence between IT and archival knowledge sets in order to develop effective preservation strategies. One area of great concern for the integrity and persistence of digital audio and video files is the selection of file formats and codecs…Though this is also an area where there is a great lack of certainty and clarity on the issue.
This paper by Chris Lacinak lays out a clear explanation of what codecs are, how they are used, and what their selection and application means to archives. Also provided are 10 recommendations that will help you in the selection and management of codecs in an archival setting.
AVPS is involved in leading parallel projects within the Federal Agencies Digitization Guidelines Initiative and the Audio Engineering Society on the development of new standards and tools for performance testing of digital audio systems. As part of this work AVPS is proposing a Comparative Analysis tool which departs from existing error detection tools and is particularly well suited to identifying a particular type of error, labeled here as interstitial errors. This paper by Chris Lacinak uncovers one type of error that can occur and discusses the theory behind the comparative analysis methodology and approach to the development of new tools for test and measurement.
This paper examines preservation philosophies and strategies applied to large scale video collections that are both born-digital and tape-based. Technically and philosophically different approaches may be applied to migrating born-digital, tape-based content with decisions ranging from deck selection and choice of output to specifications of the resulting file. At the core of this is the distinction between migrating digital video as an audiovisual signal versus migrating it as data.
Written by Chris Lacinak in representation of the Audio Engineering Society and the Association of Moving Image Archivists.
Preservation: The Shift from Format to Strategy
Nestled at the base of a green rolling hill, thirty minutes north of Accra, Ghana in the small village of Medie, is the African Heritage Library, the home of Odomankoma Kyrema, the “Divine Drummer” Kofi Ghanaba. Formerly known as Guy Warren, Ghanaba is one of the more elusive musicians of the 20th century… Click below to read the full article.
This report presents the findings of an ARSC Technical Committee study, coordinated and authored by AVPS, which evaluates support for embedded metadata within and across a variety of audio recording software applications. This work addresses two primary questions: (1) How well does embedded metadata persist, and is its integrity maintained, as it is handled by various applications, and (2) How well is embedded metadata handled during the process of creating a derivative? The report concludes that persistence and integrity issues are prevalent across the audio software applications studied. In addition to the report, test methods and reference files are provided for download, enabling the reader to perform metadata integrity testing.
Indiana University, UT Austin, and AVP were awarded a Mellon planning grant in 2017 —called “Audiovisual Metadata Platform” (AMP)—to continue investigating large scale metadata generation of audiovisual materials. This presentation given at the Digital Library Federation conference in 2017 reports on the outcomes of an in-person meeting of archivists and technical experts who gathered as part of the grant for a three day workshop focused on AMP’s technical architecture and design.
From Mass Digitization To Mass Description: Indiana University’s Strategy To Overcome The Next Great Challenge
Over the past decade, much focus has been placed on mass digitization of legacy audiovisual collections. With progress on this front, today there is a new focus emerging: mass description. In 2014 Indiana University (IU) began an effort to digitize hundreds of thousands of hours of audiovisual materials from across campus, leading to the challenge of describing this extraordinarily diverse set of materials both at scale and at a sufficient level of granularity to enable meaningful and effective discovery.
In 2015, with the support of AVP, IU began a strategic planning project to research, analyze and report on technologies, workflows, staffing, timeline and budgets to address this challenge. This presentation, given by Jon Dunn and Chris Lacinak at the Association of Moving Image Archivists (AMIA) conference in 2016, delves into the background, goals, approach and next steps for this work.
This presentation, given by Amy Rudersdorf at the 2016 American Library Association’s Preservation Administrator’s Interest Group meeting, provides a higher level discussion of the use of standards for digital preservation and repository management and assessment. Particular focus is given to ISO 16363: Audit and Certification of Trusted Digital Repositories and its usefulness beyond an audit tool to perform assessments to identify both gaps and strengths in digital repository practice.
In this presentation, Bertram Lyons demonstrates a methodology for employing the ISO 16363 standard for Audit and Certification of Trustworthy Repositories as a tool that can be used to help an organization plan for continued improvement of digital preservation services.
On Thursday, April 16, 2015, Kathryn Gronsbell spoke at the New York Foundation for the Arts (NYFA) in Brooklyn. The event, “PIXELS, LINES, AND BITS: An A/V Preservation Primer for Artists”, was a discussion around personal archiving and preservation approaches for artists interested in stabilizing their work so that it can be available in the short- and long-term.
The presentation and Q+A session was an introduction to concepts like preserving and managing media, how you can leverage your time and money to make more sustainable decisions, and what the benefits might be now and in the future. Thank you to NYFA and Independent Media Arts Preservation for helping organize this event. See NYFA’s Highlight Reel from the discussion.
Kara Van Malssen‘s presentation from the Take Control of Your Records! conference at the National Audiovisual Institute in Warsaw, Poland offers 10 steps an organization can take to help ensure successful implementation of a media/digital asset management system.
This presentation from Seth Anderson covers recent efforts at AVP to reframe the often information-heavy results of ISO 16363 audits into straightforward data points based on scoring criteria with actionable recommendations for achieving compliance. The presentation includes examples of different applications of the standard as a means of assessing developing digital preservation infrastructure and planning for completely new policies and systems.
Additionally, extensive work with the standard has revealed inconsistencies and repetitive elements that cause confusion and difficulty in interpreting and applying the requirements of a trustworthy digital repository. Seth posits an altered hierarchy to address these issues in future versions of the standard, an approach that looks to such documents not as a static, inflexible set of guidelines, but pragmatically as a framework to apply and continually refine as results and technologies change, much like digital preservation itself!
AVP President Chris Lacinak was invited to give a keynote presentation at the 2014 Fédération Internationale des Archives de Télévision / The International Federation of Television Archives (FIAT/IFTA) World Conference on the topic of our Cost of Inaction Calculator. The COI Calculator is a planning tool that provides estimated budgets and schedules over the long term so that one can begin to develop a preservation plan for beyond the immediate near term. By looking at the costs of physical storage and management, digitization, and digital storage, an institution can think about distributing costs over time while also considering the critical need for sustainability of preservation activities (and their associated costs) beyond short term fund raising or grants.
AVP was involved in a number of panels and events at the 2013 Association of Moving Image Archivists Conference in Richmond, Virginia, including organizing the first ever AMIA HackDay and presentations related to the imminent decay of magnetic media, the importance of metadata development for digital preservation, and the intricacies of vendor selection for digital asset management systems.
- CHRIS LACINAK “THE END OF ANALOG MEDIA: THE COST OF INACTION AND WHAT YOU CAN DO ABOUT IT” (PDF)
- MIKE CASEY, INDIANA UNIVERSITY, “WHY MEDIA PRESERVATION CAN’T WAIT THE WEATHERING STORM” (PDF)
- AMIA & DLF HACKDAY 2013 PROJECTS (Wiki)
- SETH ANDERSON, “NAVIGATING THE DIGITAL ARCHIVE: FIRST, KNOW THYSELF” (PDF)
- SETH ANDERSON, “MASTERING YOUR DATA: TOOLS FOR METADATA MANAGEMENT IN AV ARCHIVES” (PDF)
- KARA VAN MALSSEN “FROM ZERO TO DAM” (PDF)
The Mid-Atlantic Region Archives Conference epitomizes the importance and reach of regional professional organizations, opening educational and networking opportunities to broader audiences that cannot regularly attend national conventions. AVP was a proud first-time participant on two panels this year on the topics of preserving complex digital artworks and the refinement of archival data using tools such as Open Refine. We look forward to presenting at future meetings.
Protecting The Personal Narrative: An Assessment Of Archival Practice’s Place In Personal Digital Archiving
The archival community struggles to fit in the private process of personal digital archiving. A common recommendation is to begin preservation far upstream, introducing archival practices early into the act of personal collection. But what may the archives best intentions introduce into the act of personal collection? Entering too early into the process may place undue influence on the decisions of the collector, the what gets kept and why?
Active preservation of digital personal archives is necessary for ensuring the longevity of materials, but the archives community must be aware that this may alter the personal narratives that personal archives represent. From the Personal Digital Archiving 2013 Conference, Seth Anderson’s presentation.
This presentation addresses the typical questions that arise from embedded metadata implementers regarding the role, technicalities and value of the TimeReference field in the bext chunk of BWF files. This mostly visual presentation is a practical primer for everyone from engineers to archivists and librarians.
Metadata is an integral component of digital preservation and an essential part of a digital object. Files without appropriate metadata lack the basic means required for computing systems and humans to understand, interpret, or manage them. Effectively, there is no preservation or meaningful access without metadata.
This presentation by Chris Lacinak covers the why, what and how of embedded metadata, focusing on WAVE audio files. It also reviews initial findings from an ARSC Technical Committee study, spearheaded by Chris, analyzing the interchange and persistence of embedded metadata across audio software applications that are regularly used in the creation of audio files in production and archival settings. Finally, Chris walks through BWF MetaEdit, a groundbreaking free and open-source tool commissioned by the Federal Agencies Digitization Guidelines Initiative and developed by AudioVisual Preservation Solutions in 2010.
AVPS moderated or presented on a number of panels at the 2010 International Association of Sound & Audiovisual Archives / Association of Moving Image Archivists conference in Philadelphia, PA. The topics covered a wide breadth, including embedded metadata, new tools and strategies for digital media preservation, and new approaches to funding and advocacy for collection management.
Born Digital File-Based Video recording is pervasive. Tape is not even an option on many new cameras being sold today. This shift has made accessioning and management of file based content and the associated challenges a new reality to archives. This presentation offers insights into the challenges that born digital file-based video brings to your archive and strategies for managing it.
A presentation focusing on obsolescence monitoring and normalization as strategies for managing born digital audio.
David Rice and Mike Castleman represented Democracy Now! at the 2008 AMIA Digital Asset Symposium presenting on the integration of open source technology and Free Software in efforts to record, disseminate, and archive moving image media.
The presentation included references to:
- Tools for Recording: dvgrab, cron, vidi
- Tools for Transcoding and Wrapping: ffmpeg, mplayer, MP4Box, ffmpegX, x246 for Quicktime
- Tools for Online Media Accessibility: The Internet Archive, blip.tv, Miro
- Tools for Migrating AudioVisual Data from Tape-Based Digital Media: DATXtract and Live Capture Plus
- Tools for Backup and LTO Management: Bacula
- Metadata Extraction Tools: MediaInfo, getid3, qt_tools
- Metadata Standard: PBCore
A Survey of Current Audiovisual Assessment and Prioritization Projects. Chris Lacinak coordinated this six presentation session for JTS 2007. The speech below is an introduction to the session to offer perspective and context to the topic and presentations.
Managing the Intangible Quality Assessment of the Digital Surrogate
This report documents the outcomes of a workshop funded by the Andrew W. Mellon Foundation and hosted by Indiana University as part of a planning project for design and development of an audiovisual metadata platform (AMP). The platform will perform mass description of audiovisual content utilizing automated mechanisms linked together with human labor in a recursive and reflexive workflow to generate and manage metadata at scale for libraries and archives. The partners leading this planning project were the Indiana University (IU) Libraries, University of Texas at Austin (UT) School of Information, and AVP.
In July 2009, Smithsonian Secretary G. Wayne Clough spoke of the digital future of museums, libraries, and archives. “We have the capacity to tell the story of America and all its hopes, struggles, triumphs, creativity, contradictions, and courage.” “Ultimately, we want to put all of our … objects … online so you can access them wherever you live. We want to offer the Smithsonian experience to everyone,” said Clough to a group gathered at the National Press Club in Washington, D.C.
To ensure the longevity of these born-digital and digitized collections, research, and resources, in 2014 Secretary Clough chartered a Digital Preservation Working Group to assess current preservation practices and develop lifecycle management recommendations for the future.
In response to DPN members’ concerns around the issues of workflow for digital preservation, DPN engaged AVP to develop a digital preservation workflow curriculum to share with DPN members and others in the digital preservation community. The curriculum was released with a Creative Commons 4.0 Attribution Share Alike (CC-BY-SA) license. (License details here.)
AVP worked with Yale University Libraries to assist them in the selection of a campus-wide digital preservation system. The project identified and described functions, use cases and diagrams for ingest, metadata and data management, migration, emulation, reporting, access, security and administration.
Harvard Library collections include a variety of computer media, image sequence and video formats that will be preserved in the Library’s preservation and access repository – the Digital Repository Service (DRS). As a first step towards providing support for this material in the DRS, the Library contracted AVP in late 2015 to assist with the analysis. Specifically the analysis was conducted in three areas: formats, metadata and tools.
Lead by Kara Van Malssen, AVP completed this report, commissioned by the BIBFRAME team within the Network Development and Standards Office at the Library of Congress, to evaluate the content description needs of the moving image and recorded sound communities and to specify how those requirements can be met within a semantic bibliographic data model designed generically to support all content types found in libraries.
This report presents the findings of a study conducted by Bertram Lyons and Kara Van Malssen of AVP, on behalf of the Library of Congress, to evaluate the existing state of technical, structural, and preservation metadata for audiovisual resources in the bibliographic environment in light of existing standards for audiovisual metadata, and to make recommendations about how BIBFRAME can support the expression of such information. This study follows on our May 2014 report titled, “BIBFRAME AV Modeling Study: Defining a Flexible Model for Description of Audiovisual Resources,” also commissioned by the Library of Congress, which explored and provided high-level recommendations on a flexible data model for audiovisual resources.
AVP worked with the Library of Congress American Folklife Collection on processing a series of digital archival collections to: generate access and preservation metadata; map metadata to LOC schemas; batch restructure, rename and bag files to conform to LOC standards; extract metadata from digital files, and generate derivatives.
The resulting processing tools are available on our GitHub account:
A 10-episode podcast series produced by AVP and METRO (www.metro.org) and funded in part by the New York State Archives Documentary Heritage Program.
More Podcast Less Process features interviews with archivists, librarians, preservationists, technologists, and information professionals about interesting work and projects within and involving archives, special collections, and cultural heritage. Topics include appraisal and acquisition, arrangement and description, reference, outreach and education, collection management, physical and digital preservation, and infrastructure and technology.
AVP President Chris Lacinak is a featured speaker in this Chesapeake Systems Podcast discussing video metadata workflows and two powerful new search applications, Nexidia and NerVve. Nexidia is a phonetic search application that can scan a library of literally thousands of hours of footage in seconds. NerVve is an application that works similarly with video imagery search. Both applications present easy-to-use interfaces, and the results typically “blow-away” first-time users.
Joined by representatives from each company, the panelists address the questions of what role these applications play in advanced video workflows and whether they threaten or complement metadata-driven media asset management systems?
On Monday, April 22, 2013, the Library of Congress National Digital Information Infrastructure and Preservation Program co-hosted the Rosenzweig Forum on Technology and the Humanities: Preserving and Interpreting Born-Digital Collections, as a special event in celebration of ALA’s Preservation Week 2013. The forum hosted four speakers to talk about how their institutions are addressing the acquisition and preservation of born-digital collections and a discussion of scholarly and research use of these unique collections.