It is abundantly clear that a primary component of legacy audio preservation and access is digitization. Recognition of this fact has promoted en-masse digitization of legacy media. Recent years have proven to be very productive in the way of creating best practices and standards for audio preservation and digitization. However, as usual the devil is in the details and there are still some issues to resolve. One such area of concern is integrity issues which exist within the digitization process.
To back up a little bit, in the analog audio domain data integrity measures are firmly established through standardized testing of playback, processing and recording equipment. Coupled with standards and best practices for media handling and playback one can achieve a faithful reproduction of the original recording.
On the digital side, standards for analog-to-digital converter quality testing have been established. Best practices for file format and resolution selection have been as well. With correct utilization of these, the aforementioned faithful reproduction of the original recording can be digitized, yielding a preservation master file. Once a file is produced, convergence with general IT and data management provides access to widely used data integrity measures such as checksum generation and validation.
Sounds like we are all set. Let’s see:
- Good equipment
- Expert operator
- Regular equipment Calibration and Alignment
- Best Practices and Standards for Media care and handling, treatment and reproduction
- High Quality analog-to-digital converter
- High Quality Digital Audio Workstation
- Appropriate resolution and file format selection
- Checksum generation and validation procedure in place
That should be enough to feel good about the integrity of our digitization setup and process, right? Well, not exactly. You could still end up with something like this.
What you see in the section highlighted grey with a straight vertical line is the result of an error in writing the digital audio stream to disk where a number of samples were not recorded. They were “lost” between the output of the analog-to-digital converter and being written to disk. If those samples had not been lost, the line would move in an orderly way down a slope instead of making the abrupt drop that you see. We have termed this type of error a Digital Interstitial Error. But how does something like this happen?
Chris Lacinak‘s detailed white paper on this topic, “Digital Audio Interstitial Errors: Raising Awareness and Developing New Methodologies for Detection”, provides images of the correctly recorded version and digs deeper into this issue. The reality is that every system is vulnerable to this type of error regardless of its cost. And no matter how small the error, it is not an acceptable occurrence in a preservation transfer. The nature of digital interstitial errors makes them very difficult to identify using currently available methods and tools, and the truth is that they are often missed and overlooked. In short, the community needs better tools to identify and respond to errors such as these.
AVPS is involved in leading parallel projects within the Federal Agencies Digitization Guidelines Initiative and the Audio Engineering Society on the development of new standards and tools for performance testing of digital audio systems. As part of this work and tool-set AVPS is proposing a Comparative Analysis tool which departs from existing error detection tools and is particularly well suited for identifying errors such as these.
The first stages of research and some testing have been completed, but one of the steps for moving ahead with actual development of tools is gaining feedback from possible user groups. For this we need your help. Please read the full report in our Papers & Presentations section for further information, and if you would be interested in using a digital audio error analysis tool respond to us at [email protected] so we have the community feedback necessary to help create the most useful tool possible.