Article

Community-centered design for the development of effective cultural heritage training programs

27 January 2023

by Pamela Vizner and Kara Van Malssen

Continuing education and career-long professional development is critical in any field. As professionals dedicated to the stewardship and impactful accessibility of content—archiving, digital preservation, and digital asset management—we know well how important it is to continue to grow and learn as technologies change, user and stakeholder expectations evolve, and as we individually advance in our careers.

Both of us have also been privileged to share our knowledge with others around the world. Over the years, we have participated in dozens of local and international training programs as curriculum designers, trainers, organizers, mentors, and supporters; both as individuals and also as part of AVP’s ongoing efforts to support continuing education in formal and informal settings. These programs have been organized by a variety of professional associations, higher education institutions, and international training organizations. In the past, these were primarily in person—a few days or a week of learning with a handful of practitioners, either from the local area where the training was being conducted, or mixed groups from very different regions of the world. More recently, COVID-related restrictions have forced organizations to restructure training programs to accommodate fully remote or hybrid options, and we have participated in these efforts as well.

Throughout the years we have seen, applied, and analyzed multiple educational and training methodologies. We have seen many successes and have also identified areas of improvement, in other programs and also in our own training practices. 

Recently, we have reflected on shortcomings and opportunities to deliver more value to the participants of these trainings. This was in part motivated by our 2021 engagement with the International Centre for the Study of the Preservation and Restoration of Cultural Property (ICCROM) to help conduct a study on professional development needs for Sustaining Digital Heritage, with the aim of developing a new programme for this topic. Project researchers interviewed over thirty, primarily mid-career, heritage professionals from around the world, and analyzed existing professional development offerings on this topic. The results, and a proposed model for the programme, were published in a report available here. Through this research, we observed that:

  • Interviewees were frustrated by the lack of training opportunities for mid- and advanced-career professionals. Most trainings focus on introductory content.
  • Perhaps because they are introductory, many trainings largely focus on sharing the expertise of the trainers, which doesn’t always connect to or translate to the participants’ context, background, resources, needs, and goals. 
  • Remote trainings largely repurpose the onsite model—bringing people together for a few consecutive days of shared learning with experts brought into present live—but without the critical networking component.
  • In many training settings, there is a lack of opportunity for trainees to practice applying what they are learning to their own context and push solutions forward.

As the professional development landscape adapts to the new realities and opportunities introduced during the pandemic, now seems like a fortuitous time for the professional communities we work with to find ways to innovate and improve the value of professional development offerings, particularly for the international community. Significant investment goes into the development and delivery of all professional development—designing and optimizing for impact is a shared goal by all stakeholders. 

In this reflection, we want to offer some thoughts and ideas that we think could improve these training offerings to precisely maximize their impact while taking advantage of the new technologies and sharing opportunities available to us.

Our recommendations are rooted in human-centered design, a problem solving method that focuses on understanding and empathizing with people in order to develop beneficial solutions. It is worth noting that our recommendations here focus only on professional training programs that are not part of formal educational opportunities offered through universities or similar institutions (e.g. undergraduate programs, master programs, or certificate programs).

The key question we want to explore is: How can we ensure the resources invested in professional development offerings—on the part of organizers, trainers, participants, and funders—delivers the most impact? Our recommendation on how to achieve this can be summarized as:

  1. Discover – Take time to understand participants contexts, background, resources, needs, and goals first
  2. Design – Design training programs that provide ample opportunity to blend theory and practice. 
  3. Deliver – Take advantage of the different opportunities afforded by asynchronous and synchronous learning to maximize shared time. When possible, reuse existing content.
  4. Measure – Take the time to evaluate the results of the program, build in this practice for every iteration of the program to gather feedback to use as a tool to improve it.

DISCOVER

In our role as consultants at AVP, our approach with clients focuses on identifying the problem before proposing any solutions: we ask, we listen, we analyze, we engage in dialogue. Then, we find answers together. In other words, we diagnose before we prescribe. We have found this is the only way to find realistic solutions for a problem as each context is unique. 

One area of concern that we have identified is the lack of understanding of local contexts in the design and implementation of professional development programs. A common assumption seems to be that one single solution can be applied in multiple contexts with success.

We have seen many programs lack a clear “discovery” process. In the context of international education, this means that curriculum designers do not take the time to get to know their audience. Moreover, trainers often come from regions where availability of resources is different and where the problems they are trying to solve are of a different nature. As a result of this, programs can not only be ineffective but also run the risk of being perceived as colonizing. These exact solutions will very likely not translate well in a different context. Furthermore, many training programs in our field emphasize technical skills, and in some cases we have seen that the most pressing needs for a community might not even be of a technical nature. Very few programs include topics such as project management or fundraising, for example, which are fundamental in archives management. 

Taking the time to understand trainees’ needs and pain points is key for the design of an impactful program. This can be done through surveys, interviews, or site visits, if possible.

DESIGN

When there isn’t an understanding of the participants’ needs, the design of a program is destined to be built without a foundation. The selection of content, topics, tools, and training methods are left to the assumptions of the organizers, who might have a limited understanding of the needs of the audience they are trying to reach and without a clear set of articulated goals they want to achieve.

The first problem we see is that this often leads to the design of programs that are mostly expository and do not include opportunities for facilitated dialogue, which in many cases helps with absorption and retention. We are not saying that lectures and presentations don’t have any value but not including facilitated dialogue can leave participants without a firm understanding of how to translate what they are learning to their context. 

In addition to this, trainers are often subject matter experts, in some cases renowned professionals with many years of experience, but they are not trained to be facilitators to engage in active problem-solving with participants. Often, there’s no space for collective problem-solving. This one-sided modality has gotten more and more popular with remote options as sometimes it is very difficult to moderate discussions online with large groups. Again, that is not to say there aren’t benefits to applying remote learning, but we are sure at this point we have all experienced the challenges of open communication within remote platforms in educational contexts!

Moreover, because no discovery has been conducted, when hands-on training is included the tools that are presented are very specific and not necessarily can be applied in every single situation. This means that by the end of the program attendees are left with a large set of tools to apply without really knowing how to use them or if they even apply to their own context. For example, open source tools are often presented as the right solution for under-resourced organizations, however, in the selection of a tool there are many more aspects to consider than just availability of financial resources (and open source does not equal free, a point often under communicated).

We believe that in order for current programs to be effective and have a long-term impact a major perspective switch is required: developing curriculums with a problem-solving approach and trainee-centered learning. We think that learning the theory alone does not provide the tools to equip archivists and technicians to find the right solutions for the problems they face, which can vary greatly from organization to organization. Each organization is a different world and their problems and possible solutions are unique. After doing a good discovery, the design of a trainee-centered program not only makes a lot of sense, it also comes together more clearly.

A trainee-centered approach to teaching and learning may inspire different program designs than the typical default approaches.

DELIVER

We believe in taking advantage of the technologies and resources we have at our disposal. Program designers can be creative as long as they keep in mind the goals and needs identified during their discovery and while maintaining a trainee-centered approach. A combination of synchronous and asynchronous learning could be a good solution to maximize time and resources while keeping the program effective and successful. Asynchronous sessions could be followed by group working sessions—in-person or online—where participants have the opportunity to ask questions, apply the newly-acquired knowledge in practical exercises, and discuss the topics presented in the asynchronous sessions. They can learn from each other, and come up with creative solutions together. Allowing participants to review concepts ahead of time would not only provide space for reflection and reinforcement, but also maximize the time participants and trainers spend together. 

In this context, the role of the trainer, rather than being a lecturer, switches to a facilitator who guides participants through their own process of discovery as they try to marry theory with their own challenges. This implies that the trainers’ skills are different: this person should be a subject matter expert but also someone who is capable of guiding a conversation, challenging participants, and facilitating team work. This also means that the trainer needs to spend some time learning about the participants’ needs so they can offer effective guidance and help them find the right solutions. They need to do their own “discovery” process, know how to listen, be open, come in without preconceived notions of what the participant’s context is about, what they know and what they don’t, what their roles are within their organization, and what their expectations might be. On the other hand, participants should come prepared to share examples of current challenges for discussion and engage in collaborative problem-solving. Clear guidelines on expectations of participants and how to come prepared would be helpful in aligning all parties for productive time together.

Another advantage of this two-phased approach is that evergreen content can be created ahead of time. This can be reused, shared, translated, etc. If during discovery you have identified needs in broader areas, such as project management, then there’s an opportunity to reuse content from other sources. Obviously, there are many programs and content out there that can be used in tandem with domain-specific content. Partnerships with other educating organizations could cover the basics of general topics to facilitate access to already-existing content.

MEASURE

Determining the impact of the program must go beyond simply a feedback form delivered immediately after a training session. Instead, during the design stage, articulate a goal (or goals) of what you want the training to achieve, then determine what key performance indicators would be useful to measure to understand if you have achieved those goals. Next, determine how this information would be gathered.

For example, maybe one of your goals is to increase collaboration amongst participants in a regional training event. In order to determine if this goal has been met, you may want to track engagement on certain platforms, or whether or not they have organized follow up events. If one of your goals is for participants to be able to test a certain tool for their workflow, you may need to send participants home with clear instructions for testing. Later, follow up to with questions asking whether they completed the test, how easy was it to complete the test, how likely they are to adopt the tool.

We believe the end of the training cycle is not when the last session is over. Learning from past experiences is invaluable and will help build a better, more sustainable, more effective program. Asking participants, trainers, and other parties involved what they think, at the right moments, will give you information to continue to improve and better allocate resources in the future. Besides feedback forms, finding other ways to measure impact can be beneficial in planning future sessions. You can also design evaluation tools that are reusable so you can use this historical information to measure progress over time.

APEX: A CASE STUDY

The Audiovisual Preservation Exchange (APEX) is an international program that encourages dialogue and non-hierarchical exchange between practitioners, students, and the general public around the management and use of audiovisual collections (physical and digital). Organized by the Moving Image Archiving and Preservation Program (MIAP) at New York University (NYU) and created by Mona Jimenez, APEX has collaborated for the past 14 years with different types of organizations—including national archives, libraries, documentation centers, and community archives—to foster dialogue and mutual learning through direct work with collections. APEX is organized once a year in a different location, and previous versions have been held in Ghana, Argentina, Colombia, Uruguay, Chile, Spain, Brazil, Puerto Rico, and Mexico. AVP has been a long-time collaborator of APEX since its creation in 2008 through the participation of staff members in a variety of activities —including training— and as a current program sponsor.

APEX is not a traditional training program—everyone is a contributor and participants all learn from each other’s experiences navigating and responding to diverse administrative infrastructures and availability of resources. Its open-ended methodology embodies the approach we have described here, in a flexible and collaborative way.  

APEX starts with dialog (akin to the discovery stage described above), followed by a collaborative design process between partners. Then content delivered is based on identified needs and according to the resources available. Outcomes are measured and the process iterated on each year.

Gramophone Records Museum and Research Centre in Cape Coast, Ghana, 2008

Different from many international archival programs, APEX has a community-centered approach: each year, APEX organizers and local partners define the specific activities based on the communities’ needs. In some cases, most of the effort is focused on working on specific collections to find solutions to specific problems. In some other cases, the program includes hands-on workshops with the broader community to raise awareness around care and management of their own collections. To make this possible, APEX organizers work hand-in-hand with local partners to design the most impactful program possible.

Over the years, we have learned that the discovery process is key to a successful program and we have incorporated that as a fundamental part of the initial planning. This process starts with meetings with partners, followed by surveys, and in some cases site visits which are incredibly helpful to understand contextual information that might not be mentioned in conversations or surveys. Having the opportunity to see the spaces available, the collections, and the equipment; understanding any cultural differences or barriers; and having the opportunity to engage in person with partners has been useful information to prepare a program that is achievable, makes sense for all the people involved, and that focuses on the right topics. 

The design of the program is a long, collaborative process that can take several months. Based on the information gathered, activities are proposed to partners. These are discussed, refined, then discussed again. Every attempt is made to maximize the resources available (equipment, lodging, transportation, working spaces, technologies, existing documentation, etc.) and use collective networks to find additional resources and required skills to align with the needs identified (e.g. equipment donations, reaching out to a colleague who can collaborate with information on a certain topic, etc.).

Pamela Vizner and Caroline Gil, APEX Puerto Rico (Vieques) 2019

Delivery always varies. APEX is an in-person program, although in 2021 we were forced to organize a virtual version. In some cases we can organize online sessions with a colleague over video call to explain or discuss a given topic. There is only one thing that remains the same for each version: we ALWAYS engage in dialogue and learn from each other. The hands-on work with collections is the catalyst to discuss broader topics: as we work together to inventory video tapes we discuss approaches to digitization, and inevitably questions about digital preservation come up. Local organizations learn from each other as they uncover ways to collaborate or to learn about local resources they didn’t know existed. Every single version of APEX is not just a learning experience, it is a human experience that strengthens our networks at every level, from professional to personal, from local to international. This impact enables the creation of long-lasting connections and collaborations that live well beyond the duration of the program.

Finally, measurement of success is often done internally. Every single version of the program has resulted in learnings that are incorporated into the planning of the next version. However, there is still some work to be done in this area. APEX has recently created an advisory board that hopefully help formalize processes even more, and open it up to broader communities who can take advantage of this model in other locations.

Video Digitization Workshop, APEX México (Chiapas) 2022

Aligning Our Purpose, Messaging, and Branding

22 August 2022

Over the past 9 months or so AVP has been working with the superstar team over at Parisleaf on an effort to refine our messaging and branding. If you had asked me prior to beginning this process what I thought it would be like I might have thought it would be building from the ground up. Or perhaps just figuring out how to communicate more clearly. However, as a 15 year old company I think the process may be more akin to chiseling at a large stone to reveal the underlying figure. It was a painstaking process that consisted of shedding some things, finely shaping others, rounding off rough edges, making tough decisions, and making commitments. It was a difficult, albeit rewarding process.

We went through this process rather than just building a new website because at 15 years old we knew we needed more than just a new coat of paint. We needed to do some more serious reflection, renovation, and updating. In order to do our most impactful work and deliver the most value to our clients, we needed to understand, articulate, and deliver on what we do best – and do more of it.

Our aim is to take the outcomes of this introspective process and create the flywheel:

  • Be clear within ourselves about what we do best and where our passion lies
  • Clearly articulate verbally and visually what we do best and where our passion lies
  • Attract an audience to whom we can deliver greater value and impact than anyone else out there 
  • Build and innovate on what we do best and where our passion lies, maintaining our advantage and competitive edge

And so, with this intent, you will see that we have refined/new messaging, logo, website, and of course, some really good swag.

So, what did we come up with? You can see the visual changes throughout the site, and we will explain more about the logo below. Our new colors have been selected to represent our organization’s attributes. These are:

  • Professional & Accomplished
  • Future-Forward & Imaginative
  • Dynamic & Energetic

We can also now better articulate why we exist:

We help clients to maximize the value of their digital assets.

If you don’t know what they are, 

if they can’t be found, 

if they can’t be used effectively, 

if they’re damaged or lost, 

if they’re disconnected from other systems, 

then they aren’t creating value. 

And, if they’re badly managed, 

they’re an expensive overhead and a liability. 

Because data isn’t valuable until you can do something with it.

And share our purpose:

Your digital assets have extraordinary potential. 

Our purpose is to maximize their value through the innovation of information ecosystems.

And describe how we fulfill our purpose:

We connect humans and data. In collaboration with our clients, we create complete ecosystems for managing data that are designed around how their teams actually work and think.

Our value comes from our diverse perspectives. To see value and opportunities in data, you have to see things from different angles. We’re a forward-thinking team of cross-disciplinary experts working across a wide range of industries, so we know how to work with data in unique ways for different clients. 

Since 2006, we’ve been helping clients pinpoint their true vision and reach their goals. Instead of generic solutions, we actively listen to your needs and focus on opportunities that bring about beneficial change. We’re experts at challenging organizations to see the bigger picture, to understand where they are on their digital journey, and to navigate their next steps.

Our new logo represents this.

There are multiple meaningful elements within this logo:

We meet our customers where they are.

We look at the big picture.

We bring a clarifying spark.

We guide.

We know that there will be a lot of questions about our updates and we look forward to talking with our peeps about them. Meanwhile, we have anticipated some specific questions about what our rebranding means, and have created the FAQ below.

FAQ

Your new website seems to focus on digital asset management. Does this mean that you don’t offer services focused on digital preservation or collection management anymore?

No. We believe that digital asset management is a concept that encapsulates everything we do.  Sometimes when we use the term we are literally referring to digital asset management systems (i.e., DAMS), but as a concept, it also encompasses digital preservation, collection management, data management, metadata management, and more. These data are digital assets to your organization—we help you realize their value.

Do you still offer software development? I no longer see it under the services offered.

Through our reflection we had a couple of insights into how we talk about the services we offer.

First, we are not a consulting and software company. We are an information innovation firm. What does that mean? It means that we have a cross-disciplinary team of experts that maximize the value of digital assets through the innovation of information ecosystems. This team of subject matter experts consult, advise, develop, engineer, and more. The titles many of our peeps have consist of some version of Consultant and Software Engineer. We all focus on, are experienced within, and are experts in the domain of digital asset management.

Second, our continued software engineering contributions will be in support of digital asset management projects and prototypes. For instance, we will use software engineering when performing data migration, system integration, metadata cleanup, workflow automation, AI evaluation, and more. We will also use software engineering to build prototypes and proof of concept applications focused on digital asset management practice that will either be handed off to another entity to turn into a production system or will have otherwise served its purpose and be shut down.

What we won’t do moving forward is build production systems that require ongoing maintenance, support, and an entirely different infrastructure and operations to sustain. They are very different animals and operations. This approach and focus maximizes the value and impact that AVP can deliver and leaves the rest to others who can deliver maximum value and impact in those areas.

Does your focus on digital asset management mean that you are a DAM provider now?

When most people use the term DAM they are thinking of a software product/platform. We intentionally use the phrase digital asset management instead of DAM because we are 1) not a product/platform, and 2) we are referring to the broader practice of digital asset management, encompassing purpose, people, governance, process, technology, and measurement. We offer services focused on this holistic perspective of digital asset management practice.

Why did you remove products from your website? What has happened to your products?

We strongly believe in our products and know that they have been significant contributions to the communities we serve. We found that having both services and products on the website created confusion. People weren’t sure if we offered services or products and wondered what the relationship between our services and products were. Therefore we decided that wearavp.com will be focused on the services we offer. Paid AVP products like Aviary and Fixity Pro would best be represented by having their own independent websites. Products that have been developed by AVP for customers like embARC and ADCTest are best represented by those customers and the associated GitHub accounts. And finally, some products like MDQC, Catalyst, and Exactly will either remain available without support on GitHub or will be sunsetted.

Why did you keep the same name?

We actually set out to create a new name for AVP as part of this endeavor and we went through a process that required a great deal of time, energy, and thought. We arrived at a decision that, despite the cons of our name (not memorable, bad for SEO, etc.), redefining the name rather than changing it offered more pros and just felt right.

So, what does it stand for? Well, it stands for multiple things in different contexts. To name a few: Ambitious Vibrant People, Abundant Vantage Points, and Ample Value Proposition. You will see these sprinkled throughout our new website.

To Build a Successful DAM Program, Adopt a Service Mindset

25 August 2021

Kara_Crop-1Kara Van Malssen is Partner and Managing Director for Services at AVP.  Kara works with clients to bridge the technical, human, and business aspects of projects. Kara has supported numerous organizations with DAM selection and implementation, metadata modeling and schema development, and taxonomy development, and user experience design efforts.

[Read more]

Audiovisual Metadata Platform Pilot Development (AMPPD) Final Project Report

21 March 2022

This report documents the experience and findings of the Audiovisual Metadata Platform Pilot Development (AMPPD) project, which has worked to enable more efficient generation of metadata to support discovery and use of digitized and born-digital audio and moving image collections. The AMPPD project was carried out by partners Indiana University Libraries, AVP, University of Texas at Austin, and New York Public Library between 2018-2021.

Report Authors : Jon W. Dunn, Ying Feng, Juliet L. Hardesty, Brian Wheeler, Maria Whitaker, and Thomas Whittaker, Indiana University Libraries; Shawn Averkamp, Bertram Lyons, and Amy Rudersdorf, AVP; Tanya Clement and Liz Fischer, University of Texas at Austin Department of English. The authors wish to thank Rachael Kosinski and Patrick Sovereign for formatting and editing assistance.

Funding Acknowledgement: The work described in this report was made possible by a grant from the Andrew W. Mellon Foundation.

Read the entire report here.

PROBLEM STATEMENT

Libraries and archives hold massive collections of audiovisual recordings from a diverse range of timeframes, cultures, and contexts that are of great interest across many disciplines and communities.

In recent years, increased concern over the longevity of physical audiovisual formats due to issues of

media degradation and obsolescence, 2 combined with the decreasing cost of digital storage, have led institutions to embark on projects to digitize recordings for purposes of long-term preservation and improved access. Simultaneously, the growth of born-digital audiovisual content, which struggles with its own issues of stability and imminent obsolescence, has skyrocketed and continues to grow exponentially.

In 2010, the Council on Libraries and Information Resources (CLIR) and the Library of Congress reported in “The State of Recorded Sound Preservation in the United States: A National Legacy at Risk in the Digital Age” that the complexity of preserving and accessing physical audiovisual collections goes far beyond digital reformatting. This complexity, which includes factors such as the cost to digitize the originals and manage the digital surrogates, is evidenced by the fact that large audiovisual collections are not well represented in our national and international digital platforms. The relative paucity of audiovisual content in Europeana and the Digital Public Library of America is a testament to the difficulties that the GLAM (Galleries, Libraries, Archives, and Museums) community faces in creating access to their audiovisual collections. There has always been a desire for more audiovisual content in DPLA, even as staff members recognize the challenges and complexities this kind of content poses (massive storage requirements, lack of description, etc.). And, even though Europeana has made the collection of audiovisual content a focus of their work in recent years, as of February 2021, Europeana comprises 59% images and 38% text objects, but only 1% sound objects and 2% video objects. DPLA is composed of 25% images and 54% text, with only 0.3% sound objects, and 0.6% video objects.

Another reason, beyond cost, that audiovisual recordings are not widely accessible is the lack of sufficiently granular metadata to support identification, discovery, and use, or to support informed rights determination and access control and permissions decisions on the part of collections staff and users. Unlike textual materials—for which some degree of discovery may be provided through full-text indexing—without metadata detailing the content of the dynamic files, audiovisual materials cannot be located, used, and ultimately, understood.

Traditional approaches to metadata generation for audiovisual recordings rely almost entirely on manual description performed by experts—either by writing identifying information on a piece of physical media such as a tape cassette, typing bibliographic information into a database or spreadsheet, or creating collection- or series-level finding aids. The resource requirements and the lack of scalability to transfer even this limited information to a useful digital format that supports discovery presents an intractable problem. Lack of robust description stands in the way of access, ultimately resulting in the inability to derive full value from digitized and born-digital collections of audiovisual content, which in turn can lead to lack of interest, use, and potential loss of a collection entirely to obsolescence and media degradation.

Read the entire report here

Designing a User-driven DAM Experience, Part 1

9 April 2021

To the user, a digital asset management (DAM) or similar system is only as good as the search and discovery experience.

If users are greeted with a homepage that they can’t relate to, if searches don’t return expected results, and if they can’t figure out how to use the navigational tools to browse, they get frustrated and leave. Many will never return.

Search

DAM and similar systems exist to help people find assets they are looking for and use them effectively. Getting the search and discovery experience right is the key to adoption.

To design a system for findability, you have to start with the building blocks: metadata, taxonomy, and information architecture. To translate these into a good search and discovery experience, you have to learn how your users see the world.

[Read more]

Designing a User-driven DAM Experience, Part 2

15 April 2021

Kara Van Malssen

[Read more]

Designing a User-driven DAM Experience, Part 3

15 April 2021

Kara Van Malssen

[Read more]

Manage Your DAM Expectations

8 April 2020

Or, how getting a DAMS is like buying and owning a home

faiq-daffa-rcYE6UoIjWU-unsplash

[Read more]

Digital Preservation Go!

6 May 2021

Take your first, next step to long-term digital preservation with AVP.

[Read more]

Scenario Planning For A Successful DAM Journey

10 January 2020

Getting to Success: A Scenario-driven Approach for Selecting, Implementing, and Deploying Digital Asset Management Systems

Usage scenarios are simple narrative descriptions of current or future state use of a system. For DAMS initiatives, scenarios are an important tool that can be used throughout all stages: selection, implementation, launch, and beyond. Scenarios are a lightweight, simple, clear, and effective method for defining the goals and intended use of a system. They help facilitate communication between stakeholders and vendors, providing a starting point for ongoing conversation that ensures all parties have equal footing in the discussion. This paper provides a definition of scenarios, describes their key features and structure, identifies their benefits, and offers recommended practice for using scenarios throughout the lifecycle of a DAM deployment process. 

This paper was originally published in the Journal of Digital Media Management, Volume 7 (2018-19). 

INTRODUCTION

Organizations embark on digital asset management selection and implementation efforts for a number of reasons: to create a centralized library of assets, to enable efficient collaboration between departments, to improve review and approval processes, to streamline multi- or omni-channel distribution, and more. In all cases, the end goal is undoubtedly the same: to successfully transform some aspect of how the organization works, and to affect meaningful and productive change that will ultimately allow the organization to better serve its mission and stakeholders.

When the need for change and the opportunity for improvement is first identified, agreed upon by the relevant stakeholders, and given the green light by senior leadership, the possibilities are exciting. But it is well known that organizational change efforts can be long and difficult. Statistics and stories abound on the high failure rate of technology projects.1 Categorically speaking, digital asset management is no exception. Selecting the right technology is a daunting task. Implementation is yet a further hurdle. Things can get even more difficult at the launch and roll-out stages. Reaching the end goal can take years of sustained effort. This is not to say that embarking on technological change is not a good idea, or that it shouldn’t be undertaken. Rather, acknowledging the inevitable challenges, and identifying ways of mitigating them, should be an important aspect of planning.

Undoubtedly, one of the key challenges throughout digital asset management system (DAMS) selection and implementation process is clearly defining the system goals, getting agreement on these goals from all internal stakeholders, and ensuring that those goals are well understood by the system developer or vendor. This challenge persists throughout all phases of the project, as goals evolve through different stages. Author Mike Cohn describes software development (or, as is being discussed in this paper, procurement and implementation) as a communication challenge between the technologists who build the software, and the business or customers that will use it. Cohn notes that communication between these groups is fraught with potential for error, stating: “If either side dominates these conversations, the project loses.”2 Because these groups approach the work from such different backgrounds and perceptions, it can seem as if they are speaking different languages. 

However, there are tools and methods that can be used to help manage the change process, level set the conversation to a shared understanding and common set of terminology, and ensure strong communication between all involved parties. For technology efforts like digital asset management implementations, usage scenarios (hereafter, scenarios”) are one of those tools. Usage scenarios put the user front and center, a reminder that the technology is being implemented to serve people and help them achieve specific goals. Scenarios are created by the business, and used as a starting point for conversation around stakeholder needs and goals with technologists. Scenarios describe a situation in which one or more users would execute a task or set of tasks using a system. They ground the conversation in a consistent and easily understood format, providing a starting point for ongoing communication and action, helping ensure that no one side will dominate. And, importantly, scenarios keep the why of the project at the front and center, a question which will be continually revisited throughout.

This paper explores a few ways that scenarios can be used throughout the DAMS selection, implementation, and deployment process. It argues that a set of scenarios that are agreed upon by all key stakeholders provide a meaningful testbed of information that can serve as a tool for ensuring consistency, transparency, and measurability, connecting all phases of the technology implementation lifecycle. While this paper is written with the perspective of the DAMS manager/owner in mind, system vendors may also find these recommendations useful to incorporate into the client onboarding process.

DEFINING SCENARIOS

Usage scenarios are narrative descriptions of interactions between one or more users and the system. Most importantly, scenarios are stories. Advocates Mary Beth Rosson and John M. Carroll note that scenarios, “consist of a setting, or situation state, one or more actors with personal motivations, knowledge, and capabilities, and various tools and objects that the actors encounter and manipulate. The scenario describes a sequence of actions and events that lead to an outcome.”3

The idea of using scenarios in system design gained strong support in the early 1990s. Finding that using requirements alone defined a limited view of the system, and most importantly, lacked the human component, engineers began to explore the use of scenarios as a technique to compliment the requirements development effort. As human-computer interaction emerged as a critical area of research and development, numerous papers, books, and tutorials were contributed to the growing volume of literature on this topic. Scenario-based design and development is also closely aligned to the human-centered design process, or design thinking, which is an approach to creative problem solving that focuses on understanding human needs first.

Scenarios are attractive and widely used in technology development and deployment projects for a number of reasons, perhaps most importantly because of their simplicity. Requirements engineering author and thought leader Ian Alexander notes that, “Scenarios are a powerful antidote to the complexity of systems and analysis. Telling stories about systems helps ensure that people—stakeholders—share a sufficiently wide view to avoid missing vital aspects of problems.” He adds, “Scenarios are applicable to systems of all types, and may be used at any stage of the development life cycle for different purposes.”4

Scenarios describe expected every day use of the system, and can be created from different viewpoints to serve different functions. Two of these views that will be explored in this paper are defined by researcher Alistair Sutcliffe:

  • “a story or example of events as a grounded narrative taken from real world experience,” and,
  • “a future vision of a designed system with sequences of behaviour and possible contextual description.”5

The future-facing scenario perspective will be most useful during selection, implementation, and launch phases of a DAMS initiative, with a shift to current state scenarios during and following launch.

CREATING SCENARIOS

Scenarios are fairly simple and quick to create. They don’t require specialized knowledge or expertise to develop, although following a few best practices will result in more effective scenarios. Their greatest strength is that they are easy to understand and thus it is easy for the stakeholders they represent to provide feedback on them.

When drafting scenarios for DAMS projects, authors should considering including, at minimum: 

  • Unique Identifier for each scenario
  • Title/simple description
  • List of participating actors (archetypal personas based on the organization’s users and roles)
  • Narrative description (1-4 paragraphs)
  • Expected outcome or success criteria (the things that must be true for the scenario to be accepted by stakeholders once implemented)

Below is a simple example:

01 Asset Reuse for Marketing Campaign
Actors
Marketing Associate; Intellectual Property Associate
Scenario
A Marketing Associate (MA) needs photos for an upcoming campaign. The MA searches in the DAMS, first by keyword, then using facets to narrow results to images only. She identifies a selection of potential images, and puts 20 images into a lightbox for review. MA shares the lightbox with an Intellectual Property Associate (IPA) directly via the DAMS. The IPA receives an email with a link to the lightbox, asking her to review and approve. The IPA approves 12 of the images and denies use of 8. The IPA indicates in the comments that the branding needs to be updated on 2 of the approved images. The system alerts the MA which of the images have been approved. She downloads the images as a batch in the format and size she needs. The system tracks, at an asset level, all interactions and approvals.
Success Criteria
Users can search by keyword and refine using facetsUsers receive email notifications when assets are shared via lightboxes (or similar)Assets can be routed to system users for approvalUsers can clearly approve or deny assets for useSystem allows for users to leave comments visible to other users with appropriate permissionUsers can specify asset download format and resolutionAssets can be downloaded as a batchThe system tracks approval and usage for later reporting

Effective scenarios should:

  • Follow the 80/20 rule: At minimum, be sure to capture cases that represent 80% of the institution’s anticipated use of the system. However, don’t entirely neglect the edge cases—some of these may prove critical later on. 
  • Be constrained to a specific situational goal and outcome: If a scenario is becoming to long and includes multiple end state goals, consider breaking it into multiple scenarios. However, if you need to describe alternatives, these can be added as an additional component to the original scenario. For example if a scenario describes a checksum validation process, include the successful outcome in the main narrative description (e.g., all files pass), and create an alternative path for the case when there is a failure (e.g., one file does not validate).
  • Not be prescriptive: In their book, The Right Way to Select Technology, authors Tony Byrne and Jarrod Gingras note that when writing scenarios for the purpose of technology selection, “You want to leave it open-ended enough so that the vendor can do the prescribing of how their solution best meets your needs. So, in your stories, talk about what employees and customers do, but don’t go into too much detail about how they do it.”6
  • Not include subjective or personal preferences: Statements such as, “the user finds the interface intuitive” are not easily measurable as each user will have a different interpretation of “intuitive.”
  • Be demonstrable and testable: A vendor should be able to show you how their tool solves the scenario and/or the users should be able to complete the described tasks themselves during testing (see Implementation, below).
  • Strive to remove assumption and internal biases: It may help to enlist an objective third-party to help create scenarios, or provide feedback on existing drafts. Internal stakeholders are more likely make assumptions—such as using terminology that holds an agreed upon meaning within the institution, but may be interpreted differently by others—that an external party would not.

SELECTION

In his paper “Scenario-based Requirements Engineering,” Sutcliffe states: “Scenarios are arguably the starting point for all modelling and design, and contribute to several parts of the design process.”7 System selection and/or development is a typical starting point for scenario creation. At this stage, scenarios serve several primary functions. 

First, scenarios create agreement and buy-in amongst stakeholders about how the future system should work. Before a DAMS RFX is issued, a team should spend considerable effort defining and documenting what they would like the system to enable at their organization, and goals for how they would like the system to work. This documentation will take several forms, including business requirements (why the system is desired), functional requirements (what the system should do, at a granular level), and non-functional requirements or constraints (including technical and hosting requirements, performance requirements, format and other data requirements, etc.). Scenarios accompany these to round out a set of requirements.

At this stage, it is especially important that this process is inclusive of all key stakeholders—in the experience of this author, some DAM failures can be traced to the requirements process not involving the right people, and important needs not being considered during selection. By participating in the scenario development process, stakeholders are given the unique opportunity to think creatively about how they want to be able to work in the future, and what those improvements will look like compared with their current situation. This also helps people feel invested in the system selection process, and will help them see themselves as users of candidate systems that are demonstrated, which will result in more meaningful input during the final decision process.

For an RFX, typically around 6-12 scenarios will be created. Each scenario should describe a typical use of the system. When sketching out ideas for scenarios, it is not uncommon to find multiple ideas that ultimately illustrate the same set of functionality, with a few slightly different parameters (e.g., organizing assets from an event, organizing assets for a project). In these cases, only one scenario that captures this set of goals needs to be created. For the benefit of stakeholders who may become disappointed or frustrated if their needs are not explicitly reflected, the scenario may include a section listing “also applicable to” workflows.

Scenarios created for the purpose of DAMS selection should reflect future state, and paint a picture of what the organization expects the system will enable once it is launched. This author has helped numerous organizations draft scenarios for DAMS, and finds that often stakeholders struggle with two aspects of this process: 1) shifting their thinking to the future, rather than current, state, and 2) committing to scenarios when the workflows haven’t been finalized yet. Stakeholders will often need coaching and multiple rounds of feedback to help them overcome the first hurdle. For the second, it is important to remember that at this stage, scenarios don’t need to align perfectly to future workflows. These will not have been defined yet, and if they have, they will undoubtedly be tweaked following the implementation of new technology.

The second function of scenarios at this stage is that they illustrate for software vendors/developers how the system will be used. Scenarios allow vendors to better understand the institution and the goals of the RFX, and to craft more tailored responses. Scenarios give a sense of the different actors, and their roles and motivations. They bring to life specific individual requirements and tie them to real-world usage. When reviewing an RFP with a list of business, functional, and non-functional requirements, vendors can only gain a certain degree of understanding of how an organization will use the system. By adding a set of scenarios, those requirements come to life.

Scenarios can also become an important part of the vendor’s proposal. By including a scenario worksheet, and asking vendors to describe how their system would fulfill each scenario, an organization can not only learn more detail about how each system works, but can also get a sense of how well the vendor understands their needs and goals through their response. Vendors can be asked to include additional information including preconditions (configuration, customization, or other set up that must be in place before the scenario can be fulfilled) and an estimated timeline for implementation so that this scenario can be tested by users. This information helps reveal which systems are more readily able to support the organization’s needs out of the box, and which will require (potentially costly) customization. These responses can become an important criteria amongst others (budget, requirements alignment) that are used to narrow down a field of candidates to 2-3 that can be invited for demonstration.

The previous scenario example can be transformed into a vendor response table as follows (blue shaded cells to be completed by respondents):

01 Asset Reuse for Marketing Campaign
Actors
Marketing AssociateIntellectual Property Associate
Scenario
A Marketing Associate (MA) needs photos for an upcoming campaign. The MA searches in the DAMS, first by keyword, then using facets to narrow results to images only. She identifies a selection of potential images, and puts 20 images into a lightbox for review. MA shares the lightbox with an Intellectual Property Associate (IPA) directly via the DAMS. The IPA receives an email with a link to the lightbox, asking her to review and approve. The IPA approves 12 of the images and denies use of 8. The IPA indicates in the comments that the branding needs to be updated on 2 of the approved images. The system alerts the MA which of the images have been approved. She downloads the images as a batch in the format and size she needs. The system tracks, at an asset level, all interactions and approvals.
Success Criteria
Users can search by keyword and refine using facetsUsers receive email notifications when assets are shared via lightboxes (or similar)Assets can be routed to system users for approvalUsers can clearly approve or deny assets for useSystem allows for users to leave comments visible to other users with appropriate permissionUsers can specify asset download format and resolutionAssets can be downloaded as a batchThe system tracks approval and usage for later reporting
Solution Description
Response:
System Preconditions
Response:
Estimated Implementation Timeline
Response:
Additional Documentation
 

Finally, scenarios provide a testbed of material for system demonstrations, enabling an apples-to-apples comparison of how different systems would fulfill each situation. Once the candidate pool is reduced to a handful of top systems, asking all of these vendors to demonstrate the same subset of scenarios (typically 3-5) and using assets and metadata provided by the organization enables stakeholders to get a real sense of how the system will work. In contrast to a standard demo, which the vendors have rehearsed time and again and are designed to show the best of the system, scenario demonstrations can reveal flaws and aspects of the site that are less streamlined. In other words, they help demonstrate how it really works for the specific users.

IMPLEMENTATION

Scenarios can play a vital role during the implementation process by providing a foundation for further refinement of requirements and definition of completeness for launch phases. One way to approach DAMS implementation is to use an Agile framework. As noted by the Agile Alliance, Agile is simply, “the ability to create and respond to change.”8 Agile emerged as an approach to building software so that inevitable changes in focus and priority could be easily managed. Frameworks such as Scrum, which emphasizes cross-functional project teams and consistent development cycles known as “sprints”, have been applied to other contexts outside of software development. Agile is an ideal fit for DAMS implementation, given the number of stakeholders and unknowns, and the need for clear structure and communication. 

Scenarios provide an excellent starting point for the development of user stories, a human-centered communication tool from the Agile community, which describes individual system features from the perspective of a user. Once a DAMS has been selected, the original scenarios should be revisited. During the time between system selection and the engagement with a DAMS vendor or developer, additional needs may arise. Scenarios provided in the original RFX likely need to be updated, refined, or expanded. Additional scenarios may need to be added. 

Following the revision process, a set of agreed upon implementation scenarios will be available. These can also be deconstructed into discrete individual user stories, which take the format:

As a _[actor]_ I want _[goal/desire]_ so that _[benefit]_. 

User stories can be written for each functional aspect of the system that is described in a given scenario, ensuring that all success criteria are met. User stories can be accompanied by acceptance criteria, which are conditions that must be satisfied in order for the product to work as intended by stakeholders.

For example, given the previous scenario example, a number of user stories and acceptance criteria can be derived. Two examples might be:

StoryAcceptance Criteria
As a Marketing Associate, I want to be able to share a collection or lightbox of selected assets with another user so that I can request approval for use in a campaignUser 1 can create a lightbox and share with another userBoth users can access the same collection/lightbox
As a Marketing Associate, I want to be notified once assets I have selected are approved, so that I know when they are ready to download.User 1 routes a selection of images to User 2 for approvalUser 2 indicates approval of selection within the systemSystem sends email notification to User 1

These can be categorized according to type (e.g., search, metadata, integration), and can be prioritized according to their importance for the relevant stakeholders. The user stories can then be delivered to the system vendor, along with the scenarios, for initial setup and configuration of the system, and/or to internal stakeholders responsible for additional configuration. 

Once a set of initial user stories are created, two testbeds of material will now be available:

  1. User stories, which can be tested and validated individually, according to provided acceptance criteria
  2. Usage scenarios, which can be tested and validated by the identified actors to ensure that the entire workflow can be completed, according to the provided success criteria

The user stories derived from implementation scenarios will not reflect all the requirements that users have for the system, but it will provide an initial set of stories that are rooted in the original scenario, which will become useful as the launch phase approaches. As implementation proceeds, and new needs arise, new requirements can continue to be created in user story form. 

GETTING TO LAUNCH

A scenario-driven approach to DAMS selection and implementation enables what can be termed test driven deployment. This concept is borrowed from test driven development, an approach to iterative software development that relies on writing tests first, then writing code to reach a minimum level of functionality required to pass those tests. In test driven deployment, scenarios and user stories are tests in and of themselves, and thus become an ongoing component of implementation and launch phases of DAMS deployment. Throughout implementation (and beyond), the scenarios and user stories will become important tools for testing, providing feedback, and measuring progress toward launch goals. Again, this process will naturally fit into an Agile framework. 

It is important to establish a clear scope for an initial launch candidate. This scope should include definition of the user stories that must be completed and validated. Bear in mind that users needs will continue to evolve and grow, so keeping a backlog of stories to be prioritized and addressed over time is critical. However, communicating the goal of incremental deployment, and maintaining a transparent user story backlog will help manage user expectations and avoid scope creep.

Acceptance of the initial launch candidate will require completion and validation of all user stories prioritized for this milestone. Prior to engagement of end user groups, all user stories should be tested and validated by the DAMS owner/manager, then by key stakeholders. This exercise will likely result in discovery of bugs, incomplete stories, as well as some additional user stories that need to be implemented prior to full acceptance, and should be communicated back to the vendor. Maintaining use of the user story format will ensure consistency in communication throughout this back and forth process.

As launch approaches, teams will begin testing and training within the system. One useful test is to ask these groups to walk through the relevant narrative scenarios and success criteria, working through the outlined steps within the system, and determine if they are sufficient for launch. While additional user acceptance testing may be performed as well, scenario tests are important for tying deployment back to original requirements that stakeholders are already familiar with.

This entire cycle can continue as additional teams are brought on to the system. New teams who weren’t engaged in the initial selection and implementation stages should go through the same process, starting with the development of scenarios, followed by the create of prioritized user stories. Scenarios can also be used to craft meaningful and easy-to-digest training materials, and used as part of training sessions. Providing new users with example situations help make the system more relatable.

ONGOING IMPROVEMENT

Scenarios can also play an important role over time to create system improvements. For system enhancements, scenarios can be used to communicate the future state goal, and the changes that might be required to reach that state. For bugs or other issues, scenarios can be used to describe the user’s experience when they perform a given task, or what happens when things are not working as expected. In this case, the framing of the scenario should shift from future state (as has been suggested throughout this paper) to current state, which is the other type of scenario defined by Sutcliffe: “A story or example of events as a grounded narrative taken from real world experience.”9 The scenarios created at this stage could be termed what Sutcliffe refers to as “problem statement scenarios.”

Furthermore, ongoing user testing should also be an important part of a DAM program, so that the product owners can understand how users are working in the system, and what improvements could make their experience better. Scenarios provide an excellent starting point for the creation of task-oriented user tests, which can be delivered to participants in moderated or unmoderated sessions.

CONCLUSION

Scenarios are not a panacea and certainly not the only tool in your toolbox. However, consistent use of this format can facilitate clear communication between all parties involved in the DAMS selection, implementation, and management process. All technology projects run the risk of failure at worst, and transforming how the organization works at best. Communication is one of several critical factors that will contribute to the outcome.

Scenarios help communicate the project vision. They can support stakeholder engagement and buy-in. They enable traceability of system features to original requirements. They support testing. The simplicity and clarity of scenarios and user stories make them excellent documentation and training resources.

One important advantage of scenarios is that they are extremely lightweight. DAMS initiatives inevitably result in a great deal of documentation, and readers may feel a reluctance to add more. However, the experience of this author is that a set of key scenarios—just a few short paragraphs each—has the potential to make the selection, implementation, and launch processes streamlined and understandable to all stakeholders. As Byrne and Gingras emphasize in The Right Way to Select Technology, “After defining the business case, [scenario creation] is the most important foundational work you will do, so spend time to get it right.”10

References

1. Marr, Bernard, ‘Are these the 7 reasons why tech projects fail? Forbes; September 13, 2016.

2. Cohn, Mike. User Stories Applied. Addison-Wesley Professional; 2004. Page 3.

3. Carroll, John M., Rosson, Mary Beth. Scenario-Based Design. In Jacko, J., Sears A., editors. The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. Lawrence Erlbaum Associates; 2002.

4. Alexander, Ian, Maiden, Neil. Scenarios, Stories, Use Cases: Through the Systems Development Life-Cycle. 1st Edition. Wiley; 2004. Page 3

5. Sutcliffe, A. (2004) ‘Scenario-based requirements engineering’, in Proceedings of the 11th IEEE International Requirements Engineering Conference, Monterey Bay, CA, 12th September, pp. 320–329.

6. Byrne, Tony and Gingras, Jarrod. The Right Way to Select Technology. New York: Rosenfeld Media; 2017. Page 43.

7 Sutcliffe, ref 5 above

8. Agile Alliance. (n.d.) ‘Agile 101’,(accessed 9th April, 2019).

9. Sutcliffe, ref 5 above

10. Byrne and Gingras, ref 5 above.

Next Page »