Overcome common challenges with a DAM Maturity assessment
Picture this: Your marketing team is about to launch a high-profile social media campaign for a new product. They need to pull together creative copy, images, and videos that were recently delivered to the DAM, but suddenly realize they can’t find them in the system. They try every keyword they can think of, but, no luck. The producer is out, and no one else knows where the assets are stored. Eventually, they discover the assets were there all along, just not properly tagged. As a result, the campaign goes live hours late. Sound familiar? This kind of mix-up is all too common when a DAM isn’t running at the maturity level your organization needs. By assessing your DAM’s maturity, you can ensure assets are well-organized, easily discoverable, and ready whenever you are—keeping you on schedule and ahead of the competition.
You need to know the level of maturity of your DAM program. Why? Because this snapshot of the maturity of your program will tell you exactly how to address the challenges that are slowing you down.
If you’re in media and entertainment, you’re likely juggling hundreds of thousands of images and videos from multiple productions. In consumer packaged goods, tight integrations with product information management systems and strict data governance are critical. And for enterprise technology firms, the priority is managing diverse collateral for different buyers and industries—making content reuse essential. Ultimately, it’s the interplay of users, data, platforms, and workflows in and around your DAM that makes it all work seamlessly.
What’s the DAM Maturity Model exactly?
When I partner with clients, one of the first things I do to understand their current DAM landscape is to perform an audit using the DAM Maturity Model. Created by the DAM Foundation, this model is an industry standard assessment tool that breaks down an organization’s DAM capabilities across four categories (People, Information, Systems, and Processes) with each dimension scored against five maturity levels.
To do this, I survey and speak one on one with as many key stakeholders and contributors as I can within my first few days, getting first hand input and feedback on how things are going from their perspective. Through my listening tour, I assess many aspects of the organization’s current state, from how their systems are running to how effective processes, documentation and other DAM-related operating rhythms are.
I also identify and dig into pain points and challenges my respondents share with me, and double check those assumptions in follow up conversations, working to piece together an accurate view of the state of DAM based on how the people who use it demonstrate and speak about it.
From all of this analysis, I work to assess the organization with the DAM Maturity Model.
This assessment matters because an organization’s DAM maturity directly impacts organizational efficiency, brand consistency, and their ability to effectively utilize their digital assets across key departments, from legal, sales, marketing, brand, design, and much more.
Immature DAM programs often spend excessive time and resources recreating content, not finding it, are unable to respond to shifting market demands or have departments who simply don’t collaborate, so it is imperative to address these challenges.
DAM Maturity categories and dimensions
1. People
Technical Expertise: The organization’s ability to manage repositories, ingestion, cataloging, distribution and other aspects of DAM
Business Expertise: Stakeholder’s understanding of DAM, its value, and best practices
Alignment: The level of collaboration between internal teams to use DAM to meet business objectives
2. Information
Asset: How well digital assets are managed through their lifecycle, from creation, ingestion, transformation, versioning, distribution, and retirement
Findability: How effective search and discoverability are for assets in the DAM
Metadata: The quality of the information used to describe assets, powered by controlled vocabularies, terminology and taxonomy structures
Reuse: The organization’s ability to repurpose assets for multiple channels and intentions
Use Cases: Clarity of the organization’s intent with DAM, and how those business requirements are met with DAM systems, capabilities and workflows
3. Systems
Prevalence: Measure of how broadly DAM efforts are seen and experienced through the organization
Security: The extent that the DAM system and its contents reflect organizational entitlements and security measures (ie: single sign-on (SSO), users, roles, access controls, rights management, etc)
Usability: The ease of use and accessibility of the DAM’s user experience and interface
Infrastructure: Business units and IT teams’ ability to support the structure of the DAM system
4. Processes
Governance: The organizational infrastructure to ensure DAM strategy, policies and change is implemented, followed and managed respectively in a structured way
Integration: The ability for the DAM system to connect with the larger technology stack through complementary services and tools
Workflow: The measure of the collaboration and processes around DAM utilization, from new asset requests, versioning and other business processes
Maturity Scoring
Once I go through this assessment, I assign a rating to each dimension:
Ad-Hoc: Little to no defined process or experience in this area
Incipient: Casual understanding, or partial systems in place; efforts are still uncoordinated
Formative: Demonstrated experience, but limited to certain groups
Operational: Substantial experience and sophistication
Optimal: Established and effective experience that is proactive and continually refining
After scoring each dimension, I typically add a recommendation for how they can be improved. For example, if an organization’s Findability score is below Formative, it is very likely that the people uploading content to the DAM are not applying metadata effectively or consistently, resulting in DAM content being difficult to find by users, much like our example above. Highlighting this correlation between metadata and content discoverability is often effective to encourage better uploading practices, such as standardized metadata templates, or more focused DAM uploader training.
I’ve worked with organizations eager to improve their scoring through redefining their internal processes, brokering more effective governance structures amongst partner teams, and creating actionable roadmaps for the work to continue each quarter. Often, sharing the results of a maturity assessment can bring teams together and foster better collaboration, improving scoring naturally over time.
Why audits matter
DAM program audits like this are important because it helps you measure what matters. Addressing many of the dimensions of the DAM Maturity model will positively improve system adoption, justify spending, build more resiliency for all of the teams who support it, and address feedback from users who rely on it to get their work done. It is also an valuable tool to justify the important maintenance and care every DAM requires to stay effective.
If you’re ready to start addressing the issues holding back your DAM system with a comprehensive maturity model assessment, now is the perfect time to begin. Reach out to us today!