Skip to main content
Back to Frameworks
Optimisation · Scale-up

PMM Maturity Audit

The PMM Maturity Audit is a diagnostic tool that scores your product marketing function across five key dimensions: Positioning & Messaging, Competitive Intelligence, Sales Enablement, Launch Excellence, and Customer Insight. Each dimension is rated on a 1-5 maturity scale, from ad-hoc (1) to optimised (5). The audit identifies where the function is strong, where it's weak, and where to invest to level up.

When to use this framework

  • Assessing a PMM function you've just inherited or joined
  • Building a business case for PMM investment with leadership
  • Setting quarterly or annual PMM team goals
  • Benchmarking your function against best practice
  • Identifying which PMM capabilities to hire for next

Sign in to unlock full access

You're browsing as a visitor. Create a free account to fill in worksheets, download PDFs, and save your progress.

Worked Example

Atlassian (PMM function audit)

1. Positioning & Messaging

1 = No documented positioning. 2 = Basic positioning exists but not widely used. 3 = Positioning documented and used by marketing. 4 = Positioning used across marketing, sales, and product. 5 = Positioning regularly tested, refined, and embedded in all go-to-market.

4

What evidence supports this score? Cite specific artefacts (e.g., 'positioning doc last updated March 2024'), usage data (e.g., 'sales team references messaging in 60% of calls'), and gaps observed.

Strong positioning documentation exists for each product (Jira, Confluence, Trello, etc.) with clear persona-specific messaging. Positioning is actively used by marketing, sales, and partner teams. Website copy and campaign messaging are consistently on-brand. Gap: positioning is refreshed annually but not systematically tested with customers. Some tension between product-level positioning and the 'Atlassian platform' narrative.

What 1-2 actions would move this score up by one level?

1. Implement quarterly 'message testing' with customer panels to validate positioning resonance. 2. Create a unified 'Atlassian platform' positioning framework that harmonises individual product messaging.

2. Competitive Intelligence

1 = No systematic competitive tracking. 2 = Ad-hoc monitoring of 1-2 competitors. 3 = Regular competitive updates with battlecards. 4 = Structured win/loss programme feeding into competitive strategy. 5 = Real-time competitive intelligence with automated monitoring and proactive strategy adjustments.

3

What evidence supports this score? Cite specific artefacts (e.g., 'battlecards for 5 competitors, updated quarterly'), processes (e.g., '10 win/loss interviews per quarter'), and gaps.

Battlecards exist for top 5 competitors (Monday.com, Asana, Notion, ServiceNow, Microsoft). Updated quarterly. Win/loss data is collected from CRM but interviews are sporadic (5-8/quarter vs. target of 15-20). Competitive monitoring is mostly manual — team members track competitor blogs and product updates. No automated alerting.

What 1-2 actions would move this score up by one level?

1. Launch a structured win/loss interview programme targeting 15+ interviews per quarter using a third-party service. 2. Implement automated competitive monitoring tools (Klue or Crayon) to replace manual tracking.

3. Sales Enablement

1 = Sales creates their own materials with no PMM input. 2 = PMM provides some content but no structured enablement. 3 = Regular sales enablement sessions with standard materials. 4 = Structured enablement programme with certification and feedback loops. 5 = Data-driven enablement with content performance tracking and continuous optimisation.

3

What evidence supports this score? Cite specific materials (e.g., 'pitch deck, 3 battlecards, ROI calculator'), usage metrics (e.g., 'training completion rate'), and feedback from sales.

Standard sales enablement materials exist: battlecards, pitch decks, case studies, ROI calculators. Monthly enablement sessions are well-attended. Gap: no formal certification programme — new reps learn on the job. Content usage is not tracked (no data on which assets sales actually uses). Sales frequently requests custom one-off materials, suggesting standard materials don't meet needs.

What 1-2 actions would move this score up by one level?

1. Implement content management system (Highspot or Seismic) to track which assets are used and their impact on win rates. 2. Create a 'PMM certification' programme for new AEs with quizzes and practical exercises.

4. Launch Excellence

1 = Launches happen ad-hoc with no standard process. 2 = Basic launch checklists exist. 3 = Tiered launch framework with defined activities per tier. 4 = Cross-functional launch process with clear RACI and post-launch reviews. 5 = Optimised launch engine with playbooks, metrics, and continuous improvement.

4

What evidence supports this score? Cite specific processes (e.g., 'tiered launch framework with RACI'), metrics (e.g., 'post-launch reviews for T1/T2'), and gaps.

Well-defined tiered launch framework (Tier 1-3) with clear RACI and activity checklists for each tier. Cross-functional launch team meets weekly for T1 launches. Post-launch reviews are conducted for T1 and T2 launches. Gap: launch metrics are mostly activity-based (content produced, emails sent) rather than outcome-based (pipeline generated, adoption rate). T3 launches sometimes fall through the cracks.

What 1-2 actions would move this score up by one level?

1. Define outcome-based launch KPIs for each tier (e.g., T1: pipeline influenced, T2: feature adoption rate, T3: support ticket reduction). 2. Create a lightweight T3 launch template that product teams can self-serve.

5. Customer Insight

1 = No structured customer research. 2 = Occasional customer interviews, mostly reactive. 3 = Regular persona research and customer feedback collection. 4 = Systematic insight programme with win/loss, advisory boards, and usage data. 5 = Insight-driven organisation where customer data shapes all GTM decisions.

2

What evidence supports this score? Cite specific programmes (e.g., 'customer advisory board, quarterly'), research cadence (e.g., 'annual persona refresh'), and data access (e.g., 'PMM has product analytics dashboard').

Customer research is mostly reactive — conducted when a specific question arises. Personas exist but were last updated 18 months ago. No customer advisory board. Product analytics (usage data) is rich but PMM has limited access and no regular cadence of analysis. Customer-facing teams (support, CS) have valuable insights but no structured channel to feed them to PMM.

What 1-2 actions would move this score up by one level?

1. Launch a Customer Advisory Board (10-15 customers, quarterly meetings) for ongoing qualitative insight. 2. Establish a monthly 'insight review' with product analytics to identify usage trends and inform messaging.

6. Overall Maturity Assessment

Calculate the average of all five dimension scores.

3.2

Where is the function strongest? Weakest? What are the top 3 priorities for the next quarter?

Atlassian's PMM function is solidly in the 'Structured' stage (Level 3) with pockets of excellence in positioning (4) and launch management (4). The biggest gap is Customer Insight (2) — the function is strategy-rich but insight-poor, which risks making positioning and messaging decisions based on assumptions rather than data. Top 3 priorities for next quarter: 1. Stand up a Customer Advisory Board and structured insight programme (biggest maturity gap). 2. Implement competitive monitoring automation and increase win/loss interview volume. 3. Deploy sales content management to move enablement from activity-based to data-driven.
Your Worksheet

Fill in for your brand

Sign in to use this worksheet

Create a free account to fill in frameworks with your own brand details, download completed worksheets, and save your progress.

Take this framework offline

Download a blank PDF to fill in by hand, use it in workshops, or pin it to your wall. If you've filled in the worksheet above, you can download your completed version too.