ITIL v5 Compass
Leadership & Implementation
Maturity Assessment Guide

Maturity Assessment Guide

Why assess maturity

A maturity assessment provides an "evidence-based snapshot" of your organization's current ITSM capabilities. Without this baseline, improvement efforts are driven by assumption rather than data, and progress cannot be measured objectively.

ITIL v5 identifies three types of assessment (each serves a different purpose):

Assessment TypePurposeApproach
Self-assessmentInternal awareness and quick baselineTeams score themselves against defined criteria
Peer assessmentCross-team learning and calibrationTeams assess each other, then compare results
Formal assessmentExternal validation and benchmarkingIndependent assessors evaluate against a standard

Maturity model structure

Five maturity levels

The ITIL maturity model uses five progressive levels. Each level builds on the previous one:

LevelNameCharacteristics
1InitialAd hoc, inconsistent, hero-dependent. Success depends on individual effort rather than repeatable processes.
2ManagedRepeatable at team level. Basic processes exist but are not standardized across the organization. Some documentation and measurement.
3DefinedStandardized processes across the organization. Roles and responsibilities are documented. Consistent measurement and reporting.
4Quantitatively managedData-driven decision-making. Process performance is measured, analysed, and controlled using statistical and quantitative techniques. Predictable outcomes.
5OptimizingContinual improvement embedded in culture. Processes are proactively optimized. Innovation is systematic. The organization leads industry practices.

Realistic expectations: Most organizations operate at Level 2-3 for the majority of their practices. Achieving Level 4-5 across all 34 practices is neither necessary nor realistic for most organizations. Focus on achieving higher maturity in practices that are strategically critical for your business.

Assessment dimensions

For each practice, assess maturity across the four dimensions:

DimensionWhat to assess
Organizations and PeopleAre roles defined? Are people trained? Is accountability clear?
Information and TechnologyAre tools adequate? Is data accurate? Are systems integrated?
Partners and SuppliersAre external relationships managed? Are SLAs in place?
Value Streams and ProcessesAre processes documented? Are they followed? Are they measured?

How to conduct a self-assessment

Step 1: Select practices to assess

Assess all 34 practices or select a subset based on strategic priority. For a first assessment, consider focusing on the core operational practices.

Recommended initial scope (12 practices):

  • Incident Management
  • Problem Management
  • Change Enablement
  • Service Desk
  • Service Level Management
  • Service Configuration Management
  • Monitoring and Event Management
  • Knowledge Management
  • Continual Improvement
  • Risk Management
  • Information Security Management
  • Service Request Management

Step 2: Define assessment criteria

For each practice and maturity level, define specific criteria. Example for Incident Management:

LevelCriteria for Incident Management
1Incidents are handled ad hoc. No standard process. No categorization or prioritization scheme. Resolution depends on who receives the ticket.
2A basic incident process exists. Categorization and prioritization are defined. First-line resolution is tracked. Escalation paths are documented.
3Incident process is standardized across all teams. Major incident process is defined and tested. Metrics are reported: MTTR, first-call resolution rate, backlog age. Regular incident reviews occur.
4Incident trends are analysed quantitatively. Prediction models identify recurring patterns. Automated triage and routing are in place. SLA compliance is measured and controlled. Integration with Problem Management is systematic.
5Incident management is proactively improved using data. AI-assisted detection and diagnosis are deployed. Near-zero preventable incidents. Knowledge base is continuously enriched from incident data.

Step 3: Gather evidence

For each practice, collect:

  • Documentation: Process documents, procedures, work instructions
  • Tool data: Ticket volumes, resolution times, SLA performance
  • Interviews: Conversations with practitioners and managers
  • Observations: Shadow process execution to see reality vs documentation

Step 4: Score and analyse

Use a consistent scoring template:

PracticeOrganizations & PeopleInformation & TechnologyPartners & SuppliersValue Streams & ProcessesAverage
Incident Management32232.5
Change Enablement23122.0
Service Desk33232.75

Step 5: Identify improvement priorities

Map your scores to a priority matrix:

QuadrantCriteriaAction
Quick winsLow maturity + Low complexity to improveAddress first for early momentum
Strategic investmentsLow maturity + High complexity to improvePlan and resource carefully
MaintainHigh maturity + Core to operationsContinue current approach; protect investment
MonitorHigh maturity + Non-criticalReduce investment if needed; maintain through automation

Benchmarking against industry

While individual benchmarks vary, research across IT service management organizations provides general guidance:

Practice AreaTypical Level (Industry Average)Leading Organizations
Incident Management2.54.0
Change Enablement2.03.5
Service Desk2.54.0
Problem Management1.53.0
Knowledge Management1.53.5
Continual Improvement2.04.0
AI Governance (new in v5)1.02.5
Value Stream Management1.03.0
💡

AI governance and value stream management are new in v5. Most organizations will score Level 1 initially. This is expected and is precisely why v5 introduces these capabilities.

Assessment cadence

Assessment TypeRecommended FrequencyPurpose
Initial baselineOnce (Phase 1 of implementation)Establish starting point
Progress checkEvery 6 months during implementationTrack improvement
Annual maturity reviewAnnually (ongoing)Strategic planning input
Formal assessmentEvery 2-3 years (or before ISO audit)External validation

From assessment to action

The assessment is not the goal; it is the starting point. Each maturity gap should result in a specific improvement initiative logged in the continual improvement register:

  1. Identify the gap: Practice X is at Level 2; target is Level 3
  2. Define the improvement: What specific changes are needed (process, people, technology, partners)
  3. Assign ownership: Who is accountable for the improvement
  4. Set a timeline: When will the improvement be complete
  5. Define success criteria: How will you know the improvement has been achieved
  6. Review and adjust: Reassess after implementation to confirm the improvement

Related pages


Last updated on April 2, 2026

ITIL® is a registered trademark of PeopleCert. © 2026 ITIL v5 Compass