Maturity Assessment Guide
Why assess maturity
A maturity assessment provides an "evidence-based snapshot" of your organization's current ITSM capabilities. Without this baseline, improvement efforts are driven by assumption rather than data, and progress cannot be measured objectively.
ITIL v5 identifies three types of assessment (each serves a different purpose):
| Assessment Type | Purpose | Approach |
|---|---|---|
| Self-assessment | Internal awareness and quick baseline | Teams score themselves against defined criteria |
| Peer assessment | Cross-team learning and calibration | Teams assess each other, then compare results |
| Formal assessment | External validation and benchmarking | Independent assessors evaluate against a standard |
Maturity model structure
Five maturity levels
The ITIL maturity model uses five progressive levels. Each level builds on the previous one:
| Level | Name | Characteristics |
|---|---|---|
| 1 | Initial | Ad hoc, inconsistent, hero-dependent. Success depends on individual effort rather than repeatable processes. |
| 2 | Managed | Repeatable at team level. Basic processes exist but are not standardized across the organization. Some documentation and measurement. |
| 3 | Defined | Standardized processes across the organization. Roles and responsibilities are documented. Consistent measurement and reporting. |
| 4 | Quantitatively managed | Data-driven decision-making. Process performance is measured, analysed, and controlled using statistical and quantitative techniques. Predictable outcomes. |
| 5 | Optimizing | Continual improvement embedded in culture. Processes are proactively optimized. Innovation is systematic. The organization leads industry practices. |
Realistic expectations: Most organizations operate at Level 2-3 for the majority of their practices. Achieving Level 4-5 across all 34 practices is neither necessary nor realistic for most organizations. Focus on achieving higher maturity in practices that are strategically critical for your business.
Assessment dimensions
For each practice, assess maturity across the four dimensions:
| Dimension | What to assess |
|---|---|
| Organizations and People | Are roles defined? Are people trained? Is accountability clear? |
| Information and Technology | Are tools adequate? Is data accurate? Are systems integrated? |
| Partners and Suppliers | Are external relationships managed? Are SLAs in place? |
| Value Streams and Processes | Are processes documented? Are they followed? Are they measured? |
How to conduct a self-assessment
Step 1: Select practices to assess
Assess all 34 practices or select a subset based on strategic priority. For a first assessment, consider focusing on the core operational practices.
Recommended initial scope (12 practices):
- Incident Management
- Problem Management
- Change Enablement
- Service Desk
- Service Level Management
- Service Configuration Management
- Monitoring and Event Management
- Knowledge Management
- Continual Improvement
- Risk Management
- Information Security Management
- Service Request Management
Step 2: Define assessment criteria
For each practice and maturity level, define specific criteria. Example for Incident Management:
| Level | Criteria for Incident Management |
|---|---|
| 1 | Incidents are handled ad hoc. No standard process. No categorization or prioritization scheme. Resolution depends on who receives the ticket. |
| 2 | A basic incident process exists. Categorization and prioritization are defined. First-line resolution is tracked. Escalation paths are documented. |
| 3 | Incident process is standardized across all teams. Major incident process is defined and tested. Metrics are reported: MTTR, first-call resolution rate, backlog age. Regular incident reviews occur. |
| 4 | Incident trends are analysed quantitatively. Prediction models identify recurring patterns. Automated triage and routing are in place. SLA compliance is measured and controlled. Integration with Problem Management is systematic. |
| 5 | Incident management is proactively improved using data. AI-assisted detection and diagnosis are deployed. Near-zero preventable incidents. Knowledge base is continuously enriched from incident data. |
Step 3: Gather evidence
For each practice, collect:
- Documentation: Process documents, procedures, work instructions
- Tool data: Ticket volumes, resolution times, SLA performance
- Interviews: Conversations with practitioners and managers
- Observations: Shadow process execution to see reality vs documentation
Step 4: Score and analyse
Use a consistent scoring template:
| Practice | Organizations & People | Information & Technology | Partners & Suppliers | Value Streams & Processes | Average |
|---|---|---|---|---|---|
| Incident Management | 3 | 2 | 2 | 3 | 2.5 |
| Change Enablement | 2 | 3 | 1 | 2 | 2.0 |
| Service Desk | 3 | 3 | 2 | 3 | 2.75 |
Step 5: Identify improvement priorities
Map your scores to a priority matrix:
| Quadrant | Criteria | Action |
|---|---|---|
| Quick wins | Low maturity + Low complexity to improve | Address first for early momentum |
| Strategic investments | Low maturity + High complexity to improve | Plan and resource carefully |
| Maintain | High maturity + Core to operations | Continue current approach; protect investment |
| Monitor | High maturity + Non-critical | Reduce investment if needed; maintain through automation |
Benchmarking against industry
While individual benchmarks vary, research across IT service management organizations provides general guidance:
| Practice Area | Typical Level (Industry Average) | Leading Organizations |
|---|---|---|
| Incident Management | 2.5 | 4.0 |
| Change Enablement | 2.0 | 3.5 |
| Service Desk | 2.5 | 4.0 |
| Problem Management | 1.5 | 3.0 |
| Knowledge Management | 1.5 | 3.5 |
| Continual Improvement | 2.0 | 4.0 |
| AI Governance (new in v5) | 1.0 | 2.5 |
| Value Stream Management | 1.0 | 3.0 |
AI governance and value stream management are new in v5. Most organizations will score Level 1 initially. This is expected and is precisely why v5 introduces these capabilities.
Assessment cadence
| Assessment Type | Recommended Frequency | Purpose |
|---|---|---|
| Initial baseline | Once (Phase 1 of implementation) | Establish starting point |
| Progress check | Every 6 months during implementation | Track improvement |
| Annual maturity review | Annually (ongoing) | Strategic planning input |
| Formal assessment | Every 2-3 years (or before ISO audit) | External validation |
From assessment to action
The assessment is not the goal; it is the starting point. Each maturity gap should result in a specific improvement initiative logged in the continual improvement register:
- Identify the gap: Practice X is at Level 2; target is Level 3
- Define the improvement: What specific changes are needed (process, people, technology, partners)
- Assign ownership: Who is accountable for the improvement
- Set a timeline: When will the improvement be complete
- Define success criteria: How will you know the improvement has been achieved
- Review and adjust: Reassess after implementation to confirm the improvement
Related pages
- Implementation Roadmap (where maturity assessment fits in the adoption journey)
- Measuring Success (KPIs and metrics frameworks)
- ITIL Maturity Model (Foundation-level maturity model reference)
- Continual Improvement (the improvement model)
Last updated on April 2, 2026
ITIL® is a registered trademark of PeopleCert. © 2026 ITIL v5 Compass