Experience Level Agreements in ITIL 4 White Paper
White Paper

White Paper
- White Paper
- Business solutions
- Customer engagement
- IT Services
- Service management
- ITIL
August 19, 2019 |
21 min read
- White Paper
- Business solutions
- Customer engagement
- IT Services
- Service management
- ITIL
Today, service management is more consumer-centric than ever. Service consumer experience and satisfaction drives all its aspects, including strategic planning, design, delivery, operations, and the assessment of services.
One method to account for service experience is to include it in the service level agreement (SLA) and ensure that it is continuously monitored, analysed, and improved. This paper explores the concept of experience level agreements (XLAs), explains how this concept is addressed in ITIL® 4, and provides practical recommendations on the subject. It is aimed at service management practitioners who are familiar with ITSM and, specifically, with the concept of SLAs.
This paper aligns with concepts and recommendations in ITIL 4 publications that were published and under development at the time of writing. Some ITIL 4 content and definitions may change slightly before the Managing Professional books are published, but the key messages and solutions will not.
Context: service quality and SLAs
Today, everything is a service. Every organization is involved in a service relationship as a consumer. Many are providers. Quality of service is the foundation of our business and social interactions. So many things depend on whether the services we use meet our expectations of quality.
However, defining, agreeing, and measuring service quality is complicated. Traditionally, formal or semi- formal agreements on service quality are vague, incomplete, a means to limit provider liability, or all of these. This is a common issue in all types of service relationships.
The accepted form of agreement on service quality is an SLA.
Service level agreement
A documented agreement between a service provider and a customer that identifies both services required and the expected level of service.
The basic structure of an SLA is dictated by its name:
- ‘service’ defines the scope of the agreement
- ‘level’ defines the characteristics of the services and agreed metrics and targets for each characteristic
- ‘agreement’ covers the terms and conditions of the service provision and consumption.
This basic structure may become more complicated if the agreement covers multiple services or reflects a consumer organization’s complex structure. For example, the ‘level’ and ‘agreement’ sections may include paragraphs that apply to every service or customer within the scope, and paragraphs that are service- or customer-specific.
The ‘agreement’ section usually becomes more complicated in agreements with external organizations (where SLAs are either a form of legal contract or a part of it). In subscription agreements, it may include specific clauses, such as period of subscription, rules and costs of cancellation, and means of recurring payments.
Unfortunately, even the most comprehensive SLAs demonstrate gaps between consumers’ expectations and the final agreement:
- only some of the needs and expectations are communicated by consumers to the customer
- only some of the requirements are discussed between the customer and the service provider
- only some of the discussed quality characteristics get documented in the agreement.
There are multiple reasons for these gaps, from miscommunication to intentional limiting. In mass provision services, especially for individual consumers (such as home internet or private banking), negotiations are limited to the options in the service catalogue; service consumers have little to no influence on the SLAs beyond selecting from those options.
As a result, the agreed service level always differs from the expectations and needs the service aims to meet and fulfil: often narrower in scope and lower in level.
2.1 WATERMELON SLAs
The ‘watermelon effect’ illustrates a situation where all targets are hit, and service level reports are ‘green’, while users and customers demonstrate ‘red’ levels of satisfaction. Several factors cause this effect:
- SLAs differing from consumer expectations.
- Targets being met by cheating. For example, closing incident records before the target resolution time, then opening new records with the same information and continuing the work in a new timeframe.
- Not accounting for culture, geography, season, age, or other personal factors when investigating users’ experience.
- Not including user experience in SLAs and therefore not monitoring, measuring, nor analysing it. The obvious solution is to include user satisfaction targets in the SLA. One article recommends:
“You can do a lot by just adding a simple star rating system. Or send out customer satisfaction surveys at regular intervals for some more deep-digging kind of stats. An example experience level agreement KPI could be keeping your call closure satisfaction rate above 4.5 stars.”1
This is likely to be a useful amendment to the SLAs. However, it is important to remember service providers may adjust the satisfaction rates artificially.
More importantly:
- measuring satisfaction does not equate to understanding users’ experiences
- measuring satisfaction does not replace measuring service quality
- adding satisfaction targets does not turn an SLA into an XLA.
What is an XLA?
The concept of an SLA is closely attached to the concept of quality of service (QoS), particularly among managed service providers (MSPs) and network communication service providers. Definitions of QoS (and hence the structures of SLAs) developed in these industries are often technical, focused on network and other IT services’ performance and similar characteristics.
“For a long time, the question of how to define, provide, and measure service quality for end users has been of utmost interest for network operators, application, and service providers, as well as their customers. With the advent of packet-based communication, this has led, already in the early nineties, to several attempts at thoroughly defining Quality of Service (QoS). In the two decades to follow, the primary research directions have followed a rather technology-driven understanding of QoS, leading, among other things, to the definition of clearly specified network parameters as a prerequisite for arranging service related binding contracts between providers and users, i.e., Service Level Agreements.”2
The technical focus of the QoS and related SLAs between IT service providers and corporate service users, rather than individual and business users, has led to SLAs commonly being perceived as purely technical, lacking customer and user orientation. This created a demand for an alternative approach, quality of experience (QoE). The shift from QoS to QoE led to the introduction of an experience-focused version of an SLA: the XLA.
Experience level agreement
A type of SLA designed to establish a common understanding of the quality levels that a customer will experience through the use of the service in terms that are clear to the customer and to which he or she can relate.
XLAs aim to provide service consumers with a clear understanding of the experience they can expect from using the service.
XLAs are applicable to end-user services. For services addressed to individual consumers, agreements can be described via the experience characteristics. Examples include internet and mobile network services. For services addressed to corporate consumers, the agreement will likely include experience characteristics and technical characteristics for users and the IT team respectively. Examples include network services, printing services, and service applications.
For services providing no touchpoints with end users, technical characteristics are usually sufficient.
Examples include service platforms and facilities services like data centre conditioning.
Experience level characteristics should be mapped to the technical characteristics of the services in order to understand any constraints and to establish experience-focused monitoring of the service quality. Direct experience monitoring is often impossible. Service providers have to assume experience levels based on observable service performance data. For example, an ‘uninterrupted HD-quality movie demonstration’ (experience characteristic) is assumed to be possible only if the ‘download speed of the network connection’ (technical characteristic) is above certain value in Mbps. The service provider cannot monitor the movie experience directly, but they can assume its level based on the available connectivity data.
Service experience depends on the QoSs provided by members of the service network: the service provider, its partners and suppliers, and the service consumer’s resources and partners. XLAs should consider these dependencies. Where possible, they should be supported by a cooperation framework among the parties involved. If this is impossible, agreements are likely to be amended with fine print that limits the provider’s liability and eventually makes the agreement pointless.
XLAs are becoming increasingly popular among service providers. However, in most cases, the ‘experience’ section is limited to the measurement of user satisfaction with the service, or further limited to the user satisfaction with the service support. This data, though important, is insufficient to define a new type of service agreement.
To summarize, an XLA is a form of SLA that addresses measurable characteristics of the user experience. It can be a useful means of negotiation and management for service level for end-user-facing services. It requires an understanding of the dependencies between technical service level characteristics and user experience. It may include user satisfaction with the overall service or specific aspects of the service as an experience characteristic.
SLAs in ITIL 4
ITIL 4 significantly expands the concept of SLAs. It recommends including both utility and warranty characteristics in the agreements. It also suggests addressing user experience and offers several ways to do so.
4.1 USER-SERVICE INTERACTION
According to ITIL 4, there are three forms of interaction between a service provider and a service consumer:
- the service provider transferring goods to the service consumer
- the service consumer accessing the service provider’s resources
- service actions performed by the service provider, service users, or jointly.
Most service relationships are comprised of more than one form. For example, technology-based services usually include access to service provider’s resources and, sometimes, service actions, but the transfer of goods is relatively rare.
4.2 UTILITY, WARRANTY, AND EXPERIENCE
The ‘level’ section of an SLA usually includes the agreed service level targets for service utility and warranty.
Utility
The functionality offered by a product or service to meet a particular need. Utility can be summarized as ‘what the service does’ and can be used to determine whether a service is ‘fit for purpose’. To have utility, a service must either support the performance of the consumer or remove constraints from the consumers. Many services do both.Warranty
Assurance that a product or service will meet agreed requirements. Warranty can be summarized as ‘how the service performs’ and can be used to determine whether a service is ‘fit for use’. Warranty often relates to service levels aligned with the needs of service consumers. This may be based on a formal agreement, or it may be a marketing message or brand image. Warranty typically addresses such areas as the availability of the service, its capacity, levels of security, and continuity. A service may be said to provide acceptable assurance, or ‘warranty’, if all defined and agreed conditions are met.It is possible to assume from these definitions that service quality and service level refers to warranty and warranty requirements only. This is not the case. The management of service quality and service level should be holistic and focused on value. To achieve this, all relevant characteristics of a service should be managed, including associated metrics, areas of perception, and feedback.
The habit of separating the management of functional and non-functional characteristics of services, from the definition of requirements to the evaluation of the quality that has been achieved, comes from the separation of the development and operations teams. The separation of these characteristics and teams typically leads to a fragmented, formal understanding of service quality.
To summarize, service quality includes the functional and non-functional characteristics of services, as should the service level.
Service quality and service level can also be described in terms of user experience; ITIL 4 recommends doing so where relevant.
4.2.1 Utility
Utility characteristics of services are usually described as functions performed by people and other resources of the service provider, or service actions (performed by the service provider, available to the users, or performed jointly).
Service | Examples of utility | Example metrics and targets |
---|---|---|
Mobile internet |
|
|
Business trip air ticket search and booking |
|
|
Table 4.1 Examples of service utility descriptions and metrics
Table 4.1 shows that utility characteristics are often binary (it works or it does not work). Associated metrics are often based on incidents where important functions were unavailable or performance was unacceptably low. However, an overall assessment of a service utility can be based on an integration of multiple characteristics and related metrics. For example, if some functions of a system are unavailable or performed with high number of errors, they can be assessed as a percentage of agreed utility rather than as a binary indicator.
4.2.2 Warranty
Service warranty describes the level of assurance that the agreed utility will be provided in the agreed conditions. Conditions may include:
- territory and period of service provision
- demand/workload
- threats and associated risks
- users’ readiness
- applicable legislation.
‘Level of assurance’ means that within the agreed conditions, certain levels of availability, performance, capacity, continuity, security, usability, compliance, and other service quality characteristics are provided. Some examples are provided in Table 4.2.
Service | Warranty requirements | Metrics and targets | Conditions |
---|---|---|---|
Mobile internet | Availability |
| Within agreed service provision time and area, including home region and roaming destinations. |
Performance |
| Maximum number of simultaneous download processes is three. | |
Capacity |
| Approved apps from official sources. Within home network and selected roaming destinations. | |
Information security | Number of security incidents caused by malware from the internet per month (0). | As long as security settings and anti-virus software settings are not altered by users and all users follow information security guidelines. | |
Compliance | Service provision and the service comply to the state regulations for personal data protection, copyright protection, and content control. | As long as content control settings are not altered by users and users follow information handling guidelines. | |
Continuity |
|
| |
Accessibility | All interfaces are clearly visible and easy to follow. | As long as users are not visually impaired and can read and speak English, German, French, or Spanish. |
Table 4.2 Examples of warranty requirements and associated metrics
4.2.3 Experience
As discussed earlier, organizations are increasingly including user experience targets in the agreements. Many experience metrics are related to the service interface performance, others can indicate user satisfaction with either interface, or the service in general. Measurement of these can be integrated in digital services. Examples of experience metrics include the number and frequency of:
- user errors
- returns to the previous stage (back-button usage)
- help (F1) calls
- dropped (unfinished) service actions
- users who switched to a different channel during an advertising break users who cancel a subscription after a trial period
- users who confirm agreement with the terms and conditions without reading them.
Table 4.3 provides some service-specific examples of experience characteristics and metrics.
Service | Experience characteristics | Metrics and targets |
---|---|---|
Business trip air ticket search and booking | Uninterrupted completion of user actions | Number and percentage of bookings which were dropped unfinished per month (<50, <5%). |
User satisfaction with the service | Average and minimum rating given by users to the service over a period (>4 of 5; >2.9). | |
Clear and convenient interface | Number and percentage of transactions where users used the interface help over a month (<10; <5%). Average and minimum rating given by users to the service interface over a period (>4 of 5; >3.5). |
Table 4.3 Examples of experience characteristics and metrics
There are many other examples. The idea is to measure the user experience directly, not only to ask users about it. These experience-related metrics and targets are usually a part of a wider agreement rather than an alternative to SLAs. The experience-based approach to service definition and measurement is applicable to services where service actions are important part of the service (and may be the way to describe the service utility). In other words, it applies to services where users work with service interfaces. Note that there are many services with no or few user interactions, including ‘infrastructure as a service’ services.
From the experience management perspective, a service action is the most important form of interaction.
Service action
Any action required to deliver a service output to a user. Service actions may be performed by a service provider resource, by service users, or jointly.It is relatively easy for a service provider to identify, agree, and measure characteristics of the resources provided to the service consumer. In many cases, these are measurable, unambiguous technical characteristics. It is more difficult to define service actions that are important for the service consumer and ensure their performance. However, attempting to do so is a valuable exercise.
4.3 THE SERVICE ACTION METHOD
The service action method helps to describe and evaluate services based on the performance of key service actions performed by the users and the service provider during service consumption. The method has the following steps:
- identification of the service actions
- matching identified service actions with service provider’s service catalogue
- agreeing with customers on service action target performance
- agreeing with customers and service provider teams on metrics and measurements for the IT services.
The best way to identify service actions is by mapping the service consumer organization’s value streams. It is helpful if the organization has current maps of their value streams and processes: the service provider and the service consumer can derive lists of the operations supported by the IT services, related service actions, and performance requirements from this information. However, current maps of value streams and processes are uncommon. In these cases, it is possible to identify service actions using the organization’s documentation and industry standards. The resulting lists of business operations and related service actions will be more generic but may be sufficient for the SLA.
Customers’ requirements for service action performance can be derived from the organization’s internal procedures and standards. For example, if a hotel has a five minute standard duration for the check-in procedure, this can help to set performance targets for the related service actions of the reception staff: guest check-in should be limited to three minutes; processing the payment should be limited to one minute. These requirements can be transformed into performance metrics for the respective IT services and used by the service provider to manage and measure the service level.
4.3.1 Service actions example
A bank has a value stream for the provision of personal loans. This value stream includes several activities, most of which are enabled with IT services. After mapping and analysing this value stream, the following service actions were identified:
- Processing of a loan application
- Current bad credit check
- Affordability calculation
- Credit rating assessment
- Application scoring
- Confirmation or update of the offer
- Loan agreement signing and transfer of the funds
- Automated registration of a loan agreement and associated agreements
- Opening an account and creating a payment schedule
- Direct debit instructions setup
- Loan agreement ongoing management
- Interest calculation and account information update
- Payments processing
- Calculation of the payments due if a client requests to pay the loan in full
Based on this list, the service provider and the customer may agree on the following service level requirements and metrics for some of the service actions listed.
ervice level characteristics | Customer requirement | Service level metrics |
---|---|---|
Automated loan application processing time | Less than 30 minutes | Percentage of the loan applications processed on time |
Maximum duration of a service for all loan-related systems used in a bank branch | Less than 30 minutes | Single outage duration, maximum |
Total unavailability during a business day | Less than 60 minutes | Number and percentage of days over a period when the requirement has not been met |
Table 4.4 Examples of requirements and metrics
SLAs based on the service action performance describe agreed characteristics of the user experience in terms of their work context and business operations. However, ITIL 4 does not introduce a new name for this type of agreement; it simply expands the options available for the design of an SLA.
Practical recommendations
When a service provider defines a format for SLAs, the following actions may be beneficial:
- include user experience characteristics in the agreement if the services are end-user facing
- agree on and measure the following user experience characteristics:
- performance of user’s service actions
- performance of user interfaces
- user satisfaction with overall services and specific service components or actions
- correlate the performance of users’ service actions and user interfaces with measurable technical characteristics
- ensure data reliability (because user satisfaction metrics are easy to manipulate)
- consider environmental and personal factors.
4.1 CULTURE MATTERS
Humans’ experience and satisfaction are subjective. The way a service is delivered can be objectively captured and measured, but the way that it is perceived is influenced by the environment, personal biases, and the cultural and professional background. Service providers should take these factors into account when they discuss, agree, and monitor user experience characteristics.Some examples of the cultural factors that may affect experience level measurements and assessment are:
- National culture, which may affect expectations and the way users express their satisfaction
- Geographical and seasonal variations, which may affect expectations, experience, and satisfaction.
- Professional backgrounds, which may affect users’ understanding of the language used in the service interfaces (including special terms, abbreviations, and acronyms)
- Age, which may affect users’ ability to work with interfaces and the performance of service actions.
- Whether help, FAQ, and knowledge articles are available in only a few languages, which leads to differing metrics related to their use for native and non-native speakers.
- Public image and past experience, which drive expectations from services and service providers.
Naturally, these and similar factors should be considered when designing the services, but it is also important to consider them when designing and assessing the services’ experience level characteristics.
Conclusion
Modern ITSM includes understanding and managing the service consumers’ experience. One method for this is including experience characteristics in SLAs. This can be done in three main forms: performance of user’s service actions, performance of user interfaces, and user satisfaction. ITIL 4 recommends including experience in SLAs, along with utility and warranty characteristics.
About the author
Antonina is an experienced ITSM consultant in an international environment, ITIL expert, coach, and trainer. She is a pioneer of ITSM and one of the co-originators of the ITSM community in Belarus.
Antonina is passionate about connecting dots and using methods and best practices through diverse industries, for example, combining art, ITSM, and coaching.