How to Incorporate XLAs into Outsourcing Contracts

This employee-experience-focused white paper written by Bright Horse and HappySignals gives you guidance on how to create XLA contracts, from XLA 1.0 to XLA 2.0.

 

Introduction

Table of Contents

This employee-experience-focused white paper provides practical guidance on how to add Experience Level Agreements (XLAs) into the contracts between IT suppliers and customers.

The key questions answered in this paper include:

  • What is the best method for introducing XLAs?
  • What should customers ask for, and what should suppliers commit to when contracting with XLAs?

The paper also presents various XLA introduction
options that need to be considered; these commit to:

  • An XLA result.
  • A digital experience result.
  • Improving experience.
  • Delivering the experience management service.

Each of these options has positive and negative
implications which need to be considered carefully.
In addition to this, there are other XLA-related questions that need agreed answers, including:

  • What are the implications of failure?
  • Should contracts have penalties associated with the failure to deliver against XLAs or rewards associated with good performance?

In the end, there are suggestions and practical guidance that your organization can use in the agreements created between a customer and a service desk provider.

Authors

BH logo

neil keating brighthorse

Neil Keating

CEO & Co-Founder, Bright Horse
neil.keating@brighthorse.co.uk

 

Neil is the co-founder and chief experience officer at Bright Horse and co-founder and chief executive officer at Experience Collab. He works with a diverse range of organisations, including well known multi-national brands, and Managed Service Providers and many companies across a range of sectors to help them change the way they currently report the success of their IT by helping them through education and consulting to develop the right experience metrics (XLAs) and the right Experience Management Office (XMO) for their business.

logo-pink

email-signature-picture-sami-120px-1

Sami Kallio

CEO & Co-Founder, HappySignals
sami.kallio@happysignals.com

 

Sami is a forerunner in human-centric experience management who inspires IT leaders to transform their IT culture for better. He believes happiness leads to increased productivity. He is a popular writer and speaker about IT Experience Management and the cultural changes needed to make the change happen.

 

 

XLA Definition

The short definition of XLAs is that they measure the created value - the outcome of IT. Service level agreements (SLAs) measure the output of IT.

The XLAs discussed in this paper are multi-dimensional, with multiple data streams:

  • X-Data – Experience Data (how people feel)
  • O-Data – Operational Data (what IT did)
  • T-Data – Technical Data (how the technology worked)

To help understanding, the structure of multi-dimensional XLAs is shown in the diagram on the right, including the possible data sources.

xla-definition

 

What is Driving XLAs

In the last few years, Gartner Research has started to follow and research the IT industry’s adoption and use of XLAs, making the following comments:

  • “By 2025 XLAs will be mainstream in IT services and sourcing relationships.”.Mark Ray, Research Director, Gartner.
  • Gartner states that two out of every three tenders ask for XLAs by 2025.

Forrester Research has also focused on the area, stating that:

  • “While SLAs are important, XLAs are a much better way to measure the impact of technology on employee experience.” Andrew Hewitt, Senior Analyst, Forrester.

XLAs are becoming the new norm for measuring the success of IT. Therefore there is now a need for XLAs in the contractual agreements between outsourcers/internal IT service organizations and their customers.

Why are customers asking for XLAs?

The experience economy is also influencing IT services. Companies started first to talk and focus on customer experience in the 2010s, but in the 2020s, employee experience has grown in importance. This has meant that, for internal IT and managed service providers, it’s no longer enough to focus on the output of IT (SLAs). They must now also show the value they are creating, which is the outcome of IT measured by XLAs.

Also, while many customer organizations outsource to realize cost savings, they also prioritize employee experience as the main delivery target to mitigate the risks of outsourcing. In this area, XLAs work really well because they force the internal IT and partner organizations to focus on the same target.

Why are some suppliers driving with XLAs?

For service desk providers, XLAs have been, and still are, a way to differentiate from the competition.  Providers, of course, care about their customer satisfaction and delivering good employee experiences, and know that creating value is the best way to retain customers. Plus, when customers are satisfied, they’re more likely to buy additional services from a well-performing partner.

Experience measurement also offers additional value to providers that is not always 100% evident from the beginning - this is improved efficiency. When the customer and provider start to understand employee experience, they also start to understand which improvement opportunities are meaningful and which areas are already performing well enough. Experience management and XLAs give clear focus, and that makes working efficiently easier. 

How XLAs Differ from Traditional Metrics

To find the right level of XLA contractual commitment, it’s important to understand why XLAs are different from your organization’s existing commitment to SLAs. By understanding this, it’s easier to explain the differences between an SLA contractual commitment and an XLA contractual commitment.

Experience is cumulative

Experience happens continuously. When people are asked experience-based questions via an XLA structure, consciously or subconsciously, they’re balancing all the positive and negative interactions that they may have had with IT over time. Recent events might take higher currency in their minds (i.e., they have a higher weighting), but all recalled interactions will be taken into consideration and balanced against each other.

This means that it’s hard to accurately measure true XLAs in a single snapshot (e.g., monthly like SLAs). If you measure just one “moment in time,” then you’re likely going to get false negatives and/or false positives. This means we must view experience over time and not just in monthly snapshots – we may measure it monthly, but any issues need to be viewed in quarterly or six-monthly time periods. This will show the movement in experience and reflect that experience improvement can have a time lag.

XLAs are dynamic

XLAs can and will change more regularly than SLAs. XLAs are a measure of outcome, whereas SLAs measure how IT works. SLAs very rarely change in a contract because the service delivery/service management capabilities rarely change. However, targeted outcomes do and will change according to the customer’s business priorities.

For example, you may find you have achieved a certain outcome and now want to create a new outcome, or the business changes direction, and additional/new outcomes are desired. Experience management is always looking to find new experience findings, which may well result in new or changes in experience outcomes and their associated experience measures. 

In addition to this, new experience indicators (XIs) can provide better insights into experience. For example, we may find or implement additional SLAs that could better indicate experience, or we may purchase a new tool that can provide additional T-Data measures.

Therefore, because of this dynamic nature of XLAs, we must view them slightly differently from existing SLAs that don’t change at all.

Joint commitment/horizontal nature of XLAs

XLAs are often a joint commitment between the IT supplier and the customer. It may well be that the IT supplier is not in complete control of experience. SLAs are typically managed in vertical silos; they measure one service (service desk SLAs, desk-side support SLAs, network SLAs, etc.). XLAs can be horizontal, and they can cut across many silos – for example, an XLA for onboarding could involve many departments in IT (IT supply, IT administration, IT support, desk-side support) and even other departments such as Human Resources (HR),  Facilities, etc.

Through XLAs, an IT supplier can determine experience failures, but upon investigation, the cause of the failure may well be something out of their control. An XLA may determine that employees are unhappy with their devices or a particular business application, but it may well be the responsibility of the customer to fix these (e.g., they may need to release the budget or buy a new or update an application, etc.).

This means that sometimes committing to an experience number or improvements can be challenging for an IT supplier unless they’re delivering the complete solution.

Experience is a sum of smaller experiences/feelings

When people rate their experience for IT incidents, they’re taking into consideration all their touch points with IT and also other things that happened between the time they realized they had an IT issue to the moment they knew it was solved. For different people, and depending on the situation, different things in this process will have a different weight in the total experience.

 

feelings-1

 

Benchmarks and Baselines Considerations

The term baseline is used here to describe an organization’s starting point. This is typically the measured level of experience during the first two months.

There can be two benchmarks:

  • The service provider’s internal benchmark.
  • HappySignals Global IT Benchmark (the data collected from all HappySignals customers).

HappySignals is often asked, “Should the HappySignals benchmark be used as a target in agreements?” Even though it’s good to understand one’s own quality compared to others, we don’t recommend this because people from different regions and cultures rate services differently. Western Europe, for example, is much more critical than the US. So if your employees are mainly from Western Europe, 70 would be a great score for Happiness, but if your employees are mainly in Asia, 70 would be average or even a bad score.

Both HappySignals and the service provider’s benchmarks should give the customer organization an understanding of what can be done better and what is realistic.

The IT provider’s internal benchmark

As IT supplier organizations mature with XLAs and deploy them to more customers, they learn what “bad,” “average,” and “good” look like. This means they can have a benchmark of what a typical customer experience score looks like after 1-2 years of delivering experience management and XLAs. This enables them to have a target score that they can commit to over time.

Baseline – the starting point for improvements

As an example, an IT supplier could commit to making a 10% (or 10 Happiness Score points) experience improvement during the next six or twelve months – this commitment could even be put into the contract.

However, this target can only be set after the organization’s own baseline has been measured.

Example one Happiness Lost time / ticket XLA score
Baseline Score 50 4h 5
Target (after 6 or 12 months) 60 3h 15min 5.5

 

Example two Happiness Lost time / ticket XLA score
Baseline Score 75 3h 7
Target (after 6 or 12 months) 80 2h 45min 7.3

 

These targets shouldn’t be set in agreements for several years because targets must be aligned with business priorities more than once a year. Sometimes focus means prioritizing smaller employee groups (the sales organization, merged company employees, new employees, etc.) or certain services because there is more change happening. In these cases, focusing on the average score would not drive the right outcome for the business.

If these improvement targets are used for penalties or rewards, we recommend that these scores are calculated from a minimum 2-month time period. This helps take into account any seasonal differences.

 

We don’t recommend using penalties for XLAs. Also, we recommend introducing rewards only when the customer and IT supplier are used to XLAs and experience management.

But if you decide penalties and rewards are a must, then please read option 3 – Commit to Improving Experience section.

The HappySignals Approach to XLAs

HappySignals has previously authored an extensive document titled the ”Practical Guide to XLAs”. If you’re unfamiliar with what XLAs encompass, we strongly recommend reading this to understand the basics of XLAs.

In brief, SLAs and XLAs serve distinct purposes. SLAs function as a management tool, ensuring that we adhere to predefined operational standards. In contrast, XLAs serve as a leadership tool, aiding leadership in communicating what holds significance for the business. Essentially, by giving teams a target to reach, the team can then make their own decisions to drive the XLA outcome.

Within this paper, our primary focus is to define the aspects we measure as XLAs and explain how to incorporate them into outsourcing agreements.

HappySignals focuses on two distinct experience metrics, both of which are based on employee feedback related to various services (e.g., service desk experience, application experience, overall IT experience, etc.):

  • Happiness – employee’s experience rating
  • Lost Time – employee’s productivity loss

Although these two metrics are interconnected, they nevertheless unveil different insights into  the IT experience.

Happiness

An employee’s experience rating is expressed as a numerical value between 0 and 10. This scale is then transformed into a Happiness metric akin to the computation of Net Promoter Score (NPS). Scores of 9 and 10 are regarded as positive, scores of 7 and 8 are deemed neutral, and any rating below 7 indicates a negative experience. The average Happiness metric is computed by determining the percentage share of positive experiences minus the percentage share of negative experiences. Consequently, the Happiness metric score then falls within the range of -100 to +100.

The global average Happiness for IT incidents as of September 2023  was +78. In the HappySignals Global IT Experience Benchmark™ Report, you can see the more granular insights, such as the typical experience using a certain channel or in a certain country.

This method of score calculation is used because it places a greater emphasis on distinguishing between positive and negative experiences. It underscores the importance of individuals expressing favorable opinions

about IT, as well as those reporting concerns. This method more effectively highlights the distinctions between different services or service areas compared to the conventional average score approach.

Lost time

When individuals evaluate their experience, it’s heavily influenced by their interactions with fellow humans. Consequently, it’s possible for an employee to rate their experience positively even if a slowly handled support ticket was critical to their work, provided they felt appreciated by the service desk. In such cases, an individual may give a high rating despite having lost several hours of their work time.

Lost Time, on the other hand, gives the service desk an indication of relative priority. For example, if employees say that they’re losing double the time with service A than service B, it means the service desk should prioritize service A.

The HappySignals solution always asks employees for a Lost Time estimate because how long a ticket is open doesn’t always mean that people are losing time. As they might carry on with other work while the ticket is being worked on. Sometimes the Lost Time is even longer than how long the ticket was open because the employee wasted time trying to solve it themselves or with colleagues before contacting the service desk. Lost Time tells service providers how much actual employee productivity was lost because of the issue and not how long the service desk took to fix the issue.

As of September 2023, the average lost time per ticket is 3 hours and 5 minutes. This result is based on The HappySignals Global IT Experience Benchmark™ Report data. It’s also good to understand that 80% of the total lost time is caused by just 13% of the tickets. For developing efficiency, it’s crucial that the service provider and customer together know which tickets cause inefficiency and should therefore be prioritized by the service desk.

Conversely, Lost Time (productivity) places a premium on the service desk’s efficiency. For example, if support tickets related to a specific service consistently result in the doubling of Lost Time compared to other services, these tickets should be accorded higher priority by the service desk.

xla-product

Screenshot from the HappySignals Built-in XLA Management features

 

 

Connecting X-data to O-data

HappySignals’ tool facilitates the analysis of these two metrics by customer organizations from various operational and process data perspectives. This multifaceted approach enables results to be comprehensively assessed from different angles, such as

  • Tickets handled by vendor X.
  • Tickets pertaining to service X.
  • Utilization of channel X.
  • Submission of tickets by new employees.
  • Tickets originating from specific countries/locations.
  • Tickets that remained open for more than 1 day.
  • Tickets that were reassigned multiple times.

Within the HappySignals solution, XLA targets can be established based on this operational data. Consequently, individuals within the IT domain can each possess their unique XLA targets to better understand their role in enhancing the overall experience.

The synergy between experience, lost time, and operational data empowers organizations to understand what is most important at any given moment, allowing them to focus on the most critical issues and opportunities.

HappySignals proposes that organizations hold a prioritization meeting every six months. During this session, prioritization is determined from three distinct perspectives:

  • Areas where employees express dissatisfaction.
  • Instances where employees are experiencing significant time loss.
  • Aligning with business or IT objectives (conforming to both company and IT strategy).
  • The latter perspective encompasses considerations such as:
  • The company is investing €20 million in SAP development, necessitating a heightened focus on the SAP experience.
  • A recent company merger necessitating a focus on the experience of employees.
  • The anticipated hiring of 500 new employees, mandates an intensified focus on new employee experience.
  • A strategic imperative for expanding into new markets, prompting an emphasis on the experience of the sales organization in those markets.
  • An impending renewal of the service portal, warranting a focus on its pre and post-change experience.
  • Contemplation of changing the managed service provider (MSP) provider, providing the current provider an opportunity to enhance their service based on results.

In summary, HappySignals’ approach, centered around X-data, is straightforward and easily implementable. It’s the foundational step toward embracing XLAs or, for certain organizations, the sole requisite step in this direction.

XLA Contractual Options

There are several options that an organization can choose from in how they commit XLAs into a contract, and each option has strengths and weaknesses. The best option to choose is often a reflection of both the customer and service provider experience management maturity - the longer they’ve been delivering XLAs, the more confidence both parties will have in predicting the results.

This is especially true of service provider organizations, as often in outsourcing agreements, it’s the supplier who will be delivering a lot of the experience measures, so the more XLAs that the supplier has delivered means they can be more confident in what they commit to their customers.

If an organization (either customer or service provider) has been running XLAs for a while, it will have built up a good benchmark of typical XLA results. This will give them a better understanding of how people feel about IT - they will know the current baseline well and, using this insight, will have more confidence in committing to actual numbers. This will help them as they build the XLA contract.

The four most common XLA contractual options to consider are:

1. Committing to an XLA result

This option is committing to an actual XLA result score or scores. The commitment would be to always be above a certain level or, as with SLAs, stating that it will be above a certain level for a certain percentage of time (e.g. the score will be above 7 for 90% of the time).

2. Committing to a digital experience result

This option involves committing to a particular digital experience score. Typically, these have been out of 10 but are moving to a % score. This means that the service provider would commit to a level of digital experience and not go below this.

3. Committing to improving experience

Unlike committing to a score, this option commits to experience improvements. The IT service provider commits to delivering continual experience improvements or never decreasing experience (i.e. the experience will never get worse).

4. Committing to delivering the experience management service

This last option is the most common; rather than committing to scores or continual experience, the IT service provider commits to delivering the XLA or ITXM process on a continuous basis.

options

Option 1 – Committing to an XLA Result

In this model, the IT service provider commits to delivering a certain level of XLA performance. Due to how we're accustomed to setting SLAs, this is often the initial option that organizations consider when adding XLAs to contractual agreements.

Our experience with customers is that this approach does not yield the best possible outcome in the long term. If the XLA target is specified in the agreement, it can only establish the “what is enough” level. This gets more and more problematic if the contractual agreement period is long.

Our recommendation is that organizations never set experience targets for longer than one year and preferably only for six months. This is because IT experience expectations, IT tools, and even customers’ priorities will change.

All XLA scores should be measured and reported continuously but contractually reviewed six monthly or annually, but this brings another issue - when is the contractual review of the score undertaken?

If the score is taken too early, then this will not be fair on the IT service provider as they won’t have had a chance to improve. If the score is taken in just one month (e.g., the last month of the period), then there is a danger that this might be a “dip” month which would be unfair as well. Therefore, the recommended approach is to either:

1. Always calculate scores for penalties and rewards from the last two months of the target period.

2. Follow and report scores continuously or at least on a monthly basis.

3. Score for each month being above 80% of the target. This allows the IT service provider to manage seasonal “dips.”

The issue here is that experience is cumulative and, by its very nature, will sometimes go up and sometimes down. This means that it is the trajectory of the XLA score that matters more than the actual score at that moment in time. Is the experience going up or down?

Option 2 – Committing to a Digital Experience Result

Most large IT service providers use DEX tools to provide them with a DEX score. This is typically a score out of 10, but more are moving toward percentages as the target. Either way, this score can be a contractually committed target.

A target DEX score would be stated in the contract, and using the baseline and provider benchmark approach mentioned above, the IT supplier could contractually commit to improvements (e.g., a 10% uplift p.a.). If the baseline score was already very good, then they might commit to not degrading the experience (i.e., not lowering the score). If required, penalties/rewards can be implemented based on a similar structure to that mentioned in the Benchmarks and Baselines Considerations section above.

This option is easy and simple to measure, and a DEX tool will do all the work. Therefore it’s easy to put this into a contract. However, there are problems with this:

1. This is not a true XLA. XLAs are much wider than DEX. A DEX score can be used as part of an XLA, but they’re not XLAs in their own right.

2. DEX scores can be easily configured. The danger here is that most of the time, the IT service provider is managing the DEX tool, so it’s easy for them to adjust the metrics to suit their needs.

3. Not all DEX tools include sentiment or experience data. Not all the tools on the market include sentiment, and even if they do, sometimes the DEX score is configured without sentiment – this means that this becomes just a technical device score (T-Data), not an experience score.

Option 3 – Committing to Improving Experience

This option is where the IT service provider commits to improving experience over time (or at the very least not degrading experience over time). In this option, the actual number at a moment in time is less relevant as it is the direction of travel (the trajectory) that is being measured.

In this approach, XLAs are measured continuously, with each month’s scores examined and the direction of travel reviewed to see the experience trajectory. Has experience improved, or has it degraded?

The contractual mechanism will depend on the IT service provider’s maturity/confidence and customer needs. For example, if both are mature with XLAs and the customer baseline is low, then they can use the Benchmark and Baseline approach discussed above and commit to a 5% or 10% improvement each year, depending on where the baseline is.

If both the customer and IT service provider have experience in doing experience management, the contract(s) could have a reward associated with reaching targeted improvements.

If target scores have been reached, then a bonus of 5% of that target period could be given. Also, in this model, the reached score should be calculated from the last two months of the target period. Parties should also check that within the measured period, all months separately had a score that reached at least 80% of the target during that period.

A key caveat here is that we need to remember that experience can be a joint commitment, so we need to consider both the customer and IT service provider’s responsibilities.

If the IT service provider is new to XLAs or the customer baseline is high, then perhaps the agreement is to not degrade the experience. The agreement could be to improve the experience but not a commitment to a certain score or % improvement. This would then be close to option 4 below.

Option 4 – Committing to Delivering the Experience Management Service

This option is the best method for organizations that are new to XLAs. Within this option, no actual results or improvement levels are committed to, but commitment is given to ways of working.

Like with a statement of work (SOW), the IT service provider commits to delivering the experience management service and not the result of the experience management service. For example, the commitment would be:

1. To run experience surveys continuously to see how people feel (X-Data).

2. To agree on how the focus for experience management is set (priorities and selected improvement actions). For example, CIO meetings every six months.

3. To review the XLA data continuously in both organizations.

4. For the provider to give 2-6 improvement proposals every six months.

5. Using O and T data to help provide context and insights and to fix experience issues by understanding the root causes.

6. To implement any agreed experience improvements and celebrate success together.

By doing this, the IT service provider is not committing to a score or a % improvement but just committing to delivering the experience management service and doing something about the results.

This is easier to manage and drives the results better than focusing on just average top-level numbers, as it easily happens with options 1-3.

This option also gives the needed flexibility to learn and adjust targets continuously.

In this model, it’s also important that the customer commits to this way of working. So in the contractual agreement, there should be responsibilities for both parties.

Penalties and Rewards

The picture on the right shows the results of whether HappySignals customers have penalties and/or rewards in their SLA contracts. Although there is a slight increase in rewards for good achievement, you can see a clear movement to having neither penalties nor rewards, and the number of customers having sanctions has dropped significantly. It’s likely that this shows the recognition that SLAs traditional “carrot and stick” approach is not working in today’s world.

xla-penalties3

 

We’re not surprised by this as, in our experience, penalties linked to SLAs harm experience targets. It’s a fact that if you have penalties set for SLAs your service provider has to prioritize those. So be careful.

Also, XLA penalty structures should be avoided as much as possible, at least in the early deployments of XLAs - as committing to targets can be challenging in unknown environments. The market needs to mature, and service providers need to gather benchmarking data to know what targets to commit to.

Even though we don’t recommend this, we believe some customers will want to continue using penalties with SLAs and also add them to XLAs. So if your organization has to implement them, do it according to this structure:

1. If SLAs are not met, but XLAs are improving, then all penalties are suspended.

2. If SLAs are not met, and XLAs are degrading, then SLA penalties are paid.

3. If SLAs are met, and XLAs are improving, then there is a reward.

4. If SLAs are met, and XLAs are static – there is no penalty or reward.

5. If SLAs are met, but XLAs are degrading, then penalties are paid.

So, SLA penalties are suspended while experience is improving but put back if experience degrades.

In line with the above SLA penalty/reward systems, there should also be consideration of experience target achievement - rewarding service providers for experience improvements (although Finance departments may have challenges providing experience rewards if there are no experience penalties). Experience improvements and XLAs should directly link to improved business value (such as productivity, efficiency, engagement, commercial, etc.); therefore, making experience improvements have positive consequences for the organization such that rewards can be considered.   

If implemented, experience improvement rewards should focus on a combination of immediate financial rewards, longer-term rewards, and other non-financial rewards. Immediate financial rewards could include a quarterly/six-monthly reward payment for making experience improvements (for example, a % of the fee is paid to the IT service provider as a reward, with higher rewards given for higher target achievement). Longer-term rewards could include contract renewals or contract extensions (such as additional projects or services). Other non-financial rewards could include references, marketing material, less reporting, or fewer meetings.

XLA Recommendations

As stated at the beginning of this paper, customers and service providers have their own reasons for and benefits from XLAs and experience management. For this reason, we believe XLAs will continue to replace SLAs in outsourcing agreements.   

XLAs and experience management are still new. They’re also a change to the IT organization. For these reasons, we recommend taking a step-by-step approach to both. Don’t try to jump from zero to a perfect world. Instead, a step-by-step approach will help guarantee that your organization gets quick results and the wider team behind the change to a more employee-centric way of working.

XLAs and experience management should be seen as an IT leadership tool - helping the IT organization to make better decisions and deliver more value to the business.

It's recommended that you start with the XLA 1.0 model and then when mature enough move to the XLA 2.0 model.

The main recommendations related to using XLAs in contractual agreements can be summarized as:

  • When driving XLAs and experience management, focus on changing the way of working compared to focusing on reaching a certain target level.
  • When agreeing on the way of working, include both responsibilities of the customer and responsibilities of the IT service provider.
  • Agree how priorities and XLA targets are set continuously (every six months). Targets for improvements should be set, but not with penalties. 
  • If possible, don’t use penalties for SLAs or even for XLAs. They just force the IT service provider to optimize the average numbers, and that won’t drive the preferred outcome.
  • Use XLA rewards only in cases when both the customer and IT service provider are experienced in using XLAs.
  • When calculating your official XLA scores we recommend that you use the two last months to get the score.

XLA 1.0 - Focused on X-data

Many organizations start their XLA journey by focusing only on X-data. This is also our recommendation because we know this allows organizations to get results fast. Simple targets also make it easier to roll out the new way of working to the wider IT team.

To get maximum outcomes from your services it’s more important for the customer and IT service provider to focus on ways of working rather than one XLA target number. This is why our recommendation for this first XLA implementation is as follows: we recommend that the customer and IT service provider approach XLA introduction with a commitment to delivering the experience service with no penalties.

Recommendations for the XLA agreement

  • Measure Employee Happiness to understand the overall experience (reputation) of the service and to find Prioritized areas for improvements.
  • Measure Lost Time to understand business criticality and to be able to prioritize services based on business priority.
  • Link the experience metrics with operational data to understand different experiences and to prioritize improvements based on the current situation and business priorities.
  • Agree on how priorities and XLA targets are set continuously (every six months). The focus should not be on the overall scores but on selected improvement areas that are agreed together.
  • Agree on the methodology used. The HappySignals ITXM Framework™ can be used for this purpose.
  • Don’t include sanctions related to SLAs, DeX scores, or even XLAs. Don’t add rewards before understanding the baseline.

Commit to ways of working on both sides:

IT service provider

  • Employee feedback about their own service quality is continuously shared with service desk agents. Also, agents are trained in Employee profiles to adjust their way of working for different kinds of employees.
  • Every six months, the service provider gives a minimum of two improvement proposals based on the current situation.
  • Continues its meetings internally and with customers about the experience results and ITXM progression.
  • Provides monthly reporting on scores and selected improvement actions
  • Its delivery team will have ITXM certificates.
  • Uses Lost Time to prioritize ticket handling (to be implemented after one year).

Customer

  • Holds an Experience Steering Meeting with the CIO every six months. Setting Experience Objectives and Targets for the next six months. The main service provider partners are included.
  • Is partly responsible for solving possible collaboration issues with Level 2-3 support teams.
  • Analyzes the Lost Time data a minimum of twice per year to understand, together with the service provider, the areas where the root cause of problems should be fixed. (This is the customer’s responsibility because the tools and other partners have been selected by the customer.)
  • Holds a Monthly Experience Review Meeting (ITXM Best Practice) - reporting scores and the improvement actions done inside the customer organization.
  • The customer’s team will have ITXM Certificates and ITXM Training.

Example XLA 1.0 Contract

Please see Appendix A for an example of what an XLA 1.0 contract could look like.

XLA 2.0 - Including X, O and T-data

For those customers who want to bring X-data, O-data, and T-data together to gain insights into both what people feel and why they feel that way, then XLA 2.0 models can be designed using the Experience Collab models (Experience Management Framework™  and XLA Stack™).

The X-data will be designed and understood in the same way as in XLA 1.0, but this is then bridged to O-data from ITSM tools and T-data from Technical (DEX) tools to provide a balanced scorecard of experience. If you’re using this model, then out of all the options described above, our recommendation is option 4 – commit to delivering the experience process and measuring experience, but don’t commit to continually improving it or achieving a set target.

Some organizations are mature with DEX and commit to achieving a certain DEX score, but they’re not mature with XLAs. Some IT service providers discuss XLAs when they’re really referring to DEX, which is a danger to customers who may get confused and believe that this is a commitment to an XLA score, not just a DEX score.

In this XLA 2.0 model, we recommend that the customer and service providers have a phased approach to committing to and contracting XLAs:

  • Commit to only delivering the experience management service (as described above) and not improvements or XLA targets.
  • During the early term of the contract, both parties will build up a baseline score that enables them to gain an understanding of where the customer is in XLA score terms.
  • Both parties can then have a joint commitment to increase the XLA score as per the business needs making short-term improvements that build up to longer-term experience goals (as discussed above).
  • As this further develops throughout the contract and both parties are showing improved experience scores, these XLAs can be matured into XLA 3.0 where following the same structure of XLA 2.0, both parties can agree and apply XLAs to particular persona groups within the customer to tailor improvements to key personas/departments.

Appendix A: Example XLA 1.0 Contract

This appendix gives ideas and recommendations for creating your organization's XLA 1.0 type of Agreement into your Outsourcing Contracts.

Suggestions for Contract Inclusion XLA 1.0

This part of the white paper can be freely copied and edited for any purpose, including commercial use.

Definition Meaning

XLA

The experience level metric or agreement which will be made up from a combination of X-data and O-data.

Experience Metrics

Happiness and Lost Time measured from the employee perspective.

Experience Ambition/Objective

The objective for the next six months is based on company strategy, IT action plans, and higher strategic goals.

Experience Target

The experience level to achieve in the agreed time interval.

Experience Indicator

An experience factor that influences a good or bad employee experience  (skills of agent, speed of service, etc.).

X-data

Experience data derived from Experience Indicators about how employees feel about whether the Experience Objectives are being delivered and Experience Targets met.

O-data

Operational data that helps people to analyze the experience data (related to the country, service, channel, provider, etc.).

Experience-based Services

Service desk services and desk-side support services. These are the services that are measured with XLAs.

Current Experience

The experience being delivered now or the current baseline.

Starting Baseline

The average results from the first two months of the service delivery are seen as the starting baseline.

Old Baseline

The old experience level the customer had before this contract.

Experience Interval

The length of time between reviewing and setting targets. In this contractual agreement, this is six months.

The Global IT Experience Benchmark

The Global IT Experience Benchmark data collected from HappySignals Customers’ employees’ experiences. This is used to understand your organization’s service quality compared to others.

HappySignals Platform

The HappySignals SaaS product is used to collect and analyze experience data.

HappySignals Surveys

Ready-made experience surveys with Happiness, Lost Time measures, and Experience Indicators

HappySignals ITXM Framework™

The HappySignals ITXM Framework™ practically describes the way of working, best practices, and progression model for IT experience management.

ITXM Certificate

The certificate is given on passing the HappySignals ITXM Framework™ exams, such as the ITXM Foundation Course.

ITXM Training

The training courses for understanding the basics of experience management (e.g., HappySignals ITXM or Experience Collab Foundation).

Experience Steering Meeting

The steering meeting is held every six months between the customer and service provider.

 

Experience management service description

Measurement

In line with our commitment to enhancing employee experience, the service provider will utilize HappySignals to understand end-user perceptions by measuring employee experience after each support ticket resolution.

We aim to continuously measure and improve the employee experience using two key metrics:

  • Happiness - This metric helps us gauge the reputation of our services and pinpoint areas that require prioritized improvements.
  • Lost Time - Measuring employees’ perception of Lost Time with tickets or other IT services assists us in assessing the criticality of different services, applications, or IT assets and enables us to prioritize our continual Improvement actions based on business needs. The Lost Time metric can be used to report on productivity gains.

Reporting, transparency, and collaboration

All X-data is continuously and transparently shared with customers using the HappySignals platform.

Our collaborative approach is based on the HappySignals ITXM Framework™, where both the customer and the service provider work together to track the progression of experience management. Our shared objective is to continually enhance the ITXM progression score. While our primary focus is always on improving employee experience and reducing Lost Time, we use the ITXM progression model to support the operationalization of this overarching goal.

Monthly XLA report

The service provider is committed to delivering a monthly XLA report. This report will include:

  • Current XLA scores for the overall service, including incidents and requests, as well as scores for selected focus areas.
  • The total productivity impact to end-users in the past six months, as reported by the HappySignals platform.
  • Highlights of 1-4 significant changes in experience during the previous month/months.

Experience steering meeting

Every six months, the customer and service provider will collaborate to establish new Experience Objectives and Targets in a structured Experience Steering Meeting. Ahead of this meeting, the customer will provide the service provider with prioritization based on business needs. These priorities may be influenced by factors such as employee location, specific services, business units, roles (e.g., sales, factory workers), service channels, onboarding of new employees, and more.

The service provider will propose 2-6 improvement areas based on the current experience scores, Lost Time data, and the business priorities received from the customer. Specific targets (Happiness and Lost Time) will be set for the selected improvements.

For each improvement task, the service provider and the customer will agree on the type of improvement and which party is responsible for its execution - whether it’s the customer, the service provider, a joint effort, or another involved party.

Both parties will collaboratively set experience targets for the next six months, encompassing overall targets for IT incidents, IT requests, and specific improvement areas. These results will be measured as the average score of the last two months of the target period.

Service transition (change of service provider)

Both parties acknowledge that there may be a temporary decrease in employee experience following a change of service provider. Typically, based on HappySignals data, this decrease is around 20-30 points in Happiness.

To address this, the customer will measure the experience provided by the previous service desk provider during the last two months of their service. This measurement will serve as the minimum experience target for the new service. The new service must achieve this target within the first six months of service implementation, measured as the average Happiness and Lost Time scores over the last two months of this six-month period. In cases where the quality of service with the previous service provider cannot be assessed, targets will be set after establishing a baseline measurement (after two months).

 

Responsibilities

The service provider will

  • Continuously analyze experience results to enhance their own performance.
  • Share all employee feedback with agents whenever feedback is received.
  • Provide training to agents regarding HappySignals employee profiles.
  • Co-run Experience Steering Meetings.
  • Take responsibility for the O-data quality related to handled tickets (ticket categorization etc.)
  • Deliver the experience management service in alignment with this document.
  • Provide the XLA report as outlined in this document.
  • Ensure that service desk team members hold ITXM Certificates.

The customer will

  • Continuously analyze experience results to improve their own operations.
  • Share all employee feedback with IT leadership and business stakeholders.
  • Assist or initiate changes if experience issues are related to the work of the customer’s employees or third parties.
  • Communicate to employees how their feedback contributes to improving service quality to ensure employees understand how their feedback is being used.
  • Organize and co-run Experience Steering Meetings that prioritize experience objectives and targets in six-month intervals.
  • Take responsibility for overall O-data quality in the ITSM tool (data quality of the integrations). 
  • Ensure that the team responsible for collaboration with the service provider holds ITXM Certificates and receives training.

Appendix B: Example XLA 2.0 Contract

This appendix gives ideas and recommendations for creating your organization's XLA 2.0 type of Agreement into your Outsourcing Contracts.

Suggestions for Contract Inclusion XLA 2.0

Based on the recommendations within this paper, we suggest that customers and IT service providers approach XLA introduction with a commitment to delivering the required experience service with no penalties.

If this is the case, then we would expect to see elements like the following in a customer contract:

Definitions

Definitions Meaning

XLA

The experience level metric or agreement which will be made up from a combination of X-data and O-data.

Experience Metrics

Happiness and Lost Time measured from the employee perspective.

Experience Ambition/Objective

The objective for the next six months is based on company strategy, IT action plans, and higher strategic goals.

Experience Target

The experience level to achieve in the agreed time interval.

Experience Indicator

An experience factor that influences a good or bad employee experience (skills of agent, speed of service, etc.).

X-data

Experience data derived from Experience Indicators about how employees feel about whether the Experience Objectives are being delivered and Experience Targets met.

O-data

Operational data that helps people to analyze the experience data (related to the country, service, channel, provider, etc.).

Experience-based Services

Service desk services and desk-side support services. These are the services  that are measured with XLAs.

Current Experience

The experience being delivered now or the current baseline.

Starting Baseline

The average results from the first two months of the service delivery are seen as the starting baseline.

Old Baseline

The old experience level the customer had before this contract.

Experience Interval

The length of time between reviewing and setting targets. In this contractual agreement, this is six months.

The Global IT Experience Benchmark

The Global IT Experience Benchmark data collected from HappySignals customers’ employees’ experiences. This is used to understand your organization’s service quality compared to others.

HappySignals Platform

The HappySignals SaaS product is used to collect and analyze experience data.

HappySignals Surveys

Ready-made experience surveys with Happiness, Lost Time measures, and Experience Indicators

HappySignals ITXM Framework™

The HappySignals ITXM Framework™ practically describes the way of working, best practices, and progression model for IT experience management.

ITXM Certificate

The certificate is given on passing the HappySignals ITXM Framework™ exams, such as the ITXM Foundation Course.

ITXM Training

The training courses for understanding the basics of experience management (e.g., HappySignals ITXM or Experience Collab Foundation).

Experience Steering Meeting

The steering meeting is held every six months between the customer and service provider.

 

Experience management service description

Measurement

The service provider commits to measuring the experience of customers consuming the experience-based services. This is achieved by implementing XLAs that measure the experience ambition(s)/objective(s).

The XLAs are implemented to measure the gap between the current experience and the experience ambition/objective.

Each XLA is made up of the following three different measurements that are scored and weighted separately:

  • X-data. This is collected via a regular or preferably continuous experience survey from HappySignals. This will be scored out of 10 or 100.
  • O-data. This is taken from the relevant ITSM tool that measures how well those consuming the services are supported with their issues and requests. The relevant SLAs/KPIs/measures will be scored out of 10 or 100 (e.g. first-time resolution, reassignment count, etc.).
  • T-data. This is taken from DEX tools that measure how well the technology the customer uses is performing. This will be scored out of 10 or 100.

Each of the XLA measures will be separately weighted and then aggregated into an overall XLA score out of 10 or 100. This score will be indicative of how well the experience ambitions/objectives are being delivered.

The service provider will send experience surveys via HappySignals to each employee consuming the experience-based services. In ticket-based services, surveying should be done after each interaction and a minimum of once per year for other services.

The service provider will analyze and score each survey result in accordance with the Experience measures. For the same experience interval, the service provider shall also measure and score the relevant O-data and T-data for each employee who has used the experience-based services.

Each of these measures will be recorded and shown for the same time period. Each measure can then be documented in an XLA report highlighting the gap between the current experience and the experience ambition/objective.

Reporting

The service provider will provide an XLA report at the end of each experience interval. This report will cover:

  • An overall XLA score (weighted average of each score) for each XLA ambition/objective statement.
  • A breakdown of each score (X-data, O-data, and T-data) for each XLA. ambition/objective statement.
  • Details of potential experience issues identified.
  • Recommendations on how the experience and XLA score can be improved.

Experience is a joint commitment between the service provider and the customer, so some of the following recommendations will be for:

  • The customer to deliver.
  • The service provider to deliver.
  • Joint delivery.
  • Other third parties.

The service provider will report and review the XLA scores at each experience Interval. It will also review the XLA scoring and measures with the customer on a six-monthly basis at the appropriate governance forum. In these meetings, the customer sets priorities for XLA improvements based on their strategy.

Where improvements are identified by the service provider, they’ll provide details of the experience improvement project and any possible impact on current services (including any additional charges). The customer acknowledges that such improvements will only be completed by the service provider in accordance with the formal change control procedures.

Service transition

During the service transition, the service provider will perform a baseline measure of the XLA performance based on the service provision prior to the service commencement date.

This baseline measure will provide the basis of the existing customer experience to enable the service provider to begin to produce an experience improvement roadmap.

Changes in the measures

The details of each of the measures (X-data, T-data, and O-data) will likely change as the process continues. XLAs constantly look to find new details so experience survey questions might change, new experience-impacting data points might be found, or new weightings might be needed.

 

It’s the service provider’s responsibility to provide the measures and how they will be calculated – this includes:

  • The experience survey questions.
  • The O-data (experience-impacting SLAs/KPIs).
  • The T-data (experience-impacting technical data).
  • Scoring and weighting.

Changes in the details of the measures are to be expected. It’s the responsibility of the service provider to make these changes as required once agreed by both parties.

If changes are made to the measures, the service provider shall document these changes at each experience interval so that all parties are clear on what detailed measures are included and how they are scored and weighted.

Definitions

Definitions Meaning

HappySignals

An experience management tool used for gathering X-data and managing and understanding IT Experiences. It is also possible to set and follow X-data based XLAs continuously.

Nexthink, Lakeside, 1E, etc (DEX products)

A device analytics tool used for gathering T-data

ServiceNow

A service management tool used for gathering O-data

PowerBI

Used for combining all X-data points and reporting

Service levels

The service provider will:

  • Run the XLA service as per the service description.
  • Run the XLA service for only the agreed experience-based services.
  • Run the XLA service at each agreed experience interval.
  • Provide a detailed XLA score and report at each experience interval.

Service credits

It’s acknowledged by all parties that XLAs will not be subject to any service credits or other penalties unless otherwise agreed in accordance with the change control procedures.

The customer will:

  • Provide a list of all employees and their contact details using the experience service to the service provider (for experience survey purposes).
  • Support the service provider in running proactive experience surveys to each employee based on agreed sample groups and also surveys for tickets being resolved.

Responsibilities

The service provider will

  • Decide upon the appropriate experience measures, weightings, and scoring.
  • Report any changes to these to the customer.
  • Deliver the experience service in accordance with this document.
  • Provide an XLA report in accordance with this document.
  • Continuously analyze experience results to enhance their performance.
  • To review the employee experience with the customer at quarterly intervals.
  • Share employee feedback with agents whenever feedback is received.
  • Provide training to agents regarding HappySignals employee profiles.
  • Propose 2-6 improvement initiatives every 6 months.
  • Ensure that the service desk team members hold relevant experience certifications (e.g. HappySignals ITXM and/or Experience Collab Foundation).

 

The customer will

  • Provide details to enable the service provider to send out experience surveys via HappySignals.
  • Allow the use of the DEX tool on each device.
  • Review and analyze the employee experience with the service provider to improve their own operations.
  • Share experience results with IT leadership and business stakeholders.
  • Assist and/or initiate changes if experience issues are related to the work of the customer’s employees or third parties.
  • Communicate to employees how their feedback contributes to improving experience.
  • Prioritize experience targets in six-month intervals.
  • Ensure that the team responsible for collaboration with the service provider holds relevant experience certifications (HappySignals ITXM and/or Experience Collab Foundation).

Download this White Paper as a PDF

If you prefer to use this document offline, print it, or share it with colleagues, fill in the form below to download it as a PDF.

HappySignals Build-in XLA Management

Discover how our Experience Management Platform works for starting your XLA journey today. The standardized data collected through HappySignals enables you to create and modify XLAs without touching your ITSM tool.

 

Built-in XLA Management

Improve_XLA Management