Watch the Webinar
Discover the latest findings from the Global IT Experience Benchmark H1/2023
This report presents the findings from 860,339 end-user responses in over 130 countries. Discover our latest ITSM benchmarks as well as insights into response rates and the relationship between Time to Resolve and end-user perceived lost time with IT incidents.
If you're a CIO, IT Leader, Experience Owner, or Service Owner, you need to understand what the word "experience" refers to in contexts of DEX, XLA and other three-letter acronyms.
End-user perceived lost work time per reassignment has increased, increasing the importance for IT to understand the impact of reassignments.
Response rates vary significantly across countries, providing valuable insights into where end-users may not believe their feedback matters.
The faster tickets are solved the higher the response rate. Our data shows a linear relationship between how long a ticket is open and response rates.
For the first time, remote work satisfaction surpasses IT service satisfaction, indicating strong end-user appreciation for remote work capabilities and benefits.
To navigate this landscape, IT organizations must first define and measure the user experience. At HappySignals, we prioritize a human-centric approach, where the end-user's perspective takes precedence. This perspective is similar to the experience of taking a car to the garage for a warning light and returning to find it fixed — for the end-user, the experience is the end-to-end view, while the diagnostic and resource planning steps are for internal purposes.
In the world of internal IT, users often lack choice in which technologies and services they can use. This means certain telemetry-based metrics, like usage time, etc, don't always indicate more than how long the application was used. Just because someone uses an application extensively doesn't guarantee a positive experience. Maybe the end-user didn't know how to use the application correctly, and that was the reason for the long time of use. We believe a human-centric IT experience approach helps drive better decision-making and highlights improvements that matter most to end-users. While process and technology analysis play a role, they can't replace direct end-user feedback.
Choosing to prioritize understanding end-users means asking end-users how they feel, preferably on a continuous basis, to get a steady stream of experience data. It means asking for feedback from end-users, which means many can and sometimes will choose not to provide feedback. This is why response rates are often a topic of discussion when new customers are getting onboarded.
Some feel that the reliance on end-users to provide feedback can be replaced with process and technology data, omitting end-user feedback for XLAs. We believe this can and has already started to lead to "XLA Watermelons," which are similar to the symptoms that created the need for the "experience movement" in the first place.
This was recently highlighted by Gartner when studying how XLAs currently work when engaging with IT service providers. Gartner stated:
"... (IT) providers often disguise legacy SLAs as XLAs for clients, driving the wrong perception about the effectiveness of XLAs."
Clients often lack a proper understanding of what an XLA is and, therefore, are unable to articulate their requirements properly, leading to uncertainties and dissatisfactions related to implementation.
It can sound cliché, but humans are still the best sensor when evaluating the actualquality of IT experience. While technology metrics and data analysis can provide valuable insights, they can never fully capture the nuances and emotions that users experience. The subjective feedback from end-users is essential to understand how IT services are truly perceived and utilized.
By prioritizing the voice of the end-user, organizations can gain a deeper understanding of their needs, frustrations, and expectations. This enables IT decision-makers to make informed choices that align with the actual user experience, leading to improvements that matter most to the people using the technology.
Choosing a human-centric IT experience acknowledges that technology alone cannot fully capture the user's perspective.
We hope that this benchmark report can help you, as representatives of IT organizations, better understand how you can improve your IT experience in alignment with end-user needs and expectations.
The Global IT Experience Benchmark H1/2023 Report presents and analyzes data from 860,339 end-user responses collected through the HappySignals IT Experience Management Platform between January and June 2023.
On this webpage, we primarily included the 2023 data along with a few selected trends over a longer period of time.
Our benchmark data is collected from all HappySignals customers. These include large enterprises, as well as Managed Service Providers (MSPs) who use the HappySignals Platform with their customers – enterprises and public sector organizations.
Around 60% of HappySignals customers are using Outsourced Service Desk providers.
All responses are from IT end-users – employees using internal IT services – and reflect their feelings and perceptions about IT.
HappySignals IT Experience Management Platform connects operational data (e.g. from customers’ IT service management (ITSM) platforms) with continuous survey data from end-users about Ticket-based IT and Proactive IT areas.
Ticket-based IT (Incidents and Requests):
End-user responses are immediately collected when tickets are resolved. Surveys are sent after each ticket, asking end-users to accept the resolution by giving feedback about their experience. The average response rate for HappySignals customers in 2023 is around 22%, with variations between different companies and geographies.
Proactive IT:
Surveys are sent proactively to end-users about Proactive IT areas (e.g. Overall IT Experience, Enterprise Applications, Laptops and Computers, Remote Work, Office Environment) rather than in connection with tickets. These surveys can be scheduled to target relevant end-users at optimal frequencies, enabling continuous measurement of non-ticket-based IT areas.
Happiness:
End-users rate how happy they are with the IT area being measured (e.g. recent ticket-based service experience, Enterprise Applications, Mobile Devices, etc) on a scale from 0-10.
HappySignals then calculates the % of 9-10 scores - % of 0-6 scores = Overall Happiness (a number between -100 to 100).
Productivity:
End-users estimate how much work time they lost due to the IT touchpoint being measured.
Factors:
End-users select from a list of suggested reasons – which we call Factors – that influenced their Happiness rating. Multiple factors can be selected. These factors could be seen as experience indicators, a term also used in the IT Experience Management area.
The surveys automatically tailor the factors shown to each end-user depending on what IT area is being measured and whether the Happiness rating given in the first question was positive, negative, or neutral. Examples of factors include “It was difficult to know where to start” (Ticket-based Services) and “Applications are too slow” (Enterprise Applications).
IT Happiness across all measurement areas
Measurement Areas | H2/2022 Happiness | H1/2023 Happiness |
Overall IT Experience | +39 | +39 |
Services | +79 | +80 |
Collaboration with IT | +85 | +83 |
Remote Work | +80 | +82 |
Service Portal | +43 | +46 |
Office Environment | +42 | +43 |
Mobile Devices | +9 | +5 |
Laptops and Computers | +20 | +15 |
Enterprise Applications | +15 | +9 |
NB! Numbers may vary from the last report due to data quality improvements, which have been applied to historical data as well for better comparability. Scores are calculated with the same mathematical model as NPS. (Read about What is the difference between NPS and HappySignals?)
What is the business impact of ITXM on the Overall IT Experience?One of the common mistakes is assuming which IT touchpoints make IT end-users happy. New customers are often surprised by the touchpoints highly rated by their end-users. Contrary to popular belief, IT services are frequently among the most highly rated IT areas. If the goal is to enhance the overall employee experience with digital technologies, it's crucial to be aware of which areas are liked the most and the least. Real-time experience data across different IT touchpoints provides valuable insights that facilitate conversations between IT, HR, and business functions. Having a comprehensive understanding of the IT experience enables the company to allocate resources based on employee feedback data rather than the gut feelings of leadership team members. This, in turn, leads to a higher success rate in digital transformation projects. |
The Overall IT Experience survey is an alternative to annual IT surveys. It provides continuous insights into how people generally feel about IT and the amount of time end-users perceive as lost each month due to IT-related issues. After end-users provide their scores on a scale of 0 to 10, they're asked to identify which specific aspects of IT influenced their ratings.
Below, we present the findings of the areas that end-users selected as contributing factors for their given scores. This data offers a quick and digestible view of the factors shaping end-user perceptions of IT services.
Based on the percentages of factors selected by end-users giving a neutral score (7-8) for IT service management, we see some differences compared to the positive scores.
Based on the percentages of factors selected by end-users giving negative scores (0-6) for IT service management, here are some conclusions you might draw:
End-user happiness with IT services is an outcome influenced by a combination of factors, including both people and processes. The quality of IT support services largely drives positive experiences, the attitude of IT personnel, and effective communication and training, highlighting the paramount role of the human element in delivering satisfactory IT services. However, the analysis also reveals that hardware and software-related concerns, such as Computers & Equipment and Work Applications, are equally vital in shaping both neutral and positive experiences.
In neutral experiences, the focus shifts to maintaining a balance between addressing hardware and software issues while ensuring the quality of IT support services. This suggests that neutral satisfaction is a result of addressing both people-centric and process-centric factors, emphasizing the significance of both the human touch and effective operational procedures.
In contrast, negative experiences are primarily process-related, with IT support services, hardware problems, and software applications playing central roles in shaping dissatisfaction. This highlights the critical need for process improvement, but it doesn't negate the importance of the human element in delivering IT services.
Ultimately, IT service management needs to consider a holistic approach to ensure a positive IT experience for end-users in a large enterprise setting.
Differences between regional differences for incidents and requests
We observe interesting regional differences between how the numbers vary between incident and request Happiness and Lost Time.
These numbers appear to suggest that Western European end-users are more demanding in handling incidents, but North American end-users have higher expectations for requests. We’ll continue to track these numbers and report more in-depth in future reports.
Findings for regional differences in experience with resolved requests
Different cultures perceive and evaluate IT services in different ways. A specific score in one region is not directly comparable to the same specific score in another region. Having comparable benchmark data helps set expectations and provides an external angle for a better understanding end-user experience.
How to use this information in practiceIT service desk leaders can compare the scores to the country benchmark data to choose which countries to focus on. Using the comparison to benchmark data (in addition to internal averages) can help avoid pushing agents towards unachievable goals or reversely avoid getting too comfortable in regions where higher scores are culturally more common. |
Understanding the variation in IT survey response rates across different countries provides a perspective that the overall average fails to capture. These variations can be attributed to cultural differences, local work dynamics, and user expectations. Taking these considerations into account allows you to work specifically on communication methods and support approaches to encourage end-users to provide feedback. Making sure end-users know their feedback actually matters is the best way to drive higher response rates.
In simple terms, if the survey recipient doesn't believe the response to make a difference, then the motivation to fill out the survey will be low.
McKinsey studied this in more detail in this article and they conclude:
"A common belief is that survey fatigue is driven by the number and length of surveys deployed. That turns out to be a myth. We reviewed results across more than 20 academic articles and found that, consistently, the number one driver of survey fatigue was the perception that the organization wouldn’t act on the results."
Therefore, when looking at response rates across different countries, we encourage you to consider if the end-users in low response rate locations really feel that their voice matters as much as those in high response rate countries.
In our benchmark report, we looked at the initial response rate data and scratched our heads. It seemed as though noise from different sources was impacting the data. After consideration, we opted to utilize the "average response rate per user" metric when assessing post-incident IT surveys. This choice was driven by our desire to gain insight into the typical response rate an end-user typically provides.
The standard way of calculating response rates is [sent surveys / surveys completed = response rate]. However, this method is less reliable in a ticket-based IT survey scenario where there is no limit to how many surveys a single user can get.
Could you limit the number of surveys sent to a single end-user? Sure! But that would mean that maybe the ticket that was business critical for the user now would not provide the possibility of giving feedback, while the three previous tickets for simple things did.
To illustrate this approach, consider the following scenario:
Calculating a straightforward overall response rate by summing all responses and surveys could be misleading due to potential outliers, such as super-users or automated monitoring inboxes. Why not just check the data quality and keep it perfect? This takes time and can vary with changes in a number of environmental changes.
This approach drastically reduces the impact of these outliers by computing the average of averages, giving us a clearer representation of the typical response rate per end-user ensuring a more comparable reflection of response rates. This method effectively minimizes the influence of specific users with high ticket volumes, system-generated responses or super-users that raise tickets on behalf of end-users.
The typical response rate overall hides a more nuanced reality across different business units and locations. When we analyzed customer data with response rates across various offices in different countries, we found that the variations can be very high.
We compared the country-specific response rates within customer environments and found variations of response rates between 5% and 45% within contexts that, in theory, have a standardized IT service delivery regardless of location.
Just like with IT happiness, there seem to be very human influencing factors when it comes to response rates as well.
Response rates alone do not tell the whole story. Depending on how large the end-user base surveyed is, the number of responses to reach 95% confidence levels is also impacted.
Below is a table from "The Good Research Guide" by Martyn Denscombe that outlines what kind of response volumes are needed to reach statistical significance levels.
The numbers in each column for x% margin of error indicate the required number of responses
Number in the population | 5% margin of error | 3% margin of error | 1% margin of error |
50 | 44 | 48 | 50 |
100 | 80 | 92 | 99 |
250 | 152 | 203 | 244 |
500 | 217 | 341 | 475 |
1,000 | 278 | 516 | 906 |
5,000 | 357 | 879 | 3,288 |
10,000 | 370 | 964 | 4,899 |
100,000 | 383 | 1,056 | 8,763 |
Our initial hypothesis was that new employees would be more likely to respond to surveys from IT than employees who had been longer in the company. This proved to be false.
According to our data, the employees who have been in the company for more than four years are both the largest group of employees and the ones who are most likely to respond to IT surveys. These numbers reflect that change in typical response rates compared to the average.
The data below can give you an indication of what you might find in your own data. If your average response rate is 20%, according to our benchmark data, the response rate for more experienced employees with more than four years in the company would be 25.57% and new employees with less than six months in the company would have a response rate of 18.98%.
Employment age (tenure) | Response rate compared to average | Sample size |
Employed 0-6 months | -1.02 % | 53,304 |
Employed 6-12 months | -0.33 % | 49,221 |
Employed 1-2 years | -0.22 % | 74,878 |
Employed 2-3 years | +0.47 % | 52,321 |
Employed 3-4 years | -0.12 % | 41,306 |
Employed 4+ years | +5.57 % | 231,700 |
Key finding: The longer a ticket is open, the lower the response rate is.
As tickets remain open for a longer duration, it's only logical that the response rate on post-incident IT surveys would decrease. This can be attributed to various factors, such as the fading urgency of the issue, a decrease in the user's motivation to provide feedback, or simply the passage of time leading to forgetfulness. It's essential for IT teams to understand this correlation between ticket duration and response rate, as it allows them to gauge the effectiveness and relevance of their surveys. By recognizing the impact of ticket duration on survey participation, IT departments can adapt their strategies and prioritize prompt resolution to ensure a higher engagement and response rate from end-users.
For this data, we looked at a longer time period of 12 months from October 2022 to October 2023
Time to Resolve | Response rate compared to average | Sample size |
1 day | +5.63 % | 435,428 |
2 days | +3.24 % | 160,473 |
3 days | +2.49 % | 106,537 |
4 days | +2.17 % | 91,790 |
5 days | +2.16 % | 78,917 |
6 days | +1.81 % | 75,198 |
1 week | +1.36% | 75,227 |
2 weeks | +1.04% | 158,316 |
3 weeks | +0.27% | 81,245 |
1 month | -0.37% | 58,391 |
2 months | -1.28% | 64,501 |
3 months | -2.27% | 26,750 |
4 months | -2.75% | 15,443 |
5 months | -3.42% | 10,007 |
More than 5 months | -5.65% | 29,107 |
How MTTR changes by score and perceived lost time
Some interesting patterns emerged, looking at how long a ticket is open in calendar time and comparing it to the human perspective of end-users, independent of SLA-clock pauses.
When end-users estimate a loss of five working days (40 hours), the median resolution time can stretch to 221 calendar hours, roughly 80 working hours. This connection between user-reported time loss and ticket duration offers valuable insights for IT service desk teams in resource allocation, particularly when users perceive significant productivity impacts due to extended ticket lifespans.
Estimated work time lost | Median time to resolve |
15 min | 40 minutes |
1 hour | 6 hours |
6 hours | 26 hours |
40 hours | 221 hours |
First lets look at different scores and how the median time to resolve and average perceived time lost changes. All times have been rounded up to the closest full hour.
We observe an interesting disconnect between perceived lost time and time to resolve. While the end-user perceived experience (scores 0-10) and perceived lost time correlate linearly, there is an interestingly short time to resolve on tickets where the score is 0 or 1.
What would cause this disconnect?
Score given | Average perceived time lost | Median time to resolve |
0 | 16 hours | 55 hours |
1 | 14 hours | 94 hours |
2 | 12 hours | 140 hours |
3 | 10 hours | 141 hours |
4 | 10 hours | 120 hours |
5 | 9 hours | 99 hours |
6 | 8 hours | 104 hours |
7 | 7 hours | 89 hours |
8 | 4 hours | 49 hours |
9 | 2 hours | 24 hours |
10 | 2 hours | 4 hours |
The end-user perception of lost work time can highlight areas within the business where IT issues impact employee productivity.
The table below shows when and where the end-user perception of time lost best correlates with how long a ticket is open.
Reading the table below can be a bit tricky. Think of the ratio lost time to Time to Resolve (TTR) as an indicator of business criticality. If the ratio between how long a ticket is open and end-user perception of lost work time is high, it means that every hour the ticket remains open, a significant part of the end-user's ability to work productively is hindered.
When the ratio of ticket open time to perceived lost time is high, it suggests that the service is crucial for the business. In such cases, end-users indicate that every hour the ticket remains open significantly affects their work. To understand what this would mean for your IT organization, having access to experience data allows diving into the operational data and other details to shed more light on the causes.
Our initial analysis is that these tickets could be worth prioritizing, as they're likely to have the highest impact on end-user productivity and potentially the business as a whole.
Score given | Average perceived time lost | Ratio lost time / TTR | Median time to resolve |
0 | 16 hours | 29% | 55 hours |
1 | 14 hours | 14% | 94 hours |
2 | 12 hours | 8% | 140 hours |
3 | 10 hours | 7% | 141 hours |
4 | 10 hours | 8% | 120 hours |
5 | 9 hours | 9% | 99 hours |
6 | 8 hours | 8% | 104 hours |
7 | 7 hours | 7% | 89 hours |
8 | 4 hours | 9% | 49 hours |
9 | 2 hours | 10% | 24 hours |
10 | 2 hours | 39% | 4 hours |
Incidents and requests
The upward trend over several years with IT incident resolutions happiness has stabilized. Average lost time with IT incidents is slightly going up or down a few minutes between 6-month periods but, all in all, the changes in H1/2023 were minor.
The explanation for the stable average happiness in our Benchmark report might feel like experience management at some point stagnates and improvements become increasingly hard to find, but the overall average hides the fluctuations over time in customer data.
As you can see in the graph below, different companies start from very different starting points. As you look at the graph below, remember that neither the top performers nor the bottom ones are the same. Things happen in IT; as a result, even the best IT organizations can see dips in IT experience as perceived by their end-users.
Notable in the graph below is how, after 18 months of IT Experience Management, not a single customer has an average score of below +40, which is a remarkable improvement over hovering around +10-20.
The data analyzed here contains most of our customers, but we have excluded certain outliers that started from non-comparable situations.
This graph, together with the following customer example graph, aims to highlight how experience is dynamic. The experience management journey looks different for each of our customers.
The graph below is from one of our customers who started their IT Experience Management journey in September 2022. They have managed to improve their end-user experience in the first year, but as you can see, experience is dynamic. It changes daily, and understanding the dips and peaks allows IT to improve, step-by-step, improvement by improvement.
The definition of lost time: End-users estimate how much work time they lost due to the service experience.
The data below was first reported in the H1/2022 Benchmark report but is still very relevant. The section below is identical to the previous report.
Looking more closely at lost time across all incidents, we can see that the distribution of perceived lost time is unevenly represented on both sides of the spectrum. The historical trend shows how this polarization of the speed of service means that tickets solved in under eight hours are solved even faster year after year, but tickets that lead to more than eight hours of lost time take even longer.
This explains the paradox of rising average lost times in the last two reports. Most end-users receive faster IT support with IT incidents, but those that don't have to wait an increasingly long time. This leads to average lost time going up due to the even longer waiting times for the tickets, leading to more than eight hours of perceived lost time.
When IT identifies where end-users are losing time, they will find improvement opportunities that greatly impact every issue that gets solved. Understanding where people lose only small amounts of productive work time allows IT to identify automation candidates that would liberate time from agents to solve more complicated issues.
Understanding the reasons behind end-user's dissatisfaction with ticket-based IT services is crucial for improving them, and the HappySignals IT Experience Management Platform excels in identifying these factors.
Using a standardized list of Factors developed from research with IT end-users, we ask end-users to select the factors that best reflect their satisfaction or dissatisfaction with the service in a survey sent to them after a ticket resolution. The timing of the survey delivery is optimized to assess end-user feelings about the service at the moment of their experience, making our Factors data more reliable.
Different factors are presented to end-users depending on their happiness rating on a 10-point scale, and they can select as many factors as they wish from the list. Factors related to service agents are included in all three scenarios of negative (0-6), neutral (7-8), and positive (9-10) experiences. For example, for IT incidents, 75% of responders who gave a positive happiness rating and selected at least one factor were happy about the speed of service.
End-users can select multiple factors, thus making the percentages add up to more than 100%. By monitoring and analyzing these factors, IT organizations can gain valuable insights into end-user satisfaction and identify areas for improvement in their ticket-based services.
The factors that create positive, neutral, and negative experiences with IT Incidents for end-users remain very stable.
IT Incidents - Positive Factors | 2019 | 2020 | 2021 | 2022 | H1/2023 |
Speed of service | 75% | 74% | 74% | 75% | 75% |
Service personnel's attitude | 52% | 55% | 56% | 55% | 55% |
Service personnel's skills | 48% | 49% | 50% | 49% | 50% |
Service was provided proactively | 28% | 34% | 36% | 37% | 38% |
I was informed about the progress | 29% | 33% | 35% | 35% | 36% |
I learned something | 21% | 25% | 26% | 26% | 26% |
IT Incidents - Neutral Factors | 2019 | 2020 | 2021 | 2022 | H1/2023 |
Speed of service | 58% | 57% | 55% | 55% | 55% |
I had to explain my case several times | 20% | 21% | 21% | 21% | 21% |
It was difficult to know where to start | 11% | 11% | 12% | 12% | 12% |
I wasn't informed about the progress | 11% | 11% | 10% | 10% | 10% |
Service personnel's skills | 10% | 8% | 8% | 8% | 8% |
Instructions were hard to understand | 7% | 7% | 8% | 8% | 8% |
Service personnel's attitude | 7% | 6% | 7% | 7% | 6% |
IT Incidents - Negative Factors | 2019 | 2020 | 2021 | 2022 | H1/2023 |
My ticket was not solved | 40% | 46% | 47% | 49% | 51% |
Service was slow | 47% | 44% | 44% | 43% | 41% |
I had to explain my case several times | 29% | 30% | 29% | 30% | 29% |
I wasn't informed about the progress | 16% | 16% | 16% | 16% | 17% |
Service personnel's skills | 12% | 12% | 12% | 12% | 11% |
Instructions were hard to understand | 7% | 8% | 8% | 8% | 8% |
It was difficult to know where to start | 7% | 7% | 7% | 7% | 7% |
Service personnel's attitude | 6% | 7% | 7% | 7% | 7% |
In service delivery, IT teams must also develop channels to enhance end-user satisfaction. To create channels that improve employee happiness, it's essential to obtain reliable and detailed experience data about how end-users utilize and perceive different channels.
Without acquiring and utilizing this data, IT teams may mistakenly allocate resources to add new channels unnecessarily, encourage end-users to use them, or focus on improving channels already performing well instead of those requiring attention.
Our channel usage data reflects the recent trend in the ITSM industry of developing channels with automation and predefined user flows to reduce the workload on service desk agents. This trend is expected to continue as IT organizations strive to improve efficiency while enhancing the overall customer experience. Investments in service portals, smart AI-powered chats, and proactive monitoring of services with self-healing capabilities all aim to optimize the use of technology across different teams.
However, we advise against losing sight of end-user needs by continuously monitoring how their experience changes when support channel recommendations and usage are modified. If possible, establish a baseline for experience data before the change, track changes during the transition, and draw conclusions by assessing the experience a few months after implementation.
Note that the total percentages don't add up to 100% because we exclude channel categories that cannot be accurately categorized into the existing five categories.
Based on the data from all our customers, there are only slight differences in overall happiness with the digital channels – Chat, Email, Phone, and Portal (all range from +74 to +81). The only channel with significantly higher happiness is Walk-in (+94). The perception of lost time is also by far the lowest for Walk-in IT support, with just 1h 24min on average per incident, 1h less than the second least time-consuming channel, Phone.
Numbers rounded up to the closest full percent.
Channel usage for IT Incidents | 2019 | 2020 | 2021 | 2022 | H1/2023 |
Chat | 8% | 9% | 9% | 9% | 8% |
19% | 17% | 15% | 15% | 12% | |
Phone | 30% | 30% | 27% | 25% | 25% |
Portal | 33% | 33% | 38% | 39% | 39% |
Walk in | 6% | 3% | 3% | 4% | 5% |
Other | 4% | 8% | 8% | 8% | 11% |
Each time a ticket is reassigned, end-user happiness decreases by nearly eight points, and users lose an average of 1 hour and 42 minutes of work time, ranging from 0 to 4 reassignments. When a ticket is reassigned four times, it can result in a total loss of 8 hours and 22 minutes!
Our data, collected over the past four years, has shown consistent trends in the frequency of ticket reassignments and the corresponding impact on end-user happiness and lost time. Over the years, the amount of time end-users lose with each reassignment has increased, while the amount of reassignments has decreased for most customers.
This is one of the most potential areas where IT Experience data can help IT teams get quick wins in increasing end-user productivity by ensuring incidents are directed to the right teams as soon as possible.
The percentages when tickets are reassigned three or four times do vary, but rounding up to the closest percent makes their proportion of all tickets seem like it doesn't change.
Reassignments | H1/2020 | H2/2020 | H1/2021 | H2/2021 | H1/2022 | H2/2022 | H1/2023 | |
0 | 56% | 55% | 52% | 51% | 53% | 54% | 55% | |
1 | 27% | 27% | 30% | 31% | 30% | 30% | 29% | |
2 | 7% | 8% | 9% | 9% | 9% | 8% | 8% | |
3 | 3% | 3% | 3% | 3% | 3% | 3% | 3% | |
4 | 1% | 1% | 1% | 2% | 1% | 1% | 1% |
Reassignments | H1/2020 | H2/2020 | H1/2021 | H2/2021 | H1/2022 | H2/2022 | H1/2023 | |
0 | +76 | +79 | +81 | +82 | +81 | +81 | +82 | |
1 | +67 | +70 | +75 | +77 | +76 | +77 | +78 | |
2 | +58 | +62 | +65 | +68 | +68 | +68 | +69 | |
3 | +48 | +51 | +54 | +61 | +60 | +63 | +64 | |
4 | +43 | +45 | +46 | +50 | +51 | +51 | +51 |
Reassignments | H1/2020 | H2/2020 | H1/2021 | H2/2021 | H1/2022 | H2/2022 | H1/2023 | |
0 | 2h 9min | 1h 53min | 1h 45min | 1h 54min | 1h 55min | 2h 6min | 2h 2min | |
1 | 4h 1min | 3h 41min | 3min 23min | 3h 28min | 3h 37min | 3h 38min | 3h 37min | |
2 | 6h 6min | 5h 23min | 5h 0min | 5h 10min | 5h 9min | 5h 45min | 5h 44min | |
3 | 7h 56min | 7h 9min | 6h 27min | 6h 49min | 7h 7min | 7h 5min | 7h 5min | |
4 | 9h 51min | 8h 2min | 8h 29min | 8h 16min | 8h 15min | 9h 23min | 8h 22min |
Different Support Profiles have different service expectations
While we've covered the experiences of IT end-users in previous sections, it's important to note that there are also differences in behavior and motivation among them. Knowing these differences can help tailor IT services for different types of end-users.
This is where HappySignals IT Support Profiles can be useful.
We conducted interviews with over 500 end-users and found that two main behavioral drivers, Competence and Attitude, have the greatest impact on end-user behavior and experience. Competence refers to the end-user's capability to fix IT issues independently, while Attitude pertains to their willingness to solve the problem independently.
By mapping these behavioral drivers, we defined four Support Profiles: Doer, Prioritizer, Trier, and Supported. For more information on using these profiles in the IT Service Desk, refer to our Guide.
Consistent with previous years, Doers again have the lowest Happiness of +75, while Supported are still the happiest with Happiness of +86.
One interesting change in the data is the diminishing portion of doers and supported across the benchmark data. Prioritizers, on the other hand, have increased. This means more people today are competent IT end-users, but they still wish for IT to fix their issues.
Observing how different support profiles utilize various channels confirms the behavioral drivers identified in our original research about IT Support Profiles. The data on IT incident channel usage by different profiles reveals the following patterns:
Self-service portals are the preferred channel for Doers, who enjoy solving issues themselves and are least likely to use the Phone.
Prioritizers, who prefer minimal participation in issue resolution use the Phone more frequently than other support profiles.
Supported and Triers use Walk-in IT support 30-35% more often than Doers and Prioritizers, reflecting their preference for personal assistance and learning from patient service personnel.
For further information on how to customize services to serve different end-users in the organization better, we suggest downloading our Definitive Guide on IT Support Profiles.
The image below is from our previous report, but the profile characteristics and preferences haven't changed.
What is the business impact of understanding end-user IT Support Profiles?Although you can't change your end-users, you can customize your IT services to suit various support profiles. One way to do this is by adjusting how service agents communicate with each profile when they reach out to the service desk. For instance, Doers and Prioritizers may prefer technical jargon, while Supported and Triers may benefit from simple language and step-by-step instructions. Another approach is to analyze the data by profiles to identify which channels work best for different profiles. Then, you can develop and promote these channels to the relevant end-user profile groups. Check out our comprehensive guide to learn more about using support profiles to enhance ticket-based services!
|
We traditionally look at consecutive periods in the data to look at trends over time, but to gain a new perspective that has not been presented before, the graph below shows only the first six calendar months of the year in the last four years. While the data does not show anything dramatically different from previous reports, it shows the overall happiness trend very slowly improving, with increasing lost time per request. This is not unexpected as IT organizations are actively promoting self-service to reduce IT time spent with request management. While it reduces IT resource needs, it also pushes some of the time spent with requests to the IT end-user's work time.
Channel usage for IT Requests | 2019 | 2020 | 2021 | 2022 | H1/2023 |
Chat | 1% | 2% | 3% | 2% | 2% |
8% | 8% | 9% | 6% | 3% | |
Phone | 20% | 18% | 12% | 13% | 11% |
Portal | 63% | 54% | 57% | 57% | 60% |
Walk in | 1% | 2% | 1% | 1% | 1% |
Other | 7% | 16% | 18% | 21% | 23% |
The saying “Technology changes, People stay the same” rings true in our data. The Factors that create positive, neutral, and negative experiences with IT Requests for end-users have remained stable over the last four years.
The only slight changes have happened in a decrease in the selection of service personnel's attitude and skills as factors, which could very well be explained by the increased amount of requests that don't require service personnel to intervene. Instead, requests are increasingly handled in self-service portals.
IT Requests - Positive Factors | 2019 | 2020 | 2021 | 2022 | H1/2023 |
Speed of service | 79% | 79% | 80% | 80% | 80% |
Service personnel's attitude | 49% | 48% | 47% | 45% | 45% |
Service personnel's skills | 46% | 46% | 45% | 44% | 44% |
I was informed about the progress | 31% | 34% | 36% | 34% | 36% |
It was easy to describe what I wanted | 31% | 32% | 33% | 33% | 34% |
Instructions were easy to understand | 29% | 31% | 32% | 32% | 32% |
IT Requests - Neutral Factors | 2019 | 2020 | 2021 | 2022 | H1/2023 |
Speed of service | 58% | 57% | 58% | 56% | 55% |
I had to explain my case several times | 15% | 15% | 15% | 15% | 15% |
It was difficult to know where to start | 12% | 11% | 12% | 11% | 13% |
I wasn't informed about the progress | 11% | 12% | 11% | 11% | 11% |
It was difficult to describe what I needed | 8% | 8% | 8% | 9% | 8% |
Instructions were hard to understand | 8% | 8% | 7% | 8% | 8% |
Service personnel's skills | 7% | 6% | 6% | 7% | 7% |
Service personnel's attitude | 5% | 4% | 6% | 6% | 5% |
IT Requests - Negative Factors | 2019 | 2020 | 2021 | 2022 | H1/2023 |
Service was slow | 55% | 57% | 56% | 55% | 52% |
I had to explain my case several times | 33% | 31% | 31% | 31% | 33% |
I wasn't informed about the progress | 23% | 29% | 27% | 26% | 26% |
Service personnel's skills | 17% | 16% | 13% | 13% | 14% |
Instructions were hard to understand | 12% | 12% | 11% | 12% | 13% |
It was difficult to know where to start | 10% | 10% | 10% | 10% | 11% |
Service personnel's attitude | 8% | 9% | 8% | 8% | 9% |
It was difficult to describe what I needed | 7% | 6% | 7% | 7% | 7% |
Based on our 2023 research data on ITXM (IT Experience Management), a human-centric approach to experience management can have significant benefits for organizations. Our analysis of this data from different angles presents a holistic snapshot of end-user experiences in enterprise IT. It's worth noting that the data used in this report is solely from HappySignals customer organizations that have embraced a human-centric approach to experience management.
One of the main benefits of ITXM is its ability to empower enterprise IT leaders to drive data-driven change. By prioritizing the needs and experiences of employees, IT leaders can make informed decisions that improve productivity and overall business outcomes.
Additionally, a focus on human-centric experience management can help organizations transform their IT culture, making it more empathetic and responsive to employee needs.
Another key benefit of ITXM is its impact on employee happiness. By prioritizing a positive experience for employees, organizations can improve employee retention rates and reduce turnover costs.
Happy employees are more engaged and productive, which can lead to improved business outcomes. By prioritizing employee experiences and focusing on data-driven decision-making, organizations can achieve better outcomes for both employees and the business as a whole.
Intrigued? Discover experience management by reading the IT Experience Management Framework (ITXM™) Guide. This downloadable 10-page read introduces ITXM™ and how to lead human-centric IT operations with experience as a key outcome.
Do you prefer learning through short video courses? Check out our ITXM Foundation Training & Certification Course , where in about one hour, you can learn the foundations of IT Experience Management and get certified for free.
If you enjoyed this report, you may also want to visit our Learning Center for bite-sized videos and blog posts about topics from XLAs to optimizing your ServiceNow solution.
Read our previous Global IT Experience Benchmark Reports.
2022:
2021: