By Brandon Carter, Service Desk Manager
Let’s be honest — finding reliable, meaningful data about how well an MSP (Managed Service Provider) performs is tough. Even for those of us in the industry, it’s not always clear which stats actually matter, or how to compare one provider to another.
That’s why we’ve decided to start publishing our Service Desk performance stats publicly. Not just the good ones — all of them. Because we believe transparency is the first step toward trust.
The problem with MSP performance stats
There’s no universal standard for what MSPs should report. Some publish response times, others focus on resolution rates, and many don’t publish anything at all. Even when they do, definitions vary wildly.
Take “Average Time to Respond.” It sounds simple, but what do you think this means? What would you class as a response? Your automated ticket confirmation? A ticket status update? A message asking for more info?
At One2Call, we define “response” as the moment a customer receives a meaningful reply — advice, action, or a clear step toward resolution. It’s a tougher standard, and yes, it can make our numbers look slower. But it tells us much more about our customer's actual experience.
And even then, such stats aren’t perfect. It’s a straight average across all tickets, regardless of priority. So while high-priority tickets are dealt with much faster, a flood of low-priority tickets in a busy month can skew the average.
So what does matter? Let’s talk CSAT
Customer Satisfaction (CSAT) is one of the few metrics that’s widely understood and benchmarked. It’s simple: after a ticket is resolved, we ask the customer to rate their experience; good, bad or average (as represented by face emojis). CSAT is the percentage of people who answer "good".
CSAT is a strong stat for our industry, not only because it's in widespread use, but also because it's a score for which the SDI (Service Desk Institute) publish benchmark stats. The SDI benchmark 84.3% as average for the industry and 86.9 as median, so we're very happy to consistently score way above this with our customers. That tells us our customers feel supported, listened to, and satisfied with the help they receive — which, ultimately, is what matters most.
Why we’re publishing our service desk stats
We’re doing this because so few MSPs do. And that’s a problem.
If you’re a customer, how do you know if your MSP is doing a good job? If you’re comparing providers, how do you know who’s better? Without data, it’s guesswork.
By publishing our stats — even the ones that aren’t perfect — we’re committing to transparency. We’re holding ourselves accountable. And we’re inviting other MSPs to do the same. Now let's take a look at how we did in September 2025.
One2Call key stats for September 2025
Ave. Response Time |
Ave. Fix time |
|||
95.2% |
1hr 58 mins |
9.26 hrs |
89 |
5.0⭐ |
Previous 3 months
June 2025 | July 2025 | August 2025 | |
CSAT | 96.6% | 100% | 92.7% |
Ave. Response Time | 1hr 40 mins | 38 mins | 54 mins |
Ave. Fix time | 6 hrs | 4.5 hrs | 6.7 hrs |
NPS Score | 89 | 89 | 89 |
Google Review Score | 5.0⭐ | 5.0⭐ | 5.0⭐ |
Our reflections on September's stats
There's no doubt we had some issues in September. which I'll talk about more below, but that's also why I'm so pleased to see our CSAT getting back to a more usual number for us at 95.2%. This suggests that even when our response times aren't ideal we're still delivering quality support to our customers.
The increase in our average response time was driven by fewer inbound phone calls being answered within 15 seconds, which is another of our key stats. This also had a knock on our average fix time which rose to 9.26 hours, which is very high for us, driven by an increase in more complex tickets that required deeper investigation.
We've already made changes to our call routing and are looking at how we can streamline workflows and use automation so our agents can make more meaningful responses earlier. We're also looking at staffing and providing additional training to our teams.
Which stats mean most to you?
So that's a quick run down of where we're at right now and what we're intending to do. We'll discuss it more in future, so be sure to follow us here for future updates,
But how our industry judges itself is one thing, but what stats make sense to you? Which are the stats that you'd really like to see in order to be able to properly assess MSP performance?
We'd love to hear from you, so why not get in touch and let us know?