Is it better to feel fast, or to actually be fast? Sometimes the performance improvements you make won’t make a difference to those that count – your customers. This post will describe some of the interesting effects we observed during a performance assurance exercise with a client – and describe how user perception is a factor that needs to be both ignored and considered.
Nisa Retail had a complicated order placement system that was progressively getting slower under increasing load. Let’s quantify “complicated” within this context; It meant the overall system was made of a number of legacy systems that have been coupled together. Each of these had an element of custom development progressively layered over time, and with that an element of undocumented knowledge loss as people had moved on. A rewrite was out of the question and the client needed results quickly. We installed an APM solution that allowed us to measure, quantify, evidence and introduce performance improvements.
There were many small and medium sized improvements that resulted in a collectively large improvement for the user. The following graph neatly sums up the results achieved for Nisa Retail.
Orange Line: Over a 12-week period the site experienced a 20% increase in page views.
Green Line: Over a 12-week period the overall trend in page response time decreasing by 46%.
It should be noted that the system was a closed B2B system. This meant that the number of users remained fixed over the 12 week period.
- The average response time of a page had decreased by 46% over a 12 week period
- The number of page views had increased by 20%.
So a rather obvious benefit is that by decreasing the page load times, we increased the rate at which the users interacted with the site.
There also appears to be a general macro relationship between the average page load times and pages views. For every 2.3% increase in page load time, there is a corresponding 1% increase in page views and with that an increase in sales. We can now start to quantify to the business the ROI for increasing page responses times. Don’t forget – this is an average of all the pages on the site, but this general correlation can be used as a powerful tool.
Something was wrong.
We had numerous benefits fall out of the exercise – fewer phone complaints, better utilization of developer time, increased system stability. But what was most interesting (to me) was this: While metrics were telling us that we were improving the site – end user feedback said that what they experienced didn’t stack up to the improvements reported by our statistics. Why?
A Lesson Learnt
A key soft lesson learnt is the psychology of the user. Users perceive response times very subjectively with what appeared to be a strong bias to application entry points. Intechnica had introduced some major performance improvements, but the majority of users didn’t perceive these until the performance of the common application entry points were improved. For example, as soon as the response time of the home page was improved (from 20 to 5 seconds), the users noticed general improvements that had been introduced further downstream.
This has a number of implications:
- Objective measures should always be in place as a reality check, do not rely on the user feedback to tell you where you are
- Slow initial entry points to a system taint a users perception for the proceeding journey.
- Perceived User feedback is important – this slightly contradicts the first point, but if the users do not feel that the system is improving then you need to investigate why. You may be targeting and improving parts of the system that nobody really notices, cares about or the can visibly see*.
- To quickly improve the perceived performance, target the entry points users perceive as the most unresponsive first. Potentially cheap initial wins.
- Always confirm your statistics with users where possible. Understand the statistics reported, don’t take them at face value – e.g. what exactly does “Page Load time” mean?
So, by analysing, prioritizing and solving the performance points around the end user perceptions you could gain a higher degree of value for invested effort – A win for everyone involved.
It doesn’t just have to be about page speed
We often get a new headline figure in the performance industry – page load times, how the latest millisecond slow down by implication will loose you a per cent of your target audience, hit your conversion rate and impact your bottom line. However, in most cases these tend to be based on high volume sites and I would be wary if these stats are being wielded when being used to sell a solution into your business. Take a step back and put them into your own business context (more on this in a different blog post). If your aim is to increase usage and browsing on your site, then there may be alternative’s – e.g. making suggestions based on previous search behaviour, time of year, BI or general increasing usability.
Hearing what we don’t want to hear
What is often forgotten in the never-ending quest for page speed is end-user perception. This is natural, as it is more difficult to measure. In some cases altering the perceived page load time may give a cheaper and more cost effective improvement than actually increasing the actual page load time. I’d like to see more research and reporting around this. It would also be useful for the industry to be a little more honest, we shout loudly when we improve and see benefit. However I suspect there have been many cases where sites have improved their load times and not seen the expected benefits – what doesn’t conform to our understanding can tell us much about the world and web we occupy. Faster is better, but our customers need to know when faster won’t give them the anticipated benefits.