retail

Are celebrity endorsements doing more harm than good to your retail sales?

Online retailers exploring the recent trend of signing up celebrities to endorse them to their millions of followers on social media run the risk of site-melting bursts of traffic, according to Intechnica co-founder Jeremy Gidlow.

In a talk at GP Bullhound’s event “Online Fashion: Where is the smart money going?” last night, Jeremy pointed to a recent example where Kylie Jenner drove an incredible amount of demand to the website selling her range of lipstick. However, because the demand was so great and was directed at a specific time, the systems behind the website couldn’t cope and the whole website was brought down.

This highlighted a common problem retailers are facing today: Sales reduce margins whilst increasing demand, and the technology needed to cope with the increased demand can be very costly – further cutting into profits.

TrafficDefender is a SaaS solution developed by Intechnica in response to this problem. TrafficDefender allows retail websites to remain online throughout large spikes in web traffic without needing to over invest in additional IT infrastructure.

Watch Jeremy’s talk below or find out more about TrafficDefender.

How much damage does Third Party Content really do to ecommerce sites? [REPORT]

thirdpartyreportIntechnica recently did some in-depth research into how the performance of ecommerce (specifically fashion retail) websites is being affected by the over-proliferation of third party content, the results of which have been published in our brand new report, “The Impact of Third Party Content on Retail Web Performance”.

For the uninitiated, third party content refers to basically anything on your website served externally – usually adverts, tracking tags, analytics codes, social buttons and so forth. Check here and here for some good background on the dangers of third party content.

If you are already initiated, though, you will already know that this content places an overhead on a website’s performance to varying degrees. In a best case scenario you’re looking at some extra page weight in the form of objects, which are being served from added hosts via extra connections to your users, the result being a few extra milliseconds on your page load time. On the other end of the spectrum, a misbehaving piece of third party content can slow pages to a crawl or even knock your site offline. Remember when everyone blamed Facebook for taking sites like the Huffington Post, Soundcloud and CNN down? Yep. Third party content at its most disruptive.

So what did we want to find out?

We wanted to really see how much of an impact third party content was having on real websites. We went about this by doing some external performance monitoring of the normal websites in parallel with filtering out the third party content (which we identified based on experience – if you want to see an open source, ever-growing list of third party content, I recommend ThirdPartyContent.org). It’s also important to note that we measured a “land and search” transactional journey, not just home page response, to get a more realistic measurement. From this information, we were able to distinguish which kinds of content are most prevalent and even what element of this content was most to blame for any performance overhead. It also gave us some interesting insights into whether sites that use third party content more conservatively also happen to follow other web performance best practices.

Some fast facts

There is more detailed data in the report itself, which is available to download in full here. However, let’s look at some key findings. Keep in mind that some of these might seem predictable but it’s nice to have some concrete data, isn’t it?

  1. In general, sites that use more third party content are heavier, use more hosts and connections, and are slower than those that use less third party content.
  2. When third party content is completely filtered out, websites with a lot to begin with see a greater immediate speed boost than those with less to begin with. In fact, many retailers swapped places in the rankings once third party content was taken out of the equation.
  3. Falling in line with the famous “golden rule of web performance”, it’s the extra hosts, connections and objects brought on by third party content that result in slower response times – more so, in fact, than page weight.

Some pretty graphs and tables

I recently saw someone on Twitter express the rule “nt;dr” (no table; didn’t read), so let’s get some data in this post, shall we?

average-tag-types

The chart above shows the average number of each type of third party content present on each of our target websites. As you can see, there is far more variation in adverts than in any other form of third party content, although it’s somewhat surprising to see the average of 1.8 analytic tags on these websites. Serving adverts etc. from multiple hosts impacts performance, as we see in the table below…

table2

Our second table tells us a little more about why sites with more third party content tend to be slower. The median, maximum and minimum response, hosts ,connections, objects and size represent only what is added to the sites by third party content, not the overall metrics. We can see that third party content alone is adding up to 18.7 seconds to the response time (a median of 9.4 seconds). What’s also interesting is that there is a stronger correlation between average response time and number of objects, connections and hosts added than there is between response time and added page weight (0 is no correlation ,1 is absolute correlation).

example

The last chart I’ll show you here is very telling in terms of real impact of third party content. Red and blue are the full, normally served websites of Retailer A and Retailer B respectively. We can see that red is much slower than blue during this time period. Yellow and green are the same websites (A and B respectively) except with all third party content removed. Not only did Retailer A gain a greater speed boost than Retailer B by filtering this content out, but it actually went from being the slower website to being the faster of the two. What’s even more interesting is that Retailer A reported disappointing revenues of this period, while Retailer B posted a strong revenue report.

Read the full report

This is just a sample of some of the data we collected and analysed, so make the jump to our website to read the full report for yourself.

 

When tech glitches become business problems

Technology and specifically IT are essential to business growth, but IT can become a double-edged sword. When things go wrong, tech glitches become real business problems.

As people begin to expect more out of technology advancements, and these advancements have the potential to improve our day-to-day lives, more and more businesses are looking to innovations and the next progression to support their growth. As we’ve seen in recent years with the demise of traditional brick-and-mortar high street businesses unable to adapt in time to new digital trends (in the past year alone we’ve seen HMV, Blockbuster and Jessops hit hard, and nearly 2,500 stores affected), embracing technology is more important than ever to thrive, especially in the retail space.

However, as important as implementing new technologies is, it’s just as important to get the technology right first time. The investment in IT is now so high that tech glitches are now business problems in the most real way. IT glitches are now recognised as a mainstream issue reported on the front page of newspapers (and perhaps more significantly, virally spread digital news sources and social media discussions).

Chaos at Argos

An IT glitch caused chaos at one of the new Argos digital stores.

One recent high-profile example in the news concerned Argos’s new ground breaking “digital-only” stores. Argos has been a true trailblazer in the “click and collect” genre of retail, and the six new flagship stores are designed to fully embrace the sleek “touch screen”-y experience of the future. However, a technical glitch meant that orders were placed to be collected from these new stores before they were actually opened, leading to a frustrating experience as customers turned up to collect goods only to find closed or unfinished stores.

But it’s not just innovative new initiatives that can cause problems. Even fairly routine progressions and changes can damage business if not carefully implemented. Take, for example, BrandAlley, who brought ire to customers after delays in orders being processed. The cause was a switch to a new IT platform, instigated to prepare for international expansion. IT advancement was necessary to grow, but ended up causing a real business issue. BrandAlley has since given out vouchers worth £25 each to affected customers to save face.

I’ve written about a lot of performance specific tech glitches that cost businesses a lot of money and lost trust from customers on the blog before – from the Facebook IPO crashing NASDAQ leading to legal action, to the BT Sport app being unprepared for demand on the first day of the Premier League – and since performance issues are typically much more difficult to put right than functional issues (just look at the ongoing Healthcare.gov fiasco in the US), it really makes so much sense to pay close attention to performance. Just look at how much poor performance can damage brand and revenues.

So with the pace of IT advancement ever quickening, and our need to see the next advancements growing at least as quickly, just as important as keeping up with the “wave of the future” is making sure these changes don’t do more harm than good. After all, tech glitches ARE business problems.

Mobile Web design & HTML5: Setting the standard

Tablet computers and mobile devices are becoming ingrained in society. Image credit: Mike Licht, NotionsCapital.com

As the tablet and mobile market continues to grow, fuelled by the hotly anticipated release of new versions such as the (new) iPad (3) and Android 4.0 (Ice Cream Sandwich), businesses are realising that the trend towards consumers buying online through an increasing number of devices (or m-commerce) is only going to grow with it. Recent studies have shown that conversion rates are much higher from mobile devices and tablets than on traditional platforms; a customer who visits a website or e-commerce application through such a device is already showing a certain level of increased engagement with the retailer. Customers using tablets spend more (up to 10 or 20 percent in some cases) than those using desktop computers, other studies have shown.

Retailers and other businesses dependent on their online systems for revenue must carefully consider these users, and make sure they are being properly catered for. A recent study published by Gomez, a division of Intechnica partner Compuware, showed that the top complaints of tablet users when viewing websites were slow load times, site crashes & errors, and problems with the format of the site. These bad experiences drive users away to competitor sites, and increase the risk of them never returning to your site.

The traditional building blocks for websites were simply not made to cope with screens than switch between portrait and landscape, relatively tiny or irregular screen resolutions, or touch screen actions. Performing tasks or even simple navigation of websites under these conditions can be a frustrating experience, where usability needs to be turned into an advantage. After all, shopping via mobile devices is often triggered by impulse, making for easy sales as long as the process is as simple as possible. In the past, shopping cart abandonment has been relatively high for these platforms, and part of this has to be down to the fact that many websites were not designed with these platforms in mind; nobody likes having to scroll across both axis and zooming in and out just to be able to touch the right button on a site. The solution lies in intelligently developing websites and applications to be compatible with these varying devices, platforms and screen sizes.

While old internet architecture was not built with such devices in mind, the latest set of standards, HTML5, is designed precisely for that reason. Rather than rebuilding a whole website, or building a separate site for each device (after all, each device will have different resolutions to contend with and a “mobile” version does not ensure compatibility on all devices), designers and developers can rewrite portions of a site in HMTL5, CSS3 and Javascript. While no browser currently supports every feature of the still developing HTML5 standard, it allows one single website to adapt across devices automatically via responsive design, making it much more “mobile friendly”. For example, sidebar content is automatically shifted to the bottom of the page, allowing the main article to span across the screen. This is a clever solution where “screen real estate” is reduced on smaller devices. HTML5 is also designed to be “touchscreen friendly”.

m-commerce is growing as more people take up the technology. Image credit: Per Olof Forsberg

HTML advancements aside, there are still many considerations to make when “mobifying” or “tabletising” a website or application. Even with a responsive design, where content shifts itself around to compensate for the screen resolution without compromising readability, in many cases there is simply not enough room on the screen to feasibly show everything. This is where careful consideration needs to be made on what is necessary to be shown on the site, and in some cases, where it would actually be more appropriate to shrink the content down and encourage zooming and scrolling. You could also try to think outside of the box; a wide table of data might need to be scrolled in portrait mode, but users could be able to view it more easily if flipped into landscape mode. It is even possible to show data in completely different ways depending on the size of the screen (see some clever examples here and here).

The considerations are different for each individual website or application, but as the new web standards develop, more and more opportunities to innovate are opening up. In the end, it’s all about ensuring quality and high performance for the end user, regardless of where they are or what they are using.

Performance in the spotlight: Why performance is so important to the bottom line

Recently, major websites like Mashable have drawn attention to the public impression on website performance. I thought it would be interesting to look into the impact this has on the businesses running these websites. For anyone who has ever struggled to navigate around a website being hampered with performance issues, especially at peak times, the experience is like wading through treacle; it’s slow, unpleasant and you probably want to be somewhere else as soon as possible. So it will probably come as little surprise when I say that various reports show that people are becoming less and less tolerant of poor website performance, to the point of quite quickly abandoning the offending site for a competitor. Here’s some stats about what the people of the world (wide web) think of badly performing websites…

Loading... still loading... wait, where are you going?!

  • A 1 second page load delay causes, on average, a 16% decrease in customer satisfaction (1).
  • 1 in 4 people abandon pages that take more than 3 seconds to load, with more than half citing quick page loads as being an important factor in their loyalty to a site (2).
  • At peak times, more than 75% of customers will leave to go to a competitor’s site rather than suffer delays, and 88% of online consumers are less likely to return to a site after a bad experience  (3).

While visitors and customers increasingly can’t bear sluggish or unresponsive websites and applications, the people who own the websites themselves should loathe them even more. Here’s a quick overview of some facts and figures to show just how concerned they ought to be…

Performance affects discovery and reputation

  • Google’s search ranking algorithms measure site speed, and use this to rank faster websites higher in search results (4).
  • More than a third of online consumers will tell others about their disappointing website experience (3).
  • Visitors perceive load times to be 15% longer than they actually are, and when recalling later to others, this raises to 35% (5).

This is bad news for slow, low performance websites, but it becomes much clearer and even attention-grabbing when you look at the impact on costs and sales…

Performance affects the bottom line

  • Every 100 milliseconds in page load delay costs Amazon.com  1% of sales (6). This means that, taking into account Amazon’s $67 million in sales each day, a 1 second page delay could potentially lose $2.4 billion of sales each year.
  • By speeding the page load times up by 5 seconds, Shopzilla.com  increased their conversion rate by 7-12%, doubled the number of referrals from search engines and cut the number of servers required in half (7).

On top of that, statistics collected by AOL have shown that faster page loads lead to more page views per visitor, and a less than half a second in page load speed improvement to Yahoo.com increased traffic by 9%. And as the above statistic from Amazon shows, optimising performance represents a significant impact on business. Don’t you think $2.4 billion every year is significant? While your business probably isn’t the size of Amazon, better performance still has a positive correlation with happier customers, more visits, higher conversion rates and increased revenue.

Next time you’re trying to get on a website that just doesn’t want to load properly, think about how much of an effect this is having on what you think of said website, and it will be easy to see for yourself why performance really matters.

Sources

  1. Aberdeen Group
  2. Forrester
  3. Gomez
  4. Google
  5. Stoyan Stefanov, Psychology of Performance
  6. Amazon
  7. Shopzilla