The year in data: 2018 in ecommerce statistics

How did ecommerce change in 2018? Let's take a look at the data. Littledata benchmarks online retail performance in Google Analytics, and with over 12,000 sites categorised across 500 industry sectors we have a unique insight into ecommerce trends in 2018. The pattern we're seeing is that web sessions are becoming ever shorter as users split their attention across many ads, sites and devices. Marketers need get visibility across a range of platforms, and accept that a customer purchase journey will involve an ever greater number of online touch points. In the following analysis, we look at how performance changed across 149 ecommerce sites in 2018, and how these trends might continue in 2019. Ecommerce conversion rate is down Ecommerce conversion rate has dropped by an average of 6 basis points, not because of a drop of online sales - but rather because the number of sessions for considering and browsing (i.e. not converting) has risen. This is partly an increase in low-quality sessions (e.g. SnapChat ads preloading pages without ever showing them to users), and partly an increase in users from platforms like Facebook (see below) which bring less engagement with landing pages. See our mission to Increase Ecommerce Conversion Rate for more details. Revenue per customer is up Revenue per customer is the total sales divided by the total number of users which purchased online. The increase of $16 USD per customer per month shows that many stores are doing better with segmentation - ignoring all those sessions which don't convert, and retargeting and reselling to those that buy lots. The growth in subscription business models is also fuelling this trend. Getting a customer to commit to a regular payment plan is the most effective way of increasing revenue per user. See our mission to Increase Average Order Value for more details. Reliance on the homepage is down Content marketing became mainstream in 2018, and no self-respecting brand would now rely on the homepage alone to drive interest in the brand. The percentage of traffic coming 'through the front door' will continue to fall. In building out a range of keyword-specific landing pages, stores are harnessing a wider range of Google search queries, and providing more engaging landing pages from Google Ad and Facebook Ad clicks. Usage of internal site search is up Along with fewer visitors coming through the homepage, we are seeing fewer browsers use traditional category navigation over internal search. We think this is partly to do with younger consumers preference for search, but also probably reflects the increasing sophistication and relevance of internal search tools used by ecommerce. Referrals from Facebook are up Even after Facebook's data security and privacy embarrassments in 2018, it continues to grow as the 2nd major global marketing platform. Although few sites in our benchmark rely on Facebook for more than 10% of their traffic, it is a significant driver of revenue. As merchants continue to come to Littledata to find out the real ROI on their Facebook Ads, check back next year for a new round of analysis! How did your site perform? If you're interested in benchmarking your ecommerce site, Littledata offers a free trial to connect with Google Analytics and audit your tracking. You can see ecommerce benchmarks directly in the app, including 'ecommerce conversion rate', 'referrals from Facebook' and 'reliance on the homepage', to know exactly how your site's performing. Sign up today to benchmark your site and import Facebook Ads data directly into Google Analytics. [subscribe] For this article we looked at Littledata's private, anonymized benchmark data set, selecting ecommerce sites that had a majority of their traffic from the US and more than 20,000 sessions per month. We measured the change from 1st December 2017 to 31st December 2017 to the same month in 2018.

2019-01-28

Introducing Shopify Flow connectors for Google Analytics

Littledata has launched the first Shopify Flow connector for Google Analytics, enabling Shopify Plus stores to analyse customer journey using a custom event in Google Analytics. In addition to Littledata's native connections with Shopify, Shopify Plus, Facebook Ads, ReCharge, etc., we have now launched a beta version of a Flow connector for Google Analytics. What is Shopify Flow? Flow is an app included with Shopify Plus, which enables stores to define automation pathways for marketing and merchandising. Think of it as an ‘If This Then That’ generator just for Shopify. For example, after an order is marked as fulfilled in Shopify’s admin you might want to trigger an email to ask for a review of the product. This would involve setting a ‘trigger’ for when an order is fulfilled and an ‘action’ to send an email to this customer. How do you use Littledata Flow actions? You install Littledata's Shopify app along with Shopify Flow Every time an order is created in your store we send it to Google Analytics, along with information about which customer ID made the order (nothing personally identifiable) You add Littledata's actions to your Flow Every time the order or customer event is triggered, even for offline events, the event is linked back to Google Analytics In Google Analytics you can then: Segment the customer base to see if these actions influence purchasing behaviour Visualise when these events occurred Analyse the customers making these actions: which geography, which browser, which marketing channel (in GA 360) Export the audience to retarget in Google Ads (in GA 360) Export the audience to run a website personalisation for using Google Optimize How do you set the actions up in Flow? Google Analytics customer event – can be used with any customer triggers, such as Customer Created Google Analytics order event – can be used with any order triggers such as Order Fulfilled, Order Paid, How else could I use the events? You can now link any of your favourite Shopify Apps with Flow connectors into Google Analytics. Some examples would be: Analyse if adding a product review leads to higher lifetime value    Retarget in Google Ads after a customer's order is fulfilled   Set up a landing-page personalisation for loyal customers (using Loyalty Lion connector) How much does this cost? The Flow connectors are included as part of Littledata’s standard subscription plans. You’ll need Littledata’s app to be installed and connected to link the events back to a customer – and to get reliable data for pre-order customer behaviour. [subscribe] Can Littledata set up a flow for a specific app? Our Enterprise Plans offer account management to help you configure the Littledata Shopify connection, including the Shopify Flow connectors. Get in touch if you have a specific app you'll like to make this work with.  

2018-12-17

How to stop Google Tag Manager being hacked

In two high-profile data breaches this year – at Ticketmaster and British Airways – over half a million credit cards were stolen via a compromised script inserted on the payment pages. Update 8/7/19: British Airways was fined a record £183m over this data breach, under new GDPR regulation. They are contesting the fine. Google Tag Manager is a powerful tool which enables you to insert any script you want onto pages of your website, but that power can used against you by hackers if you're not careful – and below we’ll look at how to stop GTM being a security risk on your payment pages. Firstly, how did the hackers get the card details from these sites? And how is it relevant to GTM on your site? Security firm RiskIQ has traced the breach to a compromised Javascript file which skimmed the card details from the payment form. So when a user entered their credit card number and security code on BritishAirways.com (or their mobile app) those details were posted to a third party server, unknown to British Airways or the customer. This is a high-scale equivalent of placing a skimming devices on an ATM, which reads one card at a time. In Ticketmaster’s hack the script was one loaded from a chatbot vendor on their site, Inbenta. Inbenta claims not even to have been aware the script was used on payment pages. The changes to the script were subtle: not breaking any functionality, and in BA’s case using a domain ‘baway.com’ which looked somewhat authentic. To protect your site against a similar attack you obviously need to lock down accounts used by your developers to change scripts in the page source code, but you also need to secure GTM – which can be used to deploy such scripts. We have a few rules at Littledata to help reduce risks in using tag management on payment pages: 1. Use pixels over custom JavaScript tags on payment pages You probably need a few standard tags, such as Google Analytics, on payment pages but try to avoid any custom scripts which could possibly skim card details. Many non-standard tags use JavaScript only to create the URL of a tracking pixel – and it is much safer (and faster) to call the tracking pixel directly. Contact the vendor to find out how. (Littledata's Shopify app even removes the need to have any script on the payment pages, by hooking into the order as it's registered on Shopify's servers) 2. Avoid loading external JavaScript files in GTM Many vendors want you to load a file from their server (e.g. myvendor.com/tracking.js) from GTM, so they can update the tracking code whenever they want. This is flexible for them, but risky for you. If the vendor gets hacked (e.g. with Inbenta above) then you get compromised. It’s less risky to embed that script directly in GTM, and control version changes from there (although a fraction slower to load the page). Of particular risk is embedding a tag manager within a tag manager – where you are giving the third party rights to publish any other scripts within the one tag. Don’t do that! [subscribe] 3. Lock down Edit and Publish rights on GTM Your organisation probably has a high turnover of contract web developers and agencies, so have you checked that only the current staff or agencies have permission to edit and publish? It's OK to have external editors use 'workspaces' for version control in GTM, but ideally someone with direct accountability to your company should check and Publish. 4. Blacklist custom JavaScript tag on the payment pages You can set a blacklist from the on-page data layer to prevent certain types of tags being deployed on the payment pages. If you have a GTM container with many users, this may be more practical that step 3. 5. Remove tags from old vendors There are many thousands of marketing tools out there, and your company has probably tried a few. Do you remove all the tags from vendors when you stop working with them? These are most at risk of being hacked. At Littledata we run a quarterly process for marketing stakeholders opt-in tags they still need for tracking or optimisation. 6. Ensure all custom JavaScript tags are reviewed by a developer before publishing It can be hard to review minimised JavaScript libraries, but worth it for payment pages if you can’t follow rules 1 and 2. If you’re still worried, you can audit the actual network requests sent from payment pages. For example, in Chrome developer tools, in the 'Network' tab, you can inspect what requests sent out by the browser and to what servers. It’s easy for malicious code to hide in the patchwork of JavaScript that powers most modern web experiences, but what is harder to hide is the network requests made from the browser to external servers (i.e. to post the stolen card information out). This request to Google Analytics is fine, but if the domain of a request is dubious, look it up or ask around the team. Good luck, and keep safe with GTM!

2018-11-24

Are you looking at the wrong Black Friday metrics?

Paying attention to the right ecommerce metrics can help you establish the best customer base and shopping experience for long-term growth. But many retailers still focus only on the most popular metrics -- especially during the online shopping craze of Black Friday and Cyber Monday (#BFCM). Over the next few weeks ecommerce managers will be obsessing over data, but which stats are the most important? Two popular metrics -- ecommerce conversion rate and average time on site -- may be misleading, so I recommend looking instead at longer-term benchmarks. Here's how it all breaks down. Littledata's ecommerce benchmark data now contains indicators from over 12,000 sites, making it an ideal place to get a realistic view of Black Friday stats. Last year we found that the impact on Black Friday and Cyber Monday was larger in 2017 than in 2016. Using that same data set of 440 high-traffic sites, I dove into the numbers to see how this affected other metrics. Metrics to avoid I think that overall ecommerce conversion rate is a bad metric to track. From the leading ecommerce websites we surveyed, the median increase was 30% during the BFCM event last year...but nearly a third of the stores saw their conversion rate dip as the extra traffic didn’t purchase, with this group seeing a median 26% drop. Some stores do extremely well with deals: four sites from our survey had more than a 15-fold increase in ecommerce conversion rate during BFCM, and nearly a quarter saw more than double the conversion rate over the period. But the real question is: will tracking conversion rate hour-by-hour help you improve it? What could you possibly change within in day? Another misleading metric is average time on site. You may be looking for signs that the the extra traffic on the holiday weekend is engaging, but this is not the one to watch. The time on site for visitors who only see one page will be zero, which will mask any real increase from engaged visitors. Where to focus instead Now, do you know what good performance on funnel conversion metrics would look like for your sector? If not, have a look at Littledata’s industry benchmarks which now cover over 500 global sectors. Littledata’s benchmarks also include historic charts to show you how metrics such as add-to-cart rate vary for the average retailer in your sector month by month. Next try the ‘speed’ performance page to see how fast a user would expect a site in your sector to be. If you see site speed (as measured in Google Analytics) drop below average during Black Friday trading it’s time to pick up the phone to your web host or web operations team. Then, are you tracking return on adverting spend for extra Facebook Ads you're running during the quarter? Ad costs will spike during the peak trading period, and you make not be getting the same volume of traffic conversion into sales. Here are some quick pointers. Facebook Ads. Littledata’s Facebook Ads connection will ensure accurate data, with a dedicated Facebook report pack for automated insights. Shopify. If you're running your site on the Shopify platform, read up on which metrics are most important for Shopify stores and check out Shopify's BFCM Toolbox for seasonal online marketing. Missions. Use Missions in the Littledata app to make permanent improvements to your user experience. BFCM may be over before you can make the changes, but customers will keep buying the rest of the year. For example, can you increase add-to-cart rate with tips such as highlighting faster selling items or recommending an alternative to out-of-stock products? So focus on some clearer metrics and I hope Black Friday brings you every success! [subscribe]

2018-11-19

For every retail loser there's a retail winner

Today PwC's retail survey found the British high street is being reshaped as shoppers shift online - especially in fashion, where a net 100 high street stores closed. This misses the positive side of the story: all those shoppers are buying from independent UK brands online instead, which is one of the fastest growing area of the UK economy. We looked at 30 mid-sized online fashion retailers (with average sales of £1m per month) who get a majority of their traffic from the UK. This collection had grown their sales by an aggregate 21% from October 2017 to October 2018 (year on year). Fashion shoppers love to browse unique designs on Instagram and Pinterest, compare prices and get easy home deliveries. Independent ecommerce brands are bringing original designs to the British wardrobe, and we should celebrate their success.   Behind the research Littledata gathers benchmark data from Google Analytics on over 12,000 websites, including many types of ecommerce businesses. Our customers get insights into their performance and recommendations on how to improve online conversion. [subscribe]

2018-11-09

Categorising websites by industry sector: how we solved a technical challenge

When Littledata first started working with benchmark data we found the biggest barrier to accuracy was self-reporting on industry sectors. Here’s how we built a better feature to categorise customer websites. Google Analytics has offered benchmarks for many years, but with limited usefulness since the industry sector field for the website is often inaccurate. The problem is that GA is typically set up by a developer or agency without knowledge or care about the company’s line of business - or understanding of what that industry sector is used for. To fix this problem Littledata needed a way to categorise websites which didn’t rely on our users selecting from a drop-down list. Google Analytics has offered benchmarks for many years, but with limited usefulness since the industry sector field for the website is often inaccurate. [subscribe] The first iteration: IBM Watson NLP and a basic taxonomy Our first iteration of this feature used a pre-trained model as part of IBM Watson’s set of natural language APIs. It was simple: we sent the URL, and back came a category according to the Internet Advertising Bureau taxonomy. After running this across thousands of websites we quickly realised the limitations: It failed with non-English websites It failed when website homepage was heavy with images rather than text It failed when the website was rendered via Javascript Since our customer base is growing most strongly outside the UK, with graphical product lists on their homepage, and using the latest Javascript frameworks (such as React), the failure rate was above 50% and rising. So we prioritised a second iteration. The second iteration: Extraction, translation and public APIs The success criteria was that the second iteration could categorise 8 sites which the first iteration failed with, and should go on to be 80% accurate. We also wanted to use mainly public APIs, to avoid maintaining code libraries, so we broke the detection process into 3 steps: Extracting meaningful text from the website Translating that text into English Categorising the English text to an IAB category and subcategory The Watson API seemed to perform well when given sufficient formatted text, at minimal cost per use, so we kept this for step 3. For step 2, the obvious choice was Google Translate API. The magic of this API is that it can detect the language of origin (with a minimum of ~4 words) and then provide the English translation. That left us focussing the development time on step 1 - extracting meaningful text. Initially we looked for a public API, and found the Aylien article extraction API. However, after testing it out on our sample sites, it suffered from the same flaws as the IBM Watson processing: unable to handle highly graphical sites, or those with Javascript rendering. To give us more control of the text extraction, we then opted to use a PhantomJS browser on our server. Phantom provides a standard function to extract the HTML and text from the rendered page, but at the expense of being somewhat memory intensive. Putting the first few thousand characters of the website text into translation and then categorisation produced better results, but still suffered from false positives - for example if the text contained legal-ease about data privacy it got categorised as technical or legal. We then looked at categorising the page title and meta description, which any SEO-savvy site would stuff with industry language. The problem here is that the text can be short, and mainly filled with brand names. After struggling for a day we hit upon the magic formula: categorising both the page title and the page body text, and looking for consistent categorisation across the two. By using two text sources from the same page we more than doubled the accuracy, and it worked for all but one of our ‘difficult’ websites. This hold-out site - joone.fr - has almost no mention of its main product (diapers, or nappies), which makes it uniquely hard to categorise. So to put it all the new steps together, here’s how it works for our long-term enterprise client MADE.com's French-language site. Step 1: Render the page in PhantomJS and extract the page title and description Step 2: Extract the page body text, remove any cookie policy and format Step 3: Translate both text strings in Google Translate Step 4: Compare the categorisations of the title vs page body text Step 5: If the two sources match, store the category I’m pleased that a few weeks after launching the new website classifier we have found it to be 95% accurate. Benchmarking is a core part of our feature set, informing everything that we do here at Littledata. From Shopify store benchmarks to general web performance data, the improved accuracy and deeper industry sector data is helping our customers get actionable insights to improve their ecommerce performance. If you’re interested in using our categorisation API, please contact us for a pilot. And note that Littledata is also recruiting developers, so if you like solving these kind of challenges, think about coming to join us!

2018-10-16

Are you benchmarking your ecommerce site in the right sector?

Littledata launched benchmarks for websites two years ago. They quickly became a key feature of our app, and as customers became more engaged, so did ideas for how to improve our benchmarking and the algorithms that learn from those benchmarks. In response to customer feedback and deeper research into industry sectors, we've made some really exciting improvements over the last few months to make the comparisons even more useful -- and even more dynamic. The changes are five-fold. Detailed sectors and sub-sectors. Almost every customer we talked to said the benchmark comparison was most useful if it was for very similar sites. Previously we only had 50 high-level sectors to compare with; now we have hundreds of low-level sectors. You can visualise the full range. Smarter auto-categorisation of your website. Our machine learning process now has a 95% chance of finding the best sector for your website, meaning you can compare against the most useful benchmark without filling in a single form! Ability to manually change industry sector. And of course, if you're in that 5% that needs human input, then you (or your Enterprise account manager) can pick a better sector in the general app settings page. You might also want to change sectors just to see how you compare. No problem. Benchmarks for technology stacks. Want to see if you are making the most of a technology such as Shopify or Yieldify? Now you can compare with other sites using the same technology, making our ecommerce benchmarking even more powerful for agencies and web developers. Benchmarks for starter websites. Previously we only generated benchmarks for sites with at least 500 monthly visits. We dropped that to 200 monthly visits, so starter websites can see a comparison - and see more detail as they grow. We've launched a live visualisation of how our AI-based website categorizer is mapping a range of industry sectors. It offers a full overview of website categories and segments. And you can drill down to see more details. For example, we've seen a big rise in wine, coffee and health shake retailers this year, many of whom are using our ReCharge integration to get insight into their subscription business models. As our algorithms learn about ecommerce businesses selling beverages of many varieties and automatically categorises sites accordingly, you can now look closely at a particular segment to see how your site compares. Littledata is an Agile company. We're constantly iterating, and continuously improving the benchmarks to make them more actionable, so please give us feedback if you'd like to see more. Happy benchmarking! [subscribe]

2018-09-25

What's the real ROI on your Facebook Ads?

For the past decade Facebook’s revenue growth has been relentless, driven by a switch from TV advertising and online banners to a platform seen as more targetable and measurable. When it comes to Facebook Ads, marketers are drawn to messaging about a strong return on investment. But are you measuring that return correctly? Facebook has spent heavily on its own analytics over the last three years, with the aim of making you -- the marketer -- fully immersed in the Facebook platform…and perhaps also to gloss over one important fact about Facebook’s reporting on its own Ads: most companies spend money with Facebook 'acquiring' users who would have bought from them anyway. Could that be you? Here are a few ways to think about tracking Facebook Ads beyond simple clicks and impressions as reported by FB themselves. The scenario Imagine a shopper named Fiona, a customer for your online fashion retail store. Fiona has browsed through the newsfeed on her Facebook mobile app, and clicks on your ad. Let’s also imagine that your site -- like most -- spends only a portion of their budget with Facebook, and is using a mix of email, paid search, affiliates and social to promote the brand. The likelihood that Fiona has interacted with more than one campaign before she buys is high. Now Fiona buys a $100 shirt from your store, and in Facebook (assuming you have ecommerce tracking with Pixel set up) the sale is linked to the original ad spend. Facebook's view of ROI The return on investment in the above scenario, as calculated by Facebook, is deceptively simple: Right, brilliant! So clear and simple. Actually, not that brilliant. You see Fiona had previously clicked on a Google Shopping ad (which is itself powered by two platforms, Google AdWords and the Google Merchant Center) -- how she found your brand -- and after Facebook, she was influenced by a friend who mentioned the product on Twitter, then finally converted by an abandoned cart email. So in reality Fiona’s full list of interactions with your ecommerce site looks like this: Google Shopping ad > browsed products Facebook Ad > viewed product Twitter post > viewed same product Link in abandoned cart email > purchase So from a multi-channel perspective, how should we attribute the benefit from the Facebook Ad? How do we track the full customer journey and attribute it to sales in your store? With enough data you might look at the probability that a similar customer would have purchased without seeing that Facebook Ad in the mix. In fact, that’s what the data-driven model in Google Marketing Platform 360 does. But without that level of data crunching we can still agree that Facebook shouldn’t be credited with 100% of the sale. It wasn’t the way the customer found your brand, or the campaign which finally convinced them to buy. Under the most generous attribution model we would attribute a quarter of the sale. So now the calculation looks like this: It cost us $2 of ad spend to bring $1 of revenue -- we should kill the campaign. But there's a catch Hang on, says Facebook. You forgot about Mark. Mark also bought the same shirt at your store, and he viewed the same ad on his phone before going on to buy it on his work computer. You marked the source of that purchase as Direct -- but it was due to the same Facebook campaign. Well yes, Facebook does have an advantage there in using its wide net of signed-in customers to link ad engagement across multiple devices for the same user. But take a step back. Mark, like Fiona, might have interacted with other marketing channels on his phone. If we can’t track cross-device for these other channels (and with Google Marketing Platform we cannot), then we should not give Facebook an unfair advantage in the attribution. So, back to multi-channel attribution from a single device. This is the best you have to work with right now, so how do you get a simple view of the Return on Advertising Spend, the real ROI on your ads? Our solution At Littledata we believe that Google Analytics is the best multi-channel attribution tool out there. All it misses is an integration with Facebook Ads to pull the ad spend by campaign, and some help to set up the campaign tagging (UTM parameters) to see which campaign in Facebook brought the user to your site. And we believe in smart automation. Littledata's Facebook Ads connection  audits your Facebook campaign tagging and pulls ad cost daily into Google Analytics. This automated Facebook-Ads-to-Google-Analytics integration is a seamless way to pull Facebook Ads data into your overall ecommerce tracking -- something that would otherwise be a headache for marketers and developers. The integration checks Facebook Ads for accurate tagging and automatically pulls ad cost data into GA. The new integration is included with all paid plans. You can activate the connection from the Connections tab in your Littledata dashboard. It's that easy! (Not a subscriber yet? Sign up for a free trial on any plan today.) We believe in a world of equal marketing attribution. Facebook may be big, but they’re not the only platform in town, and any traffic they're sending your way should be analysed in context. Connecting your Facebook Ads account takes just a few minutes, and once the data has collected you’ll be able to activate reports to show the same kind of ROI calculation we did above. Will you join us on the journey to better data?

2018-09-20

Get the Littledata analytics app

Start your free 14-day trial

Learn More

Insights from analytics experts

Subscribe to the Littledata blog for the latest posts and updates

No Thanks
We respect your privacy. Your information is safe and will never be shared.
Don't miss out. Subscribe today.
×
×