Comparing 3 time ranges in Google Analytics

Selecting time ranges for comparison in Google Analytics can trip you up. We find comparing 28-day or 7-day (one week) periods the most reliable method. Gotcha 1: Last 4 days with previous 4 days This is comparing the same time periods (4 days) so shouldn't they be comparable? No! Most websites show a strong weekly cycle of visits (either stronger or weaker on the weekend), so the previous four days may be a very different stage of the week. Gotcha 2: Last month compared with the previous month Easy - we can see traffic has gone up by 5% in March. No! March has 11% more viewing time (3 extra days) than February. So the average traffic per day in March has actually dropped by 5.5%. Gotcha 3: Last week compared with the previous week You can see what's coming this time... Certain weeks of the year are always abnormal, and the Christmas period is one of them. But most business / educational sites it is a very quiet period. The best comparison would be with the same week last year. Have any questions? Let us know by commenting below or get in touch with our lovely experts!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-12-01

Top 5 Google Analytics metrics Shopify stores can use to improve conversion

Stop using vanity metrics to measure your website's performance! The pros are using 5 detailed metrics in the customer conversion journey to measure and improve. Pageviews or time-on-site are bad ways to measure visitor engagement. Your visitors could view a lot of pages, yet be unable to find the right product, or seem to spend a long time on site, but be confused about the shipping rates. Here are the 5 better metrics, and how they help you improve your Shopify store: 1. Product list click-through rate Of the products viewed in a list or category page, how many click through to see the product details? Products need good images, naming and pricing to even get considered by your visitors. If a product has a low click-through rate, relative to other products in the list, then you know either the image, title or price is wrong. Like-wise, products with very high list click-through, but low purchases, may be hidden gems that you could promote on your homepage and recommended lists to increase revenue. If traffic from a particular campaign or keyword has a low click-through rate overall, then the marketing message may be a bad match with the products offered – similar to having a high bounce rate. 2. Add-to-cart rate Of the product details viewed, how many products were added to the cart? If visitors to your store normally land straight on the product details page, or you have a low number of SKUs, then the add-to-cart rate is more useful. A low add-to-cart rate could be caused by uncompetitive pricing, a weak product description, or issues with the detailed features of the product. Obviously, it will also drop if you have limited variants (sizes or colours) in stock. Again, it’s worth looking at whether particular marketing campaigns have lower add-to-cart rates, as it means that particular audience just isn’t interested in your product. 3. Cart to Checkout rate Number of checkout processes started, divided by the number of sessions where a product is added to cart A low rate may indicate that customers are shopping around for products – they add to cart, but then go to check a similar product on another site. It could also mean customers are unclear about shipping or return options before they decide to pay. Is the rate especially low for customers from a particular country, or products with unusual shipping costs? 4. Checkout conversion rate Number of visitors paying for their cart, divided by those that start the process Shopify provides a standard checkout process, optimised for ease of transaction, but the conversion rate can still vary between sites, depending on payment options and desire. Put simply: if your product is a must-have, customers will jump through any hoops to complete the checkout. Yet for impulse purchases, or luxury items, any tiny flaws in the checkout experience will reduce conversion. Is the checkout conversion worse for particular geographies? It could be that shipping or payment options are worrying users. Does using an order coupon or voucher at checkout increase the conversion rate? With Littledata’s app you can split out the checkout steps to decide if the issue is shipping or payment. 5. Refund rate Percent of transactions refunded Refunds are a growing issue for all ecommerce but especially fashion retail. You legally have to honour refunds, but are you taking them into account in your marketing analysis? If your refund rate is high, and you base your return on advertising spend on gross sales (before refunds), then you risk burning cash on promoting to customers who just return the product. The refund rate is also essential for merchandising: aside from quality issues, was an often-refunded product badly described or promoted on the site, leading to false expectations? Conclusion If you’re not finding it easy to get a clear picture of these 5 steps, we're in the process of developing Littledata’s new Shopify app. You can join the list to be the first to get a free trial! We ensure all of the above metrics are accurate in Google Analytics, and the outliers can then be analysed in our Pro reports. You can also benchmark your store performance against stores in similar sectors, to decide if there are tweaks to the store template or promotions you need to make. Have more questions? Comment below or get in touch with our lovely team of Google Analytics experts!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-30

The referral exclusion list: what it is and how to update it?

The referral exclusion list is only available for properties using Universal Analytics ... so please make the jump and take advantage of the benefits! Let's find out how excluding referral traffic affects your data and how you can correct some of the wrong attributions of sales. By default, a referral automatically triggers a new session. When you exclude a referral source, traffic that arrives to your site from the excluded domain doesn’t trigger a new session. Because each referral triggers a new session, excluding referrals (or not excluding referrals) affects how sessions are calculated in your account. The same interaction can be counted as either one or two sessions, based on how you treat referrals. For example, a user on my-site.com goes to your-site.com and then returns to my-site.com. If you do not exclude your-site.com as a referring domain, two sessions are counted, one for each arrival at my-site.com. If, however, you exclude referrals from your-site.com, the second arrival to my-site.com does not trigger a new session, and only one session is counted. Common uses for referral exclusions list in Google Analytics: Third-party payment processors Cross-subdomain tracking If you add example.com to the list of referral exclusions, traffic from the domain example.com and the subdomain another.example.com are excluded. Traffic from another-example.com is not excluded. Only traffic from the domain entered in the referral exclusions list and any subdomains are excluded. Traffic from domains that only have substring matches are not excluded. How to add domains in the referral exclusion list: Sign in to your Gooogle Analytics account. Click admin in the menu bar at the top of any page. In the account column, use the drop-down to select the Google Analytics account that contains the property you want to work with. In the property column, use the drop-down to select a property. Click tracking info. Click referral exclusion list. To add a domain, click +add referral exclusion. Enter the domain name. Click create to save. The referral exclusion list used contains matching. For example, if you enter example.com, then traffic from sales.example.com is also excluded (because the domain name contains example.com). Need help with these steps? Get in touch with one of our experts and we'd be happy to assist you!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-29

4 common pitfalls of running conversion rate experiments from Microsoft

At a previous Measurefest conference, one of the speakers, Craig Sullivan, recommended a classic research paper from Microsoft on common pitfalls in running conversion rate experiments. It details five surprising results which took 'multiple-person weeks to properly analyse’ at Microsoft and published for the benefit of all. As the authors point out, this stuff is worth spending a few weeks getting right as ‘multi-million-pound business decisions’ rest on the outcomes. This research ultimately points out the importance of doing A/A Testing. Here follows an executive overview, cutting out some of the technical analysis: 1. Beware of conflicting short-term metrics Bing’s management had two high-level goals: query share and revenue per search. The problem is that it is possible to increase both those and yet create a bad long-term company outcome, by making the search algorithm worse. If you force users to make more searches (increasing Bing’s share of queries), because they can’t find an answer, they will click on more adverts as well. “If the goal of a search engine is to allow users to find their answer or complete their task quickly, then reducing the distinct queries per task is a clear goal, which conflicts with the business objective of increasing share.” The authors suggest a better metric in most cases is lifetime customer value, and the executives should try to understand where shorter-term metrics might conflict with that long-term goal 2. Beware of technical reasons for experiment results The Hotmail link on the MSN home page was changed to open Hotmail in a separate tab/window. The naïve experiment results showed that users clicked more on the Hotmail link when it opened in a new window, but the majority of the observed effect was artificial. Many browsers kill the previous page’s tracking Javascript when a new page loads – with Safari blocking the tracking script in 50% of pages opening in the same window. The “success” of getting users to click more was not real, but rather an instrumentation difference. So it wasn’t that more people were clicking on the link – but actually that just more of the links were being tracked in the ‘open in new tab’ experiment. 3. Beware of peeking at results too early When we release a new feature as an experiment, it is really tempting to peek at the results after a couple of days and see if the test confirms our expectation of success (confirmation bias). With the initial small sample, there will be a big percentage change. Humans then have an innate tendency to see trends where there aren’t any. So the authors give the example of this chart: Most experimenters would see the results, and even though they are negative, extrapolate the graph along the green line to a positive result and four days. Wrong. What actually happens is regression to the mean. This chart is actually from an A/A test (i.e. the two versions being tested are exactly the same). The random differences are biggest at the start, and then tail off - so the long term result will be 0% difference as the sample size increases. The simple advice is to wait until there are enough test results to draw a statistically significant conclusion. That generally means more than a week and hundreds of individual tests. 4. Beware of the carryover effect from previous experiments Many A/B test systems use a bucketing system to assign users into one experiment or another. At the end of one test the same buckets of users may be reused for the second test. The problem is that if users return to your product regularly (multiple times daily in the case of Bing), then a highly positive or negative experience in one of the tests will affect all of that bucket for many weeks. In one Bing experiment, which accidentally introduced a nasty bug, users who saw the buggy version were still making fewer searches 6 months after the experiment ended. Ideally, your test system would re-randomise users for the start of every new test, so those carryover effects are spread as wide as possible. Summary For me the biggest theme coming out of their research is the importance of A/A tests – seeing what kind of variation and results you get if you don’t change anything. Which makes you more aware of the random fluctuations inherent in statistical tests. In conclusion, you need to think about the possible sources of bias before acting on your tests. Even the most experienced analysts make mistakes! Have any comments? Let us know what you think, below!    

2016-11-27

5 tips to avoid a metrics meltdown when upgrading to Universal Analytics

Universal Analytics promises some juicy benefits over the previous standard analytics. But having upgraded 6 different high traffic sites there are some pitfalls to be aware of. Firstly, why would you want to upgrade your tracking script? More reliable tracking of page visitors - i.e. fewer visits untracked More customisation to exclude certain referrers or search terms Better tools for tracking across multiple domains and tracking users across different devices Track usage across your apps for the same web property Ability to send up to 20 custom dimensions instead of the previous limit of only 5 custom variables If you want to avoid any interruption of service when you upgrade, why not book a quick consultation with us to check if Universal Analytics will work in your case. But before you start you should take note of the following. 1. Different tracking = overall visits change If your boss is used to seeing dependable weekly / monthly numbers, they may query why the number of visits has changed. Universal Analytics is likely to track c. 2% more visits than previously (partly due to different referral tracking - see below), but it could be higher depending on your mix of traffic. PRO TIP: Set up a new web property (a different tracking code) for Universal Analytics and run the old and new trackers alongside each other for a month. Then you can see how the reports differ before sharing with managers. Once this testing period is over you'll need to upgrade the original tracking code to Universal Analytics to you keep all your historic data. 2. Different tracking of referrals Previously, if Bob clicked on a link in Twitter to your site, reads, goes back to Twitter, and within 30 minutes clicks on a different link to your site - that would be counted as one visit and the 2nd referral source would be ignored. In Universal Analytics, when Bob clicks on the 2nd link he is tracked as a second visit, and 2nd referral source is stored. This may be more accurate for marketing tracking, but if Bob then buys a product from you, going via a secure payment gateway hosted on another domain (e.g. paypal.com) then the return from the payment gateway will be counted as a new visit. All your payment goals or ecommerce tracking will be attributed to a referral from 'paypal.com'. This will ruin your attribution of a sale to the correct marketing channel or campaign! PRO TIP: You need to add all of the payment gateways (or other third party sites a user may visit during the payment process) to the 'Referral Exclusion List'. You can find this under the Admin > Property > Tracking codes menu: 3. Tracking across domains If you use the same tracking code across different domains (e.g. mysite.co.uk and mysite.com or mysite.de) then you will need to change the standard tracking script slightly. By default the tracking script you copy from Google Analytics contains a line like: ga('create', 'UA-XXXXXXX-1', 'mysite.com');. This will only track pages that strictly end with 'mysite.com'. PRO TIP: It's much safer to change the tracker to set that cookie domain automatically. The equivalent for the site above would be ga('create', 'UA-XXXXXXX-1', 'auto');. The 3rd argument of the function is replaced with 'auto'. 4. Incompatibility with custom variables Only relevant if you send custom data already Custom variables are only supported historically in Universal analytics. That means you will need to change any scripts that send custom data to the new custom dimension format to keep data flowing. Read the developer documentation for more. PRO TIP: You'll need to set the custom dimension names in the admin panel before the custom data can be sent from the pages. You can also only check that the custom dimensions are being sent correctly by creating a new custom report for each dimension. 5. User tracking limitations We wouldn't recommend implementing the new user ID feature just now, as it has some major limitations compared with storing the GA client ID. You need to create a separate view to see the logged-in-user data, which makes reporting pageviews a whole lot more complex. Visits a user made to your site BEFORE signing up are not tracked with that user - which means you can't track the marketing sources by user PRO TIP: See our user tracking alternative. Got more tips on to setting up Universal Analytics? Please share them with us in the comments, or get in touch if you want more advice on how to upgrade!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-26

Widget Tracking with Google Analytics

I was asked recently about the best way to track a widget, loaded in an iframe, on a third-party site with Google Analytics. The difficulty is that many browsers now block 3rd party cookies (those set by a different domain to the one in the browser address bar) – and this applies to a Google Analytics cookie for widgets as much as to adverts. The best solution seems to be to use local storage on the browser (also called HTML5 Storage) to store a persistent identifier for Analytics and bypass the need to set a cookie – but then you have to manually create a clientID to send to Google Analytics. See the approach used by ShootItLive. However, as their comment on line 41 says, this is not a complete solution - because there are lots of browsers beyond Safari which block third party cookies. I would take the opposite approach and check if the browser supports local storage, and only revert to trying to set a cookie if it does not. Local storage is now possible on 90% of browsers in use and the browsers with worst 3rd party cookie support (Firefox and Safari) luckily have the longest support for local storage. As a final note, I would set up the tracking on a different Google Analytics property to your main site, so that pageviews of widgets are not confused with pageviews of your main site. To do list: Build a script to create a valid clientID for each new visitor Call ga('create) function, setting 'storage' : 'none', and getting the 'clientID' from local storage (or created from new) Send a pageview (or event) for every time the widget is loaded. Since the widget page is likely to be the same every time it is embedded, you might want to store the document referrer (the parent page URL) instead Need help with the details? Get in touch with our team of experts and we'd be happy to help!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-25

How to link Adwords and Google Analytics

If you are running an AdWords campaign you must have a Google Analytics account. We will show you how to link these two accounts so you can unleash the full reporting potential of both platforms. 1. Why should you link Analytics and AdWords? When you link Google Analytics and AdWords, you can: See ad and site performance data in the AdWords reports in Google Analytics. Import Google Analytics goals and ecommerce transactions directly into your AdWords account. Import valuable Analytics metrics—such as bounce rate, avg. session duration, and pages/session—into your AdWords account. Take advantage of enhanced remarketing capabilities. Get richer data in the Google Analytics multi-channel funnels reports. Use your Google Analytics data to enhance your AdWords experience. 2. How to link Google Analytics and AdWords The linking wizard makes it easy to link your AdWords account(s) to multiple views of your Google Analytics property. If you have multiple Google Analytics properties and want to link each of them to your AdWords account(s), just complete the linking wizard for each property. Sign into your Google Analytics account at www.google.com/analytics. Note: You can also quickly open Google Analytics from within your AdWords account. Click the tools tab, select analytics, and then follow the rest of these instructions. Click the admin tab at the top of the page. In the account column, select the analytics account that contains the property you want to link to one or more of your AdWords accounts. In the property column, select the analytics property you want to link, and click AdWords Linking. Use one of the following options to select the AdWords accounts you want to link with your analytics property. Select the checkbox next to any AdWords accounts you want to link with your analytics property. If you have an AdWords manager (MCC) account, select the checkbox next to the manager account to link it (and all of its child accounts) with your analytics property. If you want to link only a few managed accounts, expand the manager account by clicking the arrow next to it. Then, select the checkbox next to each of the managed AdWords accounts that you want to link. Or, click all linkable to select all of managed AdWords accounts under that MCC. You can then deselect individual accounts, and the other accounts will stay selected. Click the continue button. In the link configuration section, enter a link group title to identify your group of linked AdWords accounts. Note: Most users will only need one link group. We recommend creating multiple link groups only if you have multiple AdWords accounts and want data to flow in different ways between these accounts and your analytics property. For example, you should create multiple link groups if you need to either link different AdWords accounts to different views of the same Google Analytics property or enable auto-tagging for only some of your AdWords accounts. Select the Google Analytics views in which you want the AdWords data to be available. If you've already enabled auto-tagging in your AdWords account, skip to the next step. The account linking process will enable auto-tagging for all of your linked AdWords accounts. Click advanced settings only if you need to manually tag your AdWords links. Click the link accounts button. Congratulations! Your accounts are now linked. If you opted to keep auto-tagging turned on (recommended), Google Analytics will automatically start associating your AdWords data with customer clicks. For a deeper view and debugging you should also read the Google Analytics guide. Have any questions on setting this up? Get in touch and we'd be happy to help!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-24

It’s Black Sunday – not Black Friday

The biggest day for online retail sales among Littledata’s clients is the Sunday after Black Friday, followed closely by the last Sunday before Christmas. Which is more important - Black Friday or Cyber Monday? Cyber Monday saw the biggest year-on-year increase in daily sales, across 84 surveyed retailers from the UK and US. In fact, Cyber Monday is blurring into the Black Friday weekend phenomenon – as shoppers get used to discounts being available for longer. We predict that this trend will continue for 2016, with the number of sales days extending before and after Black Friday. Interested in what 2016 will bring? Stay tuned for our upcoming blog post! Want to see how you did against the benchmark? Sign up for a free trial or get in touch if you have any questions!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-23

Get the Littledata analytics app

Complete picture of your ecommerce business. Free Google Analytics connection, audit and benchmarks.

Sign up