Is Google Analytics accurate? 6 common issues and how to resolve them
Our customers come from a range of industries, but when they first come to the Littledata app for help with fixing their analytics, they share a lot of common questions. First of all, is Google Analytics accurate? How do you know if your Google Analytics setup is giving you reliable data? In this blog post we look at common problems and explain what can be done to make your tracking more accurate. Google Analytics is used by tens of millions of websites and apps around the world to measure web visitor engagement. It won’t measure 100% of visitors – due to some users opting out of being tracked, or blocking cookies – but set up correctly, it should be measuring over 95% of genuine visitors (as opposed to web scrapers and bots). What are the common things that go wrong? The six most common issues with Google Analytics -- and how to resolve them 1. Your tracking script is wrongly implemented There are two common issues with the actual tracking script setup: 1) when it is implemented twice on some pages, and 2) when it is missing completely from some pages. The effect of duplicating the script is that you’ll see an artificially low bounce rate (usually below 5%), since every page view is sending twice to Google Analytics. The effect of the tracking script missing from pages is that you’ll see self-referrals from your own website. Our recommendation is to use Google Tag Manager across the whole site to ensure the tracking script is loaded with the right web property identifier, at the right time during the page load. 2. Your account has lots of spam When it comes to web traffic and analytics setup, spam is a serious issue. Spammers send 'ghost' referrals to get your attention as a website owner. This means that the traffic you see in Google Analytics may not come from real people, even if you have selected to exclude bots. Littledata’s app filters out all future spammers and Pro Reporting users benefit from having those filters updated weekly. 3. Your own company traffic is not excluded Your web developers, content writers and marketers will be heavy users of your own site, and you need to filter this traffic from your Google Analytics to get a view of genuine customers or prospects. You can do this based on location (e.g. IP address) or pages they visit (e.g. admin pages). 4. One person shows up as two or more users Fight Club aside (spoiler alert), when the same person re-visits our site we expect them to look the same each time. Web analytics is more complicated. What Google Analytics is tracking when it talks of ‘users’ is a visit from a particular device or browser instance. So if I have a smartphone and a laptop computer and visit your site from both devices (without cross-device linking) I’ll appear as two users. Even more confusingly, if I visit your site from the Facebook app on my phone and then from the Twitter app, I’ll appear as two users – because those two apps use two different internet browser instances. There's not a lot which can be done to fix that right now, although Google is looking at ways to use it's accounts system (Gmail, Chrome etc) to track across many devices. 5. Marketing campaigns are not attributed to revenue or conversions If the journey of visitors on your site proceeds via another payment processor or gateway, you could be losing the link between the sale (or goal conversion) and the original marketing campaigns. You will see sales attributed to Direct or Referral traffic, when they actually came from somewhere else. This is a remarkably common issue with Shopify stores, and that’s why we built a popular Shopify reporting app that solves the issue automatically. For other kinds of sites, the issue can often be resolved by setting up cross-domain tracking. 6. You aren't capturing key events (like purchases or button clicks) Google Analytics only tracks views of a page by default, which may not be meaningful if you have a highly interactive website or app. Sending custom events is the key to ensuring that your tracking is both accurate and relevant. Doing so is made easier with Google Tag Manager makes this easier than it would be otherwise, but you may need to speak to a qualified analytics consultant to decide what to track. If you want more certainty that your analytics is fully accurate, try Littledata's free Google Analytics audit or get in touch for a quick consultation. We <3 analytics and we're always here to help.
Why are all my transactions coming from Direct or Referral in Google Analytics, with no marketing attribution?
Connecting marketing data with sales data is an age-old problem, and the crowded digital landscape has made this even more complicated. Google Analytics is supposed to give you the power to attribute sales (or purchase transactions) back to marketing campaigns, but this doesn't happen automatically. The good news is that it's entirely possible to get the right marketing channel attribution for sales activities. Accurate marketing attribution starts with the right Google Analytics (GA) setup. Start by asking yourself the following troubleshooting questions. These steps will help you figure out if your GA setup is correct, and how to use GA to get a complete view of user behaviour. Trustworthy GA setup takes a bit of work, but with a smart analytics dashboard like Littledata, much of that work can be automated. In fact, steps 1 through 4 can be checked automatically with our free Google Analytics audit tool. First of all, are you checking the right report? The best way to see the attribution is in the 'Channels' report in Google Analytics, under the 'Acquisition' section: 1. Have you got a large enough sample to compare? Firstly, can you be sure the sales are representative? If you only have two sales, and both are ‘Direct’, that could be a fluke. We recommend selecting a long enough time period to look at more than 50 transactions before judging, as with this example: 2. Is the tracking script on your purchase confirmation page setup? It you are getting some transactions recorded, but not 100%, then it may be possible to optimise the actual tracking script setup. See our technical guide to ecommerce tracking. This can be a particular problem if many of your sales are on mobile, since slower page load speeds on mobile may be blocking the tracking script more often. 3. Have you got a cross-domain problem? If you see many of your sales under Referral, and when you click through the list of referrers it includes payment gateways (e.g. mybank.com or shopify.com), that is a tell-tale sign you have a cross-domain problem. This means that when the buyer is referred back from the payment domain (e.g. paypal.com), their payment is not linked with the original session. This is almost always a problem for Shopify stores, which is why our Shopify app is essential for accurate tracking. 4. Is your marketing campaign tagging complete? For many types of campaign (Facebook, email etc), unless you tag the link with correct ‘UTM’ parameters, the source of the purchaser will not be tracked. So if a user clicks on an untagged Facebook Ad link on their Facebook mobile app (which is where 80 – 90% of Facebook users engage) then the source of their visit will be ‘Direct’ (not Social). Untagged email campaigns are a particular issue if you run abandoned cart / basket emails, as these untagged links will be 'stealing' the sales which should be attributed to whatever got the buyer to add to cart. Tagging is a real problem for Instagram, since currently the profile link is shown in full - and looks really messy if you include all the UTM parameters. We recommend using a service like Bitly to redirect to your homepage (or an Instagram landing page). i.e. The link redirects to yoursite.com?utm_medium=social&utm_source=instragram&utm_campaign=profile_link. Read Caitlin Brehm's guide to Instagram links. 5. (only for subscription businesses using Littledata) Are you looking at only the first time payments? Tracking the source of recurring payments is impossible, if the tracking setup was incorrect at the time of the first payment. You can’t change Google Analytics retrospectively I’m afraid. So if you are using our ReCharge integration, and you want to track lifetime value, you will have to be patient for a few months as data from the correct tracking builds up. 6. Is a lot of your marketing via offline campaigns, word of mouth or mobile apps? It could be that your sales really are ‘direct’: If a buyer types in the URL from a business card or flyer, that is ‘Direct’. The only way to change this is to use a link shortener to redirect to a tagged-up link (see point 4 above). If a user pastes a link to your product in WhatsApp, that is ‘Direct’. If a user sees your product on Instagram and clicks on the profile link, that is ‘Direct’. Please let us know if there are any further issues you've seen which cause the marketing attribution to be incorrect.
What to test with Google Optimize
So you’ve got a brand new tool in your web performance kit – Google Optimize – and now you want to put it to good use. What can you test with Optimize and how does it work? Firstly, what are the different options for setting up an experiment? AB Test Using the in-page editor you can create an altered version of the page you wish to test. This could be a change of text copy, different styling, or swapping in a different image. You can also add new scripts or HTML if you’re familiar with coding. The way this works is Optimize adds a script after the page loads to manipulate the page text, images or styles. I recommend not switching header elements or large images using this method as, depending on your website setup, there may be a noticeable flicker– try a redirection test below. You can create many versions with subtly different changes (C, D and E versions if you want) – but remember you’ll need a large volume of traffic to spot significant differences between lots of variations. You can also limit the test to a certain segment of users – maybe only first time visitors, or those on mobile devices. Multivariate Test Similar to an AB test, a multivariate test is used when you have a few different aspects of the page to change (e.g. image and headline text) and you want to see which combination is most engaging. To get a significant result, you'll need a large volume of traffic - even more than testing many options in AB tests. Redirection Test This is where you have two different versions of a page – or a different flow you want to start users on. Optimize will split your visitors, so some see the original page and some are redirected to the B version. A redirection test is best when the page content or functionality is very different – perhaps using a whole different layout. The disadvantage is you’ll need a developer to build the B version of the page, which may limit the speed of cycling tests. Personalisation Personalisation is not officially supported by Optimize right now, but we’ve found it to be a useful tool. You can assign 99.9% of the visitors who match certain criteria to see the alternative version of the page. An example is where you have a special offer or local store in a particular city - see our step-by-step local personalisation example. You can ensure that all the visitors from that city see a different version of the page. Unfortunately on the free version of Google Optimize you are limited to 3 concurrent ‘experiments’ – so it won’t be a good solution if you want to run similar personalisation across lots of cities or groups of users. Next the question is where to start with tests... Start with the landing pages Landing pages get the greater volume of traffic, and are where small visual changes (as opposed to new product features) make the biggest difference to user engagement. This greater volume allows you to get a significant result quicker, meaning you can move on to the next test quicker. And keep on improving! So what exactly could you test using Google Optimize? Here are six ideas to get you going. 1. Could call-to-actions (CTA) be clearer? Changing the colour or contrast of a key button or link on the page (within your brand guidelines) usually results in more visitors clicking it. This might involve changing the style of the CTA itself, or removing elements close by on the page – to give the CTA more space to stand out. 2. Are you giving the user too many choices? In Steve Krug’s classic Don’t Make me Think he explains how any small confusion in the user’s mind can stop them making any choice. Every choice the user has to make is an opportunity for them to give up. Try hiding one of the options and seeing if more users overall choose any of the remaining options. 3. Is the mobile page too long? As many sites move to responsive designs that switch layout on smaller screens, this has led to mobile pages becoming very long. User may get ‘scroll fatigue’ before then get to critical elements on the page. Try cutting out non-essential sections for mobile users, or editing copy or images to make the page shorter. You could also try switching sections so that the call-to-action is higher up the page on mobile – although this is harder to achieve without a redirection test. 4. Is localisation important to your users? You may have discussed providing local language content for your users, and been unsure if it is worth the costs of translation and maintenance. Why not test the benefits for a single location? As with the personalisation tests, you can show a different local language (or local currency) version of the page to half the users in the single location (e.g. Spanish for visitors from Mexico) and see if they convert better. 5. Does the user need more reassurance before starting to buy? It easier to build experiments which remove elements to the page, but you should also consider adding extra explanation messages. A common problem on ecommerce stores is that visitors are unsure what the shipping charges or timing will be before adding to cart. Could you add a short sentence at the start of the journey (maybe on a product page) to give an outline of your shipping policy? Or maybe some logos of payment methods you accept? 6. Changing header navigation If your site has a complex mix of products that has evolved over time it may be time to try a radical new categorisation – maybe splitting products by gender or price point rather than by type. For this test, you’ll want to target only new visitors – so you don’t confuse regular visitors until you’re sure it’s permanent. You will also need to make the navigation changes on all pages across the site. Good luck! Littledata also offering consulting and AB testing support, so please contact us for any further advice.
How to add account edit permissions for Google Analytics
Being able to edit the Google Analytics account is the 2nd highest permission level. You need this if you want to create a new web property in Google Analytics. To grant permissions to another user you will need the highest permission level yourself: being able to manage users on the account. Step 1: Go to account user settings page First click the admin cog in any view under the account in GA you want to change, and then in the left hand list go to User Settings EITHER Select an existing user from the list and click the 'edit' checkbox OR Add a new user's email (must be a Google account) and check the 'edit' checkbox. Step 3: Check it's working Your colleague should now be able to see 'Create new property' under the list of properties in the middle of the Admin page.
Shopify Marketing Events vs Google Analytics
At the Shopify Unite conference today I heard plenty of great ideas such as ShopifyPay but the most interesting for me as a data specialist was the marketing events API. Since we launched our Fix Google Analytics Shopify app earlier this year we’ve known that reporting was a weak spot in Shopify’s platform offering, and they admit that ‘understanding marketing campaign performance’ is one of the biggest challenges of Shopify merchants right now. The ability for other Shopify apps to plug their campaign cost and attribution data into Shopify (via the marketing events API) is a logical step to building Shopify’s own analytics capability, but I don’t believe it will be a substitute for Google Analytics (GA) anytime soon. Here’s why: 1. Google Analytics is the industry standard Every online marketer has used Google Analytics, and many have favourite reports they’ve learned to interpret. Moving them to use a whole new analysis platform will take time– and it’s taken GA 10 years to achieve that dominance. 2. GA provides platform-agnostic data collection For a store using Shopify as their only source of insights, moving away from Shopify would mean losing all the historic marketing performance data – so it would be very hard to make like-for-like comparisons between the old platform and the new. Many of our customers have used GA during and after a platform shift to get continuous historical data. Which ties into my first point that over 85% of businesses have a history of data in GA. 3. Incomplete marketing tagging will still cause issues Making valid analysis on multi-channel marketing performance relies on having ALL the campaigns captured - which is why our GA audit tool checks for completeness of campaign tagging. Shopify’s tracking relies on the same ‘utm_campaign’ parameters as GA, and campaigns that are not properly tagged at the time cannot be altered retrospectively. 4. Google is rapidly developing Google Analytics I’d like to see the Shopify marketing event collection evolve from its launch yesterday, but Google already has a team of hundreds working on Google Analytics, and it seems unlikely that Shopify will be able to dedicate resources to keep up with the functionality that power users need. 5. More integrations are needed for full campaign coverage Shopify’s marketing analysis will only be available for apps that upgrade to using the new API. Marketing Events has launched with integrations for Mailchimp and Facebook (via Kit) but it won’t cover many of the major channels (other emails, AdWords, DoubleClick for Publishers) that stores use. Those integrations will get built in time, but until then any attribution will be skewed. 6. GA has many third-party integrations Our experience is that any store interested in their campaign attribution quickly wants more custom analysis or cuts of the data. Being able to export the data into Littledata’s custom reports (or Google Sheets or Excel) is a popular feature – and right now Shopify lacks a reporting API to provide the same customisations. You can only pull raw event data back out. That said, there are flaws with how GA attribution works. Importing campaign cost data is difficult and time consuming in GA – apart from the seamless integration with AdWords – and as a result hardly any of the stores we monitor do so. If Shopify can encourage those costs to be imported along with the campaign dates, then the return on investment calculations will be much easier for merchants. I also think Shopify has taken the right pragmatic approach to attribution windows. It counts a campaign as ‘assisting’ the sale if it happens within 30 days of the campaign, and also whether it was ‘last click’ or ‘first click’. I’ve never seen a good reason to get more complicated than that with multi-channel reports in GA, and it’s unlikely that many customers remember a campaign longer than 30 days ago. In conclusion, we love that Shopify is starting to take marketing attribution seriously, and we look forward to helping improve the marketing events feature from its launch yesterday, but we recommend anyone with a serious interest in their marketing performance sticks to Google Analytics in the meantime (and use our Shopify app to do so).
Important update to Remarketing with Google Analytics
How does page load speed affect bounce rate?
I’ve read many articles stating a link between faster page loading and better user engagement, but with limited evidence. So I looked at hard data from 1,840 websites and found that there’s really no correlation between page load speed and bounce rate in Google Analytics. Read on to find out why. The oft quoted statistic on page load speed is from Amazon, where each 100ms of extra loading delay supposed to cost Amazon $160m. Except that the research is from 2006, when Amazon’s pages were very static, and users had different expectations from pages – plus the conclusions may not apply to different kinds of site. More recently in 2013, Intuit presented results at the Velocity conference of how reducing page load speed from 15 seconds to 2 seconds had increased customer conversion by: +3% conversions for every second reduced from 15 seconds to 7 seconds +2% conversions for every second reduced from seconds 7 to 5 +1% conversions for every second reduced from seconds 4 to 2 So reducing load speed from 15 seconds to 7 seconds was worth an extra 24% conversion, but only another 8% to bring 7 seconds down to 2 seconds. Does page speed affect bounce rate? We collected data from 1,840 Google Analytics web properties, where both the full page load time (the delay between the first request and all the items on the page are loaded) and the bounce rate were within normal range. We then applied a Spearman’s Rank Correlation test, to see if being a higher ranked site for speed (lower page load time) you were likely to be a higher ranked site for bounce rate (lower bounce rate). What we found is almost no correlation (0.18) between page load speed and bounce rate. This same result was found if we looked at the correlation (0.22) between bounce rate and the delay before page content starts appearing (time to DOM ready) So what explains the lack of a link? I have three theories 1. Users care more about content than speed Many of the smaller websites we sampled for this research operate in niche industries or locations, where they may be the only source of information on a given topic. As a user, if I already know the target site is my best source for a topic, then I’ll be very patient while the content loads. One situation where users are not patient is when arriving from Google Search, and they know they can go and find a similar source of information in two clicks (one back to Google, and then out to another site). So we see a very high correlation between bounce rate and the volume of traffic from Google Search. This also means that what should concern you is speed relative to your search competitors, so you could be benchmarking your site speed against a group of similar websites, to measure whether you are above or below average. 2. Bounce rate is most affected by first impressions of the page As a user landing on your site I am going to make some critical decisions within the first 3 seconds: would I trust this site, is this the product or content I was expecting, and is it going to be easy to find what I need. If your page can address these questions quickly – by good design and fast loading of the title, main image etc – then you buy some more time before my attention wanders to the other content. In 2009, Google tried an experiment to show 30 search results to users instead of 10, but found the users clicking on the results dropped by 20%. They attributed this to the half a second extra it took to load the pages. But the precise issue was likely that it took half a second to load the first search result. Since users of Google mainly click on the first 3 results, the important metric is how long it took to load those - not the full page load. 3. Full page load speed is increasingly hard to measure Many websites already use lazy loading of images and other non-blocking loading techniques to make sure the bare bones of a page is fast to load, especially on a mobile device, before the chunkier content (like images and videos) are loaded. This means the time when a page is ready for the user to interact with is not a hard line. SpeedCurve, a tool focussed entirely on web page speed performance, has a more accurate way of tracking when the page is ‘visually complete’ based on actual filmstrips on the page loading. But in their demo of The Guardian page speed, the page is not visually complete until a video advert has rendered in the bottom right of the screen – and personally I’d be happy to use the page before then. What you can do with Google Analytics is send custom timing events, maybe after the key product image on a page has loaded, so you can measure speed as relevant to your own site. But doesn’t speed still affect my Google rankings? A little bit yes, but when Google incorporated speed as a ranking signal in 2010, their head of SEO explained it was likely to penalise only 1% of websites which were really slow. And my guess is in 7 years Google has increase the sophistication with which it measures ‘speed’. So overall you shouldn’t worry about page load times on their own. A big increase may still signal a problem, but you should be focussing on conversion rates or page engagement as a safer metric. If you do want to measure speed, try to define a custom speed measurement for the content of your site – and Littledata’s experts can work with you to set that custom reporting up.
The Freemium business model revisited
After I concluded that freemium is not the best business model for all, the continued rise of ‘free’ software has led me to revisit the same question. In a fascinating piece of research by Price Intelligently, over 10,000 technology executives were surveyed over 5 years. Their willingness to pay for core features of B2B software has declined from 100% in 2013 to just over 50% today – as a whole wave of VC-funded SaaS companies has flooded the market with free product. For add-ons like analytics, this drops to less than 30% willing to pay. “The relative value of features is declining. All software is going to $0” – Patrick Campbell, Price Intelligently Patrick sees this as an extension of the trend in physical products, where offshoring, global scale and cheaper routes to market online have led to relentless price depreciation (in real terms). I’m not so sure. Software is not free to manufacture, although the marginal cost is close to zero – since cloud hosting costs are so cheap. The fixed cost is the people-time to design and build the components, and the opportunities for lowering that cost – through offshoring the work or more productive software frameworks - have already been exploited by most SaaS companies. To pile on the pain, a survey of software executives also found that the average number of competitors in any given niche has increased from 10 to 15 over those 3 years. Even if software build costs are falling, those costs are being spread over a small number of customers – making the chance of breaking even lower. And the other big cost – Customer Acquisition (CAC) – is actually rising with the volume of competition. To sum up the depressing news so far: 1. Buyers have been conditioned to expect free software, which means you’ll have to give major features away for free 2. But you’ll have to pay more to acquire these non-paying users 3. And next year another competitor will be offering even more for free What is the route of this economic hole? Focussing on monetising a few existing customers for one. Most SaaS executives were focussed on acquiring new customers (more logos), probably because with a free product they expected to sweep up the market and worry about monetization later. But this turns out to be the least effective route to building revenue. For every 1% increment, Price Intelligently calculated how much this would increase revenue. i.e. If I signed up 101 users over the year, rather than 100, that would increase revenue by 2.3%. Monetization – increasing the Average Revenue Per User (ARPU) – has by far the larger impact, mainly because many customers don’t pay anything currently. In contrast, the impact of customer acquisition has fallen over 3 years, since the average customer is less likely to pay. Monetization is not about increasing prices for everyone – or charging for previously free features – but rather finding the small number who are willing to pay, and charging them appropriately. My company, Littledata, has many parallels to Profit Well (launched by Price Intelligently). We both offer analytics and insights on top of existing customer data – Littledata for Google Analytics behavioural data, and Profit Well for recurring revenue data from billing systems. And we have both had similar customer feedback: that the perceived value of the reporting is low, but the perceived value of the changes which the reporting spurs (better customer acquisition, increased retention etc) is high. So the value of our software is that it creates a requirement – which can then be filled by consulting work or ‘actionable’ modules. For myself, I can say that while focusing on new customer acquisition has been depressing, we have grown revenues once a trusted relationship is in place – and the customer really believes in Littledata’s reporting. For Littledata, as with many B2B software companies, we are increasingly content that 80% of our revenue comes from a tiny handful of loyal and satisfied users. In conclusion, while the cover price of software subscriptions is going to zero, it is still possible to generate profits as a niche SaaS business – if you understand the necessity of charging more to a few customers if the many are unwilling to pay. Freemium may be here to stay, but if customers want the software companies they rely on to stay they need to pay for the benefits. Would you like to further discuss? Comment below or get in touch!
Subscribe to our blog
Get the latest posts in your email
Get the app
See for yourself why Littledata is the smartest ecommerce analytics appFree trial