Shopify Marketing Events vs Google Analytics
At the Shopify Unite conference today I heard plenty of great ideas such as ShopifyPay but the most interesting for me as a data specialist was the marketing events API. Since we launched our Fix Google Analytics Shopify app earlier this year we’ve known that reporting was a weak spot in Shopify’s platform offering, and they admit that ‘understanding marketing campaign performance’ is one of the biggest challenges of Shopify merchants right now. The ability for other Shopify apps to plug their campaign cost and attribution data into Shopify (via the marketing events API) is a logical step to building Shopify’s own analytics capability, but I don’t believe it will be a substitute for Google Analytics (GA) anytime soon. Here’s why: 1. Google Analytics is the industry standard Every online marketer has used Google Analytics, and many have favourite reports they’ve learned to interpret. Moving them to use a whole new analysis platform will take time– and it’s taken GA 10 years to achieve that dominance. 2. GA provides platform-agnostic data collection For a store using Shopify as their only source of insights, moving away from Shopify would mean losing all the historic marketing performance data – so it would be very hard to make like-for-like comparisons between the old platform and the new. Many of our customers have used GA during and after a platform shift to get continuous historical data. Which ties into my first point that over 85% of businesses have a history of data in GA. 3. Incomplete marketing tagging will still cause issues Making valid analysis on multi-channel marketing performance relies on having ALL the campaigns captured - which is why our GA audit tool checks for completeness of campaign tagging. Shopify’s tracking relies on the same ‘utm_campaign’ parameters as GA, and campaigns that are not properly tagged at the time cannot be altered retrospectively. 4. Google is rapidly developing Google Analytics I’d like to see the Shopify marketing event collection evolve from its launch yesterday, but Google already has a team of hundreds working on Google Analytics, and it seems unlikely that Shopify will be able to dedicate resources to keep up with the functionality that power users need. 5. More integrations are needed for full campaign coverage Shopify’s marketing analysis will only be available for apps that upgrade to using the new API. Marketing Events has launched with integrations for Mailchimp and Facebook (via Kit) but it won’t cover many of the major channels (other emails, AdWords, DoubleClick for Publishers) that stores use. Those integrations will get built in time, but until then any attribution will be skewed. 6. GA has many third-party integrations Our experience is that any store interested in their campaign attribution quickly wants more custom analysis or cuts of the data. Being able to export the data into Littledata’s custom reports (or Google Sheets or Excel) is a popular feature – and right now Shopify lacks a reporting API to provide the same customisations. You can only pull raw event data back out. That said, there are flaws with how GA attribution works. Importing campaign cost data is difficult and time consuming in GA – apart from the seamless integration with AdWords – and as a result hardly any of the stores we monitor do so. If Shopify can encourage those costs to be imported along with the campaign dates, then the return on investment calculations will be much easier for merchants. I also think Shopify has taken the right pragmatic approach to attribution windows. It counts a campaign as ‘assisting’ the sale if it happens within 30 days of the campaign, and also whether it was ‘last click’ or ‘first click’. I’ve never seen a good reason to get more complicated than that with multi-channel reports in GA, and it’s unlikely that many customers remember a campaign longer than 30 days ago. In conclusion, we love that Shopify is starting to take marketing attribution seriously, and we look forward to helping improve the marketing events feature from its launch yesterday, but we recommend anyone with a serious interest in their marketing performance sticks to Google Analytics in the meantime (and use our Shopify app to do so).
Important update to Remarketing with Google Analytics
How does page load speed affect bounce rate?
I’ve read many articles stating a link between faster page loading and better user engagement, but with limited evidence. So I looked at hard data from 1,840 websites and found that there’s really no correlation between page load speed and bounce rate in Google Analytics. Read on to find out why. The oft quoted statistic on page load speed is from Amazon, where each 100ms of extra loading delay supposed to cost Amazon $160m. Except that the research is from 2006, when Amazon’s pages were very static, and users had different expectations from pages – plus the conclusions may not apply to different kinds of site. More recently in 2013, Intuit presented results at the Velocity conference of how reducing page load speed from 15 seconds to 2 seconds had increased customer conversion by: +3% conversions for every second reduced from 15 seconds to 7 seconds +2% conversions for every second reduced from seconds 7 to 5 +1% conversions for every second reduced from seconds 4 to 2 So reducing load speed from 15 seconds to 7 seconds was worth an extra 24% conversion, but only another 8% to bring 7 seconds down to 2 seconds. Does page speed affect bounce rate? We collected data from 1,840 Google Analytics web properties, where both the full page load time (the delay between the first request and all the items on the page are loaded) and the bounce rate were within normal range. We then applied a Spearman’s Rank Correlation test, to see if being a higher ranked site for speed (lower page load time) you were likely to be a higher ranked site for bounce rate (lower bounce rate). What we found is almost no correlation (0.18) between page load speed and bounce rate. This same result was found if we looked at the correlation (0.22) between bounce rate and the delay before page content starts appearing (time to DOM ready) So what explains the lack of a link? I have three theories 1. Users care more about content than speed Many of the smaller websites we sampled for this research operate in niche industries or locations, where they may be the only source of information on a given topic. As a user, if I already know the target site is my best source for a topic, then I’ll be very patient while the content loads. One situation where users are not patient is when arriving from Google Search, and they know they can go and find a similar source of information in two clicks (one back to Google, and then out to another site). So we see a very high correlation between bounce rate and the volume of traffic from Google Search. This also means that what should concern you is speed relative to your search competitors, so you could be benchmarking your site speed against a group of similar websites, to measure whether you are above or below average. 2. Bounce rate is most affected by first impressions of the page As a user landing on your site I am going to make some critical decisions within the first 3 seconds: would I trust this site, is this the product or content I was expecting, and is it going to be easy to find what I need. If your page can address these questions quickly – by good design and fast loading of the title, main image etc – then you buy some more time before my attention wanders to the other content. In 2009, Google tried an experiment to show 30 search results to users instead of 10, but found the users clicking on the results dropped by 20%. They attributed this to the half a second extra it took to load the pages. But the precise issue was likely that it took half a second to load the first search result. Since users of Google mainly click on the first 3 results, the important metric is how long it took to load those - not the full page load. 3. Full page load speed is increasingly hard to measure Many websites already use lazy loading of images and other non-blocking loading techniques to make sure the bare bones of a page is fast to load, especially on a mobile device, before the chunkier content (like images and videos) are loaded. This means the time when a page is ready for the user to interact with is not a hard line. SpeedCurve, a tool focussed entirely on web page speed performance, has a more accurate way of tracking when the page is ‘visually complete’ based on actual filmstrips on the page loading. But in their demo of The Guardian page speed, the page is not visually complete until a video advert has rendered in the bottom right of the screen – and personally I’d be happy to use the page before then. What you can do with Google Analytics is send custom timing events, maybe after the key product image on a page has loaded, so you can measure speed as relevant to your own site. But doesn’t speed still affect my Google rankings? A little bit yes, but when Google incorporated speed as a ranking signal in 2010, their head of SEO explained it was likely to penalise only 1% of websites which were really slow. And my guess is in 7 years Google has increase the sophistication with which it measures ‘speed’. So overall you shouldn’t worry about page load times on their own. A big increase may still signal a problem, but you should be focussing on conversion rates or page engagement as a safer metric. If you do want to measure speed, try to define a custom speed measurement for the content of your site – and Littledata’s experts can work with you to set that custom reporting up.
The Freemium business model revisited
After I concluded that freemium is not the best business model for all, the continued rise of ‘free’ software has led me to revisit the same question. In a fascinating piece of research by Price Intelligently, over 10,000 technology executives were surveyed over 5 years. Their willingness to pay for core features of B2B software has declined from 100% in 2013 to just over 50% today – as a whole wave of VC-funded SaaS companies has flooded the market with free product. For add-ons like analytics, this drops to less than 30% willing to pay. “The relative value of features is declining. All software is going to $0” – Patrick Campbell, Price Intelligently Patrick sees this as an extension of the trend in physical products, where offshoring, global scale and cheaper routes to market online have led to relentless price depreciation (in real terms). I’m not so sure. Software is not free to manufacture, although the marginal cost is close to zero – since cloud hosting costs are so cheap. The fixed cost is the people-time to design and build the components, and the opportunities for lowering that cost – through offshoring the work or more productive software frameworks - have already been exploited by most SaaS companies. To pile on the pain, a survey of software executives also found that the average number of competitors in any given niche has increased from 10 to 15 over those 3 years. Even if software build costs are falling, those costs are being spread over a small number of customers – making the chance of breaking even lower. And the other big cost – Customer Acquisition (CAC) – is actually rising with the volume of competition. To sum up the depressing news so far: 1. Buyers have been conditioned to expect free software, which means you’ll have to give major features away for free 2. But you’ll have to pay more to acquire these non-paying users 3. And next year another competitor will be offering even more for free What is the route of this economic hole? Focussing on monetising a few existing customers for one. Most SaaS executives were focussed on acquiring new customers (more logos), probably because with a free product they expected to sweep up the market and worry about monetization later. But this turns out to be the least effective route to building revenue. For every 1% increment, Price Intelligently calculated how much this would increase revenue. i.e. If I signed up 101 users over the year, rather than 100, that would increase revenue by 2.3%. Monetization – increasing the Average Revenue Per User (ARPU) – has by far the larger impact, mainly because many customers don’t pay anything currently. In contrast, the impact of customer acquisition has fallen over 3 years, since the average customer is less likely to pay. Monetization is not about increasing prices for everyone – or charging for previously free features – but rather finding the small number who are willing to pay, and charging them appropriately. My company, Littledata, has many parallels to Profit Well (launched by Price Intelligently). We both offer analytics and insights on top of existing customer data – Littledata for Google Analytics behavioural data, and Profit Well for recurring revenue data from billing systems. And we have both had similar customer feedback: that the perceived value of the reporting is low, but the perceived value of the changes which the reporting spurs (better customer acquisition, increased retention etc) is high. So the value of our software is that it creates a requirement – which can then be filled by consulting work or ‘actionable’ modules. For myself, I can say that while focusing on new customer acquisition has been depressing, we have grown revenues once a trusted relationship is in place – and the customer really believes in Littledata’s reporting. For Littledata, as with many B2B software companies, we are increasingly content that 80% of our revenue comes from a tiny handful of loyal and satisfied users. In conclusion, while the cover price of software subscriptions is going to zero, it is still possible to generate profits as a niche SaaS business – if you understand the necessity of charging more to a few customers if the many are unwilling to pay. Freemium may be here to stay, but if customers want the software companies they rely on to stay they need to pay for the benefits. Would you like to further discuss? Comment below or get in touch!
Shine a light on ‘dark’ Facebook traffic
If Facebook is a major channel for your marketing, whether sponsored posts or normal, then you’re underestimating the visits and sales it brings. The problem is that Facebook doesn’t play nicely with Google Analytics, so some of the traffic from Facebook mobile app comes as a DIRECT visit. That’s right – if a Facebook user clicks on your post on their native mobile app they won’t always appear as a Facebook social referral. This traffic is ‘dark Facebook’ traffic: it is from Facebook, but you just can’t see it. Since around 40% of Facebook activity is on a mobile app, that means the Facebook traffic you see could be up to 40% less than the total. Facebook hasn’t shown much interest in fixing the issue (Twitter fixed it, so it is possible), so you need to fix this in your own Google Analytics account. Here are three approaches: 1. Basic: use campaign tagging The simplest way to fix this, for your own posts or sponsored links on Facebook, is to attach UTM campaign tags to every link. Google provides a simple URL builder to help. The essential tags to add are “utm_source=facebook.com” and “utm_medium=referral”. This will override the ‘direct’ channel and put all clicks on that links into the Facebook referral bucket. Beyond that, you can add useful tags like “utm_campaign=events_page” so you can see how many click through from your Facebook events specifically. 2. Moderate: use a custom segment to see traffic What if much of your traffic is from enthusiastic brand advocates, sharing your pages or articles with their friends? You can’t expect them to all use an URL builder. But you can make a simple assumption that most users on a mobile device are not going to type in a long URL into their browser address bar. So if the user comes from a mobile device, and isn’t visiting your homepage (or a short URL you deliberately post), then they are probably coming from a mobile app. If your website is consumer facing, then the high probability is that that mobile app is Facebook. So we can create a custom segment in GA for traffic which (a) comes from a mobile device (b) does not have a referrer or campaign (i.e. direct) (c) does not land on the homepage To start you need to create a segment where source contains 'facebook'. Then add the 'Direct mobile, not to homepage' segment: Next, you can create a custom report to show sessions by hour: You should see a strong correlation, which on the two web properties I tested on resulted in doubling the traffic I had attributed to Facebook. 3. Advanced: attribute micro spikes to Facebook Caveat: you’ll need a large volume of traffic – in excess of 100 visits from Facebook a day – to try this at home The final trick has been proved to work at The Guardian newspaper for Facebook traffic to news articles. Most Facebook activity is very transitory – active users click on a trending newsfeed item, but it quickly fades in interest. So what you could do, using the Google Analytics API, is look for the ‘micro spikes’ in referrals that come from Facebook on a minute-by-minute basis, and then look at the direct mobile visits which came at the same time, and add these direct spikes to the total Facebook traffic. I've played around with this and it's difficult to get right, due to the sampling Google applies, but I did manage to spot spikes over around 5 minutes that had a strong correlation with the underlying direct mobile traffic. Could these approaches work for your site? I'm interested to hear. (Chart: Dark Social Dominates Online Sharing | Statista) Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.
6 reasons Facebook ads don’t match the data you see in Google Analytics
Cross Domain tracking for Eventbrite using Google Tag Manager (GTM)
WWI Codebreaking and Interpretation
Reading Max Hasting’s excellent book on The Secret War, 1939-1945, I was struck by the parallel between the rise of radio communications in the 1930s and the more recent rise in internet data. The transmission of military and diplomatic messages by radio in the 1930s and 1940s provided intelligence agencies with a new gold mine. Never before had so much potential intelligence been floating in the ether, and yet it threatened to flood their limited manpower with a tide of trivia. The bottleneck was rarely in the interception (trivial with a radio set) or even decryption (made routine by Bletchley Park with the Enigma codes), but rather in filtering down to the tiny number of messages that contained important facts – and getting that information in real time to the commanders in the field. The Ultra programme (Britain’s decryption of German radio intercepts) was perennially understaffed due to the fact that other civil servants couldn’t be told how important it was. At Ultra’s peak in 1943, only around 50% of the 1,500 Luftwaffe messages a day were being processed – and it is unknown how many of those were in time to avert bombing raids. The new age of technology provided an almost infinitely wide field for exploration, as well as the means of addressing this: the trick was to focus attention where it mattered. The Secret War, page 203 The ‘new age of technology’ in the last two decades poses much the same problem. Data on internet behaviour is abundant: there are countless signals to listen to about your website performance, and the technology to monitor users is commonplace. And the bottleneck is still the same: the filtering of useful signals, and getting those insights to the ‘commanders’ who need them in real time. I started Littledata to solve this modern problem in interpreting website analytics for managers of online businesses. There is no decryption involved, but there is a lot of statistics and data visualisation know-how in making billions of data points appreciable by a company manager. Perhaps the most important aspect of our service is to provide insights in answer to a specific question: Group-Captain Peter Stewart, who ran the Royal Air Force’s photo-reconnaissance operations, was exasperated by a senior offer who asked for ‘all available information’ on one European country. Stewart responded that he could only provide useful information if he knew roughly what intelligence the suppliant wanted – ‘naval, military, air or ecclesiastical’. The Secret War, page 203 In the world of online commerce, the question is something like whether the client needs insights into the checkout conversion rate of all customers (to improve site design) or for a specific marketing campaign (to improve campaign targeting). So by focusing on insights which are relevant to the scale, stage or sector of the client company, and making these accessible in a real-time dashboard, Littledata can feed into decision making in a way that raw data can never do. Want to discuss this further? Get in touch or comment below! Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.
Subscribe to our blog
Get the latest posts in your email
Get the Littledata analytics app
Complete picture of your ecommerce business. Free Google Analytics connection, audit and benchmarks.Sign up