Shopify Marketing Events vs Google Analytics

At the Shopify Unite conference today I heard plenty of great ideas such as ShopifyPay but the most interesting for me as a data specialist was the marketing events API. Since we launched our Fix Google Analytics Shopify app earlier this year we’ve known that reporting was a weak spot in Shopify’s platform offering, and they admit that ‘understanding marketing campaign performance’ is one of the biggest challenges of Shopify merchants right now. The ability for other Shopify apps to plug their campaign cost and attribution data into Shopify (via the marketing events API) is a logical step to building Shopify’s own analytics capability, but I don’t believe it will be a substitute for Google Analytics (GA) anytime soon. Here’s why: 1. Google Analytics is the industry standard Every online marketer has used Google Analytics, and many have favourite reports they’ve learned to interpret. Moving them to use a whole new analysis platform will take time– and it’s taken GA 10 years to achieve that dominance. 2. GA provides platform-agnostic data collection For a store using Shopify as their only source of insights, moving away from Shopify would mean losing all the historic marketing performance data – so it would be very hard to make like-for-like comparisons between the old platform and the new. Many of our customers have used GA during and after a platform shift to get continuous historical data. Which ties into my first point that over 85% of businesses have a history of data in GA. 3. Incomplete marketing tagging will still cause issues Making valid analysis on multi-channel marketing performance relies on having ALL the campaigns captured - which is why our GA audit tool checks for completeness of campaign tagging. Shopify’s tracking relies on the same ‘utm_campaign’ parameters as GA, and campaigns that are not properly tagged at the time cannot be altered retrospectively. 4. Google is rapidly developing Google Analytics I’d like to see the Shopify marketing event collection evolve from its launch yesterday, but Google already has a team of hundreds working on Google Analytics, and it seems unlikely that Shopify will be able to dedicate resources to keep up with the functionality that power users need. 5. More integrations are needed for full campaign coverage Shopify’s marketing analysis will only be available for apps that upgrade to using the new API.  Marketing Events has launched with integrations for Mailchimp and Facebook (via Kit) but it won’t cover many of the major channels (other emails, AdWords, DoubleClick for Publishers) that stores use. Those integrations will get built in time, but until then any attribution will be skewed. 6. GA has many third-party integrations Our experience is that any store interested in their campaign attribution quickly wants more custom analysis or cuts of the data. Being able to export the data into Littledata’s custom reports (or Google Sheets or Excel) is a popular feature – and right now Shopify lacks a reporting API to provide the same customisations. You can only pull raw event data back out. That said, there are flaws with how GA attribution works. Importing campaign cost data is difficult and time consuming in GA – apart from the seamless integration with AdWords – and as a result hardly any of the stores we monitor do so. If Shopify can encourage those costs to be imported along with the campaign dates, then the return on investment calculations will be much easier for merchants. I also think Shopify has taken the right pragmatic approach to attribution windows. It counts a campaign as ‘assisting’ the sale if it happens within 30 days of the campaign, and also whether it was ‘last click’ or ‘first click’. I’ve never seen a good reason to get more complicated than that with multi-channel reports in GA, and it’s unlikely that many customers remember a campaign longer than 30 days ago. In conclusion, we love that Shopify is starting to take marketing attribution seriously, and we look forward to helping improve the marketing events feature from its launch yesterday, but we recommend anyone with a serious interest in their marketing performance sticks to Google Analytics in the meantime (and use our Shopify app to do so).

2017-04-21

Important update to Remarketing with Google Analytics

If you got this email from Google recently, or seen the blue notification bar at the top of Google Analytics, here's what is changing and how it affects your website. The big problem in modern online marketing is that most users have multiple devices, and the device they interact with the advert on is not the same as the one they convert on: [Google’s] research shows that six in ten internet users start shopping on one device but continue or finish on a different one. Facebook has been helping advertisers track conversion across devices for a few years  - because most Facebook ads are served on their mobile app, when most conversion happens on larger screens. So Google has been forced to play catch-up. Here’s the message from the Google Analytics header: Starting May 15, 2017, all properties using Remarketing with Google Analytics will be enhanced to take advantage of new cross-device functionality. This is an important update to your remarketing settings, which may relate to your privacy policy. The change was announced last September but has only just rolled out. So you can remarket to users on a different device to the one on which they visited your site when: You build a retargeting audience in Google Analytics You have opted in to remarketing tracking in Google Analytics Users are logged into Google on more than one device Users have allowed Google to link their web and app browsing history with their Google account Users have allowed Google account to personalise ads they see across the web This may seem like a hard-to-reach audience, but Google has two secret weapons: Gmail (used by over 1 billion people and 75% of those on mobile) and Chrome (now the default web browser for desktop, and growing in mobile). So there are many cases where Google knows which devices are linked to a user. What is not changing is how Google counts users in Google Analytics. Unless you are tracking registered users, a ‘user’ in Google Analytics will still refer to one device (tablet, mobile or laptop / desktop computer).   Could Google use their account information to make Google Analytics cross-device user tracking better? Yes, they could; but Google has always been careful to keep their own data about users (the actions users take on Google.com) separate from the data individual websites capture in Google Analytics (the actions users take on mywebsite.com). The former is owned by Google, and protected by a privacy agreement that exists between Google and the user, and the latter is owned by the website adding the tracking code but stored and processed by Google Analytics. Blurring those two would create a legal minefield for Google, which is why they stress the word ‘temporary’ in their explanation of cross-device audiences: In order to support this feature, Google Analytics will collect these users’ Google-authenticated identifiers, which are Google’s personal data, and temporarily join them to your Google Analytics data in order to populate your audiences.   How can I make use of the new cross-device retargeting? The first step is to create a remarketing audience from a segment of your website visitors that are already engaged. This could be users who have viewed a product, users who have viewed the pricing page or users who have viewed more than a certain number of pages. For more help on setting up the right goals to power the remarketing audience, please contact us.

2017-04-10

How does page load speed affect bounce rate?

I’ve read many articles stating a link between faster page loading and better user engagement, but with limited evidence. So I looked at hard data from 1,840 websites and found that there’s really no correlation between page load speed and bounce rate in Google Analytics. Read on to find out why. The oft quoted statistic on page load speed is from Amazon, where each 100ms of extra loading delay supposed to cost Amazon $160m. Except that the research is from 2006, when Amazon’s pages were very static, and users had different expectations from pages – plus the conclusions may not apply to different kinds of site. More recently in 2013, Intuit presented results at the Velocity conference of how reducing page load speed from 15 seconds to 2 seconds had increased customer conversion by: +3% conversions for every second reduced from 15 seconds to 7 seconds +2% conversions for every second reduced from seconds 7 to 5 +1% conversions for every second reduced from seconds 4 to 2 So reducing load speed from 15 seconds to 7 seconds was worth an extra 24% conversion, but only another 8% to bring 7 seconds down to 2 seconds. Does page speed affect bounce rate? We collected data from 1,840 Google Analytics web properties, where both the full page load time (the delay between the first request and all the items on the page are loaded) and the bounce rate were within normal range. We then applied a Spearman’s Rank Correlation test, to see if being a higher ranked site for speed (lower page load time) you were likely to be a higher ranked site for bounce rate (lower bounce rate). What we found is almost no correlation (0.18) between page load speed and bounce rate. This same result was found if we looked at the correlation (0.22) between bounce rate and the delay before page content starts appearing (time to DOM ready) So what explains the lack of a link? I have three theories 1. Users care more about content than speed Many of the smaller websites we sampled for this research operate in niche industries or locations, where they may be the only source of information on a given topic. As a user, if I already know the target site is my best source for a topic, then I’ll be very patient while the content loads. One situation where users are not patient is when arriving from Google Search, and they know they can go and find a similar source of information in two clicks (one back to Google, and then out to another site). So we see a very high correlation between bounce rate and the volume of traffic from Google Search. This also means that what should concern you is speed relative to your search competitors, so you could be benchmarking your site speed against a group of similar websites, to measure whether you are above or below average.   2. Bounce rate is most affected by first impressions of the page As a user landing on your site I am going to make some critical decisions within the first 3 seconds: would I trust this site, is this the product or content I was expecting, and is it going to be easy to find what I need. If your page can address these questions quickly – by good design and fast loading of the title, main image etc – then you buy some more time before my attention wanders to the other content. In 2009, Google tried an experiment to show 30 search results to users instead of 10, but found the users clicking on the results dropped by 20%. They attributed this to the half a second extra it took to load the pages. But the precise issue was likely that it took half a second to load the first search result. Since users of Google mainly click on the first 3 results, the important metric is how long it took to load those - not the full page load.   3. Full page load speed is increasingly hard to measure Many websites already use lazy loading of images and other non-blocking loading techniques to make sure the bare bones of a page is fast to load, especially on a mobile device, before the chunkier content (like images and videos) are loaded. This means the time when a page is ready for the user to interact with is not a hard line. SpeedCurve, a tool focussed entirely on web page speed performance, has a more accurate way of tracking when the page is ‘visually complete’ based on actual filmstrips on the page loading. But in their demo of The Guardian page speed, the page is not visually complete until a video advert has rendered in the bottom right of the screen – and personally I’d be happy to use the page before then. What you can do with Google Analytics is send custom timing events, maybe after the key product image on a page has loaded, so you can measure speed as relevant to your own site.   But doesn’t speed still affect my Google rankings? A little bit yes, but when Google incorporated speed as a ranking signal in 2010, their head of SEO explained it was likely to penalise only 1% of websites which were really slow. And my guess is in 7 years Google has increase the sophistication with which it measures ‘speed’.   So overall you shouldn’t worry about page load times on their own. A big increase may still signal a problem, but you should be focussing on conversion rates or page engagement as a safer metric. If you do want to measure speed, try to define a custom speed measurement for the content of your site – and Littledata’s experts can work with you to set that custom reporting up.

2017-04-07

The Freemium business model revisited

After I concluded that freemium is not the best business model for all, the continued rise of ‘free’ software has led me to revisit the same question. In a fascinating piece of research by Price Intelligently, over 10,000 technology executives were surveyed over 5 years. Their willingness to pay for core features of B2B software has declined from 100% in 2013 to just over 50% today – as a whole wave of VC-funded SaaS companies has flooded the market with free product. For add-ons like analytics, this drops to less than 30% willing to pay. “The relative value of features is declining. All software is going to $0” – Patrick Campbell, Price Intelligently Patrick sees this as an extension of the trend in physical products, where offshoring, global scale and cheaper routes to market online have led to relentless price depreciation (in real terms). I’m not so sure. Software is not free to manufacture, although the marginal cost is close to zero – since cloud hosting costs are so cheap. The fixed cost is the people-time to design and build the components, and the opportunities for lowering that cost – through offshoring the work or more productive software frameworks - have already been exploited by most SaaS companies. To pile on the pain, a survey of software executives also found that the average number of competitors in any given niche has increased from 10 to 15 over those 3 years. Even if software build costs are falling, those costs are being spread over a small number of customers – making the chance of breaking even lower. And the other big cost – Customer Acquisition (CAC) – is actually rising with the volume of competition. To sum up the depressing news so far: 1. Buyers have been conditioned to expect free software, which means you’ll have to give major features away for free 2. But you’ll have to pay more to acquire these non-paying users 3. And next year another competitor will be offering even more for free What is the route of this economic hole? Focussing on monetising a few existing customers for one. Most SaaS executives were focussed on acquiring new customers (more logos), probably because with a free product they expected to sweep up the market and worry about monetization later. But this turns out to be the least effective route to building revenue. For every 1% increment, Price Intelligently calculated how much this would increase revenue. i.e. If I signed up 101 users over the year, rather than 100, that would increase revenue by 2.3%. Monetization – increasing the Average Revenue Per User (ARPU) – has by far the larger impact, mainly because many customers don’t pay anything currently. In contrast, the impact of customer acquisition has fallen over 3 years, since the average customer is less likely to pay. Monetization is not about increasing prices for everyone – or charging for previously free features – but rather finding the small number who are willing to pay, and charging them appropriately. My company, Littledata, has many parallels to Profit Well (launched by Price Intelligently). We both offer analytics and insights on top of existing customer data – Littledata for Google Analytics behavioural data, and Profit Well for recurring revenue data from billing systems. And we have both had similar customer feedback: that the perceived value of the reporting is low, but the perceived value of the changes which the reporting spurs (better customer acquisition, increased retention etc) is high. So the value of our software is that it creates a requirement – which can then be filled by consulting work or ‘actionable’ modules. For myself, I can say that while focusing on new customer acquisition has been depressing, we have grown revenues once a trusted relationship is in place – and the customer really believes in Littledata’s reporting. For Littledata, as with many B2B software companies, we are increasingly content that 80% of our revenue comes from a tiny handful of loyal and satisfied users. In conclusion, while the cover price of software subscriptions is going to zero, it is still possible to generate profits as a niche SaaS business – if you understand the necessity of charging more to a few customers if the many are unwilling to pay. Freemium may be here to stay, but if customers want the software companies they rely on to stay they need to pay for the benefits. Would you like to further discuss? Comment below or get in touch!

2017-03-10

Shine a light on ‘dark’ Facebook traffic

If Facebook is a major channel for your marketing, whether sponsored posts or normal, then you’re underestimating the visits and sales it brings. The problem is that Facebook doesn’t play nicely with Google Analytics, so some of the traffic from Facebook mobile app comes as a DIRECT visit. That’s right – if a Facebook user clicks on your post on their native mobile app they won’t always appear as a Facebook social referral. This traffic is ‘dark Facebook’ traffic: it is from Facebook, but you just can’t see it. Since around 40% of Facebook activity is on a mobile app, that means the Facebook traffic you see could be up to 40% less than the total. Facebook hasn’t shown much interest in fixing the issue (Twitter fixed it, so it is possible), so you need to fix this in your own Google Analytics account. Here are three approaches: 1. Basic: use campaign tagging The simplest way to fix this, for your own posts or sponsored links on Facebook, is to attach UTM campaign tags to every link. Google provides a simple URL builder to help. The essential tags to add are “utm_source=facebook.com” and “utm_medium=referral”. This will override the ‘direct’ channel and put all clicks on that links into the Facebook referral bucket. Beyond that, you can add useful tags like “utm_campaign=events_page” so you can see how many click through from your Facebook events specifically. 2. Moderate: use a custom segment to see traffic What if much of your traffic is from enthusiastic brand advocates, sharing your pages or articles with their friends? You can’t expect them to all use an URL builder. But you can make a simple assumption that most users on a mobile device are not going to type in a long URL into their browser address bar. So if the user comes from a mobile device, and isn’t visiting your homepage (or a short URL you deliberately post), then they are probably coming from a mobile app. If your website is consumer facing, then the high probability is that that mobile app is Facebook. So we can create a custom segment in GA for traffic which (a) comes from a mobile device (b) does not have a referrer or campaign (i.e. direct) (c) does not land on the homepage To start you need to create a segment where source contains 'facebook'. Then add the 'Direct mobile, not to homepage' segment: Next, you can create a custom report to show sessions by hour: You should see a strong correlation, which on the two web properties I tested on resulted in doubling the traffic I had attributed to Facebook. 3. Advanced: attribute micro spikes to Facebook Caveat: you’ll need a large volume of traffic – in excess of 100 visits from Facebook a day – to try this at home The final trick has been proved to work at The Guardian newspaper for Facebook traffic to news articles. Most Facebook activity is very transitory – active users click on a trending newsfeed item, but it quickly fades in interest. So what you could do, using the Google Analytics API, is look for the ‘micro spikes’ in referrals that come from Facebook on a minute-by-minute basis, and then look at the direct mobile visits which came at the same time, and add these direct spikes to the total Facebook traffic. I've played around with this and it's difficult to get right, due to the sampling Google applies, but I did manage to spot spikes over around 5 minutes that had a strong correlation with the underlying direct mobile traffic. Could these approaches work for your site?  I'm interested to hear. (Chart: Dark Social Dominates Online Sharing | Statista)   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-09

6 reasons Facebook ads don’t match the data you see in Google Analytics

If you run Facebook Ads and want to see how they perform in Google Analytics, you may have noticed some big discrepancies between the data available in Facebook Ad Manager and GA. Both systems use different ways to track clicks and visitors, so let’s unpick where the differences are. There are two kinds of metrics you’ll be interested in: ‘website clicks’ = the number of Facebook users who clicked on an advert on your own site, and (if you do ecommerce) the transaction value which was attributed to that advert. Website Clicks vs Sessions from Facebook 1. GA isn’t picking up Facebook as the referrer If users click on a link in Facebook’s mobile app and your website opens in an in-app browser, the browser may not log that ‘facebook.com’ was the referrer. You can override this (and any other link) by setting the medium, source, campaign and content attributes in the link directly. e.g. www.mysite.com?utm_medium=social&utm_source=facebook.com&utm_campaign=ad Pro Tip: you can use GA’s URL builder to set the UTM tags on every Facebook campaign link for GA. In GA, under the Admin tag and then ‘Property settings’ you should also tick the box saying ‘Allow manual tagging (UTM values) to override auto-tagging (GCLID values)’ to make this work more reliably. 2. The user leaves the page before the GA tag fires There’s a time delay between a user clicking on the advert in Facebook and being directed to your site. On a mobile, this delay may be several seconds long, and during the delay, the user will think about going back to safety (Facebook’s app) or just closing the app entirely. This will happen more often if the visitor is not familiar with your brand, and also when the page contents are slow to load. By Facebook’s estimation the GA tracking won’t fire anywhere between 10% and 80% of clicks on a mobile, but fewer than 5% of clicks on a desktop. It depends on what stage in the page load the GA pixel is requested. If you use a tag manager, you can control this firing order – so try firing the tag as a top priority and when the tag container is first loaded. Pro Tip: you can also use Google's mobile site speed suggestions to improve mobile load speed, and reduce this post-click drop-off. 3. A Javascript bug is preventing GA receiving data from in-app browsers It’s possible your page has a specific problem that prevents the GA tag firing only for mobile Safari (or Android equivalent). You’ll need to get your developers to test out the landing pages specifically from Facebook’s app. Luckily Facebook Ad Manager has a good way to preview the adverts on your mobile. Facebook Revenue vs GA Ecommerce revenue 4. Attribution: post-click vs last non-direct click Currently, Facebook has two types of attribution: post-view and post-click. This means any sale the user makes after viewing the advert or clicking on the advert, within the attribution window (typically 28 days after clicking and 1 day after viewing), is attributed to that advert. GA, by contrast, can use a variety of attribution models, the default being last non-direct click. This means that if the user clicks on an advert and on the same device buys something within the attribution window (typically 30 days), it will be attributed to Facebook.  GA doesn't know about views of the advert. If another campaign brings the same user to your site between the Facebook ad engagement and the purchase, this other campaign takes the credit as the ‘last non-direct click’. So to match as closely as possible we recommend setting the attribution window to be '28 days after clicking the ad' and no 'after view' attribution in Facebook (see screenshot above) and then creating a custom attribution model in GA, with the lookback window at 28 days, and the attribution 'linear' The differences typically come when: a user engages with more than one Facebook campaign (e.g. a brand campaign and a re-targeting one) where the revenue will only be counted against the last campaign (with a priority for ads clicked vs viewed) a user clicks on a Facebook ad, but then clicks on another advert (maybe Adwords) before buying. Facebook doesn’t know about this 2nd advert, so will attribute all the revenue to the Facebook ad. GA knows better, and will attribute all (or part) of it to Adwords. 5. Facebook cross-device tracking The main advantage Facebook has over GA is that users log in to its platform across all of their devices, so it can stitch together the view of a mobile advert on day 1 with a purchase made from the user’s desktop computer on day 2. Here’s a fuller explanation. By contrast, unless that user logs into your website on both devices, and you have cross-device tracking setup, GA won’t attribute the sale to Facebook. 6. Date of click vs date of purchase In Facebook, revenue is attributed to the date the user saw the advert; in GA it is to the date of purchase. So if a user clicks on the advert on 1st September, and then buys on the 3rd September, this will appear on the 1st on Facebook – and on the 3rd in GA. 7. The sampling problem Finally, did you check if the GA report is sampled? In the top right of the screen, in the grey bar, you'll see that the report is based on a sample.  If that sample is less than 100% it means the numbers you see are estimates.  The smaller the sample size used, the larger the possibility of error.  So in this example, a 45% sample of 270,000 sessions could skew our results plus or minus 0.2% in the best case. As a rule of thumb, Google applies sampling when looking over more than 500,000 sessions (even if you select the 'greater precision' option from the drop-down menu). You can check your own sample using this confidence interval calculator. Conclusion Altogether, there’s a formidable list of reasons why the data will never be an exact match, but I hope it gives you a way to optimise the tracking. Please let us know if you’ve seen other tracking issues aside from these.   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-08

Cross Domain tracking for Eventbrite using Google Tag Manager (GTM)

Are you using Eventbrite for event registrations? And would you like to see the marketing campaign which drove that event registration correctly attributed in Google Analytics? Then you've come to right place! Here is a simple guide to adding a Google Tag Manager tag to ensure the correct data is sent to Eventbrite to enable cross-domain tracking with your own website. Many thanks to the Lunametrics blog for their detailed solution, which we have adapted here for GTM. Before this will work you need to have: links from your site to Eventbrite (including mysite.eventbrite.com or www.eventbrite.co.uk) the Universal Analytics tracking code on both your site and your Eventbrite pages. only have one GA tracking code on your own site - or else see the Lunametrics article to cope with this 1. Create a new tag in GTM Create a new custom HTML tag in GTM and paste this script: [code language="javascript"] <script> (function(document, window) { //Uses the first GA tracker registered, which is fine for 99.9% of users. //won't work for browsers older than IE8 if (!document.querySelector) return; var gaName = window.GoogleAnalyticsObject || "ga" ; // Safely instantiate our GA queue. window[gaName]=window[gaName]||function(){(window[gaName].q=window[gaName].q||[]).push(arguments)};window[gaName].l=+new Date; window[gaName](function() { // Defer to the back of the queue if no tracker is ready if (!ga.getAll().length) { window[gaName](bindUrls); } else bindUrls(); }); function bindUrls() { var urls = document.querySelectorAll("a"); var eventbrite = /eventbrite\./ var url, i; for (i = 0; i < urls.length; i++) { url = urls[i]; if (eventbrite.test(url.hostname) === true) { //only fetches clientID if this page has Eventbrite links var clientId = getClientId(); var parameter = "_eboga=" + clientId; // If we're in debug mode and can't find a client if (!clientId) { window.console && window.console.error("GTM Eventbrite Cross Domain: Unable to detect Client ID. Verify you are using Universal Analytics."); break; return; } url.search = url.search ? url.search + "&" + parameter : "?" + parameter; } } } function getClientId() { var trackers = window[gaName].getAll(); return trackers[0].get("clientId"); } })(document, window); </script> [/code]   2. Set the tag to fire 'DOM ready' Create a new trigger (if you don't have a suitable one) to fire the tag on every page at the DOM ready stage.  We need to make sure the Google Analytics tracker has loaded first. 3. Test the marketing attribution With the script working you should see pageviews of the Eventbrite pages as a continuation of the same session. You can test this by: Opening the 'real time' reporting tag in Google Analytics, on an unfiltered view Searching for your own site in Google Navigating to the page with the Eventbrite link and clicking on it Looking under the Traffic Sources report and checking you are still listed as organic search after viewing the Eventbrite page Need more help? Comment below or get in touch!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-07

WWI Codebreaking and Interpretation

Reading Max Hasting’s excellent book on The Secret War, 1939-1945, I was struck by the parallel between the rise of radio communications in the 1930s and the more recent rise in internet data. The transmission of military and diplomatic messages by radio in the 1930s and 1940s provided intelligence agencies with a new gold mine. Never before had so much potential intelligence been floating in the ether, and yet it threatened to flood their limited manpower with a tide of trivia. The bottleneck was rarely in the interception (trivial with a radio set) or even decryption (made routine by Bletchley Park with the Enigma codes), but rather in filtering down to the tiny number of messages that contained important facts – and getting that information in real time to the commanders in the field. The Ultra programme (Britain’s decryption of German radio intercepts) was perennially understaffed due to the fact that other civil servants couldn’t be told how important it was. At Ultra’s peak in 1943, only around 50% of the 1,500 Luftwaffe messages a day were being processed – and it is unknown how many of those were in time to avert bombing raids. The new age of technology provided an almost infinitely wide field for exploration, as well as the means of addressing this: the trick was to focus attention where it mattered. The Secret War, page 203 The ‘new age of technology’ in the last two decades poses much the same problem. Data on internet behaviour is abundant: there are countless signals to listen to about your website performance, and the technology to monitor users is commonplace. And the bottleneck is still the same: the filtering of useful signals, and getting those insights to the ‘commanders’ who need them in real time. I started Littledata to solve this modern problem in interpreting website analytics for managers of online businesses. There is no decryption involved, but there is a lot of statistics and data visualisation know-how in making billions of data points appreciable by a company manager. Perhaps the most important aspect of our service is to provide insights in answer to a specific question: Group-Captain Peter Stewart, who ran the Royal Air Force’s photo-reconnaissance operations, was exasperated by a senior offer who asked for ‘all available information’ on one European country. Stewart responded that he could only provide useful information if he knew roughly what intelligence the suppliant wanted – ‘naval, military, air or ecclesiastical’. The Secret War, page 203 In the world of online commerce, the question is something like whether the client needs insights into the checkout conversion rate of all customers (to improve site design) or for a specific marketing campaign (to improve campaign targeting). So by focusing on insights which are relevant to the scale, stage or sector of the client company, and making these accessible in a real-time dashboard, Littledata can feed into decision making in a way that raw data can never do. Want to discuss this further? Get in touch or comment below!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-01

Get the Littledata analytics app

Complete picture of your ecommerce business. Free Google Analytics connection, audit and benchmarks.

Sign up