How to stop Google Tag Manager being hacked

In two high-profile data breaches this year – at Ticketmaster and British Airways – over half a million credit cards were stolen via a compromised script inserted on the payment pages. Google Tag Manager is a powerful tool which enables you to insert any script you want onto pages of your website, but that power can used against you by hackers if you're not careful – and below we’ll look at how to stop GTM being a security risk on your payment pages. Firstly, how did the hackers get the card details from these sites? And how is it relevant to GTM on your site? Security firm RiskIQ has traced the breach to a compromised Javascript file which skimmed the card details from the payment form. So when a user entered their credit card number and security code on BritishAirways.com (or their mobile app) those details were posted to a third party server, unknown to British Airways or the customer. This is a high-scale equivalent of placing a skimming devices on an ATM, which reads one card at a time. In Ticketmaster’s hack the script was one loaded from a chatbot vendor on their site, Inbenta. Inbenta claims not even to have been aware the script was used on payment pages. The changes to the script were subtle: not breaking any functionality, and in BA’s case using a domain ‘baway.com’ which looked somewhat authentic. To protect your site against a similar attack you obviously need to lock down accounts used by your developers to change scripts in the page source code, but you also need to secure GTM – which can be used to deploy such scripts. We have a few rules at Littledata to help reduce risks in using tag management on payment pages: 1. Use pixels over custom JavaScript tags on payment pages You probably need a few standard tags, such as Google Analytics, on payment pages but try to avoid any custom scripts which could possibly skim card details. Many non-standard tags use JavaScript only to create the URL of a tracking pixel – and it is much safer (and faster) to call the tracking pixel directly. Contact the vendor to find out how. (Littledata's Shopify app even removes the need to have any script on the payment pages, by hooking into the order as it's registered on Shopify's servers) 2. Avoid loading external JavaScript files in GTM Many vendors want you to load a file from their server (e.g. myvendor.com/tracking.js) from GTM, so they can update the tracking code whenever they want. This is flexible for them, but risky for you. If the vendor gets hacked (e.g. with Inbenta above) then you get compromised. It’s less risky to embed that script directly in GTM, and control version changes from there (although a fraction slower to load the page). Of particular risk is embedding a tag manager within a tag manager – where you are giving the third party rights to publish any other scripts within the one tag. Don’t do that! 3. Lock down Edit and Publish rights on GTM Your organisation probably has a high turnover of contract web developers and agencies, so have you checked that only the current staff or agencies have permission to edit and publish? It's OK to have external editors use 'workspaces' for version control in GTM, but ideally someone with direct accountability to your company should check and Publish. 4. Blacklist custom JavaScript tag on the payment pages You can set a blacklist from the on-page data layer to prevent certain types of tags being deployed on the payment pages. If you have a GTM container with many users, this may be more practical that step 3. 5. Remove tags from old vendors There are many thousands of marketing tools out there, and your company has probably tried a few. Do you remove all the tags from vendors when you stop working with them? These are most at risk of being hacked. At Littledata we run a quarterly process for marketing stakeholders opt-in tags they still need for tracking or optimisation. 6. Ensure all custom JavaScript tags are reviewed by a developer before publishing It can be hard to review minimised JavaScript libraries, but worth it for payment pages if you can’t follow rules 1 and 2. If you’re still worried, you can audit the actual network requests sent from payment pages. For example, in Chrome developer tools, in the 'Network' tab, you can inspect what requests sent out by the browser and to what servers. It’s easy for malicious code to hide in the patchwork of JavaScript that powers most modern web experiences, but what is harder to hide is the network requests made from the browser to external servers (i.e. to post the stolen card information out). This request to Google Analytics is fine, but if the domain of a request is dubious, look it up or ask around the team. Good luck, and keep safe with GTM!

2018-11-24

Are you looking at the wrong Black Friday metrics?

Paying attention to the right ecommerce metrics can help you establish the best customer base and shopping experience for long-term growth. But many retailers still focus only on the most popular metrics -- especially during the online shopping craze of Black Friday and Cyber Monday (#BFCM). Over the next few weeks ecommerce managers will be obsessing over data, but which stats are the most important? Two popular metrics -- ecommerce conversion rate and average time on site -- may be misleading, so I recommend looking instead at longer-term benchmarks. Here's how it all breaks down. Littledata's ecommerce benchmark data now contains indicators from over 12,000 sites, making it an ideal place to get a realistic view of Black Friday stats. Last year we found that the impact on Black Friday and Cyber Monday was larger in 2017 than in 2016. Using that same data set of 440 high-traffic sites, I dove into the numbers to see how this affected other metrics. Metrics to avoid I think that overall ecommerce conversion rate is a bad metric to track. From the leading ecommerce websites we surveyed, the median increase was 30% during the BFCM event last year...but nearly a third of the stores saw their conversion rate dip as the extra traffic didn’t purchase, with this group seeing a median 26% drop. Some stores do extremely well with deals: four sites from our survey had more than a 15-fold increase in ecommerce conversion rate during BFCM, and nearly a quarter saw more than double the conversion rate over the period. But the real question is: will tracking conversion rate hour-by-hour help you improve it? What could you possibly change within in day? Another misleading metric is average time on site. You may be looking for signs that the the extra traffic on the holiday weekend is engaging, but this is not the one to watch. The time on site for visitors who only see one page will be zero, which will mask any real increase from engaged visitors. Where to focus instead Now, do you know what good performance on funnel conversion metrics would look like for your sector? If not, have a look at Littledata’s industry benchmarks which now cover over 500 global sectors. Littledata’s benchmarks also include historic charts to show you how metrics such as add-to-cart rate vary for the average retailer in your sector month by month. Next try the ‘speed’ performance page to see how fast a user would expect a site in your sector to be. If you see site speed (as measured in Google Analytics) drop below average during Black Friday trading it’s time to pick up the phone to your web host or web operations team. Then, are you tracking return on adverting spend for extra Facebook Ads you're running during the quarter? Ad costs will spike during the peak trading period, and you make not be getting the same volume of traffic conversion into sales. Here are some quick pointers. Facebook Ads. Littledata’s Facebook Ads connection will ensure accurate data, with a dedicated Facebook report pack for automated insights. Shopify. If you're running your site on the Shopify platform, read up on which metrics are most important for Shopify stores and check out Shopify's BFCM Toolbox for seasonal online marketing. Missions. Use Missions in the Littledata app to make permanent improvements to your user experience. BFCM may be over before you can make the changes, but customers will keep buying the rest of the year. For example, can you increase add-to-cart rate with tips such as highlighting faster selling items or recommending an alternative to out-of-stock products? So focus on some clearer metrics and I hope Black Friday brings you every success!

2018-11-19

For every retail loser there's a retail winner

Today PwC's retail survey found the British high street is being reshaped as shoppers shift online - especially in fashion, where a net 100 high street stores closed. This misses the positive side of the story: all those shoppers are buying from independent UK brands online instead, which is one of the fastest growing area of the UK economy. We looked at 30 mid-sized online fashion retailers (with average sales of £1m per month) who get a majority of their traffic from the UK. This collection had grown their sales by an aggregate 21% from October 2017 to October 2018 (year on year). Fashion shoppers love to browse unique designs on Instagram and Pinterest, compare prices and get easy home deliveries. Independent ecommerce brands are bringing original designs to the British wardrobe, and we should celebrate their success.   Behind the research Littledata gathers benchmark data from Google Analytics on over 12,000 websites, including many types of ecommerce businesses. Our customers get insights into their performance and recommendations on how to improve online conversion.

2018-11-09

Categorising websites by industry sector: how we solved a technical challenge

When Littledata first started working with benchmark data we found the biggest barrier to accuracy was self-reporting on industry sectors. Here’s how we built a better feature to categorise customer websites. Google Analytics has offered benchmarks for many years, but with limited usefulness since the industry sector field for the website is often inaccurate. The problem is that GA is typically set up by a developer or agency without knowledge or care about the company’s line of business - or understanding of what that industry sector is used for. To fix this problem Littledata needed a way to categorise websites which didn’t rely on our users selecting from a drop-down list. Google Analytics has offered benchmarks for many years, but with limited usefulness since the industry sector field for the website is often inaccurate. The first iteration: IBM Watson NLP and a basic taxonomy Our first iteration of this feature used a pre-trained model as part of IBM Watson’s set of natural language APIs. It was simple: we sent the URL, and back came a category according to the Internet Advertising Bureau taxonomy. After running this across thousands of websites we quickly realised the limitations: It failed with non-English websites It failed when website homepage was heavy with images rather than text It failed when the website was rendered via Javascript Since our customer base is growing most strongly outside the UK, with graphical product lists on their homepage, and using the latest Javascript frameworks (such as React), the failure rate was above 50% and rising. So we prioritised a second iteration. The second iteration: Extraction, translation and public APIs The success criteria was that the second iteration could categorise 8 sites which the first iteration failed with, and should go on to be 80% accurate. We also wanted to use mainly public APIs, to avoid maintaining code libraries, so we broke the detection process into 3 steps: Extracting meaningful text from the website Translating that text into English Categorising the English text to an IAB category and subcategory The Watson API seemed to perform well when given sufficient formatted text, at minimal cost per use, so we kept this for step 3. For step 2, the obvious choice was Google Translate API. The magic of this API is that it can detect the language of origin (with a minimum of ~4 words) and then provide the English translation. That left us focussing the development time on step 1 - extracting meaningful text. Initially we looked for a public API, and found the Aylien article extraction API. However, after testing it out on our sample sites, it suffered from the same flaws as the IBM Watson processing: unable to handle highly graphical sites, or those with Javascript rendering. To give us more control of the text extraction, we then opted to use a PhantomJS browser on our server. Phantom provides a standard function to extract the HTML and text from the rendered page, but at the expense of being somewhat memory intensive. Putting the first few thousand characters of the website text into translation and then categorisation produced better results, but still suffered from false positives - for example if the text contained legal-ease about data privacy it got categorised as technical or legal. We then looked at categorising the page title and meta description, which any SEO-savvy site would stuff with industry language. The problem here is that the text can be short, and mainly filled with brand names. After struggling for a day we hit upon the magic formula: categorising both the page title and the page body text, and looking for consistent categorisation across the two. By using two text sources from the same page we more than doubled the accuracy, and it worked for all but one of our ‘difficult’ websites. This hold-out site - joone.fr - has almost no mention of its main product (diapers, or nappies), which makes it uniquely hard to categorise. So to put it all the new steps together, here’s how it works for our long-term enterprise client MADE.com's French-language site. Step 1: Render the page in PhantomJS and extract the page title and description Step 2: Extract the page body text, remove any cookie policy and format Step 3: Translate both text strings in Google Translate Step 4: Compare the categorisations of the title vs page body text Step 5: If the two sources match, store the category I’m pleased that a few weeks after launching the new website classifier we have found it to be 95% accurate. Benchmarking is a core part of our feature set, informing everything that we do here at Littledata. From Shopify store benchmarks to general web performance data, the improved accuracy and deeper industry sector data is helping our customers get actionable insights to improve their ecommerce performance. If you’re interested in using our categorisation API, please contact us for a pilot. And note that Littledata is also recruiting developers, so if you like solving these kind of challenges, think about coming to join us!

2018-10-16

Are you benchmarking your ecommerce site in the right sector?

Littledata launched benchmarks for websites two years ago. They quickly became a key feature of our app, and as customers became more engaged, so did ideas for how to improve our benchmarking and the algorithms that learn from those benchmarks. In response to customer feedback and deeper research into industry sectors, we've made some really exciting improvements over the last few months to make the comparisons even more useful -- and even more dynamic. The changes are five-fold. Detailed sectors and sub-sectors. Almost every customer we talked to said the benchmark comparison was most useful if it was for very similar sites. Previously we only had 50 high-level sectors to compare with; now we have hundreds of low-level sectors. You can visualise the full range. Smarter auto-categorisation of your website. Our machine learning process now has a 95% chance of finding the best sector for your website, meaning you can compare against the most useful benchmark without filling in a single form! Ability to manually change industry sector. And of course, if you're in that 5% that needs human input, then you (or your Enterprise account manager) can pick a better sector in the general app settings page. You might also want to change sectors just to see how you compare. No problem. Benchmarks for technology stacks. Want to see if you are making the most of a technology such as Shopify or Yieldify? Now you can compare with other sites using the same technology, making our ecommerce benchmarking even more powerful for agencies and web developers. Benchmarks for starter websites. Previously we only generated benchmarks for sites with at least 500 monthly visits. We dropped that to 200 monthly visits, so starter websites can see a comparison - and see more detail as they grow. We've launched a live visualisation of how our AI-based website categorizer is mapping a range of industry sectors. It offers a full overview of website categories and segments. And you can drill down to see more details. For example, we've seen a big rise in wine, coffee and health shake retailers this year, many of whom are using our ReCharge integration to get insight into their subscription business models. As our algorithms learn about ecommerce businesses selling beverages of many varieties and automatically categorises sites accordingly, you can now look closely at a particular segment to see how your site compares. Littledata is an Agile company. We're constantly iterating, and continuously improving the benchmarks to make them more actionable, so please give us feedback if you'd like to see more. Happy benchmarking!

2018-09-25

What's the real ROI on your Facebook Ads?

For the past decade Facebook’s revenue growth has been relentless, driven by a switch from TV advertising and online banners to a platform seen as more targetable and measurable. When it comes to Facebook Ads, marketers are drawn to messaging about a strong return on investment. But are you measuring that return correctly? Facebook has spent heavily on its own analytics over the last three years, with the aim of making you -- the marketer -- fully immersed in the Facebook platform…and perhaps also to gloss over one important fact about Facebook’s reporting on its own Ads: most companies spend money with Facebook 'acquiring' users who would have bought from them anyway. Could that be you? Here are a few ways to think about tracking Facebook Ads beyond simple clicks and impressions as reported by FB themselves. The scenario Imagine a shopper named Fiona, a customer for your online fashion retail store. Fiona has browsed through the newsfeed on her Facebook mobile app, and clicks on your ad. Let’s also imagine that your site -- like most -- spends only a portion of their budget with Facebook, and is using a mix of email, paid search, affiliates and social to promote the brand. The likelihood that Fiona has interacted with more than one campaign before she buys is high. Now Fiona buys a $100 shirt from your store, and in Facebook (assuming you have ecommerce tracking with Pixel set up) the sale is linked to the original ad spend. Facebook's view of ROI The return on investment in the above scenario, as calculated by Facebook, is deceptively simple: Right, brilliant! So clear and simple. Actually, not that brilliant. You see Fiona had previously clicked on a Google Shopping ad (which is itself powered by two platforms, Google AdWords and the Google Merchant Center) -- how she found your brand -- and after Facebook, she was influenced by a friend who mentioned the product on Twitter, then finally converted by an abandoned cart email. So in reality Fiona’s full list of interactions with your ecommerce site looks like this: Google Shopping ad > browsed products Facebook Ad > viewed product Twitter post > viewed same product Link in abandoned cart email > purchase So from a multi-channel perspective, how should we attribute the benefit from the Facebook Ad? How do we track the full customer journey and attribute it to sales in your store? With enough data you might look at the probability that a similar customer would have purchased without seeing that Facebook Ad in the mix. In fact, that’s what the data-driven model in Google Marketing Platform 360 does. But without that level of data crunching we can still agree that Facebook shouldn’t be credited with 100% of the sale. It wasn’t the way the customer found your brand, or the campaign which finally convinced them to buy. Under the most generous attribution model we would attribute a quarter of the sale. So now the calculation looks like this: It cost us $2 of ad spend to bring $1 of revenue -- we should kill the campaign. But there's a catch Hang on, says Facebook. You forgot about Mark. Mark also bought the same shirt at your store, and he viewed the same ad on his phone before going on to buy it on his work computer. You marked the source of that purchase as Direct -- but it was due to the same Facebook campaign. Well yes, Facebook does have an advantage there in using its wide net of signed-in customers to link ad engagement across multiple devices for the same user. But take a step back. Mark, like Fiona, might have interacted with other marketing channels on his phone. If we can’t track cross-device for these other channels (and with Google Marketing Platform we cannot), then we should not give Facebook an unfair advantage in the attribution. So, back to multi-channel attribution from a single device. This is the best you have to work with right now, so how do you get a simple view of the Return on Advertising Spend, the real ROI on your ads? Our solution At Littledata we believe that Google Analytics is the best multi-channel attribution tool out there. All it misses is an integration with Facebook Ads to pull the ad spend by campaign, and some help to set up the campaign tagging (UTM parameters) to see which campaign in Facebook brought the user to your site. And we believe in smart automation. Shhhh...in the past few weeks we've quietly released a Facebook Ads connection, which audits your Facebook campaign tagging and pulls ad cost daily into Google Analytics. It's a seamless way to pull Facebook Ads data into your overall ecommerce tracking, something that would otherwise be a headache for marketers and developers. The integration checks Facebook Ads for accurate tagging and automatically pulls ad cost data into GA. The new integration will normally only be available in higher-tier plans, but we're currently offering it as an open beta for ALL USERS, including Basic plans! For early access, just activate the Faceb|ook Ads connection from your Littledata dashboard. It's that easy! (Not a subscriber yet? Sign up for a free trial on any plan today.) We believe in a world of equal marketing attribution. Facebook may be big, but they’re not the only platform in town, and any traffic they're sending your way should be analysed in context. Connecting your Facebook Ads account takes just a few minutes, and once the data has collected you’ll be able to activate reports to show the same kind of ROI calculation we did above. Will you join us on the journey to better data?

2018-09-20

The World Cup guide to marketing attribution

It’s World Cup fever here at Littledata. Although two of the nationalities in our global team didn’t get through the qualifiers (US & Romania) we still have England and Russia to support in the next round. And I think the World Cup is a perfect time to explain how marketing attribution works through the medium of football. In football (or what our NYC office calls 'soccer'), scoring a goal is a team effort. Strikers put the ball in the net, but you need an incisive midfield pass to cut through the opposition, and a good move starts from the back row. ‘Route one’ goals scored from a direct punt up the pitch are rare; usually teams hit the goal from a string of passes to open up the opportunity. So imagine each touch of the ball is a marketing campaign on your site, and the goal is a visitor purchasing. You have to string a series of marketing ‘touches’ together to get the visitor in the back of the net. For most ecommerce sites it is 3 to 6 touches, but it may be more for high value items. Now imagine that each player is a different channel. The move may start with a good distribution from the Display Ads defender, then a little cut back from nimble Instagram in the middle. Facebook Ads does the running up the wing, but passes it back to Instagram for another pass out to the other wing for Email. Email takes a couple of touches and then crosses the ball inside for AdWords to score a goal – which spins if off the opposing defender (Direct). GOAL!!! In this neat marketing-football move all the players contribute, but who gets credit for the goal? Well that depends on the attribution model you are using. Marketing attribution as a series of football passes Last interaction This is a simplest model, but less informative for the marketing team. In this model the opposing defender Direct gets all the credit – even though he knew nothing about the end goal! Last non-direct click This is the attribution model used by Google Analytics (and other tools) by default. In this model, we attribute all of the goal to the last campaign which wasn’t a Direct (or session with unknown source). In the move above this is AdWords, who was the last marketing player to touch the ball. But AdWords is a greedy little striker, so do we want him to take all the credit for this team goal? First interaction You may be most interested in the campaign that first brought visitors to your website. In this model, Display ads would take all the credit as the first touch. Display often performs best when measured as first interaction (or first click), but then as a ‘defender’ it is unlikely to put the ball in the net on its own – you need striker campaigns as well. Time decay This model shares the goal between the different marketing players. It may seem weird that a player can have a fraction of a goal, but it makes it easy to sum up performance across lots of goals. The player who was closest to the goal gets the highest share, and then it decays as we go back in time from the goal. So AdWords would get 0.4, Email 0.5 (for the 2 touches before) and Instagram gets 0.1. Data-driven attribution This is a model available to Google Analytics 360 customers only. What the Data-driven model does is run through thousands of different goals scored and look at the contribution of each player to the move. So if the team was equally likely to score a goal without Facebook Ads run down the wing it will give Facebook less credit for the goal. By contrast, if very few goals get scored without that pass from Instagram in the midfield then Instagram gets more credit for the goal. This should be the fairest way to attribute campaigns, but the limitation is it only considers the last 4 touches before the goal. You may have marketing moves which are longer than 4 touches. Position based Finally you can define your own attribution weighting in Position Based model, based on which position the campaign was in before the goal. For example, you may want to give some weight to the first interaction and some to the last, but little to the campaigns in between. Still confused? Maybe you need a Littledata analytics expert to help build a suitable model for you. Or the advice of our automated coach known as the analytics audit. After all, every strategy could use a good audit to make sure it's complete and up-to-date. So go enjoy the football, and every time someone talks of that ‘great assist’ from the winger, think of how you can better track all the uncredited marketing campaigns helping convert customers on your site.

2018-07-02

Google Analytics Data Retention policy - which reports does it limit?

From 25th May 2018 Google allowed you to automatically wipe user-level data from the reporting from before a cut-off date, to better comply with GDPR. We made the change for Littledata's account to wipe user-level data after 26 months, and this is what we found when reporting before February 2016. Reports you can still view before the user data removal  Audience metrics Pageviews ✓ Sessions  ✓ Users  X Bounce rate  ✓  Audience dimensions Demographics  X OS / Browser  X Location  X User Type  X  Behaviour Pageviews ✓ Custom events X

2018-06-04

Six challenges in developing a Shopify integration

At the start of 2017 Littledata released its first Shopify app. A year on, here are my observations on the technical challenges we’ve overcome. This week we're at Shopify Unite in Toronto, and it's no surprise that their app ecosystem continues to grow. We chose Shopify as our first platform partner due to their open APIs, quality documentation and enthusiasm from other developers. Much of that has been as expected, but to help all of you looking to build your own Shopify app I’ll share some of our learnings on the hidden challenges. Littledata's Shopify app makes it a one-click process for stores to set up for Enhanced Ecommerce tracking in Google Analytics, and then get actionable insights based on the Google Analytics data. It has to hook into Shopify orders and products, as well and modify the store's theme and process ongoing transactions. 1. Handling re-installs gracefully The great advantage of Shopify’s app store over, say, Magento marketplace, is that any store admin can install and pay for an app with a couple of clicks. The disadvantage is that stores can be as quick to uninstall as install. Store admins may start, realise they don’t have permissions, time or energy to continue and roll back to try again later in the day. Since our app inserts a snippet into the store’s theme layout (see point two below), uninstalling removes the web-hooks we set up but does not remove the inserted snippet. When a store re-installs our app has to work out what state they were in when they uninstalled (audit, test mode or live), whether the script snippet is still there and what settings have been changed in the meantime. It took us a few months to get a handle on all the possible user flows, and we’ve found automated end-to-end tests to really speed up running through the different scenarios. In our Meteor framework we use Chimp [link] to run tests through Selenium on localhost and on our staging server. We've also found it essential to track our own stats of 'installs + activations' (including the date of first install and time to finally uninstall) rather than relying on the Shopify Partner stats of uninstalls and installs, which can hide the detail in between. 2. Working with script tags The other side-effect of making apps easy to install is that you can assume the early-adopter stores who will try your app already have lots of other installs. Shopify recommends using the Script Tag API to handle scripts linked to the app, so that when a store uninstalls your app it also removes any client-side scripts from the store. Unfortunately, in early tests we found the load latency to be unacceptably high: on some stores, only 50% of the page load events were getting fired before the user moved off the page. So plan B was add a snippet to the store theme, and then load this snippet at the top of the <body> element on all the layout templates. This has worked much more predictably, except when theme editors remove the snippet reference without enquiring what the Littledata app does (see our fifth challenge). 3. Charge activation vs authorisation Now a very simple gotcha. In our first month we had around 60 installs at a flat price of $20/month, but apparently no revenue. After investigation we found we had not activated the recurring charges after the store admin had authorised them. Doh! We're still not sure why an app would want to have authorised charges which are not activated -- seems like over-engineering on Shopify's side -- but luckily it was easy to correct without asking for more user permissions. 4. Tracking adds-to-cart The first version of our app tried to run the script when customers got to the ‘/cart’ page of a store. The problem here is that many stores have AJAX or ‘mini’ carts where customers can checkout without every visiting the cart page. We looked to trigger the script before the user got to the cart the page, but this appeared to run too many risks of interfering with the customer actually adding the item. Our final solution has been to poll the Shopify store for the current cart, and see if products have been added (or removed) since we last polled (and stored the previous cart contents in memory). This is somewhat inefficient, as it requires continuous network activity to grab the cart JSON from Shopify, but we’ve reduced the network requests to one every 4 seconds – judging that customers are very unlikely to add a product and checkout in less than 4 seconds. This cart polling has proved more reliable across different store templates. 5. Integrating with other Shopify apps I mentioned that early-adopter stores tend to have lots of other apps: and those apps have loyal customers who push to make Littledata's app to work their chosen app (not just vanilla Shopify). The challenge is that most of these app development companies run a very Agile process, constantly changing how their app works (hopefully to improve the experience for store owners). An integration that worked a few months ago may no longer work. We've found the best solution to be open developer-to-developer communications, via a Slack guest channel. Having the developers implementing the features on each side talk to each other really cuts down the delays caused by a well-meaning project manager slightly misinterpreting the requirement. 6. Handling ongoing updates As tested improved client-side tracking scripts, we needed to update the script in the store theme (see point 2 above). This creates a small risk for the store, as there is no UAT or test environment for most stores to check before going live with the new script. The store theme may also get edited, creating new layout templates where the Littledata snippet is not loaded. In the first version of our app we tried to update and re-insert the latest Littledata snippet automatically on a weekly cycle. However, once we reached hundreds of active installs this became unmanageable and also opaque for the store admins. In the latest version we now allow store admins to UPGRADE to the latest script, and then we check all the correct Google Analytics events are being fired afterwards. Giving the end user control of updates seems a better way of maintaining trust in our brand and also removing risk: if the update goes wrong, it’s quicker for us to alert the store owner on how to fix. Conclusion I’m still sure we made the right choice with Shopify as a platform, as their APIs, partner support and commercial traction are all number one in the ecommerce world. But I hope that by sharing some of the hidden challenges in developing Shopify integrations, we can all build better apps for the community. Have you built something for the Shopify app store? Are there development problems you’ve encountered which I haven’t shared here? PS. Are you a developer interested in joining an innovative analytics company? We're hiring in multiple locations!

2018-05-07

How Littledata helps Shopify stores comply with GDPR

When the GDPR regulation comes into effect later this month, it will impact all websites trading with EU citizens. That means any ecommerce site with customers in Europe! Is your Shopify store ready to comply? We recently updated our Shopify app (since release 7.8) to help Shopify stores which use Google Analytics comply with GDPR. In addition to automatic fixes to help your store comply, we include recommendations for how to update your site content (such as Terms and Conditions), and how to deal with the new 'two year rule'. If you're running a Shopify store, the time to act is now. Automatic fixes with our Shopify app The first two steps are done automatically when you install our GDPR-ready Shopify app. If you're already using Littledata's Shopify app, these two fixes can be applied when you upgrade to our latest tracking script (version 3.2). Here's what they address. 1. Anonymise customer IP addresses The IP address of your website visitor is considered personal information under GDPR, and to remove any risk that this is sent to Google’s servers in the USA, our script scrambles the last few digits of the IP address. Google already promises not to store the IP address, so this step is an extra level of safety. This slightly reduces the accuracy of tracking which city your visitor came from -- but we believe that this is a small price to pay for ensuring anonymity. 2. Filter personal emails and ZIP/postcodes from pageviews Many sites accidentally send personal data in the page URLs or titles tracked by Google Analytics. For example, apps with their own checkout often send the user email as a URL parameter like ‘/url?email=myname@gmail.com’. Our script now filters that personal data out at source, so the page path you’ll see in Google Analytics is ‘/url?email=REMOVED’. Additional manual steps There are two additional manual steps to ensure that Google Analytics for your Shopify store is GDPR-compliant. 3. Update your terms and conditions You need to update your website T&Cs to ensure users are aware of the Google Analytics Advertising Features that our Shopify app activates and Google uses to identify user demographics, such as gender and interests. We are not lawyers, but we suggest using something similar to these sentences to describe what data is collected, how you (and we) use the data, and how how users can opt out: Our site uses Google Analytics Advertising Features to deduce your gender, age group and interests based on other types of websites you have visited. We use this in aggregate to understand which demographics engage with areas of our website. You can opt out with Google's browser add-on. 4. Remove user-specific information after 2 years You should also change the data retention period for your Google Analytics web property, so that Google removes all user-specific information from their database after 2 years. To make this change, logging to your GA account and go to the Settings cog, and then Property > Tracking info > Data Retention. Use the 'data retention' drop-down menu to select to keep user data for 26 months, and mark 'reset on new activity' to ON. This means that after 26 months, if the user has not come back to your website, any user cookie will be deleted. We think this sensible to comply with the Right to Erasure without making any practical limits to your analysis. Right to Erasure feature coming soon! We're also working on a feature to help websites comply with the Right to Erasure or Right to be Forgotten. Here's a summary of that aspect of the regulation, from the summary of key changes at EUGDPR.org. Right to be Forgotten Also known as Data Erasure, the right to be forgotten entitles the data subject to have the data controller erase his/her personal data, cease further dissemination of the data, and potentially have third parties halt processing of the data. The conditions for erasure, as outlined in article 17, include the data no longer being relevant to original purposes for processing, or a data subject's withdrawing consent. It should also be noted that this right requires controllers to compare the subjects' rights to "the public interest in the availability of the data" when considering such requests. Littledata's Right to Erasure feature will ensure that when you delete a customer from your Shopify admin interface, any references to that customer are deleted from Google Analytics. This won’t affect aggregate reporting, such as number of web sessions or transactions. When do GDPR regulations take effect? The official enforcement date for General Data Protection Regulation (GDPR) is 25 May 2018. At that time any organisations in non-compliance may face heavy fines. In short, we recommend implementing the fixes above ASAP for your Shopify store. All you need is Google Analytics account and our Shopify app. And do check our blog regularly for updates. This is the best place to hear about new Littledata features relating to GDPR, as well as news and analysis about how the regulations affect different types of online businesses, including ecommerce websites, subscription businesses, and membership-based sites such as large charities and nonprofits. Looking for additional support? Contact us about GDPR consulting for analytics setup.

2018-05-02

Tracking the online customer journey for luxury ecommerce

Today I'm excited to be participating in the Innovation Meets Fashion event in Lugano, Switzerland. As an increasing amount of luxury and fashion retail moves online, high-end brands are finding it complicated to track the complete customer journey. In many cases, difficulties in tracking customers through to eventual purchase are holding back investment in the digital experience and online marketing. But it doesn't have to be this way. We've found a straightforward correlation in ecommerce between the average ticket price of the item being purchased and the number of web pages or sessions before that purchase is made. Simply put, customers spend longer considering big ticket items than they do with smaller ticket items and impulse purchases. Luxury retail involves many touch points with the brand across your websites, social sites and physical stores. The problem is that the longer than online customer journey, the harder it is to get consistent data on which top-of-funnel experiences are leading to purchasing. So first the bad news: since many potential customers browse anonymously, perfect ecommerce tracking across a long online and offline journey is not possible. Tracking browsers based on first-party cookies (such as Google Analytics) will fail when customers use multiple devices, clear their cookies or browse in-app (such as from Facebook). Yet there are three ways we have seen retailers selling high value items increase the reliability of their online behavioural data. 1. Track online shopping behaviour in detail Understanding whether customers browse certain products, view the detail of product variants and even add-to-cart is a good proxy for seeing which campaigns eventually convert. Does your brand have a good understanding of how each marketing channel influences browsing behaviour, after the landing page but before the checkout? 2. Offer a good reason to get customers to login before buying VIP offers, registering for events and discounts all offer a good way of getting customers to login from different devices. With the correct analytics setup, this login information can be used (without infringing the users’ privacy) to link together different interactions they make across multiple devices 3. Make the most of your email list Even without having a login before purchase, customers clicking through links in a marketing email can allow the same stitching together of sessions. This means that if a customer visits a link from their mobile device, and on another week from their home laptop, these two devices can be linked as belonging to the same email – and therefore the same person. Luxury online retail involves a complex journey. Littledata is here to make your tracking and reporting both easy and accurate. Sign up today to get started with our complete analytics suite, and feel free to reach out to our Google Analytics consultants with questions about best practices for luxury ecommerce. Your success is our success!

2018-03-26

GDPR compliance for ecommerce businesses

Ecommerce companies typically store lots of personally identifiable information (PII), so how can you make compliance easier without compromising analysis? With the deadline for GDPR compliance looming, I wanted to expand on my previous article on GDPR and Google Analytics to focus on ecommerce. Firstly, who does this apply to? GDPR is European Union legislation that applies to any company trading in Europe: so if you sell online and deliver to European Union member countries, the regulations apply to you. It's essential that you understand how your online business is collecting and storing PII. Splitting PII from anonymous data points Your goal should be to maintain two separate data stores: one that contains customer details, from where you can look up what a specific customer bought, and one that contains anonymous data points, from where you can see performance and trends. The data store for the customer details will typically be your ecommerce back-end and/or CRM (see below). This will include name, email, address, purchase history, etc. It will link those with a customer number and orders numbers. If a customer wants the right of access all the relevant details should be in this store. We use Google Analytics as the anonymous data store (although you may have a different ecommerce analytics platform). There you can store data which only refers to the customer record. These are called pseudo-anonymous data points under GDPR: they are only identifiable to a customer if you can link the customer number or order number back to your ecommerce back-end. Pseudo-anonymous data points you can safely send to Google Analytics include: Order number / transaction ID Order value / transaction amount Tax & shipping Product names and quantities Customer number Hashed email address (possibly a more flexible to link back to the customer record) If a customer exercises their right to removal, removing them from the ecommerce back-end will be sufficient. You do not also have to remove them from your Google Analytics, since the order number and customer number now have nothing to refer to. You do still need due process to ensure access to Google Analytics is limited, as in extreme circumstances a combination of dimensions such as products, country / city and browser, could identify the customer. Isn’t it simpler to just have one store? Every extra data store you maintain increases the risk of data breaches and complexity of compliance – so why not just analyse a single customer data store? I can think of three reasons not to do so: Marketing agencies (and other third parties) need access to the ecommerce conversion data, but not the underlying customer data Removing a customer’s order history on request would impact your historic revenue and purchase volumes – not desirable Your CRM / ecommerce platform is not built for large scale analysis: it may lack the tools, speed and integrations needed to get meaningful insights Beware of accidental transfers There are a few danger areas where you may inadvertently be sending PII data to Google Analytics: Customer emails captured in a signup event A customised product name – e.g. ‘engraving for Edward Upton’ Address or name captured in a custom dimension Our PII audit check is a quick, free way to make sure that’s not happening. Multiple stores of customer details GDPR compliance becomes difficult when your customer record is fragmented across multiple data stores. For example, you may have product and order information in your ecommerce database, with further customer contact details in a CRM. The simplest advice is to set up automatic two-way integrations between the data stores, so updating the CRM updates the ecommerce platform and visa-versa. Removing customer records from one system should remove them from the other. If that’s not possible, then you need clear processes to update both systems when customer details change, so you can comply with the right to rectification. Conclusion GDPR compliance need not require changing analytics tools or databases, just a clear process for separating out personally identifiable information – and training for the staff involved in handing that data. I hope this brief overview has been helpful. For further advice on how your ecommerce systems comply, please contact us for a free consultation. Littledata has experience with every major analytics platform and a wide range of custom setups. However, as a number of global companies are concurrently prepping for compliance, we highly recommend that you get in touch sooner rather than later!

2018-02-13

The 5 worst arguments for boosting Bitcoin

I’m exasperated reading dodgy logic justifying the heady ascent of Bitcoin. What are the worst 5 arguments I’ve heard? Full disclosure: I don’t own any Bitcoin, or have any bets on its rise or otherwise. 1. Bitcoin is an insurance against the collapse of capitalism The booster The rise of artificial intelligence and mass joblessness will sweep away much of the old order of nation states and their currencies. Bitcoin is independent of government and will survive the coming storm. A grain of truth I believe big change in the relative value of labour and capital, and how they contribute to the tax base, is coming faster than politicians expect. And the reactionary backlash in affected countries, such as those voting for Donald Trump, won’t stop this trend. The sceptic Bitcoin relies on a chain of other technologies which may well get disrupted with the collapse of capitalism: cheap power supply, a global internet and secure online vaults to hold the private keys and transact the Bitcoin. If you’re betting on the end of the world as we know it, hunting and farming skills are going to be more useful!   2. Bitcoin’s limited supply makes it deflationary by default The booster Unlike fiat money (e.g. the US dollar) which can be printed at will by central banks, the total number of Bitcoin is mathematically limited to 21 million. That means, as other currencies inflate, Bitcoin will hold its value – i.e. it’s digital gold A grain of truth As developed countries around the world are forced to borrow themselves out of the hole of shrinking tax bases and increasing healthcare costs, they may try to inflate their currencies to erode the debt. The sceptic Central banks have a positive inflation target for a reason: in a deflationary currency, no-one wants to spend the currency and so there’s no circulation of wealth. If one Bitcoin could have bought me a coffee in 2016, but at the time of writing could have bought a car, why would I ever spend it? And if no one spends the currency then it has no tangible value.   3. Bitcoin is the leader of the blockchain revolution The booster Blockchain is one of the few game-changing technologies to be invented the last two decades. It will revolutionise the world of finance, and you need to own Bitcoin to be part of that. A grain of truth The blockchain ledger, keeping a public record of all transactions, and reducing the possibility for fraud or interception, will certainly change many aspects of finance. There are many projects underway in financial trading and government. The sceptic Just because Bitcoin was the first use-case of the technology, does not make it essential to newer blockchains. Equally, its first-mover advantage may not even make it the winning cryptocurrency. That said, I wouldn’t go out buying a basket of other cryptocurrencies just yet – they are all overinflated by Bitcoin’s rise.   4. The increasing mining cost of Bitcoin underpins its value The booster New bitcoin gets exponentially harder to mine, so since the cost of electricity for the miner’s servers won’t fall, the cost per bitcoin mine is rising all the time. And if you can’t mine them, you’ll have to buy them. The sceptic Yes.. but what if no one needs Bitcoins at all? Mining gold is subject to the same economic forces, but if the gold goes out of fashion as a value store (as it did an the turn of the Millennium) it still had industrial value for conducting electricity and aesthetic value for jewellery. Bitcoin has neither of those.   5. The rise of bitcoin is 2017 shows it has won out as the cryptocurrency of choice The argument Bitcoin is now the established alternative store of value, which is why it has risen so fast in 2017. And what if all the pension funds and institutional investors now buy up a slice to ensure an allocation of this new asset class? A grain of truth There’s no rational way to value Bitcoin: it does not pay dividends or have intrinsic worth (see point 4). So it could be worth anything .. or nothing. The sceptic Every decade a new mania comes along for investors to follow. The vast chatter on LinkedIn, Facebook and other forums only heightens the mania by allowing unchecked falsehoods to flourish. You only have to look at the South Sea Bubble and Tulip mania to see there is nothing new under the sun. Enjoy the roller-coaster ride up .. because everything that goes up, must come down.      

2017-12-19

Retailers traded 2.4 times normal volumes during Black Friday week 2017

The results are in, and this year's Black Friday sales prove that things are continuing to look up for ecommerce. Across 570 online stores, the average store did 2.4 times their normal sales in Black Friday week 2017, compared with only 2.2 times in 2016 – and a greater proportion of stores participated in the sales. Following our post on pre-Black Friday trends, Littledata looked again at what happened from Thanksgiving Thursday 2017 through to the following Wednesday (the week including Black Friday and Cyber Monday) – versus a control period of November & December in 2016. Compared with 2016, we found a bigger number of stores participating in Black Friday sales this year: 53% of stores were trading more than 1.5 times their normal volumes, compared with only 49% in the equivalent week in 2016. For those stores which promoted heavily in 2016, the median boost was 2.5 times normal. And those in the bottom quartile of sales in 2016 still traded 108% their normal volumes. How did Black Friday promotions work for your store? Use our industry benchmarks to find out how your online store is performing against the competition.

2017-11-30

Black Friday discounting increases next season’s purchasing

I knew Black Friday had reached ‘late adopter’ stage this week when a company I’d bought fencing panels from - fencing panels – emailed me their holiday season promotions. But the real question is whether all these promotions serve to drive customer loyalty or just attract bargain hunters? At Littledata we looked at aggregate data from 143 retailers who participated most in 2016 Black Friday, versus 143 retailers who did not. For the first 23 days of November 2017 – before Black Friday – the median year-on-year increase in sales was 13% for those pushing discounts the previous year, versus only 1% growth for those avoiding Black Friday discounting *. Our conclusion is that retailers who discounted most heavily on Black Friday 2016 saw a lasting benefit in extra sales a year after the sales period. However, we don’t know whether these extra sales were profitable enough to pay for the seasonal promotions. Another possible explanation is that higher-growth retailers are more active in marketing Black Friday, but in either event the discount season has done them no harm over the following year. In a follow up post next week we’ll compare the peak discount trading – and see if on average these same stores increased their participation this year or reigned it back. Looking at 2016, it seems Black Friday was bigger than the year before for our cohort of 270 UK retailers – but at the expense of sales later in the season. Yet in the UK we are not close to US-levels of hysteria yet, where a much greater proportion of the last quarter’s sales are done on that weekend. The other interesting question is what sectors does Black Friday affect? Reflecting back on my 2016 post, it may be a surprise that the biggest boost of over 100% average increase in sales comes for Health & Beauty stores; whereas technology and computer stores on average saw a boost of 40% for the week. (The graph shows the difference with the average sales volumes in November & December, by sector, for 3 selected weeks.) And perhaps I shouldn’t have been surprised by those fencing panels: business and industrial sites saw a big boost too! Interested in tracking online sales activity for your own site this holiday shopping season? Littledata's ecommerce analytics software provides accurate data and automated reporting to help you track promotions and drive conversions and customer loyalty. * The statistical detail I took a group of 573 retailers we have tracked for at least 2 years, and looked at the ratio of Black Friday weekend sales (Friday, Saturday, Sunday, Monday) to the 2 month average for November and December. Those in the top quartile (trading 2.6 times above average during the Black Friday season) were deemed to have participated; those in the bottom quartile, showing a dip in trading over that weekend were deemed not to have participated. I then looked at the year-on-year growth in revenue between November 2016 (first 23 days) and the same period in November 2017, for the discount versus non-discount group. A t-test between the groups found a 18% probability that the two groups had the same mean, not allowing us to dismiss the null hypothesis.  

2017-11-24

6 essential benchmarks for Shopify stores

Understanding how your website performs versus similar sites is the best way to prioritise what to improve. In this post we take a look at 6 top benchmarks for optimising Shopify store performance. Accurate benchmark data is especially useful to the increasing number of ecommerce companies using web performance benchmarks, such as bounce rates and home page reliance, as core elements of their sales and marketing KPIs. Understanding benchmarks is a key to success. To put together this new benchmarking report, we analysed current data from 470 Shopify retailers. If you're wondering how you compare, check out our Shopify analytics app. Average order value Average order value (AOV) or Average revenue per paying user (ARPU) is the total monthly revenue divided by the number of users which transacted that month. It is a measure of how well you are up-selling and cross-selling your products, depending on your product mix. What is a good average order value for Shopify stores? The benchmark is $69. The average is slightly lower ($63.50) if you are a smaller Shopify store. More than $120 AOV would put you in the top quartile, and one of our top-performing stores in the luxury ecommerce sector is averaging $2,080 per order! If your Shopify store has a lower AOV than the benchmark, you might try increasing your average checkout value by cross-selling other products, offering free shipping above a minimum threshold or increasing pricing on selected products. Ecommerce conversion rate Ecommerce conversion is the number of purchases divided by the total number of sessions. Most visitors will take more than one session to decide to purchase, but this is the standard measure of conversion rate. It is a measure of how good a fit your traffic is for your products, and how well your site converts this traffic into customers. What is a good ecommerce conversion rate for Shopify stores? The benchmark is 1.75%. Larger stores have pushed this to 1.85%, and if you are more than 2.8% you are in the top quartile. The highest conversion rate we’ve seen on Shopify is 8%. Can you increase the conversion rate with more attractive product displays, or improving the checkout process? Enhanced ecommerce tracking will help you identify exactly where the blockers lie. Bounce rate from mobile search Since more than 60% of Google searches are now done on mobile, ensuring your site design works on a small screen is important for branding and sales. Bounce rate is the percent of visits of only one page – and will be high if your landing pages do not engage. Google will even adjust your mobile ranking for a given keyword depending on what proportion of visitors stick on your page - a good indication that your link was useful. What is a good bounce rate from mobile search for Shopify stores? The benchmark is 47.5%. The biggest Shopify stores have got this below 40%, and overall large retailers have 38% mobile bounce rate. So it’s not a problem with the Shopify platform, so much as a problem with the store theme – or how the options and products are displayed on a smaller screen. Can you improve the first impressions of the landing pages, put key content higher up the page, or decrease the page load speed to reduce that bounce rate? Delay before page content appears The delay between a page request by the user and them being to read or click on that page. This is more important than full page load speed for AJAX / lazy loading sites (also called the ‘DOM Interactive Time’). What is a good delay time before page content appears? The benchmark for Shopify stores is 2.75 seconds. Even larger retailers have this down to 2.8 seconds, so Shopify sites do well on this score. Anything less than 3 seconds is generally acceptable. Internet users are increasingly intolerant of slow sites. Your developers could look at Google PageSpeed Insights for more details. Often the delay will be down to extra scripts which could be delayed or removed. Server response time This is the part of the page load speed which is entirely outside of your control – and due to the speed of the servers your site runs on. What is a good server response time for Shopify stores? The benchmark is 322ms. The average for larger ecommerce is 542ms – so Shopify’s server infrastructure is serving you well here. Reliance on the homepage This is the percent of visitors who land on your homepage. If this is below 40% you rely heavily on your homepage to capture brand or paid search traffic. Google increasingly rewards sites with a greater volume of landing pages targeting more specific keyword phrases. What is a good reliance on homepage percentage for Shopify stores? The benchmark is 32%. Larger Shopify stores, with many more landing pages, have reduced this to 7.3% of traffic landing on the homepage on average. Can you build out product landing pages and inbound links to copy their advantage? Ready to benchmark your own website, stop playing guessing games and start scaling your ecommerce business? Our Shopify reporting app is the easiest way to get accurate benchmarking. Install Littledata today and you'll get instant access to up to 20 relevant industry benchmarks for ecommerce sites, plus the tools you need to fix your analytics for accurate tracking, so you'll always know for sure where your website stands. It's all about smart data that helps you focus on making changes that drive revenue and increase conversions. We're here to help you grow!

2017-11-14

Is Google Analytics compliant with GDPR?

From May 2018 the new General Data Protection Regulations (GDPR) will come into force in the European Union, causing all marketers and data engineers to re-consider how they store, transmit and manage data – including Google Analytics. If your company uses Google Analytics, and you have customers in Europe, then this guide will help you check compliance. The rights enshrined by GDPR relate to any data your company holds which is personally identifiable: that is, can be tied back to a customer who contacts you. The simplest form of compliance, and what Google requires in the GA Terms of Use, is that you do not store any personally identifiable information. Imagine a customer calls your company and using the right of access asks what web analytics you hold on them. If it is impossible for anyone at your company (or from your agencies) to identify that customer in GA, then the other right of rectification and right of erasure cannot apply. Since it is not possible to selectively delete data in GA (without deleting the entire web property history) this is also the only practical way to comply. The tasks needed to meet depends on your meaning of ‘impossible to identify’! Basic Compliance Any customer data sent ‘in the clear’ to GA is a clear break of their terms, and can result in Google deleting all your analytics for that period. This would include: User names sent in page URLs Phone numbers captured during form completion events Email addresses used as customer identifiers in custom dimensions If you’re not sure, our analytics audit tool includes a check for all these types of personally identifiable information. You need to filter out the names and emails on the affected pages, in the browser; applying a filter within GA itself is not sufficient. But I prefer a belt-and-braces approach to compliance, so you should also look at who has access to the Google Analytics account, and ensure that all those with access are aware of the need not to capture personal data and GDPR more generally. You should check your company actually owns the Google Analytics account (not an agency), and if not transfer it back. At the web property level, you should check only a limited number of admins have permission to add and remove users, and that all the users only have permission to the websites they are directly involved in. Or you could talk to us about integrations with your internal systems to automatically add and remove users to GA based on roles in the company. Full Compliance Other areas which could possibly be personally identifiable and you may need to discuss are: IP addresses Postcodes/ZIP codes Long URLs with lots of user-specific attributes The customer’s IP address is not stored by Google in a database, or accessible to any client company, but it could potentially be accessed by a Google employee. If you’re concerned there is a plug-in to anonymise the last part of the IP address, which still allows Google to detect the user’s rough location. ZIP codes are unlikely to be linked to a user, but in the UK some postcodes could be linked to an individual household – and to a person, in combination with the web pages they visited. As with IPs, the best solution is to only send the first few digits (the ‘outcode’) to GA, which still allows segmenting by location. Long URLs are problematic in reporting (since GA does not allow more than 50,000 different URL variants in a report) but also because, as with postcodes, a combination of lots of marginally personal information could lead to a person. For example, if the URL was mysite.com/form?gender=female&birthdate=31-12-1980&companyName=Facebook&homeCity=Winchester This could allow anyone viewing those page paths in GA to identify the person. The solution is to replace long URLs with a shortened version like mysite.com/form And for bonus points... All European websites are required to get visitors to opt in to a cookie policy, which covers the use of the GA tracker cookie. But does your site log whether that cookie policy was accepted, by using a custom event? Doing so would protect you from a web-savvy user in the future who wanted to know what information has been stored against the client ID used in his Google cookie. I feel this client ID is outside the scope of GDPR, but guaranteeing that the user on GA can be linked to opt-in consent of the cookie will help protect against any future data litigation. The final area of contention is hashing emails. This is the process used to convert a plain email like ‘me@gmail.com’ into a unique string like ‘uDpWb89gxRkWmZLgD’. The theory is that hashing is a one-way process, so I can’t regenerate the original personal email from the hash, rendering it not personal. The problem is that some common hashing algorithms can be cracked, so actually the original email can be deduced from a seemingly-random string. The result is that under GDPR, such email hashes are considered 'pseudonymized' - the resulting data can be more widely shared for analysis, but still needs to be handled with care. For extra security, you could add a ‘salt’ to the hashing, but this might negate the whole reason why you want to store a user email in the first place – to link together different actions or campaigns from the same user, without actually naming the user. There are ways around that strike a compromise. Contact Littledata for a free initial consultation or a GDPR compliance audit.

2017-10-19

Littledata at Codess

I was proud to be invited by Microsoft to speak at their Codess event in Bucharest last week to encourage women in software. We talked about how Littledata uses Meteor, Node and MongoDB to run scalable web applications; slightly controversial because none of these are Microsoft technologies! The event was well run and well attended, so I hope it inspires some of the attendees to start their own projects...or to join Littledata (we're hiring).

2017-10-17

The end of the ecommerce 'thank you' page

For two decades the ecommerce customer journey has stayed roughly the same. Customers browse, add to cart, checkout, and then see a page confirming their purchase: the 'thank you' page. That last step is changing, and this is no small change as it threatens to break how many sites measure purchases. Ecommerce stores that stop using a final 'thank you' page without adjusting their analytics setup accordingly are in danger of getting inaccurate purchase data, or even losing track of shoppers altogether. In order to help our customers get ahead of the curve, we've gone through a number of test cases to find short and long term fixes to this issue. But first, a little history. In the old days... In the early days of ecommerce the biggest barrier during checkout was trust. Retailers paid to be certified as ‘hack-proof’ and customers wanted to make quite sure when and how their money was taken. Fast forward twenty years to today, and in the developed world most consumers have transacted online hundreds of times. They are familiar with the process, expect a seamless user experience, and confident that when they click 'buy' their payment will be taken and the products delivered. Online shoppers are so confident, in fact, that an increasing number we observe don’t even bother waiting for that ‘thank you for your order’ page. That page is becoming redundant for three reasons: Almost every checkout process captures an email address to send an order receipt to, and the email acts as a better type of confirmation: one that can be searched and referenced. Seriously, when was the last time you opted to ‘print the confirmation page’ for your records? Many retailers are forced to compete with the superb customer support offered by Amazon. This includes refunds for products that were ordered in error, and quick handling of failed payments. So from a customer's perspective there’s little point in waiting for the confirmation page when any issues will be flagged up later. Which leads to the third reason: as retailers improve the speed of checkout, the payment confirmation step is often the slowest, and so the one where customers are most likely to drop out on a slow mobile connection. This is no small issue, as mobile revenues are expected to overtake desktop revenues for ecommerce businesses globally this year. What does this mean for ecommerce sites? The issue is that for many sites the linking of sales to marketing campaigns is measured by views of that ‘thank you' page. In the marketing analysis, a ‘purchase’ is really a view of that 'thank you' page - or an event recorded on the customer’s browser with the sale. If customers don’t view the page, then no sale is recorded. If you have ever been frustrated by the lack of consistency between Google Analytics and your own payment/back-end records, this is the most likely issue. A dependency on viewing the 'thank you' page brings other problems too: a buggy script, perhaps from another marketing tag, will block the recording of sales. This is another source of the type of analytics inaccuracy which the Littledata app combats automatically. How to adjust your ecommerce tracking The short-term fix is to tweak the firing order of marketing tags on the 'thank you' page, so that even customers who see the page for fractions of a second will be recorded. Sites with a large number of marketing tags will have the greatest room for improvement. But in the long term, as this trend continues, the analytics solution is to link the marketing campaigns to the actual payments taken. This removes the need for the customer to see any type of 'thank you' or confirmation page, and also removes discrepancies between what your marketing platform tells you was purchased and what actually got bought. This is known as server-side tracking. The good news for those of you on the Shopify platform is that our Shopify reporting app does this already - and solves a lot of other analytics problems in one install. For those on other stores, please do contact us for advice. The Littledata team has worked with ecommerce businesses to set up integrations with Magento, DemandWare and numerous custom platforms. Not only can we help fix your analytics setup for accurate tracking, but our app then automates the audit and reporting process for all of your sites going forward.

2017-08-30

Custom reporting for marketing agencies

Are you a digital marketing agency looking for new reporting solutions? As our agency partnerships continue to grow, we thought it would be useful to outline how Littledata's custom reporting helps forward-thinking agencies cut down on reporting time, visualise data and improve performance for their clients. The marketing landscape is complex, but your reporting doesn't have to be overly complicated. With such a wide range of channels and sites to track, many agencies struggle to find the best analytics tools. To you we say: Welcome, you've finally found a solution that both simplifies and enhances the reporting process. Smarter reporting and accurate analytics Do you produce regular campaign performance reports in Excel or Google Sheets for your clients? Have you rejected other reporting solutions as being too rigid or complex for your needs? Then Littledata’s custom reports could work well for you and your clients. We automate the data fetching and calculations you currently run manually, and display the results to clients in a streamlined web app. We'll even show you the most important metrics, and report on key changes - automatically. One key advantage over tools such as Tableau, Data Studio or Chartio is that you can define a template report and then roll it out for many different web properties (or segments of websites) with the click of a button. Compared with other solutions you may have considered we also offer: Full support in data setup, report design and client onboarding Branded report packs for your clients and customers Complete life cycle data on your clients' customers, from marketing attribution to repeat purchases (including for subscription-based businesses) 1st line support to end users Flexibility to calculate any metrics (using Google Sheets in our processing pipeline) Comparison to industry benchmarks for sales, marketing and web performance - or create private benchmarks amongst your own client base Actionable insights for any online business to improve marketing ROI and increase conversions, whether one large ecommerce site or a series of micro-sites Integration of Google Analytics with Google Search Console data for powerful SEO reports We’re also open to discussions about white-labelling the Littledata app. This type of partnership works best for agencies with at least 20 clients ready to take advantage of our intelligent analytics tools. Please contact us if you’d like a demo, to see how this has worked for existing customers, or to discuss a particular client’s needs. Get ready to love your analytics :)

2017-08-09

How to add tracking for multiple websites or apps (VIDEO)

If you're tracking multiple sites or apps in Google Analytics, you can connect all of these views to your Littledata account and easily switch between them. Watch this quick video to learn how to add or remove a Google Analytics data source in the Littledata app. FAQs - Working with multiple Google Analytics views How do Littledata reports link to Google Analytics views? When you click to set up another site you will see a list of all the Google Analytics properties and views linked to your Google account. Typically you will only be interested in one of the views, which contains data for the site or app you are working on. When you select a view, Littledata fetches the data it needs to enable core features such as our intelligent Google Analytics audit and industry benchmarking. Note that this doesn't commit you to purchase anything. The underlying data in your Google Analytics account is not affected unless you opt-in to our automated fixes, which let you automatically fix particular aspects of your Google Analytics setup. How many websites or apps can I track? You can set up standard reporting for as many websites as you like. However, if you're using Littledata's Pro services for advanced custom reporting, this is priced per view or data source. You can switch between these sites using the drop-down menu in the top bar. Does your reporting work with mobile app properties? Right now, some of the features will work - such as dashboards, alerts and buyer personas - but audit and benchmarking are specifically for websites. How do I add or remove a site? Once you've connected multiple web properties to your Littledata account, you can manage them using the My Sites page under the profile photo drop-down menu in the upper right. Can Littledata handle micro-sites? Yes. If each micro-site have it's own Google Analytics view, then go ahead and connect them all to your Littledata account. If the micro-sites are all under one web view, then ask the Littledata team about custom solutions to create a multi-site dashboard that lets you visualise Google Analytics data from many micro-sites and benchmark against each other. We have done this for a range of customers and are happy to discuss the details of what is involved in reporting on multiple micro-sites, whether just a few or several hundred!

2017-08-02

TechHub London demo roundup

Last night we gave a live demo of the Littledata app at TechHub London's Tuesday demo night. It's always exciting to share Littledata with other entrepreneurs and business owners, and to get their feedback about Google Analytics issues (everybody has some!). But in this post I'm putting our app aside for a moment in order to share some thoughts on the other company demos from the event. After all, isn't sharing feedback and ideas what the TechHub community is all about? My Film Buzz MyFilmBuzz is an early stage mobile app – launched eight weeks ago with 150 users. The user interface is really intuitive; making use of great visuals from movies and Tinder-style swiping to rate movies. The commercial problem is competing with established players like Rotten Tomatoes with big established audiences. Can a better interface tempt film viewers away? HeathClub TV HeathClub TV offers personalised training videos and exercises, selling via personal trainers who create their own profile and packages. A bit like Udemy for personal training courses, the trainers take a cut of the course fees. Again personal fitness is a very competitive market – the founder said one competitor spent £1.5m on their first version mobile app. I’ve personally enjoyed the 8-fit mobile app, with a similar mix of video exercises but without the marketplace for trainers to produce content. It will be interesting to see if the user generated content model wins out in this market. Trevor.io Trevor helps companies visualise data sources from their own business, such as SQL databases. The user interface makes a good job of simplifying a complex task, switching between table and graph views. As a data geek, I love it! We thought about a similar product in the early stages of Littledata, so my big question is: how many users have the analytical knowledge to create the data integrations, but aren’t comfortable using SQL or similar. At Littledata, most of our analysts progress to coding, because it makes them quicker to do the analysis – but then we are an unusually techy company. Grocemania Grocemania allows customers to place orders from local retailers, charging a small delivery fee (£2.50) and small minimum order (£10) subsidised by 15% commission from the retailers. They have launched a pilot in Surrey with nine retailers. The strategy seems to be to undercut other delivery companies, with lower delivery costs from freelancers and passing stock control onto the retailers. The presenters got a groan for highlighting how they reduce employment costs, but my real concern is how they can profitably undercut companies like Amazon who are ruthless pros at retail and delivery. Worksheet Systems Similar to Trevor, Worksheet Systems aims to solve the problem of storing lots of data in interconnected spreadsheets. Their idea is to split the user interface and database inherent in a complex spreadsheet, and present as a kind of Google Sheet – rather than the customer building an actual database. It looks really powerful, but I wasn't clear what it can do that Google Sheets doesn’t; we use Sheets for lots of smaller ‘databases’ in Littledata, and it’s both simple and powerful. Crowd.Science Crowdfunding for scientific projects, helping scientists raise money from individual donations, business sponsorship and charitable trusts. They take 5 – 10% commission of the money raised. It seems like a great model: crowdfunding is well proven in other areas, and some scientific projects have real public benefit. As the trustee of a grant-giving trust, I know the way we find projects is fairly inefficient, so this platform would be a great benefit as it takes off. Realisable Realisable is an Extract, Transform and Load (ETL) tool, with a visual business rules editor to transform a data source. Their live demo uses a job to transform unshipped orders from Shopify into a format that can be exporting to an accounting package, adding a customer ID to the transactions. I investigated this market in 2016, and there are some very big companies in the ETL market. Many of their products suck - a great opportunity - but there are ones with better user interfaces like Stitch Data. Talking to the founders afterwards, their strategy is to dominate a channel (in their case, Sage consultants); I know this has really worked for another ETL tool, Matillion for Amazon RedShift. Conclusion What’s my favourite idea (outside of Littledata)? Crowd.Science has the biggest potential commercially I think, but I do love Trevor’s product.

2017-07-05

Introducing Buyer Personas

This week we're excited to introduce Buyer Personas, a game-changing new feature for marketers and ecommerce teams that are serious about hacking growth at a major scale. Do you know which types of customers are most likely to convert? Gathering customer data is one thing, but turning it into actionable insights is another. We've found that Littledata users are often struggling to find the exact differences between web visitors that buy and those that don't buy, especially when it comes to particular marketing channels. Littledata's new Buyer Personas feature automatically generates user personas based on your particular Google Analytics ecommerce setup or conversion goals, making it easier than ever to target your marketing and on-site content at those shoppers most likely to engage, convert, and grow with your online business in the long term. For example, if you know that users who arrive on your site on the weekend, in the afternoon are more likely to buy, then you should allocate more of your budget to those times. Or if users on tablets are most likely to convert, then target campaigns and ad formats most relevant for that screen size. Accurate Data If you have a decent Google Analytics setup it is possible to look at how different attributes of the user (age, browsing device, time of visit, etc.) affect their likelihood of converting. The better the data setup for your 'people analytics', the more detailed the report can be – when's the last time you audited your website's Google Analytics setup? Buyers or Users? We’re calling the new feature Buyer Personas since this is often requested by retail customers, but it is equally relevant if you have another conversion goal (eg. registrations, event bookings). In all of these cases, your customers are essentially 'buying in' to your product or service. You can switch the conversion metric at the bottom of the Buyer Personas page in the app. Marketing Channels Buyer personas give you actionable insights on particular channels, such as paid search, while also improving your overall understanding of your ideal customer base. The feedback is split out by channel so you can action it more easily: how you would re-organise your paid search marketing is very different to how you re-target your email marketing, but both are needed. The reality is that most smaller websites won’t have any of the ideal people of their site. We are not saying that only that exact profile will convert but that, by targeting the marketing on those who convert most easily, you can improve your return on investment. Pick the category with the biggest potential audience first. The first iteration of the new feature is live in the app this week. We look forward to hearing your feedback! Note that to generate Buyer Personas, you will need an active conversion goal or ecommerce tracking setup, and a minimum of 50 conversions in the previous month. Don't have a Littledata account yet? Sign up today to fix your Google Analytics setup for free and start generating buyer personas.

2017-07-04

How to install our Shopify reporting app (VIDEO)

[embed]https://www.youtube.com/watch?v=I3c8OuqDj_8[/embed] Watch this quick video to learn how to install our Google Analytics Shopify app. The popular reporting app makes it easy to get better Google Analytics data about your Shopify store. To install Littledata's Shopify app and start trusting your data, follow the easy steps in the video: Get the app Authorise Google Analytics (GA) access Pick the existing GA data for your site Our app runs the migration process on your store Swap in Littledata's tracker (in your Shopify store admin) Confirm and go live! This video covers the basic setup process for fixing your data collection and setting up accurate tracking. But that's only the beginning of what the app can do for you. Once you've successfully installed the app and fixed your analytics setup, we recommend making daily use of your new analytics dashboard, setting up custom reports and alerts, and checking out relevant ecommerce benchmarks. Shopify stores love how the app automatically shows you the most important metrics for your sales and marketing. With a clear view of the complete user lifecycle -- from marketing channel engagement, to shopping cart activity, to repeat buying -- the sky's the limit!

2017-06-30

Is Google Analytics accurate? 6 common issues and how to resolve them

Our customers come from a range of industries, but when they first come to the Littledata app for help with fixing their analytics, they share a lot of common questions. First of all, is Google Analytics accurate? How do you know if your Google Analytics setup is giving you reliable data? In this blog post we look at common problems and explain what can be done to make your tracking more accurate. Google Analytics is used by tens of millions of websites and apps around the world to measure web visitor engagement. It won’t measure 100% of visitors – due to some users opting out of being tracked, or blocking cookies – but set up correctly, it should be measuring over 95% of genuine visitors (as opposed to web scrapers and bots). What are the common things that go wrong? The six most common issues with Google Analytics -- and how to resolve them 1. Your tracking script is wrongly implemented There are two common issues with the actual tracking script setup: 1) when it is implemented twice on some pages, and 2) when it is missing completely from some pages. The effect of duplicating the script is that you’ll see an artificially low bounce rate (usually below 5%), since every page view is sending twice to Google Analytics. The effect of the tracking script missing from pages is that you’ll see self-referrals from your own website. Our recommendation is to use Google Tag Manager across the whole site to ensure the tracking script is loaded with the right web property identifier, at the right time during the page load. 2. Your account has lots of spam When it comes to web traffic and analytics setup, spam is a serious issue. Spammers send 'ghost' referrals to get your attention as a website owner. This means that the traffic you see in Google Analytics may not come from real people, even if you have selected to exclude bots. Littledata’s app filters out all future spammers and Pro Reporting users benefit from having those filters updated weekly. 3. Your own company traffic is not excluded Your web developers, content writers and marketers will be heavy users of your own site, and you need to filter this traffic from your Google Analytics to get a view of genuine customers or prospects. You can do this based on location (e.g. IP address) or pages they visit (e.g. admin pages). 4. One person shows up as two or more users Fight Club aside (spoiler alert), when the same person re-visits our site we expect them to look the same each time. Web analytics is more complicated. What Google Analytics is tracking when it talks of ‘users’ is a visit from a particular device or browser instance. So if I have a smartphone and a laptop computer and visit your site from both devices (without cross-device linking) I’ll appear as two users. Even more confusingly, if I visit your site from the Facebook app on my phone and then from the Twitter app, I’ll appear as two users – because those two apps use two different internet browser instances. There's not a lot which can be done to fix that right now, although Google is looking at ways to use it's accounts system (Gmail, Chrome etc) to track across many devices. 5. Marketing campaigns are not attributed to revenue or conversions If the journey of visitors on your site proceeds via another payment processor or gateway, you could be losing the link between the sale (or goal conversion) and the original marketing campaigns. You will see sales attributed to Direct or Referral traffic, when they actually came from somewhere else. This is a remarkably common issue with Shopify stores, and that’s why we built a popular Shopify reporting app that solves the issue automatically. For other kinds of sites, the issue can often be resolved by setting up cross-domain tracking. 6. You aren't capturing key events (like purchases or button clicks) Google Analytics only tracks views of a page by default, which may not be meaningful if you have a highly interactive website or app. Sending custom events is the key to ensuring that your tracking is both accurate and relevant. Doing so is made easier with Google Tag Manager makes this easier than it would be otherwise, but you may need to speak to a qualified analytics consultant to decide what to track. If you want more certainty that your analytics is fully accurate, try Littledata's free Google Analytics audit or get in touch for a quick consultation. We <3 analytics and we're always here to help.

2017-06-27

Why are all my transactions coming from Direct or Referral in Google Analytics, with no marketing attribution?

Connecting marketing data with sales data is an age-old problem, and the crowded digital landscape has made this even more complicated. Google Analytics is supposed to give you the power to attribute sales (or purchase transactions) back to marketing campaigns, but this doesn't happen automatically. The good news is that it's entirely possible to get the right marketing channel attribution for sales activities. Accurate marketing attribution starts with the right Google Analytics (GA) setup. Start by asking yourself the following troubleshooting questions. These steps will help you figure out if your GA setup is correct, and how to use GA to get a complete view of user behaviour. Trustworthy GA setup takes a bit of work, but with a smart analytics dashboard like Littledata, much of that work can be automated. In fact, steps 1 through 4 can be checked automatically with our free Google Analytics audit tool. First of all, are you checking the right report? The best way to see the attribution is in the 'Channels' report in Google Analytics, under the 'Acquisition' section: 1. Have you got a large enough sample to compare? Firstly, can you be sure the sales are representative? If you only have two sales, and both are ‘Direct’, that could be a fluke. We recommend selecting a long enough time period to look at more than 50 transactions before judging, as with this example:   2. Is the tracking script on your purchase confirmation page setup? It you are getting some transactions recorded, but not 100%, then it may be possible to optimise the actual tracking script setup. See our technical guide to ecommerce tracking. This can be a particular problem if many of your sales are on mobile, since slower page load speeds on mobile may be blocking the tracking script more often.   3. Have you got a cross-domain problem? If you see many of your sales under Referral, and when you click through the list of referrers it includes payment gateways (e.g. mybank.com or shopify.com), that is a tell-tale sign you have a cross-domain problem. This means that when the buyer is referred back from the payment domain (e.g. paypal.com), their payment is not linked with the original session. This is almost always a problem for Shopify stores, which is why our Shopify app is essential for accurate tracking.   4. Is your marketing campaign tagging complete? For many types of campaign (Facebook, email etc), unless you tag the link with correct ‘UTM’ parameters, the source of the purchaser will not be tracked. So if a user clicks on an untagged Facebook Ad link on their Facebook mobile app (which is where 80 – 90% of Facebook users engage) then the source of their visit will be ‘Direct’ (not Social). Untagged email campaigns are a particular issue if you run abandoned cart / basket emails, as these untagged links will be 'stealing' the sales which should be attributed to whatever got the buyer to add to cart. Tagging is a real problem for Instagram, since currently the profile link is shown in full - and looks really messy if you include all the UTM parameters. We recommend using a service like Bitly to redirect to your homepage (or an Instagram landing page). i.e. The link redirects to yoursite.com?utm_medium=social&utm_source=instragram&utm_campaign=profile_link.  Read Caitlin Brehm's guide to Instagram links.   5. (only for subscription businesses using Littledata) Are you looking at only the first time payments? Tracking the source of recurring payments is impossible, if the tracking setup was incorrect at the time of the first payment. You can’t change Google Analytics retrospectively I’m afraid. So if you are using our ReCharge integration, and you want to track lifetime value, you will have to be patient for a few months as data from the correct tracking builds up.   6. Is a lot of your marketing via offline campaigns, word of mouth or mobile apps? It could be that your sales really are ‘direct’: If a buyer types in the URL from a business card or flyer, that is ‘Direct’. The only way to change this is to use a link shortener to redirect to a tagged-up link (see point 4 above). If a user pastes a link to your product in WhatsApp, that is ‘Direct’. If a user sees your product on Instagram and clicks on the profile link, that is ‘Direct’. Please let us know if there are any further issues you've seen which cause the marketing attribution to be incorrect.

2017-06-13

What to test with Google Optimize

So you’ve got a brand new tool in your web performance kit – Google Optimize – and now you want to put it to good use. What can you test with Optimize and how does it work? Firstly, what are the different options for setting up an experiment? AB Test Using the in-page editor you can create an altered version of the page you wish to test. This could be a change of text copy, different styling, or swapping in a different image. You can also add new scripts or HTML if you’re familiar with coding. The way this works is Optimize adds a script after the page loads to manipulate the page text, images or styles. I recommend not switching header elements or large images using this method as, depending on your website setup, there may be a noticeable flicker– try a redirection test below. You can create many versions with subtly different changes (C, D and E versions if you want) – but remember you’ll need a large volume of traffic to spot significant differences between lots of variations. You can also limit the test to a certain segment of users – maybe only first time visitors, or those on mobile devices. Multivariate Test Similar to an AB test, a multivariate test is used when you have a few different aspects of the page to change (e.g. image and headline text) and you want to see which combination is most engaging. To get a significant result, you'll need a large volume of traffic - even more than testing many options in AB tests.   Redirection Test This is where you have two different versions of a page – or a different flow you want to start users on. Optimize will split your visitors, so some see the original page and some are redirected to the B version. A redirection test is best when the page content or functionality is very different – perhaps using a whole different layout. The disadvantage is you’ll need a developer to build the B version of the page, which may limit the speed of cycling tests.   Personalisation Personalisation is not officially supported by Optimize right now, but we’ve found it to be a useful tool. You can assign 99.9% of the visitors who match certain criteria to see the alternative version of the page. An example is where you have a special offer or local store in a particular city - see our step-by-step local personalisation example. You can ensure that all the visitors from that city see a different version of the page. Unfortunately on the free version of Google Optimize you are limited to 3 concurrent ‘experiments’ – so it won’t be a good solution if you want to run similar personalisation across lots of cities or groups of users. Next the question is where to start with tests...   Start with the landing pages Landing pages get the greater volume of traffic, and are where small visual changes (as opposed to new product features) make the biggest difference to user engagement. This greater volume allows you to get a significant result quicker, meaning you can move on to the next test quicker. And keep on improving!   So what exactly could you test using Google Optimize? Here are six ideas to get you going.   1. Could call-to-actions (CTA) be clearer? Changing the colour or contrast of a key button or link on the page (within your brand guidelines) usually results in more visitors clicking it. This might involve changing the style of the CTA itself, or removing elements close by on the page – to give the CTA more space to stand out.   2. Are you giving the user too many choices? In Steve Krug’s classic Don’t Make me Think he explains how any small confusion in the user’s mind can stop them making any choice. Every choice the user has to make is an opportunity for them to give up. Try hiding one of the options and seeing if more users overall choose any of the remaining options.   3. Is the mobile page too long? As many sites move to responsive designs that switch layout on smaller screens, this has led to mobile pages becoming very long. User may get ‘scroll fatigue’ before then get to critical elements on the page. Try cutting out non-essential sections for mobile users, or editing copy or images to make the page shorter. You could also try switching sections so that the call-to-action is higher up the page on mobile – although this is harder to achieve without a redirection test.   4. Is localisation important to your users? You may have discussed providing local language content for your users, and been unsure if it is worth the costs of translation and maintenance. Why not test the benefits for a single location? As with the personalisation tests, you can show a different local language (or local currency) version of the page to half the users in the single location (e.g. Spanish for visitors from Mexico) and see if they convert better.   5. Does the user need more reassurance before starting to buy? It easier to build experiments which remove elements to the page, but you should also consider adding extra explanation messages. A common problem on ecommerce stores is that visitors are unsure what the shipping charges or timing will be before adding to cart. Could you add a short sentence at the start of the journey (maybe on a product page) to give an outline of your shipping policy? Or maybe some logos of payment methods you accept?   6. Changing header navigation If your site has a complex mix of products that has evolved over time it may be time to try a radical new categorisation – maybe splitting products by gender or price point rather than by type. For this test, you’ll want to target only new visitors – so you don’t confuse regular visitors until you’re sure it’s permanent. You will also need to make the navigation changes on all pages across the site.   Good luck! Littledata also offering consulting and AB testing support, so please contact us for any further advice.

2017-05-30

How to add account edit permissions for Google Analytics

Being able to edit the Google Analytics account is the 2nd highest permission level. You need this if you want to create a new web property in Google Analytics. To grant permissions to another user you will need the highest permission level yourself: being able to manage users on the account. Step 1: Go to account user settings page First click the admin cog in any view under the account in GA you want to change, and then in the left hand list go to User Settings   EITHER Select an existing user from the list and click the 'edit' checkbox OR Add a new user's email (must be a Google account) and check the 'edit' checkbox. Step 3: Check it's working Your colleague should now be able to see 'Create new property' under the list of properties in the middle of the Admin page.

2017-05-16

Shopify Marketing Events vs Google Analytics

At the Shopify Unite conference today I heard plenty of great ideas such as ShopifyPay but the most interesting for me as a data specialist was the marketing events API. Since we launched our Fix Google Analytics Shopify app earlier this year we’ve known that reporting was a weak spot in Shopify’s platform offering, and they admit that ‘understanding marketing campaign performance’ is one of the biggest challenges of Shopify merchants right now. The ability for other Shopify apps to plug their campaign cost and attribution data into Shopify (via the marketing events API) is a logical step to building Shopify’s own analytics capability, but I don’t believe it will be a substitute for Google Analytics (GA) anytime soon. Here’s why: 1. Google Analytics is the industry standard Every online marketer has used Google Analytics, and many have favourite reports they’ve learned to interpret. Moving them to use a whole new analysis platform will take time– and it’s taken GA 10 years to achieve that dominance. 2. GA provides platform-agnostic data collection For a store using Shopify as their only source of insights, moving away from Shopify would mean losing all the historic marketing performance data – so it would be very hard to make like-for-like comparisons between the old platform and the new. Many of our customers have used GA during and after a platform shift to get continuous historical data. Which ties into my first point that over 85% of businesses have a history of data in GA. 3. Incomplete marketing tagging will still cause issues Making valid analysis on multi-channel marketing performance relies on having ALL the campaigns captured - which is why our GA audit tool checks for completeness of campaign tagging. Shopify’s tracking relies on the same ‘utm_campaign’ parameters as GA, and campaigns that are not properly tagged at the time cannot be altered retrospectively. 4. Google is rapidly developing Google Analytics I’d like to see the Shopify marketing event collection evolve from its launch yesterday, but Google already has a team of hundreds working on Google Analytics, and it seems unlikely that Shopify will be able to dedicate resources to keep up with the functionality that power users need. 5. More integrations are needed for full campaign coverage Shopify’s marketing analysis will only be available for apps that upgrade to using the new API.  Marketing Events has launched with integrations for Mailchimp and Facebook (via Kit) but it won’t cover many of the major channels (other emails, AdWords, DoubleClick for Publishers) that stores use. Those integrations will get built in time, but until then any attribution will be skewed. 6. GA has many third-party integrations Our experience is that any store interested in their campaign attribution quickly wants more custom analysis or cuts of the data. Being able to export the data into Littledata’s custom reports (or Google Sheets or Excel) is a popular feature – and right now Shopify lacks a reporting API to provide the same customisations. You can only pull raw event data back out. That said, there are flaws with how GA attribution works. Importing campaign cost data is difficult and time consuming in GA – apart from the seamless integration with AdWords – and as a result hardly any of the stores we monitor do so. If Shopify can encourage those costs to be imported along with the campaign dates, then the return on investment calculations will be much easier for merchants. I also think Shopify has taken the right pragmatic approach to attribution windows. It counts a campaign as ‘assisting’ the sale if it happens within 30 days of the campaign, and also whether it was ‘last click’ or ‘first click’. I’ve never seen a good reason to get more complicated than that with multi-channel reports in GA, and it’s unlikely that many customers remember a campaign longer than 30 days ago. In conclusion, we love that Shopify is starting to take marketing attribution seriously, and we look forward to helping improve the marketing events feature from its launch yesterday, but we recommend anyone with a serious interest in their marketing performance sticks to Google Analytics in the meantime (and use our Shopify app to do so).

2017-04-21

Important update to Remarketing with Google Analytics

If you got this email from Google recently, or seen the blue notification bar at the top of Google Analytics, here's what is changing and how it affects your website. The big problem in modern online marketing is that most users have multiple devices, and the device they interact with the advert on is not the same as the one they convert on: [Google’s] research shows that six in ten internet users start shopping on one device but continue or finish on a different one. Facebook has been helping advertisers track conversion across devices for a few years  - because most Facebook ads are served on their mobile app, when most conversion happens on larger screens. So Google has been forced to play catch-up. Here’s the message from the Google Analytics header: Starting May 15, 2017, all properties using Remarketing with Google Analytics will be enhanced to take advantage of new cross-device functionality. This is an important update to your remarketing settings, which may relate to your privacy policy. The change was announced last September but has only just rolled out. So you can remarket to users on a different device to the one on which they visited your site when: You build a retargeting audience in Google Analytics You have opted in to remarketing tracking in Google Analytics Users are logged into Google on more than one device Users have allowed Google to link their web and app browsing history with their Google account Users have allowed Google account to personalise ads they see across the web This may seem like a hard-to-reach audience, but Google has two secret weapons: Gmail (used by over 1 billion people and 75% of those on mobile) and Chrome (now the default web browser for desktop, and growing in mobile). So there are many cases where Google knows which devices are linked to a user. What is not changing is how Google counts users in Google Analytics. Unless you are tracking registered users, a ‘user’ in Google Analytics will still refer to one device (tablet, mobile or laptop / desktop computer).   Could Google use their account information to make Google Analytics cross-device user tracking better? Yes, they could; but Google has always been careful to keep their own data about users (the actions users take on Google.com) separate from the data individual websites capture in Google Analytics (the actions users take on mywebsite.com). The former is owned by Google, and protected by a privacy agreement that exists between Google and the user, and the latter is owned by the website adding the tracking code but stored and processed by Google Analytics. Blurring those two would create a legal minefield for Google, which is why they stress the word ‘temporary’ in their explanation of cross-device audiences: In order to support this feature, Google Analytics will collect these users’ Google-authenticated identifiers, which are Google’s personal data, and temporarily join them to your Google Analytics data in order to populate your audiences.   How can I make use of the new cross-device retargeting? The first step is to create a remarketing audience from a segment of your website visitors that are already engaged. This could be users who have viewed a product, users who have viewed the pricing page or users who have viewed more than a certain number of pages. For more help on setting up the right goals to power the remarketing audience, please contact us.

2017-04-10

How does page load speed affect bounce rate?

I’ve read many articles stating a link between faster page loading and better user engagement, but with limited evidence. So I looked at hard data from 1,840 websites and found that there’s really no correlation between page load speed and bounce rate in Google Analytics. Read on to find out why. The oft quoted statistic on page load speed is from Amazon, where each 100ms of extra loading delay supposed to cost Amazon $160m. Except that the research is from 2006, when Amazon’s pages were very static, and users had different expectations from pages – plus the conclusions may not apply to different kinds of site. More recently in 2013, Intuit presented results at the Velocity conference of how reducing page load speed from 15 seconds to 2 seconds had increased customer conversion by: +3% conversions for every second reduced from 15 seconds to 7 seconds +2% conversions for every second reduced from seconds 7 to 5 +1% conversions for every second reduced from seconds 4 to 2 So reducing load speed from 15 seconds to 7 seconds was worth an extra 24% conversion, but only another 8% to bring 7 seconds down to 2 seconds. Does page speed affect bounce rate? We collected data from 1,840 Google Analytics web properties, where both the full page load time (the delay between the first request and all the items on the page are loaded) and the bounce rate were within normal range. We then applied a Spearman’s Rank Correlation test, to see if being a higher ranked site for speed (lower page load time) you were likely to be a higher ranked site for bounce rate (lower bounce rate). What we found is almost no correlation (0.18) between page load speed and bounce rate. This same result was found if we looked at the correlation (0.22) between bounce rate and the delay before page content starts appearing (time to DOM ready) So what explains the lack of a link? I have three theories 1. Users care more about content than speed Many of the smaller websites we sampled for this research operate in niche industries or locations, where they may be the only source of information on a given topic. As a user, if I already know the target site is my best source for a topic, then I’ll be very patient while the content loads. One situation where users are not patient is when arriving from Google Search, and they know they can go and find a similar source of information in two clicks (one back to Google, and then out to another site). So we see a very high correlation between bounce rate and the volume of traffic from Google Search. This also means that what should concern you is speed relative to your search competitors, so you could be benchmarking your site speed against a group of similar websites, to measure whether you are above or below average.   2. Bounce rate is most affected by first impressions of the page As a user landing on your site I am going to make some critical decisions within the first 3 seconds: would I trust this site, is this the product or content I was expecting, and is it going to be easy to find what I need. If your page can address these questions quickly – by good design and fast loading of the title, main image etc – then you buy some more time before my attention wanders to the other content. In 2009, Google tried an experiment to show 30 search results to users instead of 10, but found the users clicking on the results dropped by 20%. They attributed this to the half a second extra it took to load the pages. But the precise issue was likely that it took half a second to load the first search result. Since users of Google mainly click on the first 3 results, the important metric is how long it took to load those - not the full page load.   3. Full page load speed is increasingly hard to measure Many websites already use lazy loading of images and other non-blocking loading techniques to make sure the bare bones of a page is fast to load, especially on a mobile device, before the chunkier content (like images and videos) are loaded. This means the time when a page is ready for the user to interact with is not a hard line. SpeedCurve, a tool focussed entirely on web page speed performance, has a more accurate way of tracking when the page is ‘visually complete’ based on actual filmstrips on the page loading. But in their demo of The Guardian page speed, the page is not visually complete until a video advert has rendered in the bottom right of the screen – and personally I’d be happy to use the page before then. What you can do with Google Analytics is send custom timing events, maybe after the key product image on a page has loaded, so you can measure speed as relevant to your own site.   But doesn’t speed still affect my Google rankings? A little bit yes, but when Google incorporated speed as a ranking signal in 2010, their head of SEO explained it was likely to penalise only 1% of websites which were really slow. And my guess is in 7 years Google has increase the sophistication with which it measures ‘speed’.   So overall you shouldn’t worry about page load times on their own. A big increase may still signal a problem, but you should be focussing on conversion rates or page engagement as a safer metric. If you do want to measure speed, try to define a custom speed measurement for the content of your site – and Littledata’s experts can work with you to set that custom reporting up.

2017-04-07

The Freemium business model revisited

After I concluded that freemium is not the best business model for all, the continued rise of ‘free’ software has led me to revisit the same question. In a fascinating piece of research by Price Intelligently, over 10,000 technology executives were surveyed over 5 years. Their willingness to pay for core features of B2B software has declined from 100% in 2013 to just over 50% today – as a whole wave of VC-funded SaaS companies has flooded the market with free product. For add-ons like analytics, this drops to less than 30% willing to pay. “The relative value of features is declining. All software is going to $0” – Patrick Campbell, Price Intelligently Patrick sees this as an extension of the trend in physical products, where offshoring, global scale and cheaper routes to market online have led to relentless price depreciation (in real terms). I’m not so sure. Software is not free to manufacture, although the marginal cost is close to zero – since cloud hosting costs are so cheap. The fixed cost is the people-time to design and build the components, and the opportunities for lowering that cost – through offshoring the work or more productive software frameworks - have already been exploited by most SaaS companies. To pile on the pain, a survey of software executives also found that the average number of competitors in any given niche has increased from 10 to 15 over those 3 years. Even if software build costs are falling, those costs are being spread over a small number of customers – making the chance of breaking even lower. And the other big cost – Customer Acquisition (CAC) – is actually rising with the volume of competition. To sum up the depressing news so far: 1. Buyers have been conditioned to expect free software, which means you’ll have to give major features away for free 2. But you’ll have to pay more to acquire these non-paying users 3. And next year another competitor will be offering even more for free What is the route of this economic hole? Focussing on monetising a few existing customers for one. Most SaaS executives were focussed on acquiring new customers (more logos), probably because with a free product they expected to sweep up the market and worry about monetization later. But this turns out to be the least effective route to building revenue. For every 1% increment, Price Intelligently calculated how much this would increase revenue. i.e. If I signed up 101 users over the year, rather than 100, that would increase revenue by 2.3%. Monetization – increasing the Average Revenue Per User (ARPU) – has by far the larger impact, mainly because many customers don’t pay anything currently. In contrast, the impact of customer acquisition has fallen over 3 years, since the average customer is less likely to pay. Monetization is not about increasing prices for everyone – or charging for previously free features – but rather finding the small number who are willing to pay, and charging them appropriately. My company, Littledata, has many parallels to Profit Well (launched by Price Intelligently). We both offer analytics and insights on top of existing customer data – Littledata for Google Analytics behavioural data, and Profit Well for recurring revenue data from billing systems. And we have both had similar customer feedback: that the perceived value of the reporting is low, but the perceived value of the changes which the reporting spurs (better customer acquisition, increased retention etc) is high. So the value of our software is that it creates a requirement – which can then be filled by consulting work or ‘actionable’ modules. For myself, I can say that while focusing on new customer acquisition has been depressing, we have grown revenues once a trusted relationship is in place – and the customer really believes in Littledata’s reporting. For Littledata, as with many B2B software companies, we are increasingly content that 80% of our revenue comes from a tiny handful of loyal and satisfied users. In conclusion, while the cover price of software subscriptions is going to zero, it is still possible to generate profits as a niche SaaS business – if you understand the necessity of charging more to a few customers if the many are unwilling to pay. Freemium may be here to stay, but if customers want the software companies they rely on to stay they need to pay for the benefits. Would you like to further discuss? Comment below or get in touch!

2017-03-10

Shine a light on ‘dark’ Facebook traffic

If Facebook is a major channel for your marketing, whether sponsored posts or normal, then you’re underestimating the visits and sales it brings. The problem is that Facebook doesn’t play nicely with Google Analytics, so some of the traffic from Facebook mobile app comes as a DIRECT visit. That’s right – if a Facebook user clicks on your post on their native mobile app they won’t always appear as a Facebook social referral. This traffic is ‘dark Facebook’ traffic: it is from Facebook, but you just can’t see it. Since around 40% of Facebook activity is on a mobile app, that means the Facebook traffic you see could be up to 40% less than the total. Facebook hasn’t shown much interest in fixing the issue (Twitter fixed it, so it is possible), so you need to fix this in your own Google Analytics account. Here are three approaches: 1. Basic: use campaign tagging The simplest way to fix this, for your own posts or sponsored links on Facebook, is to attach UTM campaign tags to every link. Google provides a simple URL builder to help. The essential tags to add are “utm_source=facebook.com” and “utm_medium=referral”. This will override the ‘direct’ channel and put all clicks on that links into the Facebook referral bucket. Beyond that, you can add useful tags like “utm_campaign=events_page” so you can see how many click through from your Facebook events specifically. 2. Moderate: use a custom segment to see traffic What if much of your traffic is from enthusiastic brand advocates, sharing your pages or articles with their friends? You can’t expect them to all use an URL builder. But you can make a simple assumption that most users on a mobile device are not going to type in a long URL into their browser address bar. So if the user comes from a mobile device, and isn’t visiting your homepage (or a short URL you deliberately post), then they are probably coming from a mobile app. If your website is consumer facing, then the high probability is that that mobile app is Facebook. So we can create a custom segment in GA for traffic which (a) comes from a mobile device (b) does not have a referrer or campaign (i.e. direct) (c) does not land on the homepage To start you need to create a segment where source contains 'facebook'. Then add the 'Direct mobile, not to homepage' segment: Next, you can create a custom report to show sessions by hour: You should see a strong correlation, which on the two web properties I tested on resulted in doubling the traffic I had attributed to Facebook. 3. Advanced: attribute micro spikes to Facebook Caveat: you’ll need a large volume of traffic – in excess of 100 visits from Facebook a day – to try this at home The final trick has been proved to work at The Guardian newspaper for Facebook traffic to news articles. Most Facebook activity is very transitory – active users click on a trending newsfeed item, but it quickly fades in interest. So what you could do, using the Google Analytics API, is look for the ‘micro spikes’ in referrals that come from Facebook on a minute-by-minute basis, and then look at the direct mobile visits which came at the same time, and add these direct spikes to the total Facebook traffic. I've played around with this and it's difficult to get right, due to the sampling Google applies, but I did manage to spot spikes over around 5 minutes that had a strong correlation with the underlying direct mobile traffic. Could these approaches work for your site?  I'm interested to hear. (Chart: Dark Social Dominates Online Sharing | Statista)   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-09

6 reasons Facebook ads don’t match the data you see in Google Analytics

If you run Facebook Ads and want to see how they perform in Google Analytics, you may have noticed some big discrepancies between the data available in Facebook Ad Manager and GA. Both systems use different ways to track clicks and visitors, so let’s unpick where the differences are. There are two kinds of metrics you’ll be interested in: ‘website clicks’ = the number of Facebook users who clicked on an advert on your own site, and (if you do ecommerce) the transaction value which was attributed to that advert. Website Clicks vs Sessions from Facebook 1. GA isn’t picking up Facebook as the referrer If users click on a link in Facebook’s mobile app and your website opens in an in-app browser, the browser may not log that ‘facebook.com’ was the referrer. You can override this (and any other link) by setting the medium, source, campaign and content attributes in the link directly. e.g. www.mysite.com?utm_medium=social&utm_source=facebook.com&utm_campaign=ad Pro Tip: you can use GA’s URL builder to set the UTM tags on every Facebook campaign link for GA. In GA, under the Admin tag and then ‘Property settings’ you should also tick the box saying ‘Allow manual tagging (UTM values) to override auto-tagging (GCLID values)’ to make this work more reliably. 2. The user leaves the page before the GA tag fires There’s a time delay between a user clicking on the advert in Facebook and being directed to your site. On a mobile, this delay may be several seconds long, and during the delay, the user will think about going back to safety (Facebook’s app) or just closing the app entirely. This will happen more often if the visitor is not familiar with your brand, and also when the page contents are slow to load. By Facebook’s estimation the GA tracking won’t fire anywhere between 10% and 80% of clicks on a mobile, but fewer than 5% of clicks on a desktop. It depends on what stage in the page load the GA pixel is requested. If you use a tag manager, you can control this firing order – so try firing the tag as a top priority and when the tag container is first loaded. Pro Tip: you can also use Google's mobile site speed suggestions to improve mobile load speed, and reduce this post-click drop-off. 3. A Javascript bug is preventing GA receiving data from in-app browsers It’s possible your page has a specific problem that prevents the GA tag firing only for mobile Safari (or Android equivalent). You’ll need to get your developers to test out the landing pages specifically from Facebook’s app. Luckily Facebook Ad Manager has a good way to preview the adverts on your mobile. Facebook Revenue vs GA Ecommerce revenue 4. Attribution: post-click vs last non-direct click Currently, Facebook has two types of attribution: post-view and post-click. This means any sale the user makes after viewing the advert or clicking on the advert, within the attribution window (typically 28 days after clicking and 1 day after viewing), is attributed to that advert. GA, by contrast, can use a variety of attribution models, the default being last non-direct click. This means that if the user clicks on an advert and on the same device buys something within the attribution window (typically 30 days), it will be attributed to Facebook.  GA doesn't know about views of the advert. If another campaign brings the same user to your site between the Facebook ad engagement and the purchase, this other campaign takes the credit as the ‘last non-direct click’. So to match as closely as possible we recommend setting the attribution window to be '28 days after clicking the ad' and no 'after view' attribution in Facebook (see screenshot above) and then creating a custom attribution model in GA, with the lookback window at 28 days, and the attribution 'linear' The differences typically come when: a user engages with more than one Facebook campaign (e.g. a brand campaign and a re-targeting one) where the revenue will only be counted against the last campaign (with a priority for ads clicked vs viewed) a user clicks on a Facebook ad, but then clicks on another advert (maybe Adwords) before buying. Facebook doesn’t know about this 2nd advert, so will attribute all the revenue to the Facebook ad. GA knows better, and will attribute all (or part) of it to Adwords. 5. Facebook cross-device tracking The main advantage Facebook has over GA is that users log in to its platform across all of their devices, so it can stitch together the view of a mobile advert on day 1 with a purchase made from the user’s desktop computer on day 2. Here’s a fuller explanation. By contrast, unless that user logs into your website on both devices, and you have cross-device tracking setup, GA won’t attribute the sale to Facebook. 6. Date of click vs date of purchase In Facebook, revenue is attributed to the date the user saw the advert; in GA it is to the date of purchase. So if a user clicks on the advert on 1st September, and then buys on the 3rd September, this will appear on the 1st on Facebook – and on the 3rd in GA. 7. The sampling problem Finally, did you check if the GA report is sampled? In the top right of the screen, in the grey bar, you'll see that the report is based on a sample.  If that sample is less than 100% it means the numbers you see are estimates.  The smaller the sample size used, the larger the possibility of error.  So in this example, a 45% sample of 270,000 sessions could skew our results plus or minus 0.2% in the best case. As a rule of thumb, Google applies sampling when looking over more than 500,000 sessions (even if you select the 'greater precision' option from the drop-down menu). You can check your own sample using this confidence interval calculator. Conclusion Altogether, there’s a formidable list of reasons why the data will never be an exact match, but I hope it gives you a way to optimise the tracking. Please let us know if you’ve seen other tracking issues aside from these.   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-08

Cross Domain tracking for Eventbrite using Google Tag Manager (GTM)

Are you using Eventbrite for event registrations? And would you like to see the marketing campaign which drove that event registration correctly attributed in Google Analytics? Then you've come to right place! Here is a simple guide to adding a Google Tag Manager tag to ensure the correct data is sent to Eventbrite to enable cross-domain tracking with your own website. Many thanks to the Lunametrics blog for their detailed solution, which we have adapted here for GTM. Before this will work you need to have: links from your site to Eventbrite (including mysite.eventbrite.com or www.eventbrite.co.uk) the Universal Analytics tracking code on both your site and your Eventbrite pages. only have one GA tracking code on your own site - or else see the Lunametrics article to cope with this 1. Create a new tag in GTM Create a new custom HTML tag in GTM and paste this script: [code language="javascript"] <script> (function(document, window) { //Uses the first GA tracker registered, which is fine for 99.9% of users. //won't work for browsers older than IE8 if (!document.querySelector) return; var gaName = window.GoogleAnalyticsObject || "ga" ; // Safely instantiate our GA queue. window[gaName]=window[gaName]||function(){(window[gaName].q=window[gaName].q||[]).push(arguments)};window[gaName].l=+new Date; window[gaName](function() { // Defer to the back of the queue if no tracker is ready if (!ga.getAll().length) { window[gaName](bindUrls); } else bindUrls(); }); function bindUrls() { var urls = document.querySelectorAll("a"); var eventbrite = /eventbrite\./ var url, i; for (i = 0; i < urls.length; i++) { url = urls[i]; if (eventbrite.test(url.hostname) === true) { //only fetches clientID if this page has Eventbrite links var clientId = getClientId(); var parameter = "_eboga=" + clientId; // If we're in debug mode and can't find a client if (!clientId) { window.console && window.console.error("GTM Eventbrite Cross Domain: Unable to detect Client ID. Verify you are using Universal Analytics."); break; return; } url.search = url.search ? url.search + "&" + parameter : "?" + parameter; } } } function getClientId() { var trackers = window[gaName].getAll(); return trackers[0].get("clientId"); } })(document, window); </script> [/code]   2. Set the tag to fire 'DOM ready' Create a new trigger (if you don't have a suitable one) to fire the tag on every page at the DOM ready stage.  We need to make sure the Google Analytics tracker has loaded first. 3. Test the marketing attribution With the script working you should see pageviews of the Eventbrite pages as a continuation of the same session. You can test this by: Opening the 'real time' reporting tag in Google Analytics, on an unfiltered view Searching for your own site in Google Navigating to the page with the Eventbrite link and clicking on it Looking under the Traffic Sources report and checking you are still listed as organic search after viewing the Eventbrite page Need more help? Comment below or get in touch!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-07

WWI Codebreaking and Interpretation

Reading Max Hasting’s excellent book on The Secret War, 1939-1945, I was struck by the parallel between the rise of radio communications in the 1930s and the more recent rise in internet data. The transmission of military and diplomatic messages by radio in the 1930s and 1940s provided intelligence agencies with a new gold mine. Never before had so much potential intelligence been floating in the ether, and yet it threatened to flood their limited manpower with a tide of trivia. The bottleneck was rarely in the interception (trivial with a radio set) or even decryption (made routine by Bletchley Park with the Enigma codes), but rather in filtering down to the tiny number of messages that contained important facts – and getting that information in real time to the commanders in the field. The Ultra programme (Britain’s decryption of German radio intercepts) was perennially understaffed due to the fact that other civil servants couldn’t be told how important it was. At Ultra’s peak in 1943, only around 50% of the 1,500 Luftwaffe messages a day were being processed – and it is unknown how many of those were in time to avert bombing raids. The new age of technology provided an almost infinitely wide field for exploration, as well as the means of addressing this: the trick was to focus attention where it mattered. The Secret War, page 203 The ‘new age of technology’ in the last two decades poses much the same problem. Data on internet behaviour is abundant: there are countless signals to listen to about your website performance, and the technology to monitor users is commonplace. And the bottleneck is still the same: the filtering of useful signals, and getting those insights to the ‘commanders’ who need them in real time. I started Littledata to solve this modern problem in interpreting website analytics for managers of online businesses. There is no decryption involved, but there is a lot of statistics and data visualisation know-how in making billions of data points appreciable by a company manager. Perhaps the most important aspect of our service is to provide insights in answer to a specific question: Group-Captain Peter Stewart, who ran the Royal Air Force’s photo-reconnaissance operations, was exasperated by a senior offer who asked for ‘all available information’ on one European country. Stewart responded that he could only provide useful information if he knew roughly what intelligence the suppliant wanted – ‘naval, military, air or ecclesiastical’. The Secret War, page 203 In the world of online commerce, the question is something like whether the client needs insights into the checkout conversion rate of all customers (to improve site design) or for a specific marketing campaign (to improve campaign targeting). So by focusing on insights which are relevant to the scale, stage or sector of the client company, and making these accessible in a real-time dashboard, Littledata can feed into decision making in a way that raw data can never do. Want to discuss this further? Get in touch or comment below!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-02-01

Don’t obsess over your homepage – its importance will decrease over time

Many businesses spend a disproportionate amount of time tweaking copy, design and interactive content for their homepage. Yet they miss the fact that the action is increasingly elsewhere. Homepage traffic has traditionally been seen as a proxy for ‘brand’ searches – especially when the actual search terms driving traffic are ‘not provided’. Now, brand search traffic may be finding other landing pages directly. Our hypothesis was that over the last 2 years the number of visits which start at the homepage, on the average website, are decreasing. To prove this, we looked at two categories of websites in Littledata’s website benchmarks: Websites with more than 20,000 monthly visits and more than 60% organic traffic (227 websites) Large websites with more than 500,000 monthly visits (165 websites) In both categories, we found that the proportion of visits which landed on the homepage was decreasing: by 8% annually for the smaller sites (from 16% of total visits to 13% over two years), and 7% annually for the larger sites (from 13% to 11%). If we ignore the slight rise in homepage traffic over the November/December period (presumably caused by more brand searches in the Christmas buying season), the annual decline is more than 10%. From the larger websites, only 20% showed any proportionate increase in homepage traffic over the 2 years – and those were mainly websites that were growing rapidly, and with an increasing brand. I think there are three different effects going on here: Increased sophistication of Google search usage is leading to more long-tail keywords, where users want a very specific answer to a question – usually not given on your homepage. The increase in mobile browsing, combined with the frustrations of mobile navigation, is leading more users to use search over navigation – and bypass your homepage That Google’s search-engine result page (SERP) changes have made it less likely that brand searches (searching for your company or product names) will navigate to your landing page – and instead browse social profiles, news, videos or even local listings for your company. In conclusion, it seems that for many businesses the homepage is an increasing irrelevance to the online marketing effort. Spend some time on your other content-rich, keyword-laden landing pages instead! And would you like to see if you are overly reliant on your homepage traffic, compared with similar websites? Try Littledata’s reporting suite.   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2017-01-26

Online reporting: turning information into knowledge

Websites and apps typically gather a huge flow of user behaviour data, from tools such as Google Analytics and Adobe Analytics, with which to better target their marketing and product development. The company assumes that either: Having a smart web analyst or online marketer skim through the reports daily will enable management to keep tabs on what is going well and what aspects are not Recruiting a ‘data science’ team, and giving them access to the raw user event data, will surface one-off insights into what types of customers can be targeted with which promotions Having worked in a dozen such companies, I think both assumptions are flawed. Humans are not good at spotting interesting trends, yet for all but the highest scale web businesses, the problem is not really a ‘big data’ challenge. For a mid-sized business, the problem is best framed as, how do you extract regular, easy-to-absorb knowledge from an incomplete online behavioural data set, and how do you present / visualise the insight in such a way that digital managers can act on that insight? Littledata is meeting the challenge by building software to allow digital managers to step up the DIKW pyramid. The DIKW theory holds that there are 4 levels of content the human mind can comprehend: Data: the raw inputs; e.g. the individual signals that user A clicked on button B at a certain time when visiting from a certain IP address Information: provides answers to "who", "what", "where", and "when" questions Knowledge: the selection and synthesis of information to answer “how” questions Wisdom: the extrapolation or interpretation of this knowledge to answer “why” questions Information is what Google Analytics excels at providing an endless variety of charts and tables to query on mass the individual events. Yet in the traditional company process, it needs a human analyst to sift through those reports to spot problems or trends and yield genuine knowledge. And this role requires huge tolerance for processing boring, insignificant data – and massive analytical rigour to spot the few, often tiny, changes. Guess what? Computers are much better at the information processing part when given the right questions to ask – questions which are pretty standard in the web analytics domain. So Littledata is extending the machine capability up the pyramid, allowing human analysts to focus on wisdom and creativity – which artificial intelligence is still far from replicating. In the case of some simpler insights, such as bounce rates for email traffic, our existing software is already capable of reporting back a plain-English fact. Here’s the ‘information’ as presented by Google Analytics (GA). And here is the one statistically significant result you might draw from that information: Yet for more subtle or diverse changes, we need to generate new ways to visualise the information to make it actionable. Here are two examples of charts in GA which are notoriously difficult to interpret. Both are trying to answer interesting questions: 1. How do users typically flow through my website? 2. How does my marketing channel mix contribute to purchasing? Neither yields an answer to the “how” question easily! Beyond that, we think there is huge scope to link business strategy more closely to web analytics. A visualisation which could combine a business’ sales targets with the current web conversion data, and with benchmarks of how users on similar sites behave, would give managers real-time feedback on how likely they were to outperform. That all adds up to a greater value than even the best data scientist in the world could bring. Have any questions? Comment below or get in touch with our team of experts! Want the easier to understand reports? Sign up!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-12-12

Why do I need Google Analytics with Shopify?

If the lack of consistency between Shopify’s dashboards and the audience numbers in Google Analytics is confusing, you might conclude that it’s safer to trust Shopify. There is a problem with the reliability of transaction volumes in Google Analytics (something which can be fixed with Littledata’s app) - but using Shopify’s reports alone to guide your marketing is ignoring the power that has led Google Analytics to become over by over 80% of large retailers. Last-click attribution Let’s imagine your shoe store runs a Google AdWords campaign for ‘blue suede shoes’. Shopify allows you to see how many visits or sales were attributed to that particular campaign, by looking at UTM ‘blue suede shoes’. However, this is only capturing those visitors who clicked on the advert and in the same web session, purchased the product. So if the visitor, in fact, went off to check prices elsewhere, or was just researching the product options, and comes back a few hours later to buy they won’t be attributed to that campaign. The campaign reports in Shopify are all-or-nothing – the campaign or channel sending the ‘last-click’ is credited with 100% of the sale, and any other previous campaigns the same customer saw is given nothing. Multi-channel attribution Google Analytics, by contrast, has the ability for multi-channel attribution. You can choose an ‘attribution model’ (such as giving all campaigns before a purchase equal credit) and see how much one campaign contributed to overall sales. Most online marketing can now be divided into ‘prospecting’ and ‘retargeting’; the former is to introduce the brand to a new audience, and the latter is to deliberately retarget ads at an engaged audience. Prospecting ads – and Google AdWords or Facebook Ads are often used that way – will usually not be the last click, and so will be under-rated in the standard Shopify reports. So why not just use the analytics reports directly in Google AdWords, Facebook Business, Twitter Ads etc.? Consistent comparison The problem is that all these different tools (and especially Facebook) have different ways of attributing sales to their platform – usually being as generous as possible to their own adverting platform. You need a single view, where you can compare the contribution of each traffic source – including organic search, marketing emails and referrals from other sites – in a consistent way. Unfortunately, Google Analytics needs some special setup to do that for Shopify. For example, if the customer is redirected via a payment gateway or a 3D secure page before completing the transaction then the sale will be attributed to a ‘referral’ from the bank - not the original campaign. Return on Advertising Spend (ROAS) Once you iron out the marketing attribution glitches using our app, you can make meaningful decisions about whether a particular form of marketing is driving more revenue that it is costing you – whether there is a positive Return on Advertising Spend. The advertising cost is automatically imported when you link Adwords to Google Analytics, but for other sources, you will need to upload cost data manually or use a tool like funnel.io . Then Google Analytics uniquely allows you to decide if a particular campaign is bringing more revenue than it is costing and, on a relative basis, where are the best channels to deploy your budget. Conclusion Shopify’s dashboards give you a simple daily overview of sales and products sold, but if you are spending more than hundreds of dollars a month on online advertising – or investing in SEO tactics – you need a more sophisticated way to measure success. Want more information on how we will help improve your Shopify analytics? Get in touch with our experts! Interested in joining the list to start a free trial? Sign up! Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-12-07

Tracking customers in Google Analytics

If your business relies on customers or subscribers returning to your site, possibly from different devices (laptop, smartphone, etc.) then it’s critical you start tracking unique customers rather than just unique visitors in Google Analytics. By default, Google Analytics tracks your customers by browser cookies. So ‘Bob’ is only counted as the same visitor if he comes to your site from the same browser, but not if he comes from a different computer or device. Worse, if Bob clears his cookies or accesses your site via another mobile app (which won't share cookies with the default browser) then he'll also be counted as a new user. You can fix this by sending a unique customer identifier every time your customer signs in. Then if you send further custom data about the user (what plan he / she is on, or what profile fields they have completed) you can segment any of the visits or goals by these customer attributes. There are 2 possible ways to track registered users: Using Google Analytics’ user ID tracker By storing the clientId from the Google cookie when a new user registers, and writing this back into the tracker every time the same user registers In both cases, we also recommend sending the user ID as a custom dimension. This allows you segment the reports by logged in / not logged in visitors. Let's look at the pros and cons. Session stitching Tracking customers involves stitching together visits from different devices into one view of the customer. Option 1, the standard User ID feature, does session stitching out the box. You can optionally turn ‘session unification’ on which means all the pageviews before they logged in are linked to that user. With option 2 you can stitch the sessions, but you can't unify sessions before the user logs in - because they will be assigned a different clientId. So a slight advantage to option 1 here. Reporting simplicity The big difference here is that with option 1 all of the user-linked data is sent to a separate 'registered users' view, whereas in options 2 it is all on the same view as before. Suppose I want a report of the average number of transactions a month for registered vs non-registered visitors. With both options, I can only do this if I also send the user ID as a custom dimension - so I can segment based on that custom dimension. Additionally, with option 1 I can see cross-device reports - which is a big win for option 1. Reporting consistency Once you start changing the way users are tracked with option 2 you will reduce the overall number of sessions counted. If you have management reports based on unique visitors, this may change. But it will be a one-time shift - and afterwards, your reports should be stable, but with a lower visit count. So option 1 is better for consistency Conclusion Option 1 - using the official user tracking - offers a better route to upgrade your reports. For more technical details on how this tracking is going to work, read Shay Sharon’s excellent customer tracking post. Also, you can watch more about customer tracking versus session tracking in this video. Have any questions? Comment below or get in touch with our team of experts!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-12-06

Comparing 3 time ranges in Google Analytics

Selecting time ranges for comparison in Google Analytics can trip you up. We find comparing 28-day or 7-day (one week) periods the most reliable method. Gotcha 1: Last 4 days with previous 4 days This is comparing the same time periods (4 days) so shouldn't they be comparable? No! Most websites show a strong weekly cycle of visits (either stronger or weaker on the weekend), so the previous four days may be a very different stage of the week. Gotcha 2: Last month compared with the previous month Easy - we can see traffic has gone up by 5% in March. No! March has 11% more viewing time (3 extra days) than February. So the average traffic per day in March has actually dropped by 5.5%. Gotcha 3: Last week compared with the previous week You can see what's coming this time... Certain weeks of the year are always abnormal, and the Christmas period is one of them. But most business / educational sites it is a very quiet period. The best comparison would be with the same week last year. Have any questions? Let us know by commenting below or get in touch with our lovely experts!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-12-01

Top 5 Google Analytics metrics Shopify stores can use to improve conversion

Stop using vanity metrics to measure your website's performance! The pros are using 5 detailed metrics in the customer conversion journey to measure and improve. Pageviews or time-on-site are bad ways to measure visitor engagement. Your visitors could view a lot of pages, yet be unable to find the right product, or seem to spend a long time on site, but be confused about the shipping rates. Here are the 5 better metrics, and how they help you improve your Shopify store: 1. Product list click-through rate Of the products viewed in a list or category page, how many click through to see the product details? Products need good images, naming and pricing to even get considered by your visitors. If a product has a low click-through rate, relative to other products in the list, then you know either the image, title or price is wrong. Like-wise, products with very high list click-through, but low purchases, may be hidden gems that you could promote on your homepage and recommended lists to increase revenue. If traffic from a particular campaign or keyword has a low click-through rate overall, then the marketing message may be a bad match with the products offered – similar to having a high bounce rate. 2. Add-to-cart rate Of the product details viewed, how many products were added to the cart? If visitors to your store normally land straight on the product details page, or you have a low number of SKUs, then the add-to-cart rate is more useful. A low add-to-cart rate could be caused by uncompetitive pricing, a weak product description, or issues with the detailed features of the product. Obviously, it will also drop if you have limited variants (sizes or colours) in stock. Again, it’s worth looking at whether particular marketing campaigns have lower add-to-cart rates, as it means that particular audience just isn’t interested in your product. 3. Cart to Checkout rate Number of checkout processes started, divided by the number of sessions where a product is added to cart A low rate may indicate that customers are shopping around for products – they add to cart, but then go to check a similar product on another site. It could also mean customers are unclear about shipping or return options before they decide to pay. Is the rate especially low for customers from a particular country, or products with unusual shipping costs? 4. Checkout conversion rate Number of visitors paying for their cart, divided by those that start the process Shopify provides a standard checkout process, optimised for ease of transaction, but the conversion rate can still vary between sites, depending on payment options and desire. Put simply: if your product is a must-have, customers will jump through any hoops to complete the checkout. Yet for impulse purchases, or luxury items, any tiny flaws in the checkout experience will reduce conversion. Is the checkout conversion worse for particular geographies? It could be that shipping or payment options are worrying users. Does using an order coupon or voucher at checkout increase the conversion rate? With Littledata’s app you can split out the checkout steps to decide if the issue is shipping or payment. 5. Refund rate Percent of transactions refunded Refunds are a growing issue for all ecommerce but especially fashion retail. You legally have to honour refunds, but are you taking them into account in your marketing analysis? If your refund rate is high, and you base your return on advertising spend on gross sales (before refunds), then you risk burning cash on promoting to customers who just return the product. The refund rate is also essential for merchandising: aside from quality issues, was an often-refunded product badly described or promoted on the site, leading to false expectations? Conclusion If you’re not finding it easy to get a clear picture of these 5 steps, we're in the process of developing Littledata’s new Shopify app. You can join the list to be the first to get a free trial! We ensure all of the above metrics are accurate in Google Analytics, and the outliers can then be analysed in our Pro reports. You can also benchmark your store performance against stores in similar sectors, to decide if there are tweaks to the store template or promotions you need to make. Have more questions? Comment below or get in touch with our lovely team of Google Analytics experts!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-30

4 common pitfalls of running conversion rate experiments from Microsoft

At a previous Measurefest conference, one of the speakers, Craig Sullivan, recommended a classic research paper from Microsoft on common pitfalls in running conversion rate experiments. It details five surprising results which took 'multiple-person weeks to properly analyse’ at Microsoft and published for the benefit of all. As the authors point out, this stuff is worth spending a few weeks getting right as ‘multi-million-pound business decisions’ rest on the outcomes. This research ultimately points out the importance of doing A/A Testing. Here follows an executive overview, cutting out some of the technical analysis: 1. Beware of conflicting short-term metrics Bing’s management had two high-level goals: query share and revenue per search. The problem is that it is possible to increase both those and yet create a bad long-term company outcome, by making the search algorithm worse. If you force users to make more searches (increasing Bing’s share of queries), because they can’t find an answer, they will click on more adverts as well. “If the goal of a search engine is to allow users to find their answer or complete their task quickly, then reducing the distinct queries per task is a clear goal, which conflicts with the business objective of increasing share.” The authors suggest a better metric in most cases is lifetime customer value, and the executives should try to understand where shorter-term metrics might conflict with that long-term goal 2. Beware of technical reasons for experiment results The Hotmail link on the MSN home page was changed to open Hotmail in a separate tab/window. The naïve experiment results showed that users clicked more on the Hotmail link when it opened in a new window, but the majority of the observed effect was artificial. Many browsers kill the previous page’s tracking Javascript when a new page loads – with Safari blocking the tracking script in 50% of pages opening in the same window. The “success” of getting users to click more was not real, but rather an instrumentation difference. So it wasn’t that more people were clicking on the link – but actually that just more of the links were being tracked in the ‘open in new tab’ experiment. 3. Beware of peeking at results too early When we release a new feature as an experiment, it is really tempting to peek at the results after a couple of days and see if the test confirms our expectation of success (confirmation bias). With the initial small sample, there will be a big percentage change. Humans then have an innate tendency to see trends where there aren’t any. So the authors give the example of this chart: Most experimenters would see the results, and even though they are negative, extrapolate the graph along the green line to a positive result and four days. Wrong. What actually happens is regression to the mean. This chart is actually from an A/A test (i.e. the two versions being tested are exactly the same). The random differences are biggest at the start, and then tail off - so the long term result will be 0% difference as the sample size increases. The simple advice is to wait until there are enough test results to draw a statistically significant conclusion. That generally means more than a week and hundreds of individual tests. 4. Beware of the carryover effect from previous experiments Many A/B test systems use a bucketing system to assign users into one experiment or another. At the end of one test the same buckets of users may be reused for the second test. The problem is that if users return to your product regularly (multiple times daily in the case of Bing), then a highly positive or negative experience in one of the tests will affect all of that bucket for many weeks. In one Bing experiment, which accidentally introduced a nasty bug, users who saw the buggy version were still making fewer searches 6 months after the experiment ended. Ideally, your test system would re-randomise users for the start of every new test, so those carryover effects are spread as wide as possible. Summary For me the biggest theme coming out of their research is the importance of A/A tests – seeing what kind of variation and results you get if you don’t change anything. Which makes you more aware of the random fluctuations inherent in statistical tests. In conclusion, you need to think about the possible sources of bias before acting on your tests. Even the most experienced analysts make mistakes! Have any comments? Let us know what you think, below!    

2016-11-27

5 tips to avoid a metrics meltdown when upgrading to Universal Analytics

Universal Analytics promises some juicy benefits over the previous standard analytics. But having upgraded 6 different high traffic sites there are some pitfalls to be aware of. Firstly, why would you want to upgrade your tracking script? More reliable tracking of page visitors - i.e. fewer visits untracked More customisation to exclude certain referrers or search terms Better tools for tracking across multiple domains and tracking users across different devices Track usage across your apps for the same web property Ability to send up to 20 custom dimensions instead of the previous limit of only 5 custom variables If you want to avoid any interruption of service when you upgrade, why not book a quick consultation with us to check if Universal Analytics will work in your case. But before you start you should take note of the following. 1. Different tracking = overall visits change If your boss is used to seeing dependable weekly / monthly numbers, they may query why the number of visits has changed. Universal Analytics is likely to track c. 2% more visits than previously (partly due to different referral tracking - see below), but it could be higher depending on your mix of traffic. PRO TIP: Set up a new web property (a different tracking code) for Universal Analytics and run the old and new trackers alongside each other for a month. Then you can see how the reports differ before sharing with managers. Once this testing period is over you'll need to upgrade the original tracking code to Universal Analytics to you keep all your historic data. 2. Different tracking of referrals Previously, if Bob clicked on a link in Twitter to your site, reads, goes back to Twitter, and within 30 minutes clicks on a different link to your site - that would be counted as one visit and the 2nd referral source would be ignored. In Universal Analytics, when Bob clicks on the 2nd link he is tracked as a second visit, and 2nd referral source is stored. This may be more accurate for marketing tracking, but if Bob then buys a product from you, going via a secure payment gateway hosted on another domain (e.g. paypal.com) then the return from the payment gateway will be counted as a new visit. All your payment goals or ecommerce tracking will be attributed to a referral from 'paypal.com'. This will ruin your attribution of a sale to the correct marketing channel or campaign! PRO TIP: You need to add all of the payment gateways (or other third party sites a user may visit during the payment process) to the 'Referral Exclusion List'. You can find this under the Admin > Property > Tracking codes menu: 3. Tracking across domains If you use the same tracking code across different domains (e.g. mysite.co.uk and mysite.com or mysite.de) then you will need to change the standard tracking script slightly. By default the tracking script you copy from Google Analytics contains a line like: ga('create', 'UA-XXXXXXX-1', 'mysite.com');. This will only track pages that strictly end with 'mysite.com'. PRO TIP: It's much safer to change the tracker to set that cookie domain automatically. The equivalent for the site above would be ga('create', 'UA-XXXXXXX-1', 'auto');. The 3rd argument of the function is replaced with 'auto'. 4. Incompatibility with custom variables Only relevant if you send custom data already Custom variables are only supported historically in Universal analytics. That means you will need to change any scripts that send custom data to the new custom dimension format to keep data flowing. Read the developer documentation for more. PRO TIP: You'll need to set the custom dimension names in the admin panel before the custom data can be sent from the pages. You can also only check that the custom dimensions are being sent correctly by creating a new custom report for each dimension. 5. User tracking limitations We wouldn't recommend implementing the new user ID feature just now, as it has some major limitations compared with storing the GA client ID. You need to create a separate view to see the logged-in-user data, which makes reporting pageviews a whole lot more complex. Visits a user made to your site BEFORE signing up are not tracked with that user - which means you can't track the marketing sources by user PRO TIP: See our user tracking alternative. Got more tips on to setting up Universal Analytics? Please share them with us in the comments, or get in touch if you want more advice on how to upgrade!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-26

Widget Tracking with Google Analytics

I was asked recently about the best way to track a widget, loaded in an iframe, on a third-party site with Google Analytics. The difficulty is that many browsers now block 3rd party cookies (those set by a different domain to the one in the browser address bar) – and this applies to a Google Analytics cookie for widgets as much as to adverts. The best solution seems to be to use local storage on the browser (also called HTML5 Storage) to store a persistent identifier for Analytics and bypass the need to set a cookie – but then you have to manually create a clientID to send to Google Analytics. See the approach used by ShootItLive. However, as their comment on line 41 says, this is not a complete solution - because there are lots of browsers beyond Safari which block third party cookies. I would take the opposite approach and check if the browser supports local storage, and only revert to trying to set a cookie if it does not. Local storage is now possible on 90% of browsers in use and the browsers with worst 3rd party cookie support (Firefox and Safari) luckily have the longest support for local storage. As a final note, I would set up the tracking on a different Google Analytics property to your main site, so that pageviews of widgets are not confused with pageviews of your main site. To do list: Build a script to create a valid clientID for each new visitor Call ga('create) function, setting 'storage' : 'none', and getting the 'clientID' from local storage (or created from new) Send a pageview (or event) for every time the widget is loaded. Since the widget page is likely to be the same every time it is embedded, you might want to store the document referrer (the parent page URL) instead Need help with the details? Get in touch with our team of experts and we'd be happy to help!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-25

It’s Black Sunday – not Black Friday

The biggest day for online retail sales among Littledata’s clients is the Sunday after Black Friday, followed closely by the last Sunday before Christmas. Which is more important - Black Friday or Cyber Monday? Cyber Monday saw the biggest year-on-year increase in daily sales, across 84 surveyed retailers from the UK and US. In fact, Cyber Monday is blurring into the Black Friday weekend phenomenon – as shoppers get used to discounts being available for longer. We predict that this trend will continue for 2016, with the number of sales days extending before and after Black Friday. Interested in what 2016 will bring? Stay tuned for our upcoming blog post! Want to see how you did against the benchmark? Sign up for a free trial or get in touch if you have any questions!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-23

The Black Friday Weekend of 2015

Shoppers on Black Friday are becoming more selective – with a decrease in the number of retailers seeing an uplift in Black Friday sales, but an increase in the purchase volumes seen at those selected stores. Littledata looked at the traffic and online sales of 84 ecommerce websites* over the Black Friday weekend (four days from Friday to the following Monday), compared with the rest of the Christmas season (1st November to 31st December). 63% of the surveyed retailers saw a relative increase in traffic on Black Friday weekend 2015 versus the remainder of the season, compared with 75% of the same retailers seeing traffic rise on Black Friday 2014. This implies some decided to opt out of Black Friday discounting in 2015 or got less attention for their discounts as other retailers spent more on promotion. The same proportion of retailers (60% of those surveyed) also saw a doubling (on average) in ecommerce conversion rate** during Black Friday 2015. In 2014, over 75% of retailers saw an improved conversion rate during Black Friday, but the median improvement over the rest of the season was just 50%. 61% of websites also saw an increase in average order value of 16% during Black Friday 2015, compared with only 53% seeing order values increase the previous Black Friday. We predict that this trend will continue in 2016, with a smaller number of websites benefiting from Black Friday sales, but a greater increase in ecommerce conversion rate for a select few. Be sure to check back for what the actual trends will be for 2016! Let us know what you think below or get in touch! * The surveyed websites were a random sample from a group which got a majority of their traffic from the UK or the US. The data was collected from Google Analytics, and so represents real traffic and payments. ** The number of purchases divided by the total number of user sessions   Image credit: HotUKDeals   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-18

What are custom dimensions in Google Analytics?

By default, Google Analytics allows you to segment traffic by standard dimensions such as visitor location, screen size, or traffic source. You can view smarter reports by adding custom dimensions specific for your business. Give me an example Let's say when your members register they add a job title. Would you like to see reports on the site activity for a particular job title, or compare conversion for one job title versus another? In which case you would set a custom dimension of 'Job Title' and then be able to filter by just the 'Researchers' for any Google Analytics report. Or if you run a blog / content site, you could have a dimension of 'author' and see all the traffic and referrals that a particular author on your site gets. How do I set this up? First, you need to be on Universal Analytics, and then you need to tag each page with one or more custom dimensions for Google Analytics. This is more easily done with Google Tag Manager and a data layer. It may be that the information is already on the web page (like the author of this post), but in many cases, your developer will need to include it in the background in a way that can be posted to Google Analytics. Then you will need to set up a custom report to split a certain metric (like page views) by the custom dimension (e.g. author). Please contact our specialists if you want more advice on how to set up custom dimensions!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-16

Exclude fake 'bot' traffic from your site with Google Analytics

Ever wondered why so few visitors convert on your site? One answer is that a big chunk of your traffic is from search engine spiders and other web 'bots' which have no interest in actually engaging with you. Google Analytics has a great new feature to exclude this bot traffic from your site. All you need to do is check a box under the Admin > View > View Settings. The new option is down the bottom, underneath currency selection. It uses the IAB /ABC Bots and Spiders list, which is standard for large publishers, and updated monthly. Warning: you will see a dip in traffic from the date you apply the setting. If you're looking for a more comprehensive method to exclude spam and ghost referrals, check out our how-to guide! Have some questions about this? Get in touch with our Google Analytics experts!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-15

What are Enhanced Ecommerce reports?

In May 2014 Google Analytics introduced a new feature: Enhanced Ecommerce tracking. If you run an ecommerce operation, this gets you much more detailed feedback on your checkout process. What will I see? Shopping behaviour: how are people converting from browsers to purchasers? Checkout behaviour: at what stage of your checkout do buyers abandon the process Product performance: which products are driving your sales, and which have a high return rate Real campaign returns: see your real return on marketing investment including promotional discounts and returns How do I set this up? The bad news is it definitely requires an experienced software developer for the setup. The reports require lots of extra product and customer information to be sent to Google Analytics. You can read the full developer information on what you can track, or our own simpler guide for tracking ecommerce via Tag Manager. However, if you already have standard ecommerce tracking and Google Tag Manager, we can set Enhanced reports up in a couple of days with no code changes on your live site - so no business disruption or risk of lost sales. Is it worth implementing? Imagine you could identify a drop-off stage in your checkout process where you could get a 10% improvement in sales conversion or a group of customers who were unable to buy (maybe due to language or browser difficulties) – what would that be worth? Many businesses have that kind of barrier just waiting to be discovered…   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-14

How to track time on page with Google Tag Manager

Our script for accurate tracking of time on page beats Google's default measurement to give you an accurate picture of how long users are spending on your page open and in focus. This post translates the approach into Google Tag Manager. The setup consists of two tags (one custom), one firing rule and two variables. Step by step: 1. Add the timer script as a custom HTML tag <script><br /> /*<br /> Logs the time on the page to dataLayer every 10 seconds<br /> (c) LittleData consulting limited 2014<br /> */<br /> (function () {<br /> var inFocus = true;<br /> var intervalSeconds = 10; //10 seconds<br /> var interval = intervalSeconds * 1000;<br /> var eventCount = 0;<br /> var maxEvents = 60; //stops after 10 minutes in focus<br /> var fnBlur = function(){inFocus = false; };<br /> var fnFocus = function(){inFocus= true; };<br /> if (window.addEventListener) {<br /> window.addEventListener ('blur',fnBlur,true);<br /> window.addEventListener ('focus',fnFocus,true);<br /> }<br /> else if (window.attachEvent) {<br /> window.attachEvent ('onblur',fnBlur);<br /> window.attachEvent ('onfocus',fnFocus);<br /> }<br /> var formatMS = function(t){<br /> return Math.floor(t/60) +':'+ (t%60==0?'00':t%60);<br /> }<br /> var timeLog = window.setInterval(function () {<br /> if (inFocus){<br /> eventCount++;<br /> var secondsInFocus = Math.round(eventCount * intervalSeconds);<br /> dataLayer.push({"event": "LittleDataTimer", "interval": interval, "intervalSeconds": intervalSeconds, "timeInFocus": formatMS(secondsInFocus) });<br /> }<br /> if (eventCount>=maxEvents) clearInterval(timeLog);<br /> }, interval);<br /> })();<br /> </script> 2. Add two variables to access the data layer variables One for the formatted time, which will feed through the event label And one for the number of seconds in focus since the last event, which will feed through the event value 3. Add the firing rule for the event 4. Add the tag that reports the timer event to Google Analytics Options and further information You can change the timer interval in the custom HTML tag - the reporting will adjust accordingly. Choosing the interval is a trade-off between the resolution of the reporting and the load on the client in sending events, as well as Google's 500 hit per session quota. We've chosen ten seconds because we think the users who are in 'wrong place' and don't engage at all will leave in under ten seconds, anything more is some measure of success. If you'd like assistance implementing this or something else to get an accurate picture of how users interact with your site, get in touch!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-11-14

Accurate tracking of time on site

There’s a flaw in the way Google Analytics measures ‘time on site’: the counter only starts from the second page visited, so all one-page visits are counted as zero time on site. If a visitor comes to your page, stays for 10 minutes reading – and then closes the window… that’s counted as ZERO time. With landing pages that have lots of interaction, or the call to action is a phone call rather than a click, this can be a real problem. Pasting the Javascript below onto all the pages of your site will fix the problem. The script logs an event to Google Analytics for every 10 seconds the visitor stays on the page, regardless of whether they bounced or not. But it won't affect your bounce rate or time on site for historical comparison *. We suggest you look closely at how visitors drop off after 10, 20 and 30 seconds to see which of your web content could be improved. Paste this into the source of your all your pages, after the Google Analytics script <!-- Time on Site tracking (c) LittleData.co.uk 2014 --><script>(function(e){var t=true;var n=0;var r=true;var i=function(){t=false};var s=function(){t=true};if(window.addEventListener){window.addEventListener("blur",i,true);window.addEventListener("focus",s,true)}else if(window.attachEvent){window.attachEvent("onblur",i);window.attachEvent("onfocus",s)}var o=function(e){return Math.floor(e/60)+":"+(e%60==0?"00":e%60)};var u=window.setInterval(function(){e=e+10;if(t){n=n+10;if(typeof _gaq==="object"){_gaq.push(["_trackEvent","Time","Log",o(n),n,r])}else if(typeof ga==="function"){ga("send",{hitType:"event",eventCategory:"Time",eventAction:"Log",eventLabel:o(n),eventValue:10,nonInteraction:"true"})}}},1e4);window.setTimeout(function(){clearInterval(u)},601e3)})(0)</script> What you'll see In Google Analytics go to Behaviour .. Events .. Top Events and click on the event category 'Time'.                               Searching for a particular time will find all the people who have stayed at least that length of time. e.g. 0:30 finds people who have stayed more than 30 seconds. FAQs Does this affect the way I compare bounce rate or time-on-site historically? No. The script sends the timer events as 'non-interactive' meaning they won't be counted in your other metrics. Without this, you would see a sharp drop in bounce rate and an increase in time on site, as every visitor was counted as 'non-bounce' after 10 seconds. If you prefer this, see below about adapting the script. Will this work for all browsers? Yes, the functions have been tested on all major, modern browser: IE 9+, Chrome, Safari and Firefox. What if I upgrade to Universal Analytics? Don’t worry – our script already checks which of the two tracking scripts you have (ga.js or analytics.js) and sends the appropriate log. Will this max out my Google Analytics limits? The script cuts off reporting after 5 minutes, so not to violate Google’s quota of 200 – 500 events that can be sent in one session Can I adapt this myself? Sure. The full source file is here. Need more help? Get in touch with our experts!

2016-11-13

How to set up ecommerce tracking with Google Tag Manager

Enhanced ecommerce tracking requires your developers to send lots of extra product and checkout information in a way that Google Analytics can understand. If you already use GTM to track pageviews you must send ecommerce data via Google Tag Manager Step 1 Enable enhanced ecommerce reporting in the Google Analytics view admin setting, under 'Ecommerce Settings' Step 2 Select names for your checkout steps (see point 4 below): Step 3 Get your developers to push the product data behind the scenes to the page 'dataLayer'. Here is the developer guide. Step 4 Make sure the following steps are tracked as a pageview or event, and for each step set up a Universal Analytics tracking tag: Product impressions (typically a category or listing page) Product detail view (the product page) Add to basket (more usually an event than a page) Checkout step 1 (views the checkout page) Checkout step 2 etc - whatever registration, shipping or tax steps you have Purchase confirmation Step 5 Edit each tag, and under 'More Settings' section, select the 'Enable enhanced ecommerce features' and then 'use data layer' options: Of course, there's often a bit of fiddling to get the data layer in the right format, and the ecommerce events fires at the right time, so please contact us if you need more help setting up the reports! Step 6 - Checking it is working There is no 'real time' ecommerce reporting yet, so you'll need to wait a day for events to process and then view the shopping behaviour and checkout behaviour reports. If you want to check the checkout options you'll need to set up a custom report: use 'checkout options' as the dimension and 'sessions' and 'transactions' as the metrics. Need some more help? Get in touch with our lovely team of experts and we'd be happy to answer any questions!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.  

2016-11-10

Android users buy 4x more than Apple users. Why?

Looking at a sample of 400 ecommerce websites using Littledata, we found mobile ecommerce conversion rates vary hugely between operating systems. For Apple devices, it is only 1% (and 0.6% for the iPhone 6), whereas for Android devices the conversion rate is nearly 4% (better than desktop). It’s become accepted wisdom that a great ‘mobile experience’ is essential for serious online retailers. As 60% of all Google searches now happen on mobile, and over 80% of Facebook ad clicks come from mobile, it’s highly likely the first experience new customers have of your store is on their phone. So is it because most websites look worse on an iPhone, or iPhone users are pickier?! There’s something else going on: conversion rate on mobile actually dropped for these same sites from July to October (1.25% to 1.26%) this year, even as the share of mobile traffic increased. Whereas on desktop, from July (low-season) to October (mid-season for most retailers), the average ecommerce conversion rate jumped from 2% to 2.5%. It seems during holiday-time, consumers are more willing to use their phones to purchase (perhaps because they are away from their desks). So the difference between Android and iOS is likely to do with cross-device attribution. The enduring problem of ecommerce attribution is that it’s less likely that customers complete the purchase journey on their phone. And on an ecommerce store you usually can’t attribute the purchase to the initial visit on their phone, meaning you are seriously underestimating the value of your mobile traffic. I think iPhone users are more likely to own a second device (and a third if you count the iPad), and so can more easily switch from small screen browsing to purchase on a large screen. Whereas Android users are less likely to own a second device, and so purchase on one device. That means iPhone users do purchase – but you just can’t track them as well. What’s the solution? The only way to link the visits on a phone with the subsequent purchases on another device is to have some login functionality. You can do that by getting users to subscribe to an email list, and then linking that email to their Google Analytics sessions. Or offering special discounts for users that create an account. But next time your data tells you it’s not worth marketing to iPhone users, think again. Need help with your Google Analytics set up? Comment below or get in touch!   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.  

2016-11-02

Personalising your site for a local event with Google Optimize

Google Optimize (standard edition) will be released publically at the end of October, allowing free access to powerful AB testing and personalisation features. Here’s a guide to launching your first test, assuming you have the Google Optimize 360 snippet installed on your page. Step 1: Create the experiment I want to trigger a personalisation on Littledata’s homepage, shown only to visitors from London, which promotes a local workshop we have running later this month. It’s not a real AB test, as we won’t have enough traffic to judge whether the banner is a success, but we can use the ‘experiment’ to launch this personalisation for a local audience. First, I need a new test (click the big blue plus sign) and select an AB test. I’ll name my test, and set the editor page as our homepage – which is pre-filled from Google Analytics anyway… Since I have Google Analytics linked, I can select a goal from GA as the objective. In this case, the banner will promote the event (which isn’t tracked on our site) so the only sensible goal is promoting more pageviews – but it’s possible it will also increase signups for our app, so I’ll include that as a secondary objective. Next, I need to add a variant, which is going to load my event banner. I’ve named it ‘add yellow bar’. Clicking on the variant row will take me to the editor. Step 2: Edit the ‘B’ version Note: Optimize’s editor works as a Chrome Plugin, so you’ll need to install that in Google Chrome first. It’s easy to select an element on the page to edit or hide, but my variant will load a new snippet of HTML code which is not already on the page. So I’ll select the element at the top of the page (with ID ‘content’) and then go to the select elements icon in the top left. Now I’ve got the right element to use as a building block, I’m going to add an ‘HTML’ change. And set it to insert the HTML ‘before’ the current element. I’ve pasted in the HTML I’ve recycled from another page. Once I click apply we can see the new element previewing at the top of the page. Next, let’s check it looks OK on mobile – there’s a standard list of devices I can select from. Yes, that is looking good – but if it wasn’t I could click the ‘1 change’ text in the header to edit the code. Lastly, in the editor, you may have noticed a warning notification icon in the top right of the Optimize editor. This is warning me that, since Littledata is a single-page Javascript site, the variant may not load as expected. I’m confident Optimize is still going to work fine in this case. Step 3: Launching the experiment After clicking ‘Done’ on the editor, I go back to the experiment setup. Usually, we’d split the traffic 50:50 between the original and the variant, but in this case, I want to make sure all visitors from London see the message. I’ll click on the weighting number, and then set ‘add yellow bar’ to show 99.9% of the time (I can’t make it 100%). Then, we want to set the geotargeting. The experiment is already limited to the homepage, and now I click ‘and’ to add a 2nd rule and then select ‘geo’ from the list of rules. I want the yellow bar to show only for visitors from London. The city is a standard category, and it recognised London in the autocomplete. As the final step, I need to click ‘Start Experiment’. I can’t edit the rules of any running experiments (as this would mess up the reporting), but I can stop and then copy an experiment which is incorrect. Conclusion Google Optimize makes it really simple to set up tests and personalisations, although it is missing a few features such as scheduling. The premium edition (Optimize 360) will allow more analysis of tests using Google Analytics, and also allow the import of custom audiences from other Google 360 products. This is powerful if you want to launch a customised landing pages experience based on, say, a DoubleClick display ad campaign. So try it out, and if you have any questions, contact one of our experts! Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-10-18

Google Optimize versus Optimizely

I’ve been an Optimizely certified expert for a couple of years and have now trialled Google Optimize 360 for a few months, so it seems a good time to compare how they stack up. Optimizely is the current market leader in AB testing (or content experimentation), due to its ease of use and powerful reporting tools. It gives companies an easy way to run many concurrent tests and manage their setup and roll out without the involvement of developers. That was a big step up from Google Content Experiments, where the only way to set up an experiment is to write some Javascript code. The Guardian had some success with Optimizely, where they increased subscriptions by 46%. Google Optimize is an equivalent testing tool, and has copied much of the user interface that made Optimizely popular: you can click on elements within the page to experiment, and change their style, hide them or move them. My only complaint is that the interface is so simple it can take a while to unbury powerful features, such as transform the page via a custom script. There have been many success stories of companies implementing Google 360. Technically, Optimize’s editor is a bit smoother; using a Chrome plugin avoids some of the browser security issues that bugged Optimizely (since internet browsers confused the Optimizely in-page editor with some kind of script hacking). For example, to load Littledata’s homepage in their editor I have to enable ‘insecure scripts’ in Chrome and then navigate to a different page and back to force the editor to re-render. For reporting, Google Optimize 360 gives the ability to see results either in Optimize or as a custom dimension in Google Analytics – so equivalent to Optimizely. Right now Optimize lacks some features for advanced scheduling and user permissions, but I expect those to evolve as the product gathers momentum. The critical difference is with the targeting options Optimizely allows you to target experiments based on the device accessing the page (mobile vs desktop, browser, operating system) and for enterprise plans only to target based on geolocation. The limitation is that every time Optimizely needs to decide whether to run the test, the check for the user’s location may take a few seconds – and the landing page may flicker as a test rule is triggered on not. Google Optimize can target to any audience that you can build in Google Analytics (GA). This means any information you capture in Google Analytics – the number of previous visits, the pages they have previously seen or the ecommerce transactions – can be used in a test or personalisation. For example, in Google Optimize you could serve a special message to users who have previously spent more than $100 in your store. Optimizely has no knowledge of the users’ actions before that landing page, so the only way you could run an equivalent personalisation is to expose this previous purchase value as a custom script on the landing page (or in a cookie). The beauty of Google Optimize is that you are targeting based on information already captured in Google Analytics. There is no technical setup beyond what you were already doing for Google Analytics, and it doesn’t take a developer to build targeting for a very specific audience. Pricing Optimizely starts from under $100/month, but to get access to enterprise features (e.g. for geo-targeting) you will need to spend $2000 plus per month. Google Optimize is currently being sold at a flat rate of $5000 / month for the basic tier of Google 360 customers (which have between 1M to 50M sessions per month), but in future, it could be offered at a lower price to smaller companies. Conclusion Where you’ll appreciate the benefits of Google Optimize is for running personalisations based on complex rules about previous user behaviour, or the campaigns they have seen. The more different tests you are running, the more time and simplicity saving you will get from building the audience in Google Analytics rather than some custom scripts. Google Optimize 360 is currently in beta but you can currently add your email to invite list. For smaller customers, or those with less complex needs, Optimizely still offers better value – but that might change if Google were to offer a limited version of Optimize at a lower price.   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.   Further reading: Create and customise dashboards and widgets in Google Analytics New in Littledata: an improved navigation, trend detection algorithm, and more How to set up internal searches in Google Analytics Image credit: Blastam  

2016-10-05

Why your business shouldn't go freemium

I took part in an interesting discussion about when the popular ‘freemium’ model for web apps is viable. Startup dogma says that scalable software/web products must be (at least partly) free, so it's worth reiterating the downsides of this approach. Here’s how I see the economic realities: 1. Free brings you the wrong sort of users UK supermarkets are currently locked in a price war to try to grab market share, so let’s imagine Tesco were crazy and desperate enough to offer free cases of beer to all shoppers. Do you think their overall sales would suddenly rise that day? Or would freeloaders come from around town to collect free beer… and then buy their groceries from Asda or Sainsbury’s, as usual? Giving away free products, including free web apps, discourages users from thinking about whether they value the product – and would ever pay for it. Often free attracts the kind of time-rich, cash-poor user who will go to extreme lengths to avoid paying – so frustrating the ‘pay for extra convenience’ premium package of many freemium models. Conversely, offering a free trial and taking payment options up front confirms that customers can and will be able to pay. GoToMeeting’s CFO found their conversion rate of ACTIVE trial users to paying customers was 45% when credit card details were taken up front – as opposed to 3% when they were not. 2. Low payment conversion rates need massive active user bases Most freemium products report less than 5% of active users paying for the service, with the conversion rate generally closer to 1% if you look at all users (including inactive). So with a standard consumer subscription fee of $20/month you would have an Average Revenue Per User (ARPU) of $20 x 12 x 1% = $2.40. To get to a respectable turnover of $10m per year, (I'd suggest the minimum aim for an Angel or seed funded startup) you would need over 4m users. And to get to VC-exit/IPO scale you would need 50m users. That is achievable for mass-adoption products like Evernote and Dropbox but really isn’t if you’re in a niche market like schools. There are less than 4m teachers in the whole of the USA, and expecting to get more than 10% of them adopting your product is cuckoo-land talk. For example, Busuu (a language learning marketplace) boasted 45m million users last year but only $12m in revenue – under 30 cents per user. Busuu is probably just safe with a huge market, but for an app with impressive global traction that is a very small reward: Facebook makes $6 per user per year just from advertising. 3. Low customer lifetime values limit your marketing options If your lifetime value per user is $3 - since the average subscription length is likely to be around a year – there are very few marketing options beyond SEO or word-of-mouth. You can’t justify affiliate deals, pay-per-click or display advertising if you can’t afford to buy customers. e.g. If you buy a keyword for even $0.30 you would need a bullish 10% signup rate even to break even on the marketing spend. Freemium businesses have genuinely got to sell themselves, and so must focus on the product first. The exact opposite was ScreenSelect (which merged into LoveFilm and then Amazon film rental) – customer growth could be bought with the predictable recurring revenue. All LoveFilm did, was promise a marketing agency the first few months of a customer’s subscription payments, and the agency worked out which tactics would be profitable at that level. See William Reeve’s great explanation of the economics. But that only worked when the customer lifetime value was $100 or more. 4. Freemium generates huge buzz but little sustainable profit A big trap in startup business model innovation is following pundits saying ‘freemium is really working for X company’ when ‘working for’ means they are getting loads of users and publicity. In itself publicity is a good thing – ‘product as the message’ – but it will only translate into profit if there is a massive market to capture. Here's the dirty secret: many venture-capital funded businesses are more focused on user growth than revenue because hyper-growth of users increases their valuation faster than moderate revenue growth. So don’t copy them if you plan to grow your company without massive VC funding! How many truly big businesses make the majority of their revenue from premium subscriptions backed by free software? From an article in 2011 here is a decent list who are still going strong: LogMeIn (switched to free trial) Dropbox Skype Spiceworks Lookout Eventbrite Zendesk Evernote SurveyMonkey GitHub What they have in common is mass-market, simple tools with massive international business audiences, and masses of venture funding behind them. But there are plenty of other former tech darlings who flamed out before making the critical mass of users – e.g. Helpstream in 2010. Conclusion: freemium is not for bootstrapped companies I’m not saying freemium can’t work for you, but be very wary of following the freemium orthodoxy unless you have: A potential user base of more than 10m (and preferably 100m users) A product so simple it doesn’t need training or support to get set up Substantial funding – it takes a long time to build enough users to generate cost-covering income For many businesses, the fabled networked effects of all those free users will be out-weighted by the opportunity cost of not charging some of them from the start. To profitability and beyond! Have any questions? Comment below or get in touch! (Chart: How Much Is the Average Facebook User Worth? | Statista)   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-09-27

Making the detection of significant trends in your traffic easier to see

Our core belief at Littledata is that machines are better at spotting significant changes in your website’s performance than a human analyst. We’ve now made it easier for you to get specific alerts, reducing the time spent wading through data. This is the story of how we produced the new trend detection algorithm. Enjoy! Back in 2014, we developed the first version of an algorithm to detect if today or this week’s traffic was significantly different from previous periods. This allows managers to focus in on the aspects of the traffic or particular marketing campaigns which are really worthy of their attention. Although the first version was very sensitive, it also picked up too many changes for a single person to investigate. In technical language, it was not specific in enough. In June and July, Littledata collaborated with a working group of mathematicians from around Europe to find a better algorithm. The European Study Group with Industry (ESGI) originated in the University of Limerick’s mathematics department in Ireland and has helped hundreds of businesses link up with prominent mathematicians in the field to solve real-world problems. Littledata joined the latest study group in University College, Dublin in July, and was selected by a dozen mathematicians as the focus for their investigation. Andrew Parnell from the statistics department at University College, Dublin helped judge the output from the four teams that we split the group into. The approach was to use an algorithm to test the algorithms; in other words, we pitted a group of statistical strategies against each other, from clustering techniques to linear regression, through to Twitter’s own trend detection package, and compared their total performance across a range of training data sets. Initially, the Twitter package looked to be doing well, but in fact, it had been developed specifically to analyse huge volumes of tweets and perform badly when given low volumes of web traffic. In between our host’s generous hospitality, with Guinness, Irish folk music, and quite a lot of scribbling of formulas on beer mats, myself and our engineer (Gabriel) worked with the statisticians to tweak the algorithms. Eventually, a winner emerged, being sensitive enough to pick up small changes in low traffic websites, but also specific enough to ignore the random noise of daily traffic. The new trend detection algorithm has been live since the start of August and we hope you enjoy the benefits. Our web app allows for fewer distractions and more significant alerts tailored to your company’s goals, which takes you back to our core belief that machines are able to spot major changes in website performances better than a human analyst. If you’re interested in finding out how our web app can help you streamline your Google Analytics’ data, please get in touch! Further reading: 7 quick wins to speed up your site analysis techniques Online reporting turning information into knowledge Will a computer put you out of a job?

2016-09-08

Setting up common email software for Google Analytics

Many of the popular email providers make it easy to automatically tag up links in your emails to allow Google Analytics to track them under the 'Email' channel. Without this, the traffic from email links will be dispersed under 'Direct' and 'Referral' channels, and you won't be able to see which emails really drive engagement or sales. Here are the links to set up some common email services: MailChimp Campaign Monitor ActiveCampaign Benchmark Email ConstantContact iContact Emma MadMimi GetResponse Mail Jet If your email provider is not in the list, or you send emails from your own platform, you'll need to manually paste in tagged up email links. Still need some help? Contact us and we'll be happy to answer any questions!

2016-08-24

Personally Identifiable Information (PII), hashing and Google Analytics

Google has a strict policy prohibiting sending Personally Identifiable Information (PII) to Google Analytics. This is necessary to provide GA reports around the world, yet comply with country regulations about storing personal information.  Even if you send personal information accidentally, Google may be forced to delete all of your analytics data for the time range affected. This policy has recently tightened to state: You may not upload any data that allows Google to personally identify an individual (such as names and email addresses), even in hashed form. A number of our clients are using a hashed email as the unique identifier for logged in users, or those coming from email campaigns.  If so, this needs be a minimum of SHA256 hashing (not MD5 hashing), with a 'salt' to improve the security - check your implementation meets the required standard. If you want to check if personal information affects your analytics, we now include checking for PII in our complete Google Analytics audit. Google's best practice for avoiding this issue is to remove the PII at the source - on the page, before it is sent to Google Analytics.  But it may be hard to hunt down all the situations where you accidentally send personal data; for example, a form which sends the user's email in the postback URL, or a marketing campaign which add the postcode as a campaign tag. We have developed a tag manager variable that does this removal for you, to avoid having to change any forms or marketing campaigns which are currency breaking the rules. Steps to setup 1. Copy the script below into a new custom Javascript variable in GTM [code language="javascript"]function() { // Modify the object below to add additional regular expressions var piiRegex = { //matches emails, postcodes and phone numbers where they start or end with a space //or a comma, ampersand, backslash or equals "email": /[\s&amp;\/,=]([a-zA-Z0-9_.+-]+\@[a-zA-Z0-9-]+\.[a-zA-Z0-9-.]+)($|[\s&amp;\/,])/, "postcode": /[\s&amp;\/,=]([A-Z]{1,2}[0-9][0-9A-Z]?(\s|%20)[0-9][A-Z]{2})($|[\s&amp;\/,])/, "phone number": /[\s&amp;\/,=](0[0-9]{3,5}(\s|%20)?[0-9]{5,8}|[0-9]{3}-[0-9]{4}-[0-9]{4})($|[\s&amp;\/,])/ }; // Ensure that {{Page URL}} is updated to match the Variable in your // GTM container to retrieve the full URL var dl = {{Page URL}} var dlRemoved = dl; for (key in piiRegex) { dlRemoved = dlRemoved.replace(piiRegex[key], 'REMOVED'); } return dlRemoved; }[/code]   2.Check {{Page URL}} is set up in your GTM container This is a built-in variable, but you'll need to check it under the variables tab.   3. Change the pageview tag to override the standard document location, and use the variable with PII removed   By default, Google Analytics takes the location to be whatever is in the URL bar (document.location in Javascript).  You will over-ride that with the PII-safe variable.  

2016-08-03

How to use Enhanced Ecommerce in Google Analytics to optimise product listings

Ecommerce reporting in Google Analytics is typically used to measure checkout performance or product revenue.  However, by analysing events at the top of the funnel, we can see which products need better images, descriptions or pricing to improve conversion. Space on product listing pages is a valuable commodity, and products which get users to click on them – but don’t then result in conversion – need to be removed or amended.  Equally, products that never get clicked within the list may need tweaking. Littledata ran this analysis for a UK retailer with Google Analytics Enhanced Ecommerce installed.  The result was a scatter plot of product list click-through-rate (CTR) – in this case, based on the ratio of product detail views to product listing views – versus product add-to-cart rate.  For this retailer, it was only possible to buy a product from the detail page. We identified three problem categories of product, away from the main cluster: Quick sellers: these had an excellent add-to-cart rate, but did not get enough list clicks.  Many of them were upsell items, and should be promoted as ‘you may also like this’. Poor converters: these had high click-through rates, but did not get added to cart. Either the product imaging, description or features need adjusting. Non-starters: never get clicked on within the list. Either there are incorrectly categorised, or the thumbnail/title doesn’t appeal to the audience.  They need to be amended or removed. How we did it Step 1 - Build a custom report in GA We need three metrics for each product name (or SKU) - product list views, product detail views and product add to carts - and then add 'product' as a dimension. Step 2 - Export the data into Excel Google Analytics can't do the statistical functional we need, so Excel is our favoured tool.  Pick a decent time series (we chose the last three months) and export. Step 3 - Calculate List > Detail click through This website is not capturing Product List CTR as a separate metric in GA, so we need to calculate as Product Detail Views divided by Product List Views.  However, our function will ignore products where there were less than 300 list views, where the rate is too subject to chance. Step 4 - Calculate Detail > Add to Cart rate Here we need to calculate Product Adds to Cart divided by Product Detail Views.  Again, our function will ignore products where there were less than 200 detail views. Step 5 - Exclude outliers We will use an upper and lower bound of the median +/- three standard deviations to remove improbable outliers (most likely from tracking glitches). First we calculate the median ( =MEDIAN(range) ) and the standard deviation for the population ( =STDEV.P(range) ).  Then we can write a formula to filter out all those outside of the range. Step 6 - Plot the data Using the scatter plot type, we specify List > Detail rate as the X axis and Detail > Add to Cart as the Y axis. The next step would be to weight this performance by margin contribution: some poor converters may be worth keeping because the few sales they generate are high margin. If you are interested in setting up Enhanced Ecommerce to get this kind of data or need help with marketing analytics then please get in contact.   Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.

2016-03-31

SEIS support covers over 100% of your startup investment risk

Until recently I hadn't understood how generous the Seed Enterprise Investment Scheme is for investors in early-stage companies. Investors can put up to £100k in qualifying companies, as long as they don't control more than 30% of the SEIS company. There are three overlapping benefits which mean you can recoup over 100% of your investment in tax offset if the companies goes bust, and get a 5x boost to the value of your initial investment if all goes well.  It sounds too good to be true, so use your allowance while it is still open! Let's assume that you are an additional rate (45%) tax payer, and want to invest £10,000 of capital gains into an SEIS company. What happens if that company eventually goes bust? A. Reinvestment relief Firstly you get a 50% reduction in the capital gains tax bill from gain reinvested.  If you realised a gain of at least £10k over and above the capital gains tax allowance from selling shares or property, then you can reclaim the tax on the amount you reinvest in the SEIS company.  At the 2014/2015 higher rate of 28% that is: + £2,800 B. Income tax relief Next you can write 50% of your £10k investment off against your income tax bill from this year or last - even if you didn't directly use that income to invest in the SEIS company. +£5,000 C. Loss relief If the company goes bust, then you can write a further 45% (your marginal tax rate) in the year you claim against your income tax bill. 45% times the £5,000 of investment the tax payer didn't originally fund. +£2,250   So of that £10k you have already recouped £2,800 + £5,000 + £2,250 = £10,050 from HMRC. Leaving you with a small gain to cover the inconvenience. But look on the bright side! What if the company sells for double the value in a few years' time? This time you still get benefits A & B, but also keep the proceeds free of capital gains tax. So you put in £10k, but take £7,800 back off your tax bill, leaving you with £2,200 net exposure.  When you sell the shares for £20k, you have multiplied your capital at risk 9 times An investment in an equivalent non-SEIS company would have yielded £20k, less capital gains tax of £2,800 = £17,200 (1.7x your investment) So you get more than five times the net gain from the SEIS investment.  

2016-02-25

3 steps to great email customer support

As a consumer brand, is there a better way of getting customers to refer you business than offering excellent customer support? My inbox this afternoon showed two polar opposites of handling support by email and illustrated what great support looks like. I can sum up the differences: Ditch the "you're in a queue" email Really listen to the customer Offer further advice Ditch the "you're in a queue" email My depressing email exchange with Swiss Airlines starts when I tried to complain about the £4.50 credit card charge. I would normally never pay it, but their debit card payment route was broken, so to book the flight I had no choice. Dear customer, thank you for your message. We will get back to you as soon as possible. The response time may vary depending on the amount of research required. Please do not reply to this E-Mail. Use for your feedback our page: www.swiss.com/contacts We thank you for your understanding. Yours sincerely, Swiss International Air Lines Ltd. Let's unpack the sheer hostility of this: "thank you for your message" = we care so little we couldn't be bothered to add a capital letter "as soon as possible" = nor do we have enough staff to answer today "Please do not reply to this E-Mail" = we can't even be bothered to install a smart ticketing system Really it would be better not to send me an auto-response at all - just get back to me when a human is ready. Let's compare that with an email I get from TransferWise, which was my good experience of the day. At first glance, this looks like an automated response, but then I realise it's signed by a real person - and they actually want me to reply to the email. TransferWise are having to deal with genuinely onerous FCA anti-money laundering rules - and offering a helpful way to get around it. Really listen to the customer The Swiss conversation goes downhill from there. OK, I'm a bit smart Alec about the transaction fee - but it's a well known scam. On 24 Feb 2016, at 05:51, contactus@swiss.com wrote: Dear Mr. Upton, Thank you for writing to us with regards to your query and we apologizes for the inconvenience caused. We would like to inform you that GBP4.50 is the fee charged directly from the bank/bank fee. Therefore, we cannot grant a refund with regards to the above mentioned fees. We trust the above information will be of assistance and are available to assist you with any further questions at any time. Thank you for choosing SWISS and we wish you a pleasant day further. Kind regards, Miriama Consultant Customer Travel Services / R1S ----- From: Edward Upton [mailto:edward@edwardupton.com] Dear Miriama, That is absolutely untrue. MasterCard charges you 0.3% for the transaction, which in this case is 51p https://www.mastercard.us/en-us/about-mastercard/what-we-do/interchange.html So please can you refund me GBP 4? regards, Edward Upton ----- From: contactus@swiss.com Dear Mr. Upton, Thank you for writing to us. We have reviewed your request regarding your reservation. Please note that in regards to your request we will not be able ot refund the OPC. Please note this (GBP4.50) is a charge placed by the credit card company and it applies as per the point of commencement of your ticket. We hope this information is useful. Please do let us know if you need additional information. Thank you for choosing SWISS. Kind Regards, Alexander Consultant Customer Travel Services / R1S This feels like someone has cut and pasted from a standard response list. It's robotic. And given that the original issue was actually about their website being broken, there is a total lack of empathy for the issue - just some 'apologizes' (sic). Offer further advice Often companies have to say no to refunds and extra requests, but at least be gracious. And sometimes the company can offer you something that benefits both parties: a guide to how to avoid needing to email in the future. Here is the exemplary reply from Transferwise Hi Edward, I hope you’re doing well! Thank you for getting back to us, and confirming that we can change the name on the payment ###### to your personal. I shall quickly pass this on to my colleagues, who are able to make the change and proceed with the transfer. As soon as the payment is sent out from our end, we shall send you a confirmation e-mail, like always. All you need to do is check your inbox every now and then.:) Just in case, I will explain how you can choose to use both your personal and business profiles on TransferWise. Once you log in to your TransferWise account, on the upper right corner you should see a logo (like a man in a circle). When you click on the logo, you should see: Use as Edward Upton Use as Littledata Consulting Ltd Therefore, if you want to set up a personal payment, and you’re planning to send money from your personal bank account, please make sure that “Use as Edward Upton” is ticked. And if you’re planning to make a business payment and send money from your business bank account, please make sure to choose the second option. If anything was left unclear or you would need help with something else, please don’t hesitate to get back to us. We are always happy if we can help! I hope you have a lovely day, Eliisa, TransferWise Support Which company do you think I'll recommend in the future? Comment below!

2016-02-25

5 myths of Google Analytics Spam

Google Analytics referral spam is a growing problem, and since Littledata has launched a feature to set up spam filters for you with one click, we’d like to correct a few myths circulating. 1. Google has got spam all under control Our research shows the problem exploded in May – and is likely to get worse as the tactics get copied. From January to April this year, there were only a handful of spammers, generally sending one or two hits to each web property, just to get on their reports. In May, this stepped up over one thousand-fold, and over a sample of 700 websites, we counted 430,000 spam referrals – an average of 620 sessions per web property, and enough to skew even a higher traffic website. The number of spammers using this tactic has also multiplied, with sites such as ‘4webmasters.org’ and ‘best-seo-offer.com’ especially prolific. Unfortunately, due to the inherently open nature of Google Analytics, where anyone can start sending tracking events without authentication, this is really hard for Google to fix. 2. Blocking the spam domains from your server will remove them from your reports A few articles have suggested changing your server settings to exclude certain referral sources or IP addresses will help clear us the problem. But this misunderstands how many of these ‘ghost referrals’ work: they are not actual hits on your website, but rather tracking events sent directly to Google’s servers via the Measurement Protocol. In this case, blocking the referrer from your own servers won’t do a thing – since the spammers can just go directly to Google Analytics.  It's also dangerous to amend the htaccess file (or equivalent on other servers), as it could prevent a whole lot of genuine visitors seeing your site. 3. Adding a filter will remove all historic spam Filters in Google Analytics are applied at the point that the data is first received, so they only apply to hits received AFTER the filter is added. They are the right solution to preventing future spam, but won’t clean up your historic reports. To do that you also need to set up a custom segment, with the same source exclusions are the filter. You can set up an exclusion segment by clicking 'Add Segment' and then red 'New Segment' button on the reporting pages and setting up a list of filters similar to this screenshot. 4. Adding the spammers to the referral exclusion list will remove them from reports This is especially dangerous, as it will hide the problem, without actually removing the spam from your reports. The referral exclusion list was set up to prevent visitors who went to a different domain as part of a normal journey on your website being counted as a new session when they returned. e.g. If the visitor is directed to PayPal to pay, and then returns to your site for confirmation, then adding 'paypal.com' to the referral exclusion list would be correct. However, if you add a spam domain to that list then the visit will disappear from your referral reports... but  still, be included under Direct traffic. 5. Selecting the exclude known bots and spiders in the view setting will fix it Google released a feature in 2014 to exclude known bots and spiders from reports. Unfortunately, this is mainly based on an IP address - and the spammers, in this case, are not using consistent IP addresses, because they don't want to be excluded. So we do recommend opting into the bot exclusion, but you shouldn't rely on it to fix your issue Need more help? Comment below or get in touch!

2015-05-28

Will a computer put you out of a job?

I see a two tier economy opening up in England, and it’s not as simple as the haves and have-nots. It’s between those that build machines, and those that will be replaced by them: between those that can code, and those that can’t. We’ve seen the massive social effects that declining heavy manufacturing jobs since 1970s have had on much of the North of England and Scotland, and I believe we’re at the start of a similar long-term decimation of service industry jobs – not due to outsourcing to China, but due to automation by computers.  Lots of my professional friends in London would feel they’re beyond the reach of this automation: their job involves being smart and creative, not doing production-line tasks. But it is these jobs, which currently involve staring at numbers on a screen, which are most at risk from computer substitution. If your job involves processing a load of data into a more presentable format (analysts, accountants, consultants and some types of traders) then a computer will eventually - within the next 20 years - be able to do your job better than you. In fact, within 20 years computers will be much better than humans at almost every kind of data processing, as the relentless extension of Moore’s law means pound-for-pound computer processing will be 1 million times cheaper than it is now. As Marc Andreessen put it, ‘Software is eating the world’, and we’re only just beginning to work through the implications.  This worries me. With the greater and greater levels of automation of the working world, what happens to employment? Last year we saw an incredible event in the sale of WhatsApp to Facebook: massive wealth creation ($17bn) accompanied by almost no job creation (33 employees at the time of sale). If a tiny number of highly skilled people can create a service with 300m paying customers, why do companies need to hire lots of people? In the utopian view of future work we give up all boring admin tasks to the machines, and focus on face-to-face interaction and making strategic decisions based on selected knowledge fed to us by our personal digital agents (like Google search on steroids). Lots more thinking space leads us to be more productive, and more leisure time makes us happier. But 30 years ago they thought computers would evolve into very capable personal assistants, when in fact office workers are chained to the screen for longer hours by the tyranny of email and real-time information flow. Look at Apple’s forecast from 1987 of what computing might look like in 2006: the professor is freed from the tedium of typing or travelling to the library. Yet they didn’t consider whether the professor himself might be needed in a world where students could get their lectures as pre-recorded videos. So the cynical view is that more volume of data will require more humans to interpret, and the technology will always need fixing. As companies become more automated there will be more and more jobs shifting into analysis and IT support; analogous to how, as postal mail has been replaced by email, jobs in the company post room have shifted into IT support. The problem is that there really are a limited number of humans that can set up and maintain the computers. I’d love to see society grappling with that limitation (see grass-roots initiative like CoderDojo) but there are some big barriers to retraining adults to code: limited maths skills, limited tolerance for the boredom of wading through code, and limited opportunities for people to test their skills (i.e. companies don’t trust this most critical of job roles to new apprentices). So those that have commercial experience in programming can command escalating day rates for their skills – and this is most apparent in London and San Francisco, while pay in other skilled areas is not even keeping up with core inflation. That leads us to the dystopian view: that the generation starting their working lives now (those 10 years younger than me) will see their prospects hugely diverge, based on which side of the ‘replace’ or ‘be replaced’ divide they are. If companies akin to Google and Facebook become the mainstay of the global economy, then they’ll be a tiny number of silicon sultans whose every whim is catered for – and a vast mass of technology consumers with little viable contribution to the workplace. Let’s hope our politicians start grasping the implications before they too are replaced by ‘democracy producing’ software!

2015-03-27

How to audit your Web Analytics Ecommerce tracking

Most companies will see a discrepancy between the transaction volumes recorded via web analytics and those recorded via internal sales or financial database. This article focuses on how to find and reduce that discrepancy, to give greater credibility to your web analytics data. Following on from our article on common Google Analytics setup problems, we are often asked why Google Analytics ecommerce tracking is not a 100% match with other records, and what is an acceptable level of difference. Inspired by a talk from Richard Pickett at Ensighten, here is a checklist to run through to reduce the sources of mismatch. The focus here is Google Analytics Ecommerce tracking, but it could apply to other systems. In summary, you wouldn’t ever expect there to be a 1:1 match, due to the different paths the two events take over the internet. The general consensus is that anything less than 4% of difference in transaction volumes is good, but could sometimes persist up to 10%. Factors that affect this target rate include how many users have got ad blockers or disable Google Analytics (popular in Germany, for example), what proportion are on mobile devices (which suffer from more network interruptions) and how the purchase thank you / confirmation page is built. So on to the list. 1. Are other Javascript errors on the page blocking the ecommerce event in certain situations? The most common reason for the tracking script not executing in the browser is that another bug on your page has blocked it (see GDS research). The bug may only be affecting certain older browsers (like Internet Explorer 7), and have missed your own QA process, so the best approach is to use Google Tag Manager to listen for any Javascript error events on the confirmation page and send these to Google Analytics as custom events. That way your users do the testing for you, and you can drill into exactly which browsers and versions the bugs are affecting. 2. Is the tracking code as far up the page as it could be? If the user drops their internet connection before the whole page loads then the ecommerce event data won’t get a chance to fire. The best approach is to load the script at the bottom of the <head> element or top of the <body>.  The Google Analytics script itself won't block the page load, and arguably in this one purchase confirmation page, the tracking is more important than the user experience. 3. Is the tracking code firing before all the page data has loaded? The inverse of the previous problem: you may need to delay firing the tracking code until the data is ready. This is particularly an issue if your ecommerce transaction data is ‘scraped’ from the HTML elements via Google Tag Manager. If the page elements in question have not loaded before the ecommerce tracking script runs, then the product names, SKUs and prices will be empty – or returning an error. 4. Is the problem only your ecommerce tracking script or just page tracking is general? It could be that the way you are sending the transaction data (e.g. product name, price, quantity) is the problem, or that the page tracking overall is failing in some cases. You can pinpoint where the problem lies by comparing the pageviews of the confirmation page, with the number of ecommerce events tracked. Caveat: on many sites, there’s another route to seeing the purchase confirmation page, which doesn’t involve purchasing (for example as a receipt of a historic purchase). In that case, you may need to capture a unique purchase event, which only fires when a new purchase is confirmed – but without any information on the transaction or products. 5. Are events from your test site excluded? Most companies will have a development, staging or user acceptance testing server to where the website is tested, and test users can purchase.  Are you blocking the tracking from these test sites? Some possible ways to block the test site(s) would be: Set up sub-domain specific blocking rules in Google Tag Manager (or better) Divert the tracking from your test subdomains to a test Google Analytics account, using a lookup macro/variable Set up filters in the Google Analytics view to exclude 6. Is your tag set with a high priority? Tag manager only. If you use Google Tag Manager and have multiple tags firing on the tracking page it’s possible that other tags are blocking your ecommerce data tag from firing. Under ‘Advanced settings’ in the tag editor, you can set a higher priority number for tag firing; I assume the ecommerce data to Google Analytics is always the first priority. 7. Are any strings in the product name properly escaped? A common problem is apostrophes: if your product name contains a quote mark character, then it will break the following Javascript. See Pete’s bunnies – the strings in yellow are valid, and everything after the stray apostrophe will be misinterpreted. The solution is to run a script across any text field to either strip out the quotation marks or replace any quotes with their HTML equivalent (eg &quot;). 8. Are your quantities all integers? One of our clients was selling time slots, and so had the ‘quantity’ of the ecommerce tracking data equivalent to a number of hours. Timeslots sold in half-hours (e.g. 1.5 hours) were not tracking… because Google Analytics only recognises a quantity which is a whole number, so sending ‘1.05’ will not be recognised as 1. 9. Are any possible ‘undefined’ values handled? It may be that the data on your products is incomplete, and some products that people buy do not have a name, price or SKU. The safest approach is to have some fall-back values in your Javascript tracking code to look for undefined or non-text variables and post a default value to Google Analytics. E.g. If ‘product name’ is undefined then post ‘No product name’, or for price, the default should be ‘0.00’. These will then clearly show up in your Ecommerce Product performance reports and the data can be cleaned up. 10. Are users reloading the page and firing duplicate tracking events? Check whether this is a problem for your site by using our duplicate transactions custom report to see multiple events with the same transaction ID. A solution is to set a ‘has tracked’ cookie after the ecommerce tracking has been sent the first time, and then check whether the cookie is set before sending again. 11. Are users going back to the page and firing the tracking at a later date? The sessions column in the transactionID report in step 9 should give you an idea of whether the problem is repeat page loads in one session, or users revisiting the page in another session. If you see duplicate transaction IDs appearing in other sessions there are a couple of possibilities to investigate: Could users be seeing the page again by clicking on a link to an email, or from a list of historic orders? Are there any back-end admin pages that might link to the confirmation page as a receipt? In both cases, the solution is to have a different URL for the receipt that the one where the ecommerce tracking is fired. If there are any other troubleshooting steps you have found helpful, please let us know in the comments or get in touch!  

2015-03-17

5 common Google Analytics setup problems

Can you rely on the data you are seeing in Google Analytics? If you use it daily in your business you should really give some time to auditing how the data is captured, and what glitches could be lurking unseen. The notifications feature in Google Analytics now alerts you to some common setup problems, but there are more simple ones you could check today. Here are 5 aspects of your Google Analytics account to check now. Are you running the latest Universal Analytics tracking code? Is your overall bounce rate below 10%? Are you getting referrals from your own website? Are you getting ‘referrals’ from your payment gateway? Have you got the correct website default URL set in GA? Are you getting full referring URL in reports? 1. Are you running the latest Universal Analytics tracking code? You may have clicked upgrade in the Google Analytics admin console, but have your developers successfully transferred over to the new tracker code? Use our handy tool to test for universal analytics (make sure you copy your URL as it appears in the browser bar). 2. Is your overall bounce rate below 10%? The 'bounce rate' is defined as sessions of only one page. It’s highly unlikely to be in single digits unless you have a very unique source of engaged traffic. However, it is possible that the tracking code is firing twice on a single page. This double counting would mean Google Analytics sees every single page view as two pages – i.e. not a bounce This is more common on template-driven sites like Wordpress or Joomla, where you may have one tracking script loaded by a plugin – and another pasted onto the main template page. You can check if you have multiple pageviews firing by using the Google Tag Assistant plugin for Chrome. 3. Are you getting referrals from your own website? A self-referral is traffic coming from your own domain – so if you are www.acme.com, then a self-referrals would be appearing as ‘acme.com’. Have a look at the (recently moved) referrals list and see if that is happening for you. This is usually caused by having pages on your website which are missing the GA tracking code, or have it misconfigured. You can see exactly which pages are causing the problem by clicking on your domain name in the list and seeing the referring path. If you are on universal analytics (please use our tool to check) you can exclude these referrals in one step with the Referral Exclusion list.  For a fuller explanation, see the self-referral guide provided by Google. 4. Are you getting ‘referrals’ from your payment gateway? Similar to point 3: if you have a 3rd party payment service where customers enter their payment details, after they redirect to your site – if you are on Universal analytics – they will show up as a new visit… but originating from ‘paypal.com’ or ‘worldpay.com’. You need to add any payment gateway or similar 3rd party services to that referral exclusion list.  Just add the domain name - so PayPal would be 'paypal.com' 5. Have you got the correct website default URL set in GA? When Google Analytics was first set up for your website you may have set a different domain name than what you now use. Or maybe you have switched to run your site on https:// rather than http://. So you need to change the default URL as set up in the admin page. For this go to Admin > Property > Property Settings. Once that is setup correctly, the ‘All Pages’ report becomes a lot more useful – because you can click through to view the actual page using the open link icon. Advanced: Are you getting full referring URL in reports? If you run your website across different subdomains (e.g. blog.littledata.co.uk and www.littledata.co.uk) then it can be difficult to tell which subdomain the page was on. The solution to this is to add the hostname to the URL using a custom filter. See the guide on how to view full page URLs in reports. What other setup issues are you experiencing? Let us know in the comments or by tweeting @LittledataUK.

2015-02-18

Best enhanced ecommerce plugins for Magento

With the release of Google Analytic's Enhanced Ecommerce tracking, Magento shop owners now also have the option to track more powerful shopping and checkout behaviour events. Using a Magento plugin to add the tagging to your store could save a lot of development expense. But choosing a third party library has risks for reliability and future maintenance, so we’ve installed the plugins we could find to review how they work.  The options available right now are: Tatvic’s Google Analytics Enhanced Ecommerce plugin (there is also a paid version with extra features) BlueAcorn’s ‘official’ Google Enhanced Ecommerce for Magento plugin Scommerce Mage's Google Enhanced Ecommerce Tracking plugin Anowave – they have a GTM and non-GTM plugin available for €150, but declined to let us test them for this review DIY – send the data directly from Google Tag manager Advanced features Plugin Checkout options? Promotions? Social interactions? Refunds? Tatvic - - - - BlueAcorn  Y  -  -  - Scommerce Y Y - Y Anowave Y Y Y Y DIY setup Y Y - - Our overall scoring Plugin Ease of install Flexibility Privacy Cost Tatvic 4 2 2 Free BlueAcorn 3 1 5 Free Scommerce 3 3 5 £65 / US$98 None (DIY) 1 5 5 Your time! There is no clear winner so choose the plugin that suits your needs best. If you are concerned about data privacy then go for either BlueAcorn or Scommerce, but pick Tatvic's plugin if you prefer easiest installation process. If you want to spend more time capturing further data – like promotions and refunds – you might want to consider implementing the tracking yourself with Google Tag Manager. Tatvic’s plugin Advantages: Fast and easy to install (it took less than an hour to configure everything). Good support by email after installation. Basic shopping behaviour and checkout behaviour steps captured. Disadvantages: It injects a Google Tag Manager container into your site that only Tatvic can control. Some reviewers on Magento Connect raised privacy concerns here, so Tatvic should clarify how and why they use this data. At the very least it is a security flaw, as any Javascript could be injected via that container. * Product impressions are only segmented by product categories - there is no separation for cross-sell, upsell or related products widgets. No support for coupon codes or refunds. * Tatvic can help you configure your own GTM container if their standard setup is an issue for you. Scommerce plugin Advantages: It doesn’t need Google Tag Manager, so you can be sure that no one can add scripts to your site. You can install from Magento Connect. Update on 24 Aug 2015: Supports one page checkout. BlueAcorn plugin Advantages: Easy to install. It doesn't add Google Tag Manager to your site. Disadvantages: You have to set your shop currency to US dollars. Support is slow to respond. Enable Enhanced Ecommerce reporting To be able to install listed plugins for Magento, you will first of all need to enable Enhanced Ecommerce tracking in Google Analytics. If you already have it enabled, you can skip this section. Go to Google Analytics > Admin > View > Ecommerce Settings. Enable Enhanced Ecommerce and set up the checkout funnel steps (see the screenshot for standard checkout steps).  Remove your Google Analytics tracking code from the website. Installing Tatvic’s plugin Go to Magento Connect centre, open the “settings” tab and enable beta extensions.  Go back to the “extensions” tab, paste the link into extension and click 'Install'.  You should see a successful completion message.  Go back to the configuration page. Don't worry if you see 404 error.  Log out and back in again and you shouldn't see the error anymore.  Now add the missing details in the configuration settings, eg Google Analytics account, checkout URL. You should see all the checkout steps working.  Installing BlueAcorn plugin BlueAcorn's plugin supports only stores that have their currency set to US dollars. If your online shop is in any other currency, you won't be able to see most of the data on your product's sales performance. Installing BlueAcorn's plugin is similar to Tatvic's but you have to do two extra steps. Go to the cache store management, select all items, select 'Disable' from the Actions dropdown list and click 'Submit'.  Go to System > Tools > Compilation and click button ‘Disable’.  Install the plugin. Log out and log back in. Re-enable the cache by going back to the cache store management, select all items and enable them. Go to the Google API tab (System > Configuration > Google API), enable plugin and insert your Google Analytics account number.  Installing Scommerce plugin Disable compilation mode by going to System > Tools > Compilation and click 'Disable' button.  Disable Google Analytics API.  Upload module to root folder (PDF). Now flush the cache.   Configure plugin.   If you have any further queries regarding the plugins we reviewed, don't hesitate to let us know in the comments.

2015-02-18

Agriculture in Uganda: Measure and Improve

I had a truly inspiring day visiting Send a Cow project near Masaka in Uganda. A group of 30 farmers underwent 4 years of training, supported by weekly visits from a social worker and agricultural trainer. From a group living in absolute under-a-dollar-a-day poverty, there are now farmers owning thousands of dollars worth of livestock and selling export crops like coffee. This education and support, plus the capital grant of one animal per household, has transformed their community. Although the success relied on a solid base of family and group cohesion, organised labour and animal husbandry, I want to focus on three aspects which have ongoing potential for the community. 1. Record keeping Yep, data to you and I. Writing daily details of milk yields, crop inputs, market sale prices and even visitor numbers enabled the farmers to measure and improve. Data also allows farmers to forecast and be inspired. Selling a regular surplus of milk from two cows (after family consumption – yes, they have great teeth!) gave the farmer a regular income of US$3.50 per day at the farm gate. That is more than a teacher’s salary in Uganda.  With tender care and back-breaking forage harvesting, they now have a calf being reared – and can count just how much that will mean in further milk and profits. Maybe in 10 years they will be entering yields into a smartphone app, and have market prices forecast automatically. 2. Organic agriculture Oil derivatives (like diesel and fertiliser) are nearly as expensive in Uganda as the UK – in ridiculous contrast to the local market prices for vegetables. Efficient farming therefore has to rely on minimal imported inputs, and maximise the local bounty of sun, rain … and manure. Every precious drop of animal urine is captured – to mix with ash and chilli as an insect repellant for plants – or used neat as a fertiliser. In dry season, every rainfall is maximised, with lots of mulching of vegetables to prevent evaporation; and with a permaculture approach of shading coffee bushes with banana plants, and vegetables under the coffee. I am a fan of organic farming for health and environmental reasons, but out here I just do not see an alternative, cost-effective way to increase crop yields. 3. Peer-to-peer lending Developed-to-developing country lending networks, like Kiva.org, have grown rapidly – but with inevitable problems in vetting funding applications at distance. What farmers need are equivalents of 19th century Europe’s co-operative societies – where savers and lenders from the same area are brought together.  These farmer groups operate a very effective local system. All members pledge to save every month: from just 1 cent a week. Then any member can ask for a short term (maximum 3 month) loan from the fund – which is now $2000. The default rate is low – around 2% - as members know the debtors ability to repay, and can monitor progress in person. Plus every debtor has savings in the scheme – so wants to preserve their share of the capital. Three month loans (and flat 10% interest) make repayments easy to predict – and work in a country where planting to harvest is only 3 months. Uganda’s government abolished co-operatives in the 1990s when they started sponsoring political campaigns. But if these lending clubs can grow they could go some way to unlocking the capital that Africa needs to grow. This post was written by Edward Upton, Founder of Littledata, @eUpton

2015-02-17

Under the hood of Littledata

Littledata tool gives you insight into your customers' behaviour online. We look through hundreds of Google Analytics metrics and trends to give you summarised reports, alerts on significant changes, customised tips and benchmarks against competitor sites. This guide explains how we generate your reports and provide actionable analytics. 1. You authorise our app to access your Google Analytics data As a Google Analytics user you will already be sending data to Google every time someone interacts with your website or app. Google Analytics provides an API where our app can query this underlying data and provide summary reports in our own style. But you are only granting us READ access, so there is no possibility that any data or settings in your Google Analytics will change. 2. You pick which view to report on Once you've authorised the access, you pick which Google Analytics view you want to get the reports on. Some people will have multiple views (previously called ‘profiles’) set up for a particular website. They might have subtly different data – for example, one excludes traffic from company offices – so pick the most appropriate one for management reports. We will then ask for your email so we know where to send future alerts to. 3. Every day we look for significant changes and trending pages There are over 100 Google Analytics reports and our clever algorithms scan through all of them to find the most interesting changes to highlight. For all but the largest businesses, day-by-day comparisons are the most appropriate way of spotting changing behaviour on your website. Every morning (around 4am local time) our app fetches your traffic data from the previous day – broken down into relevant segments, like mobile traffic from organic search – and compares it against a pattern from the previous week. This isn’t just signalling whether a metric has changed – web traffic is unpredictable and changes every day (scientists call this ‘noise’). We are looking for how likely that yesterday’s value was out of line with the recent pattern. We express this as signal bars in the app: one bar means there is a 90% chance this result is significant (not chance), two bars means a 99% chance and three bars means 99.9% certain (less than a 1 in 1000 chance it is a fluke). Separately, we look for which individual pages are trending – based on the same probabilistic approach. Mostly this is change in overall views of the page, but sometimes in entrances or bounce rate. If you are not seeing screenshots for particular pages there are a few reasons why: The website URL you entered in Google Analytics may be out of date Your tracking code may run across a number of URLs – e.g. company.com and blog.company.com – and you don’t specify which in Google Analytics The page may be inaccessible to our app – typically because a person needs to login to see it 4. We look for common setup issues The tracking code that you (or your developers) copy and pasted from Google Analytics into your website is only the very basic setup. Tracking custom events and fixing issues like cross-domain tracking and spam referrals can give you more accurate data – and more useful reports from us. Littledata offers setup and consultancy to improve your data collection, or to do further manual audit. This is especially relevant if you are upgrading to Universal Analytics or planning a major site redesign. 5. We email the most significant changes to you Every day - but only if you have significant changes - we generate a summary email, with the highest priority reports you should look at. You can click through on any of these to see a mobile-friendly summary. An example change might be that 'Bounce rate from natural search traffic is down by 8% yesterday'. If you usually get a consistent bounce rate for natural / organic search traffic, and one day that changes, then it should be interesting to investigate why. If you want your colleagues to stay on top of these changes you can add them to the distribution list, or change the frequency of the emails in My Subscriptions. 6. Every Sunday we look for changes over the previous week Every week we look for longer-term trends – which are only visible when comparing the last week with the previous week. You should get more alerts on a Sunday. If you have a site with under 10,000 visits a month, you are likely to see more changes week-by-week than day-by-day.   To check the setup of your reports, login to Littledata tool. For any further questions, please feel free to leave a comment below, contact us via phone or email, or send us a tweet @LittledataUK.

2015-02-05

What's new in Google Analytics 2014

Google has really upped the pace of feature releases on Analytics and Tag Manager in 2014, and we’re betting you may have missed some of the extra functionality that’s been added. In the last 3 months alone we’ve counted 11 major new features. How many have you tried out? Official iPhone app. Monitor your Google Analytics on the go. Set up brand keywords. Separate out branded from non-brand search in reports. Enhanced Ecommerce reporting. Show ecommerce conversion funnels when you tag product and checkout pages. Page Analytics Chrome plugin. Get analytics for a particular page, to replace old in-page analytics. However, it doesn’t work if you are signed into multiple Google Accounts. Notifications about property setup. Troubleshoot common problems like domain mis-matches. Embeded Reports API. So you can build custom dashboards outside of GA quickly. Share tools across GA accounts. Now you can share filters, channel groupings, annotations etc easily between views and properties Tag Assistant Chrome plugin. Easily spot common setup problems on your pages using the Tag Assistant. Built-in user tracking. See our customer tracking guide for the pros and cons. Import historic campaign cost and CRM data (premium only). Previously, imported data would only show up for events added after the data import. Now you can enter a ‘Query Time’ to apply to past events, but only for Premium users. Get unsampled API data (Premium only - developers). Export all your historic data without restrictions Better Management API (for developers). Set up filters, Adwords links and user access programmatically across many accounts. Useful for large companies or agencies with hundreds of web properties.

2014-07-21

Pulling Google Analytics into Google Docs - automated template driven reporting

The Google Docs library for the analytics API provides a great tool for managing complex or repetitive reporting requirements, but it can be tricky to use. It would be great if it was a simple as dropping a spreadsheet formula on a page, but Google’s library stops a few steps short of that - it needs some script around it. This sheet closes that gap, providing a framework for template driven analytics reports in Google Docs. With it you can set up a report template, and click a menu to populate it with your analytics results and run your calculations - without needing to write a line of script - the code is there if you want to build on it, but you can get useful reports without writing a line of script. Prerequisites While you don't have to write code to use this, there are some technical requirements. To get the most out of it you'll need to have: your Google analytics tagging and views set up familiarity with Google’s reporting API familiarity with Google Docs spreadsheets - some knowledge of Google apps scripting is an advantage If you are looking for something more user-friendly or tailored to your needs, contact us and book a consultation to discuss - we can help with your analytics setup and bespoke reporting solutions. Getting started Setting this up takes a few steps, but you only need to do this once: Open the shared Google spreadsheet Make a copy Enter a view ID in the settings sheet - get this from the Google Analytics admin page. Authorise the script Authorise the API - in the API console - this is the only time you need to go into the script view using Tools|Script Editor Once in script editor select Resources|Advanced Google services On the bottom of the Advanced Google services dialogue is a link to the Google Developers Console, follow this and ensure that Google analytcs API is set to On You're done. You can go back to the spreadsheet and run the report (on the Analytics menu). From now on all you need to do is tweak any settings on the template and run the report.   Setting up your own report template You can explore how the template works using the example. Anywhere you want to retrieve value(s) from Google Analytics, place this spreadsheet function on the template: = templateShowMetric(profile, metric, startdate, enddate, dimensions, segment, filters, sort, maxresults) This works as a custom spreadsheet function, for example =templateShowMetric(Settings!$B$2,$B7,Settings!$B$3,Settings!$B$4,$C7,$D7,$E7,$F7,$G7) Note that in the example, several of the references are to the settings sheet, but they don't have to be, you can use any cell or literal value in the formula - it's just a spreadsheet function. To get the values for the API query, I'd suggest using Google’s query explorer. To set this up for a weekly report, say, you would have all the queries reference a single pair of cells with start and end dates. Each week you would change the date cells run the report again - all queries will be run exactly as before, but for the new dates. Using spreadsheet references for query parameters is key. This opens up use of relative and absolute references - for example if you need to run the same query against 50 segments, you list your segments down a column, set up segment as a relative reference, and copy the formula down spreadsheet style. You can use this to do calculations on the sheet and use results in the analytics API, for example you might calculate start and end dates relative to current date. Future posts will cover setting up templates in more detail. Under the hood The templateShowMetric function generates a JSON string. When you trigger the script, the report generator copies everything on the template to the report sheet and: runs any analytics queries specified by a templateShowMetric function removes any formulas that reference the settings sheet (so you can use the settings sheet to pass values to the template, but your reports are not dependant on the settings staying the same)

2014-04-21

Analytics showing wrong numbers for yesterday's visits

We've noticed a few issues with clients using Universal Analytics this last month, when visits for the last day have been double the normal trend. It then corrects itself a few hours later - so seems to be just a blip with the data processing at Google. Others have noticed the same problem. The temporary fix is to only generate reports with time series ending the day before yesterday. i.e. ignore yesterday's data. Now Google have officially acknowledged the problem Looking forward to seeing that one fixed!

2014-04-15

Measuring screen resolution versus viewport size

There’s a difference between the ‘screen size’ measured as standard in Google Analytics and the ‘browser size’ or ‘browser viewport’. Especially on mobile devices, there are pitfalls comparing the two. Browser viewport is the actual visible area of the HTML, after the width of scroll bars and height of button, address, plugin and status bars has been allowed for. Desktop computer screens have got much bigger over the last decade, but browser viewports (the visible area within the browser window) are not. The CSS tricks site found only 1% of users have their browser viewing in the full screen. While only 9% of visitors to his site had a monitor less than 1200px wide in 2011, around 21% of users have a browser viewport of less than that width. Simply put, on a huge monitor you don’t browse the web using your full screen. Therefore, 'screen resolution' may be much larger than 'viewport size'. The best solution is to post browser viewport size to GA as a custom dimension. P.S. Google Analytics does have a feature within In Page Analytics (under Behaviour section) to overlay Browser Size, but it doesn’t work for any of the sites I look at.

2014-04-14

How many websites use Google Analytics?

Google Analytics is clearly the number one web analytics tool globally. From a meta-analysis of different surveys, we estimate it is currently installed on over 50% of all websites or 80% of operational websites using any kind of analytics tracking. We looked at the following sources for this chart: Datanyze survey of Alexa top 1m sites (04/2014) BuiltWith survey of all websites (04/2014) MetricMail survey of Alexa top 1m sites Pingdom survey of Alexa top 10k sites (07/2012) W3Techs survey of their own sites (04/2014) LeadLedger survey of Fortune 500 sites (04/2014)

2014-04-10

What's included in Analytics traffic sources?

The Channel report in Google Analytics (under 'Acquisition' section) splits out into 6 or more types of visit channel: Direct Where a visitor has: typed the URL into the address bar clicked on a link which is NOT in another web page (e.g. in a mobile app) visited a bookmarked link Organic Search All visits from search engines (i.e. Google, Bing, Yahoo) which were not an advertisement. You used to be able to filter out people searching for your brand (which are more like Direct visits), but now the search terms are not provided. Paid Search Visits from search engines where the visitor clicked on an advert. Referral Where a visitor has clicked on a link in another website (not your own domain), but not including search engines or social networks. Social Networks Specifically links from known social network websites (including Facebook, Twitter, LinkedIn etc) Email From links tagged as medium = 'email'. Your email software needs to be configured correctly to add this tag. Display Links tagged as 'display' or 'cpm'. FAQs Can I change the channel groupings? Yes, you can change this under Admin .. (Selected View).. Channel Grouping. But we recommend you don't do this for your default view, as you won't be able to compare the historical data.

2014-03-30

Complete picture of your ecommerce business

From marketing channels to buying behaviour, Littledata is the ultimate Google Analytics toolbox.

Get started