The Freemium business model revisited
After I concluded that freemium is not the best business model for all, the continued rise of ‘free’ software has led me to revisit the same question. In a fascinating piece of research by Price Intelligently, over 10,000 technology executives were surveyed over 5 years. Their willingness to pay for core features of B2B software has declined from 100% in 2013 to just over 50% today – as a whole wave of VC-funded SaaS companies has flooded the market with free product. For add-ons like analytics, this drops to less than 30% willing to pay. “The relative value of features is declining. All software is going to $0” – Patrick Campbell, Price Intelligently Patrick sees this as an extension of the trend in physical products, where offshoring, global scale and cheaper routes to market online have led to relentless price depreciation (in real terms). I’m not so sure. Software is not free to manufacture, although the marginal cost is close to zero – since cloud hosting costs are so cheap. The fixed cost is the people-time to design and build the components, and the opportunities for lowering that cost – through offshoring the work or more productive software frameworks - have already been exploited by most SaaS companies. To pile on the pain, a survey of software executives also found that the average number of competitors in any given niche has increased from 10 to 15 over those 3 years. Even if software build costs are falling, those costs are being spread over a small number of customers – making the chance of breaking even lower. And the other big cost – Customer Acquisition (CAC) – is actually rising with the volume of competition. To sum up the depressing news so far: 1. Buyers have been conditioned to expect free software, which means you’ll have to give major features away for free 2. But you’ll have to pay more to acquire these non-paying users 3. And next year another competitor will be offering even more for free What is the route of this economic hole? Focussing on monetising a few existing customers for one. Most SaaS executives were focussed on acquiring new customers (more logos), probably because with a free product they expected to sweep up the market and worry about monetization later. But this turns out to be the least effective route to building revenue. For every 1% increment, Price Intelligently calculated how much this would increase revenue. i.e. If I signed up 101 users over the year, rather than 100, that would increase revenue by 2.3%. Monetization – increasing the Average Revenue Per User (ARPU) – has by far the larger impact, mainly because many customers don’t pay anything currently. In contrast, the impact of customer acquisition has fallen over 3 years, since the average customer is less likely to pay. Monetization is not about increasing prices for everyone – or charging for previously free features – but rather finding the small number who are willing to pay, and charging them appropriately. My company, Littledata, has many parallels to Profit Well (launched by Price Intelligently). We both offer analytics and insights on top of existing customer data – Littledata for Google Analytics behavioural data, and Profit Well for recurring revenue data from billing systems. And we have both had similar customer feedback: that the perceived value of the reporting is low, but the perceived value of the changes which the reporting spurs (better customer acquisition, increased retention etc) is high. So the value of our software is that it creates a requirement – which can then be filled by consulting work or ‘actionable’ modules. For myself, I can say that while focusing on new customer acquisition has been depressing, we have grown revenues once a trusted relationship is in place – and the customer really believes in Littledata’s reporting. For Littledata, as with many B2B software companies, we are increasingly content that 80% of our revenue comes from a tiny handful of loyal and satisfied users. In conclusion, while the cover price of software subscriptions is going to zero, it is still possible to generate profits as a niche SaaS business – if you understand the necessity of charging more to a few customers if the many are unwilling to pay. Freemium may be here to stay, but if customers want the software companies they rely on to stay they need to pay for the benefits. Would you like to further discuss? Comment below or get in touch!
Shine a light on ‘dark’ Facebook traffic
If Facebook is a major channel for your marketing, whether sponsored posts or normal, then you’re underestimating the visits and sales it brings. The problem is that Facebook doesn’t play nicely with Google Analytics, so some of the traffic from Facebook mobile app comes as a DIRECT visit. That’s right – if a Facebook user clicks on your post on their native mobile app they won’t always appear as a Facebook social referral. This traffic is ‘dark Facebook’ traffic: it is from Facebook, but you just can’t see it. Since around 40% of Facebook activity is on a mobile app, that means the Facebook traffic you see could be up to 40% less than the total. Facebook hasn’t shown much interest in fixing the issue (Twitter fixed it, so it is possible), so you need to fix this in your own Google Analytics account. Here are three approaches: 1. Basic: use campaign tagging The simplest way to fix this, for your own posts or sponsored links on Facebook, is to attach UTM campaign tags to every link. Google provides a simple URL builder to help. The essential tags to add are “utm_source=facebook.com” and “utm_medium=referral”. This will override the ‘direct’ channel and put all clicks on that links into the Facebook referral bucket. Beyond that, you can add useful tags like “utm_campaign=events_page” so you can see how many click through from your Facebook events specifically. 2. Moderate: use a custom segment to see traffic What if much of your traffic is from enthusiastic brand advocates, sharing your pages or articles with their friends? You can’t expect them to all use an URL builder. But you can make a simple assumption that most users on a mobile device are not going to type in a long URL into their browser address bar. So if the user comes from a mobile device, and isn’t visiting your homepage (or a short URL you deliberately post), then they are probably coming from a mobile app. If your website is consumer facing, then the high probability is that that mobile app is Facebook. So we can create a custom segment in GA for traffic which (a) comes from a mobile device (b) does not have a referrer or campaign (i.e. direct) (c) does not land on the homepage To start you need to create a segment where source contains 'facebook'. Then add the 'Direct mobile, not to homepage' segment: Next, you can create a custom report to show sessions by hour: You should see a strong correlation, which on the two web properties I tested on resulted in doubling the traffic I had attributed to Facebook. 3. Advanced: attribute micro spikes to Facebook Caveat: you’ll need a large volume of traffic – in excess of 100 visits from Facebook a day – to try this at home The final trick has been proved to work at The Guardian newspaper for Facebook traffic to news articles. Most Facebook activity is very transitory – active users click on a trending newsfeed item, but it quickly fades in interest. So what you could do, using the Google Analytics API, is look for the ‘micro spikes’ in referrals that come from Facebook on a minute-by-minute basis, and then look at the direct mobile visits which came at the same time, and add these direct spikes to the total Facebook traffic. I've played around with this and it's difficult to get right, due to the sampling Google applies, but I did manage to spot spikes over around 5 minutes that had a strong correlation with the underlying direct mobile traffic. Could these approaches work for your site? I'm interested to hear. (Chart: Dark Social Dominates Online Sharing | Statista) Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.
6 reasons Facebook ads don’t match the data you see in Google Analytics
Cross Domain tracking for Eventbrite using Google Tag Manager (GTM)
WWI Codebreaking and Interpretation
Reading Max Hasting’s excellent book on The Secret War, 1939-1945, I was struck by the parallel between the rise of radio communications in the 1930s and the more recent rise in internet data. The transmission of military and diplomatic messages by radio in the 1930s and 1940s provided intelligence agencies with a new gold mine. Never before had so much potential intelligence been floating in the ether, and yet it threatened to flood their limited manpower with a tide of trivia. The bottleneck was rarely in the interception (trivial with a radio set) or even decryption (made routine by Bletchley Park with the Enigma codes), but rather in filtering down to the tiny number of messages that contained important facts – and getting that information in real time to the commanders in the field. The Ultra programme (Britain’s decryption of German radio intercepts) was perennially understaffed due to the fact that other civil servants couldn’t be told how important it was. At Ultra’s peak in 1943, only around 50% of the 1,500 Luftwaffe messages a day were being processed – and it is unknown how many of those were in time to avert bombing raids. The new age of technology provided an almost infinitely wide field for exploration, as well as the means of addressing this: the trick was to focus attention where it mattered. The Secret War, page 203 The ‘new age of technology’ in the last two decades poses much the same problem. Data on internet behaviour is abundant: there are countless signals to listen to about your website performance, and the technology to monitor users is commonplace. And the bottleneck is still the same: the filtering of useful signals, and getting those insights to the ‘commanders’ who need them in real time. I started Littledata to solve this modern problem in interpreting website analytics for managers of online businesses. There is no decryption involved, but there is a lot of statistics and data visualisation know-how in making billions of data points appreciable by a company manager. Perhaps the most important aspect of our service is to provide insights in answer to a specific question: Group-Captain Peter Stewart, who ran the Royal Air Force’s photo-reconnaissance operations, was exasperated by a senior offer who asked for ‘all available information’ on one European country. Stewart responded that he could only provide useful information if he knew roughly what intelligence the suppliant wanted – ‘naval, military, air or ecclesiastical’. The Secret War, page 203 In the world of online commerce, the question is something like whether the client needs insights into the checkout conversion rate of all customers (to improve site design) or for a specific marketing campaign (to improve campaign targeting). So by focusing on insights which are relevant to the scale, stage or sector of the client company, and making these accessible in a real-time dashboard, Littledata can feed into decision making in a way that raw data can never do. Want to discuss this further? Get in touch or comment below! Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.
Don’t obsess over your homepage – its importance will decrease over time
Many businesses spend a disproportionate amount of time tweaking copy, design and interactive content for their homepage. Yet they miss the fact that the action is increasingly elsewhere. Homepage traffic has traditionally been seen as a proxy for ‘brand’ searches – especially when the actual search terms driving traffic are ‘not provided’. Now, brand search traffic may be finding other landing pages directly. Our hypothesis was that over the last 2 years the number of visits which start at the homepage, on the average website, are decreasing. To prove this, we looked at two categories of websites in Littledata’s website benchmarks: Websites with more than 20,000 monthly visits and more than 60% organic traffic (227 websites) Large websites with more than 500,000 monthly visits (165 websites) In both categories, we found that the proportion of visits which landed on the homepage was decreasing: by 8% annually for the smaller sites (from 16% of total visits to 13% over two years), and 7% annually for the larger sites (from 13% to 11%). If we ignore the slight rise in homepage traffic over the November/December period (presumably caused by more brand searches in the Christmas buying season), the annual decline is more than 10%. From the larger websites, only 20% showed any proportionate increase in homepage traffic over the 2 years – and those were mainly websites that were growing rapidly, and with an increasing brand. I think there are three different effects going on here: Increased sophistication of Google search usage is leading to more long-tail keywords, where users want a very specific answer to a question – usually not given on your homepage. The increase in mobile browsing, combined with the frustrations of mobile navigation, is leading more users to use search over navigation – and bypass your homepage That Google’s search-engine result page (SERP) changes have made it less likely that brand searches (searching for your company or product names) will navigate to your landing page – and instead browse social profiles, news, videos or even local listings for your company. In conclusion, it seems that for many businesses the homepage is an increasing irrelevance to the online marketing effort. Spend some time on your other content-rich, keyword-laden landing pages instead! And would you like to see if you are overly reliant on your homepage traffic, compared with similar websites? Try Littledata’s reporting suite. Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.
Online reporting: turning information into knowledge
Websites and apps typically gather a huge flow of user behaviour data, from tools such as Google Analytics and Adobe Analytics, with which to better target their marketing and product development. The company assumes that either: Having a smart web analyst or online marketer skim through the reports daily will enable management to keep tabs on what is going well and what aspects are not Recruiting a ‘data science’ team, and giving them access to the raw user event data, will surface one-off insights into what types of customers can be targeted with which promotions Having worked in a dozen such companies, I think both assumptions are flawed. Humans are not good at spotting interesting trends, yet for all but the highest scale web businesses, the problem is not really a ‘big data’ challenge. For a mid-sized business, the problem is best framed as, how do you extract regular, easy-to-absorb knowledge from an incomplete online behavioural data set, and how do you present / visualise the insight in such a way that digital managers can act on that insight? Littledata is meeting the challenge by building software to allow digital managers to step up the DIKW pyramid. The DIKW theory holds that there are 4 levels of content the human mind can comprehend: Data: the raw inputs; e.g. the individual signals that user A clicked on button B at a certain time when visiting from a certain IP address Information: provides answers to "who", "what", "where", and "when" questions Knowledge: the selection and synthesis of information to answer “how” questions Wisdom: the extrapolation or interpretation of this knowledge to answer “why” questions Information is what Google Analytics excels at providing an endless variety of charts and tables to query on mass the individual events. Yet in the traditional company process, it needs a human analyst to sift through those reports to spot problems or trends and yield genuine knowledge. And this role requires huge tolerance for processing boring, insignificant data – and massive analytical rigour to spot the few, often tiny, changes. Guess what? Computers are much better at the information processing part when given the right questions to ask – questions which are pretty standard in the web analytics domain. So Littledata is extending the machine capability up the pyramid, allowing human analysts to focus on wisdom and creativity – which artificial intelligence is still far from replicating. In the case of some simpler insights, such as bounce rates for email traffic, our existing software is already capable of reporting back a plain-English fact. Here’s the ‘information’ as presented by Google Analytics (GA). And here is the one statistically significant result you might draw from that information: Yet for more subtle or diverse changes, we need to generate new ways to visualise the information to make it actionable. Here are two examples of charts in GA which are notoriously difficult to interpret. Both are trying to answer interesting questions: 1. How do users typically flow through my website? 2. How does my marketing channel mix contribute to purchasing? Neither yields an answer to the “how” question easily! Beyond that, we think there is huge scope to link business strategy more closely to web analytics. A visualisation which could combine a business’ sales targets with the current web conversion data, and with benchmarks of how users on similar sites behave, would give managers real-time feedback on how likely they were to outperform. That all adds up to a greater value than even the best data scientist in the world could bring. Have any questions? Comment below or get in touch with our team of experts! Want the easier to understand reports? Sign up! Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.
Why do I need Google Analytics with Shopify?
If the lack of consistency between Shopify’s dashboards and the audience numbers in Google Analytics is confusing, you might conclude that it’s safer to trust Shopify. There is a problem with the reliability of transaction volumes in Google Analytics (something which can be fixed with Littledata’s app) - but using Shopify’s reports alone to guide your marketing is ignoring the power that has led Google Analytics to become over by over 80% of large retailers. Last-click attribution Let’s imagine your shoe store runs a Google AdWords campaign for ‘blue suede shoes’. Shopify allows you to see how many visits or sales were attributed to that particular campaign, by looking at UTM ‘blue suede shoes’. However, this is only capturing those visitors who clicked on the advert and in the same web session, purchased the product. So if the visitor, in fact, went off to check prices elsewhere, or was just researching the product options, and comes back a few hours later to buy they won’t be attributed to that campaign. The campaign reports in Shopify are all-or-nothing – the campaign or channel sending the ‘last-click’ is credited with 100% of the sale, and any other previous campaigns the same customer saw is given nothing. Multi-channel attribution Google Analytics, by contrast, has the ability for multi-channel attribution. You can choose an ‘attribution model’ (such as giving all campaigns before a purchase equal credit) and see how much one campaign contributed to overall sales. Most online marketing can now be divided into ‘prospecting’ and ‘retargeting’; the former is to introduce the brand to a new audience, and the latter is to deliberately retarget ads at an engaged audience. Prospecting ads – and Google AdWords or Facebook Ads are often used that way – will usually not be the last click, and so will be under-rated in the standard Shopify reports. So why not just use the analytics reports directly in Google AdWords, Facebook Business, Twitter Ads etc.? Consistent comparison The problem is that all these different tools (and especially Facebook) have different ways of attributing sales to their platform – usually being as generous as possible to their own adverting platform. You need a single view, where you can compare the contribution of each traffic source – including organic search, marketing emails and referrals from other sites – in a consistent way. Unfortunately, Google Analytics needs some special setup to do that for Shopify. For example, if the customer is redirected via a payment gateway or a 3D secure page before completing the transaction then the sale will be attributed to a ‘referral’ from the bank - not the original campaign. Return on Advertising Spend (ROAS) Once you iron out the marketing attribution glitches using our app, you can make meaningful decisions about whether a particular form of marketing is driving more revenue that it is costing you – whether there is a positive Return on Advertising Spend. The advertising cost is automatically imported when you link Adwords to Google Analytics, but for other sources, you will need to upload cost data manually or use a tool like funnel.io . Then Google Analytics uniquely allows you to decide if a particular campaign is bringing more revenue than it is costing and, on a relative basis, where are the best channels to deploy your budget. Conclusion Shopify’s dashboards give you a simple daily overview of sales and products sold, but if you are spending more than hundreds of dollars a month on online advertising – or investing in SEO tactics – you need a more sophisticated way to measure success. Want more information on how we will help improve your Shopify analytics? Get in touch with our experts! Interested in joining the list to start a free trial? Sign up! Get Social! Follow us on LinkedIn, Twitter, and Facebook and keep up-to-date with our Google Analytics insights.
Subscribe to Littledata news
Insights from the experts in ecommerce analytics
Get the Littledata analytics app
Start your free 14-day trialLearn More