Where Shopify's Web Pixel misses the mark
10 reasons to switch to server-side tracking for ecommerce analytics
Do you need to process customer data in-house to be truly data secure?
Many brands with large customer bases are facing a similar question when it comes to storing data—is it time to bring all data processing in-house? Whether this is prompted by a data security audit, a data breach, or a desire to be more agile with data analysis, it's an important question that thankfully doesn't have a complicated answer. In this article, I’ll explore whether you should outsource or insource customer data processing for your brand. Quick side note—for Littledata’s direct-to-consumer (DTC) brands, customer data is usually first-party data captured as part of the ecommerce checkout process, including post-purchase interactions with the customers and web browsing information such as IP addresses. Why you need first-party data to be secure First-party customer data is data the customer shares with you directly through the server connecting them to your website. By its very nature, first-party data is created by a contract—and more importantly, a bond of trust—between your brand and the end customer. Accidentally leaking that data is brand-damaging: 46% of organizations surveyed by Forbes suffered reputational damage after a data breach. In addition, GDPR and similar regulations impose large fines (up to 4% of global revenue) for data breaches—specifically, lax processes leading to a data breach. You might also be concerned about commercial espionage—how valuable could your customer purchase history be in the hands of a competitor or a fraudster? Or maybe your company has been burned by third-party data processors in the past whose security standards did not meet your own. Taking these concerns together, you may be thinking the only way to be truly data secure is to process and store first-party customer data on your own infrastructure. But there are downsides to this. Do you want to own your own data infrastructure? By data infrastructure, I don’t mean owning bare-metal servers that sit in the broom cupboard behind your office. I’ll assume you are comfortable with the concept of hosting data in a public or private cloud environment. However, even maintaining that cloud computing infrastructure brings costs and risks. Your company will be responsible for software patches, updates to use the latest API versions, monitoring for suspicious activity, and handling outages. Data engineering is complex, and great data engineers are in short supply. So, I suggest you are better off licensing a secure data pipeline than building it all yourself. Does your company control the data end-to-end? Frankly, processing company data in-house may be missing the point if you do not control the data processing end-to-end. Many of Littledata’s customers have made a deliberate choice by working with Shopify or BigCommerce to leave purchase and transaction processing to a cloud provider—signing data processing agreements (see DPAs for Shopify and BigCommerce) to store customer data on US cloud servers. Many brands also make a choice to share customer data with Google (pseudo-anonymized) or with Facebook (not anonymized) to improve their customer acquisition and Return on Advertising Spend (ROAS). In effect, these brands are outsourcing the data processing that happens between the ecommerce cloud and the marketing cloud to Littledata. Trying to do this processing in-house makes little sense when the start and end of the data processing chain are third parties. Does EU customer data need to stay in the EU to be secure? You may have read about regional courts in France and Austria ruling against sending EU customer data to Google Analytics—or indeed sending data to any US server. I think these rulings are extreme and will eventually be struck down. There is no practical or legal reason why data processing on servers within the EU is somehow more GDPR compliant than hosting on the cloud in the US. That said, data nationalism as a trend is here to stay, so there may be a future need to keep EU data siloed. All cloud computing networks have EU servers, and tools like Segment make it possible to split EU customer data processing onto EU servers. The limitation is that right now, none of our other partners (especially Shopify, Google, and Facebook) have the same ability to process in the EU. This makes regionalizing only one part of the data processing chain pointless. Is outsourced data GDPR compliant? Yes, you can subcontract data processing to a third party. But to be GDPR compliant, your data processors need to enable the right to rectification, the right to erasure, and the right to restrict processing. All the main partners that Littledata works with (Shopify, Google Analytics, Facebook Ads, etc.) have API endpoints by which your customer can request their data to be updated or erased, and this request can be passed on to the downstream processors. If the customer requests to restrict processing (e.g. opting out of advertising retargeting using a cookie consent banner) your company needs to also pass along that choice to the downstream processors. Littledata’s tracking script makes that easy to do via integration with Shopify’s consent management, and plugins for OneTrust and TrustArc. Can you control outsourced data processing? Yes. Doing so is just a matter of working with a processing partner that a) is transparent on how they process the data, b) follows good practices in data security, and c) provides Service Level Agreements (SLAs) for the processing. At Littledata, we are clear about how we process customer data (and exactly what data points are stored where), have a public data security policy, and provide tight processing SLAs for Plus customers. [tip]Learn more about how Littledata protects your data while giving you 100% accurate analytics by booking a demo with one of our experts.[/tip] Conclusion I believe you can outsource data processing and still be truly data secure. In fact, I believe trying to bring data fully in-house is costly and pointless for most cloud ecommerce brands. Pick trusted partners to ensure your customer data processing is both super reliable and super secure, and get on with scaling your business!
Lunch with Littledata: Jumping into GA4 with Google Analytics Expert Krista Seiden
The rise of Google Analytics 4, the newest version of the world’s most popular analytics service, is predictably a very big deal in the world of data. As we move full steam ahead toward a cookie-less future and leave third-party data behind, Google has revamped its Analytics service to give users both a new look and new tools to check on the health of their businesses. Changes as big as this, though, always come with a learning curve. That’s when it helps to have an expert that can smooth the transition. In this edition of Lunch with Littledata, I spoke with KS Digital founder and former Evangelist for Google Analytics at Google, Krista Seiden about what GA users can expect from GA4, which reports come out of the box and which require more effort to build, and what to do to set yourself up for success starting today. [tip]Not sure what GA4 has in store? See our top 10 reasons to make the switch.[/tip] Edward from Littledata: You're obviously a well-known evangelist for Google Analytics (GA). Could you tell me a bit about how you got into the world of analytics? Krista Seiden: I like to consider my journey into analytics a bit of an accident (laughs). I was working at Adobe Systems way back in 2009 when my happy accident started. One of my responsibilities as a business analyst was to put together a monthly dashboard for the CEO, which included about 30 different metrics from around the business unit. I had to email probably 30 different people every month to get these metrics and put them together. It was very old school. And I realized that probably about half of the metrics I was collecting every month actually came from Omniture, which was their analytics solution at the time. I thought, “Well, this is silly. Why am I emailing all these people?” So I went to the web analytics team and said, “Hey, just teach me how to do this.” I spent some time with them and I learned how to pull the data myself. That was really helpful. Then Adobe bought Omniture, and all of a sudden all of that training that they had—which is generally really expensive—was available for free. So I thought, “Sweet. I'm going to learn all of this in more detail so I can be more useful in my current job.” And then as time went on, my job evolved and they asked me to just take on Web Analytics full time. So it was, I like to say, a happy accident because it kind of evolved into that new position, but it also just sparked from my interest in trying to make things more efficient and not have to bother everyone. So I spent some time at Adobe doing analytics, then I went to the Apollo Group and did analytics there using the free version of GA on a site that had millions and millions of users. This was predating even GA premium, so it was awful sampling. It was a horrible experience. I had to figure out all sorts of hacks and ways to try to make the data more usable. Just as I was onboarding Omniture there, I was tapped by Google to come run analytics and optimization for what's now the Google Cloud Group, what was then the Google Apps Group. Edward: Speaking of GA, I wanted to talk specifically about GA4 which is just launching. Now that its arrival has been officially announced and it's out of beta, do you think it’s ready for a high scale brand to use as their primary analytics tool? Krista: That's a good question. I think the answer is going to depend on who you ask. If you ask Google, they're going to say yes, it’s fully ready. If you ask somebody outside of Google, depending on their love or hate relationship with it, you will get a varying degree of answers. From where I stand, I think the answer is yes—but I think the answer is yes because Universal Analytics has a deprecation date (July 1, 2023). You don't really have a choice at this point, you need to start migrating (to GA4). For big companies especially, if you're going to need year-over-year data, you need to have GA4 set up and collecting data properly before the end of June 2022. That being said, there are features that are still missing, especially when it comes to ecommerce. We don't have our item scope custom dimensions yet, which is definitely a big problem for a lot of big ecommerce clients. There are ways that you can use other available dimensions to kind of fill that gap for now. It’s not the best, certainly. There are other features that are missing. But there's also a long roadmap and I'm pretty comfortable with where that roadmap is going in terms of the end product of what GA4 will eventually look like when a lot of that has rolled out. I think it's made a lot of progress in the last six months in particular, and it's a lot more ready now than it was not that long ago. [tip]See 10 benefits you can get from making the move to GA4 now[/tip] Edward: I do understand Google's dilemma that they want to sunset UA, but they simply can't launch everything now—there are a lot of features to build out. How are you advising brands go about making the transition to GA4? Is it about double-tracking using UA and GA4 for now? Krista: I think the narrative for the past year and a half has really been let's dual tag, get GA4 set up, start collecting historical data, and start getting used to it. My business, KS Digital, has stopped doing any sort of UA work. We actually stopped at the beginning of 2022, so we haven't taken on any new UA-specific clients since late last year. Our offerings now, when people come to us, are around getting them set up with a solution design and implementation for GA4. We’ll look at their UA data and maybe do a lightweight audit so that we at least have an understanding of what they're collecting, how they're doing it, and what we may be able to carry forward. But we're not really advising on UA anymore. That being said, I do still think dual tagging is a good idea if you are a current UA user and I will continue to recommend that all the way through the sunset. I think it's important to have that side by side, although I do also think it leads to a bit too much of a reliance on UA when people do need to start transitioning to GA4. So it's a little bit of a battle there, but I think it's important for data continuity. Edward: Yes, because the data collection migration has got to happen first, but then people have got to move over the reporting dependence. Krista: Yep. Edward: What are the biggest unexpected challenges you've seen with established brands who are transitioning to GA4? Krista: There are obviously some feature gaps and those have been challenges. But I think the biggest challenge is really just the mindset—getting people used to a brand new tool. GA4 looks and feels very different. You might log in and look at any of the reports that are out of the box and you see this very ugly scatterplot and you're like, “What am I supposed to do with this?” I think a lot of people don't fully realize what they can do with GA4. So, for example, you can completely customize the UI. You can change the visuals. You can add or remove reports that are important to you. You can organize them any way you want. You could never do that in UA—so you can really make GA4 your own. I think that's going to be really important to help people get more comfortable and want to move over. But I think the biggest hindrance is really just a lack of training, a lack of knowing what to do with the product, and just a bit of fear over that unknown. Edward: It’s deceptive because the UI looks very similar at first glance. But then when you start digging, you realize there’s a lot of stuff that’s very different. Krista: (laughs) Yeah. Edward: As you said earlier, obviously there are some feature gaps, particularly around e-commerce. A lot of the out-of-the-box ecommerce reports are missing. For us, the most obvious gaps are around the shopping behavior funnel and checkout completion funnel. But they also exist around the product level analysis which, as you say, is blocked by the lack of item scope dimensions. Are you seeing brands able to replicate some of those using the explorations module? Krista: Yes. So I have several large ecommerce clients that are working on GA4 and we have replicated a lot of those reports within Explorations. The nice thing about that is you actually get to be a lot more specific about what you want in those funnel reports. You can break them down, you can add multiple segments side by side. You can do things like showing the elapsed time between steps or making it an open and closed funnel. So I do think there are actually a lot of benefits to doing that way, but it’s more work to set it up out of the box. “The nice thing about (GA4’s Explorations feature) is you actually get to be a lot more specific about what you want in those funnel reports. You can break them down, you can add multiple segments side by side. there are actually a lot of benefits to (creating reports) that way, but it’s more work to set it up out of the box.” And because of the way that the Explorations permissions work right now, it's very frustrating. You can't actually share access to a report. You can share the report, but then somebody has to make a copy of it and edit it to make it their own. You can't have a shared report that anybody can, for example, change the date on or add a segment to. I think that that's limiting, so I'm hopeful that those permissions will change and become more friendly over time. Edward: Yeah, because the other thing that’s obviously lacking is any ability to share report templates. As ecommerce specialists, we have to build ecommerce template reports. Can you see Google opening up the template galleries to third parties? Or was their Custom Report Gallery not seen as a success? Krista: I don't know that they didn't see the Custom Report Gallery as a success. I don't think it was really top of mind for them. I hope that there will be some sort of a template gallery for Explorations. I think that as more and more people move to GA4 and see that they have to do a lot more in Explorations, that demand will bubble up. I guess we'll see, but I'm hopeful that we will see something like that. Edward: I think it would be a solution because as you say, the problem is not that you can't build analysis reports. The problem is that it just takes some analytics knowledge to build the report. Krista: And you can't do the same type of funnel visuals within Google Data Studio, for example, where you could ship that template because it doesn't have the same processing due to how data studio gets that data from the API. So it's not easily replicable in a shareable fashion. Edward: What about GA4’s connection with Google Ads? How do you think getting accurate data in GA4 helps brands make the most of Google Ads? Krista: I think it's similar to how brands are utilizing Universal Analytics with Google ads, right? It's that conversion data—so goals in UA or conversions in GA4. Then with Google Ads, you can link those conversions and optimize your campaigns that way. I think one of the hidden benefits that maybe isn’t very well known within GA4 is that conversion data is now essentially calculated based on data-driven attribution for everyone. So you can actually change that model and choose what you want if you don't want data-driven. But if you think data-driven is a good model for you, then your ads are now bidding to conversions that are based on that. So that's a difference, but it depends on how impactful that really is for your business. Other than that, I think GA4 operates pretty similarly to UA. [tip]Move your ad strategy to first-party data solutions all around by running dynamic Facebook ads with the new Conversions API.[/tip] Edward: That’s interesting. I see data-driven attribution as one of the big perks of GA4 because it was previously a GA 360-limited feature that is now available for all. So what you're saying is that not only can you run the data-driven attribution within GA4, but you can basically do that within Google ads as well? Krista: Using your conversions right from GA4, if those conversions are being calculated using data-driven attribution, then that will flow through to Google ads. Edward: Cynically, one of the problems we come against is brands whose agencies want to see the conversions directly in Google Ads. Because the attribution model is more greedy, and obviously from the agency's point of view, it makes their campaigns look better (laughs). Krista: Yeah, I've always wanted to say absolutely not. My conversions will be based on GA—but to each their own. Edward: The other big advantage for GA4—which gets our bigger customers excited—is the BigQuery sync or the “ensemble data export.” The question there is, do you think that will be a “free forever” feature? Because that was previously a big plus of upgrading to GA 360. Krista: I do think it'll be a “free forever” feature. However, in the past couple of months, Google has started to enforce the data limits of the free export. I think that limit is a million per day. So if you go over, then that's probably an upgrading type path for you. Honestly, if you have that much data, there are probably other reasons why you might want to upgrade as well. But I do think it'll be free forever. That's one of the big value props of GA4, that everybody now has access to this raw end data. Edward: Yeah. As you say, it's really just that they're just enforcing what was already consistent with regards to volume. Krista: Mm-hmm. But they have actually released the ability to filter the data that you export into BigQuery. So even if you are going over that limit, you can choose what data you want to export to stay under that limit. I think that's actually a really nice additional feature there that helps to make that BigQuery export continually usable, even for businesses with high volume. Edward: Are there any other big features we haven't talked about that you think would be beneficial to an e-commerce brand that made the switch? Krista: Yeah, one feature that I love that's actually beneficial to all types of businesses is enhanced measurement. I love enhanced measurement because out of the box, it's six additional events (well, five if you don’t count page views) that are just collected on your behalf if you allow Google to do it, and you can toggle them on or off. In my opinion, it really helps to democratize data because a lot of businesses were not going to have the resources or the time or effort to be able to go add those types of events. And now they're just going to get them out of the box, which gives them a lot more insight into what's going on on their sites. “I love enhanced measurement because out of the box, it's six additional events that are just collected on your behalf… it really helps to democratize data because a lot of businesses were not going to have the resources or the time or effort to be able to go add those types of events.” Edward: Back in the day when we were doing Google Tag Manager setups, these metrics used to be on the standard list of stuff that you’d say the brand could invest in for enhanced tracking, but it was all manual steps to do so. So it's nice that it's out of the box. Krista: Totally. Edward: Is there anything else you think might be interesting for our audience to know? Krista: Just one word of wisdom, really a warning to people—you're going to need to figure out how to save your historical data. Google said that at least six months after the deprecation date, views will still be available to look at. But after that point, access to them is going to go away. So you won't have access to that historical data after potentially January 1st, 2024. That means brands need to think about how they're going to export their data from Universal Analytics and keep that historical data. It is possible, and there are a lot of ways to do it. I think there's a big business opportunity. We're going to see a lot of new businesses going into this space here. We may see some helpful tools from Google as well. Who knows? But I think that's something to just keep in mind as we get closer to that deprecation date. Edward: I totally agree. I was recently chatting with a customer about the ways to do it. Ultimately it boils down to what analysis you’ll want to do with that data. Because using the reporting API, you can’t export every historical event. You need to decide ahead of time what you want to compare. I told the customer that ultimately you're going to want to do some kind of historical analysis, maybe year-on-year type comparisons. What’s tricky is, as you say, either businesses have got to take a greedy approach and export as much as possible before the close-off or really decide what they're going to want to compare after it. Krista: Yeah. I think for most of my customers, I'll probably recommend a simpler route where we narrow down what their key reporting metrics are (or have been) and focus on exporting those. The sooner you get GA4 set up, the more historical data you'll have there. I've been running GA4 for about three years now. But obviously, not everybody is. If you get it set up before June of this year, though, you'll have your historical data. Edward: Which for most brands is good enough. Krista: Right. There are some brands that want more. But realistically, how often are you actually looking back at that data from five years ago? Not very often. If you are, it's looking at very high-level metrics like how many users or sessions or page views you had and what you’re at now. Edward: Most brands I know have changed the tracking implementation multiple times within that five years. So it's not really valid to look back that far. Krista: Yeah. I think it's more of a shock factor that you're losing access to the data rather than something that people actually need and use all that often. Edward: To wrap up on something we discussed at the very beginning—as you say, the investment brands need to make in GA4 is more in learning how to use this new tool. Are there other particularly good resources you’d recommend for people to learn about how to build their reports? Krista: Yeah, I think there are a lot of great blog posts out there from so many different people in the analytics community. Selfishly, I'll say I have some great GA4 courses from KS Digital. My students have been very happy. You get to learn directly from me and they include live Slack access and office hours. So it's not just video learning but direct interaction where I'll answer all your tough GA4 questions. Google has some resources and there are other great courses out there as well. I've always learned so much just from following blogs and social media on the topic. Quick links: Get ready for the rise of Google Analytics 4 and sunsetting of Universal Analytics Learn why data is critical to your DTC growth strategy See We Make Websites ideal headless tech stack, featuring Littledata’s Google Analytics connector Read 10 benefits you can enjoy when you make the move to GA4
How to run dynamic Facebook ads with Facebook Conversions API
It is rare that we speak to a customer at Littledata who isn't spending a majority of their PPC budget on Google and Facebook, and this hasn't changed with browser and OS privacy changes and the renewed focus on first-party data. It has just become more complex. Facebook Ads are used by top DTC brands to find new customers and retarget shoppers, and dynamic ads are one of the secrets to success on that paid channel. But without the right tools, dynamic ads can be difficult to run correctly — let alone optimize for higher ROAS. In this post, I look at what dynamic product ads are, the main use cases for DTC marketers, and how to use tools such as the Facebook Conversions API (CAPI) to improve your ecommerce advertising. What are dynamic product ads on Facebook? Dynamic ads, or specifically dynamic product ads, allow you to show a Facebook user an advert for the same (or similar to) the product they browsed on your store. This dynamic product ad is typically displayed to an audience that has added a product to their cart but has not yet purchased. Facebook has shown using this strategy increases click-through rate (CTR) on ads and reduces your cost per acquisition (CPA). In one example, Inch Blue, a children’s shoe manufacturer, more than doubled their return on Facebook Ads spend by using dynamic product ads and lookalike audiences (another technique described here). What do I need to set up dynamic product ads? To enable dynamic product ads, you need to feed four things from your store to Facebook: An event telling Facebook which users have viewed a productAn event telling Facebook which users added which products to their cartAn event telling Facebook which users have completed purchases (so they can be excluded from the audience)A product feed matching the product ID viewed, with the image and description you want to be displayed in the dynamic ad It's important to send all of this data to Facebook Ads, because otherwise you could waste a lot of money targeting shoppers with the wrong products, or retargeting customers who already bought the item! Using Facebook Conversions API to run high-performing ads The four necessary events above are increasingly being blocked by anti-tracking technology in web browsers when tracked via Facebook Pixel. Ad accounts missing the “add to cart” event will have a smaller audience to target. Those missing the “purchase” event will be advertising to users who already purchased. Both of these problems increase CPA. Meta's Facebook Conversions API (CAPI) solves this problem by sending the events from server to server, where they cannot be blocked, and so they can contain more complete customer data to match to a Facebook user. This increases the relevance of the audience on Facebook and reduces CPA. This is a big uplift — many brands report a 20% to 30% increase in purchase tracking after using Facebook CAPI, meaning they can save up to 30% of their ad budget since they avoid retargeting customers who purchased already. Shopify stores report a 20% to 30% increase in purchase tracking after using Facebook Conversions API, meaning they can save up to 30% of their ad budget. Some Facebook CAPI solutions only send the purchase event server-side — which only solves part of the problem. Littledata’s connector for Shopify to Facebook CAPI also sends the add-to-cart event server-side. This means that it is captured every time and can be sent to Facebook automatically. [note]Littledata’s Facebook Conversions API connection is available now for Shopify stores. Get Started Today.[/note] What’s the best way to retarget product ads? Facebook recommends targeting an “add to cart” event with a broad product group. So, you can either retarget based on a product ID or a product category. Retargeting using the SKU or product variant is less successful, because the user is likely looking for something like (but not exactly the same as) the product they abandoned in the cart. In addition to using add to cart events, you could also target a broader audience who viewed the product details page (see below the ‘Viewed or added to cart but not purchased’ option). It depends on the type of product you are selling. How do I set up dynamic ads with Littledata? Littledata’s Facebook CAPI integration sends all the required events you need for product targeting via Facebook CAPI. The only limitation is that the Product Viewed event (from the product details page) is still sent client-side (via Facebook Pixel). This means some product views might be missing due to browser cookie blocking, although in a future iteration this will also be moved to Facebook CAPI. You will also need a product catalog feed for Facebook: Facebook Feed by Littledata reliably performs this role. How do I configure the dynamic ads within Facebook? To configure dynamic ads, you will first need to create a new campaign with Facebook Ads manager. Start with a Catalog Sales campaign. Next, link the campaign with the product catalogue you set up. Then name the campaign and edit how it interacts with your product catalog. Then link the event data source you configured Facebook CAPI to send to. Finally, you are ready to configure the audience rules. In this case, I have chosen to retarget users who viewed or added to cart but did not purchase. What are the other options for setting up Facebook dynamic ads for a Shopify store? As we wrote last year, there are a couple of other options for connecting Shopify to Facebook CAPI. The options are: Shopify’s inbuilt Facebook channelServer-side Google Tag Manager (sGTM) But both have their limitations. Shopify’s Facebook channel has problems with order duplication — so revenue and order volumes are double-counted, making ROAS hard to calculate. It also doesn’t send add-to-cart events server-side, resulting in lower retargeting rates. Server-side Google Tag Manager is more reliable but puts all the onus on you to maintain the integration. Facebook’s marketing technology changes every month, and we believe paying for a constantly maintained connection is a better long-term solution. Of course, if you just want to prospect new customers with selected products from your catalog, you could do this without sending events to Facebook—but then you’d be blind to whether the campaigns were really working or not. What to do next to set up dynamic ads Are you ready to boost the effectiveness of your Facebook Ad spend? What would a double-digit uplift in your ad spend effectiveness mean for your brand? You can get started today with just three things: A Facebook Ads accountA product catalog feed from Shopify to Facebook AdsAn event data feed (also known as Facebook CAPI) Happy retargeting! Note: Littledata’s Facebook Conversions API connection is now available for Shopify stores. Let's get started!
An open letter to Mark Zuckerberg from Littledata Founder, Edward Upton
Dear Zuck, You’re a developer. I’m a developer. And I thought Facebook was a developer-friendly company to work with — after all, you’re trying to recruit tens of thousands of engineers to work at Meta. But our experience trying to integrate with Facebook Ads makes me really doubt that. It’s been frustrating. At times, eyeball-gougingly frustrating. Littledata runs a popular data integration, allowing hundreds of ecommerce brands spending a LOT of money on Facebook Ads to export their cost and click data to better calculate return on advertising spend. Until October this Facebook app was running just fine, and our mutual customers were happy social marketers. The trouble started when Facebook needed to verify our business manager account earlier this year. We’ve heard that Facebook needs to know their business customers better — some of those Russians spending big on election Ads were not quite who they said they were. We understand. Littledata is trusted by thousands of Shopify stores around the world, so we’d be happy to show Facebook our company paperwork. The problem is the app in question is linked to a legacy business manager account with no admin user. Hands up, that was my fault — as someone who’s led a hyper-growth startup I hope you’ll see why sorting out a duplicate Facebook account never got prioritised. So, we never got the memo back in February 2021 that the business manager account was unverified and suspended. No Facebook message, email, push notification or carrier pigeon. Nada. This time bomb carried on ticking until 5th October when we needed to add back app permissions after an update to Facebook’s marketing API. But we were not able to do so without — you guessed it — a verified business manager account. On 5th October you probably had bigger fish to fry with Facebook’s network meltdown, but I hope a coder like you couldn’t fail to spot the classic infinite loop: Littledata can’t verify the Facebook business manager account, because there is no admin user with access to that businessFacebook can’t add an admin to an unverified business when it's been inactive for more than 60 daysWe can’t move the app to another Facebook business, as there is no admin user with accesss Since October, I’ve been in contact with Facebook business support nearly every day because our Facebook advertiser customers are complaining every day. And over 8 weeks — EIGHT WEEKS — I have felt like I’m head-butting a concrete wall. Since this scenario isn’t one that was imagined by the business verifications team, it apparently just can’t be fixed. Maybe this is how those data centre managers felt on 5th October, locked out of their own building because Facebook’s authentication systems were offline? So now our app can’t be used. So advertisers spending tens of millions with Facebook Ads are upset too. I’m just a developer wanting to work with Facebook. Can you or anyone else get us out of this verification Meta-hole? Best regards, Edward Founder, Littledata P.S. If you can take a short break from the metaverse, it's support ticket 622162645450139
Why stores are using Facebook’s Conversions API
Segment Q2 Updates
Shopify to Segment is one of our most popular connections, so we're always making improvements that give users the capabilities they need to optimize revenue. This update adds key tracking tools that give stores greater insight into customer checkout behavior, Facebook marketing attribution, recurring billing, and more. Supporting subscriptions in the checkout Littledata’s Shopify source is now fully compatible with most common subscription billing apps using Shopify’s checkout. Our app captures all recurring orders — linking them back to the user who first purchased if possible — and tags the events to differentiate between one-time purchases, first-time subscription orders and recurring orders. You can now use Littledata to send event data from subscription apps in the Shopify checkout, including: ReChargeBoldOrdergrooveSmartrr If you are using ReCharge you can take advantage of the subscription lifecycle event tracking as well. Learn more about the subscription lifecycle events we push to Segment for churn analysis, including Subscription Created, Subscription Updated, Subscription Cancelled and Payment Method Updated. Facebook Conversions API destination Segment’s cloud-mode Facebook destination is now out of beta, and becoming increasingly popular with marketers looking to more accurately target their Facebook Ads in the face of increasing browser limitations. Next month Littledata will be adding all the extra event parameters needed for Facebook CAPI, so please contact us if you’d like to join the private beta. Opting out of client-side events We understand some of our customers want to instrument their own event tracking (maybe using Littledata’s Google Tag Manager data layer), but retain the server-side events from Shopify. In this case, Littledata’s tracking script is still needed on the Shopify storefront to initialise Segment AnalyticsJS library and capture the anonymous ID for server-side events. But, you can add disableClientSideEvents: true or disablePageviews: true in a manual settings update. GDPR cookie compliance If your store is using a Shopify-compatible cookie banner (or using a consent management platform like OneTrust or TrustArc), the Littledata’s tracker can respect your users’ choices by switching just one setting. For OneTrust we also push the user consent choices as a user trait, so you can control which personas are shared with other platforms. Simpler accepts_marketing flag User traits for all events where the user is known now contain a simple true/false accepts_marketing field — useful in CRM destinations for email marketing. This is in addition to the marketing_opt_in_level field, which can give more detail on whether this was a single or double opt-in for marketing. How to get Littledata's Shopify source for Segment If you aren't yet a Littledata user, you can start a free trial directly from the Shopify app store. If you already have a Littledata account, you can activate the Shopify-to-Segment connection directly in the Littledata app. On Shopify Plus? Learn more about Littledata Plus.
Subscribe to Littledata news
Insights from the experts in ecommerce analytics
Try the top-rated Google Analytics app for Shopify stores
Get a 30-day free trial of Littledata for Google Analytics or Segment