How Littledata’s product sprints fuel innovation

trust your Google Analytics data setup

Littledata thrives on innovation. As a top data connector with a complex backend and seamless frontend, we’re always looking for ways to innovate faster and smarter. To fuel that innovation, we use focused development sprints to ship high-quality features and updates.

Over the years, we’ve learned that: 1) having clear objectives, 2) removing unknowns, and 3) delivering value in smaller chunks is key to an impactful product development process.

That’s why we start projects by first clarifying our goals, then discussing the scope of features and their impact. That way, we can break them down into meaningful chunks and prioritize them for implementation.

Delivering value in small chunks is key to impactful product development

We arrived at this process after several trials and errors over many arduous months. Our north star metric all along has been sprint velocity. We measure that metric using total story points, which focus on a task’s worth of value delivered to customers rather than working time spent. We believe teams that most often deliver value to their customers have a higher chance of success in the long run.

All that said, we recognize every team dynamic is different. Each team should test what works best for them. Littledata’s process — laid out below — helped us double our velocity per developer, per sprint. We highly recommend it to any product team that wants to try it out.

To show you why our process works, let’s dive deeper into it.

Choosing Goals and Objectives

How to set Annual and Quarterly OKRs (Objectives and Key Results)

We begin each year by stating our overall goals and objectives using the OKR framework. These annual OKRs are then broken down into quarterly OKRs and translated to fit each team.

Using Team Initiatives / Epics

Once each team clearly understands their OKRs for the quarter, they break them down into epics (or initiatives). Epics are bodies of work that, when completed, push the team closer to achieving their goals for the quarter.

For example, in one quarter, our product team identified trial conversion rate as a key metric that, if improved, could help Littledata move closer to its business goals. To boost the trial conversion rate, the team used first principles thinking, user research and feedback, and user experimentation to help identify root issues that prevented users from completing the trial.

Littledata breaks goals down into epics, which help us work together on clear initiatives

Using that data, the team came up with several epics like creating a “getting started” campaign, improving the onboarding experience for users, and launching a feature to educate users about the product. Each epic contained clearly defined user stories (specific tasks) to help resolve the root issues identified.

Epics breakdown

We want to be able to start delivering value to our users as soon as we can. So, once we have a clear understanding of our target epics (or initiatives) for a quarter, we break them down further into valuable, independently deployable iterations:

  1. 1HOUR iteration
  2. 1DAY iteration
  3. 1SPRINT iteration
  4. FINAL iteration

Each of these iterations is deployable on its own and adds value to our customers. Work starts using the smallest possible version of the epic that we could build and deploy while still adding value to users. We continue building to reach the final iteration: a fully-featured spec that has all the bells and whistles we’d initially planned for.

Breaking an epic down into these iterations means that:

  1. We start adding value to our users sooner than later. Instead of waiting for a couple of sprints, we start delivering value in hours (literally.)
  2. We can measure impact a lot earlier. This helps keep us agile, letting us shift strategies if our proposed solution or the identified problem is not aligned with our users’ needs.
  3. We increase perceived velocity. This helps keep team spirits and momentum high.

We try to stack a mix of epics in every sprint to continue delivering value to customers across multiple fronts.

The full Littledata sprint process

Our sprint development cycle begins well before an actual sprint starts — ideally about two sprints in advance. We hold a few planning and estimation sessions beforehand to make sure we’ve clarified all the unknowns and aligned the entire team on the deliverables for the sprint. Then, it’s on to the epic planning.

Planning epics

We plan epics for a couple of sprints at a time. Each Littledata sprint lasts two weeks, which we’ve found to be short enough to accurately forecast the roadmap, yet long enough to enable us to take on larger features. For each epic planning discussion, we involve the Product Manager (PM), Engineering Manager (EM), and Technical Program Manager (TPM).

Writing specs

After we’ve aligned the desired outcomes for the PM, EM, and TPM for each epic and prioritised them into the sprint, the TPM works with the engineering team to break the epic down into smaller tickets that make sense from an implementation perspective.

Estimating tasks

Our EM works with the engineering team on a daily basis to discuss tickets specified by the TPM and estimate their complexity using story points (and following industry best practices.)

Although complexity estimation is arbitrary and differs from team to team, as long as the team remains consistent in its estimations, we believe it adds a lot of objectivity to estimating sprint velocity. This further helps us plan each sprint, know the team capacity per developer per sprint, and aids us in our hiring decisions.

Pre-sprint planning

The PM, EM, and TPM meet again prior to the sprint’s start to discuss the now estimated epics. They negotiate and prioritize work based on the team’s capacity, as well as the value added to our customers and the business. This is where we lock in the work for an upcoming sprint.

The entire product team connects at the start of each sprint to align on the epics and their desired outcomes. Ideally, this is more of an alignment meeting. By this point, everyone on the team will have gone through specs and will be quite familiar with the expectations. There should be no unknowns at this stage; the entire focus should be on execution.

The sprint

At Littledata, we follow an agile, two-week sprint model. We use Jira tickets to track progress, with each ticket flowing through the following stages:

  1. TODO: Prioritized ticket, assigned to a particular developer
  2. IN PROGRESS: The developer has picked up the ticket and is working on it. Ideally, there shouldn’t be more than one ticket per developer in this column at any given point in time.
  3. CODE REVIEW: The developer has moved the ticket for peer review to make sure there aren’t any code quality issues.
  4. QA: After a ticket passes code review, our QA analyst makes sure the implementation matches the acceptance criteria specified on the ticket.
  5. DEPLOYMENT: If there are no dependencies, the ticket gets deployed to production after it passes QA. We try to deploy to production several times in a given sprint.

Sprint review

When we reach the end of each sprint, we wrap up with a review meeting. We talk through the sprint velocity, discuss what the blockers were, and brainstorm how we can improve in the next sprint.

Many key Littledata features and product innovations have come from this sprint process, with sprint reviews feeding directly back into sprint planning for the next cycle. Those innovations include:

And so much more!

Indeed, the cycle continues to work on and on, from the next sprint to the one after…

Try our process for yourself

Has our product development process piqued your interest? Could you see yourself thriving in a collaborative work environment as part of a growing team dedicated to making a difference in customers’ lives?

At Littledata we’re building the top ecommerce data platform on the planet, with customers — and teammates — around the world. Take a look at our open positions, and don’t forget to follow us on Instagram and Twitter.

Plus, if you’re using development sprints in an innovative way, let us know and you might even get featured on the blog!

Prev
Shopify Analytics vs. Google Analytics: Why don’t they match?
shopify analytics vs google analytics

Shopify Analytics vs. Google Analytics: Why don’t they match?

If you’re a Shopify store owner using both Shopify analytics and Google

Next
Lunch with Littledata: Why a headless build is right for your store with Nacelle

Lunch with Littledata: Why a headless build is right for your store with Nacelle

Want to learn from DTC founders and entrepreneurs shaking up their industries?

You May Also Like