- The Product-Led Geek
- Posts
- đź‘“ The PLGeek Guide to Engagement (Part 1)
đź‘“ The PLGeek Guide to Engagement (Part 1)
Why it's so important and how to measure it
In this week’s Product-Led Geek newsletter, I’m highlighting a favourite topic of mine that I feel doesn’t get enough attention - engagement. This is the first in a two-part series. Read on to hear what’s coming in part 2.
But first, a little context…
Dollar-based retention and the importance of efficiency
In the current economic climate, with efficiency top of mind, more than ever SaaS companies are focusing efforts on initiatives to improve gross margins, lower CAC, increase LTV and raise net dollar retention (reduce churn and drive expansion revenue).
It’s almost always cheaper to generate revenue from your existing customer base than it is to generate net-new revenue.
Re-acquiring churned customers is always expensive, and most often not even possible. The ship has sailed.
Usage-based retention is the biggest leading indicator of revenue retention
If your customers are not actively and habitually getting value from your product, they won’t continue to pay for it - at least not past the next renewal milestone - so usage-based retention must be a strategic priority.
The most effective and the most efficient way to drive increased usage-based retention is for the product to do that work for you.
From here on in, I’ll use retention to refer to usage-based retention.
Human-centric mechanisms via customer success and support teams certainly have an important supporting role to play but are too high cost to rely on as the primary means to retain, and no matter how good they are, they’ll struggle if the product isn’t sticky (both intrinsically and through manufactured means).
And this is why every software company must become product-led in at least the growth lever of retention or risk eventual disruption from competitors who are.
Every software company must become product-led in retention or risk eventual disruption
Retention is an output - don’t focus on it
Don’t focus on retention. There, I said it.
Wait….what?! You just told us how important it is!?
I did indeed, but here’s the thing.
Retention is lagging. It’s the output. You improve retention by improving your ability to activate new users, keep current users engaged, and resurrect users who have gone dormant.
Always keep it visible, but trying to work on it directly is fruitless. You need to understand the levers at your disposal.
There’s a lot written about activation and for very good reason. It’s often the right place to start if you have a retention problem. If you’re not getting users and their teams to experience the core value of your product and build habits around that core value, then you won’t have an active user base to keep engaged. You can’t put the cart before the horse.
But once you have a good handle on activation, or you have the resources to be able to concurrently focus on multiple areas, then it’s time to invest in improving engagement - the focus of this post.
(I’ll save the topic of resurrection for another day).
The engagement opportunity
Your acquisition loops are firing on all cylinders. Priority is being given to the acquisition of users from your ICP. You’ve defined activation in such a way that it’s predictive of mid-term retention. You’re successfully activating a subset of new users coming through the door. But you can’t stop there.
Focusing efforts on engagement creates opportunity to influence growth in multiple ways
First of all, by definition, at the very moment of engagement, a user is retained. And so keeping users engaged means they retain for longer, which has a downstream impact in other areas of your growth model by providing ongoing opportunities for them to monetise, and ongoing opportunities for them to fuel acquisition through word of mouth and viral loops. The more active users you have in the product, the greater the likelihood of more of them recommending you or inviting others in.
But beyond retention, engagement directly drives other growth levers.
When any individual user or team is more engaged, they can accelerate acquisition loops
Take Miro for example. All other things being equal, a user who creates 5 boards per week provides a significantly greater opportunity for those boards to be shared with potential new users than if they were just to create 1 board per week. This is why Miro invests a lot in discoverability, promotion and education around new or adjacent use cases with templates.
They know that deepening engagement through use case expansion is an effective way to drive new user acquisition (and to bring back existing users to further compound the loops).
When any individual user or team is more engaged, they can drive increased monetisation
Many B2B PLG SaaS products with usage-based pricing leverage this to their benefit and motions across their businesses should be designed to help users and teams be successful with the product in order to nurture their users to cross monetisation thresholds - via free to paid as well as expansion revenue within existing customers through higher plan tiers and add-ons.
For example, Wistia plans scale along with the volume of videos created/hosted.
Wistia provides (and encourages the adoption of) a number of tools around video creation, editing, collaboration and management to drive engagement higher.
In each of these examples, notice my use of the phrase more engaged…suggesting nuance in breadth and depth of engagement.
Engagement isn’t binary
Activation is binary. A user or team has either reached the habit moment and activated, or they haven’t.
The above examples should help demonstrate that it’s insufficient to consider engagement in the same way. Engagement needs to be modelled as a spectrum. Yes, in the aggregate users and teams are either engaged or not - but reducing engagement to binary states is a common mistake that hides nuance and prevents actionability.
Just knowing where users or teams are on the engagement spectrum gives you incredible insight, and unlocks the ability to customise the tactics you use to drive increased engagement.
And when you track engagement at the aggregate level across your user base you can capture the distribution of teams using the product along the spectrum. And if you track the change in that distribution over time you can better evaluate if your efforts to increase engagement are being successful.
Measuring engagement
So if engagement is a spectrum, how should we measure it?
A spectrum of engagement gives you much more fidelity to work with than just a binary assessment. But a continuous spectrum is too much fidelity. It becomes equally unactionable.
My go-to approach for actionable measurement of the engagement spectrum is to use a few carefully defined buckets or states, as advocated for by Reforge. But that’s not where I like to start.
Despite what I’ve said about a binary measurement of engagement not being actionable, it’s actually where I recommend you start. But there are some critical rules to follow here to ensure the data is meaningful.
I see lots of companies get this wrong and define overall engagement as daily/weekly/monthly active users which brings the potential for misguided application of the metric.There are three simple principles to follow to create a meaningful data point for overall engagement.
Define activity around the core value of your productThe most common problem I see is an arbitrary definition of active.Typically that’s defaulting to the lowest common denominator of any activity in the product (for example just logging in). This isn't going to tell you anything meaningful. Instead, define and orient around a definition that’s based on users/teams experiencing the core product value.
Align with the problem frequencyMake sure the period is reflective of the frequency of the problem you solve for,If it's a weekly frequency, don't use a daily measure.
Use a team-based metric for B2B SaaSIn B2B PLG, the core value prop is most commonly centred around a team. Unless the team realises value, the product won’t see sufficient adoption and monetisation will be unlikely. In these contexts, it should be expected that the definition of both activation and engagement is team (or workspace) centric.
To bring this all to life, here's an example from Snyk:
Core Value: Fixing vulnerabilities (not scanning, not detecting vulns, but fixing them)Frequency: Weekly (we expected teams to be paying back security debt on a weekly basis)Team-based: Security is a team sport. The Snyk 'Org' concept most closely encapsulates a team.
So the overall engagement metric was
Weekly Fixing Orgs (WFO)
Overall engagement metrics in this form often make for fantastic north-star metrics for product-led companies.
Bonus: Take your overall engagement metric for any given period, and divide it by the volume of core value experienced (e.g. number of fixes made for Snyk) in that period to get a measure of the depth of engagement per active user/team. Track that over time to monitor trends in overall depth of engagement.
Once you have gone through this exercise, you will also have a good understanding of some of the key ingredients to help in defining the actionable states on the engagement spectrum.
Engagement states
Engagement states help answer the question
“How engaged is a user/team with our product?”
They enumerate the degree of engagement.
The principle is that you take your engagement spectrum and divide it into a small number of finite buckets (or states), and group your users/teams in those buckets according to their depth of engagement in any given period.
You might define buckets around the intensity of users/teams experiencing core value, or around the frequency of them experiencing core value. An additional option suggested by Reforge is around feature usage, but I’ve not yet found a scenario where a feature usage based definition trumps a definition based on frequency or intensity of value realisation. I’d always suggest starting any analysis there and only moving to feature-based in specific situations, such as when your product is supporting several different use-cases through these high level features, and the value of each is different. Even then, you might still benefit from tracking a more granular value-centric metric for each of those high level capability areas.
At Snyk for example we defined 3 frequency-based states as well as a fourth dormant state, but you shouldn’t be beholden to that - let the data inform this.
We defined the Core engagement state as between 4 and 7 unique days of fixing in the last 30. Why was this the Core state? Not because that’s where the largest volume of teams were, but because the natural problem frequency was weekly - in other words, we expected teams to be fixing vulnerabilities on a weekly basis give or take.
Below the Core state, we had a Casual state which was defined as between 1 and 3 unique days of fixing in the last 30.
And above the Core state, we had a Progressive state defined as 8 or more unique days fixing in the last 30.
We defined the boundaries between the states at significant inflection points in correlation with long-term retention.
So for example we’d see around 40% likelihood of teams still fixing using Snyk 12 months later if they were in the Casual engagement state, but around 80% likelihood if they were in the Progressive state.
If a team was dormant, there was less than 5% likelihood of them still using Snyk to help them fix vulnerabilities 12 months later!
Dormancy is a huge red flag!
This underscores the importance of taking a really measured approach to engagement. The opportunity is huge!
Be careful…
Avoid choosing and optimising an engagement metric that might be counter to the behaviours that you want to encourage.
To give an example, it might initially seem intuitive for an education product to measure engagement around the number of lessons taken, but learners who are cramming right before a test would appear as highly engaged during that time period, while those who have been learning with a steady pace over a longer period of time would appear as less engaged. Yet it’s the latter cohort who are likely to have built more sustainable habits around the product and are more likely to be retained in the long term.
Ultimately assuming you have sufficient data to hand you should be testing every metric candidate against factors including long-term retention and monetisation propensity.
While I was at Snyk we investigated many candidate metrics to build our engagement states around, including an initial favourite hypothesis around the volume of fixes the Org made in a given period. We found that to have not as strong a correlation with long-term retention as our eventual frequency-based definition. The lesson here is to evaluate several different perspectives.
But also beware that perfection is the enemy of good. It’s much better to have something with reasonable confidence that you’re able to align, unite and drive action around than it is to spend months on end in analysis paralysis trying to find the perfect definition.
Utilising engagement data
Engagement state data (including movements over time) is invaluable. There are many ways it can be utilised across the organisation in support of growth. Here are 9 suggestions to get you started.
Churn prediction
Expansion prediction
Cohorting for experimentation targeting
Contextualising the product experience for users/teams in different engagement states
Customising notification volume and content
Tailoring product and marketing automation
Input to PQL and PQA scoring models
Rich signals for revenue teams to reference in conversations with prospects and customers
Overall period-to-period product health reporting with trends of movements between states
Closing thoughts
Apologies to regular readers for the longer-than-usual post - I did mention this was a favourite topic of mine - believe it or not this is an abbreviated version that required me to split into two.
I hope that I’ve helped convince you that engagement is likely an area of critical importance to your business and something that you should be considering holistically in your growth model, even if it’s not something you’re actively aiming to influence today.
Here’s part 2 of this post where I dig into improving engagement.
Todays listen:
3 interesting reads:
Reply