At Intercom, shipping is just the beginning.
We iterate, fight for adoption, and keep pushing for maximum impact for our customers and our business. We strive to deliver outcomes as well as outputs to drive real results for our customers.
In an episode of our Intercom on Product podcast, our co-founder Des summed up the output vs outcomes relationship in this way: “it’s what you ship versus what happens because of the thing you ship.” We work to ensure that what happens because of things we ship delivers tangible value to our customers.
“Driving great customer outcomes while ignoring business outcomes doesn’t make a successful company”
But in the end, driving great customer outcomes while ignoring business outcomes doesn’t make a successful company. We strive to balance both, delivering high-impact outcomes for both our customers and our business. Here’s how we do it:
We frame the outcomes we’re aiming to drive upfront
We think about what kind of customer and business outcomes we’re striving for from the outset, particularly when putting together our problem statement. Throughout the process, we ask ourselves: “What measurable change in customer behavior will result from successfully solving the problem?”, and typically measure this behavior as product activity or usage.
We use a specific R&D outcomes metrics template to help us think through and articulate the results we’re aiming for. In it, we frame the customer benefits we want to see along with any metrics, targets, and supporting rationale we have. We do the same thing with the business benefits we want to drive.
“We ship a feature to solve a customer problem; to drive certain customer behaviors and, in turn, impact business results”
We ship a feature to solve a customer problem; to drive certain customer behaviors and, in turn, impact business results. Maybe what we’re shipping will save customers time, drive efficiencies, or reduce their costs. We then think about business outcomes or business results. If we solve that customer problem, and drive that customer behavior, what impact might we expect to see within our business? We frame business impact using the following categories:
- Acquisition: Will it help us to acquire new customers?
- Expansion: Will it help us deepen or broaden our existing customers’ usage?
- Retention: Will it play a role in retaining customers or preventing churn?
- Revenue: Can we directly link it to commercial impact? For example, something like an add-on can easily be measured in revenue.
“It’s not always possible to measure the commercial impact of every product feature”
Sometimes the lines between shipping a product and driving business results seem oceans apart – and it’s not always possible to measure the commercial impact of every product feature. We don’t agonize over it and we certainly don’t want to shoehorn every feature to fit a revenue goal – but we try to map it back as much as we can.
We instrument our product so that we can measure our outcome
We need to ensure we have the right data in place to measure the outcomes we’re striving for. We instrument our product by tracking specific events of interest along with additional context about those events. We use an in-house analytics framework for instrumentation where we track the action, object, place, and metadata for each event. Each user of Intercom can perform an action on a certain object in a certain place, where:
- Action: Describes the action that the user took, e.g. opened, clicked.
- Object: Describes an object that is acted on or affected by the action, e.g. conversation details, message.
- Place: Describes where the action is triggered. This usually represents the page of the app the user was on, e.g. the inbox.
- Metadata: Provides additional information about each specific occurrence of the event, e.g. a URL, an ID, a state.
This is a critical step towards measuring and driving outcomes. If the feature hasn’t been instrumented, it’s not ready to be shipped.
We always assume we need to iterate on our products
We know that we won’t always get it exactly right for our customers or our business, and so we plan and leave time to iterate on our products until we do.
This isn’t easy – once a feature is released, it’s natural to feel a pull towards the next thing on the roadmap. We’re all guilty of the “ship, move on, ship, move on” mentality, but at Intercom we know our job isn’t done when we ship. So we plan to iterate.
After we ship, we fight for adoption and usage
We know it’s not simply a case of “build it and they will come” – we have to fight for adoption and usage. That means defining, measuring, and understanding core metrics like awareness, intent, activation, adoption, and engagement. We track and review these metrics post-launch, interpret the results, and take action to improve things when needed.
We review outcomes and share our learnings
To codify our learnings, we have a rigorous process for reviewing outcomes and sharing our conclusions broadly across the company. This isn’t just on the PM, data scientist, or researcher, it’s on the entire team. One of the main tools we use for this is an outcome report.
Around a month after we release a feature, we complete an outcome report to reflect on whether or not we’re seeing the outcomes we expect. We look at both quantitative and qualitative feedback to assess how we’re doing, and discuss the actions or decisions we need to make as a result.
For bigger projects or large launches, we might identify multiple checkpoints, assessing outcomes at the three, six or twelve-month mark. Teams share their outcome reports to a dedicated Slack channel to reach as many people as possible, both within their team and across the wider company.
“We always assume there will be some post-launch iteration needed, and generally consider this an extension of the original project”
We always assume there will be some post-launch iteration needed, and generally consider this an extension of the original project. That said, sometimes we need to reprioritize the roadmap to make room for bigger changes. A good guiding principle is to ask ourselves, “Have we sufficiently solved the identified problem for most target customers? Have we achieved the customer and business outcomes we were aiming for?”
Watch out for anti-patterns
These are practices that can creep in after a while without anyone noticing. These could look like:
- Failing to decide what outcome we expect at the beginning of a project.
- Failing to define suitable metrics and the right instrumentation for the product, feature, or outcome.
- Shipping and moving on.
- Taking credit for an outcome we didn’t plan to cause.
- Rushing through the outcome report process just to check a box.
Fighting against these anti-patterns takes a lot of work from teams who genuinely understand the “deliver outcomes” principle and how important it is to our customers and our business. We need to continue to apply this principle to everything we build to ensure it’s embedded in our culture and the way we solve customer problems. Doing this helps us to deliver tangible value at scale to our customers and company.
This is the second post in our Intercom on Product series, where experts across Intercom’s R&D team talk about the principles that guide their work everyday. Read other blog posts in the series here.