Built for You: Making of Series

In this week’s special episode, we deep dive into the process behind Series, Intercom’s new visual campaign builder.

On the last Built for You episode, we talked about how customer feedback informs the features we build at Intercom. This week, we examine the conversations that led to Series, one of our biggest launches this year.

Series wasn’t your typical project at Intercom. Instead of building on previous iterations with small, incremental changes that could be quickly shipped, this was a two-year-long enterprise designed almost entirely from scratch.

For this virtual panel, we’ve gathered the team that made it happen:

They’ll walk you through the process of building Series, from cross-team collaboration to beta-testing our way towards operational excellence. If you’ve ever wondered how all the pieces tie together in a major launch, this is the episode for you.

If you enjoy the conversation, check out more episodes of our podcast. You can subscribe on iTunes, stream on Spotify or grab the RSS feed in your player of choice. What follows is a lightly edited transcript of the episode.


Thomas: Thank you for joining us today on this special episode of Inside Intercom. This is the second of the Built for You podcasts. In the first podcast, I chatted with some of our teams from research, engineering, and product management, who translated your feedback into the features and products we built. We learned that customer feedback is not distilled and passed down through a production line. Instead, each of the teams look at the feedback unvarnished from the outset and work together to solve the problem.

In this episode, I’m diving deeper into how that collaboration between R&D and other teams both shapes the products that we build and allows us to measure whether we have indeed solved real customer problems.

Now, my panel today are no strangers to each other, as they’ve been working closely together on one of our biggest releases this year, which is Series. So, let’s hear from our panelists about who they are, and the work that they do at Intercom. Lucas, let’s start with you.

Lucas: Hey Thomas, thanks for having me. It’s such a pleasure to be here talking about a great project like Series with great folks like you. I’ve been an engineering manager at Intercom for about 2.5 years. And for 1.5 years, I have been managing the team responsible for building and maintaining Series.

Alex: I’m Alex. I’m a designer on the same team as Lucas. I’ve been working on Series since the beginning, and through all of the nitty-gritty details of it.

Nicole: I’m Nicole. I’m a product marketer here at Intercom, and I was responsible for our go-to-market strategy and bringing Series to market during the year.

Wal: And I’m Wal. I’m on the research and data science team here at Intercom. Our group works with folks across Intercom, trying to help them understand the opportunities and risks in the adoption, activity, and ultimately the commercial performance of their products and the Intercom product suite. We are principally responsible for the messaging and outbound products, so that’s how I’ve been involved in this product series.

Building Series from the ground up

Thomas: To start, does anyone want to try and summarize all of the customer feedback or the problems that Series was trying to solve, and maybe tell us a little bit about what Series is?

Lucas: There were two main reasons for building Series. Firstly, we had this product called Campaigns that we used it to orchestrate outbound messages. It didn’t provide any visual cues for the customers to check their outbound messages. So, they had no insight into the performance of their outbound messages, the sequencing, how they were connected, or how they were organized.

“It was a big rewrite in many ways. It was more than just an iteration of what we had before. It was a very new way of thinking about the problem”

As a result of this, customers ended up using third-party visual tools to draw their campaigns before turning them into a real Intercom campaign, which, as you can imagine, created a lot of friction for our customers.

Secondly, we decided to build Series because, over time, more types of messages were released. An example of that is Carousels, Tours, and most recently, Banners. And supporting these brand new messages on Campaigns would cost us a lot of engineering efforts.

So, we decided to kill two birds with one stone by providing a visual cue for customers to mitigate the need for any third-party tools while increasing our support for existing and future outbound messages.

Thomas: Fantastic. So, if I was to summarize that into two key problems that you were trying to solve, it was to create a visual tool that allowed you to orchestrate your messaging, while also allowing you to use those multi-channels of outbound channels that we have in Intercom. It allows you to essentially use all of those channels.

Lucas: Yeah, exactly.

Alex: I just wanted to add one more thing on top of what Lucas was saying. Back when we were creating our first iteration of the solution to this outbound orchestration problem, which was the old Campaigns, we were seriously thinking of what should the solution should be and how it should work. And we decided that the future of orchestration would be something very automated, very hands-off, where you would just put a bunch of messages into this single tool and then decide through algorithms and smart suggestions on which message to send to what person at what time.

“That’s perhaps different in a way to the way we build a lot of stuff in Intercom. We iteratively improve, and get out in the wild, and see how customers use it. This was the big, uncomfortable rewrite, and the rebuild, and the relaunch of a big feature”

But the problem was that, for customers, it was something that they didn’t feel ownership and control over. The whole approach led to so much customer confusion and feedback that they didn’t understand the black-boxing nature of Campaigns. And that was also part of the reason why we wanted to change and improve the solution.

Wal: Yeah. I think that’s worth calling out. The sophistication of usage, and what customers were trying to achieve with orchestrated messaging kind of outgrew the old solution we had. And so, the new implementation of orchestration, which is what Series is, was a technical architectural refactor. It was a big rewrite in many ways. It was more than just an iteration of what we had before. It was a very new way of thinking about the problem.

And I think that’s perhaps different to the way that we build a lot of stuff at Intercom. We iteratively improve, and get out in the wild, and see how customers use it. This was the uncomfortable rewriting, rebuilding, and relaunching of a big feature. I think that’s what made it a bit different as well, from our perspective.

A visual approach

Thomas: Series is very much like nothing else in Intercom. Alex, maybe I can come to you from a design point of view. Can you talk to us a little bit about the specific problem that led you to think about how a designer might package Series?

Alex: Sure. The final solution was definitely not what it could have been. And there were very valid, viable options that we considered for how Series could look and work that had nothing to do with the visual element of it. As Wal just mentioned, back with Campaigns, we actually were considering or even playing with ideas of just iterating on it, as in, just doing a slight change of some of the functionality and repackaging some things to make it more understandable and clear to customers.

But the problem that we found was that it was just not about the lack of comprehension that people had with Campaigns. It was also the type of use cases that we wanted to prioritize and support, and the workflows that customers actually had when they were creating their outbound orchestrations. It was just so different from the linear, non-visual flow which Campaigns presented that we basically had to rethink it and go with a more visual approach.

“It was just so different from the linear, non-visual flow which Campaigns presented that we basically had to rethink it and go with a more visual approach”

And then within that visual approach, there were multiple different layers and options for how flexible and free the whole canvas would actually be. We’ve prototyped a lot of different options, and we’ve tried to unearth what interactions customers would have on that canvas that would actually allow us to support their workflows in a way that would make sense for them.

If we come back to a point that Lucas was making, when we looked at how customers were creating their outbound orchestrations, we saw that they actually went to tools like Miro or Whimsical, or even a physical whiteboard to create a flow of their messages, visualize it, quickly iterate on it, and move things around.

From there, we were able to understand that if people were actually using those visual tools to collaborate with other folks and create their orchestrations before moving them into Intercom’s non-linear builder, then it was probably a really good indication that we should look at providing an experience that was much more similar to those types of tools, which remind you of actually playing with things on a whiteboard, rather than being constrained in any way. And so, that’s where the whole visual direction and visual nature for Series came from.

Getting everyone in the same page

Thomas: I’m sure there are plenty of people who will be listening to this podcast and thinking Alex is a designer and Lucas is on the engineering team. It’s probably an oversimplification to say that the discussions between your teams are about style versus function, it sounds like the design team is very much involved in creating something that is equally as functional.

“The collaboration that we have within the team is on so many levels, and it ensures that designers, and engineers, and product managers are fully involved in the whole process on every single decision point”

Alex: Yeah. I think that’s a really interesting thing about Intercom. We collaborate within the team on many levels, and it ensures that designers, and engineers, and product managers are fully involved in the whole process on every single decision point. Which meant that during each of the design exploration, we always had an engineer in the room playing the role of a person who would be able to point out, “Hey, this type of thing is actually very hard for us to do, and this is something that we can do, and this is something that we should discuss and try to come up with a better or more clear solution.”

And similarly, from a design perspective, I was very hands-on in the engineering part of things, ensuring that all the little visual features looked the way we wanted them to look. And so, it was this very tight collaboration between two functions throughout the whole process, which ensured that at each point, at each important milestone, we were aligned around the type of thing that we believed, as a team, was the right thing for us to do and the right solution for our customers.

Thomas: And inevitably, I’m sure there were some concessions that needed to be made during that collaboration. Lucas, can you tell me a little bit about what concessions were made in launching the Series? Was there anything de-scoped from the launch that you might expect to come out as a fast follow?

Lucas: That’s a great question. We remember a lot of concessions that we needed to make in the first place, especially when it came to the canvas. I think there’s something important to say here that makes my life very easy as a manager, which is our culture, and how our culture plays an important role in the concessions that we needed to make.

“It really boils down to a culture that allows us to make those concessions without any hurt feelings”

I think that engineers at Intercom have a great understanding of their role in releasing value to customers. They understand this super well. They understand that sometimes they need to do something a bit more complex than they were expecting, but it’s important because it’s going to unleash a lot of value for our customers. But Alex understood very well that some of the things that he was suggesting us to do were very, very complex, and would take us a few more weeks than what we had planned initially. This could potentially derail the whole project and prevent us from releasing the project on time.

The canvas is a great example. A lot of concessions were made on both sides. Alex and engineers, as Alex said, were in constant talk with each other to ensure that the decisions were made to optimize for a solution that, in my opinion, would be a combination of three things:

  • Great UX that would allow us to increase adoption, that would allow customers to use Series in the way that they were expecting;
  • Great code that would not create technical debt that we’d have to pay in the future;
  • And something which I believe is the combination of the two first factors, which is the velocity that would allow us to collect feedback more quickly.

So, let’s say Alex was suggesting something that seemed pretty crazy to us. We would figure out the simplest solution that we could put in front of customers, and then we would see if this is what Alex was expecting? We did that multiple times with prototypes, with real code, et cetera. So I think this is part of our culture, as I said. It really boils down to having a culture that allows us to make those concessions without any hurt feelings.

Nailing the key value proposition

Thomas: Nicole, can I bring you in here? One of the things that we learned from our first podcast was that when you have a lot of customers sharing their unique problems, or at least their own unique ways of describing a problem, it’s not always easy to collect that feedback into a clearly defined problem to be solved. Does that present challenges when you’re bringing the solution or product back to those customers?

Nicole: Like everyone has mentioned, Series was a big, standalone product that we had to build from scratch. A lot of the work we do at Intercom is small, incremental changes that we can deliver really quickly to drive value for our customers, and this was a very different scenario where we were developing something from the bottom up.

And that posed a challenge because we were solving so many different problems for our customers. In product marketing, I’m focused on communicating that value, and really thinking about that value proposition, and why our customers should use something like Series for their outbound messaging. And in communicating that, we really needed to think about the key, high-level challenges that this tool solves for, and filtering out the noise, and isolating those key value propositions that Series would deliver for our customers.

Part of that was a lot of back and forth internally and prioritizing the messages we wanted to communicate. But another part was speaking with our beta customers and understanding the value they were getting from using this tool. And then, thinking about how we could use that in our messaging, and communicating that out to the market.

There was a lot of back and forth, followed by a lot of hypothesizing and then narrowing that down. But ultimately, we landed on a few key value propositions that Series brings, which is A, the visual nature of it, so the ability to really see the messaging journey that you’re building out, and then B, the fact that you have all of those different message types across all of Intercom’s channels. So, email, push notifications, mobile carousels, chats, posts, product tours, and the ability to really orchestrate those, and create a consistent and meaningful journey for your customers.

“I had regular calls with our beta customers and really looked to understand what the benefits were”

Those were the two messages that we landed on as our really high-level benefits. And then, of course, there were lots of smaller, more nuanced benefits that we would then throw in our messaging at different phases of that journey. Yes, it was challenging to figure out what those two high-level benefits were, but we had a process in place to really nail those down.

Thomas: You mentioned the beta program, you were almost testing your messaging to see what resonates with that beta audience.

Nicole: Throughout the beta process, we’re always looking to get feedback to understand how exactly our customers are using it and liking it, and if they have any feedback. As a product marketer, I’m checking in to understand the benefits they’re seeing, looking to talk to them to get testimonials or case studies when it comes to the launch, but also to understand how the benefits resonate with them.

And so, I had regular calls with our beta customers and really looked to understand what the benefits were. If they could summarize what the key benefits were that they were getting from this tool, they would communicate those back to me, and I would then validate my assumptions or my hypothesis for what I thought the high-level value propositions would be. That was a really great way to get that feedback and make sure that our messaging was really tight and in line with what our target audience and what our customers would believe.

Down the adoption funnel

Thomas: Wal, how does analytics fit into all of this?

Wal: We wear a few hats in the whole process, and that can be challenging because while we’re evolving our processes over time, our customers are changing their behavior over time, too. They’re becoming more sophisticated, they’re becoming more advanced in their usage of Intercom, and the kinds of things they’re trying to achieve with Intercom. So, it’s a shifting platform in many ways.

“Firstly, what does a customer who’s getting great, valuable, and useful usage of this feature look like?”

The way we tend to work with something like this, with new products that we’re launching, part of it is defining success metrics with the product group. Firstly, what does a customer who’s getting great, valuable, and useful usage of this feature look like? What are they achieving? So, we have some success metrics around that. And then, of course, we have our own success metrics internally. We want adoption of the product, we want customers to be able to adopt the product in good ways. Because customers will tell us that if they don’t like a product, they’ll stop using it. So, we’re trying to understand that from an adoption perspective.

We do a couple of things around launches. The first thing we do is define an adoption funnel. These are the steps in the adoption of the product that a customer is going through on their way to advanced usage. We have an aware phase, an interest phase, and an intent phase, and an active phase. We’ve got these stages that we want customers to walk through as they get to be advanced users of the feature.

Then, we have this other thing called Target Customer Tiers. We’ve got our Tier One, the perfect customers, where the feature’s ideal for them. We expect really high usage and really high adoption by this group. We’ve got one adoption funnel for those. Our Tier Two customers are those who we think will find this useful. We think that maybe they’ve used auto messaging or something in the past and they’ll be good candidates for this feature, but they’re not definites, they’re not Tier One. We’ve got different expectations for that group. And then, Tier Three is everyone else  we’re not really sure whether they’re going to adopt at all.

What we do is we’ve got separate adoption funnels for each of these tiers, and we keep an eye on those adoption funnels. And there are different levels per tier. The point of all that stuff is to try and encode our expectations as to what will happen down the road. As product people, as builders, we’re kinda predicting the future. We’re saying we think this is what the product will look like in three months or six months as people are using it.

So, we encode all these expectations and what we think will happen in these tiers and adoption funnels. And then, after the launch, we look at these expectations and see if the data coming in changes our beliefs. So, that’s what we try and do. We generally try and put all the stuff in place so that we can describe our expectations. And then, post-launch, we review the data as it arrives to see if our beliefs change, if something unexpected is happening, or if it’s exactly what we expected. So, that’s our role in all this.

We try and take all that stuff, we take it around to the other teams and we say, “Look, if this is how we expect our customers to be, can we, based on your plans, build a model around all of this that we can then go and try and set very specific targets and goals on?” But essentially, it’s all about encoding expectations and then seeing if the things play out the way we expect them to or not.

“We go into betas with very opinionated views on what we expect to happen”

Thomas: So, you’re really defining who the product would best suit. Would it be fair to say that as part of those funnels that you’re creating, you’re really trying to test if the problems that you have heard back are actually being solved?

Wal: That’s exactly it. Through many channels, through usage, through your team, and through the various people telling us what the problems are, we basically encode the group we believe has those problems into our tier, into one adoption funnel. And we expect that adoption funnel to perform. If we’ve built the right product, that adoption funnel will perform really strongly, lots of those customers will progress through to advanced usage. And if they do, then that’s what we expect to happen. If that doesn’t happen, then we’re blind to something, we need to go and investigate why or understand why that’s not the case.

Thomas: Nicole mentioned the beta group earlier on, and we’re testing messages to see if they resonate. Are you doing the same thing with the funnels? Are you testing to see if the funnel does work and if it’s doing what we intended?

Wal: That’s a great question. And this is something we’ve got better at with time. Early in the beta, there’s no point doing that. Because early beta activity data is just people clicking around and playing and testing things out, which is exactly what we want. Early in the beta phase, we’re just trying to see if people are using the product even roughly in the correct way, we’re just trying to monitor. And sometimes we find things. Everyone who wanders in here suddenly clicks on this part, why is that? So, we’re asking early in the beta rounds, and those kinds of questions.

But later on in the beta, towards the end of it, really what we do is hypotheses for what should happen. If somebody is using this feature, we have a hypothesis for what they should achieve through using it. Later on in the beta, we’re basically trying to see if there is any big evidence that would challenge that hypothesis. Is the beta telling us that we’re wrong?

We go into betas with very opinionated views on what we expect to happen, and we only really react if we see something very different, or if something not unexpected happens in the beta. We tend to approach a beta in a very hypothesis-driven way.

Running the beta

Thomas: Coming back to Lucas and Alex, are you looking out for the same things in a beta release? What are the key things that you get out of a beta group trying out new features?

Lucas: I will try to answer these questions through an engineering lens, as I expect Alex is going to answer the discussion more through a product lens. Series was a big project that was in the making for about two years. Most people don’t know this. We wanted to ensure that customers would not just have a great UX, we also wanted them to have a great product experience overall, which these days boils down to operational excellence. In other words, we wanted customers to have confidence that their users would get their messages at the right time, in the right place. There should be no doubts about the behind the scenes excellence of Series.

So we did two things. We ran some load tests simulating real customers’ usage, like mimicking campaigns and stuff like that, so we could ensure we had the same load for Series that we had for Campaigns, that our systems were working as we expected.

However, we also wanted to use the beta to stress-test some crazy user cases. For example, questions like how customers will use such a feature, how many users will they target, how many notes will they set up. Those were questions that we didn’t have a great answer to. Because Series is a brand new product, customers might use Series in a different way than they use Campaigns. The way that we were testing this out behind the scenes wasn’t enough, and we wanted to release this to beta customers to understand how they were using it.

“I think we spent three months in beta trying to understand each of those little bits and pieces”

Beta allowed us to learn more about how our systems work under pressure, how our systems react to those crazy user cases that were completely unexpected. Instead of making assumptions, putting some technical limits in place, and frustrating customers, we decided to be more open-minded: let’s release this to beta customers. Let’s understand and learn how they are going to use this. Let’s improve our systems, and let’s make sure that we are releasing the best product, not only from a product point of view but also from an operational point of view.

Alex: The beta was completely instrumental for us and for pretty much all of the decision-making. We tried to de-risk a lot of our decisions even before we’d gone to the beta. And as Lucas mentioned, Campaigns, for example, is a different product, but we still try to look at some of the Campaigns usages to make some projections for how people will be using Series. But those are just projections, and it’s a completely different story when people actually try to use it in the wild.

We’ve created a number of prototypes for the actual visual builder and did some user tests which were extremely helpful, but those user tests were not the real thing, they were not actual people setting up messages to send to their customers. That’s a lot of anxiety, and it really stress tests design decisions and product approaches.

And so, knowing that Series is such a complex and multi-layered product, we knew that we needed to have a bunch of time scheduled for the beta. I think we spent three months in beta trying to understand each of those little bits and pieces and trying to stress-test all of the decisions we had made. And then, just using the beta as a way to get the ongoing customer inputs into all of the things that we’re doing.

Wal: One of the tiny, little extra points on the beta that I think is valuable is we spent a lot of time figuring out who would be in the beta, the customers that would participate. Shwaytaj Raste, the product manager, spent a lot of time trying to get a good balance of existing customers from the old product — some who were happy, some who were frustrated — and new customers who knew nothing about it, just to try and get a mix of all different kinds of people who might be exposed to Series. Because it’s a complex product, there’s a lot of complexity.

I think this is probably the best release we’ve done in terms of the thoughtfulness of who was actually in the beta program. Some customers contacted us and gave us feedback, some never gave us any feedback at all. I think that was a smart thing with this product. There was a thoughtful notion of who would actually be in the beta program.

Thomas: It’s really interesting to bring beta in. I think when we started this conversation around our customer feedback and how it shapes or helps to shape the products that we build, we were really thinking of customer feedback in terms of an intake system, and then there’s an output process. I think what the beta allows you to do is introduce feedback throughout that entire process.

Coming up soon

Thomas: Before we finish up, I wanted to come back to Nicole. Lucas expertly escaped telling me if there were any fast follows that we can expect from Series. Are there any particular feature releases on the cards for Series?

Nicole: Yeah, we have some really exciting features coming out over the next couple of weeks. One of the biggest is A/B testing, the ability to test multiple Series against each other to see what is performing best. This is great for performance in general, but also for really understanding your specific customer base or your user base and what will drive impact. So, that’s our first.

“Sometimes what you get are tools that completely redefine the job you were doing in the first place. And it all starts with a conversation”

We also have annotations, which is the ability to take notes and collaborate with teammates within this Series canvas in the builder. You can think of this as “I need my manager to review everything I just built, and just make sure it’s okay,” or “I want to get feedback from a colleague and collaborate within the specific tool rather than having to move everything into docs and collaborate offline.” This would give our customers the ability to collaborate within Series itself. Those are some of our biggest ones coming down the line, so keep an eye out for those.

Thomas: Nicole, Lucas, Alex, and Wal, thank you again for joining us on this podcast. Before I sign off, my quick reflection on what I’ve learned over these past two episodes.

As I said, it would be easy to imagine customer feedback as a simple intake and output process. But by taking the approach at Intercom, with each of the different teams looking at that unvarnished feedback, trying to deeply understand the problem to be solved, and then feedback coming in throughout that entire process with all of the teams working together to create an Intercom solution to the problem, what is built most often goes beyond what was initially asked from customers. And sometimes, those features will simplify your workflows, making it easier for you to do the things you need to do, and sometimes what you get are tools that completely redefine the job you were doing in the first place. And it all starts with a conversation.

Built for you horizontal ad