Intercom makes it easy to measure the effectiveness of your messages with our goals, A/B testing and control group features. First, let’s look at goals.
Every message you send has a goal. But open and click rates will only tell you so much. Goals are the true measure of a how message is performing. When creating an ongoing message, you can set a goal to measure its effectiveness. You can see each user that matched the goal after receiving the message.
To add a goal to an ongoing message, choose it from the dropdown under 'Set your goal'. Next, choose the result you want to measure. In the below example we set our goal as 'Web sessions is increased'. Each user who gets this message, who then comes back to the app, will contribute to this goal's total.
You should also set a limit on how long your customers' actions should be counted towards the goal. For this example, we'd like all customers who have additional web sessions within 3 days of receiving the message to be counted.
Pro tip: With a longer time limit you'll see a higher percentage of recipients hit the goal, but it's also more likely their actions have been impacted by other factors, including your other messages.
You can set a goal using any piece of data in Intercom, whether it's an event or a user attribute; it can be any desired outcome for the message. For example, your goal could be to see users upgrade to your pro plan, to increase their iOS sessions, to send more messages, or to upload an avatar.
Once the message is live, you can track your goals performance on the message stats bar.
How are goals calculated?
Goals are calculated based on the number of messages sent.
You can easily filter your goal by after the number of messages sent, opened or clicked:
Pro tip: You can of course add goals to one-off messages too. 👌
Intercom makes it super-easy for you to A/B test your messages so that you can fine tune them to be as effective as possible. All A/B tests should start with a question; i.e., 'will A or B be better at achieving X?'.
Here are some examples:
Which content layout will result in more replies?
Which content will result in more users upgrading?
Which content layout will help me capture more visitor emails?
Which subject line will get more users to open my email?
Will a button generate more click-throughs than a text link?
There's lots more to why you should A/B test. We wrote a blog post all about it here.
Test, test, and test again
You should always be testing, and learning. For example, if you test a button against a link, and the button wins, next you should be testing the button colour. Then when you have a winner from that, test the button copy. The key point is that the more you test, the more effective your messages will become. Always be testing.
How to set up your A/B test
First, create a new message. Go to Outbound and click on 'New':
Before putting it live, make sure that you have at least:
Given your message a name
Selected your audience.
Selected your channel.
Added the content of your message. This message will be message "A" in your test.
Note: You can start an A/B test against a message that's already live. It doesn't have to be when you create your message for the first time.
Now, start your A/B test
Once you have your message A set up, click 'Run a new test' and choose the option for A/B test.
This will create a draft of your B message. Initially, this is just a duplicate of your existing message. You can now edit the duplicate with whatever it is you're going to test (e.g., different subject lines, different text, different button colours, or different images, whatever you want).
We recommend that you only compare one difference at a time. This way, if there is a clear winner, you'll know the reason, and you won't be left guessing about which variant caused the difference.
Finally, put it live
Once you're happy with the two messages, put them live.
You can check the performance on the auto message list page, or on the message page. On the message page just toggle between version A and version B to see the stats for each. As time goes by, you'll be able to see which variant is performing better in the metrics you care about most. For example, if you were A/B testing subject lines, you would be likely concerned with open rates.
Choose a winner once you have a result
Once you're happy that one message is outperforming the other (for example when one message's goal % is higher than the other) select the 'Pause' button on the message you'd like to stop. 100% of recipients will now get the winning message. This will automatically pause the other message and people will no longer receive it (unless you decide to put it live again).
Prove the impact of your messages with control groups
Testing your messages against control groups lets you know which messages are driving real action. We’ll send your message to half of your intended audience, and hold back the message for the other half.
Then, you can easily see which group performed better. This will help you make informed decisions about which messages to keep, and which to refine.
Note: Control groups are only available on certain Intercom plans. See our plans and pricing here.
Set up your control group test
Once you’re finished writing your message, click ‘Run a new test’. Then, select ‘Control group':
You can only set a control group when you’re drafting a message. Once a message is live, you can’t set a control group for it.
Now, once you’re happy with your message set it live.
Control groups are available for all ongoing messages, including those in a campaign. They’re not available for one-off messages.
How control groups work
Next, we’ll split the audience of the message into two groups. Then, we’ll automatically assign 50% of your audience to the message and 50% to the control group. That means 50% of your audience won’t receive any message.
See the results of your control group experiment
Once your message is live, it’s easy to see the results of your control group experiment.
First, click on the ‘Message’ tab - here you’ll see the number of customers who received the message, and the percentage of those customers who completed your goal.
Then, click on the ‘Control group’ tab - here you’ll see the number of customers who didn’t receive the message (because they were in the control group), and the percentage of those customers who completed the goal.
If a higher percentage of users who received your message completed your goal than users in your control group, you can likely attribute the lift in performance to your message.