Intercom makes it easy to measure the effectiveness of your messages with our goals, A/B testing and control group features. First, lets look at goals.

Goals

Every message you send has a goal. But open and click rates will only tell you so much. Goals are the true measure of a how message is performing. When creating an auto message, you can set a goal to measure its effectiveness. You can see each user that matched the goal after receiving the message.

To add a goal to an automatic message, choose it from the dropdown in the 'Set a goal' tab. Next, choose the result you want to measure. In the below example we set our goal to 'Plan is pro'. Each user who gets this message, who then upgrades to our pro plan, will contribute to this goal's total.

You can set a goal using any piece of data in Intercom, whether it's an event or a user attribute; it can be any desired outcome for the message. For example, your goal could be to see users upgrade to your pro plan, to increase their iOS sessions, to send more messages, or to upload an avatar.

Once the message is live, you can track your goals performance on the message stats bar.

How are goals calculated?

Goals are calculated based on the number of messages sent. 

Note: Goals used to be calculated based on the number of opened messages. We updated this statistic to make it consistent with all other message statistics.

You can easily filter your goal by after the number of messages sent, opened or clicked:

You can of course add goals to one-off manual messages too. In the example below, the goal of our message is to get our customers to invite their first friend, so we have set the goal to 'invite friend count is 1.' If any customer who gets this message subsequently invites their first friend, then they will contribute to the message's goal total.

This video explains more about how goals work.

A/B Testing

Intercom makes it super-easy for you to A/B test your messages so that you can fine tune them to be as effective as possible. All A/B tests should start with a question; i.e., 'will A or B be better at achieving X?'.

Here are some examples:

  • Which content layout will result in more replies?
  • Which content will result in more users upgrading?
  • Which content layout will help me capture more visitor emails?
  • Which subject line will get more users to open my email?
  • Will a button generate more click-throughs than a text link?

There's lots more to why you should A/B test. We wrote a blog post all about it here.

Test, test, and test again

You should always be testing, and learning. For example, if you test a button against a link, and the button wins, next you should be testing the button colour. Then when you have a winner from that, test the button copy. The key point is that the more you test, the more effective your messages will become. Always be testing.

How to set up your A/B test

First, create a new user or visitor auto message. Go to Messages and click on 'New Auto Message' in 'Visitor Auto Messages':

Before putting it live, make sure that you have at least:

  1. Given your message a name
  2. Selected your audience.
  3. Selected your channel.
  4. Added the content of your message. This message will be message "A" in your test.

Note: You can start an A/B test against a message that's already live. It doesn't have to be when you create your message for the first time.

Now, start your A/B test

Once you have your message A set up, click 'Run a new test' and choose the option for A/B test. This will create a draft of your B message. Initially, this is just a duplicate of your existing message. You can now edit the duplicate with whatever it is you're going to test (e.g., different subject lines, different text, different button colours, or different images, whatever you want).

Best practice

We recommend that you only compare one difference at a time. This way, if there is a clear winner, you'll know the reason, and you won't be left guessing about which variant caused the difference.

Finally, put it live

Once you're happy with the two messages, put them live. 

Monitor progress

You can check the performance on the auto message list page, or on the message page. On the message page just toggle between version A and version B to see the stats for each. As time goes by, you'll be able to see which variant is performing better in the metrics you care about most. For example, if you were A/B testing subject lines, you would be likely concerned with open rates.

Choose a winner once you have a result

Once you're happy that one message is outperforming the other (for example when one message's goal % is higher than the other) select the 'Pause' button on the message you'd like to stop. 100% of recipients will now get the winning message. This will automatically pause the other message and people will no longer receive it (unless you decide to put it live again).

Note: A/B testing is only available for auto messages. You cannot A/B test a manual message.

Prove the impact of your messages with control groups 

With Messages Premium, you can test your messages against control groups to know which messages are driving real action. We’ll send your message to half of your intended audience, and hold back the message for the other half. 

Then, you can easily see which group performed better. This will help you make informed decisions about which messages to keep, and which to refine. 

Set up your control group test

Once you’re finished writing your message, click ‘Run a new test.’ 

Then, select ‘Control group.’ 

Note: 

  • You can only set a control group when you’re drafting a message. Once a message is live, you can’t set a control group for it. 
  • Control groups are available for all message channels (in-app, email and push). 

Now, once you’re happy with your message set it live. 

Important: 

How control groups work 

Next, we’ll split the audience of the message into two groups. Then, we’ll automatically assign 50% of your audience to the message and 50% to the control group. That means 50% of your audience won’t receive any message. 

See the results of your control group experiment 

Once your message is live, it’s easy to see the results of your control group experiment. 

First, click on the ‘Message’ tab - here you’ll see the number of customers who received the message, and the percentage of those customers who completed your goal. 

Then, click on the ‘Control group’ tab - here you’ll see the number of customers who didn’t receive the message (because they were in the control group), and the percentage of those customers who completed the goal. 

If a higher percentage of users who received your message completed your goal than users in your control group, you can likely attribute the lift in performance to your message. 

What’s next? 

You can find more information and best practices on control groups here.

Did this answer your question?