It’s very beneficial to run A/B tests (aka 'split tests') of your messages when starting your journey with cold mailing or when trying out new ideas. Testing different variants and analyzing the results will tell you which strategy works best for your business. It’s easier to be more confident in your decisions when you have data, not just your gut, to back them up.
Our app has a feature allowing you to test different messages on each step of the sequence, but it's also possible to test whole different sequences and targeting criteria! Here is a guide to A/B testing and what to keep in mind:
How to perform an A/B test of a step in a sequence?
In Growbots, you can easily make and test different Variants in each step of the sequence. When you assign a sequence with different variants to a batch of prospects, the system will randomly split them into even groups, each receiving one of the variations. You can easily turn the Variants on and off by the switch on the left once you create them!
As every Variant has its own statistics (opened, clicked, replied and warm rates), you’ll be able to conveniently compare their performance and make decisions based on the correct metric. When you have enough data to draw meaningful conclusions and see that one Variant significantly underperforms the other, you can decide to turn off the one that isn’t performing.
Keep in mind that the variants are distributed randomly among the prospects who receive them. Moreover, they are distributed on each step separately, so the variant sent to a given prospect in Step 1 has no impact on which variant the same prospect will receive in the subsequent steps! The same prospect can receive variant A in Step 1, variant B in Step 2 and variant C in Step 3. If you'd like to A/B test whole different sequences - take a look at the last section of this article!
How to make A/B testing correctly?
It's good to keep in mind a few things before you start to test your Templates:
You should start by deciding what you want to test. Test just one thing at a time to check what’s really impacting your results. For example, if you’re comparing different testimonials, leave the remaining elements of the message (subject line, CTA, etc.) the same. Otherwise, if one email outperformed the other, it will be hard to tell what made the difference.
Remember that the larger the test sample and the time frame, the more statistically significant results will be. The effectiveness of certain elements can change over time, so it’s best to test your ideas regularly.
You can compare more than 2 Variants at a time, just remember not to split the messages too much to keep each sample solid (we recommend to make conclusions about the differences between Variants when there are at least 100 sent messages per Variant).
A/B testing of sequences and targeting criteria
Although the in-app feature of A/B testing is present only in the sequence editor, which allows you to perform the A/B tests on specific steps of a sequence, we strongly encourage you to also use similar strategy to A/B test whole sequences or different targeting criteria!
A/B testing of sequences
In order to test whole different sequences for the same target, all you need to do is create two campaigns with the same target (you can do that by creating one campaign and then cloning it - this way you'll clone the exact same targeting criteria and use them in the new one). Then you can use slightly different sequences in those campaigns and observe which sequence performs best!
AI Variant Generation
Now, you can also generate a new variant to any email step in the sequence and get email content generated with GPT, ready for testing & reducing deliverability risks by using multiple types of messaging content.
A/B testing of targeting criteria
Similarly, you can create two campaigns and try out different targeting criteria in order to see, for example, if your campaigns performs better with a more narrow target, or with a wider one. As you can see the results for every campaign separately, you will be able to easily verify which targeting criteria worked best!
A/B testing is not an overnight project, it requires devoting some time and effort. However, the work you put into this process will surely pay off in the long run. Happy testing!