AB testing
AB testing lets you compare two versions of one element in your campaign, subject line, sender, content, or call-to-action image, and automatically send the better-performing version to the rest of your list. It removes guesswork from your campaign decisions and gives you data you can act on across future sends.
AB testing is available on Pro and Premium subscriptions.
Prerequisites
- You have a Pro or Premium Flexmail subscription.
- You have a campaign ready in the campaign creation flow, at step 3 (the overview page).
- For sender tests: both sender addresses must be validated in your account.
- For content tests: you have two different messages prepared.
How it works
You create two variants of one element. Flexmail splits a portion of your recipient list, the test group, in half, sending variant A to one half and variant B to the other. After a waiting period you set, Flexmail evaluates the results and automatically sends the winning variant to the remainder of your list.
The winner is determined by open rate for subject line and sender tests, and by click rate for content tests. Call-to-action tests work differently, see below.
Attention You can only run one AB test type per campaign, and you can't change the type after the test has started.
The four AB test types
Subject line test
Test two different subject lines. The variant with the higher open rate after the test period wins, and the winning subject line goes to the rest of your list. This is one of the easiest tests to set up and can have a meaningful impact on your overall open rates.
Sender test
Test two different sender names or email addresses. Both addresses must be validated in your Flexmail account. Winner is determined by open rate.
Content test
Test two entirely different messages. This is the most comprehensive test type, useful when you want to compare different approaches, layouts, or offers. Winner is determined by click rate.
Call-to-action test
Test different images for a specific call-to-action in your message. Unlike the other test types, this sends to your entire list, not a test group. Flexmail alternates between the variants and declares a winner once a set number of click actions is reached. After that point, all readers see the winning version.
Set up a subject line, sender, or content test
From the campaign overview page (step 3 of the campaign creation flow), click the AB testing button before proceeding to the sending options.

- Select the test type.
- Enter the second variant. The first variant is whatever you entered in step 1 of the campaign.
- Set the size of your test group as a number or percentage of your contact list.
- Set the waiting period. After this time, Flexmail evaluates the results and sends the winner.
- Click Save to confirm your test setup.

Viewing your results
Once the test is complete, you'll find the results in the AB testing section of your campaign report. The winning variant is clearly marked, along with the open or click rate for each variant.

Pro tips
- Keep a running log of your AB test results across campaigns. Patterns emerge over time, your audience may consistently prefer shorter subject lines, action-oriented language, or name personalisation. That knowledge is more valuable than any general best-practice guide.
- Test one element at a time. If you change both the subject line and the sender in the same test, you won't know which change drove the result.
- Set your test group large enough to get meaningful results, at least a few hundred contacts per variant. Very small test groups produce unreliable data.
Common mistakes to avoid
- Setting the waiting period too short. A two-hour window may not capture contacts who check email later in the day. Set at least 4 hours, or overnight for B2B audiences.
- Declaring a test result meaningful when the margin is very small. If both variants performed within 1-2 percentage points of each other, the difference may not be statistically significant. Look for consistent patterns across multiple tests rather than acting on a single close result.
- Testing too infrequently. One test per campaign is fine, but if you only run a test a few times a year, you're not building useful knowledge about your audience.
- Using a call-to-action test for a very low-engagement list. This test type needs a minimum number of clicks to declare a winner. If your list is small or your click rate is low, the test may run for a very long time.
Next steps
- Run your first subject line test on your next campaign. Compare a functional subject line against a curiosity-driven one and see what your audience responds to.
- Once you've run several tests, look for patterns in your results, see "What makes a good subject line?" for subject line guidance.
- Combine AB testing with segmentation: test different approaches for different audience segments to build segment-specific insights over time.