In the rapidly-changing world of online fundraising, you should never underestimate the power of an email. According to the M+R Benchmarks 2018 Study, email accounted for 28% of online revenue in 2017, with nonprofits raising an average of $42 for every 1,000 fundraising messages sent.
Email still matters—but it’s getting more and more important to craft those messages strategically. The same study reported that response rate to fundraising emails declined by 6% in 2017. Following the same formula and crossing your fingers for better results won’t cut it.
So how can you start refining your fundraising emails? Through A/B testing.
WHAT IS A/B TESTING?
First, let’s make sure we’re all on the same page. A/B testing compares two versions of the same variable to determine which is more effective at a stated aim. All other variables are held constant. If you’re not familiar with this type of experiment, an example should help clarify it:
ABC Nonprofit is planning to send out a fundraising email, and their annual giving coordinator wants to know if changing the color of the call to action (CTA) will impact the click-to-open rate (CTOR; the number of people who click on the call to action divided by the number of people who opened the email).
She creates the fundraising email with their standard blue CTA, then creates an identical version of the same email with a red CTA. Everything else about the messages remains the same: subject line, images, copy, etc. The annual giving coordinator randomly splits the ABC subscriber list down the middle, and sends half Version A, half Version B.
After the email goes out, the annual giving coordinator analyzes the results. Version A with the typical blue had a 2% CTOR, while Version B with the red call to action had a 5% CTOR. The annual giving coordinator decides to perform the same test in future fundraising emails. If she sees similar results, she’ll start using red CTAs more frequently.
It’s as simple as that! A/B testing is nothing new, and it’s not unique to email fundraising or even to the online sphere. Organizations can use A/B tests to refine elements of their websites, social media ads—even their direct mail. In this post, we’re going to focus on email, but take note of these other possibilities.
WHY SHOULD I A/B TEST?
Best practices are so important and can get you far. I can tell you that, in general, you should personalize your emails as much as you can, use eye-catching colors to make your CTAs pop, and emphasize the solution rather than focusing on the problem in your fundraising email copy.
But I can’t tell you how YOUR audience will respond to a specific email. You’ve carefully built up your email subscriber list. It’s your direct connection to people who have opted in to engage with your organization. A/B testing gives you the chance to test and re-test small changes, see how that curated list of supporters responds to your emails, and pivot accordingly.
WHAT VARIABLES SHOULD I TEST?
Wait! Before you jump to variables, think about your goal and the problem you want to solve. Does your open rate seem low? Are you hoping to raise your CTOR? Maybe you want to focus on the number of gifts coming in or the average gift size per fundraising email.
Of course, you probably have multiple goals, but pick one to focus on. Once you decide this, it will be easier to choose the variable to test. The options are nearly endless, but some common choices include:
Improving open rate:
- Subject line (copy, length, personalized vs. generic)
- Sender name
- Day of week
- Time of day
Increasing CTOR, number of gifts, or average gift size:
- CTA color
- CTA copy
- Location of CTA in email
- Messaging (in body of email)
Note: If you find that your average CTOR is increasing, but the number of gifts per email isn’t budging, it may be time to make changes to your donation webpage.
HOW DO I RUN A TEST?
First, steer clear of A/B testing during the biggest fundraising pushes of the year. It’s not a good idea to experiment with your tried-and-true formula during these crucial time periods. Most nonprofits send fundraising emails throughout the year, so you likely will still have ample opportunity to test at other times.
Many email marketing services offer a built-in A/B testing feature, so investigate whether your platform has this functionality. If not, you can still run an A/B test manually, following these steps:
- Create the Version A email
- Clone Version A and change one variable to create Version B
- Randomly select a segment of your email list—this segment should be at least 1,000 subscribers
- Split this list down the middle and send half Version A and half Version B
- After you’ve analyzed the results (give it at least 4 hours), send the “winning” version to the rest of your email list
If the email list in question is smaller than 1,000 subscribers, you can still perform the A/B test without segmentation. Just split it 50/50 and remember to take the results with a grain of salt, given your small sample size.
Keep testing: If you think you’ve found something that makes an impact, try it out a few more times. If you achieve similar results, incorporate that change into your future fundraising emails. A/B testing can and should be an ongoing process, helping you stay nimble, pivot quickly—and ultimately raise more online.
If you'd like to learn more about A/B testing or other tactics to enhance your email fundraising strategy, contact me at email@example.com.