What is split testing and how do we use it?

Split testing or A/B testing is the process of creating different variations of an ad, for example, a landing page (mini web pages), and an email, and then presenting it to your audience to observe which variation works best.

Say the prospective parent sees an ad with a student reading or a student playing. They click on the first one but not the second one.

Or a prospective parent sees a landing page with a red “Inquire Now” button or a page with a green “Inquire Now” button, clicking on the first but not the second.

This is a profound and very interesting concept with a lot of implications. Let’s peel back the layers one at a time.

The first thing we should mention is that split testing shifts our mindset. Instead of having a strong opinion on whether something looks nice or whether it is what parents want to see or hear, we will suspend belief and let the parents decide for themselves.

Certainly, one or two clicks don’t mean anything, but when you start getting hundreds of clicks, patterns emerge. If you do your testing right, you get statistically significant results.

This way, parents “vote” for what they prefer, instead of us deciding.

This image shows how parents would vote for the second ad.

Source: blog.twn.ee

It is also important to note that parents often do not know exactly what they want, or what appeals to them because the desires and preferences work on a subconscious level. When you click on an ad, you often don’t really know what appealed to you, but you just went ahead and clicked.

Look at this example, where one word increased performance of an ad:

Split testing is made easier in the digital world because it costs very little to make a variation of something. While this is not so in the physical world, this same concept can be used with any experience you are providing to your prospective families. For example, you could have two student ambassadors saying different things during your tours, and then observe which message resonated more with parents.

Split testing goes hand in hand with measurement. You need to measure the result of your test objectively, in order to draw a satisfactory conclusion.

We use split testing on all of your campaigns. A typical testing regimen might look something like this. We will choose 3 different images, 3 different titles, and 5 different targeting options, giving us 45 variations.

Source: Adespresso

We will do this again and again, optimizing the campaigns and indirectly asking your audience to vote for what they like best.

Here are some common campaign elements that we always test:

  1. Zip codes.
  2. Interests and lookalike audiences.
  3. Android vs. iPhone.
  4. Age ranges.
  5. Ad designs.
  6. Ad image colors.
  7. Photo vs. an illustration.
  8. Different types of photos.
  9. Informational vs. storytelling ad text.
  10. Short copy vs. long copy.
  11. Titles.
  12. Landing page colors.
  13. Landing page form fields.
  14. Short vs. long landing pages.
  15. Landing page calls to action and buttons.
  16. Landing page support elements (testimonials, arrows, etc.).

We might create 40 different ads for each campaign until we get a sense of what works best for your school.

At Enrollhand we have a weekly cycle where we look at all our testing.

We track ad clickthrough rate (CTR), landing page conversion rate (CR) and Cost Per Inquiry (CPI) using an Excel spreadsheet. Every Friday, we sit down to evaluate our results and decide if we are going to KEEP the tested changes or RETURN to the previous setup. We also determine whether we are going to roll out the changes to additional clients. Here are a few of our past campaign tests:

1. Redesigned Landing Page Enhanced with Children's Image Gallery and Lightbox CTA

We did not want to include additional information about the school on the landing page. In a past experiment (July 2018) we confirmed that adding information beyond a certain point significantly reduced conversions as a result of the following:

a. Increased distraction/confusion for visitors of our landing pages.

b. Parents were getting enough info without needing to start a conversation with our school.

Results: One out of 13 schools in the experiments saw a limited improvement.

Insights: Parents were getting a lot of visual content already at the ad level, so additional photos or footage on the landing page does not provide any added value. We change up campaigns every couple of weeks (depending on how frequently parents have already seen the ads that are running), so parents get a healthy dose of the school's facilities, staff, vibe, etc.

2. Switching Our Ads from Short to Long Form

When creating Facebook ads, most of the images have to fit within certain limitations. For example, the image has to be a set ratio and size. The headline and the link description have to be a certain length otherwise they get cut off. That leaves the ad's body text as the primary variable. It can be just a single word or the equivalent of a short story. Since the ad text has so many possibilities, we at Enrollhand get asked all the time which length is best. Therefore, we invested $7,642 of ad spend to get a scientific answer.

Results: This experiment was a 'keeper' for 8 out of the 20 schools in the sample. Some clients did not see any improvement [12 out 20] while there were others who saw a significant change [8 out of 20].

Using these experiments, for example, we were able to completely turn ATLAS around; a school that had been underperforming for three consecutive weeks. ATLAS’s Cost per Inquiry has since stabilized at -74% compared to the previous period.

Insights: For clients hovering around a 5% conversion rate, increasing the body text in our ads added an average of about 4% to their performance. Clients who had already broken the 5% threshold saw little or no improvement.

3. Softer Opt-in with Reduced 'Ask'

In this experiment, we were testing whether requiring a smaller commitment would boost conversions. We switched the copy in our ads' Call-to-Action and the inquiry form from 'Get In Touch' to 'Receive more Info' (location, grade levels, and tuition) or 'Subscribe to our Updates.' We tested this approach on a total of 35 clients.

Results: We saw significant improvement in 20 out of the 35 schools in the experiment. The exciting part of this was that our follow-up via text message and phone calls revealed these parents to be equally, if not more responsive, leading to campus tours and enrollments.

Insights: When a parent opts into a softer ask, they may very well be interested in being contacted. At the campaign level, we should be focused on initiating the conversation. After that, a light text message follow-up is ideal for gauging whether the parent is simply browsing or actually shopping around for schools.

The approach we use above shows why it makes little sense to spend a lot of time in advance on minute details of a certain ad. This testing process will morph the campaign gradually into what parents actually want (even if they don’t know it).

It can feel a bit frustrating to give up control like this to the crowd, but it soon becomes very liberating when you think that you can essentially listen to what your prospective families want and give it to them.

Testing is a continuous process and our aim is to gradually make small improvements. You will first see reach and impressions expanding bit by bit. You will then see engagement and clicks increasing. Then inquiries will start to increase. Scheduled calls will start and expand. Finally, you’ll get more visits and enrollments.