top of page

UX writing A/B tests that actually boost clicks (with examples)

  • Writer: Sebastien Smith
    Sebastien Smith
  • Aug 29, 2025
  • 5 min read

Updated: Jan 5

Two glass jars labeled A and B on gray fabric hold yellow and black mustard seeds. A small pile of yellow seeds lies in front of jar A.
Photo by me

It's 2026, and let's be honest: writing clean microcopy alone no longer cuts the mustard. UX writers are expected to do more — to think like strategists, and back up our choices with data.


A/B testing is one of the most powerful ways to prove your words have an impact. And it's so simple: Show users two versions of your copy, and see which one drives more clicks, sign-ups, or sales.


Armed with A/B testing, you go beyond intuition and make data-driven decisions in your UX writing that support the business as a whole.


Why A/B test UX writing?

UX writing needs regular optimisation to keep up with evolving user needs. A/B testing helps you to:

  • Optimise based on real data. Instead of relying on gut feeling or subjective opinions, A/B testing tells you what works for your audience. For example, if your marketing team wants you to inject more brand into your microcopy, A/B testing answers whether it resonates with users or not.

  • Boost clicks and conversions. When comparing two versions of microcopy, A/B testing helps you identify which one performs better in terms of engagement and click-through rates. It also informs you on best practices for achieving maximum impact from your UX writing.

  • Understand your audience better. Before you write anything as a UX writer, you always ask, "Who am I writing for?" A/B testing provides valuable insights into your audience, including how they respond to specific wording or tone of voice.


Before you start A/B testing

Set clear research goals and hypotheses. What are you trying to achieve through testing? What do you think will happen? Here are some typical goals for A/B testing in UX writing:

  • Increase click-through rates

  • Decrease drop-off rates

  • Reduce support tickets


Make your hypothesis specific and measurable. This will help sell it to developers and persuade them to invest the necessary resources. Some examples:

  • "On-brand language will increase email open rates by 10%"

  • "Giving hints throughout account creation forms will decrease drop-offs by 5%"


When you should A/B test UX writing

A/B testing is effective for high-impact touch points, such as CTA buttons and email subjects. Essentially, anywhere that's directly tied to conversion, so that you can directly measure the impact.


If you have a user flow with a high drop-off rate (e.g., payment form abandonment, low email open rates), A/B testing can help identify friction points and lower them.


When you should NOT A/B test (and what to do instead)

You need a large user sample to yield results and make the A/B test worth the time and resources. This is where I often fell short as a UX writer in the past: Not everyone was as keen as mustard for testing as me. I had ideas for how to experiment, but I wasn't persuasive in explaining why developers should invest the time.


What’s more, comparing two different sets of microcopy can evaluate isolated changes in wording, but it’s not a stand-in for your broader content strategy. That's because while A/B testing can show that one variant outperforms another, it cannot explain why.

To go beyond quantitative insights and understand the why behind user actions, try usability testing instead.


UX writing A/B test ideas (with examples)


  1. CTA buttons

This is the bread-and-butter of A/B testing in UX. CTA buttons are tied directly to clicks and conversions, making them fertile ground for testing.


How to measure: Clickthrough rates


Ideas & examples:

  • Short vs. long button labels: "Subscribe" vs. "Subscribe for weekly offers"

  • With and without value proposition: "Start your free trial" vs. "Get started"

  • On-brand vs. direct: "Send me a magic code" vs. "Send verification code"

  • Reassuring words vs. strong verbs: "Checkout securely" vs. "Pay now"

  • Curiosity vs. clarity: "Check availability" vs. "Book a room"


Example hypothesis: "We believe using language that inspires curiosity rather than direct words will increase clickthrough rates by 10%"


Two product boxes labeled Mustard A and B, each with a hotdog image and price $4.99. Buttons read "View cart," "Checkout securely," and "Pay now."
I made this mock-up using Whimsical.

  1. Email subjects and push notifications

Yes, emails and pushes are within the scope of UX writing and content design. Especially those communications that are part of the user journey (think order confirmations, customer satisfaction surveys, etc.)


How to measure: Open rates


Ideas:

  • Long vs. short: "Ticket confirmation" vs. "Your ticket is inside this email — Keep this handy"

  • Emoji vs. no emoji: "👋 We're glad you're here" vs. "We're glad you're here"

  • On-brand vs. direct: "You're all set for your trip!" vs. "Flight booking confirmed"

  • Name vs. no name: "Exclusive deals for Samantha" vs. "Deals just for you"

  • Action required vs. without: "[Action required] We need your tax details" vs. "Tax details required"


Example hypothesis: "Longer, more informative email subjects in our ticket confirmation emails will reduce customer inquiries for missing tickets by 5%"


  1. Sign-up flows and form fields

High drop-off on your sign-up flows and forms? Try A/B testing to uncover the friction that's preventing successful sign-ups.


How to measure: Form abandonment rate


Ideas:

  • Placeholder text only vs. reassuring text: "Enter your phone number" vs. "We'll only use your number for order confirmation"

  • Placeholder text vs. hints: "Create a password" vs. "Passwords must be 8–12 characters, with at least 1 number"

  • Placeholder text vs. examples: "Enter your reference number" vs. "ABC123456789"


Example hypothesis: "Form fields with reassuring text will decrease drop-offs by 10%"


Two text input fields labeled "Phone number: Enter your number" with "Cancel" and "Continue" buttons, yellow and orange accents.
I made this mock-up using Whimsical.

  1. Hide or show the copy

While this leans more towards product design, consider hiding versus showing longer copy to see how users engage with your long-form content. For the control group, show the content as is; in the variant, hide it behind a "Read more" or expandable accordion design.


If users are expanding your content, that means they are engaging with it. If not, they are likely skipping over it, and you can consider shortening or removing it altogether.


How to measure: Click rate on "Read more" or the expand button.


Ideas:

  • Instructional text

  • Product text

  • T&Cs


Example hypothesis: "Using an expandable accordion design will show if users are engaging with the content"


Two text boxes on a light blue background. Left box: detailed mustard tips. Right box: teaser text with "Read more >" in yellow.
I made this mock-up using Whimsical.

Best practices for A/B testing UX writing

  • Test both versions simultaneously for accurate results.

  • Run your test with a large pool of users for a good amount of time (e.g. 10,000 users over two weeks)

  • Test one piece of copy within a screen at a time (e.g. if you’re testing conversions for a page, don't test the CTA and title together)

  • It's better to run tests with new customers for more objective results.

  • Test fast and test early: you'll save money by uncovering bad copy early.


Jar of mustard seeds labeled "A" with an open lid and spoon inside, placed on a gray cloth. Some seeds are spilled in front.
Photo by me

Finishing up

Here's the bad news I've saved for last: not every A/B test yields a winner. Sometimes the difference between two versions is too small to be conclusive.


But even "failed" tests can teach you what doesn't matter, so that you can focus your energy on what does. Each test, whether you win or lose, provides insight into how your users think and respond to your words.


The trick is to treat A/B testing as an ongoing practice. Test early, test often, and combine it with qualitative research. That's how you'll uncover the copy that not only boosts clicks but builds a connection with your audience. You'll find the words that cut the mustard.

 
 
 

Comments


Contact

I'm available for:

  • Essays + criticism

  • Features

  • UX + marketing copywriting

  • Research

  • Editing/proofreading

  • LinkedIn
  • Instagram

Thanks! I'll get back to you soon.

Sebastien Smith Portfolio

bottom of page