For a man that spends his life talking about digital, I have a pretty deep love for analogue. My house (and garage) are full of long-since outmoded formats – I reverted back to hard copy books a while back, and whilst I consume a massive amount of music via streaming, I have a large and ever-growing record collection. Not because of some longing for hipster cool – just because I never really lost my love for the format, and thus have bought and listened to music on vinyl all my teenage and adult life.

The perfect expression of the vinyl format is the 7 inch single. Your favorite song in a tangible, portable product that you can treasure and play over and over again. It’s the medium that launched rock and roll, sold billions of units via pop, and even kick-started punk rock. And it comes with two sides – A and B.

The world of 45s may seem some way from that of data-driven marketing, but I think singles present some interesting case studies for testing. There have been some notable instances over the years where tracks that rested on the B side became the star. Take the Beach Boys’ “God Only Knows”, cited by many as one of the greatest pop songs of all time – the B side of the US single release of “Wouldn’t It Be Nice”. “Into The Groove” by Madonna, B side of the original US single release of “Angel”. And perhaps my favorite B side, northern soul classic “Tainted Love” by Gloria Jones. All instances where, via a range of circumstances, the consumer has exercised their choice, making the flipside the star, proving that record labels don’t always know what their audience want.

In the above cases, A/B testing was far from planned, but in 2016 bringing your audience into the process of testing has never been easier. However, in my experience, I’ve seen many instances where marketers still regularly disregard testing. Often this is based on the same misplaced belief that record labels had in the past – they don’t need to test because they know their customers. At best this may be because they’re going off retrospective customer insights, but all too often it’s simply because they’re still operating by gut feel. There can also be a misplaced belief that testing wastes valuable activation budget, or that testing is too complex and difficult.

Whilst vinyl may have made a come-back of late, the era of marketers making decisions based on gut feeling alone has not. Marketers’ creativity and acumen should be matched by their willingness to experiment and learn from their audiences (see our video Where Do Design and Creativity Fit in Data- Driven Marketing? for more details). In the era of data-driven marketing, there can be no excuses for not embracing testing. Simple A/B testing is accessible with even a modest toolset and rudimentary analytics capabilities. And you can quickly and easily step-up to more complex multivariate testing, or indeed, multi armed bandit tests.

Let’s just take a second to break down what each of these mean:

From A/B to MAB

  • A/B testing – you offer two variants, for example two treatments of a buy button. The variant that works the best, in this instance the button that generates the most conversions, is the winner.
  • Multivariate testing (MVT) – using the same example, this time you offer multiple variants of the buy button in addition to other elements on the page, such as the placement of product images and ratings and review text. The object of the test is to define the combination of variants that drive the best performance.
  • Multi-armed bandit tests (MAB) – like with MVT, this approach offers multiple variants, for example, combinations of copy, images and buttons. Traffic to each variant is weighted based on the probability it will generate better results. Where A/B testing and MVT use randomized distribution, MAB updates the randomized distribution as the test progresses, allowing you to both exploit variants that have worked well in the past whilst also exploring variants that might offer even better performance.

Testing Builds Data-Driven Organizations

As marketing leaders we should be building data-driven objectives, like testing and optimization into how we measure and reward our teams and our agencies. Post campaign analysis is too late to critique your ad placements, subject lines, copy or creative. This is at the very heart of creating a data-driven marketing organization – one where accountability for performance permeates through the DNA of all the marketing team, and where all, not just your marketing analysts are on the hook for making sure every element of every campaign delivers.

Just ask Mick

Regardless of your taste in music, or your chosen format, you should always be testing. To quote the Rolling Stones majestic 1969 B side, “You Can’t Always Get What You Want” – but by testing you can ensure that your audience always get what they want, and indeed what they need. And when they get what they want, you do too.

 

 

Leave a Comment

Your email address will not be published. Required fields are marked *