Product review - Wynter (B2B message testing platform)
What is Wynter?
Wynter is a B2B message testing platform, founded by Peep Laja. The platform allows users to upload landing pages and email copy and get it in front of their ideal customer personas (ICPs). ICPs then provide feedback on the relevance, clarity, and overall effectiveness of your messaging, or choose which version of a landing page they prefer if you have uploaded several. Wynter also allows users to run surveys and user testing with their ICP.
Choosing an ICP
ICPs are grouped by level of seniority. You can choose from specialist, mid-level, and senior leadership audiences. Each level has a range of categories and job titles to choose from.
Questions and doubts I had when considering Wynter
I’ve followed Peep for a while and his landing page breakdowns on social media using real feedback from Wynter panellists are some of my favourite things to stumble upon. Nonetheless, there are always doubts and questions when considering any platform. Here’s a few that went through my mind and what I’ve discovered since.
Will they be able to source my target ICP?
Not to be Captain Obvious here, but this will very much depend on who your ICP is. For my use case I needed to match DevOps specialists, DevOps managers, and senior decision makers such as CTOs or CIOs. Happy to report there were no issues matching those ICPs.
If you have a more niche audience in mind it may be a little trickier to find an immediate match. Which leads me onto my next question.
What if I wish to target a more granular audience?
The ICPs I mentioned in the question above are a good match for generic landing pages, such as a homepage. But what if we wanted to go a little deeper? Could we find DevOps specialists who used Kubernetes for example?
On a recent call with Peep I posed these questions. Wynter’s audience team will screen and attempt to reach new or granular audiences if it matches your use case. There’s some common sense caveats. For example, if you’re planning to run a one-time test with a very granular audience it’s going to be an unreasonable ask to Wynter to run a recruitment project for that audience. However, if you plan to use that audience several times through the year and your signing up for an annual subscription, then a quick chat with Wynter will help scope things out.
How long will it take to get feedback?
Wynter suggests that feedback will be with you within 1 to 3 days. This was the one claim that really grabbed my attention and also raised some scepticism.
I’ve been involved in message testing and user testing in the past when the recruitment process, interviewing, and feedback gathering has taken weeks (at best) and sometimes months. As marketers, we’ve all had that conversation with other teams about wanting to speak to customers but the barriers start to go up - their account has a support ticket, they’ve churned, or they’re in the early stages of purchasing a new product or renewing. To my mind, some of these scenarios are the perfect time to speak with customers but I get the reasons why it sometimes can’t happen.
After a frustrating few weeks getting a shortlist together the recruitment begins. 1 out of 50 of your emails gets a reply. More frustration. Timezone conflicts with researchers having to run interviews at weird times in the morning or evening. Getting agreement on incentives and signed off by legal and finance - isn’t this bribery? In a nutshell, it can be a lot of work to get things lined up and customers to agree to talk to you when running tests in-house.
For my pilot campaign I ran a test with 15 panellists and within 24 hours all feedback was completed. Another big hurdle cleared and a significant pain point from previous experience resolved.
What will the quality of feedback be like?
One of those questions that you won’t really get a feel for until you’ve ran a test. Yes, there are lots of reviews you can read through but until you put your messaging in front of your ICP (or run one of Wynter’s other tests) it’s a natural instinct to think that you may be the ‘special case’ who it won’t work for.
Having ran a recent pilot campaign I would rate the quality of feedback as good to excellent. Some panellists will go into more details than others, which is what you would expect, but you want to find those patterns that come through across multiple participants.
There was one moment when reviewing the feedback that stood out for me and cemented in my mind that we had reached a relevant audience. It was a straightforward comment. Something along the lines of, ‘Oh, we were talking about our current setup in the office today. I hadn’t heard of your tool but will definitely be taking a look later this week.’ That’s a darn good ICP match in anyone’s book. While it isn’t the objective to reach a solution-aware audience (problem-aware will be the sweet spot for most), it’s definitely a benefit.