May’s BIG forum was a fascinating session, examining the changing world of B2B research with a lively discussion on the evolving toolkit of data collection & analysis.
Moderated by Alex Wheatley of Kantar, we were joined by a panel of Graeme Cade (Savanta), Stuart Gullock (Lucid) and Trevor Wilkinson (Purple Market). Representing providers of technological or ‘non-traditional’ as well as traditional methods in B2B research, they were a perfect blend of experts for this particular session.
Over the long life of the BIG Forum we have returned to the subject of B2B panels a number of times. On each occasion, the debate on the value of B2B panels has been hard fought with the conclusion generally being that whilst B2B panels had their place, they weren’t yet the go to sample source in the way they had become with much consumer research.
Can you do B2B research online via a panel ? – Graeme Cade (Savanta)
As the first speaker of the evening, Graeme addressed what he revealed to be the question he is now most frequently asked… Can you do B2B research via online panels? Whether possible, he suggested, is dependent both on what you mean by B2B and what you mean by Online Panels.
On the one hand, the UK has over 27 million workers. With each having at least some B2B responsibility, surely all of those are suitable respondents for a B2B project? In truth, many clients are not only not interested in micro business decision makers, but they are often most interested in the managers and senior leaders of the UK’s large businesses of which there are only 8,000. Suddenly the pool of suitable contacts has shrunk considerably and targeting B2B contacts is certainly more challenging.
Next Graeme went on to address people’s past (and current) concerns about the identity of B2B panellists. Were participants really who they claimed to be? We’ve always wondered, haven’t we? Graeme told us about an experiment he had run some years ago targeting IT decision makers using a number of online panels. As part of their survey they had included one killer question from which they felt they could accurately identify frauds and fakes – including made up companies that genuine IT specialists should have recognised as fakes in a list of industry suppliers.
Their analysis suggested that an astonishing 80% of participants for that study weren’t genuine. Whilst clearly very disappointing all was not lost. Through the study, Graeme identified a number of panels that performed very well.
At the time, the “better” panels fell into a number of categories:
- Specialist panels – recruited for a particular purpose
- Premium – paying high incentives
- Verified – with extra opt-ins and manual checks using the likes of LinkedIn etc.
And what are Savanta doing to help provide ever better panel provision? Graeme was kind enough to outline their future strategy. In the short-term they are focusing on aggregation of sources as well as high end membership models where participants are offered more than just surveys. They are given the opportunity to join a community where they can also interact with other like-minded people. Longer-term improvements to B2B panels are expected to come from a combination of programmatic targeting, which allow a widening of the search, and intelligent products which are likely to be niche and expensive.
The evolution of B2B Panels – Stuart Gullock (Lucid)
The session continued with Stuart taking us on a journey which demonstrated the importance of technological change in the ever-changing nature of panel provision. The earliest panels were built around the delivery of survey invites via email. The approach was very successful, but as it became more popular the panel providers of the day came under increasing pressure to deliver ever greater numbers of respondents.
One of most significant technological advances was the invention of the router. This allowed panel providers to keep sending people who were being screened out of one survey to another until they eventually reach one for which they were suitable.
Their next major change was creating ties with existing loyalty programmes. Not only did this result in significant growth in terms of overall panel size but was also hugely successful due to the use of meaningful incentives. As an example, the CEO of a FTSE company would not have been tempted by the offer of a few pounds, but a meaningful number of airmiles provided by a loyalty programme partner, gave panels access to not only a greater number of B2B contacts but those of a more senior level as well.
But despite these improvements, still the panel providers struggled to keep us with demand. Burnout lead to trial of river / dynamic sampling as companies tried to “turn on the tap”. For consumer studies, participants were invited to surveys from strategically placed pop-ups.
So where are we now? According to Stuart we find ourselves in the API era where sourcing is programmatic. Machine to machine communications allow integration with websites and apps which allows for very efficient access to potential survey participants.
Lucid don’t manage panels – they are a software marketplace giving value to buyers and sellers. Like Uber, Facebook, Alibaba and Airbnb they are the biggest in their respective fields but don’t own anything. These companies are all successful due to a combination of factors:
Like the aforementioned companies, Lucid see technology as the key to their future success and consequently they are investing in it “in a serious way”. They don’t just want to tick along and are planning something big. They have 100 tech engineers working on products that they believe will revolutionise sample provision.
Stuart finished with a very interesting point. Despite clearly believing passionately in the importance of technology in the future of panel provision, he emphasised the belief that any research will only ever be as good as its surveys. For panels to thrive we must provide respondents with surveys that they want to complete.
B2B CATI vs B2B Panel – Trevor Wilkinson (Purple Market Research)
Trevor has worked as a B2B researcher for 25 years. He is a user of panels but also the owner of a CATI unit. Trevor’s first experience of online panels was around 15 years ago. A panel provider, who we won’t name, promised to get any B2B audience Trevor required but it soon became clear that far from having a panel of pre-recruited B2B specialists, they were in fact doing “backdoor B2B” – the approach of piggy backing B2B surveys off the back of a consumer panel.
Trevor talked about a number of good B2B panel experiences with the Research Now E-Rewards panel but admitted to being generally sceptical about data quality from B2B panel surveys. He argued that for consumer work the default is now a form of online research, but for B2B the default should still be CATI. Even for B2B projects being done with customer lists he still encourages his clients to use a telephone approach due to past experience of a different audience completing surveys online.
Whilst CATI is undoubtedly more expensive and takes longer, with a telephone approach you can encourage greater participation which means that ultimately the audience should be more representative. Trevor also expressed a belief that you are more likely to be confident in the identity of a telephone recruit. When using online panels for B2B research, it is extremely important that any early screening is presented in a way that best disguises the type of person who will qualify – helping reduce the opportunities for panellists to masquerade as the required persona.
With the excellent Alex Wheatley convening, the session then moved on to a very honest discussion of the role of online panels in B2B research. This started with further probing on the question of whether we should have any confidence in who is actually completing our B2B online surveys.
For Stuart, whether suspicion around respondent identity is an issue is to some extent dependent on the audience being targeted. The more common the required participant the easier a project is to conduct. When faced with especially weird requests it is sometimes necessary to tell a client that CATI is the more appropriate approach. Lucid do not influence suppliers and buyers – they simply act as an intermediary. Whilst buyers will not like this, in their mind it is not Lucid’s responsibility to check the veracity of survey participants. Whilst there is some risk to buyers, Stuart points to the fact he has been involved with thousands of successful B2B panel surveys with an average of only 5% to 10% of completes being rejected to concerns over quality.
Graeme pointed out that the higher incentives available to B2B panellist are both a blessing and a curse. Whilst necessary to attract respondents of the seniority desired for many B2B surveys, the greater incentive the greater reason for people to want to trick system and play the role of a B2B figure when in fact they are not. As already suggested, the use of simple trap question is a very important method of identifying and excluding “frauds”. Having long worked for companies offering a range of fieldwork methodologies, Graeme understands the need for selecting the most appropriate for each individual study. B2B panel surveys can work when examining themes and general prevalence. For example, is ‘A’ better than ‘B’, or which is the preferred option from idea 1, 2 and 3? In contrast, panels simply aren’t appropriate when looking to conduct any form of B2B market sizing.
For Trevor, we sometimes must accept that a study done with a B2B panels is the only option – something is better than nothing. If research is needed in time for a board meeting in a couple of days’ time, then CATI is mostly likely not a realistic approach. B2B panels offer a level of quick turnaround where time is of the essence.
A member of the audience pointed out that clients often don’t tell researchers why they want data. Having that information might affect approach. He still does a lot of intercepts with paper surveys but tops up with customer lists and surveys accessed from links publicised in social media.
Stuart explained that he thinks account managers have an important role to play. You need people who know when to say “No” and channel to CATI or other methods when appropriate. Companies must train their staff to be honest rather than allow their sales staff to always say “Yes”.
Another forum visitor revealed that her original concerns regarding B2B panels were linked to the speed with which people are known to change jobs. When reliant on email addresses that was more of a concern, but with the current approach of intelligent targeting and screening it is less of an issue. It was pointed our that Chuck Miller’s quarterly quality scores have shown that river sampling with APIs can deliver better data than some traditional panels despite the use of double or even treble opt-in.
Customers lists are often held up as the answer to everything but despite what GDPR should have done to improve them, these lists remain dirty with out of date contacts, wrong telephone numbers and incorrectly spelt email addresses. For Trevor, quality is more important that quantity with B2B research. Technology can be extremely useful but there are times when a more traditional approach may be more appropriate. Sourcing a small targeted CATI list using website scouring can be extremely effective.
A different audience member discussed her preference for using qual for B2B research as you can see who is participating, thus giving a better understanding of them as a person. This was followed up by a mention of “Qualt” – someone’s idea of larger scale qual. They have seen many occasions when the results of a follow up quant exercise have been identical to those of the initial qual phase and have begun to question the need for quant BB2.
Looking to continue the discussion of suitable sample sources for B2B research, Alex asked the audience if they had any examples of making better use of internal resources. An example was given of a client making use of sales staff, hospital directors and gym managers from across their group.
This was a thoroughly engaging session, which gave a large group the chance to share their views on a topic matter on which people are known to have very strong opinions. Whilst some people will inevitably remain sceptical about their value, the representatives of the B2B panels gave a strong defence for their products. Given the right parameters I have little doubt that providers such as Lucid and Savanta can very successfully facilitate online B2B research. Sample choice must be governed by what you are trying to do. When testing a change of service, you may be able to use simply an existing customer base. When exploring a possible new product, you are likely to need a wider audience. Whether an online panel is appropriate will be dependent on the specific requirements of any B2B project. For responsible researchers, ever better technology will provide us with additional ways to access respondents which should be considered when planning any B2B research.
I look forward to seeing how things have progressed when we return to the subject of B2B online panels in a few years’ time.