AMEC Measurement Month – Roundtable at Duck&Waffle
Not a bad way to start the morning, eh?
Gorkana held its first Measurement Roundtable yesterday morning at the Duck & Waffle, focused on best practice measurement in the agency world.
Over the past decade, measurement has increasingly become an integral component of successful PR. We have felt this acutely, as our measurement programmes have grown by leaps and bounds over the past 10-15 years in response. This means that we’re constantly developing and refining the ways in which we derive insight and present meaningful data.
Really exciting for us, but we’re also aware that it presents a challenge to the industry – to you: How in the world do you start incorporating measurement as an agency when you didn’t even find time for breakfast this morning?
So we killed two birds with one stone and hosted a breakfast focused on measurement. Amidst the mountain of pastries, hash browns and sausages, we shared insight around case studies of award-winning measurement and our agency guests challenged us with obstacles they experience in measurement.
As we suspected, our guests saw the path to effective and standardised measurement as a bit muddy and murky, requiring some serious wellies. And they weren’t alone. A few months back, we conducted surveys across the PR industry in advance of the 2015 AMEC Summit and found that 80% of PRs wanted to incorporate more sophisticated measurement into their work, but felt that there was not enough practical guidance out there to begin.
So we started the morning by looking at a few case studies for which we’d won awards at AMEC to illustrate what we see as crucial to meaningful and insightful measurement:
- The work that we did with Nesta looked at the impact of PR in relation to other marketing activities and is summarised neatly here.
- With UNICEF, our measurement spanned multiple markets and informs the campaign planning process. The case study for our work on their Child Protection campaign can be found here.
- Our work with The Stroke Association incorporates survey data, allowing for correlations to be drawn between PR activity and changes in public awareness. Find this here.
- Finally, our work with Samsung was referenced frequently as well, especially in regards to competitor benchmarking. Find that here.
Three lessons stood out from these projects:
Evidence of learned and changed behaviours
We pat ourselves on the back when we’re reviewing an evaluation report with a client and his/her reaction is “wow – looks like we need to do more of that” or “hmm – that obviously didn’t work but I can see what we need to change”. From our perspective, sharing your excellent results is only half the value of evaluation – we strive to build measurement frameworks that clearly address the ‘so what?’ factor.
Measurement also makes it easier to learn from your competitors. It’s hugely useful when you can drill down and see which journalists and which publications your competitors are focused on – or even how they’re managing their engagement on social networks. Learning from evaluation doesn’t need to sit solely with PR teams either – in a recent example, our measurement of spokesperson effectiveness helped to make the case for increased spokesperson training. Compelling stuff!
Alignment with wider business objectives
The survey we ran earlier this year unveiled another telling statistic – only 25% of PR teams reported to have clear communication objectives.
At Gorkana, we use something called a Communications Cascade to help clients identify these objectives by starting with the wider business objectives and working our way down to measureable PR results. This is an example of what we used with the Stroke Association:
Not only does this ensure your efforts are as efficient and effective as possible, but it also means your results are directly related (read: relevant) to the goals and objectives of the board from whom you’d like a bit more budget.
Incorporation of external data sources
All of our best measurement aims to incorporate external data, providing a quantitative alternative to AVEs. This example from our work with Nesta illustrates that concept perfectly, incorporating several other types of marketing activity into a statistical modelling piece to demonstrate the value of PR. However, results became clearer when viewed in conjunction with a poll run by the BBC.
The poll asked people where they had heard of the prize – and most referenced PR activity. Thus, the conclusion to be drawn was that the initial PR push was indeed crucial to the success of the campaign. And it doesn’t need to stop there…the list of external data sources patiently waiting to be integrated into measurement reporting is long! Google analytics, Twitter analytics, social media engagement, sales results, marketing activities, SEO stats…and the list goes on.
So now you’re set. Right? Mmm…probably not. The response we got at our breakfast was “well…easier said than done”. Consequently, the second part of our discussion focused on obstacles to client buy-in.
Standardisation of approach
How are you meant to start measuring when everyone is measuring in a different way? We have a few answers for that. First, the Barcelona Principles (2.0!) are a great place to start. They include things like the importance of qualitative metrics, necessary alignment with business objectives and the banishment of AVEs.
The onus is on the industry to start requiring that any kind of measurement aligns with these tenants. We’ll keep talking about them if you keep talking about them. And we’ll even talk to you (and your clients) about them first if it helps.
Next, we recommend a framework of measurement – essentially a suite of options, set methods and a set pricing structure that allows your measurement offerings to be scalable and flexible. This also allows for the intellectual capital of the measurement programme to sit squarely with your agency, even if it’s being outsourced to *cough*Gorkana*cough. We can help you set that up.
Now a standardised methodology is all well and good, but it does leave one crucial question unanswered. How do you know the results are good results? I’ll share our approach: At Gorkana, we measure and evaluate with hundreds of organisations spanning all sectors, from finance to charities to utilities to the UK Government.
This means we’ve been able to develop aggregated data sets, spanning our entire client base over the last several years, which allow us to provide industry-specific benchmarking. In effect, you’re able to set targets and compare results from those of your peers.
We also encourage fully-benchmarked evaluation, where the same metrics are tracked for a set of competitors so you’re able to measure competitively over time and – more importantly – evaluate and learn from your competitors’ results as well as your own.
From my experience, little is more compelling to a business than that.
Finally, one of the most obvious obstacles was a lack of education. It’s impossible to train a team without resources, and really difficult to convince a client without case studies. Enter: AMEC.
“AMEC acts as the voice for public relations research and measurement, providing a forum and International network for sharing knowledge and best practice about communications research, media intelligence and insights.”
As the educational sponsor of AMEC, Gorkana is keen to hear more from you – our clients and our industry – about where we can focus our efforts to address this gap.What works for your teams? Workshops? Webinars? Case studies? For the time being, we’ll continue with all three. But let us know if you think of something better.
Also, it’s worth noting that as a result of our recent merger with several US-based measurement agencies, we’ll be looking at a whole suite of new offerings over the coming years. We want to make sure we’re addressing your needs, so we’ll be holding focus groups and are keen to get you involved. Let me know if you’d like to be a part of those discussions at [email protected].