I think it’s fair to say that internal communications professionals are more likely to count themselves as words people rather than numbers people. However, accessing the right data and using the right diagnostic tools to get underneath the root causes of issues you’re seeing in your organisation or team is a key part of the comms role. It’s something I talk about in my book, and I interviewed Benjamin Ellis, data expert and founder of SocialOptic, for the chapter on this complex topic. Benjamin is part of the Redefining Communications collective and we often work together when implementing The Field Model during the vital diagnosis phase.
In this blog, I take a whistle-stop tour of how to diagnose, the tools you can use depending on your employee numbers and how to look out for bias. I also share some of my top questions to ask to really get under the skin of what’s going on.
What are the best tools for diagnosing what’s wrong in your organisation?
If we’re to truly diagnose what’s causing chaos in an organisation, we need to look at both communication and its impact. Communications audits are a useful basis for diagnosis, but they won’t tell you everything. There must be a conversation.
Surveys have become the go-to solution for research – we all love a stat! But they can be a bit of a tick-box exercise. Some tests will be right for your organisation, and some won’t: geography and time factors will undoubtedly play a role, but it’s also important to consider employee numbers and map that to the company rationale or culture. Choosing the wrong method could have negative consequences:
“If people are going off on stress/sick leave, then an online survey won’t cut it. It will feel faceless and like a tick-box exercise. You need the social connection – and you need to remember the need to understand people as the foundation.”
Key tools include listening interviews, focus groups, surveys, polls. You can also combine tools such as a survey with listening interviews. I recommend starting with employee numbers and the reason for the research. As a guideline:
- 500 plus employees: listening interviews, online surveys and focus groups
- 100 to 500 employees: listening interviews and online survey
- Less than 100 employees: listening interviews
“How bacon is this organisation?”
It might sound obvious, but it’s important to collect data that matters – it’s all about context:
“You could ask people “How bacon is this organisation?” And people say, “Well it’s really bacony”, or “It’s not very bacony”. Now you have a measure that says, “We are 10 per cent more bacony than last month,” but it doesn’t correlate to anything in the real world.”
We can easily get the data to tell the story we want to tell, but that won’t help us understand, diagnose or fix any issues for the long term. These are the three stages of The Field Model.
It’s all about asking your employees the right questions, such as:
- What is the purpose of [team or company]?
- Do you know the strategy for the business and what is the focus for the next 12-24 months?
- What makes [team or company] different from other providers ?
- What are the challenges about working at [company]?
- How do you find out about how the business is doing?
- What do you love about working here?
- Do you trust the leadership team?
I list more examples of great questions to ask in Chapter 6 – Data and Diagnostics, Influential Internal Communication.
How to avoid bias when you collect data
Bias and data go hand in hand; whether we’re analysing simple figures or creating complex algorithms. In my book I share a case study about data bias and algorithms. I refer to some key insights from the mathematician and author of Weapons of Math Destruction, Cathy O’Neil. She discusses the darker side of data. People view mathematics as neutral and correct, but the problem with algorithms is that they’re based on models that are “opinions embedded in mathematics”. O’Neil warns that algorithms are significantly affecting people’s lives since there is often no accountability or appeals procedure for the models used; “By trusting the data that’s actually picking up on past practices and by choosing the definition of success, how can we expect the algorithms to emerge unscathed?”
As we increasingly use data for decision-making, we must ensure we audit neutral data to limit bias. Bias can slip in at the sampling, collecting and interpreting stages. Here are a few tips to avoid bias creeping in:
- Don’t lead the witness – think about the language you’re using.
- Avoid incentivising participation – it skews the data to those who want the incentive
- Ask an independent person to review the results
- Give people a realistic timeframe and make it accessible to everyone
- Be aware of how interpretation can be influenced by context, individual bias, group-think bias, or the reasons why the data is being collected.
If you’d like to find out more about The Field Model and diagnosing what’s wrong in your organisation, please get in touch. Listen to my podcast episode on The Field Model.