Is AI removing our intelligence from measurement?
Author:
Rebecca Couper-Leonard Insight and Content Manager
In our bid to streamline and speed up the measurement process, are we in danger of losing our edge as employee experts? Rebecca Couper-Leonard from our Insight team investigates.
Internal communication measurement and evaluation is our Insight team’s bread and butter. For over 15 years, we’ve been assessing and advising on the state of organisational communication through one-off audits and regular measurement series.
But like many professionals, we’re starting to see changes in how we work thanks to the rise of AI. Already we’ve experienced how using AI tools can help us to organise, analyse and summarise data. It’s been particularly useful to speed up quantitative research analysis, which previously might have required hours spent poring over spreadsheets or grappling with formulae.
Indeed, we’re seeing how we might go further in the future in the wider world of research. It ranges from the intriguing, such as YouGov’s BrandVoices Index using AI-moderated interviewers (AIMIs), to the concerning, with some market research agencies reporting participants using AI-generated responses that are impairing their findings.
But could relying on AI for measurement and evaluation change the way we learn about our organisations and employees? Can we take any steps to prevent over-reliance on AI insights rather than our own expertise and experience?
Threatening organisational knowledge
We’ve spoken to several IC professionals who are concerned that they will have less awareness of what’s going on with their employees if they rely on AI to tell them. They worry that they might end up having a poorer understanding of employees’ wants and needs if they rely on AI to filter data and tell them what’s important.
Research from leading universities such as MIT and Cornell backs these fears up, finding that use of AI leads to reduced knowledge retention and critical thinking.
We too have noticed that using AI can give you a more superficial understanding of research, versus reviewing it with your own brain.
In a recent audit, we trialled analysing focus group transcripts using both human brains and AI engines. We found that AI got about 60-70% of the way there, but tended to privilege frequency over meaning, and smooth out contradictory data to fit its own narrative.
We also got very different results when giving the same data and prompt to different AI platforms, as they prioritised different elements.
This means you can be left with neat themes that may appear right on the surface, but don’t contain the detailed insights that support interpretation and understanding.
It’s also limited to online data – what about the knowledge you gain from ‘walking the floor’ and being immersed in company culture? Where’s that in the AI mix?
And yes, you can use AI-driven sentiment analysis to review the content of your enterprise social network, but what about the things that employees don’t want to post on a public platform, and would only say in a confidential conversation?
We need to be careful that organisational insight doesn’t come to be seen as something you extract, rather than something you develop.
Creating echo chambers
Of course, you could argue that as we use AI tools more, they will learn what we need to know and adapt their analysis to the style and level of thinking we need. But this comes with its own dangers.
AI tools learn from our prompts, patterns of use, and what we take forwards into our work. This can mean that findings are framed in familiar ways or pushed forwards to support existing narratives.
All of this contributes to echo chambers, which are like digital walls that echo our voices and opinions back to us, and block out any dissenting or alternative views.
We’ve seen this in our own trials, when we noticed that an AI analysis had wholly missed out a unique theme that we’d picked up through human analysis. When we asked the AI tool “What about this missing theme?”, lo and behold it looked at the data from a new angle and found the insight to support it.
We’d be remiss if we didn’t note here that telling AI what themes to look for also isn’t great. Organisations could be tempted to use this to further their own agenda, asking AI to validate pre-determined themes rather than independently arriving at its own conclusions. For example: “Hey Copilot – take this focus group transcript and give me evidence to support scrapping the intranet.”
So we must make sure that those carrying out research implement governance correctly and understand research ethics. Without them, we risk these automated outputs being shrouded in bias and inauthenticity.
Balancing AI with authenticity
So after all that, are we recommending that organisations avoid AI for measurement and evaluation? Absolutely not.
AI can do a brilliant job at organising data, but finding the meaning still requires human input. Here are some of the techniques we recommend using to maintain that balance between automated efficiency and authentic insights.
Add context to your prompts
Don’t just give a sheet of results to your AI tool of choice and tell it to go to town. In your prompt, explain the rationale behind the research and the type of analysis you need. For example, say whether you want a quick summary of key themes, or if you want a detailed run-down of every piece of feedback given, including major and minor themes.
Check your working
AI isn’t always right. In fact, a lot of the time, it’s wrong. So make sure you check the outputs it gives you for accuracy, relevancy and reliability. For your first few times using AI for analysis, we recommend doubling up with human analysis alongside so you can spot any recurring issues that you need to watch out for next time.
Call in the experts
It’s incredibly easy for bias to creep into research projects, especially when you have a people-pleasing AI tool ready to give you whatever answer you want. Our Insight team can help you to identify where AI can best serve your measurement and evaluation needs, avoid bias, and implement governance to keep you on track.
