🎓 Teaching at TMU We’re now deep into the Toronto Metropolitan University’s People Analytics Program that I developed—and the students are incredible. This week, we’re diving into data collection, and the class inspires this newsletter.
🏢 Got space? The Toronto People Analytics Group is still looking for a space sponsor for our upcoming meetups. If your org has a cool office and wants to host some of the city’s best data nerds, reply to this or DM me.
📈 My People Analytics Course The course is growing—fast. As promised, I’ve increased the price to $350. It’s still miles more affordable than other programs (that charge 3x more for half the content). If you're on the fence, don't wait too long—we’re capping spots to keep the experience hands-on and personal. But because you are a newsletter subscriber, here is a secret discount code valid for 1 week only: SECRET.
🧠 Today’s Topic: Why Surveys Might Be Lying to You
Surveys are the Swiss Army knife of HR.
Need to know how people feel? Survey. Want to show you’re listening? Survey. Trying to build a case for an L&D program? Survey.
But here's the hard truth: surveys are flawed.
Yet, everyone uses them in people analytics.
But if you use them in people analytics without questioning them, you’re probably basing decisions on incomplete or distorted data.
Let’s unpack the most common ways surveys can go wrong—with real-world examples and what to do instead.
Let's go!
😇 1. Social Desirability Bias: "I'll Say What You Want Me to Say"
We’ve all done it. You get a survey at work that asks:
“I feel safe to speak up when I disagree with my manager.”
You might click “Agree” even if the truth is… not so clear. Why? We want to appear competent and positive, to present ourselves in a good light.
This is social desirability bias—people giving socially or professionally acceptable answers, not necessarily truthful.
Real Example:
A client once ran a psychological safety survey after announcing several leadership changes. Despite clear anecdotal concerns from team leads, the results showed unusually positive scores on the “trust in leadership” item.
Perhaps, people agreed. Or...
Perhaps people did not believe the survey was anonymous because the invitation included each employee's name.
What to Do Instead:
Include indirect questions: Instead of asking, “Do you trust your manager?” ask about observed behaviours—e.g., “Do other people on the team trust your manager?”
Add anonymous channels: Use anonymous surveys and reinforce anonymity by explaining how the data are handled.
Communicate clearly: Make sure people know why you're asking specific questions, and how their responses will (or won’t) be used. And always follow up with actions—else you risk to lose employee trust.
🧠 2. Memory: “Last Month? I Barely Remember Yesterday.”
Human memory is not a hard drive—it’s a fuzzy highlight reel.
Many surveys ask people to reflect on past experiences, assuming *akhem* perfect recall:
“Over the past 6 months, how often have you felt supported at work?”
But ask someone what they had for lunch last Thursday, and watch them struggle. Recency bias, mood at the moment, and personal interpretation all cloud the data.
Real Example:
Respondents were asked to rate stress levels over the past quarter in a burnout survey.
One manager scored themselves low because “this week was better.” When you dig deeper, their workload hadn't changed at all—it was just a nice and sunny day.
What to Do Instead:
Shorten the recall window: Ask about “the past week” rather than “last quarter.”
Use diary studies or pulse surveys: Track how feelings change in real-time, rather than relying on a big quarterly check-in.
📝 3. Language and Comprehension Gaps: “What Does That Question Even Mean?”
Clarity is everything.
If people interpret the question differently, your data becomes noise.
Some survey items look great to the HR team but fall apart in the wild. Words like “empowered,” “inclusive,” or “engaged” mean different things to different people.
Add in literacy levels, neurodiversity, second-language readers, and you’ve got a recipe for misinterpretation.
Real Example:
Let's take a look at a simple question: “I am recognized for good work,” which can be translated into French as “I am publicly praised.” Francophone respondents may consistently score this lower because public praise may be inappropriate in their culture, even when recognition is happening.
What to Do Instead:
Test your questions: Run a pilot and ask participants how they interpreted each item.
Simplify language: Favor clarity over cleverness. Avoid jargon or ambiguous terms.
Use translation and localization: Don’t just translate the words—adapt the intent to local cultural norms. And use a double translation to ensure the content is interpreted properly and well.
📉 4. Low Response Rates: “You’re Only Hearing from the Choir”
This is probably the most underappreciated problem.
If only half your company responds to a survey, you're not just getting less data—you’re getting biased data.
Engaged employees are more likely to respond. So are those who feel strongly (positively or negatively).
The silent middle? Missing.
Worse, you don’t know how different non-respondents are—so you can't generalize results with confidence.
Real Example:
Suppose you launch a survey on equity and inclusion.
Only 38% of the team responds. The analytics team then finds strong positive scores on “belonging,” but when you hold a follow-up focus group, voices of marginalized employees who hadn’t filled out the survey come forward—with very different stories.
What to Do Instead:
Always report the response rate: “82% of employees responded” is meaningful. So is “Only 43% did.”
Weight responses where possible: Adjust data based on known demographics to better reflect the whole.
Follow up with the silent group: Ask why they didn’t participate. Was it survey fatigue? Lack of trust? That’s data too.
If you’re still here, thanks for sticking with this deep dive. Surveys are still valuable—but only when we recognize where they fall short.
Let me know if you’ve run into any of these traps before—and what you did to fix them.
K
Whenever you’re ready, there are 2 ways I can help you:
#1
If you’re still looking to get started in People Analytics, I recommend starting with my affordable course:
Practical People Analytics: Build data-driven HR programs to 10x your professional effectiveness, business impact, and career. This comprehensive course will teach you everything from building an HR dashboard for business results to driving growth through more advanced analytics (i.e., regression). Join your peers today!
#2
If you are looking for support in your human capital programs, such as engagement, retention, and compensation & benefits, and want to take a more data-driven approach, contact me at Tskhay & Associates for consulting services. Or simply reply to this email!
by Konstantin Tskhay Feb 18, 2026 State of People Analytics 2025-2026: Survey Results ↓ Hi Friends, As many of you know, I continue to serve on the HR.com People Analytics Advisory Board. Every year, we survey organizations—from startups to the Fortune 500—to map the current state of our discipline. We’ve just kicked off planning for the 2026-2027 period, but first, we need to look at the data from this past year. The results are a bit of a reality check. Here is the "State of the Union" for...
by Konstantin Tskhay January 26 What manufacturing can teach us about HR efficiency? ↓ Hi Everyone, Welcome back. It’s been a minute—I’ve been heads-down designing a potential People Analytics Cohort-Based Bootcamp. Is this something you would be interested in? Reply and let me know. If there is enough demand, I’ll open up a waitlist. Next, I am currently fighting a refund battle with Mentimeter. Despite not using the tool, I was charged via a silent background renewal. It’s a frustrating...
by Konstantin Tskhay October 29, 2025 How does your Employee Lifetime Value stack up against Other Companies? ↓ Hi friends, I know, it has been a while! I've been quite busy across the board: People Analytics Strategy for a law firm. I have just finished working with the client to develop their People Analytics Strategy, setting them on the journey to outcompete other companies within the space by really becoming a data-driven HR function. People Analytics Training for HR: I have also just...