Typefully

Tackling Response Biases (The What, Why, Motivation & How)

Avatar

Share

 • 

3 years ago

 • 

View on X

Floating surveys, one can't help but encounter all those response biases. When one comes to terms with it over time, the lack of a befitting solution is really a big worry. How does one tackle that? Read on to find out... #productmanagement #survey #qualitative #response 1/
Over my recent article I've tried to get into the issues surrounding the framing of questions for surveys which seem to be a pretty common problem & also touched upon how they could be resolved to a large extent. Here's the link: mgmtinc.substack.com/p/strawman-and-steelman 2/
Of course, the non-responsiveness is a huge energy drainer for teams that float surveys, in one sense that could feel like a huge blessing if compared to the bias some of those responses could carry. And needless to mention, drier the topic more vaguer the response. 3/
Let's now drill down into each of those biases by starting to build an understanding over these: 🔹What (is it)? 🔹Why (do they occur)? 🔹The motivation (that could be the primary driver behind such thinking / actions)? 🔹 How (can they be avoided)? 4/
1. MORAL BIAS 🔹What: the morality / what's considered a moral boundary that's typical of a community could become a barrier over answering questions 🔹Why: though largely uncommon, the occurrence of moral bias can't be ignored over some communities / pockets of the world 5/
🔹Motivation: they oppressive angle takes over emotions; For ex: "leather" accessory product is perceived sensitive for "animal rights activists" & could border immorality 🔹How: selectively choose sample spaces so that it doesn't reach places where it'd be labeled offensive 6/
2. NON-CONFORMANCE BIAS 🔹What: choices could be very passive & indicative of non-participation & thus considered non-conformant / indecisive like for ex: choosing "neither, nor" response on all answers 🔹Why: disconnection with the survey & that leads to them not feeling... 7/
...comfortable in taking any sides 🔹Motivation: a possible case of ignorance 🔹How: use simple language, filter out & choose audience relevant to sample space, have a mix of question types & a varied choice of responses 8/
3. NEUTRALITY 🔹What: user is totally confused by the survey & now has just one goal which is to click his way out of it 🔹Why: indecisiveness over options being very close 🔹Motivation: heavy language used, dry or irrelevant topic, not enough knowledge to provide answers 9/
🔹How: use simple language, even a person without any knowledge of it should be able to understand what's being asked choose sample space properly; it's possible for people with experience in a domain to be unaware of a topic (for ex: CA / CFA may not understand Crypto) 10/
4. FAMILIARITY 🔹What: people with prior knowledge of a product / feature may want to choose options so as to get their vote counted largely in favor of it 🔹Why: they've been sold on the idea of the product / the feature 11/
🔹Motivation: they'd have a clear goal which is to heavily influence for the motion to be carried & everything else could take a backseat 🔹How: avoid people with knowledge of what's to come; for ex: internal teams / people who have been a part of product idea discussions 12/
5. PSYCHOLOGICAL 🔹What: a personal experience with some (x) that has a deep correlation to what's being asked 🔹Why: psychologically dented deeply either in the negative / positive 🔹Motivation: very strong past experiences setting the mood for the whole survey 13/
🔹How: largely unavoidable, if permissible choose a sample space that's broad enough to accommodate responses so as to vet almost all applicable use cases / user personae 6. EXTREMITY 🔹What: aggrandizing responses presenting an embellished picture 14/
🔹Why: a person's emotional past, attention seekers by habit or nature 🔹Motivation: an attempt to standout gets them to make the responses more interesting 🔹How: set an acceptable & allowed standard deviation, measure each of the responses against the set values... 15/
...if STDDEV seems to be off by a huge scale for all the responses over from one / few odd accounts, then may be they are in your best interest to ignore them 7. CONFORMANCE 🔹What: it rarely happens, but internal teams could extend the time the whole survey is run for... 16/
...so as to gun for a majority over responses 🔹Why: a personal / emotional connection with it 🔹Motivation: may be totally convinced with a certain direction to take over a given feature / product 🔹How: proper brainstorming b/w teams, recheck survey fitment to purpose 17/
When it's not unusual that one feels like some of these biases are easy to identify, discern & get rid of, they could happen right under your supervision without a clue & that's why each step of the process ought to be reflected upon before calling it final & acceptable. 18/
That's it folks! If you enjoyed reading this, then please do this: 1. Retweet the first tweet at the top 🔝 marked "1/n" 2. Drop me a follow @bgpinv 3. Subscribe, like & share my newsletter - mgmtinc.substack.com/ That'd inspire me to do more. Thanks for reading 🙏
SUMMARY: Anyone who has dabbled with surveys & measuring responses know that most biases are real. To tackle them one could to analyze these: 🔹What (is it)? 🔹Why (do they occur)? 🔹The motivation (the primary driver behind such thinking)? 🔹 How (can they be avoided)?
Avatar

Guru Prasad “TPW - The Product Web 🕸”

@BgpInv

Product Management (Fintech, B2B B2C SaaS PaaS); Leadership; Coined Solution State Model (SSM)-2014, Elevated Trapdoor-2014; Mentor & Advisor @ TPW;