Category Archives: quantitative research

The essence of qualitative research: “verstehen”

“But how many people did you talk to?” If you’ve ever done qualitative research, you’ve heard that question at least once. And the first time? You were flummoxed. In 3 short minutes, you can be assured that will never happen again.

Folks, qualitative research does not worry about numbers of people; it worries about deep understanding. Weber called this “verstehen.” (Come to think of it, most German people call it that too. Coincidence?). Geertz called it “thick description.” It’s about knowing — really knowing — the phenomenon you’re researching. You’ve lived, breathed, and slept this thing, this social occurrence, this…this…part of everyday life. You know it inside and out.

Courtesy of daniel_blue on Flickr

Courtesy of daniel_blue on Flickr

You know when it’s typical, when it’s unusual, what kinds of people  do this thing, and how. You know why someone would never do this thing, and when they would but just lie about it. In short, you’ve transcended merely noticing this phenomenon. Now, you’re ready to give a 1-hour lecture on it, complete with illustrative examples.

Now if that thing is, say, kitchen use, then stand back! You’re not an Iron Chef, you are a Platinum Chef! You have spent hours inside kitchens of all shapes and sizes. You know how people love them, how they hate them, when they’re ashamed of them and when (very rarely) they destroy them. You can tell casual observers it is “simplistic” to think of how many people have gas stoves. No, you tell them, it’s not about how many people, it’s about WHY they have gas stoves! It’s about what happens when you finally buy a gas stove! It’s about….so much more than how many.

Welcome to the world of verstehen. When you have verstehen, you can perhaps count how many people have gas stoves. Sure, you could determine that more men than women have them. Maybe you could find out that more of them were built between 1970 and 80 than 1990 and 2000. But what good is that number? What does it even mean?

When you’re designing, you must know what the gas stove means. You must know what it means to transform your kitchen into one that can and should host a gas stove. You must know why a person would be “ashamed” to have a gas stove (are they ashamed of their new wealth? do they come from a long line of safety-conscious firefighters?). You must know more than “how many.”

So the next time someone asks you, “how many people did you talk to?”, you can answer them with an hour-long treatise about why that doesn’t matter. You can tell them you are going to blow them away with the thick description of what this thing means to people. You are going to tell them you know more about this thing than anyone who ever lived, and then, dammit, you’re gonna design something so fantastic, so amazing that they too will be screaming in German. You have verstehen!

See my discussion about sampling methods in qual and quant research for more insight into the reasons why “how many” is irrelevant in qualitative research.

Advertisements

Designers are from Venus, Six Sigmas are from Mars

DT has a great post over at Design Sojourn that discusses Six Sigma methodology and how it relates to design. He cites Tim Brown at IDEO who argues that Six Sigma is essentially Newtonian, while design thinking is quantum. In his own design work, DT expressed doubts about using Six Sigma:

After studying the Six Sigma process, I point blank said: “There was no way any of my designers are going to be judged on the quality and success of a design based on how many sketches or iterations we did before we deliver it.”

Both Brown and DT cite Sara Beckman, who recently discussed the topic in the New York Times. Beckman reviews how Six Sigma focuses on incremental improvements, while design and design thinking focuses on big changes. For those of you who aren’t familiar with Six Sigma, it’s a method pioneered by Motorola, which aims to reduce the number of errors to 3 in one million. The “six sigma” refers to six standard deviations. The number of errors should be at the extreme end of the normal curve, or between + or – 3 standard deviations, represented by the Greek symbol sigma.

I argue that design is more complementary to the “interpretivist” paradigm of qualitative research while Six Sigma is positivist. Interpretivists don’t believe the world is a static place. They see reality as being continuously created by you, me and other social actors. There is no such thing as “The Truth” in interpretivist approaches, just different versions of the truth. Typical methods of interpretivists are ethnography, in-depth interviewing and discourse analysis. Positivist research, on the other hand, assumes that reality is static. Positivists believe that “The Truth,” is out there to be discovered. Typical methods would include quantitative surveys.

Designers should focus on interpretivist methods, therefore. They should uncover different versions of the truth using observation and interviewing, as well as deep reflection on symbols and their meanings. Surveys and other quantitative methods are more Six Sigma in that they can measure improvement over time. Designers ought to consider measuring improvement, but starting with qualitative approaches is best.

The Importance of Symbols: doctors and their (dirty) lab coats

The New York Times reports that the American Medical Association is considering doing away with the venerable symbol of the physician: the lab coat. There’s a very good reason to get rid of lab coats: they’re dirty. But the symbol of the lab coat is far more important. The New York Times reports the empirical flaw in wearing lab coats:

The group’s Council on Science and Public Health is looking at the role clothing plays in transmitting bacteria and other microbes and is expected to announce its findings next year.

This empirical finding shouldn’t be surprsing. We also know, for example, that male physician’s ties are wearable petri dishes. The verdict ought to be clear, therefore that we should get rid of lab coats. Not so fast, say physicians.

Getting rid of the lab coat is getting rid of one of the most important symbols of a physician’s identity. Dr. Richard Cohen told the New York Times how important that lab coat is:

“When a patient shares intimacies with you and you examine them in a manner that no one else does, you’d better look like a physician — not a guy who works at Starbuck’s.”

Here is the lesson for designers: empirical “fact” is not the whole story. What role any particular symbol plays in social life is just as critical. What’s fascinating about this story is that physicians are now trained in “evidence-based medicine,” meaning they are trained to diagnose and treat based on more “rigourous” science (I have my doubts about that rigour, but that’s another blog post).

Yet here is a clearly “scientific” reality about the danger of treating patients while wearing a bacteria-infested lab coat and/or tie, and physicians continue to wear them. For all their protestations of “evidence,” physicians too are social beings, embedded in a social world. They too must convey an identity, even if the symbols used for doing so compromise their ability to complete their stated vocational mission.

The symbol is powerful. Designers who base their decisions on so-called “evidence” ought to pay attention to other kinds of evidence, such as the enduring patterns of social interactions. We should pay attention to any enduring patterns of social behaviour but *especially* those which fly in the face of supposed “logic.”

Customers more satisfied when served by white males

In an interesting study, researchers at UBC have found that customers express higher satisfaction when they’re served by white men than by women or people of colour — even when their behaviour is exactly the same. Marketing professor Karl Aquino expressed surprise at the findings, as he told The Globe and Mail

“We had thought there would be some bias going on in the sense of people who were males or whites would be rated more positively,” Mr. Aquino said

“But we didn’t anticipate that for performing the same behaviours, the women and minorities would actually be rated lower,” he said of the study to be published in the Academy of Management Journal.

This study should not be surprising at all.

What this study demonstrates is what Raymond Breton calls the “symbolic order”; we unconsciously place white men at the top of our social hierarchy. We do this in multiple ways, including placing art, culture and ideas at the top of an invisible ladder. Public Enemy sums it up nicely in “Fight the Power”:

None of my heroes don’t appear on no stamp

We know that people have largely unconscious reactions of sexism and racism, oftentimes without even realizing it. It is likely that these unconscious ideas bleed into marketing research easily, especially when such studies are quantitative in nature, and therefore lack the thick description or deep probing offered by qualitative approaches.

This finding has wide-reaching implications. First, when companies use customer satisfaction surveys, they must be aware of the inherent inaccuracy of these surveys. You may believe you’re accurately measuring actual satisfaction, but this study shows that frequently, we don’t measure any such thing. Secondly, such surveys are often used to award bonuses or even job security. As we know in academia, student evaluations are frequently what stands between a scholar and a full-time position. If we know that customer satisfaction is driven by factors other than actual performance, then we are likely to be unwittingly simply rewarding membership in a dominant group.

Read the entire story on The Globe. It’s worth a think.

Data-driven social interaction: The difference between analogue and digital part III

Data-driven social experience is an entirely new manner of social interaction, one that obscures our emotional connections to people. Data makes social relationships visible, knowable, and countable in unprecedented ways. But it does not — and cannot — convey the emotional experience of social interaction. I’ve already discussed how digital technologies transform text and time. Now I want to explore how “data” transforms social experience.

Take the notion of the “social network.” Most people (especially those that read blogs!) think these synonymous with Web sites like Facebook. Truth be told, social network analysis has existed for almost a century. We’ve all heard the term “six degrees of separation,” but most of us don’t know that was coined by none other that Stanley Milgram, of the “shock experiments” fame, when he tracked letters mailed around the world.

Social networks are exceedingly difficult to know from a quantitative perspective. We all live inside social networks but we have a very hard time knowing how these networks are constructed. We may know, for example, that our friend Jeff is friends with another one, Sarah, but we don’t know if Sarah knows Jeff’s partner Sam. Social network analysis is a set of methods designed to learn exactly that.

Now imagine your social network, as it is represented on Facebook (what, you’re not on Facebook?). Below is an image from Visual Complexity that renders a social network visibly but also very easily, simply by mining the data inherent in Facebook’s structure:

from Visual Complexity

from Visual Complexity

Note how we instantly and easily know how institutions are connected, and through which people. Previously, researchers would have to conduct extensive and expensive surveys to get these data. Now these data are easily calculated and visualized by anyone with access to a social network online.

Some people are talking about this visualization as a piece of intellectual property. Alex Iskold on Mashable, for example, asks “Who owns the social map?” I go further and ask, “What does it mean that our social world is mappable?”

Our social world is now infiltrated by masses of data. These data inform us about the structure of our interactions with others in ways that we could not recall correctly if asked. Suddenly we can now see our social world reflected back to us, punctuated by  institutions, and social structures. When we see our social network through the eyes of data, we see the names of organizations, or the institutional affiliation of the people. We do not “see” the emotional experience that created our connections in the first place.

Suddenly, we may think we really are not that close with Jeff, because his partner Sam is really not friends with anyone I know. I can also see that Sarah and I have very few friends in common, which may lead me to think I don’t have much of a future friendship with her.

Those data crowd out the qualitative, embodied experience of the laughs I shared with Jeff and Sam at their cottage last summer. Those data obscure the fact that Sarah and I shared 3 long months as call centre employees together, a time that bonded us forever. A data-filled social world is one that masks the visceral, emotional experiences of face-to-face interaction.

Digital social life is revealed to us in fragmented, mashed up ways. Such ways were impossible before the freely available data on social networks, data that is now so ubiquitous, we don’t even see it.

Improving participation rates: research recruitment best practices

Those of you out there who’ve tried it know: recruiting research participants is HARD. Here are a few insights from the research to help you with better recuitment.

  1. Personalized contact with respondents, followed by pre-contact and aggressive follow-up phone calls *: Don’t count on a form letter, email or random tweet to do the job. Capitalize on your personal relationship with that person. If you don’t have a personal relationship, ensure that you use the person’s name, and for God’s sake, spell it correctly!

    Once you’ve made initial contact, you are not done. Not by a long shot. Make sure you speak to the person (you can do this through IM or email if you’d like) to give them more information. They’re now interested. Don’t stop! One more step!

    Follow up 1 week after initial contact. Assuage any fears they may have. Answer any questions honestly. And above all, be available for more information.

  2. External researchers with social capital are best**: University-based researchers have been shown to have the best participation rates, but you don’t have to be a professor.  Researcher Sister Marie Augusta Neal of Emmanuel College achieved a near perfect response rate because of her close ties to the respondents and their communities. The lesson here is, if you hire a consultant, make sure they’re trusted. Even better if they personally know the people to be recruited.
  3. Monetary incentives have no effect, unless money is offered “no strings attached”***: Little known fact: the best way to use a monetary incentive is to offer it, up front, with absolutely no strings attached. The “free” money makes people feel more indebted socially. Evidence of this effect can be found in the book Freakonomics. Researchers found that daycare centres that levied late penalties on tardy parents actually had more of a late-pickup problem than those that levied no fine. Why? Because the parents reduced their relationship to the daycare as a mere transaction. Use the “gift economy” approach and ensure a feeling of indebtedness. My personal favourite is a coupon for a single iTunes song at $.99. It is cheap but appears to have great value. Offer it, up front, and then ask for participation

*  Cook, C., F. Heath, and R. Thompson. 2000. “A Meta-analysis of Response Rates in Web or Internet-based Surveys.” Educational and Psychological Measurement 60:821-836.

** Rogelberg, S., A. Luong, M. Sederburg, and D. Cristol. 2000. “Employee Attitude Surveys: Examining the Attitudes of Noncompliant Employees.” Journal of Applied Psychology 85:284-293.

***Hager, M., S. Wilson, T. Pollak, and P. Rooney. 2003. “Response Rates for Mail Surveys of Nonprofit Organizations: A Review and Empirical Test.” Nonprofit and Voluntary Sector Quarterly 32:252-267. Singer, E. (2006) Introduction: Nonresponse Bias in Household Surveys. Public Opinion Quarterly, 70, 637-645

Sampling methods in qualitative and quantitative research

Why does sample size not matter in qualitative research? Because of the assumptions that qualitative researchers make, namely, that the social world is not predictable. Qualitative researchers believe that people are not like molecules or other objects; people’s actions are not predictable.

But quantitative researchers DO believe that social activity IS predictable. So when they compare their observations of social activity to what would happen in purely random results, the difference says something. Let’s say you were to research people’s preferences for a particular interactive feature. Say you’re wondering if young people will like a radio button more than older people. First, you model what results you’d expect if you’d just flipped a coin. Then you use a probability (random) sample, and compare those results to purely random results. Is there a difference?

If there is a difference between them, you can infer that indeed, something other than chance (in this case, age) affect people’s preferences.

Qualitative researchers don’t agree that such things can be reliably predicted. That’s why they don’t bother with expensive and involved random sampling. See all these details below from my research design course.