By Kritika Gupta
*Featured Image by ThisisEngineering RAEng on Unsplash
Human experience is absorbed, modified, and communicated to other humans or artificial intelligence in both formal and informal settings. With the exponential increase in the use and misuse of data—and the ways data are obtained from humans—in recent decades, human beings are often reduced to mere data sets. Researchers and evaluators sometimes work with quantitative data that fail to provide needed insight when not accompanied by human experience. Qualitative inquiry, then, has an important place in evaluation research. While quantitative datapoints provide us with a zoomed-out picture of a policy or a program, they do not give us the depth that an evaluation team craves.
The heart of evaluation is human experience, the sum of seen and unseen perceptions and emotions about programs, products, or situations. Data points can tell us about the what and how much, but they do not inform us of the why and how. Consider a typical community health education project aimed at improving general STEM knowledge of middle-school students in a small town. An external evaluator with a “data-points-only” approach will look at the number of students present in the session and the assessment of their knowledge before and after attending the session on a standard 5- or 7-point scale. A mixed-methods approach evaluator will observe the same data points but add the whys and hows to the approach. Consider these questions for the same case: “Why did x number of students attend the session? What did they learn after the session? What do they plan to do with this information? Will they attend similar sessions in future? Why or why not?”
No data points can provide the same explanation to a statistical figure as a human insight can. Qualtrics, one of the most popular survey research platforms among academic audiences, uses the “Human Experience Cycle” in their Experience Management data. The major constructs of this cycle are expectations, experiences, perceptions, attitudes, and behaviors.
In addition to asking yourself, “How many people experienced this?,” consider the impact presented by a discovered issue. Janelle Estes and Andy MacMillan in User Tested
*Photo by Alexander Sinn on Unsplash
In formal qualitative inquiry, the mindset, and the training of the (qualitative) data analyst can impact the interpretation of the information collected. We have come very far in playing with data through modern day software platforms that yield fancier statistical results and visual representations. However, we still need to keep improving when it comes to capturing and analyzing human experiences. After hours of analysis and digging into qualitative data, we still often report percentages of participants with the highest occurrence of themes and sub-themes. We also still tend to report “the most emotional” or “the most moving” quote, but what is the “significance” of the conversations that never make it to the final report? Guess we will never know… or is it time we asked?