Using Mixed Methods Research for Planning Programs

ripple in puddle reflecting painted colors
By:

Whether you know it or not, you use mixed methods research to make decisions in your day-to-day life. We all employ a mix of quantitative and qualitative methods to make all kinds of decisions. Just think about when you plan a trip. If you are looking up hotels to see where you’d like to stay, you might rely on a combination of star ratings (quantitative data) and customer reviews (qualitative data) to identify the ideal spot. However, like most methodological concepts, there are a lot of important considerations for how to design mixed methods research. Combining qualitative and quantitative research can often yield richer results than either of these methods on their own, but how each of these methods are used in relation to the other can significantly impact the effectiveness (and accuracy) of the results.

Take the hotel search example: say you are most interested in being close to the beach, and you don’t really care whether your room has a nice view. You start your hotel search by analyzing the quantitative data – filtering the list to show only hotels that have average user reviews of four stars or higher. You then read the reviews for this smaller list of hotels, scanning for anything mentioning proximity to the beach. You end up booking a room a short drive from the beach. Your decision to limit your qualitative analysis of reviews to the small set of reviews for hotels with high star ratings could mean that you filtered out hotels that are right next to the beach but only received mediocre reviews because of their boring views.

While the end result in this example was not ideal, the hotel chosen through this method would probably be fine. The stakes for making decisions about utility energy efficiency programs, however, are considerably higher, and choosing the wrong mixed methods research approach can produce data that tells an incomplete, incoherent, or just flat-out incorrect story about the program. If you are an energy efficiency program manager for an electric utility, there are many more variables to consider (from customer demographics to variations in climate), and you have to balance the needs of multiple stakeholders rather than just your own preferences. Your job is trickier still if you’re in charge of a new program where participants’ experiences with the program are largely unknown.

With some preparation and a guiding framework, however, mixed-methods designs can be extremely powerful, providing value greater than the sum of their qualitative and quantitative parts, while avoiding costly miscalculations.

This post will describe three commonly-used mixed-methods research designs, which can be adapted to answer the variety of research questions energy efficiency professionals need to get answers to: (1) Convergent parallel design, (2) Explanatory sequential design, and (3) Exploratory sequential design (Creswell & Clark, 2011). Accompanying the description of each design type, I have provided a typical scenario in which a program manager would want to use that design.

Convergent Parallel Design

convergent parallel flow chart

This design is characterized by siloing quantitative and qualitative research efforts in the beginning. Quantitative and qualitative data are collected at the same time, but analyzed independently of each other. After these separate analyses are completed, the results of both quantitative and qualitative analyses are considered together, and the researchers develop an overall interpretation of the phenomena of interest.

A convergent parallel design is appropriate when you want to maximize confidence in your findings, as you can meaningfully compare qualitative and quantitative findings for similarities and differences. It is best for when the underlying phenomena of interest are well-defined or have a standardized method for measurement (e.g., when examing customer satisfaction with program services using a standard set of 1-to-10 scale questions and open-ended questions that are used to measure satisfaction in all programs). This design is also generally a good choice for evaluating mature programs or programs that adhere closely to commonly-used program designs. Mature programs have likely been evaluated several times and researchers are likely to have greater confidence that they have identified the set of commonly-cited reasons for being satisfied or dissatisfied with the program.

Explanatory Sequential Design

explanatory sequential flow chart

This design is characterized by collecting and analyzing quantitative data first and following up with qualitative data collection and analysis, which allows for very detailed explanations of phenomena.

An explanatory sequential design is most appropriate when you want to collect quantitative data from a large population—like a survey measuring program satisfaction among all commercial and industrial utility customers in a county—but you also want to understand why some members of that population answered the way they did.

Exploratory Sequential Design

exploratory sequential flow chart

This design is characterized by collecting and analyzing qualitative data first and following up with quantitative data collection and analysis. This design is primarily useful for generalizing qualitative results. Researchers can use this approach to identify possible relationships using a qualitative method with a relatively small sample and then follow up with a quantitative method to test the extent to which those relationships are generalizable to a larger population.

An exploratory sequential design is most appropriate when you want to collect quantitative data from a large population, such as a survey measuring the frequency with which residential customers engage in energy-efficient behaviors, but you do not know which topics are the most important to include as question prompts in this quantitative data collection effort.

The Importance of Getting Mixed Methods Right

 

When employed correctly, mixed-methods research can produce game-changing insights. Poorly designed or executed mixed-methods research, however, can produce findings that are incomplete, incomprehensible, or inaccurate. Take the example in the previous paragraph about attempting to identify how customers make decisions about purchasing water heaters. If, instead of using the design described above, the researchers used an explanatory-sequential design—quantitative methods followed by qualitative methods—the results might be extremely difficult to interpret. Imagine if the researchers first came up with their own list of water heater features and sent out a survey asking customers to rank those features. If the researchers didn’t get their list just right, they would likely get back a large number of surveys with highly-ranked “other/none of the above” responses—not very useful on their own and definitely not very useful for informing the design of a qualitative study.

Whether you are planning a vacation or the evolution of a portfolio of energy programs, understanding this framework can help you choose the best mixed-methods design for your research objectives. And selecting the right experts to design and conduct your research will prevent wasting resources on irrelevant data collection and increase the likelihood that research findings will be used and useful (Patton, 2008). Our goal is to leverage the strengths of both qualitative and quantitive data so that our research enables meaningful change—and so that program managers can enjoy their own vacations, knowing that their programs are an integral component of the shift to a cleaner, smarter energy future.

Learn about engagements where we employed mixed methods research!