Spring 2026 SOWK 460w Week 04 - Using Qualitative and Mixed Methods Designs

Slide 1
Text on a presentation slide indicates 'USING QUALITATIVE & MIXED METHODS DESIGNS.' A stylized eyeglasses graphic contains 'Program Evaluation Design.' Below, details read 'Jacob Campbell, Ph.D. LICSW at Heritage University, SOWK 460w Spring 2026.'

Spring 2026 SOWK 460w Week 04 - Using Qualitative and Mixed Methods Designs

title: Spring 2026 SOWK 460w Week 04 - Using Qualitative and Mixed Methods Designs date: 2026-02-09 11:56:37 location: Heritage University tags:

  • Heritage University
  • BASW Program
  • SOWK 460w presentation_video: > “” description: >

In week four of SOWK 460, we will start to explore program evaluation methodologies and the practicality of your program evaluation. Most program evaluations use a variant of mixed-methods. The chapter readings from Royse (2022) focus on qualitative and mixed methods evaluations and needs assessments. This week, students will submit their group work plans and review how to complete the form.

The agenda will be as follows:

  • Basics of program design methodologies
  • Examples of qualitative research
  • Planning the design of and the tasks completion of your evaluation

Learning Objectives this week include:

  • Differentiate between qualitative, quantitative, and mixed methods program evaluation designs.
  • Describe the value of qualitative methods.
  • Identify and compare three core qualitative methods.
  • Construct a group work plan that outlines key components, indicators, sources, and measures of success for their program evaluation.
Slide 2
The slide titled 'Plan for Week Four' lists the agenda and learning objectives for a program design class. Topics include methodologies, qualitative research, and evaluation. Instructor: Jacob Campbell, Heritage University.

Week Four Plan

Agenda

  • Basics of program design methodologies
  • Examples of qualitative research
  • Planning the design of and the tasks completion of your evaluation

Learning Objectives

  • Differentiate between qualitative, quantitative, and mixed methods program evaluation designs.
  • Describe the value of qualitative methods.
  • Identify and compare three core qualitative methods.
  • Construct a group work plan that outlines key components, indicators, sources, and measures of success for their program evaluation.
Slide 3
Text slide with 'WHY QUALITATIVE DESIGN METHODS' and five points: participant descriptions, in-depth topic exploration, program specifics, research expertise, foundational ideas. Jacob Campbell, Ph.D., LICSW at Heritage University. (Royse, 2022) SOWK 460w Spring 2026.

Why Qualitative Design Methods

There are a number of reasons why we might want to gather qualitative data for our study.

  • Framing the topic in participants’ descriptions
  • Exploring a topic in greater depth and with additional details
  • Gain specifics about what works and doesn’t work in a program
  • Needing research expertise and sensitivity to draw out responses (text talked about exploring illegal, stigmatized, or socially unacceptable behavior; e.g., opioid use, poor parenting, domestic violence)
  • Uncovering foundational ideas to build further literature on (consider no literature and using it as an exploratory study)
Slide 4
A tangled string visually represents complexities of qualitative data. Text states qualitative data relates to clinical practices, offers emergent ideas, and includes intersectionality, attributed to various sources. Presentation slide by Jacob Campbell, Ph.D.

Qualities of Qualitative Design Methods

There are some unique aspects of qualitative design methods. They include:

  • Qualitative data includes aspects of intersectionality and is messy and complicated to sort and understand (Fine et al., 2021)
  • With qualitative data, there is an ability to find emergent ideas (Kapp & Anderson, 2010)
  • Closely related to clinical practices (Kapp & Anderson, 2010)

Reference

Fine, M., Torre, M. E., Oswald, A. G., & Avory, S. (2021). Critical participatory action research: Methods and praxis for intersectional knowledge production. Journal of Counseling Psychology, 68(3), 344-356. https://doi.org/10.1037/cou0000445

Kapp, S. A., & Anderson, G. R. (2009). Agency-based program evaluation: Lessons from practice. Sage Publications. https://doi.org/https://doi.org/10.4135/9781544364896

Slide 5
Three illustrations depict qualitative research methods: participant observation, in-depth interview, and focus group. Descriptions explain researchers observe, question, and explore participant experiences. Includes text: 'Three Key Qualitative Methods' and references Jacob Campbell.

Three Key Qualitative Methods

Royse (2022) describes three specific methods for qualitative research

  • Participant Observation: The researcher observes participants in their natural environment, often times as a participant herself
  • In-depth Interview: The researcher asks several, open-ended questions to explore participants’ personal histories, experiences, and perspectives
  • Focus Group: The researcher asks in-depth questions of small groups of participants to explore their experiences, perspectives, and cultural norms
Slide 6
A person stands speaking on stage, holding a clicker. The slide reads: 'THE POWER OF VULNERABILITY: An Example of Interviews and Grounded Theory.' Link: 'ted.com/talks/brene_brown_the_power_of_vulnerability.'

The Power of Vulnerability

I wanted to have us watch Brene Brown’s Ted Talk, the Power of vulnerability. How many of you have watched this before?

  • Her description of grounded theory (a qualitative method)
  • The process of understanding and connecting with the research
  • Why I call qualitative methods messy
Slide 7
Overview image of study phases with three main sections: Orientation (discussing PAR principles), Entry interviews (developing agendas), and Six co-designed sessions (trauma understanding, self-care, and implementing changes). Includes co-researcher roles (social workers, teachers). Methods include group book study, self-care activities, and exploration.  Text: 'OVERVIEW OF STUDY PHASES'  '1. Orientation: discussing PAR, its values, tenets, principles, practices, and processes'  '2. Entry interviews: Co-researcher voice in developing agendas and describing needs'  '3. Six co-designed sessions: Understand how trauma impacts students... Develop a tool or recommendation for how other school staff could create similar growth in other schools'6 Co-Researchers: 3 School Social Workers, 2 Behavior Interventionist Teachers, 1 Para Educator'Embed in dialogues' with icons for group book study, self-care activity, exploration, reflection, and action.

Example of Focus Group Overview of the Study Phases - Introduction to Study

This slide shows all of the parts of this study, the Trauma-Informed PLC. Sometimes you will also hear me refer to it as my PLC. We will be going through each of the these parts in turn to explain what I did before we discuss the results.

Three parts

  1. orientation
  2. Entry interviews
  3. Six focus group sessions:
  • Understand how trauma impacts students
  • Limiting re-traumatization within the classroom
  • Methods for increasing resiliency factors for students
  • Engaging in self-care and burnout prevention to reduce the impact of secondary trauma
  • Evaluate and implement ideas for promoting systematic changes within a classroom and school-wide
  • Develop a tool or recommendation for how other school staff could create similar growth in other schools

Embedded in Each Dialog was:

  • Group Book Study
  • Self-Care Activity
  • Exploration, Reflection, and Action
Slide 8
Diagram illustrates strategies and themes for building a trauma-informed PLC, highlighting learning approaches and thematic goals. Notable elements include self-care, resilience, understanding trauma, and promoting systematic changes. (Campbell, 2023)

Components of building a trauma-informed PLC

The following graphic describes all of these components that I have gone through and reviewed. They include the foundations of:

  • Following a mutual aid model
  • Incorporate an Interdisciplinary Framework

The themes of

  • Understand How Trauma Impacts Students
  • Limiting Re-Traumatization Within the Classroom
  • Methods for Increasing Resiliency Factors for Students
  • Engaging in Self-Care and Burnout Prevention to Reduce the Impact of Secondary Trauma
  • Evaluate and Implement Ideas for Promoting Systematic Changes Within a Classroom and School-Wide
  • Develop a Tool or Recommendation for How Other School Staff Could Create Similar Growth in Other Schools

And the learning strategies of

  • Engage in the Process of Reviewing Practice Together for Development
  • Use Idea Generation to Develop New and Novel Ideas
  • Integrated Self-Care Practices Into Groups and Encourage Use to Reduce Compassion Fatigue
  • Use Storytelling to Make Meaning and Develop Cohesion
  • Include Scholarly Sources and Develop Connections to Evidence-Based Practice
  • DefineConcepts as a Group to Enhance Understanding
  • Review Protocols for Professional Socialization

Implementing a group focused on trauma-informed care through this format might be an unexpected idea to some. During the orientation meeting, there was a school admin who joined the orientation meeting. They later reached out to me and shared they could not participate in the PLC at this time due to the time commitment. During the orientation, she commented that she hadn’t realized that it was an orientation for an ongoing PLC-style group. She had believed the training would follow a more traditional sit-and-get training, and the commitment was joining for an hour-and-a-half presentation.

I hope this project will provide an avenue for new ways of learning about trauma-informed care practices in schools that can also come from the PLC and the classroom.

Reference

Campbell, J. (2023). A professional learning community for developing trauma-informed practices using participatory action methods: transforming school culture for students with emotional and behavioral disabilities (Publication No. 30424801) [California Institute of Integral Studies ProQuest Dissertations Publishing]. ProQuest Dissertations and Theses. https://www.proquest.com/dissertations-theses/professional-learning-community-developing-trauma/docview/2813493629/se-2

Slide 9
A table lists observations of a student's behavior during different classes, including activity, behavior, and consequences. It highlights language arts, small group math, PE, and math on specific dates.Text includes: - 'QUALITATIVE DESIGNS'- 'OBSERVATIONS'- 'ABC Notes'- 'Student: Harold, Grade: 9th'- 'Jacob Campbell, Ph.D. LICSW at Heritage University'- 'Example of ABC Data'- 'SOWK 460w Spring 2026'

Example of Participant Observation: ABC Data

The textbook talked about structured observations. In a school based setting one method of doing this is collecting ABC data

Explain ABC data collection

Other examples

  • Observation while consulting
  • Counts (like cars that can see with seatbelts or STAR Reporting)
  • Structured data collection
Slide 10
Two overlapping speech bubbles, one brown asking 'WHAT IS YOUR PROJECT,' the other gray. Background is white. Text: 'Jacob Campbell, Ph.D., LICSW at Heritage University' and 'SOWK 460w Spring 2026.'

What is your project

[Whole Group Activity] Have groups sit with each other. Give 10 minutes to talk about their projects

Share out what see as project currently.

Slide 11
**Object:** Slide text**Action:** Displays guidelines**Context:** Presentation slide from a program evaluation lecture**Transcription:**- **PROGRAM EVALUATION ELEVATOR PITCH**   - **Project Title:** Title of Your Study   - **Program Being Evaluated:** Name of Agency and Program Being Evaluated   - **Research Question(s):** What question(s) are you trying to answer with your project? Questions should be clear, focused, and feasible within the scope of a semester-long program evaluation.   - **Types of Data to Be Collected:** What data do you plan to collect (e.g., archival records, observations, interviews, surveys)? Consider how each data type contributes to answering your research question(s).   - **Plan for Data Collection:** How do you plan to collect your data, and from whom? Describe who will participate, how data will be gathered, and any relevant access considerations (e.g., staff permission, informed consent, etc.).**Footer:**- Jacob Campbell, Ph.D., LICSW at Heritage University- SOWK 460w Spring 2026

Program Evaluation Elevator Pitch

Your assignment this week includes a pitch. The following are the parts of that pitch.

  • Project Title: Title of Your Study
  • Program Being Evaluated: Name of Agency and Program Being Evaluated
  • Research Question(s): What question(s) are you trying to answer with your project? Questions should be clear, focused, and feasible within the scope of a semester-long program evaluation.
  • Types of Data to Be Collected: What data do you plan to collect (e.g., archival records, observations, interviews, surveys)? Consider how each data type contributes to answering your research question(s).
  • Plan for Data Collection: How do you plan to collect your data, and from whom? Describe who will participate, how data will be gathered, and any relevant access considerations (e.g., staff permission, informed consent, etc.).

Your responses do not need to be finalized at this stage, and your evaluation may evolve over the course of the semester. As we move through the semester, you may make revisions to your evaluation plan; however, if you propose changes to your data sources or entirely new components, please follow up with the instructor for approval.

Slide 12
Table outlines a 'Program Evaluation Work Plan' with categories for Evaluation Design and Data Collection, including Component, Indicator, Source, Success, Task, Person Responsible, Deadline. Contains text indicating Jacob Campbell, Ph.D., Heritage University, SOWK 460w Spring 2026.

Program Evaluation Work Plan

This is due Monday Morning. It is OK if it ends up changing some or not being exactly followed. The idea is for you to have a good understanding of what you will do in your project. The pitch is included in ith.

Evaluation Design, what is being looked for in each section

  • Component: What are we doing and producing?
  • Indicator: What information are we looking at?
  • Source: Where will it come from?
  • Success: How will we know what it means?

Definitions on form

Component: The specific evaluation task, including what data you will collect and the work products you will create. Components should be included for evaluation activities (e.g., surveys, archival data reviews, interviews) and the final products you will produce (e.g., an executive summary or presentation).

Indicators: There may be multiple indicators for any given component. Indicators are the specific measures, variables, or descriptive criteria you will use to assess the component. For evaluation activities, indicators may be qualitative or quantitative. Describe the type of information you will examine (e.g., counts, percentages, averages, ratings, or thematic patterns). If the component is a work product, the indicator may be a brief description of what will be produced (e.g., “a clear, well-organized executive summary”).

Sources: Describe where the data will come from. Your task rows can describe how you will collect the information and the steps involved. Sources may include existing program or agency records (e.g., case files, attendance logs, administrative data) or newly collected data (e.g., surveys, interviews, focus groups). Evaluations that include multiple sources of information are often more meaningful and allow reviewers to triangulate the significance of the collected data. For your work product components, you might simply state that the source is the program evaluation.

Success: Provide a simple description of how you will interpret the information collected. You are seeking to document how your group will know what the component means. Success does not require positive findings, only clear interpretation.

Task, Person Responsible, and Deadline: Divide each component into actionable steps necessary to complete it. Consider the development, implementation, and review of these steps. Add these as tasks, assign responsibility (individual or group), and include anticipated completion dates.

Show the form, and how to change rows

Slide 13
Slide titled 'Probable Components' lists evaluation activities (surveys, reviews) and final products (logic model, executive summary, presentation). Bottom text credits Jacob Campbell at Heritage University, SOWK 460w Spring 2026.

Probable Components

The following are the components you will add include:

  • 1-3 Entries Evaluation Activities (e.g., surveys, archival data reviews, interviews)
  • 3 Entries Final Products
    • Logic model development
    • Executive summary components
    • Final presentation
Slide 14
Slide titled 'Indicators in Program Evaluation' shows three types: Input Indicators measuring contributions, Process Indicators measuring activities and outputs, and Outcome Indicators assessing program effects. Text provides description and attribution.

Indicators in Program Evaluation

The CDC (2021) describes indicators in program evaluations:

Definition: The measurable information used to determine if a program is implementing their program as expected and achieving their outcomes

  • Input indicators measure the contributions necessary to enable the program to be implemented (e.g., funding, staff, key partners, infrastructure)
  • Process indicators measure the program’s activities and outputs (direct products/deliverables of the activities). Together, measures of activities and outputs indicate whether the program is being implemented as planned.
  • Outcome indicators measure whether the program is achieving the expected effects/changes in the short, intermediate, and long term.

Reference

Centers for Disease Control and Prevention (2021, April 9) Indicators: CDC Approach to Evaluation. https://www.cdc.gov/evaluation/indicators/index.htm

Slide 15
Slide displaying a QR code and title 'Criteria for Selection of High-Performing Indicators.' Subheading reads 'A Checklist to Inform Monitoring and Evaluation.' Includes link: 'https://wmich.edu/sites/default/files/attachments/u350/2014/Indicator_checklist.pdf.'

Criteria for Selection of High-Performing Indicators

[Small Group Activity] In your groups, review the A Checklist to Inform Monitoring and Evaluation

[Whole Group Activity] Debrief

  • What do you think you are going to take and adapt to your evaluation
Slide 16
Object: Table  Action: Presents evaluation criteria  Context: Slide from a presentation titled 'Rubric for Program Evaluation Work Plan' includes columns for 'Description' and 'Highly Developed,' with criteria such as Completeness, Clarity, Fairness, and Feasibility. Additional text: Jacob Campbell, Ph.D. LICSW at Heritage University, SOWK 460w Spring 2026.

Group Work Plan Rubric

Review the rubric:

Description Initial Emerging Developed Highly Developed
Completeness Very few aspects of the overall research project are included in the plan. The plan includes several components, but a few significant processes are not included. The plan generally outlines most of the research project. The plan is thorough and covers the entire research project.
Clarity The plan of what needs to happen in either the evaluation design or the data collection is unclear. The plan provides some general idea of tasks that need to be completed but does not include a delineation between evaluation design and data collection. The plan is understandable and includes information about the design and data collection. The plan clearly articulates both the evaluation design and data collection that will take place within the research. The evaluation design includes components, indicators, sources, and what success looks like. The data collection identifies specific tasks, the person responsible, and deadlines for completing those tasks.
Fairness The distribution of tasks is not fair for group members. The distribution of tasks is somewhat fair for group members, but some significant tasks or components are unfairly assigned. The group members are assigned tasks, but a few seem to have more or less work than others. Group members are fairly distributed with tasks related to the assignment.
Feasibility None of the components or tasks are feasible. The program evaluation does not appear feasible, with significant components not likely to be completed. The program evaluation appears feasible, but some aspects might seem out of scope or out of the student’s ability to complete. The program evaluation plan appears feasible and something the group can accomplish within the semester.
Slide 17
Presentation slide titled 'Post Your Group Work Plan in the Forum.' Includes points on program evaluation, group member names, and work plan description. Instructor: Jacob Campbell, Ph.D., LICSW, Heritage University, SOWK 460w Spring 2026.

Posting your Group Work Plan in the Forums

  • Where you will do your program evaluation
  • A general idea of what you will look at
  • The names of group members
  • A brief description of your group work plan
Slide 18
Table labels sections for evaluation design and data collection with rows for planning. A clock icon is shown. Text reads: 'TIME TO WORK ON PROGRAM EVALUATION ELEVATOR PITCH AND GROUP WORK PLAN.' Footer: 'Jacob Campbell, Ph.D. LICSW at Heritage University,' 'SOWK 460w Spring 2026.'

Time to Work on Program Evaluation Elevator Pitch and Group Work Plan

[Small Group Activity] Time to work on completing form