Spring 2025 SOWK 460w Week 04 - Using Qualitative and Mixed Methods Designs

Slide 1
Title text states 'USING QUALITATIVE & MIXED METHODS DESIGNS.' A brown goggles graphic labeled 'Program Evaluation Design' features centrally. Below, text reads: 'Jacob Campbell, Ph.D. LICSW at Heritage University SOWK 460w Spring 2025.'

Spring 2025 SOWK 460w Week 04 - Using Qualitative and Mixed Methods Designs

title: Spring 2025 SOWK 460w Week 04 - Using Qualitative and Mixed Methods Designs date: 2025-02-10 09:52:01 location: Heritage University tags:

  • Heritage University
  • BASW Program
  • SOWK 460w presentation_video: > “” description: >

In week four of SOWK 460, we will start to explore program evaluation methodologies and the practicality of your program evaluation. Most program evaluations use a variant of mixed methods. The chapter readings from Royse (2022) focus on qualitative and mixed methods evaluations and needs assessments. This week, students will submit their group work plans and review how to complete the form.

The agenda will be as follows:

  • Basics of program design methodologies
  • Examples of qualitative research
  • Planning the design of and the tasks completion of your evaluation
Slide 2
Slide presents an agenda. Bullet points cover 'Basics of program design methodologies,' 'Examples of qualitative research,' and 'Planning the design of and the tasks completion of your evaluation.' Footer includes the name Jacob Campbell and SOWK 460w Spring 2025.

Agenda

  • Basics of program design methodologies
  • Examples of qualitative research
  • Planning the design of and the tasks completion of your evaluation
Slide 3
Text slide titled 'Why Qualitative Design Methods' lists five benefits: framing topics, exploring depth, gaining specifics, needing research expertise, and uncovering foundational ideas. Authored by Jacob Campbell at Heritage University.

Why Qualitative Design Methods

Royse (2022) describes a number of number of reasons why we might want to gather qualitative data for our study.

  • Framing the topic in participants’ descriptions
  • Exploring a topic in greater depth and with additional details
  • Gain specifics about what works and doesn’t work in a program
  • Needing research expertise and sensitivity to draw out responses (text talked about exploring illegal, stigmatized, or socially unacceptable behavior; e.g., opioid use, poor parenting, domestic violence)
  • Uncovering foundational ideas to build further literature on (consider no literature and using it as an exploratory study)
Slide 4
The slide features a tangled string, symbolizing complexity, with text discussing qualitative design methods. Key points mention clinical practice connection, intersectionality, and emergent ideas. It includes references and course information.

Qualities of Qualitative Design Methods

There are some unique aspects of qualitative design methods. They include:

  • Qualitative data includes aspects of intersectionality and is messy and complicated to sort and understand (Fine et al., 2021)
  • With qualitative data, there is an ability to find emergent ideas (Kapp & Anderson, 2010) (Very exploratory, find connections to things, example of Brene Brown)
  • Closely related to clinical practices (Kapp & Anderson, 2010)

Reference

Fine, M., Torre, M. E., Oswald, A. G., & Avory, S. (2021). Critical participatory action research: Methods and praxis for intersectional knowledge production. Journal of Counseling Psychology, 68(3), 344-356. https://doi.org/10.1037/cou0000445

Kapp, S. A., & Anderson, G. R. (2009). Agency-based program evaluation: Lessons from practice. Sage Publications. https://doi.org/https://doi.org/10.4135/9781544364896

Slide 5
Diagram showing three qualitative research methods: participant observation, in-depth interview, and focus group. Each method is illustrated with icons and brief descriptions. Title reads 'Three Key Qualitative Methods.'

Three Key Qualitative Methods

Royse (2022) describes three specific methods for qualitative research

  • Participant Observation: The researcher observes participants in their natural environment, often times as a participant herself
  • In-depth Interview: The researcher asks several, open-ended questions to explore participants’ personal histories, experiences, and perspectives
  • Focus Group: The researcher asks in-depth questions of small groups of participants to explore their experiences, perspectives, and cultural norms
Slide 6
A person stands on stage speaking, holding a clicker. The slide reads: 'The Power of Vulnerability: An Example of Interviews and Grounded Theory.' Additional text includes 'Brené Brown' and a TED talk link.

An Example of Interviews and Grounded Theory: The Power of Vulnerability

The textbook described grounded theory. That is the methodology that Brown (2010) uses and talks about her talk. This talk also demonstrates the imergent way that qualitative data can be demonstrated.

How many of you have watched this before?

  • Her description of grounded theory (a qualitative method)
  • The process of understanding and connecting with the research
  • Why I call qualitative methods messy

[Whole Group Activity] Watch the video The Power of Vulnerability

[Whole Group Activity] Debrief

  • What did you hear and see
  • What stands out to you about this as a method for gathering data

Reference

Brown, B. (2010). The power of vulnerability [Video]. TED. https://www.ted.com/talks/brene_brown_the_power_of_vulnerability

Slide 7
The image is a slide titled 'Overview of Study Phases,' outlining three main phases: Orientation, Entry Interviews, and Six Co-Designed Sessions. It lists roles like '6 Co-Researchers' and activities such as 'Group Book Study' and 'Self-Care Activity.'

Example of Focus Group Overview of the Study Phases - Introduction to Study

This slide shows all of the parts of this study, the Trauma-Informed PLC. Sometimes you will also hear me refer to it as my PLC. We will be going through each of the these parts in turn to explain what I did before we discuss the results.

Three parts

  1. orientation
  2. Entry interviews
  3. Six focus group sessions:
  • Understand how trauma impacts students
  • Limiting re-traumatization within the classroom
  • Methods for increasing resiliency factors for students
  • Engaging in self-care and burnout prevention to reduce the impact of secondary trauma
  • Evaluate and implement ideas for promoting systematic changes within a classroom and school-wide
  • Develop a tool or recommendation for how other school staff could create similar growth in other schools

Embedded in Each Dialog was:

  • Group Book Study
  • Self-Care Activity
  • Exploration, Reflection, and Action
Slide 8
A slide displays two sections: 'Learning Strategies' and 'Themes,' listing concepts for developing trauma-informed professional learning communities. Notable text includes 'COMPONENTS IN BUILDING A TRAUMA-INFORMED PLC' and 'Campbell, 2023.'

Components of building a trauma-informed PLC

The following graphic describes all of these components that I have gone through and reviewed. They include the foundations of:

  • Following a mutual aid model
  • Incorporate an Interdisciplinary Framework

The themes of

  • Understand How Trauma Impacts Students
  • Limiting Re-Traumatization Within the Classroom
  • Methods for Increasing Resiliency Factors for Students
  • Engaging in Self-Care and Burnout Prevention to Reduce the Impact of Secondary Trauma
  • Evaluate and Implement Ideas for Promoting Systematic Changes Within a Classroom and School-Wide
  • Develop a Tool or Recommendation for How Other School Staff Could Create Similar Growth in Other Schools

And the learning strategies of

  • Engage in the Process of Reviewing Practice Together for Development
  • Use Idea Generation to Develop New and Novel Ideas
  • Integrated Self-Care Practices Into Groups and Encourage Use to Reduce Compassion Fatigue
  • Use Storytelling to Make Meaning and Develop Cohesion
  • Include Scholarly Sources and Develop Connections to Evidence-Based Practice
  • DefineConcepts as a Group to Enhance Understanding
  • Review Protocols for Professional Socialization

Implementing a group focused on trauma-informed care through this format might be an unexpected idea to some. During the orientation meeting, there was a school admin who joined the orientation meeting. They later reached out to me and shared they could not participate in the PLC at this time due to the time commitment. During the orientation, she commented that she hadn’t realized that it was an orientation for an ongoing PLC-style group. She had believed the training would follow a more traditional sit-and-get training, and the commitment was joining for an hour-and-a-half presentation.

I hope this project will provide an avenue for new ways of learning about trauma-informed care practices in schools that can also come from the PLC and the classroom.

Reference

Campbell, J. (2023). A professional learning community for developing trauma-informed practices using participatory action methods: transforming school culture for students with emotional and behavioral disabilities (Publication No. 30424801) [California Institute of Integral Studies ProQuest Dissertations Publishing]. ProQuest Dissertations and Theses. https://www.proquest.com/dissertations-theses/professional-learning-community-developing-trauma/docview/2813493629/se-2

Slide 9
A chart titled 'ABC Notes' lists observations of a 9th-grade student named Harold, detailing activities, behaviors, and consequences during specific dates/times. Categories include Language Arts, PE, and Math. Printed text includes 'Example of ABC Data,' and 'SOWK 460w Spring 2025.'

Example of Participant Observation: ABC Data

The textbook talked about structured observations. In a school based setting one method of doing this is collecting ABC data

Explain ABC data collection

Other examples

  • Observation while consulting
  • Counts (like cars that can see with seatbelts or STAR Reporting)
  • Structured data collection
Slide 10
Speech bubbles interact, with 'WHAT IS YOUR PROJECT' text in one. Context shows presentation slide details: Jacob Campbell, Ph.D., LICSW at Heritage University, SOWK 460w Spring 2025.

What is your project

[Whole Group Activity] Have groups sit with each other. Give 10 minutes to talk about their projects

Share out what see as project currently.

Slide 11
Table titled 'Program Evaluation Work Plan' organizes a work plan for evaluation design and data collection, with columns for component, indicator, source, success, task, responsible person, and deadline.

Program Evaluation Work Plan

This is due this Sunday. It is OK if it ends up changing some or not being exactly followed. The idea is for you to have a good understanding of what you will do in your project.

Evaluation Design

_ Component: A part of your evaluation _ Indicator: Measurable information about program implementation _ Source: Data to be collected _ Success: What you will know with collecting this information

Talk about using the form and being able to change or add rows.

Slide 12
Slide with text lists four components: 'Evaluative aspects,' 'Logic model development,' 'Executive summary components,' and 'Final presentation.' Bottom text mentions Jacob Campbell, Ph.D., Heritage University, and 'SOWK 460w Spring 2025.'

Probable Components

The following are the components you will add include:

  • Evaluative aspects (likely two or three)
  • Logic model development
  • Executive summary components
  • Final presentation
Slide 13
Slide outlines program evaluation indicators. Features three sections: Input Indicators (measure contributions), Process Indicators (measure activities/outputs), Outcome Indicators (measure program success). Provides detailed explanations for each. Footer credits Jacob Campbell, Ph.D., LICSW at Heritage University.

Indicators in Program Evaluation

The CDC (2021) describes indicators in program evaluations:

Definition: The measurable information used to determine if a program is implementing their program as expected and achieving their outcomes

  • Input indicators measure the contributions necessary to enable the program to be implemented (e.g., funding, staff, key partners, infrastructure)
  • Process indicators measure the program’s activities and outputs (direct products/deliverables of the activities). Together, measures of activities and outputs indicate whether the program is being implemented as planned.
  • Outcome indicators measure whether the program is achieving the expected effects/changes in the short, intermediate, and long term.

Reference

Centers for Disease Control and Prevention (2021, April 9) Indicators: CDC Approach to Evaluation. https://www.cdc.gov/evaluation/indicators/index.htm

Slide 14
A presentation slide features a QR code on a brown background. Text reads, 'CRITERIA FOR SELECTION OF HIGH-PERFORMING INDICATORS' and provides a link and academic attribution to Jacob Campbell, Ph.D.

Criteria for Selection of High-Performing Indicators

[Small Group Activity] In your groups, review the A Checklist to Inform Monitoring and Evaluation

[Whole Group Activity] Debrief

  • What do you think you are going to take and adapt to your evaluation
Slide 15
The image is a slide titled 'PROGRAM EVALUATION WORK PLAN.' It features a table for planning, divided into columns: Component, Indicator, Source, Success, Task, Person Responsible, and Deadline. Text describes data and evaluation.

Work Time: Program Evaluation Work Plan

[Small Group Activity] Give time to work on plan, answer questions.

Slide 16
Text slide instructing to post a group work plan in the forum; includes requirements: evaluation location, overview, group member names, and description. Attributed to Jacob Campbell, Heritage University, for SOWK 460w Spring 2025.

Posting your Group Work Plan in the Forums

  • Where you will do your program evaluation
  • A general idea of what you will look at
  • The names of group members
  • A brief description of your group work plan
Slide 17
Rubric table evaluates program work plans on completeness, clarity, fairness, and feasibility. Each criterion includes descriptions under 'Highly Developed.' Title: 'RUBRIC FOR PROGRAM EVALUATION WORK PLAN.' Footer notes author and course details.

Group Work Plan Rubric

Review the rubric:

Description Initial Emerging Developed Highly Developed
Completeness Very few aspects of the overall research project are included in the plan. The plan includes several components, but a few significant processes are not included. The plan generally outlines most of the research project. The plan is thorough and covers the entire research project.
Clarity The plan of what needs to happen in either the evaluation design or the data collection is unclear. The plan provides some general idea of tasks that need to be completed but does not include a delineation between evaluation design and data collection. The plan is understandable and includes information about the design and data collection. The plan clearly articulates both the evaluation design and data collection that will take place within the research. The evaluation design includes components, indicators, sources, and what success looks like. The data collection identifies specific tasks, the person responsible, and deadlines for completing those tasks.
Fairness The distribution of tasks is not fair for group members. The distribution of tasks is somewhat fair for group members, but some significant tasks or components are unfairly assigned. The group members are assigned tasks, but a few seem to have more or less work than others. Group members are fairly distributed with tasks related to the assignment.
Feasibility None of the components or tasks are feasible. The program evaluation does not appear feasible, with significant components not likely to be completed. The program evaluation appears feasible, but some aspects might seem out of scope or out of the student’s ability to complete. The program evaluation plan appears feasible and something the group can accomplish within the semester.