...

Strategies in Addressing Program Evaluation Challenges for a District-Wide

by user

on
Category: Documents
12

views

Report

Comments

Transcript

Strategies in Addressing Program Evaluation Challenges for a District-Wide
Strategies in Addressing Program
Evaluation Challenges for a District-Wide
Comprehensive Behavioral Health Model
Amy Kaye, Ph.D. & Jill Snyder, Ph.D.
Agenda
• Introduction to Boston’s Comprehensive Behavioral Health
Model (CBHM)
• Background and Need for Evaluation Focus
• Development of CBHM’s Evaluation Plan
–
–
–
–
–
Research Team & Tasks
Logic Model
Data Sources
Indicators
Evaluation vs. Research Questions
• District Level Data-based Decision Making
• Conclusions
• Discussion
INTRODUCTION TO
CBHM IMPLEMENTATION
Boston Public Schools (BPS) Context
• First public school system in the US (1647)
• >120 schools
– 80 Elementary
– 9 Middle
– 31 Secondary
Boston Public Schools Context
• Diverse neighborhoods
• 54,312 students
– 30% English Language
Learners
– >100 languages spoken
(MDESE, 2015)
Boston Public Schools Context
• Need for services
– 1 in 5 students with disabilities (MDESE, 2015)
– 1 in 4 Boston children have experienced at least 1
adverse childhood event (BPHC, 2013)
• Limited resources
– 57 school psychologists for >120 schools
– 35% of schools have limited or no behavioral
health partners (<.5 FTE)
Comprehensive
Behavioral Health Model
CBHM Implementation
• Launch: 2012-2013 school year
• 10 schools per year
• Current: 40 schools
• Executive Work Group Formation
CBHM EVALUATION
School-Based Behavioral Health
Evaluation
• Historically, limited evaluation efforts
• Consistently identified as an area in need of
critical attention
School-Based Behavioral Health
Evaluation Challenges
Low measure
completion
and return
rates
Access to
data
Combining
and
managing
large data
sources
Selection of
key indictors
Multiple
distinct
stakeholders
School-Based Behavioral Health
Evaluation Challenges: District Models
Low measure
completion
and return
rates
Access to
data
Combining
and
managing
large data
sources
Increasingly
larger data
sets
Multiple
distinct
stakeholders
Selection of
key indictors
Controlling
complex
differences
across
schools
Establishing
one
comprehensive plan
Evaluation Goals
• Accountability
• Quality assurance and improvement
• Data-based decision making
Cohort 1
Cohort 2
Cohort 3
Cohort 4
Future
Cohorts
MTSS
Systems
Data
Practices
CBHM
Systems
Practices
Data
Partnerships
More info: Geier, Smith, & Tornow (2012)
Partners
Family
Engagement
Research
Communications
Implementation
CBHM
Executive Work Group
CBHM Research Committee:
Organization
WHO?
The CBHM Research
Committee consists
of representatives
from multiple
agencies and various
disciplines
Boston Public
Schools
Hospital
Partners
University
Partners
CBHM Research Committee:
Organization
WHAT?
CBHM Research Committee:
Organization
WHEN?
Logic
Models
Logic Models
• Provide a map for a program or initiative
• Clarify:
– A program’s destination
– The pathways to that destination
– Markers along the pathways
• Explicitly present:
– Assumed theories of change/action
– Assumptions about resources at a program’s
disposal
(Shakman & Rodriguez, 2015)
Logic Models
• Benefits:
– Accountability for what matters
– Common language
• Integrate:
– Planning
– Implementation
– Evaluation
– Reporting
Logic Models
INPUTS
OUPUTS
OUTCOMES
Logic Models
INPUTS
OUTPUTS
Program
Investments
Activities
Participants
What we
invest
What
we do
Who we
reach
OUTCOMES
Short
Medium
What
are the
results
(University of Wisconsin - Extension, 2008)
Long
Logic Models
(University of Wisconsin - Extension, 2008)
CBHM Logic Model: Theory
OUTPUTS
LONG
TERM
OUTCOMES
BHS
Schools
Students
CBHM
implementation
began by building
capacity and
expertise among
BHS Staff
BHS Staff with
expertise in MTSS
and Behavioral
Health supported
the adoption and
implementation of
MTSS frameworks
within schools
BPS Students, Families
and Communities
benefit from a broad
range of academic
and behavioral
supports, provided
within a MTSS
framework
INTERVENTION
Click
CBHM:
to edit
District
Master
Evaluation
title style
BHS
Schools
Students
CBHM
implementation
began by building
capacity and
expertise among
BHS Staff
BHS Staff with
expertise in MTSS
and Behavioral
Health supported
the adoption and
implementation of
MTSS frameworks
within schools
BPS Students, Families
and Communities
benefit from a broad
range of academic
and behavioral
supports, provided
within a MTSS
framework
INTERVENTION
INTERVENTION
Click
to edit
Master
title style
CBHM:
School
Evaluation
INPUTS
OUTPUTS
What?
Who?
OUTCOMES
SHORT
Term
MEDIUM
Term
LONG
Term
School Staff
School Based PD
CBHM School Staff
Screening
Teachers/Students
Progress
Monitoring
Teachers/Students
Social Emotional
Learning
Teachers
Data Based
Problem Solving
Teams
CBHM School Staff
Partners
Families
Coaching
CBHM School Staff
Support Staff
Academic
and Social
Competence
Partners
Families and
Communities
District
Support
BHS Staff
CBHM
Coaches
Change in
CBHM
School Staff
Knowledge
Change in
CBHM
School Staff
Behavior
Safe &
Supportive
Learning
Environments
High Quality,
Equitable
Behavioral
Health
Services
Evaluation Questions
PROCESS QUESTIONS
How many students, parents, teachers are being reached?
FIDELITY QUESTIONS
Are CBHM activities being implemented as outlined in CBHM schools?
Which portions of CBHM are being implemented with least & greatest fidelity?
OUTCOMES QUESTIONS
Are BPS behavioral health staff demonstrating increased knowledge and
changes in their behaviors at their schools?
Are staff in CBHM schools demonstrating increased knowledge and
changes in behaviors consistent with CBHM?
Are students in CBHM schools demonstrating improvements in
academic and social competence?
Evaluation
Plan
Evaluation Plan Considerations
What are our indicators for each of our outputs
and outcomes?
What data sources inform each of these outputs
and outcomes?
Evaluation Plan:
Data Sources
• BPS Databases
Data
Warehouse
SEIMS
BIMAS
(Meier,
McDougal, &
Bardos, 2011)
ATI
LIZA
SNAP
Aspen/
SIS
MCLASS
(Dibels &
TRC)
Navigating Access to District Databases
Evaluation Plan:
Data Sources
• BHS Data Sources
Staff
Monthly
Activity
Reports
Professional
Development
Workshop
Evaluations
Time
Sampling
Tiered
Fidelity
Inventory
(Algozzine et al.,
2014)
Annual
BCH
Community
Partnership
Report
Where do we even begin?
Data Source Best Practices
Reliable & Valid
Standardized
protocols
Trained data
gatherers
Actionable & Relevant
Frequent & Timely
Available across all
schools
Collected, analyzed,
and reported in a
frequent & timely
Integrated into daily
manner
routines
From Ward (2015)
Identifying Indicators
INPUTS
OUTPUTS
OUTCOMES
PROCESS Questions
What?
FIDELITY
Questions
Who?
SHORT
Term
MEDIUM
Term
LONG
Term
School Staff
School Based PD
CBHM School Staff
Screening
Teachers/Students
Progress
Monitoring
Teachers/Students
Social Emotional
Learning
Teachers
Data Based
Problem Solving
Teams
CBHM School Staff
Partners
Families
Coaching
CBHM School Staff
Support Staff
Academic
and Social
Competence
Partners
Families and
Communities
District
Support
BHS Staff
CBHM
Coaches
Change in
CBHM
School Staff
Knowledge
Change in
CBHM
School Staff
Behavior
Safe &
Supportive
Learning
Environments
High Quality,
Equitable
Behavioral
Health
Services
Organizing Data Sources into an
Evaluation Plan: School Level Outputs
Output
Universal
Screening
Indicator
Source
% of students BIMAS
screened
BIMAS
Aim for
CBHM
Fidelity
>80% of
students
Timeline for Do I Have Where is Data Action Steps
Data Entry Access? Located?
Update
(File Name &
Person
w/Access)
UA 1 –
10/1 - 12/15
UA 2 –
3/1 - 5/15
Yes
□ Enter data
into database
“BIMAS Data (Amy)
Inventory” in □ Amy access to
Research folder BIMAS data (Jill)
(All Research □ Update
Team)
BIMAS Data
Inventory
document
monthly (Amy)
Organizing Data Sources into an
Evaluation Plan: District Level Outputs
Output
Data
Management
and
Accountability
Indicator
% of schools using
BIMAS universal
screening (>80%
screened)
Source
BIMAS
Aim for
CBHM
Fidelity
100% of CBHM
schools
Timeline for Do I Have Where is Data Action Steps
Data Entry Access? Located?
Update
(File Name &
Person
w/Access)
UA 1 –
10/1 - 12/15
UA 2 –
3/1 - 5/15
Yes
□ Enter data
into database
“BIMAS Data (Amy)
Inventory” in □ Amy access to
Research folder BIMAS data (Jill)
(All Research □ Update
Team)
BIMAS Data
Inventory
document
monthly (Amy)
Identifying Indicators
INPUTS
OUTPUTS
What?
Who?
OUTCOMES
SHORT
Term
MEDIUM
Term
LONG
Term
School Staff
School Based PD
CBHM School Staff
Screening
Teachers/Students
Progress
Monitoring
Teachers/Students
Social Emotional
Learning
Teachers
Data Based
Problem Solving
Teams
CBHM School Staff
Partners
Families
Coaching
CBHM School Staff
Support Staff
Academic
and Social
Competence
Partners
Families and
Communities
District
Support
BHS Staff
CBHM
Coaches
Change in
CBHM
School Staff
Knowledge
Change in
CBHM
School Staff
Behavior
OUTCOMES Questions
Safe &
Supportive
Learning
Environments
High Quality,
Equitable
Behavioral
Health
Services
Identifying Indicators
OUTCOMES (SCHOOL LEVEL)
SHORT TERM
Increased staff knowledge of
social emotional development &
behavioral health
Increased staff knowledge of
best practices in addressing
student behavioral health needs
Increased confidence in
addressing student behavioral
health needs
MEDIUM TERM
LONG TERM
Improved student academic
performance
Integrated academic and socialemotional learning
Increased positive behaviors
Improved school climate
Data-based decision-making
Targeted supports and services
Community partnerships
Improved student academic
engagement
Increased school capacity to
provide services
Improved access to services
Identifying Indicators
OUTCOMES
SHORT TERM
Increased staff knowledge of
social emotional development &
behavioral health
Increased staff knowledge of
best practices in addressing
student behavioral health needs
Increased confidence in
addressing student behavioral
health needs
PD Evaluations
MEDIUM TERM
LONG TERM
Improved student academic
performance
Integrated academic and socialemotional learning
MCAS
Increased positive behaviors
BIMAS
Improved school climate
Data-based decision-making
Targeted supports and services
School Climate Survey
Improved student academic
engagement
Attendance
Community partnerships
Increased school capacity to
provide services
Tiered Fidelity Inventory
Monthly Clinician Reports
Partnership Report
Improved access to services
FTEs; Monthlies
Organizing Data Sources into an
Evaluation Plan: School Level Outcomes
Outcome
Indicator
Increased staff % of school staff
knowledge
agreeing with
satisfaction survey
items
Source
Aim for
CBHM
Fidelity
Timeline for Do I Have Where is Data Action Steps
Data Entry Access? Located?
Update
(File Name &
Person
w/Access)
>80%
PD
Satisfaction
Surveys
After
workshops
No
BHS Files
(BHS Staff)
□ Obtain data
for quarterly
reports
Regular
Review of
Data with
Research
Committee
Regular
Review of
Data with
Research
Committee
SST
Grade Level Team
ILT
EWG
Student
Classroom
Grade
School
District
District Level
Data-Based Decision Making
Cohort 1
Cohort 2
Cohort 3
Cohort 4
Future
Cohorts
Data Review
MONTHLY
• Thermometer of
clinical activities
QUARTERLY
• Report of key
outputs and
outcomes to
EWG
ANNUALLY
• Report of overall
outputs and
outcomes to
EWG &
stakeholders
Data Review:
Monthly Thermometer
Behavioral Healt h Services Mont hly Report : Sept ember 2 0 1 5
Bost on Public Schools
* For t his report , dat a is based on only School Psychologist s ( FTE=5 4 )
450"
350"
250"
200"
388"
150"
206"
100"
50"
0"
Assessment s! IEP Counseling
& Consult at ions !
Crises!
FBA/ BIP!
Number of
St udent s
Served in
Tier II
Groups!
300"
100"
Consult at ions!
St udent Support !
400"
IEP/ 5 0 4
Meet ings!
106" 63"
132"
64
School-Based
Crises!
7
Suicide Risk
Assessment s!
7
Threat
Assessment s!
96"
100"
80"
47"
60"
26"
40"
20"
24
Support ed f or research and dat a provided by Bost on Children' s Hospit al
Parent s!
School-Wide Teams!
Prevent ion!
Number of
PD
Facilit at ed!
177"
Communit y
Part ners!
120"
79
Administ rat ion!
0"
4"
44"
Data Review:
Annual Report
• Reporting at 3 levels:
CBHM STUDENT Outcomes
Cohort 1:
Decrease in Problem Behaviors
58
57
56
BIMAS
Average T-Score
55
54
Conduct
53
Negative Affect
Cognitive/Attention
52
51
50th Percentile
50
49
48
2012
2013
2014
CBHM STUDENT Outcomes
Cohort 1:
Increase in Positive Behaviors
52
51
50th Percentile
50
BIMAS
Average T-Score
49
48
Social
47
Academic Functioning
46
45
44
43
42
2012
2013
2014
CBHM STUDENT Outcomes
Cohort 1:
Increase in Academic Outcomes
242
241
PROFICIENT
240
MCAS
Average Scaled Score
239
238
ELA
237
MATH
236
235
234
233
232
2012
2013
2014
CBHM STUDENT Outcomes
● Compared Fall 2013 with Fall 2014
● All CBHM Students with data available for both screenings (n=738)
Findings:
Statistically Significant increase in average
BIMAS Social Scale T-Score (p<.01)
Nearly significant decrease in average BIMAS
Conduct Scale T-Score (p=.063)
CBHM SCHOOL Outcomes
Cohort I: Attendance Rates at CBHM Schools Compared to District
Source: DESE, SY 13-14
100.0%
98.0%
96.0%
Sumner
New Mission
Mattahunt
Mason
Joseph Lee
Jackson Mann
86.0%
BPS
88.0%
Conley
90.0%
Boston Arts Academy
92.0%
Boston Latin School
94.0%
CBHM SCHOOL Outcomes
CBHM Implementation Fidelity: SY 2013-14 (Fidelity For Now)
CBHM DISTRICT Outcomes
SCHOLARLY PURSUITS:
•
•
•
•
•
•
Book Chapter
Articles
Professional Presentations
Journal Articles
Grant Submissions
TV Interviews
Examples: District Level
Data-Based Decision Making
• Data Retreats
– August & October 2014
– Review of annual report data
– Lack of clarity surrounding what might be
prompting improved outcomes
• Need for more consistent fidelity data
– End of 2014-15: Use of SWPBIS Tiered Fidelity
Inventory (Algozzine et al., 2014) introduced
Examples: District Level
Data-Based Decision Making
• April 2015 Quarterly Report Review
– Screening data
– Remains low despite significant increases in #
– Pattern of drop off in the spring
• How can we help improve screening
completion rates?
• Research question:
– What is getting in the way of screening
completion?
• Further exploration into this question to come
BIMAS Completion Rates
16000
14000
12000
10000
8000
Enrollment
# Screened
6000
4000
2000
0
UA 1
2012-13
UA1
UA2
2013-14
UA1
UA2
2014-15
Conclusions:
Successful Strategies and Tools
Network,
network,
network.
Community
Partnerships
Data Accountability
Office
Organize
and
delegate.
Data
Evaluation
Research
Write it
down.
Evaluation Plan
Evaluation Timeline
Report Templates
Share
your data.
Frequency of Sharing
Data
Research Procedures
Sustainability
McIntosh et al. 2014
•
•
•
•
•
CBHM Research Committee
CBHM Executive Work Group
UMass Boston Practicum Students
Boston Children’s Hospital Evaluation Team
Behavioral Health Services Staff at Boston
Public Schools
• CBHM School staff, students, and families
Questions? Comments?
Questions? Comments? Contact…
• www.cbhmboston.com
• Amy Kaye
[email protected]
• Jill Snyder
[email protected]
References
•
•
•
•
•
•
•
•
Algozzine, B., Barrett, S., Eber, L., George, H., Horner, R., Lewis, T., Putnam, B., Swain-Bradway, J.,
McIntosh, K., & Sugai, G. (2014). School-wide PBIS Tiered Fidelity Inventory. OSEP Technical Assistance
Center on Positive Behavioral Interventions and Supports. www.pbis.org.
Boston Public Health Commission, Research and Evaluation Office (2013). Health of Boston’s Children:
Parent and Caregiver Perspectives. Retrieved from http://www.bphc.org/healthdata/
Geier, R., Smith, S., & Tornow, M. (2012, January). District data teams: A leadership structure for
improving student achievement [White Paper]. Retreived from
http://www.publicconsultinggroup.com/education2/library/white_papers/.
Massachusetts Department of Elementary and Secondary Education (2015). Boston Public Schools
District Profile. Retrieved from http://profiles.doe.mass.edu/profiles
McIntosh, K., Kim, J., Mercer, S.H., Stickland-Cohen, M.K., & Horner, R.H. (2015). Variables associated
with enhanced sustainability of school-wide positive behavioral interventions and supports.
Assessment for Effective Intervention, 40(3), 184-191.
Shakman, K., & Rodriguez, S.M. (2015, April). Logic models for program design, implementation, and
evaluation: Workshop toolkit. Institute of Education Sciences.
University of Wisconsin -Extension (2008). Developing a logic model: Teaching and training guide
[PowerPoint slides]. Retrieved from
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html.
Ward, C.S. (2014, August). Conduction Implementation Informed Evaluations: Practical Applications
and Lessons from Implementation Science [Power Point slides]. Retrieved from:
www.relmidatlantic.org/pulic_event/conducting-implementation-informed-evaluations-practicalapplications-and-lessons.
Fly UP