Comments
Transcript
Welcome NIPN Operations Training Washington, DC
Welcome WA 2005 OR ME 1999 2007 ID 2010 2011 MN ND MT NIPN Operations Training 2011 SD WI WY MI NB 2011 UT 2011 2007 Washington, DC CO KS MO PA OH IN IL 2003 CA 2005 MA RI 2005 CT 2008 2007 IA NV VT NY NH 2013 NJ 2012 NIPN and Collaborative Improvement and Innovation Network Wendy Davis, MD DE WV 2007 VA MD 2012 DC 2005 KY NC AZ NM 2005 2004 OK 2007 TN AR SC 2013 AL GA MS TX LA FL AK IP Pre-IP October 2-4, 2013 Inactive IP (not currently participating) Year IP joined NIPN HI National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 What is a CoIN? Collaborative Improvement & Innovation Network to Reduce Infant Mortality (CoIIN) A CoIN, or Collaborative Innovation Network, has been described as a cyber-team of self-motivated people with a collective vision, enabled by the Web to collaborate in achieving a common goal by sharing ideas, information, and work.1 Key Elements of a COIN are: • • • Being a “cyberteam” (i.e. most COIN work will be distance-based) Innovation comes through rapid and on-going communication across all levels. • Describes how individuals will work (and learn) collaboratively to develop, implement, and evaluate strategies to reduce infant mortality. • Adapted to reflect focus on both innovation and improvement Collaborative Improvement & Innovation Network to Reduce Infant Mortality (CoIIN). 1 Gloor PA. Swarm Creativity: Competitive Advantage through Collaborative Innovation Networks. New York: Oxford University Press, 2006. Wendy Davis, MD, FAAP October 3, 2013 4 CoIIN Structure: CoIIN: History Regions IV and VI Priorities & Teams January 2012: Infant Mortality Summit, New Orleans (Regions IV and VI – AL, FL, GA, KY, MS, NC, SC, TN; AR,LA, NM, OK, TX) 5 state-identified priorities: Designed to meet stated needs related to: Common evidence-based strategies to reduce infant mortality Shared, collaborative learning and action across states March 2012: initiative launched to support the adoption of collaborative learning and quality improvement principles and practices to reduce infant mortality and improve birth outcomes Developed in partnership with ASTHO, AMCHP, March of Dimes, CityMatCH, CMS, and CDC 5 Reducing early elective deliveries <39 weeks (EED) Expanding interconception care in Medicaid (ICC) Reducing SIDS/SUID (SS) Increasing smoking cessation among pregnant women (SC) Enhancing perinatal regionalization (PR) Strategy Teams 2-3 Leads (Topical Experts) Data and/or Methods Experts (as needed) 2 Staff from MCHB and Partner Organizations Self-selected Members from each of the 13 states in Regions IV and VI Teams average 25-30 members; state delegations range from 7-13 members. 6 1 CoIIN Structure: Region V Priorities CoIIN: Structure (cont.) 4 region-identified priorities: Lifespan: originally 12-18 months; Regions IV & VI recently extended through August, 2014 Support provided by contract through MCHB Foci, activities, and outcomes are “team driven” Early elective deliveries (EED) <39 weeks Preconception Health Social Determinants Reducing SIDS/SUID (SS) States: IL, IN, MI, MN, OH, WI 7 8 CoIIN: Launch (Regions IV & VI) Face-to-face meeting July 2012 Learning sessions on: CoIIN: Launch (Region V) Existing efforts to reduce infant mortality and improve birth outcomes Collaborative Learning Quality Improvement Infant Mortality Summit March, 2013 MCHB + AMCHP + ASTHO + Abt Associates + VCHIP Face-to-face meeting October 1-2, 2013 Regions IV and VI update (report successes), “peer to peer learning,” plan for next 11 months Region V – QI learning, priority selection, team development Keynote: focus on Social Determinants (overarching priority selected by Region V) Meeting Goals: Promote team building (5 Strategy & 13 State Teams) Provide training in methods for QI and collaborative learning Offer a structured environment to plan, implement, and test innovation Provide an opportunity for State Teams to Report-Out on activities and successes since January 2012 Infant Mortality Summit. 9 10 CoIIN: Challenges CoIIN: Summary Logistics Regional, inter- and intra-State differences Strategy-specific challenges Adoption of collaborative practices under challenging logistical circumstances 11 Launched in response to stated needs among the 13 States in Regions IV and VI; now extended to Region V Designed to help states use the science of quality improvement and collaborative learning to improve birth outcomes over a period of 12-18 months Part of a portfolio of efforts to improve birth outcomes and works in partnership with these initiatives 12 2 CoIIN + NIPN: Opportunities? Encouraged current participants to consider outreach to IPs in states in Regions IV, V & VI Questions/Discussion Sustainability? Potential for future projects of mutual interest Infant Mortality reduction as a population health objective of interest to IP members and partners Infuse health disparities/health equity into IP projects (e.g., expand approach to data collection to include geographic mapping) Thoughts? 13 14 IPs with expertise on each topic will begin these facilitated discussions—please pick TWO sessions to attend. There will be an announcement when you should switch. From Design to Documentation and Promotion: Practical Approaches to Getting the Most out of Each QI Project Topics include: •Project Design and Measure Selection •Successful Practice Recruitment and Engagement Strategies •Managing Data and Conducting Analyses (IN ROOM NEXT DOOR) •Results Promotion and Dissemination National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 National Improvement Partnership Network (NIPN) 3rd Annual Meeting March 8-9, 2010 Paul E. Jarris, MD Executive Director, Assoc. of State & Territorial Health Officials (ASTHO) Public Health and Improvement Partnerships: Using QI to Strengthen Performance and Improve Population Health Outcomes Paul Jarris, MD National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 Veteran, VT 50 18 3 Paul E. Jarris, MD, MBA ASTHO Executive Director Public Health Accreditation Board (PHAB) National Public Health Improvement Initiative (NPHII) Tool Development ◦ ROI ◦ Health Equity Index ASTHO lead Collaboratives Policy vs. microsystems “The goal of the national accreditation program is to improve and protect the health of the public by advancing the quality and performance of tribal, state, local, and territorial public health departments” Accreditation is the measurement of health department performance against a set of nationally recognized, practice-focused and evidence-based standards There are three prerequisites that a health department must have to apply for accreditation: community health assessment, health improvement plan, and department strategic plan PHAB Standards and Measures contain 12 domains of competency that look at health agency: Health Assessment, Investigation, Information and Education, Community Engagement, Policies and Plans, Public Health Law, Access to Care, Workforce, QI, Evidence Based Practice, Administration and Management, Governance http://www.phaboard.org/ $32.5 million in CDC grants to 73 governmental entities (states, territories, a few large cities and one tribe) Focus is on QI and Performance Management All grantees are required to address their readiness for PHAB accreditation. Each NPHII grant funds a Performance Improvement Manager (PIM). Many grantees also fund an Accreditation Coordinator: ◦ CDC provides networking and TA support for all PIMs ◦ ASTHO provides networking and TA support for Accreditation Coordinators in states. 4 Provide technical assistance and to assist state and territory health agencies to ◦ Organize and prepare for accreditation ◦ Integrate performance management systems into the public health infrastructure ◦ implement quality improvement in public health programs and processes Provide policy guidance to state and territory health officials and executive leaders on accreditation and performance management legislation Educate the public health community on PHAB accreditation and its impact of population health and the quality of public health services. Represent the state and territory health agency perspective during the development and continuous improvement of the PHAB Accreditation Program. Aggregate public health benefit Public health programs ROI applied to QI in public health (ASTHO tool) Savings in Medicaid Expenditures ASTHO created tool for applying ROI to QI….. ◦ Reductions in standard operating costs Greater efficiencies realized ◦ Revenue enhancements Increased cost reimbursement ◦ Increased productivity of agency functions Increased service encounters ◦ Decreased time to produce outputs Reduced cycle time process The Objective of Health is Twofold: ◦ Goodness – The best attainable average level. ◦ Fairness - The smallest feasible differences among individuals and groups. World Health Report 2000 Collaborate with United Health Foundation Index will allow state health agencies to: ◦ Examine health status data (obesity, infant mortality, tobacco use, etc.) by race/ethnicity, age, income, and educational level ◦ Create goals based on state-level (State Health Improvement Plan) and national-level (Healthy People 2020) measures 5 Maternal and child health Public health-primary care integration Million Hearts Radiation readiness Prescription Drug Misuse Improving health rankings National Prevention Strategy Tobacco Control Climate Change Health in All Policies Reduce early elective deliveries <39 weeks Access to pre- and interconception care Smoking cessation Safe Sleep Improve perinatal regionalization Social Determinants of Health D.C. Puerto Rico Taken Pledge Virginia Apgar Award (8% Reduction) Franklin Delano Roosevelt Award (9.6% Prematurity Rate) Eliminating Payment for EED: Georgia (7/2013), New Mexico (4/2013), South Carolina (1/2013), Texas (10/2011) Bonus to Hospitals meeting Quality Targets: Washington State (7/2013) Requirement for Hospitals to use Evidence-Based Guidelines: Michigan (1/2013) Non-medically Necessary C-Sections Paid at Vaginal Delivery Rate: Illinois (5/2012) Current legislation on EED: Hawaii State and Local Health Agencies worked with local hospitals, obstetrical providers and families to increase awareness of delivery to full term Partnered with Medicaid ACTION Legislation to eliminate Medicaid payment for elective inductions/csections <39 weeks Health department received 4.1 M for statewide Healthy Babies Preterm birth rate has dropped 6% in 1 year saving Medicaid an estimated $5.4M RESULT 6 Began recruiting hospitals for voluntary “hard stop” effort in Jan. 2011 Currently have 55 of 59 birthing hospitals enrolled, affecting 95% of births 70% reduction in rate of induction <39 weeks without medical indication As of Jan. 1, 2013, NAS is a reportable condition ◦ Source of maternal opiate use is captured through web-based portal available on Dept. of Health website Examining Medicaid claims data to provide insight into opportunities for intervention to prevent and treat opioid dependency in pregnancy ◦ Primary prevention strategies include: prevent addiction from occurring, prevent pregnancy in women taking narcotics Cost of NAS in TN Impact of NAS on Infant Health Care Expenditures, CY 2011 TennCare non-LBWT Births TennCare Live LBWT Births2 NAS Infants 45,205 40,437 4,768 528 Cost for Infant in first year of life $350,936,293 $171,336,964 $179,599,329 $33,249,612 Average Cost per child $7,763 $4,237 $37,668 $62,973 Average length of stay (days) 4.8 3.2 18.3 32.5 Number of Births 17P (17 alpha-hydroxyprogesterone caproate) is a synthetic form of progesterone TennCare Paid Live Births1 Metric ◦ ◦ While 17P is cost effective the calculations of ROI have varied* Percentage of Newborns in DCS Custody within One Year of Birth, CY 2011 2 Infants born in CY 2011 NAS infants ◦ Total # of Infants 55,578 528 ◦ Total # infants in DCS 767 120 % in DCS 1.4% 22.7% This sample contains only children that were directly matched to TennCare’s records based on Social Security Number. Any infant weighing under 2,500g at the time of birth was considered low birth weight (LBWT). ~4,219 women qualify for 17P – 2,023 low income and 1,622 Medicaid. Projected cost savings - $1,752,000 for Medicaid – $4,558,400 for all women. 313 babies born full term. * Based on compounded 17P cost, full course may be up to 20 shots 39 16.0 14.0 14.54 14.86 14.47 13.64 14.07 14.07 12.55 12.0 Deaths per 1,000 1 Served ~223 uninsured women, providing over 4021 doses of medication. 95 clinics across the state requested free 17P for uninsured patients. Most only had a few uninsured patients each year. 11.04 10.0 8.0 6.0 4.0 2.0 0.0 All Other Regions OH MI IN IL WI MN Total Region V 7 16.0 14.54 14.86 14.47 14.0 13.64 14.07 14.05 Deaths per 1,000 12.55 14.07 Deaths per 1,000 12.0 10.0 8.0 6.0 4.0 10.00 9.00 8.00 7.00 6.00 5.00 4.00 3.00 2.00 1.00 0.00 8.24 7.14 All 9.45 8.94 OH MI 8.01 8.14 8.71 IN IL WI Other Regions 2.0 0.0 All OH MI IN Other Regions IL WI MN* Total Region V * U.S.-born mothers Rate Ratio 2.32 Population Attributable Fraction 16% 8.28 MN* Total Region V 2.31 2.51 2.24 2.48 2.63 3.05 2.43 18% 22% 13% 20% 14% 11% 18% *US-born Black mothers 2.50 Excess Deaths per 1,000 2.00 1.00 58% 0.50 17% -0.50 74% 35% 26% distribution 33% 58% 1.50 0.00 Macro system view: Policy focus ◦ Scope, scale, sustainability due to gestational age 43% 47% 26% 16% 18% OH MI IN 88% 137% 63% IL WI 42% 20% due to higher mortality among preterm infants 38% MN^ Total due to higher mortality among term infants -1.00 * Compared to All Other Regions ^ US-born Black mothers ASTHO President’s Challenge on Healthy Babies and Library of Best and Promising Practices: www.astho.org/healthybabies/ ASTHO Making the Case for MCH Programs: www.astho.org/Making_the_Case_for_MCH_ Programs/ ◦ Statute, rule, regulation, payment, licensure ◦ ROI at system or Medicaid level Multi-stakeholder ◦ ◦ ◦ ◦ ◦ ◦ ◦ Cross Cabinet Administrative and legislative branches Public Private Providers Insurers Community Groups Consumer (patient) groups Working Lunch Food is located in the foyer National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 8 Take Home Messages The Future of Pediatric Public Health and Primary Care • Building health and education readiness for the next generation of children requires a focus on the one science of early brain development addressing toxic stress from pre-conception to early childhood • Integration of medical home, home visiting and early childhood system allows breakthrough strategies to address toxic stress and the social determinants of health • Evidence-based practices, population-based approaches and continuous quality improvement for primary care practices and early childhood community services defines MCH public health for the future National Improvement Partnership Network Operations Training: October 3, 2013 Arlington, Virginia David W. Willis, M.D., FAAP Director of the Division of Home Visiting and Early Childhood Systems (DHVECS) Maternal and Child Health Bureau Health Resources and Services Administration Department of Health and Human Services 49 50 A League Table of Child Well-Being LIFE COURSE Drivers of Developmental Trajectories • Neurodevelopmental Genetic, Prenatal and Neurodevelopmental Factors Socialeconomic environment Attachment and Relational Patterns (ACE Scores) • Social-economic • Relational Relational Health Source: UNICEF, 2013 51 52 Adverse Childhood Experiences (ACES) in Early Childhood (age 0-5 years) POPULATION ATTRIBUTABLE RISK • A large portion of many health, safety and prosperity conditions is attributable to Adverse Childhood Experience. Adverse Childhood Experiences Child had one or more Adverse Child or Family Experiences Child had two or more Adverse Child or Family Experiences National Prevalence for All Children (0-17 years) [State Range] 47.9% 40.6% (CT) - 57.5% (AZ) 22.6% 16.3% (NJ) - 32.9% (OK) 25.7% 20.1% MD – 34.3% (AZ) 20.1% 15.2% (DC) – 29.5 (OK) 3.1% 1.4% (CT) – 7.1% (DC) 6.9% 3.2% (NJ) – 13.2% (KY) 7.3% 5.0% (CT) – 11.1% (OK) 8.6% 5.2% (NJ) – 16.6% (DC) 8.6% 5.4% (CA) – 14.1% (MT) 10.7% 6.4% (NY) – 18.5% (MT) 4.1% 1.8% (VT) – 6.5% (AZ) National Prevalence for Young Children (0-5 years) IMPORTANT NOTE: Questions about child abuse and neglect were not directly asked about in the survey—though are unlikely to lead to substantially different overall rates since ACES are so commonly co-occurring. Socioeconomic hardship Divorce/separation of parent Death of parent ACE reduction reliably predicts a decrease in all of these conditions simultaneously. Parent served time in jail Witness to domestic violence Victim or witness of neighborhood violence Lived with someone who was mentally ill or suicidal Lived with someone with alcohol/drug problem Treated or judged unfairly due to race/ethnicity 53 Christina Bethell, PhD, MBA, MPH. 36.6% 12.5% 25.3% 9.9% 0.9% 4.5% 4.1% 2.8% 5.5% 5.5% 0.9% 54 9 Adverse Childhood Experiences and Health: Prevalence Among All US Chidlren Prevalence of CSHCN with Emotional, Behavioral or Developmental (EBD) Problems, by Number of Adverse Childhood Experiences and Age 25% 80% 20.2%s 19.0%s 18.1%s 20% 17.8%s 17.6%s 17.4%s 15% 9.3%s 7.4%s 6.6%s 6.2%s 5% 2.2% 0-1 (6.1%) 4-5 (18.0%) 6-7 (19.3%) 4.6% 8-9 (25.5%) 5.1% 5.0% 40% 51.9% 20% Age in years (Prevalence of 2+ ACEs indicated in parentheses) 0% Non-CSHCN Child with One Adverse Child Experience Child with Two or More Adverse Child Experiences Christina Bethell, PhD, MBA, MPH, CAHMI. 55 How Early Experience Gets Into the Body A Biodevelopmental Framework Physical, Chemical & Built Environments Physiological Adaptations & Disruptions Nutrition CSHCN CSHCN with EBD Problems Christina Bethell, PhD, MBA, MPH CAHMI. 56 Increasing Early Brain and Child Development Focus Lifelong Outcomes Cumulative Effects Over Time GeneEnvironment Interaction 36.0% 19.3% Child with no Adverse Child Experiences Environment of Relationships Two or more adverse family experiences 25.2% 4.6% 10-11 (28.6%) 12-13 (28.4%) 14-15 (31.2%) 16-17 (31.9%) Foundations of Healthy Development and Sources of Early Adversity One adverse family experience 25.9% 6.9% 5.6% 2-3 (13.3%) 9.8%s 7.8% 3.6% 2.4% 0% 10.1%s 23.7% 60% 12.4%s 10% Children With Chronic Conditions Are More Likely to Experience ACEs. Children With ACEs Are More Likely to Have Chronic Conditions HealthRelated Behaviors Educational Achievement & Economic Productivity Physical & Mental Health • BUILDING HEALTH • Mitigating toxic stress effects on health and developmental trajectories • Promoting the healthy early childhood foundations of life-span health • Promoting preventative mental health • Promoting kindergarten readiness • Strengthening the systems to address the social determinants of health Biological Embedding During Sensitive Periods 57 We’re in the “building health and developmental assurance” business… 58 Relational Health Physical health Developmental health Relational health 60 59 60 10 Building an Enhanced Theory of Change that Balances Enrichment and Protection Maternal, Infant and Early Childhood Home Visiting (MIECHV) New Protective Interventions Significant Adversity Healthy Developmental Trajectory • Section 2951 of the Affordable Care Act of 2010 (P.L. 111-148) amends Title V of the Social Security Act to add Section 511: • $1.5 billion over 5 years • $100M FY2010 • $250M FY2011 • $350M FY2012 • $400M FY2013 / FY2014 • Grants to States and Territories • 3% Set-aside for Grants to Tribes, Tribal Organizations, or Urban Indian Organizations • 3% Set-aside for Research, Evaluation, and Technical Assistance • Requirement for Collaborative Implementation by HRSA and ACF Supportive Relationships, Stimulating Experiences, and Health-Promoting Environments Source: Harvard Center on Developing Child 61 62 Priority Populations Evidence-Based Home Visiting Works Improves parental capacity and efficacy Strengthens positive parenting behaviors & reduces negative ones Improves birth outcomes Promotes healthy child development & links children to better, more consistent healthcare Identifies early developmental delays and links children to appropriate services Reduces maternal depression Improves school readiness Puts parents & children on a trajectory toward long-term health & productivity • • • • • • • • • • • • • • • • • Families in at-risk communities Low-income families Pregnant women under age 21 Families with a history of child abuse or neglect Families with a history of substance abuse Families that have users of tobacco in the home Families with children with low student achievement Families with children with developmental delays or disabilities Families with individuals who are serving or have served in the Armed Forces, including those with multiple deployments 63 64 Evidence-based Home Visiting Models Age Ranges Served by Evidence-Based Models Nurse Family Partnership (NFP) Early Head Start- Home Based Option (EHS-HV) Healthy Families America (HFA) Parents as Teachers (PAT) HIPPY (Home Instruction Program for Preschool Youngsters) Healthy Steps Family Check-up Child First Early Intervention Programs for Preschool Youngsters Play and Learning Strategies (PALS) The Early Start (New Zealand) Oklahoma Community-based Family Resource and Support Program • SafeCare Augmented • MECSH – Maternal EC Sustained HV Program • • • • • • • • • • • • HIPPY Model Name Nurse-Family Partnership Healthy Steps Early Head Start Healthy Families America* Parents As Teachers** Family Check-up Prenatal 0 1 2 3 4 5 6 … 17 Age of Eligible Population 65 66 11 State Selection of Home Visiting Model (April 2013) Evidence Based Model Number of States Implementing Healthy Family America 43 Nurse-Family Partnership 42 Parents as Teachers (PAT) 30 Early Head Start 26 Home Instruction for Parents of Preschool Youngsters (HIPPY) 8 Healthy Steps 3 Child First 1 Family Check-Up 1 67 68 MIECHV Grantees FY10-14 FY10 FY11 FY12 FY13 FY14 Total Grants 69 95 118 119 113 Formula Awards (states , DOC, and territories) 56 54 53 53 53 9 10 12 9 19 New Competitive Expansion Awards Continuing Expansion Awards New Development Awards 13 Continuing Development Awards Non-Profit Awards 13 19 • Benchmark Areas • Maternal and newborn health (8) • Child injuries; child abuse, neglect, or maltreatment; emergency department visits (7) • School readiness and achievement (9) • Crime (2) or domestic violence (3) • Family economic self-sufficiency (3) • Coordination/referrals for other community resources (5) 31 6 13 6 1 (ND) 2(FL,WY) 1 3 25 25 25 Continuing Non-Profits Tribal Awards (new and continuing) Data Collection on Six Goals (Benchmark Areas) • Between three and nine constructs (X) within each area • 34 or 35 constructs overall per grantee 22 69 Three Types of Measurement Aspect Quality Improvement Performance & Results Accountability Evaluation & Research Purpose Improvement of services Assurance, trends, accountability New knowledge, evaluation Bias Accept consistent, known bias Set measures to reduce bias Design to eliminate bias Sample size “Just enough” data, small sequential samples Aim for 100% available, relevant data Adequate for design, plus “just in case” data Hypothesis Hypothesis flexible, changes No tests, no hypotheses as learning occurs Fixed, generally predetermined hypotheses Testing Sequential tests, small tests of change No tests One or two larger overall tests of hypotheses Measuring improvement Run charts, Shewhart control charts, etc. General trends, no tests Statistical tests (e.g., t-test, pvalues, chi square) Confidentiality of data Data used by those involved with QI project Data available for public consumption and review Research confidentiality protections as needed Adapted from IHI: Solberg, L I; Mosser, G; McDonald, S "The three faces of performance measurement: improvement, accountability, and research." The Joint Commission Journal on Quality Improvement. 23, No. 3 1997, pp. 135-47. What is a CoIIN? • A focus on both innovation and improvement yielding a Collaborative Improvement & Innovation • A CoIIN, or Collaborative Innovation and Improvement Network, has been described as a team of self-motivated people with a collective vision, enabled by the Web to collaborate in achieving a common goal by sharing ideas, information, and work. • Key Elements of a CoIIN • Being a “cyber-team” (i.e. most CoIIN work will be distance-based); • Innovation comes through rapid and on-going communication at all levels; • Work in patterns characterized by meritocracy, transparency, and openness to contributions from everyone. Gloor PA. Swarm Creativity: Competitive Advantage through Collaborative Innovation Networks. New York: Oxford University Press, 2006. 23 24 12 CoIIN: Design to Action -- Vision Know-Do Gap Use the Model for Improvement to guide Strategy Team work, including: 1. Identify a Quality Improvement Aim. (Plan) MIECHV: HV CoIIN • • • • 2. Use Team members (Expert Leads and State Representatives) to identify strategies that work. (Plan) 3. Identify process and outcome measures based on chosen Strategies. (Plan) Breastfeeding Developmental screenings Maternal Depression Engagement/transition What we know 4. Implement strategies to “test” change. (Do) What we do 5. Measure progress (outcomes and processes). (Study) Yesterday 6. Adjust strategies as needed. (Act) 25 • • • • • • • Recruit Participants (10-100 Teams) Select Topic Areas (Constructs) Expert Meeting Pre-work P LS 1 S S LS 2 D A D A S Planning Group P P D A Holding the Gains LS 3 AP1 ADVANCING CQI IN HOME VISITING / 3 Tomorrow Source: Adapted from Institute for Healthcare Improvement (IHI) 26 Breakthrough Early Childhood Systems Activities Home Visiting Learning Collaborative (CoIIN) Develop “Change Package” Today AP2 Project LAUNCH (SAMHSA) Help Me Grow ECCS (Early Childhood Comprehensive Systems) Building Bridges Race to the Top - ELC States TECCS (Transforming Early Childhood Community Systems) Place- Based Initiatives • California: First 5 Alameda County, Magnolia Place • Children’s Service’s Council Palm Beach County • Promise Neighborhoods (US Department of Education) Expert Support Source: Adapted from IHI, BTS Collaborative. LS – Learning Session AP – Action Period ADVANCING CQI IN HOME VISITING / 10 76 27 TECCS (Transforming Early Childhood Community Systems) Help Me Grow (P. Dworkin, 2013) Community Outreach Child Health Provider Training/Screening/Surveillance to promote the use of HMG and provide networking opportunities among families and service providers “Do you have questions about how your child is learning, behaving or developing?” Centralized Telephone Access Point for connecting children and their families to services and care coordination. Provide feedback and follow-up Child mental health Early Interven -tion Support groups for grandpar -ents Faithbased programs Head Start Home Visiting Parenting education ADHD support groups Hispanic resource center Domestic violence shelter Advocacy groups 29 Source: Halfon, 2013 78 13 The Five Conditions of Collective Impact Success Take Home Messages • Building health and education readiness for the next generation of children requires a focus on the one science of early brain development addressing toxic stress from pre-conception to early childhood • Integration of medical home, home visiting and early childhood system allows breakthrough strategies to address toxic stress and the social determinants of health • Evidence-based practices, population-based approaches and continuous quality improvement for primary care practices and early childhood community services defines MCH public health for the future • Common agenda – shared vision • Shared Measurement – collecting data and measuring results consistently • Mutually Reinforcing Activities – differentiating while still coordinated • Continuous Communication – consistent and open communication • Backbone Organization – for the entire initiative and coordinate participating organizations and entities Source: J. Kania and M. Kramer, 2011 79 80 Contact Information David W. Willis, MD, FAAP Director, Division of Home Visiting and Early Childhood Systems Maternal and Child Health Bureau, HRSA 301-443-8590 [email protected] Meaningful Measurement and Evaluation Efforts in the Improvement Partnerships: Spotlight on Innovation and Opportunities for Discussion and Input Colleen Reuland, MS Oregon Pediatric Improvement Partnership October 3rd, 2013 - Washington DC National Improvement Partnership Network Meeting 81 Key Areas We Will Cover 1. Measure specific strategies for gathering and displaying information 2. Leveraging Partnerships • • Utilizing national and state-level data Partnership with Medicaid/CHIP 3. Staffing measurement and evaluation in your IP • • Internal models Contract out models Format: Examples of Innovative Strategies & Group-Level Discussion 2013 NIPN Operational Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Measure Specific Strategies Measures related to screening o Developmental o Autism o Newborn o Mental health Asthma Medical Home Office Reported Tools Patient Experience of Care Surveys Provider/Office Staff Experience Surveys 2013 NIPN Operational Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) 14 Screening Measures: Examples from IPs Example from Arizona: Developmental Driver Diagram of Developmental Screening • Arizona • Alabama • Indiana • BOPS 86 Example from Alabama: Screening for Early Childhood Issues Example from Indiana: MCHAT Screening • Four Numbers Gathered: 1. 2. 3. 4. • • 87 Screened Passed Failed Referred Proportions then shown (describe numerator and denominator) Numbers shown by AGE and by Practice 88 15 Example from BOPS:Newborn Screening Newborn Results Recorded in EMR and Discussed with Parents by 2 Months of Age BOPS, 2013 100% 80% 60% 40% 20% 0% Example from BOPS: Recorded Jan-13 94% Feb-13 95% Mar-13 91% Apr-13 93% May-13 93% Jun-13 98% Jul-13 94% Discussed 84% 87% 87% 86% 88% 93% 94% Jan-13 Feb-13 Mar-13 Apr-13 May-13 Jun-13 Jul-13 Recorded Volume 243 253 233 263 239 210 246 Discussed Volume 218 226 201 238 219 192 222 Example from BOPS: Routine Mental Health Screening Newborn Results Communicated to Parent Routine Mental Health Screening at Annual Well Child Visits BOPS, 2012-13 Newborn Results Recorded in EMR and Discussed with Parents by 2 Months of Age BOPS, 2013 100% 100% 90% 80% 80% 70% 60% 60% 50% 40% 40% Jul 2013 All Sites Now Implementing 30% 20% 20% 1st Full Month of Implementation Nov 2012 10% 0% 0% Recorded Jan-13 94% Feb-13 95% Mar-13 91% Apr-13 93% May-13 93% Jun-13 98% Jul-13 94% Discussed 84% 87% 87% 86% 88% 93% 94% Jan-13 Feb-13 Mar-13 Apr-13 May-13 Jun-13 Jul-13 Recorded Volume 243 253 233 263 239 210 246 Discussed Volume 218 226 201 238 219 192 222 Measures Related to Asthma Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug PSC-17 0% 17% 34% 37% 48% 52% 63% 68% 74% 44% 72% Y-PSC-17 0% 19% 41% 43% 55% 64% 69% 77% 77% 42% 61% Apr 667 468 May 675 473 4-11 Visits 12-18 Visits Oct 0 0 Nov 811 547 Dec 629 432 Jan 814 521 Feb 602 392 Mar 633 424 Jun 626 400 Jul 1765 1334 Aug 2027 1634 Example from Arizona • Arizona • Indiana 96 16 Example from Arizona Example from Indiana Asthma Quality Improvement Dashboard Asthma Action Plan: Percentage with plan in the chart 100 90 80 70 60 50 40 30 20 10 0 Jan Feb Mar Apr May Jun Jul Aug Education: Percentage of patients with asthma who have education re: proper use of spacer device 100 80 60 40 20 0 97 Jan Example from Indiana Feb Mar Apr May Jun Jul Aug Example from Indiana Education: Baseline Chart Review Data 100 90 80 Practice 1 70 Practice 2 60 Practice 3 50 Practice 4 Practice 5 40 COMBINED 30 20 10 0 Risks of smoking Flu shot Spacer device education/reminder Triggers and irritants Office Reported Tools Assessing Medical Home • Example of tools: – NCQA PCMH 2011 – Medical Home Index/ MHI– Revised Short Form • Tool required CHIPRA National Evaluator – Home grown tools • Important factor for IPs to consider – Require facilitation and engagement • Particularly true for the MHI-RSF given it is anchored to the MCHB definition of CYSHCN – Not uncommon for practices to report they are in a different place than they may be – Not uncommon to see scores go DOWN over course of learning collaborative 2013 NIPN Operational Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Office Reported Tools Assessing Medical Home: Tips from IPs Around Facilitation and Helpful Strategies • Arizona: – GAPS tool for NCQA PCMH recognition. – QI meeting – Take notes and then go over processes • Indiana: – Found significant issues with NCQA for pediatrics. • Helpful to choose a chronic condition • Work within processes for the quality improvement of chronic conditions with careful labeling and reflection as to how it generalizes to other conditions. – Sharing challenges of other practices also helps. • • • • Ask what is going well? Offer time to reflect on the things going well; Suggest that processes can always be improved; Say if you were doing this process extremely well a year from now, what would it look like? • How do we get there between now and then? • Ask them to reflect on what they are doing well, how can they do more of that elsewhere? 2013 National Improvement Partnership Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) 17 Office Reported Tools Assessing Medical Home: Tips from IPs Around Facilitation and Helpful Strategies • New Mexico – Leverage the art of coaching • Consider the team and team personality and what the site needs to be successful – Focus site’s attention on an area of success and then circle back around after they have improved ‘change’ morale and momentum – Read“Crucial Conversations” and use the book’s strategies. • Washington DC – Depends on measure/big picture. – When data integrity is critical to reporting/deliverable- be explicit up from the start with coaching/expectations • Oregon – Collecting three office reported tools (NCQA PCMH, MHI-RSF, and Oregon Patient Centered Primary Care Homes) – For MHI created talking points for each item for the QI staff to use • OR team held individual calls with non-OR sites – Create an inventory of strategies and processes practices are using that are attesting to “yes” – Read “Crucial Conversations” and use the book’s strategies – Leverage separate staff that are for evaluation and measurement – Emphasize the importance of comparative data in a learning collaborative model 2013 National Improvement Partnership Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Improvements in MHI-RSF Quality Domains: State-Level Average Scores and Changes Observed Reporting Measures of Medical Home: Example from Oregon Across T-CHIC: Average Change in MHI-RSF Overall and Domain Scores by State 100% 90% Percent of a Perfect Score 80% 70% 60% 50% 40% 30% 20% 10% 0% MHI-RSF Overall Percent Score Domain 1: Organizational Capacity Domain 2: Chronic Condition Management Alaska Scores Tri-State Children’s Health Improvement Consortium (TCHIC) Domain 3: Care Coordination Oregon Scores Domain 4: Community Outreach Domain 5: Data Management Domain 6: Quality Improvement West Virginia Scores Summary Created by Oregon Pediatric Improvement Partnership (OPIP) Reporting Measures of Medical Home: Example from Oregon Goal of the Health Reform Efforts: Achieve the Triple Aim Patient Experience of Care Surveys 108 18 Patient Experience of Care Surveys Patient Experience of Care Surveys 1. 2. 3. 4. 5. 6. 7. 8. • Arizona Idaho Indiana Oregon Vermont New Jersey South Carolina Washington DC • • • • • • • 2013 National Improvement Partnership Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Patient Experience of Care Surveys: Example of Indiana 2013 National Improvement Partnership Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Question 2 - Does your doctor/nurse offer your family information about health and wellness appropriate to your child’s developmental stage? (This includes information about child developmental, mental health, healthy weight, and nutrition, physical activity, sexual developmental and sexuality, safety/injury prevention, and oral health) • Indiana – Short survey for the practices in the MHLC – Adapted from 1) 2) Arizona – Survey on care coordination Idaho – Primary Health Pediatrics Family Survey – St. Luke’s Developmental Pediatrics Indiana – Short survey for the practices in the MHLC which was adapted from Family Voices, “FamilyCentered Care Self-Assessment Tool – Family Tool” and Center for Medical Home Improvement, “the Medical Home Family Index: Measuring the organization and Delivery of Primary Care for Children with Special Health Care Needs”. Oregon – CAHPS HP-CC – CAHPS CG PCMH – MHI- Family Index Vermont o CAHPS survey o Chronic Illness Project – Internally designed survey used for patients who have transitioned from pediatric to adult care. New Jersey – Medical Home Family Feedback Survey South Carolina – CAHPS Washington DC – CAHPS – Adolescent Surveys 3.00 Family Voices, “Family-Centered Care Self-Assessment Tool – Family Tool” Center for Medical Home Improvement, “the Medical Home Family Index: Measuring the organization and Delivery of Primary Care for Children with Special Health Care Needs”. – Provided the adapted survey and asked the practices to survey everyone who came into the office over a two week period. – Provided English and Spanish versions of the survey • Practices sent the surveys after the two week period. – Provided a summary report of the survey data back to the practices. • This demonstrated to the practices a process to collect patient experience of care data over a short period of time and to use the data to inform the practice, identify areas for change or improvement, etc. 2.50 2.00 3 - Always 2- Often 1.50 1- Sometimes 0- Never 1.00 0.50 Key To Success: We provided the tool and summarized the data. 0.00 Practice 1 Question 14 - There is a staff person(s) or a “care coordinator “ who will help you with difficult referrals, payment issues, & follow-up activities. Practice 3 Practice 4 Practice 5 Practice 6 Practice 7 Practice 8 Overall Average Question 15 - Office staff help you to connect with family support organizations & informational resources in your community & state. 3.00 3.00 2.50 2.50 2.00 2.00 3 - Always 2- Often 1.50 1- Sometimes 0- Never 3 - Always 2- Often 1.50 1- Sometimes 0- Never 1.00 1.00 0.50 Practice 2 0.50 0.00 Practice 1 Practice 2 Practice 3 Practice 4 Practice 5 Practice 6 Practice 7 Practice 8 Overall Average 0.00 Practice 1 Practice 2 Practice 3 Practice 4 Practice 5 Practice 6 Practice 7 Practice 8 Overall Average 19 Washington DC: Adolescent Health Care Survey Washington DC: Adolescent Survey 2013 National Improvement Partnership Meeting CAHPS, CAHPS Everywhere! But is the data used to improve care? OPIP Activities Related to Patient Experience of Care Surveys ECHO: Enhancing Child Health in Oregon • Part of T-CHIC • Medical Home Learning Collaborative (8 sites) CAHPS CG PMCH Child (5 pediatric practices) CAHPS CG PCMH Child AND Adult (3 Family Medicine) • Patient-Centered Primary Care Institute • Medical home learning collaborative anchored to Oregon PCPCH standards • Standards emphasize practice-level use of the CAHPS • Nearly all practices attesting to OR PCPCH using in-office, convenience sample • Within our PCPCI Collaborative focused facilitation efforts on enhanced engagement with families and improved methods for convenience, in-office administration 11 7 FEDERAL CMS: CHIPRA CORE MEASURE SET Overview of targeted efforts • T-CHIC: Tri- State Children’s Health Improvement Consortium o Leverage Medicaid/CHIP measurement work related to CAHPS o Trying to utilize state EQR and Core Measure funds for practicelevel data gathering and feedback reports o CAHPS Clinician and Group-Level, Patient Centered Medical Home for participating practices o Standardized administration via DataStat • T-CHIC ECHO PCPCH USE of the data to guide and inform improvement efforts (Analysis and reporting) X T-CHIC, State (3), Practice (21) X (Project, Practice (7)( X (General, Bay Clinic) Project-level use of the data to guide curriculum X X Practice level administration & collection X X X (Bay Clinic, In office & Generally) Practice-level engagement to guide improvement X X X Federal CMS Core Measure, NCQA CAHPS CG PCMH PCPCH, CCO Incentive Metric PCPCH, CCO Incentive Metric 11 9 CAHPSHealth Plan (HP) CAHPS- HP with the Children Chronic Conditions State Sponsored: CCO Incentive Metric; Some CCOs Also Collecting Through CPCI & Other Efforts CAHPS Clinician and Group (CG) Emphasized in PCPCH, But Unstandardized Data Collection, Many practices collecting data and doing nothing more CAHPS CGPatientCentered Medical Home Supported by T-CHIC (Includes ECHO), Including CYSHCN items, Q-CORP piggybacked off effort 11 8 Efforts OPIP is Involved with Related to Patient Experience of Care Policy briefs to inform improvements Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) OPIP Analysis of the CAHPS CG PMCH Data Analysis conducted overall, and then by state and by practice o Includes Adults CAHPS CG findings for the Family Medicine Practices CAHPS CG PCMH findings related to the following factors: o Child o Respondent o Practice o MHORT Quality Domains Variations in quality of care observed by these factors Implications of these variations 12 0 20 Variations in CAHPS CG PCMH Findings by Age of Child OPIP Analysis of the CAHPS CG PMCH Data Children & Youth with Special Health Care Needs (CYSCHN) Child CAHPS CG PMCH Only Respondent Characteristics CAHPS CG PCMH Quality Domain Achievement Scores For Which There Were Significant Differences by Age of Child 100% Ethnicity CYSHCN Age of Respondent Race Number of consequences Education Age of Patient Type of Consequences Language survey completed Achievement Score Characteristics of Persons Care Being Reported About in Survey 80% 60% 40% 20% General Health Status 0% Mental Health Status Child Development 0 up to 5 years 12 1 Child Prevention 5 years up to 12 years Self-Management 12 years and up Tri-State Children’s Health Improvement Consortium (TCHIC) Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Variations in CAHPS CG PCMH Findings by Child’s Health Status OPIP Analysis of the CAHPS CG PMCH Data Practice Characteristics Variation in the CAHPS CG PCMH Quality Domain Achievement Scores by Child Characteristics Reported in the Survey CAHPS CG PCMH Quality Domains Access Variable Specification n % Child Communication Development Child Prevention SelfOffice Staff Management NCQA PCMH Quality Domains Specialty of Docs CYSHCN Age of Respondent Type of Practice (FQHC, Private, part of Large System) Number of consequences Education Type of Consequences Excellent/Very Good 1487 86% 80.49 94.13 63.20 56.52 32.68 91.64 Geography Good/Fair/Poor 243 14% 72.34 92.23 56.13 50.12 36.16 86.82 Percent of Patient Population Pediatric Excellent/Very Good 1434 83% 80.35 94.12 62.40 56.22 32.53 91.61 Number of Pediatric Patients Good/Fair/Poor 17% 74.63 92.74 61.30 52.58 36.27 87.84 Percent of Visits Publicly Insured Child's General Health Status Child’s Mental Health Status 296 Bold, blue text is used to indicate a p value of <0.05 for difference between average score within each group. For these domains for which there is a significant difference in quality, the cells are shaded to indicate which group has the highest score (green).Where applicable, the group with the lowest score is in red. Tri-State Children’s Health Improvement Consortium (TCHIC) Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Characteristics related to Access 12 4 Variations in CAHPS CG PCMH Findings by CYSHCN Who Experience Significant Consequences Kinds and Types of Special Health Care Needs Identified % of Survey Respondents with that Consequence Alaska Oregon West Virginia Sites Sites Sites Variation in CAHPS CG PCMH Quality Domain Achievement Scores by Level of CYSHCN Consequences T-CHIC (of N=1739 (of N=207 (of N=723 (of N=809 Surveys with CAHMI Surveys with Surveys with Surveys with Responses) CAHMI Responses) CAHMI Responses) CAHMI Responses) 1) Prescription medication need/use 22.1% 13.0% 18.0% 28.1% 2) Limited or prevented in ability to function 7.1% 10.1% 5.8% 7.4% 12.9% 12.6% 11.6% 14.1% 6.0% 10.1% 4.7% 6.2% 10.4% 11.1% 9.4% 11.0% 3) Above routine use of medical care, mental health or other health services 4) Specialized therapies (OT, PT, Speech) 5) Counseling or treatment emotional, behavioral or developmental problems Tri-State Children’s Health Improvement Consortium (TCHIC) Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Achievement Score CAHMI CYSHCN Screener: Specific Consequences and Needs MHI-RSF Quality Domains 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% * * Access Communication Child Development Less than 3 Consequences on the CAHMI Screener Child Prevention Self-Managenent Office Staff 3 or More Consequences on the CAHMI Screener * Variation is statistically significant Tri-State Children’s Health Improvement Consortium (TCHIC) Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) 21 Domains within Medical Home Measures Across T-CHIC/ECHO OPIP Analysis of the CAHPS CG PMCH Data Practice Characteristics Practice Quality Indicators Specialty of Docs NCQA PCMH Previous Patient Experience of Care Survey Type of Practice (FQHC, Private, part of Large System) QI or Other Medical Home Efforts as of Baseline Geography NCQA PCMH Certified at the Time of the Survey Percent of Patient Population Pediatric EMR Maturity Levels Enhance Access and Continuity Identify and Manage Patient Populations Plan and Manage Care Provide Self-Care Support and Community Resources Number of Pediatric Patients Percent of Visits Publicly Insured Characteristics related to Access 12 7 Track and Coordinate Care Measure and Improve Performance CAHPS CG PCMH MHI-RSF (specific CYSHCN) Organizational Capacity Chronic Condition Management Care Coordination Community Outreach Data Management Quality Improvement/ Change Access Communication Self-Management Support Office Staff ComprehensivenessChild Development ComprehensivenessChild Prevention 128 PCPCI Learning Collaborative Call: Comparison of NCQA PCMH and CAHPS CG PCMH Related to Access CAHPS CG PCMH After Hours Access Patient Experience of Care Surveys: How They Are Part of Your Medical Home Transformation Efforts NCQA PCMH Element 1B After-Hours Access Q19. Usually or always able to get care needed from provider's office during evenings, weekends, or holidays •Objective of the Calls: T-CHIC Review PCPCH standards (Current and Proposed Revisions) that relate to Patient Experience of Care Survey T-CHIC Oregon Oregon Childhood Health Practice #1 Associates of Salem Childhood Health Practice #1 Associates of Salem The Children's Clinic PortlandPractice Pediatric #2 Clinic The Children's PracticeClinic #2 ↑ The Children's Clinic TualatinPractice Pediatric #3 Clinic ↑ Practice #3 ↑ Review key issues for practices to consider in implementing patient experience of care surveys Family Medical Group#4 NE Practice Family Medical Group Practice #4NE HillsboroPractice Pediatric #5 Clinic St. Luke's Eastern Oregon Practice #6 Medical Associates Hillsboro Pediatric PracticeClinic #5 ↓ St. Luke's Eastern Oregon Practice #6 Medical Associates ↓ WindingPractice Waters Clinic #7 Winding Waters #7 Clinic Practice ↑ 0% Worse 20% 40% 60% 80% 100% Review key issues for practices to consider in USING information from patient experience of care surveys Questions and Answer Woodburn Pediatric Practice #8 Clinic Woodburn Pediatric Practice Clinic #8 Review the different versions of the CAHPS, and how the CAHPS Clinician and Group – Patient Centered Medical Home complements the PCPCH standards 0% 20% 40% Worse Score Better ↑↓Statistically significantly higher/lower than State score. 60% 80% Score 12 9 100% Better 13 0 Clinic Spotlight: Bay Clinic Pediatrics • Patient experience of care OPIP Support of Bay Clinic Pediatrics • Patient-centered administration – CAHPS CG PCMH (with CYSHN Screener) – Fielded for Two Months (July-September) – Mapped out process for administering in the office – Created survey administration materials • • • • • Methods – Part of department strategy, clinic-wide/cultural commitment • Healthy competition between providers – Promotion (front desk, waiting rooms, exam rooms) – Administration • Web-based or paper versions (depending on patient preference) disseminated by PROVIDERS at all visits for a pre-determined interval (2 months) • In addition, surveys are being sent by mail to a subpopulation- CYSHN identified by gestalt for each provider Posters for the parent Scripts for office staff Letters Survey monkey version • QI Coaching – Process for who how data will be reviewed – Process for how improvement opportunities – Levers to use with CCO • Analysis and reporting – Feedback reports of data 13 1 13 2 22 Planning Your Administration: Outline the Steps and Assign Roles 13 3 13 4 Provider and Office Staff Experience Surveys • New Mexico – ENM uses surveys at the end of each initiative (Survey Monkey ) • Washington DC – Survey practices in our LC’s- pre- mid-LC and post about experience/satisfaction with LC – Share results with participants as part of project calls/summary • Oregon – Learning Sessions • Pre-Meeting • Post- Evaluation – Collaborative Calls • Evaluation – Experience Based Design: Video Vignettes 13 5 2013 National Improvement Partnership Meeting Items in Provider/Office Staff Survey: Example from Oregon Leveraging Other Groups Data for QI • • • • • • 2013 National Improvement Partnership Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) Vermont – Birth certificate data – National and state registries (state immunization, Cystic Fibrosis Foundation, etc) – Insurers Arizona – Working with 3 other states and are able to share data . – Compare AZ data to national registry. • Examine the amount of providers in other states vs. the amount of providers currently working on registry in Arizona. Kentucky – MCO data on metrics (well-child, immunization, Chlamydia, asthma) New Jersey – Immunization data from the State. – Data from the CDC, Advocates for Children of NJ (Kids Count) Oregon – Early Intervention data, reporting back to PCPs BOPS – National data on adolescent STI screening, adolescent mental health screening, pediatric mental health screening and early childhood developmental screening 2013 National Improvement Partnership Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) 23 Leveraging Partnerships with Medicaid/CHIP to Use Data to Guide Inform QI: Positive Experiences 1. 2. 3. 4. 5. 6. Leveraging Partnerships with Medicaid/CHIP to Use Data to Guide Inform QI: Positive Experiences Alabama – Developmental screening billing data in a specific project for baseline data. – CHIPRA core measure with data elements in our AI projects. Oregon – Core measure data – Exploring “child id” across public systems – Leveraged External Quality Review to have MCOs collect robust medical chart review data related to develpomental screening, referral and follow-up Utah – Work state public health department and Medicaid – We work directly with one of the data coordinators to ‘massage’ the data to meet our needs. Alabama Idaho New Jersey Oregon Utah Washington DC South Carolina – See Follow-Up Slides 2013 National Improvement Partnership Meeting Facilitation by the Oregon Pediatric Improvement Partnership (OPIP) South Carolina’s Work Around the CHIPRA Quality Measures – Introduced 18 of 24 quality core measures Jan 2011: Emergency Department visits ** st Well-child visits (WCV):1 15 mo. ** Developmental Screening 1st 3 yrs. ** Follow-up ADHD medication ** QTIP PDSAs Jan 2011 to August 2013 South Carolina’s Work Around the CHIPRA Quality Measures 180 153 160 July 2011: Access to Primary Practitioners ** Preventative Dental Services ** CAHPS 144 133 140 117 120 100 February 2012 Frequency of Ongoing Prenatal Care % of live births < 2,500 grams C-Section Asthma … w/ 1+ Asthma ED visit ** July 2012 BMI ** Pediatric Hemoglobin A1C testing January 2013 Follow-up …Mental Illness July 2013 Adolescent Well-Care Visits** Timeliness of Prenatal Care Immunizations for adolescents Chlamydia Screening ** SC received approval American Board of Pediatrics (ABP) to offer MOC credits on 10 quality measures. See ** plus Family Centered Care and Behavioral Health 80 60 75 75 69 56 44 31 40 20 20 8 4 1 1 3 1 7 2 2 0 0 141 Example from South Carolina 142 Example from South Carolina 143 144 24 Leveraging Partnerships with Medicaid/CHIP to Use Data to Guide Inform QI: Barriers Encountered Idaho: • Tried to use our Medicaid data team for reporting baseline information. Experienced significant barriers. New Jersey: • This has been a real challenge Washington DC • Engaged with DC Medicaid at IP start-up, but this aspect has been less of a focus • DC Medicaid has undergone successive changes in leadership, priorities. Building Measurement into Your IPs: Staffing Models New Mexico: o 1.5 FTE related to data central: turning raw data into fast feedback reports so coaches can communicate with sites; managing our access data base; and assisting with creation of online data collection (Google Docs/Drive) o 1.0 FTE MPH to help with data synthesis and overall program evaluation and reporting. Since this person is new plan is to have this person more involved with up front design of measures and management of database o Added the 1.0 FTE MPH and 0.5 data analyst because our increased work and changes in the way we share data with sites required more people Bronx o Evaluation Expert: 0.05 FTE, Project Director and Assistant each at 1 FTE Utah o QI coaches and Senior Program Manager are trained in measurement and data and help facilitate the processes in all our projects Washington DC – We have grown our own staff/capacity- still learning & growing. Not as strong in evaluation as in front end QI coaching, data collection. – Growth is subject to funding- hard to sustain steady FTE’s when projects come & go. – Identified alternate funding in academic health system to sustain baseline functionality Building Measurement into Your IPs: Staffing Models Vermont o Have staff specifically focused on measurement and data. o Also have contracted with specifically to provide evaluation to other national and state level organizations. Oregon o Staff specifically focused on measurement and data, but from a senior level. Found not enough work for a junior-level, analyst only position. Exploring models to share analyst staff with other in Department Arizona: o We have some staff who are responsible for collecting the data and putting together the slides to share with practices on phone calls. New Hampshire: o Jo Porter (.05 FTE), Research Director for the NHIHPP, is serving as our data/evaluation design lead, o Data collection and reporting will be done by a Holly Tutko (.25 FTE), NH PIP Project Director and a Research Associate (TBD, 30 FTE). Sharing an Analyst in Another Department in Same Organization Vermont • Share staff with College of Medicine through UVM routinely. • This model has worked well when a % of shared staff’s FTE is paid through the IP. Kentucky • Within our department of pediatrics, there is a health services research unit (brand new) and we can obtain statistical support there New Jersey • Utilized the same analyst for several different QI projects/initiatives – currently we work with 2 different researchers. • As all of our programs are preventative or wellness oriented, within the context of a medical home framework, using the same analyst has helped them to get to know our agency and team, and how we work as a team. Contracting Out for Measurement and Evaluation 1. 2. 3. 4. 5. AL ID New Jersey South Carolina Utah Break. Snack is located in foyer. Provided by: Each share the positive and negative experiences. National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 25 Session Objectives • Attendees will learn best practices and share strategies on how to teach QI with limited time • Attendees will learn how to recognize when a participating practice should be let go from a project • Attendees will discuss how to promote honesty in practice teams’ data collection Teaching Quality Improvement – Effective QI Coaching Strategies Tamara N Johnson, MPH Quality Improvement Coach Goldberg Center for General Pediatrics and Community Health Children’s National Medical Center Mary Jo Paladino, MSA Executive Director, CHIP-IN for Quality Child Health Improvement Partnership Indiana National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 DC snapshot • Primary Care Access, 2005 • Poverty in DC, 2000 Teaching QI When Time is a Factor Tamara John, MPH Quality Improvement Coach District of Columbia Partnership to Improve Children's Health Care Quality Children’s National Health System Dark green Moderate to Highly privileged areas Yellow, Orange & Red less privileged blocks in order of increasing adversity Barriers to QI DC-PICHQ • • • DC Partnership to Improve Children’s Healthcare Quality – Launched November 2005 with start-up funding & guidance from VCHIP & Commonwealth Fund – Matching funds & institutional home: Children’s National Medical Center (D.C.) • Key partnerships with DC Medicaid, DC AAP chapter, academic health centers, FQHC’s and provider community, managed care plans (Medicaid) – Sharing & mentoring from VCHIP and NIPN state improvement partnerships Vision: To improve healthcare quality & outcomes for children in the District of Columbia (& region) – Goals: • Build an enduring regional partnership (& funding) • Engage provider participation & leadership with key stakeholders: government & payers • Utilize demonstrated quality improvement (QI) methodologies to promote incremental change at practice- and system-based level • Produce data-driven & measurable outcomes District of Columbia: unique jurisdiction, demographics & region – Metropolitan Washington DC region: DC, MD and VA – 3 state governments, Medicaid programs, AAP chapters, politics • Provider Time • Fear of the unknown • • – QI Is too hard! – Bad experience with QI Staffing – QI coaches – Practice • Small staff • Comfort with project topic • Lack of leadership • Part time Staff How do you overcome these barriers and teach the basics of QI in 1 hour or less? Technology – EHR implementation – No control/limited access • Distance • Competing priorities 26 Working/ Driving in the DMV Asthma QI Learning Collaborative 2012-2013 • DC is only 68 sq. Miles • Has some of the worst traffic in the nation – Drivers spend ~67 hrs. a year in traffic Improving Asthma Care in Pediatric Practice QI MOC program 2012-2013 • ~50 practices, 202 active participants in DC, MD and VA • Partnered with: Maryland & DC AAP, IMPACT DC • ABP MOC Part 4 credit (25 points) & CME Credit (Up to 33.5 hours) Site visit planning: Organization is Key The more you can do beforehand the better! • Meet with your project team and decide what the project should look like – – – – Number of Learning Sessions Number of chart audit reports Number of team meetings Roles and Responsibilities for participating providers/practices – What are your expectations for participation – Flexibility vs. requirement Create a Project Map Helps to quickly identify project activities Created using Microsoft VISIO or PPT Children’s National Health Network Asthma Learning Collaborative Project Work Flow The Planned Asthma Visit Practical Asthma Education Advanced Asthma Management December 19th 2012 Molly Savitz, MSN, FNP Caitlin Munoz, MSW, MPH Stephen Teach, MD November 7th,2012 Molly Savitz, RN Rhonique Harris, MD October 3 2012 Kick-off meeting Quality Improvement 101/ Introduction to Pediatric Asthma November QI Conference Call With Team Leaders December QI Conference Call With Team Leaders January QI Conference Call With Team Leaders February QI Conference Call With Team Leaders March QI Conference Call With Team Leaders Practice Team Meeting Practice Team Meeting Practice Team Meeting Practice Team Meeting Practice Team Meeting Practice Team Meeting Provider Responsibilities Created using Microsoft VISIO or PPT Coding to Improve Reimbursement April 3rd 2013 Mark Weissman, MD April QI Conference Call With Team Leaders Pulling It All Together May 14th 2013 Panelist: Asthma Experts May QI Conference Call With Team Leaders Practice Team Meeting June QI Conference Call With Team Leaders Practice Team Meeting Practice Team Meeting Office Detailing Site Visits Nov-12 Oct-12 10/3/2012 Dec-12 Jan-13 Feb-13 Mar-13 Apr-13 May-13 Jun-13 Jun-13 10/31/2012 Baseline November Chart audit Chart audit 30 Charts 10 Charts December Chart audit 10 Charts January Chart audit 10 Charts February Chart audit 10 Charts March Chart audit 10 Charts PDSA Cycle/Progress Report 2 (January-February) PDSA Cycle/Progress Report 1 (November-December) Create easy to read guidelines: Breathing Easy: Environmental Management of Asthma March 7th 2013 Molly Savitz, MSN, FNP Jerome Paulson, MD Joy Purcell, Esq. January 29th 2013 Dinesh Pillai, MD Hemant Sharma, MD April Chart audit 10 Charts May Chart audit 10 Charts June Chart audit 10 Charts PDSA Cycle/Progress Report 3 (March-April) Practice Responsibilities/ Requirements Created using Microsoft VISIO or PPT CNHN Obesity Learning Collaborative Team Requirement CNHN Asthma Learning Collaborative How to earn MOC Part 4 Credit as a Provider Watch the live webinar Kick-off webinar Option 1 Mandatory Activities for the Practice Team Watch the recorded webinar Option 2 1 team meeting a month for 9 months 1 baseline chart audit 1 chart audit a month for 8 months Total of 9 meetings Total of 9 chart audits Submit 1 meeting report a month to QI TeamSpace by the 5th of the following month. All members earning MOC credit should be present. (2) Submit 1 baseline chart audit to QI Team Space by Oct 31, 2012. 30 charts must be reviewed. Residents: 20 Charts must be reviewed (2) 1 QI conference call a month Total of 8 Calls Yes Submit a signed presentation attestation form No I cannot receive credit Yes 1 monthly call with the QI team All practice members are welcomed to participate. Attendance is required from at least 1 member working towards MOC credit. Submit 1 chart audit report a month to QI Team Space by the 5th of the following month. 10 charts must be reviewed. (2) Mandatory Provider Activities You are the documented provider for a minimum of 12 patients over the course of the entire project 3 PDSA projects Total of 3 reports Yes You attended a minimum of 5 out of 9 team meetings You attended/watched at least4 webinars Yes Conduct at least 3 PDSA projects over the course of the Learning collaborative. Submit 3 PDSA reports on QI team space by the end of the reporting month. (3) Yes Minimum of 1 chart per provider per audit report Yes Yes Yes Your initials are documented monthly on the practice chart audit tool for at least 12 patients over the course of the project Your Name is documented on at least 5 submitted team meeting forms 1) Providers can only earn earn MOC part 4 credit if all chart audits and reports are submitted to QI Team Space by June 30,2013 Yes Team requirements are completed Yes Yes Yes Submitted a signed presentation attestation form for each webinar* Please refer to: “How to earn MOC Part 4 credit: Team Requirement ” to confirm that all requirements are met before data submission. Yes 2) For chart review and team meeting requirements, please review: How to earn MOC part 4 credit as a provider. Providers are now eligible for MOC part 4 credit if all individual provider requirements are met. Please refer to: “How to earn MOC Part 4 credit as a Provider” to confirm that all requirements are met before data submission. *Rules for submitting webinar If you did not meet the minimum requirements for any of the mandatory activities, your participation will be reviewed on a case by case basis. No Yes I do not receive credit I receive MOC Credit attestation forms: ♦ All forms should be submitted by June 30, 2013 ♦4 out of 7 submitted attestation forms must be submitted within 30 days from the day the recorded webinars are posted. 27 Workbook tools with directions: Chart Audits Pre-visit planning Created using Microsoft Word & Excel • Talk to your subject experts: – Discuss your project measures and how they should be documented • Create practice friendly tools with directions for: – – – – QI basics Data collection Meeting notes PDSA Reports Tools cont. The Site Visit: Scheduling the visit • Plan a time that works best for both you and the practice: – Try to schedule 45 minutes- 1 hr. time slots Beware of the time constraints Early morning- Limited time for questions because patient visits are scheduled to begin and providers will need to prepare for the day. Lunch: Providers might come in late or sporadically because they are finishing up with their patient visits and they may leave early to prepare for their next visit End of the day: Providers may be late because they are finishing up with patients, but they will have more time to ask questions if something is not clear During the Site Visit • Create a basic Handbook/Cheat Sheet that you can provide to your practices and coaches – This helps to: • Structure your time • Set the tone and expectations for the project Site visit Breakdown: IP Project Site visit Practice Coach name: Email: Phone: Today’s Agenda 1) Introductions 2) Review of Project Work Flow 3) How to earn credit 1) Team 2) Provider • Guide the conversation • Provide guidance when the QI coach leaves • Bring a snack for the practice- it makes the visit easier 4) 5) 6) 7) 8) Review of QI TeamSpace Chart Audits Monthly Meetings PDSA Cycles Questions Thank you for your time! Site Visit Tips: • Send them your site visit packet ahead of time so they know what to expect • Include all of your handouts and instructions • If you need a computer with internet access, make sure there is one available or bring your own. • If you use a website to collect data provide step by step instructions on how access the site • Provide due dates when you can • Leave enough time to discuss PDSA cycles Be flexible with your agenda 28 After the coach has left….. What have other IP’s done? • VCHIP – Large group training sessions with ongoing support using coaching calls • • New Jersey: Create the tools you need to make life – Practices should have a committed project team in place easier throughout the project: • • Face-to-face visits works best Schedule your Learning Sessions and – Assess the practice level of understanding and commitment – Allows the IP to bring resources from the community to create sustainable collaborations Team leader calls during lunch – These are recorded for those who cannot • IHAWCC attend live – – Face-to Face visits combined with , WebEx, individualized phone calls & practice updates Practice teams can discuss their practice • Alabama changes with each other during project calls • – • – Do not employ individual practice coaches to visit practice teams • Use monthly webinar calls , A QI tool of the month and refresher on clinical guidelines • The project physician or QI manager will help practices who are struggling Chart Audit results: Have a 1-2 day turnaround time • Arizona: Access to the QI Coach/ Content experts – Send the agenda and slides a few days before so practices can review and come prepared with questions/discussions – Allow less time to review data and more time to discuss barriers/successes/idea sharing through Email and phone Teaching QI Measures and Data Teaching Quality Improvement – Measures & Data Mary Jo Paladino, MSA Executive Director, CHIP-IN for Quality Child Health Improvement Partnership Indiana Indiana Experience: 3 examples Medical Home Learning Collaborative Chronic Condition Management – Asthma Developmental Screening - MOC National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 Medical Home Learning Collaborative The MHLC Structure • Three year Indiana Community Integrated Systems of Services (IN CISS) grant – Nine in October 2009 – Nine in October 2010 • Diverse in size, demographics, location and culture • All implementing AAP’s Medical Home Tool Kit in their practices http://www.pediatricmedhome.org/ • Bi-weekly Conference Calls • Face to face site visits every 8-12 weeks • Annual Spring and Fall Meetings 29 Conference Call Topics AAP National Center for Medical Home Implementation and Center for Medical Home Improvement - Building Your Medical Home ~ Toolkit ~ • • • • • • • • • • • Updates Huddles Quality Improvement Team Meetings Pre-planned Visits Improved Access Buy-In to Medical Home National Committee for Quality Assurance (NACQ) Standards Electronic Health Records – Meaningful Use Registries Family / Parent Partner Recruitment and Involvement Medical Home Billing Codes Supports your development and/or improvement of a pediatric Medical Home. Prepares you to apply for and potentially meet the National Committee for Quality Assurance (NCQA) Physician Practice Connections® Patient Centered Medical Home (PPC-PCMHTM) Recognition program requirements. Offers capacity to chart progress Web site: http://www.pediatricmedhome.org Tracking Your Progress 30 6 Scores on the Medical Home Index for the Indiana Medical Home Learning Pre Collaborative 6 Scores on the Medical Home Index for the Indiana Medical Home Learning Collaborative 5 5 4 4 3 3 2 2 Pre Pre 1 1 0 0 Chronic Condition Management – Asthma MOC • Practices chose their measures • Identified through collaborative discussion 4 measures: 1) Education 2) Asthma Action Plans 3) Diagnosis Specific Coding 4) Controller Medication • Baseline data and monthly chart reviews • Asthma Quality Improvement Dashboard • Maintenance of Certification (MOC) Example: Asthma Action Plans AIM: Within a year have 80% of patients with asthma action plans in their charts Plan Do Act Study Distribute asthma action plans to patients with asthma Build an asthma registry and choose a tool No action plan in chart, schedule follow asthma appt. Chart reviews and patient feedback Asthma Quality Improvement Dashboard Asthma Baseline Chart Review Education Asthma Action Plan Data 100 90 80 Practice 1 Practice 2 Practice 3 Practice 4 Practice 5 COMBINED 70 60 50 40 30 20 10 0 100 90 80 70 60 50 40 30 20 10 0 Practice 1 Practice 2 Practice 3 Practice 4 Practice 5 COMBINED Asthma Action Plan: Percentage with plan in the chart 100 90 80 70 60 50 40 30 20 10 0 Jan Feb Mar Apr May Jun Jul Aug In Medical Documention Goals met or Record barriers Diagnosis Specific Coding 120 100 80 100 Practice 1 Practice 2 80 Practice 2 60 Practice 3 Practice 3 40 Practice 4 20 Practice 5 COMBINED 0 "well" or "Not well" controlled Severity code 100 Practice 1 60 Education: Percentage of patients with asthma who have education re: proper use of spacer device Controller Medications 120 Practice 4 40 80 60 40 Practice 5 20 COMBINED 0 Persistent asthma on controller Meds 20 0 Jan Feb Mar Apr May Jun Jul Aug 31 Development Screening MOC • MCHAT 18 and 24 month screens – – – – – – How many children seen for 18 month visit How many children screened Number of Pass Number of Fail Number of Referrals If not referred, why? Neee Discussion with Practices after looking at the Data • What surprised you? • What works well / what would you tell others/ helpful hints? • What were some of challenges and how have you addressed them? • What would you like to see happen next? 32 Lessons Learned • Find the leader who does QI naturally and label what they do • Identify a problem, brainstorm, try something, get feedback, make a change – that is PDSA • Start small • Celebrate early successes, it helps build enthusiasm and buy in • Use data to inform Activity • Break into 4 groups • Each table has a different topic • Each table needs a recorder and someone to report out • Paper at each table will describe the scenario Objective After the Coach Leaves: Strategies to Increase Practice QI Sustainability • To understand key steps in ensuring practice level sustainability throughout the course of a project. R.J. Gillespie, MD, MHPE, FAAP Medical Director Oregon Pediatric Improvement Partnership NIPN Operations Training Meeting, Arlington, VA October 2013 As the Project Starts • Understanding Clinic’s Change Culture • Knowing who the clinic needs to be engaged • Getting the backing of clinic leadership Key thought: Understanding how the practice typically addresses change and decisionmaking will facilitate project spread. As the Project is Underway • Developing “healthy habits” as a team • Understanding common pitfalls to change, as well as facilitators to change Key thought: Helping coach the practice as they experience both pitfalls and facilitators to change will help them maintain a functioning team. 33 As the Project Finishes • Leaving the practice with a long-term infrastructure • Spreading QI skills and knowledge to other practice members Key thought: QI skills and knowledge can’t live in the brain of a single individual (or small group of individuals) if change is to be sustained. Food for thought • Given that medical home transformation is a flexible, long-term process… How can you build your project team and do your project-level work in a way that sustains the work beyond the timeframe of the learning collaborative? Enabling and Sustaining Change: Who do you Need to Engage? • Who will the change affect - directly and indirectly? • What are the drivers and barriers to change? • Who needs to be involved, consulted, informed? – Remember that often means engaged participation of the non-provider office staff (can sink or float a project) • To what extent will contextual and cultural factors such as team dynamics or differences in approach between departments affect the success of change? • How can I develop a shared vision? AS THE PROJECT STARTS INTENTIONALLY BUILDING THE PROJECT TEAM & ENSURING CLINIC-LEVEL ENGAGEMENT Key Questions: Understanding Your Clinic’s Change Culture • How are changes made in your practice? • Who holds decision-making authority in your practice? • How can you engage other providers to participate in changes made during the learning collaborative? • What are the structural supports needed to maintain continued growth as a medical home? Lessons from the National Center for Medical Home Improvement (CMHI) • Evaluated practices that improved on their “medical home”ness AND sustained their improvements • Lessons from these sustained innovators: “If you do nothing else…” – Identify your population of CSHCN – Gain family participation/feedback – Develop the capacity for practice-based care coordination and the use of care plans Adapted from Cooley, W.C. (2012, June). Care coordination – Assuring a family-centered approach [PowerPoint Slides]. 34 Maxims of Patient Centered Care The needs of the patient come first Nothing about me without me Every patient is the only patient From: D. Berwick. What ‘Patient-Centered’ Should Mean: Confessions of an Extremist. Health Affairs, 28, no.4 (2009): w555-565. Key Steps in Leadership Driving Change • Senior leadership (clinical and administrative) must play an active and visible role • Change should be conducted by a multidisciplinary team • Leadership needs to create a sense of urgency and dissatisfaction with the status quo Leadership Driving Improvement • Who are the key leaders that are needed to achieve this development in practice and where are they positioned? • Does this leadership reflect the range of stakeholders? • How do we develop a shared vision amongst the leaders? • Are there any gaps in leadership and how can this be addressed? • How can the vision be shared to inspire and engage leadership at all levels? Engaging Others • Leadership is needed at all levels to drive forward continuous improvement. Leadership involves: – inspiring, enabling and transforming others – facilitating and supporting change – developing staff – and acting as a role model for others. Simple Steps to Implement Now • Working on team identity and function – Are you meeting regularly outside of practice facilitation visits? Do you create an agenda? Are you dividing accountabilities? • Finding ways to share project information, goals, aim statements with others AS THE PROJECT IS UNDERWAY STEPS TO IMPLEMENT AS THE PROJECT IS RUN: FORMING “HEALTHY HABITS” AS A TEAM – What are the avenues for sharing information with other providers and staff? Are there standing meetings that you need to get yourself on the agenda for? • Publicizing project data with other staff members, providers – How is performance data shared with others in the practice? 35 Conducting Pilots in a Way that Engages Different Types of Adopters in Your Clinic • Small tests of change are critical before wide-spread clinic level changes are conducted • Pilots should test: – Different processes AND – Can engaged different providers and stakeholders in testing out the change • By being part of the test, can become an advocate for the change • Intentionally recruit internal “thought leaders” in the pilot testing • Intentionally recruit people that approach project differently in the practice Common Pitfalls in Change • Implementation that requires fundamental changes in established routines are harder to implement. • Limiting participation to a few people playing traditional roles. • Absence of a dedicated support structure. • Continued dependence on a few individuals to sustain momentum. Factors supporting sustained change • A culture of openness and transparency • Staff at all levels involved • A system for communication of good practice development projects • Clear organizational goals in respect of quality • Coordination of activity across projects • National and local initiatives within a coherent operational plan • A financial infrastructure to support development of practice and associated education Structural Supports • Implementing large scale change calls for dedicated support structures • Success increases if multiple tactics are used • Quality Improvement Strategic Plan may facilitate long term change AS THE PROJECT FINISHES FINE-TUNING QUALITY IMPROVEMENT EFFORTS: BUILDING A QI INFRASTRUCTURE FOR YOUR CLINIC What a QI committee Should Do • Medical home “transformation” is about broad systems change. Assessing and improving the practice based on recognized medical home definitions require QI. • Interpreting policy / health reform changes and how that will impact the clinic also informs QI. • Building the clinic’s Adaptive Reserve should be a primary goal of the QI committee. – Other providers / office staff should be competent in QI – “Sensemaking” is an adaptive reserve skill that needs to be fostered • Consider developing a QI Strategic Plan. – If a practice has a lot of projects going on…how do they connect to each other and inform the direction that the clinic wants to take? – What are the goals of the clinic related to improvement? 36 Adaptive Reserve • What’s predictive of medical home transformation is the characteristics of the practice themselves…specifically adaptive reserve • The ability of a practice to be resilient, to bend, and thrive survive under force. Facilitates adaptation during times of dramatic change. Template for Strategic Plan Components What is Sensemaking? • “Sensemaking is the process through which the fluid, multilayered world is given order, within which people can orient themselves, find purpose, and take effective action. Organizations don’t discover sense, they create it.” --Don Berwick • When it comes to quality measures, sensemaking is the process of creating a story about data to give that data meaning within a clinical context. • Once you can give the data meaning within a practice, an action plan becomes a simple next step. QI Committee Structure • Multidisciplinary team is needed – Nursing staff, front office, operations, parent, tech staff, etc. – Only way to ensure that clinical processes are improved broadly • Need to determine clear accountabilities for tasks of the committee, meeting schedule, leadership, etc. • Consider how to insert the committee into the existing governance structure. • Parent Advisory Committee should be considered. Some materials from QI Committee should be vetted through a parent group. Engaging families and/or youth • In working with practices, this is difficult but meaningful in many ways • Some ideas for how to engage families: – Recruiting families for QI teams or standing clinic committees – Focus Groups • Recruit a group of parents to discuss specific topics • Example: focus group to review service needs for CYSHCN – Parent Advisory Group • Can also be subject-specific, or have the agenda driven by the parents – Survey patients and families about their experience of care • Formal surveys • Shorter surveys of topics of interest Final Thoughts • Sustainability needs to be considered as soon as the practice is engaged in a project • Specific knowledge and skills can be imparted during the project • Sustainability is easier if an infrastructure is developed • Engaging families and patients will hold the practice’s “feet to the fire” as they continue their QI efforts 37 THANK YOU! Please Complete the Evaluation Form National Improvement Partnership Network (NIPN) Improvement Partnership (IP) Operations Training October 2-4, 2013 38