Partnership Plan Waiver Managed Long Term Care (MLTC) Amendment - August 2012

EVALUATION PLAN

New York Department of Health

Partnership Plan Medicaid Section 1115 Demonstration

Start Date of Demonstration Period:          August 1, 2011
End Date of Demonstration Period:             December 31, 2014

As a component of the Special Terms and Conditions (STCs) for the Partnership Plan Medicaid Section 1115 Demonstration (No. 11–W–00114/2), the New York State Department of Health (DOH) hereby submits this draft evaluation plan for approval to the Centers for Medicare and Medicaid Services (CMS).

This evaluation plan will assess the degree to which the Demonstration goals have been achieved and/or key activities have been implemented. The evaluation plan includes a discussion of the Demonstration´s major goals and activities, evaluation questions, and measures and data that will be used in the evaluation.

In accordance with the Special Terms and Conditions for the Demonstration extension, the State will submit two evaluation reports during the extension period: one for the Demonstration extension as a whole, including preliminary findings for the Hospital–Medical Home (H–MH) and Potentially Preventable Readmissions (PPR) demonstrations; and, one presenting final findings for the H–MH and PPR demonstrations.

The New York State Department of Health will select and contract with an independent outside vendor for completion of the evaluations described above. The DOH will be responsible for the quarterly and annual reporting requirements.

OVERVIEW OF THE DEMONSTRATION

In July 1997, New York State received approval from the Health Care Financing Administration (HCFA) for its Partnership Plan Section 1115 Demonstration. The State´s goal in implementing the Demonstration was to improve the health status of low–income New Yorkers by improving access to health care in the Medicaid program, improving the quality of health services delivered, and expanding coverage to additional low–income New Yorkers.

Through the original Demonstration, the State implemented a mandatory Medicaid managed care program in counties with sufficient managed care capacity and the infrastructure to manage the enrollment processes essential to a mandatory program. The Demonstration also enabled the extension of coverage to certain individuals who would otherwise be without health insurance.

The initial Demonstration was approved in 1997 to enroll most Medicaid recipients into managed care organizations (MCOs). In 2001, the Family Health Plus program (FHP), implemented as an amendment to the Demonstration, began providing comprehensive health coverage to low– income uninsured adults (with and without children) that have income and/or assets greater than Medicaid eligibility standards. In 2002, the Demonstration was further amended to provide family planning services to women losing Medicaid eligibility as well as certain other adults of childbearing age.

With the original Demonstration and subsequent amendments, the Partnership Plan

Demonstration includes five major components:

  • A Medicaid managed care program providing Medicaid State Plan benefits through comprehensive managed care organizations to most recipients eligible under the State plan;
  • A Family Health Plus program providing a more limited benefit package, with cost– sharing imposed, to adults with and without children with specified income and assets;
  • A Family Planning Expansion program provided to men and women of childbearing age with net incomes at or below 200 percent of the federal poverty level (FPL) and to women who lose Medicaid eligibility under the Partnership Plan at the conclusion of their 60–day postpartum period; and
  • Two hospital quality demonstration programs:
    • The Hospital–Medical Home Demonstration is intended to improve the coordination, continuity and quality of care for individuals receiving primary care in settings affiliated with teaching hospitals, and facilitate incorporation of patient–centered medical home concepts into residency training;
    • The Potentially Preventable Readmissions Demonstration will test strategies for reducing the rate of preventable readmissions within the Medicaid population.
  • The Managed Long–Term Care program will expand mandatory Medicaid managed care enrollment to dually–eligible individuals over age 21 who receive community–based long–term care services in excess of 120 days and provide dually–eligible individuals age 18 – 21, as well as nursing home eligible non–dual individuals age 18 and older, the option to enroll in the ML TC program. In addition, this amendment permits the state to expand eligibility to ensure continuity of care for individuals who are moving from an institutional long–term care setting to receive community–based long–term care services through the managed long–term care program.

With CMS approval extending the Demonstration through 2014, New York State is planning to commission evaluations for the Partnership Plan, the Family Planning Expansion program, the hospital quality demonstration programs, and the Managed Long–Term Care Program to determine the degree to which the State has added to the successes that it has already achieved with the Partnership Plan Demonstration.

Goals and Major Activities

The primary goals of the Partnership Plan Demonstration are to increase access, improve quality, and expand coverage to low–income New Yorkers. In the years since initial approval of its Partnership Plan Demonstration, New York has made significant progress in meeting these goals.

Specifically, the Demonstration will allow continued eligibility for the managed care program, Family Health Plus program, and the Family Planning Expansion Program, as follows:1

Medicaid Managed Care Program

State Plan Mandatory and Optional Groups FPL Level and/or other qualifying criteria
Pregnant women Up to 200 % FPL
Children under age 1 Up to 200 % FPL
Children 1 through 5 Up to 133% FPL
Children 6 through 18 Up to 133% FPL
Children 19–20 Income at or below the monthly income standard (determined annually)
Parents and caretaker relatives Income at or below the monthly income standard (determined annually)
Demonstration Eligible Groups
Adults who were recipients of or eligible for Safety Net cash assistance but are otherwise ineligible for Medicaid Income based on Statewide Standard of Need (determined annually)

Family Health Plus

Demonstration Eligible Groups FPL Level and/or other qualifying criteria
Parents and caretaker relatives of a child under the age of 21 (who could otherwise be eligible under section 1931 of the Medicaid State plan) Income above the Medicaid monthly income standard but gross family income at or below 160% FPL.
Non–pregnant, non–disabled ("childless") adults (19–64) Income above the Statewide standard of need but gross household income at or below 100% FPL.

Family Planning Expansion Program

Demonstration Eligible Groups
Women who lose Medicaid eligibility at the conclusion of their 60–day postpartum period
Men and women of childbearing age with net incomes at or below 200% FPL who are not otherwise eligible for Medicaid or other public or private health insurance coverage that provides family planning services

TECHNICAL APPROACH

As noted above, the primary goals of the Partnership Plan Demonstration are to increase access, improve quality, and expand coverage to low income New Yorkers. To accomplish these goals, the Demonstration includes several key activities including enrollment of new populations, quality improvement and coverage expansions. This evaluation plan will assess the degree to which the key goals of the Demonstration have been achieved and/or the key activities of the Demonstration have been implemented.

Evaluation Plan Approach

The process of designing the evaluation plan first involved identifying and documenting the Demonstration´s key goals and activities, which were included in the State´s Demonstration extension proposal and the Special Terms and Conditions.

With key goals and activities identified, the process of designing the evaluation plan involved selecting evaluation questions that correspond to each of the major Demonstration goals and activities, building on the previous evaluation plan. The evaluation itself will seek to answer the evaluation questions, which in turn will assess the degree to which the Demonstration has been effective in implementing the key activities identified, directly achieving the goals of the Demonstration, or both.

The specific evaluation questions to be addressed by the evaluation were based on the following criteria:

  1. Potential for improvement, consistent with the key goals of the Demonstration;
  2. Potential for measurement, including (where possible and relevant) baseline measures that can help to isolate the effects of Demonstration initiatives and activities over time; and
  3. Potential to coordinate with the DOH´s ongoing performance evaluation and monitoring efforts.

Once research questions were selected to address the Demonstration´s major program goals and activities, specific variables and measures were then identified to correspond to each research question. Finally, a process was developed for identifying data sources that are most appropriate and efficient in answering each of the evaluation questions.

The evaluation team will use all available data sources. The timing of data collection periods will vary depending on the data source. Enrollment data will be collected monthly, provider network data quarterly, QARR/HEDIS data annually and CAHPS data every two years. For this three–year period, the evaluation team will have three years of QARR data (2009, 2010 and 2011) and three years of CAHPS (2009, 2011 and 2013) data. Data related to the hospital quality demonstration programs will be collected through required grantee progress reports.

Analysis Plan

While the Demonstration seeks to generate cost savings and promote quality care, observed changes may be attributed to the Demonstration itself and/or external factors, including other State– or national–level policy or market changes or trends. The evaluation team will develop a theoretical framework depicting how specific Demonstration goals, tasks, and activities are causally connected. This theoretical framework, which may include a logic model, will incorporate any known or possible external influences to the extent possible (such as policy changes or market shifts) and their potential interactions with the Demonstration´s goals and activities.

The theoretical framework will be used as a reference for the evaluation team in isolating the degree to which the Demonstration is associated with observed changes in relevant outcomes. Specifically, the evaluation team will seek to isolate the effects of the Demonstration on the observed outcomes in several ways:

  1. To the extent possible, the evaluation team will gather and describe credible contextual evidence that attempts to isolate the Demonstration´s contribution to any observed effects as well as describe the relative contributions of other factors influencing the observed effects. This will include documenting any relevant legal, regulatory, or policy changes or other trends – including the sequence, scope, and duration of such changes – at both a State and national level that are likely to influence the observed outcomes.
  2. Where possible and relevant, the evaluation will incorporate baseline measures for each of the selected variables included in the evaluation. Data for each of the targeted variables and measures will be collected regularly so that changes in outcome measures and variables can be observed on a longitudinal basis. Baseline measures will include measures for variables reported in the previous evaluation.
  3. The evaluation will compare rates of performance and measures with State and national benchmarks, where relevant and feasible. Incorporating benchmark measures will allow for external comparisons of Demonstration measures to State and national trends, further isolating the impacts of the Demonstration by controlling for external factors influencing the observed effects.

The evaluation features described above (analysis of qualitative contextual information, the use of baseline measures, ongoing data collection, and benchmarking) represent quasi–experimental means by which the evaluation team will determine the effects of the Demonstration. Evaluation conclusions will include key findings associated with individual research questions addressed as well as integrated information combining the results of individual evaluation questions to make broad conclusions about the effects of the Demonstration as a whole.

In addition, the evaluation will include specific recommendations of best practices and lessons learned that can be useful for DOH, other States, and CMS. Moreover, to the extent possible, the evaluation team will integrate and/or compare evaluation conclusions and recommendations to the evaluation report submitted to CMS on January 29, 2010 and/or previous studies or evaluations of relevance.

The DOH will have a contract with an EQRO to conduct the federally–required review of Managed Care Entities (MCE) as defined in 42 CFR 438 Subpart E. As the expansion of managed care to selected populations and counties is an important component of this Demonstration, the findings of EQRO activities and ongoing internal monitoring of managed care activities will be made available, as necessary, to assist the vendor selected to conduct the evaluations and write the interim and final reports.


PARTNERSHIP PLAN:
EVALUATION GOALS, ACTIVITIES, MEASURES, AND DATA

The evaluation tool, appended, forms the foundation of our evaluation plan by identifying and organizing the goals, activities, key evaluation questions, outcome measures and variables and data sources that will be used to measure the State´s success in achieving the major goals of the Partnership Plan Demonstration.

Table 1 outlines the evaluation strategy for measuring the success of the Partnership Plan, including the Family Planning Expansion Program and the Clinic Uncompensated Care Program. Also included are evaluation parameters for: Twelve Month Continuous Coverage; implementation of the Enrollment Center; HIV Special Needs Plans; mandatory enrollment of individuals living with HIV; Medicaid Advantage Plans; and Managed Long–Term Care.

Table 2 provides evaluation parameters for the Hospital–Medical Home (H–MH) Demonstration and Table 3 details the evaluation plan for the Potentially Preventable Readmission (PPR) Demonstration.

The draft final evaluation report due to CMS by July 31, 2014 will include all program components as detailed in Tables 1 – 3. However, as findings for the H–MH and PPR demonstrations may be preliminary as of that date, a separate report providing final findings on the H–MH and PPR demonstrations will be submitted to CMS by April 30, 2015.



Evaluation Tool for the New York State
Partnership Plan Demonstration
Demonstration Period:
October 1, 2009 through December 31, 2014

This tool describes the key goals, evaluation questions, measures/variables, activities and data sources related to New York State

Goal 1: To expand managed care enrollment
Continue managed care enrollment into New York´s Medicaid Program
  Research Questions Measures/Variable Data Sources
1 How many beneficiaries were enrolled in Medicaid managed care as a result of the demonstration? Number of beneficiaries enrolled in managed care, statewide and by beneficiary type, analyzed across age category, county and with percent change over time OHIP Data Mart

Goal 2: To improve health care access for Medicaid beneficiaries in New York
Continue to improve health care access for Medicaid beneficiaries in New York
  Research Questions Measures/Variable Data Sources
1 To what extent has the demonstration improved access to primary care? Rates of physician participation in Medicaid managed care plans by county for primary care providers; ratio of primary care providers per 1,000 enrollees; number of primary care visits per member per month (PMPM); access measures for primary care in New York City and Rest of State OHIP Data Mart; HEDIS/Quality Assurance Reporting Requirements (QARR); Provider Network Data System (PNDS); Consumer Assessment of Healthcare Providers and Systems (CAHPS)
2 To what extent has the demonstration improved access to specialty care? Rates of physician participation in Medicaid managed care plans by county for specialists; ratio of specialty providers per 1,000 enrollees; number of specialty care visits PMPM; access measures for specialty care in New York City and Rest of State OHIP Data Mart; HEDIS/QARR; CAHPS; PNDS
3 To what extent has the provision of continuous eligibility affected the stability and continuity of coverage and care to adults? Average length of enrollment/eligibility, pre– and post– OHIP Data Mart
4 How has implementation of the statewide Enrollment Center impacted "churning" by demonstration participants? Compare rates of continuous enrollment pre– and post– Enrollment Center Enrollment Center data; OHIP Data Mart
5 How have results of the family planning expansion program expanded access to family planning services among target population? Number of beneficiaries in target population receiving family planning; utilization data OHIP Data Mart
6 How has additional funding provided under the Clinic Uncompensated Care program increased the use of patient–centered medical homes (PCMH) and electronic medical records (EHR)? Number of grantee clinics that participate in medical homes and have implemented EHRs Grantee reports; NCQA monthly feed of PCMH allow for determination of the number of PCMH providers in the clinics and possibly utilization trends if matched to OHIP Data Mart claims and encounters

Goal 3: To continue to improve the quality of care
To determine the extent to which the New York Partnership Demonstration Plan improved the quality of care for Medicaid beneficiaries
  Research Questions Measures/Variable Data Sources
1 How has quality of care for Medicaid managed care organizations changed over the life of the Demonstration? Changes in managed care quality measures for the health plans that serve Medicaid managed care enrollees; covered areas will include but not be limited to the following areas: provider network; child & adolescent health; women´s health; adults living with illness; behavioral health; access and service

Changes in rates of member satisfaction for health plans that serve Medicaid managed care enrollees

Comparison of quality measures between New York Medicaid managed care and national benchmarks
QARR; CAHPS; National HEDIS data
2 How does quality of care for New York Medicaid managed care enrollees compare with national benchmarks?
3 Has the gap in measures of quality and satisfaction narrowed between New York Medicaid managed care plans and commercial plans? Comparison of quality and satisfaction measures between New York Medicaid managed care and commercial plans QARR; CAHPS
4 How has Medicaid financial mechanisms/payment methods evolved to support program objectives to advance a higher quality health care system? Qualitative descriptions of new financial mechanisms to support Demonstration goals, such as pay–for–performance initiatives, quality incentives, move to risk adjusted capitation rates, etc. and early experience with their use. Internal New York State Department of Health (DOH) documents and reports
5 Has the HIV Special Needs Plan been a successful model for delivery of care to persons living with HIV/AIDS and their eligible dependents? Number of beneficiaries enrolled in HIV SNPs by beneficiary type, age category, and county; description of factors that influence ability to increase enrollment; number of persons living with HIV/AIDS enrolled in managed care and SNPs, with percent change over time; ability of SNPs to meet fixed expenses and to accomplish expansion and growth MMIS; DOH Administrative Data; QARR; MMIS data; DOH Internal Data, HIV QUAL Data; Solvency reports
6 How effective have provider/enrollee education and outreach efforts been in minimizing the impact of the transition of individuals living with HIV into mandatory Medicaid managed care? Changes in member satisfaction related to care, communication and knowledge of Medicaid managed care; quality and utilization by county; complaints and regulatory action; provider training and utilization caps QARR; CAHPS; Survey Data (Note: CAHPS sampling and survey data will require additional resources); DOH Internal Data
7 How effective has the state´s plan oversight and compliance monitoring been in minimizing the impact of the transition of individuals living with HIV into mandatory Medicaid managed care?
8 To what extent has the mandatory enrollment of individuals living with HIV into Medicaid managed care impacted the perceptions of care (Fee For Service v. Special Needs Plan (SNP) v. mainstream)? HIV SNP satisfaction measures; mainstream Medicaid managed care plan satisfaction measures CAHPS, Survey Data; (Note: CAHPS sampling and survey data will require additional resources); DOH Internal data (Rapkin, Knowledge and Attitude interviews summary, available as pre– measure of perceptions)
9 Has the Medicaid Advantage Program been successful in integrating Medicare and Medicaid covered services for dually eligible beneficiaries? Number of managed care plans and beneficiaries participating in integrated programs; quality measures for enrollees of integrated plans; cost efficiencies realized by Medicaid program as a result of integration of Medicare and Medicaid; utilization of services; number of complaints; number of expansions OHIP Data Mart; CMS receives HEDIS and CAHPS for Medicare Advantage enrollees; DOH Internal Data; HEDIS for Medicaid Advantage as of 2012 reporting year; QARR; Medicare CAHPs; DOH administrative data; MMIS
10 Has the required enrollment of individuals living with HIV into Medicaid managed care (either mainstream plans or HIV SNPs) impacted quality outcomes? HIV SNP quality measures compared to mainstream Medicaid managed care plans in NYC; quality measures for enrollees after Sept. 2010 or October 2011 compared with enrollees prior to Sept. 2010 or Oct. 2011; changes in member satisfaction measures for the health plans that serve Medicaid managed care enrollees; rates of emergency room and inpatient hospital use pre– and post– enrollment QARR; CAHPS; OHIP Data Mart; DOH Internal Data (payor HIV QUAL data can be stratified by region)
11 To what extent have SNPs improved overall quality of care? Evaluation of SNPs patient health outcomes QARR; OHIP Data Mart; DOH Internal Data (HIV QUAL data that is SNP specific includes outcome measures not reflected in QARR)

Goal 4: Expanded Health Care Coverage Continue to reduce the number of uninsured New Yorkers
  Research Questions Measures/Variable Data Sources
1 How has expanded eligibility in the Family Health Plus program (FHP) affected health coverage for low–income uninsured adults? Number of beneficiaries enrolled in FHP, by beneficiary type, age category, and county OHIP Data Mart; Current Population Survey data
2 How many individuals have enrolled in employer sponsored health insurance (ESHIP) through the FHPlus Premium Assistance Program? Number of beneficiaries enrolled in ESHI through FHPlus by beneficiary type, age category, and county OHIP Data Mart

Goal 5: Expanded Managed Long–Term Care
Make managed long–term care available to a greater number of eligible Medicaid recipients
  Research Questions Measures/Variable Data Sources
1 How has enrollment in MLTC plans increased over the length of the demonstration? Number of beneficiaries enrolled in MLTC plans, by county and percent change over time. OHIP Data Mart
2 What are the demographic characteristics of the MLTC population? Are they changing over time? Year to year comparison of demographic composition of MLTC beneficiaries, including age, race, gender, language, risk factors, enrollment, payment source, location, living situation, and top diagnoses. SAAM
3 What are the functional and cognitive deficits of the MLTC population? Are they changing over time? Year to year comparison of average statewide MLTC beneficiary scores on Activities of Daily Living Measures, Urinary Incontinence Frequency, Bowel Incontinence Frequency, Cognitive Functioning, When Confused, When Anxious, Frequency of Pain, and Depressive Feelings. SAAM
4 Are the statewide and plan– specific overall functional indices decreasing or staying the same over time? Average Overall Functioning score by health plan and statewide average with percent change over time. SAAM
5 Are the average cognitive and plan–specific attributes decreasing or staying the same over time? Year to year comparison of plan–specific scores on Activities of Daily Living Measures, Urinary Incontinence Frequency, Bowel Incontinence Frequency, Cognitive Functioning, When Confused, When Anxious, Frequency of Pain, and Depressive Feelings. SAAM
6 Are the individual care plans consistent with the functional and cognitive abilities of the enrollees? This evaluation question will be included when there is sufficient data available in 2014 to provide accurate measures.  
7 Access to Care: To what extent are enrollees able to receive access to personal, home care and other services such as dental care, optometry and audiology? Percentages of MLTC beneficiaries with a wait time of less than one month for routine Dentistry, Eye Care, Foot Care, and Audiology. Percentages of new MLTC enrollees that stated that accessing Personal Care and Home Care was the same or better than it was before joining the plan. MLTC Member Satisfaction Survey MLTC Satisfaction Survey of New Enrollees
8 Quality of Care: Are enrollees accessing necessary services such as flu shots and dental care? Percentage of MLTC beneficiaries who received a flu shot within the last year. Percentage of MLTC beneficiaries who saw a dentist within the last year. SAAM, Encounter Data
9 Patient Safety: Are enrollees managing their medications? What are the fall rates and how are they changing over time? The risk–adjusted percentage of MLTC beneficiaries who independently manage oral medication with percent change over time; Statewide percentage of MLTC beneficiaries that fell within the last six months with percent change over time. SAAM
10 Satisfaction: What are the levels of satisfaction with the timeliness (how often services were on time/how often the enrollee was able to see the provider at the scheduled time) and quality of network providers? Percentages of MLTC beneficiaries who rated Home Health Aide, Care Manager, and Regular Visiting Nurse timeliness as Usually or Always. Percentages of MLTC beneficiaries who rated Home Health Aide, Care Manager, and Regular Visiting Nurse quality as Good or Excellent. MLTC Member Satisfaction Survey
11 Costs: What are the PMPM costs of the population? Sum of payments divided by MLTC beneficiary member months in one year. OHIP Data Mart Evaluation Tool for the New York State Partnership Plan Demonstration – Hospital Medical Home Demonstration Period: August 1, 2011 through December 31, 2014 This tool describes the key goals, evaluation questions, measures/variables, activities and data sources related to the New York State

Evaluation Tool for the New York State
Partnership Plan Demonstration – Hospital Medical Home
Demonstration Period:
August 1, 2011 through December 31, 2014

This tool describes the key goals, evaluation questions, measures/variables, activities and data sources related to the New York State

Goal 6: Improved Hospital Outpatient Primary Care
Improve the coordination, continuity and quality of care for individuals receiving primary care in hospital outpatient departments operated by teaching hospitals, as well as other primary care settings used by teaching hospitals to train resident physicians
  Research Questions Measures/Variable Data Sources
1 Has the State´s Hospital–Medical Home (H–MH) demonstration resulted in demonstrable improvements in the quality of care received by Demonstration participants? Development of at least five clinical performance metrics consistent with QARR and/or meaningful use measures relevant to populations served Baseline and annual rates for each measure (submitted in state reports); NCQA monthly feed of Patient Centered Medical Home (PCMH) providers; primary care physician rosters and Quality Assurance Reporting Requirements (QARR) member– level files
2 How has the H–MH helped selected facilities improve both their systemic and quality and safety performance under each implemented initiative by the selected facilities? Key measures of the Quality and Safety Improvement Projects (QSIP) will be used to ascertain improvement in the systematic and quality and safety performance of facilities Each QSIP will have specific measures included and related data sources; additional milestones may also be included to enable the implementation of the measures specific to the intervention
3 To what extent has the H–MH demonstration produced replicable residency program design features that enhance training in medical home concepts? Evidence–based processes and outcomes will be measured to determine achievement among programs Baseline and annual data for each measure (submitted in state reports)
A. Continuity of Care
1 To what extent have operations been restructured to enhance continuity of care? Implementation of an initiative to restructure operations to enhance patients´ continuity of care experience in conjunction with developing a Patient Centered Medical Home (PCMH); increase in number of training sites and resident time in ambulatory settings; new site trainings done beyond hospital environment Data will be gathered per project plan (state progress reports will incorporate milestones to measures success/ objective measures of progress)
2 How has increased training supported the core activities of medical home transformation? Pre– and post– evaluation of trainings to determine impact on medical home transformation Data will be gathered per project plan (state progress reports will incorporate measures success/ objective measures of progress)
3 How will demonstration changes, related to the restructuring be sustained following termination of the demonstration? Sustainment will be compared to formal recognition of care consistent with NCQA requirements Data will be gathered per project plan (state progress reports will incorporate measures to assess changes)
B. Care Transition/Medication Reconciliation
1 Has better care transitions/medication reconciliation reduced readmissions and improved access to care? Develop transition bridge between management and medication reconciliation; number of patients participated in medication reconciliation; evaluation of readmissions and other utilization and quality metrics; clinical communication protocol; quality and quantity data related to continuity and follow up care; Patient registry; shared electronic information or medication list; avoidable re– admission data; patient risk assessment systems data
C. Integration of Physical/Medical Home
1 How has implementation of care improved the H–MH systems for coordinating physical and behavioral health care? Quality metrics related to integration of care; number of referrals to behavioral health; Communication protocol process for consults; procedures for coordinated case management PSYKES Data; Training session data; Average wait time
2 How have training programs helped integrate care for behavioral health patients within Medical Home? Quality metrics related to integration of care; Quantity of BH patients using MH model Data will be gathered per project plan (state progress reports will incorporate measures to assess changes)
3 Has the creation of a PSYKES system to receive reports improved care integration? Quality metrics related to integration of care; PSYKES system data
4 How has the initiative of Improving Access and Coordination between Primary and Specialty Care improved the system of the H–MH? Quality/Satisfaction Metrics related to primary, specialty, and follow–up care; Provider and patient satisfaction rates of specialty services; Rate of Primary Care follow–up services; Inclusion of specialists in team care Baseline data; specialty referral data; NCQA PCMH monthly files; QARR; CAHPS; separated select service area hospital data; primary–specialty care management protocols
5 How does the referral process affect access and coordination between primary and specialty care? Metrics related to wait times and appointment backlog; Quantity of PCP able to do low level specialty care; Survey data (primary and specialty); Baseline data;
6 How has the enhancement of interpretation services and culturally competent care improved the system of the H– MH? Access to language services; workforce sensitivity training; quality and quantity measurement of adequate interpretation technology; cultural analysis of service area Workforce data; HEDIS; DOH Administrative Data; Census Data
D. Avoidable Preterm Births
1 How has the Quality and Safety Improvement Project of ´Avoidable Preterm Births: Reducing Elective Delivery Prior to 39 Weeks Gestation´ demonstrated better care in hospital setting? Percent of scheduled inductions, C–sections and deliveries at 36(0/7) to 38(6/7) weeks without medical or obstetrical indication documented of all scheduled inductions or deliveries; percent of infants born at 36(0/7) to 38(6/7) weeks gestation by scheduled delivery who went to NICU; percent of mothers informed about risks and benefits of scheduled deliveries 36(0/7) to 38(6/7) weeks gestation documented in the medical record; percent scheduled deliveries at 36(0/7) to 38(6/7) weeks that have documentation in medical record of meeting optimal criteria of gestational age assessment; percent all four elements are in place: gestational age >/= 39 weeks; monitor fetal heart rate for reassurance of fetal status; pelvic exam: assess to determine dilation, effacement, station, cervical position and consistency, and fetal presentation; monitor and manage hyperstimulation QARR; HEDIS;Peer Reviewed Journals; Milestones Data; Baseline Performance data; SPARCS Data; Hospital Data

Evaluation Tool for the New York State
Partnership Plan Demonstration – Preventable Readmissions
Demonstration Period:
August 1, 2011 through December 31, 2014

This tool describes the key goals, evaluation questions, measures/variables, activities and data sources related to the New York State

Goal 7: Reduce the Rate of Potentially Preventable Readmissions
Test strategies for reducing the rate of preventable readmissions within the Medicaid population, with the related longer–term goal of developing reimbursement policies that provide incentives to help people stay out of the hospital
  Research Questions Measures/Variable Data Sources
1 How have results of the Potentially Preventable Readmissions (PPR) demonstration program informed changes in reimbursement policies that provide incentives to help people stay out of the hospital? Readmission numbers by participating hospital Hospital Data; grantee reports
2 How has the PPR demonstration program improved quality and cost saving at selected facilities? PPR numbers compared with last year Hospital Data; grantee reports
3 To what extent are the interventions tested both replicable and sustainable? Assessment of success in implementing activities and reducing PPRs Hospital Data; grantee reports

____________________________________________________

1. Subject to exclusions and exemptions as outlined in the STCs  1

September 28, 2012