/
PIP 101- Introduction to Performance Improvement Projects
Outline
Overview of the PIP process
PIP Summary Form review
PIP scoring methodology
What is a PIP?
Quality improvement project.
Purpose is to assess and improve processes and outcomes of care.
Time frame is minimum of 3 years from beginning to termination (according to CMS).
The PIP process provides an opportunity to:
Identify and measure a targeted area (clinical or nonclinical).
Analyze the results.
Implement interventions for improvement.
HSAG’s role is to:
Validates PIPs using CMS protocol, Validating Performance Improvement Projects, A Protocol for Use in Conducting Medicaid External Quality Review Activities, Final Protocol, Version 1.0.
Validate the study’s findings on the likely validity and reliability of the results.
Provide PIP Validation Reports.
Why do we validate PIPs?
The Balanced Budget Act of 1997 (BBA), Public Law 105-33, requires that states conduct an annual evaluation of managed care organizations (MCOs) or prepaid inpatient health plans (PIHPs) to determine the MCOs’ and PIHPs’ compliance with federal regulations and quality improvement standards. According to the BBA, the quality of health care delivered to Medicaid members in the MCOs and PIHPs must be tracked, analyzed, and reported annually. As one of the mandatory external quality review activities under the BBA, the states are required to validate PIPs. To meet this validation requirement, the states contract with Health Services Advisory Group, Inc. (HSAG), as the external quality review organization. The BBA requires HSAG to assess each health plan’s “strengths and weaknesses with respect to the quality, timeliness, and access to health care services furnished to Medicaid recipients” (42 Code of Federal Regulations [CFR] 438.364 [a][2]).
The purpose of a health care quality PIP is to assess and improve processes, and thereby outcomes, of care. For such projects to achieve real improvements in care, and for interested parties to have confidence in the reported improvements, PIPs must be designed, conducted and reported in a methodologically sound manner.
PIP Completion Instructions
Are used to ensure each HSAG evaluation element has been addressed.
Help to simplify PIP submissions.
Promote efficiency in preparing PIP documentation.
10 PIP Activities
CMS rationale
HSAG evaluation elements
Activity I: Select the Study Topics
CMS Rationale:
The study topics should:
Impact a significant portion of the members.
Reflect Medicaid enrollment in terms of demographic characteristics, prevalence of disease, and the potential consequences (risks) of the disease.
Support the goal to improve processes and outcomes of health care.
Be specified by the State or on the basis of Medicaid member input.
HSAG Evaluation Elements:
The study topic:
Reflects high-volume or high-risk conditions.
Is selected following collection and analysis of data (include plan-specific data).
Addresses a broad spectrum of care and services.
Includes all eligible populations that meet the study criteria.
Includes members with special health care needs. If any population was excluded, explain why.
Has the potential to affect member health, functional status, or satisfaction.
Activity I: Study Topic Examples
Improving Diabetic Screening.
Improving Well-Child Visits in the First 15 Months of Life.
Member or Provider Satisfaction.
Access to Care and Services.
Improving Blood Lead Screening.
Activity II: Define the Study Questions
CMS Rationale:
Stating the questions helps maintain the focus of the PIP and sets the framework for data collection, analysis, and interpretation.
HSAG Evaluation Elements:
The study question:
States the problem to be studied in simple terms.
Is answerable.
In general, the study question should illustrate this point: Does doing X result in an increase or decrease in Y?
Example: Do targeted interventions increase the percentage of children who receive 6 or more well-child visits in the first 15 months of life?
Activity III: Select the Study Indicators
CMS Rationale:
The study indicators:
Represent a quantitative or qualitative characteristic (a variable).
Represent discrete events (member has or has not experienced Event X).
Are appropriate for the study topic.
Are objective, clearly and unambiguously defined.
HSAG Evaluation Elements:
The study indicators:
Are well-defined, objective, and measurable.
Are based on practice guidelines, with sources identified.
Allow for the study question to be answered.
Align with the study question.
Measure changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives.
Have available data that can be collected on each indicator.
Are nationally-recognized measures such as HEDIS, when appropriate.
Include the basis on which each indicator was adopted, if internally developed.
Activity III: Study Indicator Example: The percentage of members turning 15 months of age during the measurement year who received 6 or more well-child visits.
Activity IV: Identify the Study Population
CMS Rationale:
The study population:
Represents the entire Medicaid-eligible enrolled population.
Allows systemwide measurement.
Allows the implementation of improvement efforts to which the study indicators apply.
HSAG Evaluation Elements:
The method for identifying the eligible population:
Is accurately and completely defined.
Includes requirements for the length of a member’s enrollment in the plan.
Captures all members to whom the study questions apply.
Example: All members who turned 15 months of age during the measurement year who were continuously enrolled from 31 days through 15 months of age with no more than one gap in enrollment of up to 45 days during the continuous enrollment period.
CPT codes: 99381, 99382, 99391, 99432
ICD-9 codes: V20.2, V70.0, V70.3, V70.5, V70.6, V70.8, V70.9
Activity V: Valid Sampling Techniques*
CMS Rationale:
Sample size impacts the level of statistical confidence in the study.
Statistical confidence is a numerical statement of the probable degree of certainty or accuracy of an estimate.
The sample size is large enough to detect improvement in indicators between measurement periods.
The sample is representative of the entire eligible population.
HSAG Evaluation Elements:
Sampling techniques:
Consider and specify the true or estimated frequency of occurrence.
Identify the sample size (or use the entire population).
Specify the confidence interval to be used (or use the entire population).
Specify the acceptable margin of error.
Ensure a representative sample of the entire population.
Are in accordance with generally accepted principles of research design and statistical analysis.
*Activity V is only scored if sampling techniques were used. If the entire population was used, document this in Activity V.
Activity VI: Reliably Collect Data
CMS Rationale:
Procedures used to collect data for a given PIP must ensure that the data collected on the PIP indicators are valid and reliable.
Administrative data collection.
Manual data collection.
HSAG Evaluation Elements:
The data collection techniques:
Provide clearly defined data elements to be collected.
Clearly specify sources of data.
Provide for a clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected.
Provide a timeline for the collection of baseline and remeasurement data.
If data were collected manually, the PIP should:
Provide qualifications, training, and experience of manual data collection staff members.
Ensure consistent and accurate collection of data according to indicator specifications.
Support inter-rater reliability.
Provide clear and concise written instructions that include an overview of the study.
For administrative data collection, the PIP should include:
An administrative data collection algorithm, data flow chart, or narrative description that outlines the steps in the production of the indicators.
An estimated degree of administrative data completeness and supporting documentation for how the percentage was determined.
Activity VII: Implement Intervention and Improvement Strategies
CMS Rationale:
An intervention is designed to change behavior at an institutional, practitioner, or member level.
An intervention increases the likelihood ofmeasurable change.
HSAG Evaluation Elements:
Planned/implemented strategies for improvement are:
Related to causes/barriers identified through data analysis and quality improvement (QI) processes.
System changes that are likely to induce permanent change.
Revised if original interventions are not successful.
Standardized and monitored if interventions are successful.
Planned/implemented strategies for improvement should be:
Realistic, feasible, and clearly defined.
Implemented in a reasonable amount of time to be effective.
Examples of Improvement Strategies:
Member: Reminder postcard mailings to children due for a well-child visit, which continue monthly based on the child’s date of birth.
Member: Development of an on-hold message addressing the importance of well-child visits.
Provider:Face-to-face meetings with low-reporting providers to provide educational materials.
Causal/Barrier Analysis
Methods:
Quality improvement committee.
Develop an internal task force.
Tools:
Fishbone diagram
Process mapping
Barrier/intervention table
Fishbone Diagram
Activity VIII: Analyze Data and Interpret Study Results
CMS Rationale:
Data analysis begins with examining performance on the selected clinical or nonclinical indicators.
Data analysis and interpretation is initiated using statistical analysis techniques definedin the data analysis plan.
HSAG Evaluation Elements:
The data analysis:
Is conducted according to the data analysis plan in the study design.
Allows for generalization of the results to the study population (if sample selected).
Identifies factors thatthreaten the internal or external validity of findings.
Includes an interpretation of findings.
Is presented in a way that provides accurate, clear, and easily understood information.
Identifies initial measurement and remeasurement of the study indicators.
Identifies statistical differences between initial measurement and remeasurement.
Identifies factors that affect the ability to compare initial measurement with remeasurement.
Includes the extent to which the study was successful.
For PIPs that provide baseline data, Evaluation Elements 1–5 will be scored in Activity VIII of the PIP Validation Tool.
Activity IX: Assess for Real Improvement*
CMS Rationale:
Change represents “real” change.
Results show the probability that improvement is true improvement.
Results show the degree to which change is statistically significant.
HSAG Evaluation Elements:
The remeasurement methodology is the same as the baseline methodology.
There is documented improvement in processes or outcomes of care.
The improvement appears to be the result of intervention(s).
There is statistical evidence that observed improvement is true improvement.
*Activity IX will be scored when the PIP has progressed to Year 2 (Remeasurement 1)
Activity X: Assess for Sustained Improvement*
CMS Rationale:
Change results from modifications in the processes of health care delivery.
If real change has occurred, the project should be able to sustain improvement.
HSAG Evaluation Elements:
Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant.
*Activity X is not scored until the PIP has reported baseline and at least two annual remeasurements of data (Year 3).
PIP Scoring Methodology
HSAG Evaluation Tool
13 Critical Elements
53 Evaluation Elements (including the Critical Elements)
Overall PIP Score
Percentage Score: Calculated by dividing the totalMet by the sum of the total Met, Partially Met, and Not Met.
Percentage Score of Critical Elements: Calculated by dividing the total critical elements Metby the sum of the critical elements Met, Partially Met, and Not Met.
Validation Status:
Met
Partially Met
Not Met
For aMetvalidation status:All critical elements were Met and80 percent to100 percent of all elements were Met across all activities.
For aPartially Metvalidation status:All critical elements were Met,and 60 percent to 79 percent of all elements were Met across all activities;or one or more critical element(s) were Partially Met.
For aNot Metvalidation status:All critical elements are Met and less than 60 percent of all elements are Met across all activitiesor one or more critical element(s) were Not Met
Not Applicable (NA)elements (including critical elements) were removed from all scoring
Not Assessedelements (including critical elements) were removed from all scoring
A Point of Clarification is used when documentation for an evaluation element includes the basic components to meet requirements for the evaluation element (as described in the PIP narrative), but enhanced documentation would demonstrate a stronger understanding of CMS protocols.
Example 1:
Met = 43, Partially Met = 2, Not Met = 0, NA = 8, and all critical elements were Met.The plan receives an overall Met status, indicating the PIP is valid.The score for the plan is calculated as 43/45 = 95.6 percent.
Example 2
Met = 52, PartiallyMet = 0, Not Met = 1, NA = 0, and one critical element was Not Met. The plan receives an overall Not Met status and the PIP is not valid.
PIP Tips
Complete the demographic page before submission.
Label ALL attachments and reference them in the body of the PIP study. Make sure to submit attachments with PIP submission.
HSAG does not require personal health information to be submitted. Submit only aggregate results.
Ensure all HSAG evaluation elements have been addressed in the PIP Summary Form.
Notify HSAG when the PIP documents are uploaded to the secure FTP site and state the number of documents uploaded.
This is a desk audit - document, document, document!
FTP Site DocumentPage 1
Health Services Advisory Group, Inc.PIP-Val_Introduction_V1.0_0411
HEDIS is a registered trademark of the National Committee for Quality Assurance (NCQA).