Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and Outlier Rule/Guidelines / Supplemental Report(s) / Specific Instructions
Bad Data
Bad Data
(This appears above the Survey
Name and Number)
It means the Univ. of CO team was unable to process the survey because portions of the survey data were missing. / Bad Data Survey Report in BadDataSurvey.pdf / Review the Bad Data report for information on data concerns.
  • If the survey has critical element responses from only one surveyor, contact QTSO help desk. The most likely cause is that the TC is not marking the survey as being complete and a secondary surveyor is uploading their version of the survey after the TC, thus overwriting the TC’s full survey with Stage 2 responses from a single secondary surveyor.
  • If the survey has no Stage 1 data and no critical element responses in Stage 2, the survey file is basically blank. Contact QTSO help desk.

TC In Audit Table
TC
In Audit Table
(This appears above the Survey
Name and Number)
It means the Univ. of CO team is unable to match CE responses to specific surveyors. /
  • We are not sure why this occurs. We believe it is a software issue, not something the surveyors have done.
  • Potentially due to a change in TC during the survey. If this is not the case, please contact the Univ. of CO QIS team. We need your help in determining the cause of this issue.
  • Any survey with the TC in the Audit Table will not have items 21 and 22 calculated.

  1. Number of Residents With Incomplete Stage 1 Data

This is the count of residents with incomplete Stage 1 data for Admission and Census Sample residents.
Outlier rules
High outlier only
Greater than or equal to double the national average. / Missing Stage 1 Data Report in MissingStage1Data.pdf / Review the Missing Stage 1 Data report for any survey that had incomplete Stage 1 data. The report shows the specific Stage 1 data item(s) that remained incomplete. The report also displays any notes entered by the TC on the Verification of Stage 1 Data screen.
The DAR-SAreport displays the average count of residents with incomplete Stage 1 data for Admission and Census Sample residents across the set of surveys being reviewed.
If concerns are identified, examine the QIS process steps that are associated withStage 1 data:
●Ensure each resident in the Stage 1 Admission and Census sample have a check mark.
●If an entire Admission Sample resident’s clinical record review is incomplete, during the initial team meeting ensure that the team coordinator determines assignments for Admission Sample residents who are in the facility, but not in the Census Sample.
●If an entire Admission Sample resident’s clinical record review is incomplete, ensure that the team is crossing off the Admission Sample resident as the record is completed on the team copy of the report.
●During the Stage 1 team meetings, ensure that the team coordinator discusses team progress and determines the remaining workload for Stage 1. During the Verification of Stage 1 data, ensure that the TC or the involved surveyor resolves any incomplete Admission or Census Sample resident information by entering the missing Stage 1 responses on the TC’s laptop. (The missing information can be entered on the secondary laptop, which is responsible for the missing data; however the secondary then needs to synchronize with the TC.)
●Ensure surveyors finish their Stage 1 work. During the Stage 1 team meetings, each surveyor should discuss his/her progress by verifying that his/her census sample residents are complete and the team also should determine the workload remaining in Stage 1.
●If a resident interview was terminated prior to the completion of the interview, ensure that the surveyor made additional attempts to complete the interview. If an interview remains incomplete, the surveyor should document the reason for the interview’s incomplete status in the Relevant Findings.
  1. Care Areas and Facility Task Triggering Status

Triggering indicator (“T”) for each care area and non-mandatory task. The report average shows the rate at which each care area and non-mandatory facility task was triggered across the set of surveys being reviewed.
Report, State, and National average calculations:
Numerator: Number of surveys on which the care area or non-mandatory facility task was triggered.
Denominator: Number of surveys being reviewed.
Excludes: care areas based exclusively on MDS data (i.e., Behavior and Emotional Status, Hearing, Infections, UTI, and Vision).
Outlier rules
Low Outlier-
If the national average is26%, then the low outlier is ½ or less than the national average.
If the national average is ≥26% and 61%, then the low outlier is 15 or more points less the national average.
If the national average is ≥61%, then the low outlier is 20 or more points less than the national average.
High outlier-
If the national average is >11%, thenthe high outlier is 40 points or more than the national average.
If national average is ≤11%, then quadruple the national average.
Note: Social Service has no high outlier and is an exception to the above rule. / None / Identify any high or low outliers for the report and state average compared to the national average.
Identify any 0% rate across the report and stateand compare to the national average.
If concerns are identified, examine the QIS process steps that are associated withtriggering rates:
●For any care area or non-mandatory facility task with a low/high average triggering rate, refer to the QCLI Dictionary to identify the Stage 1 question(s) and response(s) that cause the care area or facility task to trigger.
Trend this information to determine whether further training is needed to ensure that surveyors:
●Understand the intent of the questions,
●Ask the questions accurately,
●Do not ask leading questions,
●Do not repeat the question to elicit a positive response,
●Ask all of the questions,
●Answer the questions correctly,
●Conduct appropriate observations, and
●Conduct appropriate clinical record reviews.
●Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.
For example, assume the physical restraints care area has a low average triggering rate compared to the national average. The physical restraint care area is triggered from the staff interview and resident observation.Ensure that:
●Surveyors ask the staff interview physical restraint questions exactly as written, including the parenthetical portion of the question;
●Surveyors do not asking leading questions;
●Surveyors ask all of the physical restraint questions, when necessary, based on a prior response;
●Surveyors answer the physical restraint questions correctly based on the staff member’s response;
●Surveyors understand the type of restraints that should be marked during Stage 1, especially for side rails;
●Surveyors understand the exclusion for side rails;
●Surveyors make multiple observations of the resident to determine whether a restraint is in place when the resident is in and out of bed;
●Surveyors answer the physical restraint observation questions correctly;and
●Surveyors understand the intent behind the physical restraint questions for the staff interview and resident observation.
  1. Count of QCLIs That Exceeded the Threshold

Count of all QCLIs that exceeded the threshold.
Count of non-MDS based QCLIs that exceeded the threshold per survey.
Outlier rules
Low outlier-
5 or more points less than the national average.
High outlier-
10 or more points greater than the national average. / None / Identify any high or low outliers for the report and state average compared to the national average.
If the report average is comparable to the national average, review the individual surveys to identify how much variability there is in the count of QCLIs that exceed the threshold.
If concerns are identified, examine the QIS process steps that are associated with QCLIs. Ensure surveyors:
●Understand the intent of the questions,
●Ask the questions accurately,
●Do not ask leading questions,
●Do not repeat the question to elicit a positive response,
●Ask all of the questions,
●Answer the questions correctly,
●Conduct appropriate observations, and
●Conduct appropriate clinical record reviews.
Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.
Ensure the team coordinator documents concerns during team meetings.
4. Distribution of QCLIs That Exceeded The Threshold By Data Source
If only two QCLI exceed the threshold, one from MDS and one from Resident Observation, the rates for MDS and observation would each be 50% and the rates for all other data sources would be 0%. The rates for a single survey should sum to 100%.
Report, State, and National average calculations: Average rate of QCLIs that exceeded the threshold by data source across the set of surveys being reviewed.
The seven (7) data sources include: MDS, resident observation (RO), resident interview (RI), family interview (FI), staff interview (SI), census clinical record review (CR), and admission clinical record review (AR).

Rate (one for each data source) =

Numerator: Number of QCLIs from the data sourcethat exceeded the threshold.
Denominator:Total number of QCLIs that exceeded the threshold on the survey.
Outlier rules
Low outlier-
5 or more points less than the national average.
High outlier-
10 or more points greater than the national average. / None / Review each individual survey to ensure there isn’t a 0% rate for the MDS. The MDS should not be 0% since the MDS QCLIs are automatically calculated when the survey shell is exported. If the rate is 0% for the MDS (for Item 4 and Item 23), the MDS QCLIs were not calculated and the QTSO help desk should be notified if not already contacted during the survey.
Identify any high or low outliers for the report and state average compared to the national average.
Review the individual surveys to identify how often QCLIs did not exceed the threshold (0%) for each data source(e.g., resident observation QCLIs do not exceed the threshold on many of the surveys included in the report).
If concerns are identified, examine the QIS process steps that are associated with QCLIs.Ensure surveyors:
●Understand the intent of the questions,
●Ask the questions accurately,
●Do not ask leading questions,
●Do not repeat the question to elicit a positive response,
●Ask all of the questions,
●Answer the questions correctly,
●Document the correct response,
●Conduct appropriate observations, and
●Conduct appropriate clinical record reviews.
Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.
Ensure the team coordinator documents concerns during team meetings.
For example, if a team continues to have a low average rate of QCLIs that exceed the threshold for resident observations, the trainer may need to follow up to ensure surveyors are performing multiple, quality observations during Stage 1.
5. “Negative” Responses by Data Source
A “negative” response is a response that includes a resident in the “QCLI Criteria Met” category of the QCLI calculation results, or is part of a set of questions that includes a resident in the “QCLI Criteria Met” category.

Rate (one for each data source) =

Numerator: Number of negative responses by data source for each surveyor.
Denominator: Number of responses by data source for each surveyor.
Outlier rules
Low outlier-
Half or less than the national average
(Except for FI and CR: a low outlier is =0%).
High outlier-
Double or more than the national average
(Except for SI: a high outlier is ≥20%). / None / Identify any low or high outliers by data source for the report and state average compared to the national average.
Review the surveyor rates to determine whether any surveyor has a low or high rate of negative responses by data source compared to the national average.
Survey-level: Counts are made within a survey and across surveyors.
Surveyor-level: Counts are made across surveys for individual surveyors.
Report-level: Survey-level rates are averaged across surveys.
Determine whether a surveyor has an unusual occurrence of 0% for Stage 1 negative responses or has a considerably lower/higher average rate of negative responses compared with the state or national average for each data source. If concerns are identified, examine QIS process steps that are associated with negative response rates. Ensure surveyors:
●Understand the intent of the questions,
●Ask the questions accurately,
●Do not ask leading questions,
●Do not repeat the question to elicit a positive response,
●Ask all of the questions,
●Answer the questions correctly,
●Conduct appropriate observations, and
●Conduct appropriate clinical record reviews.
Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.
Ensure the team coordinator documents concerns during team meetings.
6. Census Sample Interview Rate
Count of resident interviews conducted; count of predicted interviews on the Census Sample report per the Brief Interview for Mental Status (BIMS) score; and the interview rate.
Interview Rate =
Numerator: Count of resident interviews conducted.
Denominator: Count of predicted interviews on the Census Sample report per the BIMS score calculations.
Exclusion: Any resident on the Census Sample report without a BIMS score (e.g., any newly admitted resident added to the census sample during resident reconciliation)
Outlier rules
Low outlier-
A low outlier is ≤77.
High outlier-
A high outlier is ≥121. / Resident Interview Rates Report in CompareInterviewStatusvsConducted.pdf / Identify any low or high outliers for the DAR-SAreport average and state average compared to the national average.
Review the individual surveyor rates to identify any surveyor with a low or high rate of actual interviews.
Include as part of the analysis consideration for survey participation. For all surveyor-level items, consider the number of surveys the surveyor participatedin and the number of opportunities the surveyor had to conduct the task. For example, a surveyor may have an interview rate of only 33%; however, the surveyor was only on one survey and had only three interviewable residents on that survey.
Refer to the “Resident Interview Rates Report” to determine how many residents were pre-determined by the BIMS score to be interviewable and who the surveyor actually interviewed.
If concerns are identified, examine the QIS process steps that are associated with interview rates:
●Ensure surveyors screen all census sample residents to determine the resident’s actual interview status.
●Ensure surveyors prioritize the resident interviews and resident observations before spending time conducting the clinical record reviews.
●The trainer may need to address issues related to surveyor organizational skills or assist the surveyor in developing an effective system to interview all of the applicable residents.
●Problem solve with surveyors to identify effective strategies for conducting interviews with short-stay residents within the given time frame.
7.1 Census Sample Refusal Rate
Count of Resident interview refusal responses, count of interviewable residents (surveyor determined), and the refusal rate
Refusal Rate =
Numerator: Count of resident refused responses.
Denominator: Count of interviewable residents as determined by the surveyor.
Outlier rule
High Outlier only-
5 or more points greater than the national average. / Resident Interview Rates Report in CompareInterviewStatusVSConducted.pdf / Identify any high outliers for the DAR-SAreport average and state average compared to the national average.
Review the individual surveyor rates to identify any surveyor with a high refusal rate compared to the national average.
Include as part of the analysis the consideration for survey participation. For all surveyor-level items, consider the number of surveys the surveyor participated in and the number of opportunities the surveyor had to conduct the task.
Refer to the “Resident Interview Rates” report to determine what residents the surveyor marked as unavailable.
Survey-level: Counts are made within a survey and across surveyors.
Surveyor-level: Counts are made across surveys for individual surveyors.
Report-level: Survey-level rates are averaged across surveys.
If concerns are identified, try to determine the cause of the extreme refusal rate:
●Determine whether the surveyor can identify why he/she has a high refusal rate.
●The trainer may need to provide guidance to the surveyor on developing resident rapport or may need to observe the surveyor’s interview technique to determine why he/she has a high refusal rate.