DAR-SA Item-by-Item IG - University of Colorado Denver

advertisement

Instructions:

This document defines each data item and provides an overview and instructions for data item use on the DAR-SA.

Review supplemental reports for items noted with such reports. Enter any notes on the DAR-SA in the space provided under each data item.

These notes can be used for such activities as QI and training.

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Bad Data

Bad Data

(This appears above the Survey

Name and Number)

It means the Univ. of CO team was unable to process the survey because portions of the survey data were missing.

TC In Audit Table

TC

In Audit Table

(This appears above the Survey

Name and Number)

It means the Univ. of CO team is unable to match CE responses to specific surveyors.

Supplemental

Report(s)

Bad Data Survey

Report in

BadDataSurvey.pdf

Specific Instructions

Review the Bad Data report for information on data concerns.

If the survey has critical element responses from only one surveyor, contact QTSO help desk. The most likely cause is that the TC is not marking the survey as being complete and a secondary surveyor is uploading their version of the survey after the TC, thus overwriting the TC’s full survey with Stage 2 responses from a single secondary surveyor.

If the survey has no Stage 1 data and no critical element responses in Stage 2, the survey file is basically blank. Contact QTSO help desk.

We are not sure why this occurs. We believe it is a software issue, not something the surveyors have done.

Potentially due to a change in TC during the survey. If this is not the case, please contact the Univ. of CO

QIS team. We need your help in determining the cause of this issue.

Any survey with the TC in the Audit Table will not have items 21 and 22 calculated.

Revised 11/19/13 1

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

1.

Number of Residents With Incomplete Stage 1 Data

This is the count of residents with Missing Stage 1 incomplete Stage 1 data for

Admission and Census Sample residents.

Outlier rules

High outlier only

Greater than or equal to double the national average.

Data Report in

MissingStage1Data

.pdf

Specific Instructions

Review the Missing Stage 1 Data report for any survey that had incomplete Stage 1 data. The report shows the specific Stage 1 data item(s) that remained incomplete. The report also displays any notes entered by the TC on the Verification of Stage 1 Data screen.

The DAR-SA report displays the average count of residents with incomplete Stage 1 data for Admission and

Census Sample residents across the set of surveys being reviewed.

If concerns are identified, examine the QIS process steps that are associated with Stage 1 data:

● Ensure each resident in the Stage 1 Admission and Census sample have a check mark.

● If an entire Admission Sample resident’s clinical record review is incomplete, during the initial team meeting ensure that the team coordinator determines assignments for Admission Sample residents who are in the facility, but not in the Census Sample.

● If an entire Admission Sample resident’s clinical record review is incomplete, ensure that the team is crossing off the Admission Sample resident as the record is completed on the team copy of the report.

● During the Stage 1 team meetings, ensure that the team coordinator discusses team progress and determines the remaining workload for Stage 1. During the Verification of Stage 1 data, ensure that the

TC or the involved surveyor resolves any incomplete Admission or Census Sample resident information by entering the missing Stage 1 responses on the TC’s laptop. (The missing information can be entered on the secondary laptop, which is responsible for the missing data; however the secondary then needs to synchronize with the TC.)

● Ensure surveyors finish their Stage 1 work. During the Stage 1 team meetings, each surveyor should discuss his/her progress by verifying that his/her census sample residents are complete and the team also should determine the workload remaining in Stage 1.

● If a resident interview was terminated prior to the completion of the interview, ensure that the surveyor made additional attempts to complete the interview. If an interview remains incomplete, the surveyor should document the reason for the interview’s incomplete status in the Relevant Findings.

Revised 11/19/13 2

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

2.

Care Areas and Facility Task Triggering Status

Triggering indicator (“T”) for each care None area and non-mandatory task. The report average shows the rate at which each care area and nonmandatory facility task was triggered across the set of surveys being reviewed.

Report, State, and National average calculations:

Numerator: Number of surveys on which the care area or nonmandatory facility task was triggered.

Denominator: Number of surveys being reviewed.

Excludes: care areas based exclusively on MDS data (i.e., Behavior and

Emotional Status, Hearing, Infections,

UTI, and Vision).

Outlier rules

Low Outlier-

If the national average is <26%, then the low outlier is ½ or less than the national average.

If the national average is ≥26% and

<61%, then the low outlier is 15 or more points less the national average.

If the national average is ≥61%, then the low outlier is 20 or more points less than the national average.

High outlier-

If the national average is >11%, then

Specific Instructions

Identify any high or low outliers for the report and state average compared to the national average.

Identify any 0% rate across the report and state and compare to the national average.

If concerns are identified, examine the QIS process steps that are associated with triggering rates:

● For any care area or non-mandatory facility task with a low/high average triggering rate, refer to the

QCLI Dictionary to identify the Stage 1 question(s) and response(s) that cause the care area or facility task to trigger.

Trend this information to determine whether further training is needed to ensure that surveyors:

● Understand the intent of the questions,

● Ask the questions accurately,

● Do not ask leading questions,

● Do not repeat the question to elicit a positive response,

● Ask all of the questions,

● Answer the questions correctly,

● Conduct appropriate observations, and

● Conduct appropriate clinical record reviews.

● Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.

For example, assume the physical restraints care area has a low average triggering rate compared to the national average. The physical restraint care area is triggered from the staff interview and resident observation. Ensure that:

● Surveyors ask the staff interview physical restraint questions exactly as written, including the parenthetical portion of the question;

● Surveyors do not asking leading questions;

● Surveyors ask all of the physical restraint questions, when necessary, based on a prior response;

● Surveyors answer the physical restraint questions correctly based on the staff member’s response;

● Surveyors understand the type of restraints that should be marked during Stage 1, especially for side rails;

● Surveyors understand the exclusion for side rails;

● Surveyors make multiple observations of the resident to determine whether a restraint is in place when the resident is in and out of bed;

● Surveyors answer the physical restraint observation questions correctly; and

● Surveyors understand the intent behind the physical restraint questions for the staff interview and resident observation.

Revised 11/19/13 3

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

2.

Care Areas and Facility Task Triggering Status the high outlier is 40 points or more than the national average.

If national average is ≤11%, then quadruple the national average.

Note: Social Service has no high outlier and is an exception to the above rule.

3.

Count of QCLIs That Exceeded the Threshold

Count of all QCLIs that exceeded the None threshold.

Count of non-MDS based QCLIs that exceeded the threshold per survey.

Outlier rules

Low outlier-

5 or more points less than the national average.

High outlier-

10 or more points greater than the national average.

Specific Instructions

Identify any high or low outliers for the report and state average compared to the national average.

If the report average is comparable to the national average, review the individual surveys to identify how much variability there is in the count of QCLIs that exceed the threshold.

If concerns are identified, examine the QIS process steps that are associated with QCLIs. Ensure surveyors:

● Understand the intent of the questions,

● Ask the questions accurately,

● Do not ask leading questions,

● Do not repeat the question to elicit a positive response,

● Ask all of the questions,

● Answer the questions correctly,

● Conduct appropriate observations, and

● Conduct appropriate clinical record reviews.

Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.

Ensure the team coordinator documents concerns during team meetings.

Revised 11/19/13 4

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s) Specific Instructions

4. Distribution of QCLIs That Exceeded The Threshold By Data Source

If only two QCLI exceed the threshold, one from MDS and one from Resident

None Review each individual survey to ensure there isn’t a 0% rate for the MDS. The MDS should not be 0% since the MDS QCLIs are automatically calculated when the survey shell is exported. If the rate is 0% for the MDS

Observation, the rates for MDS and observation would each be 50% and the rates for all other data sources would be 0%. The rates for a single survey should sum to 100%.

(for Item 4 and Item 23), the MDS QCLIs were not calculated and the QTSO help desk should be notified if not already contacted during the survey.

Identify any high or low outliers for the report and state average compared to the national average.

Review the individual surveys to identify how often QCLIs did not exceed the threshold (0%) for each data source (e.g., resident observation QCLIs do not exceed the threshold on many of the surveys included in the report). Report, State, and National average calculations: Average rate of QCLIs that exceeded the threshold by data source across the set of surveys being reviewed.

The seven (7) data sources include:

MDS, resident observation (RO),

If concerns are identified, examine the QIS process steps that are associated with QCLIs. Ensure surveyors:

Understand the intent of the questions,

Ask the questions accurately,

● Do not ask leading questions,

● Do not repeat the question to elicit a positive response,

● Ask all of the questions,

● Answer the questions correctly, resident interview (RI), family interview (FI), staff interview (SI), census clinical record review (CR), and admission clinical record review (AR).

Rate (one for each data source) =

Document the correct response,

Conduct appropriate observations, and

Conduct appropriate clinical record reviews.

Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.

Numerator: Number of QCLIs from the data source that exceeded the threshold.

Denominator: Total number of QCLIs that exceeded the threshold on the survey.

Ensure the team coordinator documents concerns during team meetings.

For example, if a team continues to have a low average rate of QCLIs that exceed the threshold for resident observations, the trainer may need to follow up to ensure surveyors are performing multiple, quality observations during Stage 1.

Outlier rules

Low outlier-

5 or more points less than the national average.

High outlier-

10 or more points greater than the national average.

Revised 11/19/13 5

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

5. “Negative” Responses by Data Source

None A “negative” response is a response that includes a resident in the “QCLI

Criteria Met” category of the QCLI calculation results, or is part of a set of questions that includes a resident in the “QCLI Criteria Met” category.

Rate (one for each data source) =

Numerator: Number of negative responses by data source for each surveyor.

Denominator: Number of responses by data source for each surveyor.

Outlier rules

Low outlier-

Half or less than the national average

(Except for FI and CR: a low outlier is

=0%).

High outlier-

Double or more than the national average

(Except for SI: a high outlier is ≥20%).

Specific Instructions

Identify any low or high outliers by data source for the report and state average compared to the national average.

Review the surveyor rates to determine whether any surveyor has a low or high rate of negative responses by data source compared to the national average.

Survey-level: Counts are made within a survey and across surveyors.

Surveyor-level: Counts are made across surveys for individual surveyors.

Report-level: Survey-level rates are averaged across surveys.

Determine whether a surveyor has an unusual occurrence of 0% for Stage 1 negative responses or has a considerably lower/higher average rate of negative responses compared with the state or national average for each data source. If concerns are identified, examine QIS process steps that are associated with negative response rates. Ensure surveyors:

● Understand the intent of the questions,

● Ask the questions accurately,

● Do not ask leading questions,

● Do not repeat the question to elicit a positive response,

● Ask all of the questions,

● Answer the questions correctly,

● Conduct appropriate observations, and

● Conduct appropriate clinical record reviews.

Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.

Ensure the team coordinator documents concerns during team meetings.

Revised 11/19/13 6

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

6. Census Sample Interview Rate

Count of resident interviews conducted; count of predicted interviews on the Census Sample report per the Brief Interview for

Mental Status (BIMS) score; and the interview rate.

Interview Rate =

Numerator: Count of resident interviews conducted.

Denominator: Count of predicted interviews on the Census Sample report per the BIMS score calculations.

Exclusion: Any resident on the

Census Sample report without a BIMS score (e.g., any newly admitted resident added to the census sample during resident reconciliation)

Outlier rules

Low outlier-

A low outlier is ≤77.

High outlier-

A high outlier is ≥121.

Supplemental

Report(s) Specific Instructions

Resident Interview

Rates Report in

CompareInterviewS tatusvsConducted.

pdf

Identify any low or high outliers for the DAR-SA report average and state average compared to the national average.

Review the individual surveyor rates to identify any surveyor with a low or high rate of actual interviews.

Include as part of the analysis consideration for survey participation. For all surveyor-level items, consider the

number of surveys the surveyor participated in and the number of opportunities the surveyor had to conduct the task. For example, a surveyor may have an interview rate of only 33%; however, the surveyor was only on one survey and had only three interviewable residents on that survey.

Refer to the “Resident Interview Rates Report” to determine how many residents were pre-determined by the

BIMS score to be interviewable and who the surveyor actually interviewed.

If concerns are identified, examine the QIS process steps that are associated with interview rates:

● Ensure surveyors screen all census sample residents to determine the resident’s actual interview status.

● Ensure surveyors prioritize the resident interviews and resident observations before spending time conducting the clinical record reviews.

● The trainer may need to address issues related to surveyor organizational skills or assist the surveyor in developing an effective system to interview all of the applicable residents.

● Problem solve with surveyors to identify effective strategies for conducting interviews with short-stay residents within the given time frame.

Revised 11/19/13 7

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

7.1 Census Sample Refusal Rate

Count of Resident interview refusal responses, count of interviewable residents (surveyor determined), and the refusal rate

Refusal Rate =

Numerator: Count of resident refused responses.

Denominator: Count of interviewable residents as determined by the surveyor.

Outlier rule

High Outlier only-

5 or more points greater than the national average.

Supplemental

Report(s)

Resident Interview

Rates Report in

CompareInterviewS tatusVSConducted.

pdf

Specific Instructions

Identify any high outliers for the DAR-SA report average and state average compared to the national average.

Review the individual surveyor rates to identify any surveyor with a high refusal rate compared to the national average.

Include as part of the analysis the consideration for survey participation. For all surveyor-level items, consider the number of surveys the surveyor participated in and the number of opportunities the surveyor had to conduct the task.

Refer to the “Resident Interview Rates” report to determine what residents the surveyor marked as unavailable.

Survey-level: Counts are made within a survey and across surveyors.

Surveyor-level: Counts are made across surveys for individual surveyors.

Report-level: Survey-level rates are averaged across surveys.

If concerns are identified, try to determine the cause of the extreme refusal rate:

● Determine whether the surveyor can identify why he/she has a high refusal rate.

● The trainer may need to provide guidance to the surveyor on developing resident rapport or may need to observe the surveyor’s interview technique to determine why he/she has a high refusal rate.

● Identify any patterns or trends where refusal rates are higher (e.g., facilities with large sub-acute units).

7.2 Census Sample Unavailable Rate

Count of resident interview unavailable responses; count of interviewable residents (surveyor determined); and the rate of interviewable who were unavailable.

Refusal Rate =

Numerator: Count of resident unavailable responses.

Denominator: Count of interviewable residents as determined by the surveyor.

Outlier rule

Resident Interview

Rates Report in

CompareInterviewS tatusVSConducted.

pdf

Identify any high outliers for the DAR-SA report average and the state average compared to the national average.

Review the individual surveyor rates to identify any surveyor with a high unavailable rate compared to the national average.

Include as part of the analysis consideration for survey participation; consider the number of surveys the surveyor participated in and the number of opportunities the surveyor had to conduct the task.

Refer to the “Resident Interview Rates” report to determine what residents the surveyor marked as unavailable.

Survey-level: Counts are made within a survey and across surveyors.

Surveyor-level: Counts are made across surveys for individual surveyors.

Report-level: Survey-level rates are averaged across surveys.

Revised 11/19/13 8

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

7.2 Census Sample Unavailable Rate

High Outlier only-

5 or more points greater than the national average.

Supplemental

Report(s) Specific Instructions

If concerns are identified, try to determine the cause of the extreme unavailable rate:

● Determine whether the surveyor can identify why he/she has a high unavailable rate.

● Identify and address issues related to surveyor organizational skills or assist the surveyor in developing an effective system for interviewing all applicable residents if he/she has a high unavailable rate.

● Determine whether a surveyor makes sufficient attempts to conduct resident interviews prior to marking the resident as unavailable.

● Ensure surveyors prioritize the resident interviews and resident observations before spending time conducting the clinical record review.

● Identify patterns or trends when unavailable rates are higher (e.g., facilities with large sub-acute units).

● Problem solve with surveyors to identify effective strategies for conducting interviews with short-stay residents in the given time frame.

Revised 11/19/13 9

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

8. Relevant Findings For “Negative” Responses

Count of Relevant Findings for All

Negative Responses.

A “negative” response is one that includes a resident in the “QCLI

Criteria Met” category of the QCLI calculation results, or is part of a set of questions that includes a resident in the “QCLI Criteria Met” category.

Count of negative responses for relevant findings; count of negative responses; and the rate of negative responses with relevant findings.

Rate =

Numerator: Count of documented relevant findings for negative responses.

Denominator: Count of negative responses.

Outlier rule

Low Outlier only-

Anything less than the national average.

Counts of Relevant

Findings for All

Negative

Responses Report in

RelevantFindingsfo rSelectNegatives.p

df

Specific Instructions

Identify any low outliers for the report average and state average compared to the national average (the goal is to have 100% documented relevant findings for negative responses).

Review the individual surveyor rates to identify any surveyor with a low rate of documenting relevant findings.

Refer to the supplemental report to determine which Stage 1 negative responses did not have a relevant finding by surveyor.

Survey-level: Counts are made within a survey and across surveyors.

Surveyor-level: Counts are made across surveys for individual surveyors.

Report-level: Survey-level rates are averaged across surveys.

If any low outliers are identified:

Determine whether surveyors probe appropriately for the specifics behind a “negative” response during interviews.

 Ensure surveyors document relevant findings for “negative” responses for the resident observations, interviews, and census and admission clinical record reviews.

Review the Stage 1 Relevant Findings report to identify any trends or patterns regarding “negative” responses that do not have a relevant finding.

Review deficiencies for adequacy of evidence. If the evidence is inadequate to support the deficiency or the deficiency could be strengthened, determine whether a Stage 1 observation or interview response should have been documented with a relevant finding.

For example: A deficiency includes only one observation of a resident being poorly positioned during Stage 2.

The single observation contributes to a reviewer’s questioning of whether one observation can support a determination of noncompliance and the facility may argue that the observation was a one-time event.

Further review of Stage 1 data reveals the surveyor observed and marked the resident as being poorly positioned; however, there was no relevant finding documenting the date, time, or details of the observation.

The absence of a relevant finding excludes documented evidence to support multiple observations of the same failure. It may be helpful to:

Identify surveyors who have the highest rate of documented relevant findings and share their documentation techniques and skills with other surveyors.

Ensure surveyors utilize all techniques and options that facilitate entry of relevant findings, such as reviewing the tablet PC tutorial on handwriting recognition.

Revised 11/19/13 10

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

11. Count of Investigated Non-Mandatory Facility Tasks

Count of investigated non-mandatory facility tasks.

Report, State, and National average calculations:

Average count of investigated nonmandatory facility tasks across the set of surveys being reviewed.

Outlier rules

Low outlier-

Half or less than the national average.

High outlier-

Double or more than the national average.

None

12. Citations for Facility Tasks

Citations for all facility tasks (“y” indicates a citation).

Report, State, and National average calculations:

Rate (for each task) =

Numerator: Number of citations.

Denominator: Number of surveys on which the task was investigated.

Outlier rules

Low Outlier-

If the national average is <26%, then the low outlier is ½ or less than the national average.

If the national average is ≥26% and

F-tag Report in

FtagReport.pdf

Specific Instructions

Identify any low or high outliers for the report average and state average compared to the national average.

If concerns are identified, examine the QIS process steps that are associated with non-mandatory tasks.

If a trend is identified that a task is investigated less/more frequently, ensure surveyors:

● Understand the intent of the questions,

● Ask the questions accurately,

● Do not ask leading questions,

● Do not repeat question to elicit a positive response,

● Ask all of the questions,

● Answer the questions correctly, and

● Conduct appropriate observations.

Ensure surveyors discuss pertinent non-mandatory facility task findings for resident assignments during team meetings.

Ensure team coordinator documents concerns during team meetings.

If the line is blank/no entries, there were no citations.

Determine whether the report average and state average have a 0% citation rate for a particular facility task.

Identify whether the report and state have a low or high citation rate for any facility tasks compared to the national average.

For non-mandatory tasks, consider [the number of times the task triggered or was initiated] to be aware of potential citation opportunities (see Item 19).

If concerns are identified, examine the QIS process steps that are associated with facility task citations:

● Ensure surveyors follow the guidance on the facility task pathway when conducting the investigation.

● Ensure surveyors conduct a comprehensive and thorough investigation.

● Ensure surveyors discuss pertinent findings for facility task assignments during team meetings.

● Ensure the team coordinator documents concerns during team meetings.

● Ensure surveyors make accurate compliance decisions.

Additionally:

● Determine which facility tasks are never cited. Is it reasonable that those tasks are never cited?

Revised 11/19/13 11

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

12. Citations for Facility Tasks

<61%, then the low outlier is 15 or more points less the national average.

If the national average is ≥61%, then the low outlier is 20 or more points less than the national average.

High outlier-

If the national average is >11%, then the high outlier is 40 points or more than the national average.

If national average is ≤11%, then quadruple the national average.

14. Surveyors with Blank Critical Element Responses

Surveyors with Blank Critical Element

Responses.

No outlier rule

Missing Stage 2

Data Report in

MissingStage2Data

.pdf

Specific Instructions

● Identify areas of possible concerns for re-training or additional in-depth training

Identify surveyor(s) with any number of blank CE responses.

If concerns are identified, examine QIS process steps that are associated with complete Stage 2 data:

● Determine if a different surveyor completed the investigation.

● Determine the reason the surveyor did not complete the investigation.

● Determine if the team correctly initiated the resident/task investigation to be investigated by the alternate surveyor.

● Determine if the TC documented concerns in the verification of Stage 1 screen.

● Determine if the TC documented concerns in the team meeting notes.

● Determine if the team addressed the concerns regarding the resident (s)/task(s) to be investigated.

Revised 11/19/13 12

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s) Specific Instructions

15. Number of Triggered Care Areas/Tasks

Count of triggered care areas and non-mandatory tasks.

Report, State, and National average

None Identify whether the report and state have a low or high average count of triggered care areas compared to the national average.

If the report average is comparable to the national average, review the individual surveys to determine how much variability there is across triggered care areas for each survey. calculations:

Average count for triggered care areas and non-mandatory tasks across the set of surveys being reviewed.

Outlier rules

Low outlier-

5 or more points less than the national average.

If concerns are identified, examine the QIS process steps that are associated with triggering. Ensure surveyors:

● Understand the intent of the questions,

● Ask the questions accurately,

● Do not ask leading questions,

● Do not repeat the question to elicit a positive response,

● Are asking all of the questions,

● Answer the questions correctly,

● Conduct appropriate observations, and

● Conduct appropriate clinical record reviews.

High outlier-

5 or more points greater than the national average

Ensure surveyors discuss pertinent findings for both resident and facility task assignments during team meetings.

Ensure the team coordinator documents concerns during team meetings.

16. Number of New Care Areas/Tasks that were Surveyor-Initiated

Count of care area and nonmandatory facility task initiations, excluding initiations of care areas/tasks that were already triggered, and the four mandatory care areas (hospice, ventilator, PASRR, and dialysis).

The initiated care area or nonmandatory facility task is identified.

Report, State, and National average calculations:

Average count for care area and nonmandatory facility task initiation

F-tag Report in

FtagReport.pdf,

Surveyor Initiated and Removed Care

Area in

Initiated&Removed

CareAreas.pdf

Review the Surveyor Initiated and Removed Care Areas Report to determine whether the team investigated complaints during the survey (i.e., to determine if the initiations are due to the complaint). Once you have identified the initiations related to complaints, you can ignore them and focus on the other initiations.

Review each individual survey to identify the care areas or tasks that were initiated (excluding those initiated for a complaint).

Review the F-tag Report to determine whether an F-tag was cited for the initiated care areas or tasks.

Although care area initiations are not always cited, they have a high probability of being cited. The care area was initiated because the surveyor(s) had an issue with the care area.

Determine the total number of initiated care areas not cited divided by the total number of initiated care areas to determine the rate of initiated care areas not cited across the set of surveys being reviewed.

Review the Surveyor Initiated and Removed Care Areas and F-tags Report to ensure all surveyors on the team, not assigned dining and infection control, initiated the task.

Revised 11/19/13 13

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s) Specific Instructions

16. Number of New Care Areas/Tasks that were Surveyor-Initiated across the set of surveys being reviewed.

Outlier rule

High outlier only-

Double or more than the national average

Review the Surveyor Initiated and Removed Care Areas and F-tags Report to ensure nutrition was initiated when dialysis was initiated.

Review the Surveyor Initiated and Removed Care Areas and F-tags Report to ensure survey teams are initiating PASRR, hospice, ventilator, and dialysis as appropriate.

If concerns are identified:

● Ensure surveyors initiate care areas and non-mandatory facility tasks appropriately (e.g., confirm that all

Stage 1 information was answered correctly, the concern was discussed with the team, and the concern warranted initiation).

● Identify any patterns or trends in a particular care area or task being initiated.

● Determine if the initiation was warranted even though it did not lead to a deficiency.

● Identify any patterns or trends in a particular care area or task being initiated and not cited.

If there is a high rate of initiated care areas or non-mandatory tasks not being cited, ensure surveyors:

● Follow the guidance on the facility task and CE pathway during investigations.

● Enter complaint investigation information on the Review Materials screen.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

Revised 11/19/13 14

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

17. Status of Surveyor-Initiated Tags

17.1

Count of unique surveyor-initiated Ftags. Regardless of the number of times an F-tag is initiated, it is only counted once.

17.2

Count of surveyor-initiated F-tags that were not cited

17.3

List of all instances of surveyorinitiated F-tags (including those not cited) for each surveyor with those associated with a care area or facility task highlighted (red font F-tags).

Report Averages/Surveyor-level

Outlier rule

High outlier only-

Double or more than the national average (for 17.1, 17.2, and 17.3)

Note: To identify high outliers at the surveyor level (17.3), divide the national average by the number of surveyors normally on the team

(usually this is 4) to obtain a surveyor average. High outliers are double or more than this computed surveyor average.

Supplemental

Report(s) Specific Instructions

Surveyor-Initiated

F-tags with

Mapping to a Care

Area/ Task Report in

Initiated&Removed

CareAreas.pdf

Stage 2 Report in

Stage 2 Report.pdf

Unnecessary

Medication Use

Residents Report in

UnnDrugReviewRe sidents.pdf

F-tag Report in

FTagReport.pdf

Identify whether the report and state have a high average count of unique F-tags, F-tags not cited, and F-tags associated with a care area compared to the national average.

Look at each individual surveyor to identify whether a particular surveyor has a high count of initiated F-tags associated with a care area compared to the national average.

Anytime an F-tag that is mapped to a care area or task is initiated, the software warns the surveyor by listing the care areas or tasks associated with the F-tag. The surveyor should select the appropriate care area or task instead of initiating the F-tag directly.

For more information, see the Mapping document on the University of

Colorado website: http://www.ucdenver.edu/academics/colleges/medicalschool/departments/medicine/hcpr/qis/Pages/default.

aspx

For any initiated red font F-tag listed in the DAR-SA, determine whether a care area or facility task is available to guide the investigation of the identified concern. If an applicable care area is available in QIS, the care area should be initiated instead of the F-tag to ensure a comprehensive and thorough investigation is completed.

The supervisor may want to review the deficiency on the CMS-2567 to identify weaknesses in the documented evidence that may have been avoided had the surveyor followed the structured investigation of the CE or

Facility pathway. For example, the surveyor initiated F314 instead of initiating the pressure ulcer care area.

Once the deficient practice is identified, review the Stage 2 Report to ensure the resident wasn’t already being investigated for the applicable care area or task in Stage 2. For example, a surveyor initiated F221 when the surveyor for that resident was already investigating the physical restraints care area. If the resident was being investigated for the applicable care area or task, the surveyor should have answered the applicable CE as No and cited the F-tag under the investigated care area or task.

Based on the review of the Stage 2 Report, if the care area or task was not being investigated for the resident, the surveyor should have initiated the applicable care area or task for the resident.

Based on the review of the Stage 2 Report, if the mapped care area or task is not applicable to the initiated Ftag, there is no concern.

For any F-tag in red font, since the surveyor did not use the mapped care area or task, ensure the correct F-tag was cited.

If F329 is initiated, and the cited concern is related to the Unnecessary Medication pathway, review the

‘Unnecessary Medication Use Residents’ report to determine whether the resident was investigated for the care area.

Revised 11/19/13 15

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

17. Status of Surveyor-Initiated Tags

Supplemental

Report(s) Specific Instructions

If the team initiated an F-tag as a result of Stage 1 findings, the supervisor can review the QCLI Results Report to determine whether the surveyors answered the Stage 1 information correctly and captured the issue during

Stage 1.

Example: The team directly initiated F323 for observed large gaps between the side rail and mattress but did not indicate there were any concerns in Stage 1 that would have been included in the QCLI Results Report under Accidents (QP218). The trainer should ensure that surveyors answer the Stage 1 questions appropriately, or are aware that the Stage 1 question encompasses the concern.

Ensure surveyors cite the appropriate F-tag if they did not use the applicable CE or facility task pathway.

Identify trends in initiated red font F-tags.

Determine whether surveyors initiate F-tags for care areas or facility tasks that are already being investigated in Stage 2. If the resident was included in the Stage 2 Sample and was investigated for the issue, the surveyor should answer the appropriate CE as “No” instead of initiating the F-tag directly.

For care areas, ensure surveyors are aware that the compliance decision screen lists the F-tag next to each CE.

For facility tasks, ensure surveyors are aware that the F-tag is listed next to each CE.

Example: The surveyor was investigating nutrition for resident #1. The surveyor initiated F282 for a concern with the facility not implementing care plan approaches regarding nutrition concerns for resident #1. The surveyor should have answered CE3 under the nutrition care area for resident #1 as “No” and cited F282 under the nutrition care area.

During team meetings, ensure surveyors discuss pertinent findings with the team. The team can also ensure that similar concerns are investigated thoroughly.

Identify whether surveyors initiate F-tags in order to cite a resident for a facility task concern. Ensure surveyors understand that they should not initiate an F-tag for a resident to cite the resident under a facility task. Instead, to cite resident-specific findings for a facility task, the surveyor should go to Options, Select

Stage 2 Resident, and put a check in the “On Report?” column for the resident to be cited. The surveyor should then cite the resident-specific finding under the applicable CE for the facility task.

Example: The surveyor initiated F241 for resident #2 for a dignity concern observed during dining observation.

The surveyor should have marked the resident for inclusion under the Select Stage 2 Resident option.

Subsequently, on the dining observation screen, the surveyor should have included the resident-specific finding, including the resident’s ID, under CE3 related to the observed dignity concern.

Revised 11/19/13 16

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

18. Survey Citation Rate for Triggered Care Areas and Tasks

None Rate of citation for triggered care areas and non-mandatory facility tasks.

Report, State, and National average calculations:

Average rate of triggered and cited care areas and non-mandatory facility tasks across the set of surveys being reviewed.

Rate =

Numerator: Number of triggered and cited care areas and nonmandatory facility tasks for the survey.

Denominator: Number of triggered care areas and non-mandatory facility tasks for the survey.

Outlier rules

Low outlier-

5 or more points less than the national average.

High outlier-

10 or more points greater than the national average

Specific Instructions

Identify whether the report and state have a low or high survey citation rate compared to the national average.

If the report average is comparable to the national average, review the individual surveys to determine how much variability there is in the survey citation rate.

Identify the frequency of a 0% citation rate across the set of surveys being reviewed. If there is a 0% citation rate, review Item 19 to determine how many care areas/tasks triggered to get a sense of how many areas were triggered and not cited.

If concerns are identified, examine the QIS process steps that are associated with citation rates. Ensure surveyors:

● Follow the guidance on the facility task and CE pathway when conducting the investigation.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

For example: Sufficient Nursing Staff triggers on all surveys but yields no citations. The trainer may want to review the Facility pathway with the team to ensure all information is being gathered and reviewed during team compliance decision deliberations.

Revised 11/19/13 17

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

19. Care Area Citation Rate for Triggered Care Areas and Tasks

Specific Instructions

Lists all care areas and facility tasks that can be triggered.

Identifies triggered care areas and tasks (T), cited care areas and tasks

(C), and surveyor-initiated care areas and tasks (S). Survey totals for

Triggered, Triggered and Cited, and

Triggered and Not Cited are given at the end of each survey column.

Looking across the report, the Totals column shows the number of times each care area was triggered and cited, the number of times each care area triggered, and the percentage of times each care area was triggered and cited.

Outlier rules

Low Outlier-

If the national average is <26%, then the low outlier is ½ or less than the national average.

If the national average is ≥26% and

<61%, then the low outlier is 15 or more points less the national average.

If the national average is ≥61%, then the low outlier is 20 or more points less than the national average.

High outlier-

If the national average is >11% then

None Identify the care areas and tasks that were triggered and not cited (0%) for the report (Totals) and state.

Identify whether the report and state have a low or high citation rate for triggered care areas and tasks compared to the national average.

When evaluating the citation rate, also consider the number of times the care area or task triggered. A citation rate of 0% when the care area only triggered once may not have as much significance for this item as a care area that triggered on all ten surveys but was only cited once for a 10% rate.

Determine trends or patterns regarding whether a particular care area or task continues to have a low/high citation rate when triggered.

If there is a low citation rate for a triggered care area or task, examine the QIS process steps that are associated with citations. Ensure surveyors:

● Follow the guidance on the facility task and CE pathway when conducting the investigation.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

Revised 11/19/13 18

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

19. Care Area Citation Rate for Triggered Care Areas and Tasks

Specific Instructions the high outlier is 40 points or more than the national average.

If national average is ≤11%, then quadruple the national average.

Note: Social Service has no high outlier and is an exception to the above rule.

20. Removed Care Areas/Tasks/Tags and Reason for Removal

Identifies care areas, facility tasks, and

F-tags that were removed along with the reason for the removal.

If the surveyor marked “8=other,” the documented rationale for the removal is provided.

The surveyor who removed the care area, task, or F-tag is identified.

No outlier rule

Surveyor-Initiated

F-tags with

Mapping to a Care

Area/ Task Report in

Initiated&Removed

CareAreas.pdf and the

F-tag Report in

FTagReport.pdf

Review the documented rationale and determine whether the rationale seems appropriate.

If a care area or task was removed because Stage 1 information was recorded incorrectly or the resident was transferred or discharged from the facility (codes 1-5 and 7), review the Surveyor Initiated and Removed Care

Areas and F-tag report to ensure another resident who met the criteria was initiated for the care area, as available.

Determine trends or patterns regarding whether there is a particular care area, task, or F-tag being removed.

If the surveyor selects reason 8=other, ensure that the documented rationale does not indicate that the assignment was removed since it is reassigned to another surveyor. To make workload adjustments in Stage 2, the original owner of the assignment should not remove the care area or task. The surveyor assuming responsibility of the assignment should initiate the care area or task. This will ensure that the reassigned area is actually completed when the surveyors synchronize their Stage 2 data.

For example: Surveyor 1 wants surveyor 2 to complete the activity care area investigation for resident #3.

Surveyor 1 leaves the activity care area incomplete on his/her tablet. Surveyor 2 initiates the activity care area for resident #3 on his/her tablet and completes the investigation. When surveyors 1 and 2 synchronize their

Stage 2 data to the TC, the activity care area will display as complete provided surveyor 2 completed the investigation.

Revised 11/19/13 19

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

21. Rate of CE Responses of “No”

Count of CE “No” responses; count of

CE responses; and the rate of CE responses of “No”

The number of CE “No” responses represents the number of potential noncompliance determinations.

Rate =

Numerator: Count of CE responses of “No.”

Denominator: Count of CE responses.

Note: Excludes Infection Control,

Dining, Medication Administration,

Medication Storage, and Environment because multiple CE responses to the same task are not recorded in the software.

Outlier rules

Low outlier-

2/3 rds or less than the national average.

High outlier-

Double or more than the national average

Supplemental

Report(s)

CE Responses of

NO and Status of

Associated F-tags

Report in

CEResponseNoAnd

FtagSelected.pdf

Specific Instructions

Identify whether the report and state have a low or high “No” designation rate compared to the national average.

Review the individual surveyor rates to identify whether a particular surveyor has a low or high “No” designation rate compared to the national average.

Survey-level: Counts are made within a survey and across surveyors.

Surveyor-level: Counts are made across surveys for individual surveyors.

Report-level: Survey-level rates are averaged across surveys.

If concerns are identified, examine the QIS process steps associated with Stage 2 investigations. Ensure surveyors:

● Follow the guidance on the facility task and CE pathway when conducting the investigation.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

Revised 11/19/13 20

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

22. Rate of CE Responses of “N/A”

Count of CE “N/A” responses; count of

CE responses with an “N/A” option; and the rate of CE responses of “N/A.”

Rate =

Numerator: Count of CE responses of “N/A.”

Denominator: Count of CE responses with an “N/A” option.

Note: Any CE that is automatically marked as “N/A” due to an automatic skip pattern is not counted as an

“N/A” for this DAR item. Also excluded are Infection Control, Dining,

Medication Administration,

Medication Storage, and Environment, as multiple responses to the same task are not recorded by the software.

Outlier rule

High outlier only-

5 or more points than the national average

CE Responses of

N/A and

Associated F-Tag

Report in

CEResponseNA.pdf

CE Responses of

Overridden N/A’s and Associated F-

Tag Report in

CEResponseNAOve

rride.pdf

Specific Instructions

Identify whether the report and state have a high N/A designation rate compared to the national average.

Review the individual surveyor rates to identify whether a particular surveyor has a high N/A designation rate compared to the national average.

Review the CE Responses of N/A and Associated F-Tag Report to identify what CEs are being marked as N/A, as well as any trends in N/A designation. Also identify whether there is any pattern of N/A designations within a care area or task. The report only displays the care area or tasks manually answered by the surveyor.

Review the CE Responses of Overridden N/A’s and Associated F-Tag Report to determine whether the surveyor is overriding an automatic skip pattern. If F280 is on the report, review the CE Responses of No and Associated

F-Tag Report to determine why the F-tag was skipped (i.e., either F272 or F279 was answered as No).

If concerns are identified, examine the QIS process steps associated with CE responses:

● Ensure surveyors mark N/A appropriately and follow the N/A guidance in the Facility and CE pathways.

● Determine whether surveyors prematurely determine compliance by marking a CE as N/A instead of No; thereby preventing the opportunity for the team to determine compliance and adequately analyze the scope of a potential finding.

Revised 11/19/13 21

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

23. QCLI Denominators By Data Source

Count of residents in the denominator for selected QCLIs.

QP013 (MDS) should not be zero for

Item 4 and Item 23.

QP096/QP205 (Resident Observation) number consists of observations for residents who are cognitively impaired/total number of Census

Sample residents.

The denominator for QP253 (Resident

Interview) plus the denominator for

QP096, the count of resident interview refusals, and unavailable should match the denominator for

QP205.

QP236 (Family Interview) should be at least three.

QP265 (Staff Interview) number should match QP205.

Report calculation:

Average count by QCLI across the set of surveys being reviewed.

No outlier rule

Resident Interview

Rates Report in

CompareInterviewS tatusVSConducted.

pdf

Specific Instructions

Ensure the MDS is not 0. If item 23 is 0 and item 4 is 0%, it means there was no MDS data for this survey. This needs to be caught before the survey is conducted. The TC should be checking the size of the sampling pool before distributing the shell to team members. See Checklist Item 1.

Ensure the sample size reflects the facility’s actual census (The facility’s licensed bed size is listed in the top section of Item 1):

Above 100 = Census Sample of 40

61-100 = Census Sample of 35

33-60 = Census Sample of 30

1-32 = Census Sample will be 90% of the actual census

Ensure teams conduct three family interviews per survey.

If a team consistently fails to complete three family interviews, determine adequacy of the team’s meeting planning and communication regarding the completion of family interviews. Review the Resident Interview

Rates Report to determine number of opportunities for family and resident interviews.

Ensure the team conducted the appropriate number of staff interviews (i.e., the number of staff interviews should equal the census sample).

Ensure resident observations, interviews, and other data sources are not ‘0’.

If a team is consistently not completing interviews (Staff Interview, Resident Interview, and Family Interview) from data sources, provide education and direction to surveyors on the team. Teams are potentially affecting the number of triggered care areas for Stage 2 investigation.

Revised 11/19/13 22

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

24. Survey Time

CMS – 670 components include time related to pre-survey; onsite; and offsite

No outlier rule

Supplemental

Report(s)

None

Specific Instructions

Ensure surveyors spend a reasonable amount of time completing pre-survey tasks. Time estimates for presurvey tasks include: one hour for the team coordinator and no more than 15 minutes for each surveyor acting as the secondary, less any pre-survey offsite meeting time.

Determine whether a surveyor, team, or district office spends considerably less/more time onsite over time compared with other surveys with similar circumstances (i.e., similar number of triggered care areas, potential citations and scope and severity -- Items 15, 27 and 33).

Ensure surveyors spend a reasonable amount of offsite time based on the number of potential citations per surveyor. Take into consideration the potential number of citations, per surveyor -- Items 26 and 27.

Additionally, take into consideration the scope and severity of the citation -- Item 33.

25. Stage 2 Assignments

Count of Stage 2 Assignments and percentage of Stage 2 workload.

There is no comparison report, state, or national averages.

Rate (for each surveyor) =

Numerator: Number of Stage 2 assignments for the individual surveyor

Denominator: Number of Stage 2 assignments for the team.

The average contribution of potential QIS citations for the individual surveyor across the set of surveys being reviewed.

No outlier rule

None Some care areas and tasks take longer to investigate than others and some investigations are more complex.

Consider both of these factors when reviewing the distribution of survey assignments.

Determine whether a surveyor has a considerably lower/higher rate of Stage 2 assignments over time compared with other surveyors.

For any surveyor with a low rate of Stage 2 assignments, identify and address issues related to surveyor organizational skills and assist the surveyor with developing an effective and efficient management system for his/her Stage 2 workload.

Revised 11/19/13 23

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

26. Number and Rate of Citations by Surveyor

Count of surveyor contribution to each F-tag, limited to one contribution per surveyor per F-tag; multiple surveyors may contribute to a F-tag

Contribution percentage of QIS potential citations for each surveyor.

Rate (for each surveyor) =

Numerator: Number of QIS potential citations for the individual surveyor.

Denominator: Number of QIS potential citations for the team.

Average contribution of potential QIS citations for the individual surveyor across the set of surveys being reviewed.

No outlier rule

None

Specific Instructions

Determine whether a surveyor cites a considerably lower/higher rate of potential QIS citations over time.

For any surveyor with a low/high citation rate, determine whether the surveyor also has a low/high rate of

Stage 2 assignments (Item 25). For example, it may be of greater concern if a surveyor has the highest rate of

Stage 2 assignments and the lowest citation rate compared to a surveyor with the lowest rate of Stage 2 assignments and citations.

If concerns are identified, examine the QIS process steps associated with Stage 2 investigations. Ensure surveyors:

● Follow the guidance on the facility task and CE pathway when conducting the investigation.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

Revised 11/19/13 24

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

27. Number of QIS Potential Citations

Count of QIS potential citations. This is the total number of potential citations determined onsite by the survey team during the analysis and decision-making meeting.

Report calculation:

An average count of all QIS potential citations across the set of surveys being reviewed.

Outlier rules

None

Low outlier-

2 or less than the national average.

High outlier-

4 or more than the national average

28. Number of Actual CMS-2567 Citations

None Count of Citations on the CMS-2567.

Report calculation:

The average count of actual CMS-

2567 citations across the set of surveys

Outlier rules

Low outlier-

2 or less than the national average.

High outlier-

4 or more than the national average

Specific Instructions

Determine whether a district office or team has a considerably lower/higher average count of QIS potential citations over time compared with the state and national average.

If concerns are identified, examine the QIS process steps associated with Stage 2 investigations and compliance decisions. Ensure surveyors:

● Follow the guidance on the facility task and CE pathway when conducting the investigation.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

Determine whether a district office or team has a considerably lower/higher average count of CMS-2567 citations over time compared with the state and national average.

Determine whether surveyors add or delete deficiencies after the team has made compliance decisions during the analysis and decision-making meeting. (See DAR-SA item #27)

Determine whether supervisors/reviewers add or delete deficiencies during the internal QA review process.

Determine whether the CMS-2567 indicates numerous changes made to the QIS potential citations.

If there are numerous differences between the QIS potential citations identified during the survey and the final

CMS-2567, determine the underlying reasons for the differences (e.g., is the deletion due to an inadequate investigation or insufficient documentation, are supervisors adding F-tags that the team should have identified, is training warranted)?

Revised 11/19/13 25

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

29. Number of Tags with No Citation Decision

Count of F-tags that do not have a status of “Cited” or “Not Cited” on the

Potential Citations Screen.

The status for any F-tag on the

Potential Citations Screen should not remain as “Not Determined.”

Report calculation:

Average count of F-tags that do not have a status of “Cited” or “Not Cited” across the set of surveys being reviewed.

No outlier rule

30. Potential Findings That Are Not Cited

F-tag Report in

FTagReport.pdf

F-tag Report in

FTagReport.pdf

Count of QIS potential findings not selected by the team during analysis and decision-making meeting.

Report calculation:

Average count of potential findings not selected by the team

Outlier rule

High outlier only-

Double or more than the national average.

Note: To identify high outliers at the surveyor level (17.3), divide the national average by the number of surveyors normally on the team

Specific Instructions

Identify any survey with any number, other than ‘0’, which indicates no citation decision was made for an Ftag.

Ensure surveyors make team compliance decisions for all F-tags on the Potential Citations Screen.

Ensure surveyors compare the Potential Citation Report to the Potential Citations Screen to ensure all intended citations are represented before the team coordinator load cites into ASE.

Identify whether the report and state have a high average count of potential findings not cited compared to the national average.

Review each individual surveyor to identify whether a particular surveyor has a high average count compared to the national average, on this report and over time.

Review the F-tag Report for any F-tag (s) not cited to ensure an appropriate rationale was provided.

Review the findings not cited for adequacy of evidence. If the evidence is inadequate to support the finding, determine whether a Stage 1 observation or interview response included a relevant finding.

If concerns are identified, examine the QIS process steps that are associated with Stage 2 investigations and compliance decisions. Ensure surveyors:

● Follow the guidance on the facility task and CE pathway when conducting the investigation.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

Revised 11/19/13 26

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

30. Potential Findings That Are Not Cited

(usually this is 4) to obtain a surveyor average. High outliers are double or more than this computed surveyor average.

31. QIS F-Tag Frequency Analysis

List of up to ten specific F-tags that were cited on at least half of the surveys included on the report.

Report, State, and National average calculations: Rate of frequency for cited F-tags across the set of surveys being reviewed.

None

Outlier rules

Low Outlier-

If the national average is <26%, then the low outlier is ½ or less than the national average.

If the national average is ≥26% and

<61%, then the low outlier is 15 or more points less the national average.

If the national average is ≥61%, then the low outlier is 20 or more points less than the national average.

High outlier-

If the national average is >11%, then the high outlier is 40 points or more than the national average.

If national average is ≤11% then quadruple the national average.

Specific Instructions

Determine whether surveyors make compliance decisions prematurely.

For example: Five residents are listed on the Potential Citations Screen for F221. If a resident is not marked for inclusion, but the F-tag is still cited, the resident who was not included is counted as a potential finding that was not cited since that resident will not be included in the citation.

Determine whether a district office or team has a considerably lower/higher rate of frequently cited F-tags over time compared with the state and national average.

If concerns are identified, examine the QIS process steps associated with Stage 2 investigations and compliance decisions.

Ensure surveyors:

● Follow the guidance on the facility task and CE pathway when conducting the investigation.

● Conduct a comprehensive and thorough investigation.

● Discuss pertinent findings with the team to assess the potential for noncompliance and determine the need for additional investigation.

● Make accurate compliance decisions.

Revised 11/19/13 27

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s) Specific Instructions

32. Sample Size Expansions for Single Content Care Areas/Tasks/Tags and Severity and Scope (CA n>3)

32.1

Lists the triggered care area(s), facility task(s), and F-tags (direct citations) that had a sample size greater than three. The sample size listed includes the residents chosen by ASE-Q and the additional surveyor-initiated residents.

32.2

Lists the final QIS determination for those triggered care area(s), facility task(s), or direct citations that had a sample size greater than three. Lists the most severe scope and severity, or if not cited (NC).

Note: There is no way for Univ. of

Colorado staff to determine from the data the rationale for the sample size expansion or at what point during the survey the expansion occurred.

No outlier rule

F-tag Report in

FTagReport.pdf

Identify any task, care area, or F-tag that was expanded but not cited. Determine whether there is a pattern.

Determine whether the team appropriately expands the sample.

Identify any task, care area, or F-tag that was expanded but cited at an isolated level (D-level). Determine whether there is a pattern.

If cited at an isolated level (D-level), review the F-tag Report to determine how many residents were actually cited to ensure the isolated scope is appropriate. Determine whether the expanded sample contributed to a higher severity or scope determination.

Trend the information to determine whether the sample is expanded for a particular care area, facility task, or

F-tag, but the team does not cite the area or cites the area at an isolated D-level.

For a percentage of not cited when care area was expanded, add the number of expanded care areas not cited, then divide by total care areas expanded.

Revised 11/19/13 28

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

Supplemental

Report(s)

33. Severity and Scope

Rate of citations at each scope and severity level.

Rate (one for each scope and severity level) =

Numerator: Number of citations at the specific scope and severity level for the survey.

Denominator: Number of citations for the survey.

Report, State, and National average calculations:

• Average rate of citations at each scope and severity level across the set of surveys being reviewed.

Outlier rules

Low Outlier-

There is no low outlier for A-C and H-L level citations.

The low outlier for D and E level citations is 10% or less than the national average.

The low outlier for F and G level citations is half or less than the national average.

High outlier-

The high outlier for A-C and F-G level citations is double or more than the national average.

The high outlier for D-E level citations is 10% or more than the national average.

None

Specific Instructions

Identify whether the report and state have a lower or higher frequency of citing certain scope and severity levels compared to the national average.

Determine whether a district office or team has a considerably lower/higher rate for each severity and scope level over time compared with the state and national average.

If a district office or team consistently has a high rate of isolated level deficiencies, determine whether the team is identifying patterns appropriately.

If a district office or team consistently has a high rate of severity-level 1 deficiencies and a low/high rate of severity-level 2 and 3 deficiencies, ensure teams follow the key elements for severity determinations outlined in the investigative protocols in Appendix PP.

Ensure surveyors refer to the Psychosocial Outcome Severity Guide in Appendix P to help determine the severity of psychosocial outcomes resulting from the identified noncompliance at a specific F-tag,

Surveys with no citations are not included in the distribution of severity and scope or report average calculations.

Revised 11/19/13 29

Desk Audit Report for the State Agency (DAR-SA) Item-by-Item Instruction Guide

DAR-SA Item, Definition, and

Outlier Rule/Guidelines

33. Severity and Scope

The high outlier for H-L level citations is ≥2%.

Any citation without a scope and severity is considered to be a high outlier

Supplemental

Report(s) Specific Instructions

Revised 11/19/13 30

Download