QUALITY TODAY

A blog dedicated to advancing healthcare quality.

Why Inter-Rater Reliability Matters & How to Do It

Posted by Revee White on Jun 27, 2016 10:00:00 AM

Find me on:

As new payment models continue to push the demand for more and more healthcare big data, it is essential that organizations produce quality data that is complete, valid and reliable. Often, your data relies on the skills, training and commitment of your abstraction team. Data abstraction results should be validated through ongoing Inter-Rater Reliability (IRR) review to determine accuracy and completeness.

Incorporating IRR into your routine can reduce data abstraction errors by identifying the need for abstractor education or re-education and give you confidence that your data is not only valid, but reliable.

IRR review should be a regular part of your abstraction process on a monthly, quarterly and annual basis. In addition, we recommend organizations perform IRR in these four instances:

  1. After a specifications manual update
  2. When a new abstractor starts
  3. When new measure sets are introduced
  4. During performance evaluations

The IRR sample should be randomly selected from each population using the entire list of cases, not just those with measure failures. Each case should be independently re-abstracted by someone other than the original abstractor.

 The IRR abstractor then inputs and compares the answer values for each Data Element and the Measure Category Assignments to identify any mismatches. The results are reviewed/discussed with the original abstractor and case is updated with all necessary corrections prior to submission deadlines.

If the original and IRR abstractor are unable to reach consensus, we recommend submitting questions to QualityNet for clarification. Lessons learned from mismatches should be applied to all future abstractions. Results should be analyzed for patterns of mismatches to identify the need for additional IRR Reviews and/or targeted education for staff.

Primaris' process includes randomly selecting three to five medical records per quarter per measure for each of our quality measure abstractors. IRR is incorporated into new abstractor orientation and training process.  The process includes team lead and new abstractor jointly reviewing a minimum of 5 records prior to abstractor’s independent review of 5 records. The process of independent abstraction followed by IRR continues until abstractor achieves 95% or greater score on each measure assigned. Abstractors are expected to maintain a minimum of 95% accuracy on IRR review. If a reviewer falls below the 95% threshold, intensive one-on-one training is conducted.

In addition to individual feedback, general feedback is provided to all reviewers related to patterns or trends identified in IRR. Areas for improvement and lessons learned, during Primaris’ quality control program processes, are shared among all abstractors to assure continued compliance and accuracy. Closing the feedback loop, Primaris reviews IRR results with clients each quarter.   

A focused IRR process is why Primaris-trained abstractors maintain a IRR rate of 98 percent, and that means clients can be certain that the data is timely, informative and, most importantly, accurate.

Many providers are drowning in data--they can’t keep up with quality reporting demands. Dedicating resources and proper training is crucial to taming the chaos of a highly complex, time-consuming process that significantly impacts an organization’s financial and competitive standing.

Primaris provides outsourced quality measures abstraction and quality improvement services to hospitals and physicians across the country, performing more than 10,000 data abstractions per month.

Email us today at engage@primaris.org to discuss how outsourced quality measures abstraction can benefit your facility or to have us help you conduct an IRR review at your facility.

Download Checklist

 

Topics: operational efficiency, IRR, data abstraction

Subscribe to Blog

Interested in being a guest blogger? Email us