CDAR2_QRDA_R1D1_2008SEPT

Implementation Guide For CDA Release 2
Levels 1, 2 and 3
Quality Reporting Document Architecture (QRDA)

Based on HL7 CDA Release 2.0

(US Realm)

Draft Standard for Trial use

First Ballot

September 2008

Yellow = text that needs review/embellishment

Purple = known things to check/verify or change prior to ballot

© 2008 Health Level Seven, Inc.
Ann Arbor, MI
All rights reserved.

Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Current Working Group also includes: / [Name, Name, Name, Name, Name]


Acknowledgments

This Guide was produced and developed through the efforts of the Quality Reporting Document Architecture (QRDA) Project supported by The Child Health Corporation of America (CHCA) to develop and support a standard for quality reporting.. Through this project, CHCA supports the HL7 Pediatric Data Standards Special Interest Group and others in developing a draft standard for reporting quality measure data.

The QRDA committee made up of……… was instrumental in guiding the project so that alignment between the <organizations> interested and working on various aspects of quality reporting occurred.

The co-editors would also like to express their appreciation for the support and sponsorship of the Structured Documents Working Group and the Pediatric Data Standards Special Interest Group.

Finally, we would like to acknowledge the foundational work on Health Level Seven (HL7) Version 3 and the Reference Information Model (RIM), the HL7 domain committees, especially Patient Care, and the work done on CDA itself.

We would also like to acknowledge the collaborative effort of ASTM and HL7, which produced the Continuity of Care Document (CCD). All these efforts were critical ingredients to the development of this DSTU and the degree to which it reflects these efforts will foster interoperability across the spectrum of health care.

Revision History (to be removed prior to putting up for ballot)

Rev / Date / By Whom / Changes / Note
New / May 21, 2008 / Gay Giannone


Table of Contents

1 Introduction 7

1.1 Purpose 7

1.2 Scope 7

1.3 Audience 8

1.4 Process of Formalizing a Measure 8

1.4.1 Role of Professional Societies 8

1.4.2 Role of Technical Groups 8

1.4.3 Suggested Collaboration Process 8

1.4.4 Definition of a Quality Measure 8

1.5 Process of Reporting on a Measure 9

1.5.1 Role of Processing Entity 9

1.5.2 Role of Quality Manager 9

1.6 Approach 9

1.6.1 Organization of this Guide 9

1.6.2 Use of Templates 10

1.7 Conventions Used in This Guide 11

1.7.1 Explanatory Statements 11

1.7.2 Conformance Requirements 12

1.7.3 Vocabulary Conformance 12

1.7.4 XPath Notation 12

1.7.5 Keywords 13

1.7.6 XML Samples 13

1.7.7 Contents of the Ballot Package 13

2 category one QRDA CDA 14

2.1 Header Constraints 14

2.1.1 Header attributes 14

2.1.2 Participants 14

2.2 Category One Body Constraints 17

2.3 QRDA Category One Section Constraints 17

2.3.1 Measure Set Section Conformance 18

2.3.2 Measure Section Conformance 18

2.3.3 Reporting Parameters Section Conformance 19

2.3.4 Patient Data Section Conformance 20

3 category two qrda cda 22

3.1 Header Constraints 22

3.1.1 Header attributes 22

3.1.2 Participants 22

3.1.3 Header relationships 24

3.2 Category Two Body Constraints 25

3.3 QRDA Category Two Section Constraints 25

3.3.1 Required Section 25

3.3.2 Entry Patterns 26

4 category three qrda cda 27

4.1 Header Constraints 27

4.1.1 Header attributes 27

4.1.2 Participants 27

4.1.3 Header relationships 29

4.2 Category Three Body Constraints 30

4.3 Section Constraints 30

4.3.1 Required Section 30

4.3.2 Entry Patterns 31

5 References 32

Appendix A — External Constraints 33

Introduction 33

CCD Constraints 33

Medications (templateID 9.99.99999…) 33

Social History (templateID 9.99.99999…) 33

Appendix B — Template IDs defined in this Guide 34

6 Open issues 35

Table of Figures

Figure 1: clinicalDocument example 13

Figure 2: realmCode Category One example 14

Figure 3: clinicalDocument/templateId Category One example 14

Figure 4: recordTarget Category One Example 15

Figure 5: assignedAuthor Category One Example 15

Figure 6: Informant - Category One Example 16

Figure 7: Custodian Category One Example 16

Figure 8: legalAuthenticator Category One Example 17

Figure 9: Measure Set Section Example 18

Figure 11 MeasureAct example 19

Figure 12: realmCode Category Two example 22

Figure 13: Null flavor recordTarget example 23

Figure 14: AssignedAuthor as a processing entity example 23

Figure 15 Informant Category Two example 23

Figure 16: Custodian Category Two Example 24

Figure 17 legalAuthenticator Category Two example 24

Figure 18: documentationOf Category Two Example 25

Figure 19: realmCode Category Three example 27

Figure 20: Null flavor recordTarget example 28

Figure 21: AssignedAuthor as a processing entity example 28

Figure 22 Informant Category Three example 28

Figure 23: Custodian Category Three Example 29

Figure 24: legalAuthenticator Category Three example 29

Figure 25 documentationOf Category Three Example 30

Table of Tables

Table 1: Contents of the Ballot Package 9

Table 2: Administrative Gender Value Set 15

Table 3: Template IDs Defined in this Guide 17

Table 4: Sections Used in this Guide and in QQQ 19

1  Introduction

1.1  Purpose

The IOM definition of quality is, “The degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge” [1] In order for knowledge about care quality to be evaluated it must be gathered and communicated to the appropriate organizations.

The purpose of this document is to describe constraints on CDA Header and Body elements for Quality Reporting Documents. Quality Reporting Document Architecture (QRDA) is a document format that will provide a standard structure with which to report quality measures to organizations (such as …CMS, HHS, HEDIS…etc) that will analyze and interpret the data that is received. Measuring quality in health care is complex. Accurate, interpretable data, efficiently gathered and communicated is key to correctly assess for quality care delivered.

1.2  Scope

Is:

Standardize the representation and communication of measure data.

Is not

An attempt to have HL7 vote on the actual eMeasure data elements

Measures can be defined by prof societies, yadda yadda (don’t necessarily preclude HL7 from creating measures). The (clinical) measure development process is out of scope. As described below (ref to formal process section), the formal representation is ideally developed in parallel with the measure definition itself. From there, the measure’s data elements can be communicated via QRDA.

Data elements comprising a measure – out of scope.

Representation of formal eMeasure logic definition (You could mention this again in the Future Work section – a tighter coupling with Collaborative, ability to semi-automatically generate QRDA’s from formal eMeasure definitions)

The goal of QRDA is to nationally standardize the framework of quality reports and to define the way the data about measures are structured to create interoperability between reporting and receiving institutions. QRDA is the framework for communicating measure data defined by professional societies.

1.3  Audience

The audience for this document includes software developers and consultants responsible for implementation of reporting capabilities with their Electronic Health Record (EHR) systems, and developers and analysts in receiving institutions and local, regional, and national health information exchange networks who wish to create and/or process CDA reporting documents created according to this specification.

1.4  Process of Formalizing a Measure

Ideally, a process is in place whereby the measure is formally specified, in a process that involves domain experts and a computable representation. From there, one could theoretically auto-generate the QRDA Category I, II, and III specifications. However, it may be that in some cases, the development of a QRDA specification will come before there is an agreed upon formal representation of a particular Measure. In these cases, care must be taken to have a planned collaberation process between the domain experts and the measure representation designers get consences that the intent is captured and the output is useful.

1.4.1  Role of Professional Societies

1.4.2  Role of Technical Groups

1.4.3  Suggested Collaboration Process

describe the goal of re-using EHR data here?

1.4.4  Definition of a Quality Measure

A quality measure is a mechanism that enables the user to quantify the quality of a selected aspect of care by comparing it to a criterion. Quality measures are used for three general purposes: quality improvement, accountability, and research.

Quality measures may be used for external quality improvement in programs operated by state, regional, or national entities; other quality improvement organizations; or professional organizations. External agencies frequently collect the performance measurement data, verify their accuracy, and report quality performance results among providers of care in a format that allows direct comparison of providers. External agencies may also provide "benchmark" results that can be used to encourage providers to strive to perform at the best level shown to be achievable.[2]

CDA’s role in quality measures is to standardize the modeling of these measures to enable interoperability between all of the stakeholder organizations.

describe the goal of re-using EHR data here?

1.4.4.1  Measure Set

A measure set is a group of related measures with the same initial patient population (outlier:NQF preg and related cond measure set) such as all patients with pneumonia.

The measures within the measure set may or may not have the same denominator population. For example, the measures within the NQF endorsed measure set for Pneumonia have the same initial population of patients with at least one ICD-9 code for pneumonia from a given value set. The individual measure population subset (denominator) will include an ICD-9 code for pneumonia plus may include another observation such as cigarette smoker or immunocompromised and an ICU admission. The numerator population (measurement criteria) for each measure within a measure set is different, such as patients who received smoking cessation education or who received a particular antibiotic.

1.5  Process of Reporting on a Measure

(include a reference to HITSP IS06) – give a concise summary (couple paragraphs), including stuff we mention below

1.5.1  Role of Processing Entity

Could talk about alternate workflows; could talk about various header participants and who is which one (noting that we’re not standardizing the workflow per se)

1.5.2  Role of Quality Manager

1.6  Approach

Overall, the approach taken here is consistent with balloted Implementation Guides for CDA. These publications view …. There could be bullets with links:

·  List documents/standards/guides reviewed …

1.6.1  Organization of this Guide

The requirements laid out in the body of this document are normative and are subject to change only through the ballot process. These cover the header , body and section requirements. The IG will define Category I header, body and section requirements, followed by Category II header, body and section requirements and finally Category III header, body and section requirements.

1.6.1.1  Framework

Talk about how the IG is a framework for all measure reporting. Describe how IG will define the 3 types. Describe how framework + individual measure IG = <measure> QRDA

The measure-specific implementation guide(s) will describe the exact description and modeling/entries for each measurement.

1.6.1.1.1  Category One

Individual patient-level reports with the full clinical data defined in the

measure. Data points that meet exclusion criteria may be included and identified

1.6.1.1.2  Category Two

Aggregate data across a defined population. The report may or may not identify individual patient’s data within the summary. Or may simply state the patients inclusion or exclusion in the measure. The defined population may represent the final measure numerator or denominator or both.

1.6.1.1.3  Category three

Calculated results for a given population and period of time, identifying exclusions. The report conveys the separate values for numerator and denominator and the calculated result.

1.6.2  Use of Templates

When valued in an instance, the template identifier signals the imposition of a set of template-defined constraints. The value of this attribute provides a unique identifier for the templates in question.

1.6.2.1  Originator Responsibilities

An originator can apply a templateId if there is a desire to assert conformance with a particular template.

In the most general forms of CDA exchange, an originator need not apply a templateId for every template that an object in an instance document conforms to.When template IDs are required for conformance it shall be asserted within the IG.

1.6.2.2  Recipient Responsibilities

A recipient may reject an instance that does not contain a particular templateId (e.g., a recipient looking to only receive CCD documents can reject an instance without the appropriate templateId).

A recipient may process objects in an instance document that do not contain a templateId (e.g., a recipient can process entries that contain substanceAdministration acts within a Medications section, even if the entries do not have templateIds).

If an object does not have a templateId, a recipient shall not report a conformance error about a failure to conform to a particular template on classes that do not claim conformance to that template and which are not required to be conformant by other templates.

1.6.2.3  Levels of Constraint

This specification defines additional constraints on CDA Header and Body elements used in the three levels of Quality Reporting documents in the US realm, it reuses CCD entry level templates where appropriate and provides examples of conforming fragments in the body of the document and examples of conforming XML instances as an appendix.

Within this DSTU, the required and optional clinical content within the body is identified.

This DSTU specifies three levels of conformance requirements:

• Level 1 requirements specify constraints upon the CDA header and the content of the document.

• Level 2 requirements specify constraints at the section level of the structuredBody of the ClinicalDocument element of the CDA document.

• Level 3 requirements specify constraints at the entry level within a section.

Note that these levels are rough indications of what a recipient can expect in terms of machine-processable coding and content reuse. They do not reflect the level or type of clinical content.