HEALTHCARE COST & UTILIZATION PROJECT

User Support

Do Your own analysis
Explore Expert Research & Limited Datasets

Final Guidance Document Based on Joining Forces Project

HCUP logo


FINAL GUIDANCE DOCUMENT BASED ON JOINING FORCES PROJECT


January 29, 2010



Executive Summary

The Agency for Healthcare Research and Quality (AHRQ) sponsors the Healthcare Cost and Utilization Project (HCUP) which collects all-payer statewide administrative data through a Federal-State-Industry partnership with 40 state government organizations, hospital associations, and private data organizations. In June of 2007, AHRQ issued two Request for Proposals designed to support HCUP Partners in their efforts to enhance their existing administrative data by adding (or investigating what it would take to add) more detailed clinical information to their hospital discharge data.

AHRQ awarded contracts to three HCUP Partners (Florida Agency for Healthcare Administration (AHCA), Minnesota Hospital Association (MHA), and Virginia Health Information (VHI)) to conduct in-depth pilot projects to add or link hospital clinical information to administrative (discharge abstracts or claims) data. AHRQ also awarded one planning contract to Washington state, which was not yet ready to engage in a pilot but sought to enhance its administrative dataset with more clinical data.

Through the creation of a peer-to-peer learning network, AHRQ provided technical assistance and networking opportunities to the participating Partners through its contractors, Thomson Reuters and the National Academy for State Health Policy. From September 2007 through October 2009, the three pilot sites: developed project informational materials, recruited hospitals to participate, identified data elements to add to administrative datasets, standardized lab data using LOINC, contracted with consultants to link data, developed processes for data transmission and merging, provided technical assistance to hospitals in their efforts to retrieve and report the data, and analyzed results.

The variations in approach, processes, and tools from the pilot sites, which represented a state agency, a hospital association, and a private data organization, demonstrate that each type of Partner can successfully undertake a project of this nature. Despite differences, each pilot succeeded in recruiting hospitals to participate, collecting and incorporating POA and laboratory data within administrative datasets, and developing a dataset for analysis. Main lessons learned by the pilot states include:

Project initiation

Data standards and transmission

Communicating with hospitals and other stakeholders

Pilot sites remain committed to adding clinical data to their administrative datasets and analyzing the joined data to inform quality reporting and quality improvement efforts.

TABLE OF CONTENTS

The Potential Value of Adding Clinical Data

The Agency for Healthcare Research and Quality (AHRQ) is the Federal agency charged with improving the quality, safety, efficiency, and effectiveness of healthcare for all Americans. AHRQ sponsors the Healthcare Cost and Utilization Project (HCUP) which collects all-payer statewide administrative data through a Federal-State-Industry partnership with 40 state government organizations, hospital associations, and private data organizations (known as HCUP Partners).

HCUP's objectives1 are to:

Administrative (discharge abstracts or claims) data include information on diagnoses and procedures, but lack more detailed clinically important information that could be useful in efforts to improve healthcare quality. These datasets do not include physiological data for accurate measurement of illness severity (e.g. lab values, vital signs) and often do not include present on admission (POA) indicators for diagnoses. POA indicators distinguish between conditions that are present at admission and those arising during hospital stays, such as hospital-acquired infections or post-admission heart attack. POA information may help alleviate some hospital public reporting concerns because hospital scores can be adjusted for treating patients who were sicker at the time of admission and not adjust for patients' worsening status during the hospital stay that may have been the consequence of poor quality care. POA also helps distinguish patient safety events that occurred in the hospital from those that occurred before the patient was hospitalized.

There is a growing national consensus on the need to bolster the clinical component of administrative data with POA indicators and data elements for laboratory values and other clinical measures. For patients and purchasers to make sound healthcare choices and for providers to improve the quality of healthcare provided, timely, relatively inexpensive, actionable, and credible data are needed. More clinical detail is essential for public reporting on quality and costs of care as well as for hospital efforts to improve quality and value. Adding clinical data to administrative datasets could improve quality measurement, thus providing hospitals with metrics to track and monitor performance improvement and consumers with more accurate and credible information for decision-making.

______________________

1 http://www.hcup-us.ahrq.gov/overview.jsp

Learning Network Methodology

Charge to Pilots

In June of 2007, AHRQ issued two Requests for Proposal designed to support HCUP Partners in enhancing their existing administrative data by adding (or investigating what it would take to add) more detailed clinical data to their existing hospital discharge abstracts. AHRQ awarded contracts to three HCUP Partners (Florida Agency for Healthcare Administration (AHCA), Minnesota Hospital Association (MHA), and Virginia Health Information (VHI)) to conduct in-depth pilot projects to add or link hospital clinical information to administrative data. AHRQ also awarded one planning contract to Washington state, which was not yet ready to engage in a pilot but wanted to investigate the feasibility of such an effort.

The three primary objectives of this project were to:

AHRQ theorized that improving the clinical robustness of HCUP Partner data would enable the statewide data organizations to produce more accurate and expanded quality assessments of hospitals in their state. Adding POA indicators and clinical data would enable HCUP Partners to differentiate conditions present on admission from those acquired as a result of the hospital stay, assess the severity of the condition on admission through lab results, and more accurately adjust mortality rates, which would improve the value of these data to consumers, providers and researchers. Enhancing discharge records with clinical data would produce information that: has credibility with clinicians by accurately measuring severity of illness through POA and lab results at point of or shortly after admission; can accurately identify areas of low quality performance by providing key clinical information with which better risk-adjustment can be obtained; is actionable by hospitals in identifying quality/safety issues and measuring improvements over time; and could improve transparency through eventual public reporting. The pilot would demonstrate the technical feasibility of adding clinical data from existing and cost-effective electronic hospital data systems, avoiding the need for expensive data abstraction; expand data capabilities of pilot state data organizations; develop a reproducible approach; and set the stage for integrating clinical and administrative data streams in the future.

Contracted Partners were charged with developing an implementation plan and testing methods for merging clinical data with the administrative data they collect from hospitals in their state. They were responsible for:

Learning Network Process

AHRQ created a learning network of the four participating Partners in order to enable them to learn from and collaborate with peers, understand and anticipate hurdles faced in implementation, and forge solutions to make the process easier for other states to follow. The California Office of Statewide Health Planning and Development was invited to participate as well, given their plans to enact regulations focused on this issue. Through the learning network AHRQ provided technical assistance and networking opportunities to the pilot sites through its contractors, Thomson Reuters and the National Academy for State Health Policy. Pilot projects participated in monthly conference calls to share progress, get updates on relevant information from AHRQ, and discuss common issues. Pilot projects also participated in annual in-person meetings, posted information on a project wiki, and received expert advice on technical issues. During their regular meetings and calls, the participating Partners shared ideas and tools and incorporated lessons learned by their peers. For example, Partners reviewed each others’ hospital surveys and marketing materials, such as one-page project summaries, and later adapted these materials for use in their own states.

Project Results and Lessons

This section provides results and lessons from the three Partner pilot sites. Specific project results and lessons are categorized according to whether they relate to project initiation, data standards/transmission, or communication.

From September 2007 through October 2009, the three pilot sites:

Pilot sites are also analyzing results, but were not able to complete the analysis within the pilot timeframe.

Each project had its own challenges and achievements. These variations in part were due to differing types of Partner organizations, relationships with hospitals, and expertise available within the data organization. The Florida Agency for Healthcare Administration (AHCA), Minnesota Hospital Association (MHA), and Virginia Health Information (VHI) represented a state agency, a hospital association, and a private data organization, respectively. Each had similar but slightly different reasons for undertaking the project.

The variability in the Partner organizations demonstrates that each type of Partner can successfully undertake a project of this nature. Despite differences, each pilot succeeded in recruiting hospitals to participate, collecting and incorporating POA and laboratory data within administrative datasets, and developing a dataset for analysis. Table 1 provides information on the number of hospitals participating in each pilot site and the clinically enhanced data elements that were added to administrative datasets.

Table 1. Participating hospitals by pilot and clinically enhanced data elements

Pilot state Number of participating hospitals Clinically enhanced data elements
Florida 22 34 lab values
Minnesota 13 POA, 26 selected numerical chemistry, blood gas, and hematology test results
Virginia 27 POA, approximately 30 lab values, and several linking variables

Project Initiation

Formalizing the project through contractual agreements: Each pilot faced some administrative delays in starting their projects, from standard delays related to getting business associate agreements signed (VHI) or amended (MHA) to unexpectedly long delays in obtaining budget approval and contracting with experts (AHCA). Pilots found that some hospitals have specific concerns about data security and may require more explicit contract language to address this issue. Pilots recommend anticipating and allowing time for negotiations with contractors and hospitals; they also suggest government agencies leave ample time for budget approval, especially during tight fiscal times.

Hospital recruitment: Pilot projects found that establishing and communicating a solid business case or benefit for adding clinical data to hospitals was important to recruiting hospitals to participate. Pilots found it useful to address issues of hospital costs and competing resources; VHI pointed out to hospitals that no additional and costly data abstraction would be required. They also marketed the feedback reports that they planned to provide to hospitals as a way to demonstrate the value to hospitals of participating. These reports would be designed to provide hospitals with useful information for quality improvement initiatives. Pilots reached out to personal contacts at hospitals to encourage their participation. Each of the pilot projects developed marketing and recruitment materials (e.g. presentations, promotional newsletters, and recruitment letters) to engage hospitals in the process, and learned that they needed to communicate with several contacts at each institution. Each pilot also hosted a kick off meeting where experts provided presentations to hospital staff; these meetings were critical to explaining the project. Because staff actually doing the work was often not included in initial meetings, additional communication about the project was subsequently required with staff new to the project. HCUP Partners designed hospital surveys to assess the technical capabilities of participating hospitals, gauge hospital readiness to provide data, and guide the selection of initial data elements. Pilot sites noted the large up-front investment associated with recruiting hospitals: large turnover in hospital staff added to the cost burden throughout the project as Partners had to engage and orient additional contacts.

Technical expertise: Data collection organizations may not have expertise to analyze results and may need a contractor to help ensure that they have a full understanding of technical aspects of the task (such as Logical Observation Identifiers Names and Codes or LOINC) before hospitals are approached, and to help analyze results. Field experts can help make the case for data collection in order to increase project credibility. VHI involved a pathologist to design tools for mapping of inpatient lab tests and results to LOINC standards in electronic reports. The consultant also helped to develop screens and edits for submitted laboratory values. An expert in operations research and health outcomes measurement helped design systems to measure the accuracy of submitted POA information and assess the improvements in outcomes measurement when laboratory and POA information were added. MHA’s primary needs for consultants were in the areas of data set linkages, data integrity, analytic reports and severity-adjusted quality reports. Cardinal Health provided MHA with technical expertise, research presentations and comparative data to support the project. 3M Health Information Systems (HIS) was under contract with AHCA to help hospitals in Florida translate laboratory tests to LOINC. 3M also analyzed the resulting dataset created by joining the data and provided an in-depth analysis on predicting quality indicators in hospitals from the combined data.

Dr. Michael Pine assisted both MHA and VHI in overall project design, technical consultation, presentations of research findings, data linkage development, quality indicator measure refinement, edit screens for assessing POA indicator quality, and comparative reports. He assisted in the assessment of the extent to which adding laboratory and POA information to administrative data will improve Partners’ ability to measure health outcomes.

Data Standards and Transmission

Data Standards

All three pilot projects found that use of data standards (LOINC, Health Level 7 (HL7)) was not widespread, and that this project was cutting edge in the use of LOINC standards for lab data as described below.

LOINC: Use of a common set of data elements, along with common codes, definitions, and other attributes facilitates aggregation of data and common analyses. LOINC has been identified as the common code set for laboratory test names, so pilot sites decided it was worth pursuing in their projects. Although many hospitals were not using LOINC, the pilot Partners each developed translations from the codes used in each hospital’s internal clinical information systems to these standard codes. By providing LOINC maps and training to hospitals with examples and instructions for those lab elements of interest, the pilots found that many hospitals could map their internal codes to LOINC values without significant technical assistance. However, hospital maps did require review and some modifications.

HL7: All three pilot projects found using the HL7 format standards for exchanging clinical and administrative data to be a major challenge. The file format is appropriate for the intended purpose of transmitting real-time transactions, but is overly complex for a one-time focused, retrospective study utilizing batch data transfers. A fully compliant HL7 record includes many data elements not needed for this project and correctly programming the complex data structures proved to be a challenge. Inexperience with HL7 among many hospitals posed another barrier. All projects maintained flexibility when the transmission of the full HL7 record proved too complicated; each decided to use a shorter, flat file transmission format, and two states (Virginia and Minnesota) decided to use an "HL7-like" format that retained HL7 data field definitions.

Data Transmission

Web portals: All pilot sites initially utilized secure web portals for data transmission. One unexpected difficulty in each pilot site was the file size of the laboratory data and insufficient capacity for handling it. Partners needed to enhance data handling capacity to support larger file sizes and use Secure File Transfer Protocol (SFTP) and mailing of physical media as alternatives to accommodate very large files. AHCA found that hospital firewalls (and even its contractor firewall) made downloading the FTP software and/or connecting to the Partner’s secure FTP site challenging; AHCA resolved the issue by working closely with hospital IT staff.

Both Minnesota and Virginia considered adding vital sign data but found that it was often not recorded electronically by participating hospitals. Although minor issues occurred with formatting, coding conventions, and data extraction, each pilot project found enhancement of claims data with POA modifiers and a limited set of numerical laboratory results to be feasible and of great potential value.

Communicating with Hospitals and Other Stakeholders

Stakeholder involvement: The three pilot projects involved a variety of stakeholders in their project, including hospital personnel, hospital associations, health information and health quality organizations, and health insurers, physicians, and consumer communities. In hindsight, some pilots believe that establishing a steering committee at the outset would have helped to lend credibility and facilitate communication among stakeholders in hospital departments and data collecting organizations.

Consistent, ongoing communication with hospitals to support their participation: Each pilot site found the need to communicate regularly with hospitals that were providing data through the pilot project. This need arose for a variety of reasons: hospital personnel (quality assurance managers, lab information staff, and IT personnel) often may not communicate internally, high staff turnover within hospitals left gaps in knowledge, and competing priorities among hospital staff led to delays in data transmission. Pilot sites communicated with hospitals in a variety of ways, such as newsletters, phone calls, email, in-person meetings, and conference calls. Florida even created a hospital learning network of its own for facilitating hospital-to-hospital problem-solving and peer-to-peer learning. Pilot sites used these forums to communicate project updates as well as regularly check-in with hospitals to help solve problems causing delays in data submission. In its planning project, Washington convened two free, one-day symposia for stakeholders and conducted a series of one-on-one phone interviews with selected key stakeholders. As a result of an initial meeting the state held with the Washington State Hospital Association to gather advice, two locations were selected for the symposia. Holding events in two regions helped to ensure that hospitals in urban and rural areas were able to participate and that both perspectives were represented in the state’s findings. During the symposia, stakeholders participated in facilitated discussions about the potential benefits, issues, uses, and barriers relating to collecting, sharing, and using a limited set of clinical data combined with existing hospital administrative data.

Communication with hospital staff at multiple levels and in multiple departments: Pilot sites noted the need to engage hospital CEOs as well as staff with responsibility for implementing the project. "Language" differences with hospital IT staff were particularly challenging. Pilot sites had to be able to speak in both simple, non-technical terms for non-IT staff, as well as communicate information at a more technical level for IT staff. Minnesota found that meetings and conference calls with the participating hospitals and consultants helped keep hospitals engaged, and having quality assurance staff engage the IT staff helped. Providing concise handouts with clear explanations of the project task and purpose to hospitals also proved helpful; hospital staff could share the marketing materials internally as new employees became involved in the project or staff needed a refresher. For its planning project, Washington reached out to Chief Executive Officers Chief Financial Officers, Quality Assurance/Quality Improvement Directors, and Medical Records Directors. This communication revealed the unique issues faced by small, marginally viable hospitals in rural areas that might need additional support for adding clinical data.

Technical assistance from HCUP Partners: Participating hospitals needed assistance with LOINC lab results and POA coding, and processes to ensure continued accurate results. Pilot Partners created resources for this purpose, including an educational video for possible CME credit, LOINC mapping tools, a verification process, and feedback after data submission. HCUP Partners may need a dedicated person to manage the process, including communications, contractual arrangements, meeting coordination, and data acquisition. As a part of the planning process, one partner recommended the development of a detailed communications plan to include identification of key contacts, a timeline, communications, kick-off event, ongoing meetings, and web page for reference materials. Each pilot site developed a Website or Wiki for easy access to useful information and for sharing and cataloguing reference materials.

Feedback to participating hospitals: In addition to regular communication, pilot projects found that feedback to hospitals on data quality, analysis, and results was critical to demonstrate to participating hospitals that their efforts were worthwhile. HCUP Partners marketed the project as an opportunity for hospitals to obtain useful information for quality improvement initiatives and developed customized data quality and analytical reports. With the assistance of technical expert consultants, Partners also provided data edit screens to help hospitals assess their own data quality and for identifying errant POA coding. Project continuation will require that the projects prove value to hospitals by showing how useful metrics can be established to gauge and monitor quality improvement.

Flexibility in responding to hospital needs: By maintaining regular communication and providing consistent feedback with multiple staff at hospitals, pilot sites remained aware of challenges or delays experienced by hospitals and could make adjustments in timelines or processes to accommodate hospital needs or constraints. Pilot sites learned that small or critical access hospitals had different needs and challenges (particularly related to IT) compared to larger hospitals. Upon learning that some hospitals were unable to send data in the requested format, one Partner decided to reformat data itself. Partners also revised their lists of data elements based on what hospitals regularly collected. Additionally, sometimes linking was done by the hospital, and sometimes linking was done by the data organization if a hospital was unable to do it. By remaining flexible in responding to hospital needs, Partners believe they conveyed their commitment to minimizing the use of hospital staff time and resources while maximizing the value of the results to hospitals.

Next Steps

Given the positive experiences of and many lessons learned from these contracts, participating Partners have a variety of next steps, as well as recommendations for future assistance that could be provided by AHRQ.

Next Steps for Partners

Pilot site Partners would like and/or plan to continue collecting and analyzing these data. Most of project costs are front-loaded, so the costs associated with sustaining data collection are small. At the same time, some Partners believe additional staff time will need to be dedicated to the activity, particularly data analysis, and future funding and resources are unknown. Pilot projects have not yet fully developed or captured the analytic value of adding clinical data, which may pose a challenge for sustaining hospital interest and participation. However, Partners remain confident that when the analyses are completed, the value of collecting these data will be clear.

The Partners’ pilot projects complemented various state initiatives. MHA plans to continue and expand its pilot project with a position funded through a project to collect and report hospital performance data required as part of state health reform legislation in 2008.

VHI currently publishes the AHRQ quality indicators and expects to update this site with POA-enhanced administrative data when POA reporting is complete. VHI also intends to improve the risk-adjustment methodology of its Cardiac Care report by using laboratory data. VHI also intends to use the results of this project to demonstrate the importance of laboratory reporting to all hospitals in Virginia in order to expand data collection efforts. Finally, VHI will also use the results to continue to engage a wide variety of stakeholders in data-driven transparency initiatives.

AHCA is moving forward with data collection and will be creating reports for hospital use. Lessons learned from AHCA’s pilot project may help inform the state’s creation of a strategic plan for HIT.

Pilots Suggested Next Steps for AHRQ

In their final reports, pilot projects noted a variety of types of assistance AHRQ can provide to sustain data collection and analysis and help new states enhance administrative datasets by adding clinical data. These suggestions include:

Pilot sites remain committed to adding clinical data to their administrative datasets and analyzing the joined data to inform quality reporting and quality improvement efforts. They value the contract experience and look forward to continued assistance from and in partnership with AHRQ.


Internet Citation: Final Guidance Document Based on Joining Forces Project. Healthcare Cost and Utilization Project (HCUP). July 2011. Agency for Healthcare Research and Quality, Rockville, MD. www.hcup-us.ahrq.gov/datainnovations/clinicaldata/Project_Summary.jsp.
Are you having problems viewing or printing pages on this website?
If you have comments, suggestions, and/or questions, please contact hcup@ahrq.gov.
Privacy Notice, Viewers & Players
Last modified 7/14/11