HEALTHCARE COST & UTILIZATION PROJECT

User Support

Do Your own analysis
Explore Expert Research & Limited Datasets

Virginia Final Report

Virginia Health Information logo

ADDING CLINICAL DATA TO STATEWIDE ADMINISTRATIVE DATA


FINAL REPORT
VIRGINIA HEALTH INFORMATION
September 30th, 2009



Executive Summary

  1. Team members and consultants

    The team members and consultants were:
    Michael Lundberg (Virginia Health Information, Executive Director),
    Chris Delcher (Virginia Health Information, Analyst),
    Deborah Waite (Virginia Health Information, Operations Manager),
    Ramesh Shukla, PhD (Virginia Commonwealth University, subcontractor),
    Michael Pine PhD (Michael Pine and Associates, Inc. subcontractor),
    Sallie Cook MD (Virginia Health Quality Center, clinical pathologist, subcontractor)

    Virginia Health Information (VHI) also worked closely with state pilot partners and the Agency for Healthcare Research and Quality (AHRQ) consultants over the course of the project.


  2. Project summary

    VHI was able to successfully recruit 27 Virginia hospitals into the pilot program. This resulted in the collection of nearly 400,000 present on admission (POA) enhanced patient records with accompanying lab values representing almost half of all Virginia discharges. During the project, VHI provided data quality feedback via custom reporting to show the distribution of POA values, provided customized reports that identified potentially hospital acquired conditions using the POA indicators, and provided AHRQ quality indicator reports showing the impact to rates pre and post use of the POA indicator. During the course of the project, POA reporting was mandated in Virginia. Towards the end of the project, VHI began engaging the health insurance, physician, and consumer communities on the use of this information for quality initiatives. Feedback from these stakeholders is expected in November 2009.


  3. Number of hospitals participating in the pilot

    There were 27 participating hospitals. Even those hospitals that could not participate because of technical issues expressed interest in the project.


  4. Materials/tools developed for hospitals

    We developed a host of materials including business associate agreements, presentations, promotional newsletters, LOINC mapping templates and record layouts for hospital use, a project website, and customized data quality and analytical reports.


  5. Data elements collected

    In addition to the POA indicators on all discharges, we collected approximately 30 laboratory values (see page 5-8, Attachment 1). The lab values were based on recommendations from Michael Pine and Associates, Inc. because of their experience with risk-adjusting cardiac care outcomes. We also collected several "linking" variables to help with joining back to administrative data sets.


  6. How data elements are standardized

    The POA indicators were standardized by the Centers for Medicare and Medicaid Services (CMS). For the lab values, we used the Logical Observation Identifiers Names and Codes (LOINC) standard (http://loinc.org). Additional data elements were standardized according to VHI's long-standing data submission layout.


  7. Data transmission methods

    VHI established a secure FTE site. Occasionally, hospitals sent lab records via password protected CD because file sizes exceeded the limits of the web transfer.


  8. Stakeholders engaged in the process

    There were several stakeholder groups engaged in the process including the Board of Virginia Health Information, the Virginia Hospital and Healthcare Association, the Virginia Health Quality Center, and 30 participating hospitals. Towards the end of the project, VHI recruited new stakeholders from the health insurance, physician, and consumer communities to examine the utility of project results. These results are scheduled for presentation in November 2009.


  9. Major challenges and their resolutions

    Initially, VHI expected the HL7 format to be a major challenge. Indeed, after examining the technical details, VHI decided to move forward with a different transmission format. VHI faced few obstacles, beyond standard administrative delays, in getting business associate agreements signed. As the data began to flow, there were challenges encountered when file formats were not followed to the letter. While troublesome, these problems were readily solved. Additionally, one of the major challenges that VHI faced was in designing custom reports that would be useful to hospitals. VHI typically followed a process of internal development with our subcontractors and then release to hospitals. Hospital feedback was limited so it was a challenge to ascertain whether or not these reports were useful to them. VHI is planning to have to a final project meeting to request more explicit feedback on these materials.


  10. Project websites

    VHI developed http://www.vhi.org/hybriddata.aspExit Disclaimer for this project. VHI also actively participated in the project WIKI. VHI has also included a Master Timeline in Attachment 8.
  11. Project Overview

  12. State for which you are reporting

    Virginia


  13. Description of the purpose and intention of your project. How does it relate to your organization's current activities and how might the data be used? How has it evolved since the beginning of the project?

    Virginia Health Information (VHI) sought this contract to add present on admission and clinical data to Virginia's established administrative data system. In the "Technical Proposal", VHI proposed three major objectives, all of which were met without major deviation, for the project.


    1. "In order to improve the health data VHI collects, VHI will collaborate with health care stakeholders to establish the feasibility of and link clinical data and a present on admission indicator for diagnoses to VHI's administrative statewide inpatient hospital database. VHI has used AHRQ Quality Indicators software for quality improvement and public health surveillance. Outcomes do vary by geographic region (see map) and by hospital."

      2007 Selected Infections Due to Medical Care (Area-wide PSI 23). (Updated for the final report)

      Map of Virginia that indicates discharge rate (unadjusted) per 100,000 population at-risk for each county. The discharge rate ranges indicated in the map are: 0 to 10; 11 to 20; 21 to 30; 31 to 40; 41 to 120.

      "Improving VHI's ability to discern infections or better adjust mortality rates by adding POA indicators or clinical data will improve the value of these data to consumers, providers and researchers. Through previous work with Virginia trauma registry data, outpatient surgery data, EMS and vital records information, VHI has utilized existing patient identifiers needed to link additional information to the administrative files. In addition, VHI has extensive experience with the technical problems that can arise in data linkage and approaches and techniques needed to overcome those challenges to acquire and produce enhanced files for analysis and reporting. VHI's relations with hospitals and their association is a positive one where collaboration and open communication are the norm. Through a process of inclusion and problem solving, VHI will develop information and data sets that could, if supported legislatively by stakeholders, lead to broader statewide collection and be customized for a variety of purposes including public reporting of quality information on mortality, readmissions and other outcomes to help foster consumer choice and quality improvement. The enhanced data would also be expected to benefit researchers and public health scientists already using VHI administrative data and others in the future."

      Through the course of the project, VHI incorporated the POA indicator into the AHRQ quality indicator software and provided before-after POA scenarios for participating hospitals.


    2. "VHI recognizes there is variation in state data systems, present on admission data collection and hospital clinical data systems such as laboratory, vital signs and other key clinical values. VHI expects there could be a need to address concerns about data confidentiality, data extraction issues, changing lab formats and incompatibility of data formats. Hospitals may change key personnel involved in the exchange and be re-recruited. Thankfully, there are also many common approaches to data collection and integration. VHI intends to develop a reproducible approach for linkage and use of data for quality improvement and reporting by identifying how hospital data streams vary (by system and vendor) and developing a plan to uniformly integrate these data into a statewide data set. By documenting and sharing these challenges and solutions through a series of information sharing and dissemination activities, VHI will help foster these abilities in other states."

      In the early stages of the project, VHI developed a technical survey (see Attachment 2) to assess the technical capabilities of participating hospitals. The results of this survey played a significant role in the design of the data elements. Data submission was never quite uniform despite the fairly strict technical specifications so there was a significant time investment in formatting and compatibility.


    3. "Meeting contract objectives is the start point and not the end point of this contract. VHI seeks this contract to set the stage for ongoing integration and use of these additional clinical data in the future. This goal will be met through a process of collaboration, demonstration of proof of concept which may lead to legislative action with the support of its stakeholders." VHI did, in fact, successfully implement legislation to mandate the collection of POA beginning July 1, 2009. We believe that this project helped demonstrate the value and relative ease of POA reporting for hospitals.

    Virginia's team members and consultants were:

    Michael Lundberg (VHI, Executive Director),
    Chris Delcher (VHI, Analyst),
    Deborah Waite (VHI, Operations Manager),
    Ramesh Shukla, PhD (Virginia Commonwealth University, subcontractor),
    Michael Pine PhD (Michael Pine and Associates, Inc. subcontractor),
    Sallie Cook MD (Virginia Health Quality Center, clinical pathologist, subcontractor).

  14. Key stakeholders and their roles

    During the start-up phase, the Virginia Hospital and Healthcare Association www.vhha.comExit Disclaimer , representing 44 member health systems and hospitals, representing 106 community, psychiatric, rehabilitation, and specialty hospitals throughout Virginia played an important role in assisting with hospital recruitment.


  15. Consultants and their major contributions. Include your reasons for adding consultants to your team.

    VHI has many years of experience in collecting, analyzing and publishing health care information. As a small staff of 8, VHI benefits greatly from those with expertise outside of VHI's core capabilities. For this contract, VHI engaged the help of both local and national experts. Dr. Sallie Cook, a pathologist was very helpful in designing tools for the proper mapping of in-hospital lab tests to a standard called LOINC. She was also instrumental in helping to develop screens and edits for submitted laboratory values. Dr. Ramesh Shukla, an expert in operations research and health outcomes measurement was involved in helping design systems to measure the accuracy of submitted POA information and assessing the improvements in outcomes measurement when laboratory and POA information were added. Dr. Michael Pine is experienced in analysis of health data for outcomes measurement. His experience with lab and POA data was valuable in assessing the extent to which adding laboratory and POA information to administrative data will improve our ability to measure health outcomes.

    Project Planning

    What issues did you consider and materials did you prepare prior to contacting hospitals for participation? VHI requested that VHHA send the following announcement to member hospitals via their electronic newsletter.

    "VHI Awarded Contract to Link Clinical Data with Hospital Discharge Data" Virginia Health Information (VHI) has been awarded a two-year contract to work with hospitals to develop a method to improve Virginia’s patient level data system’s ability to predict and evaluate mortality and other outcomes of care. The contract effective date is September 30, 2007, and is provided by the Federal Agency for Health Care Research and Quality (AHRQ.) The contract has a total possible award amount of $327,704.VHI will work with participants to identify the data elements to be collected and develop enhanced quality reports for hospital internal use. VHHA and VHI view this effort as a potential method to enhance the patient level data system for measurement and improvement of the quality of care. Hospitals participating in the pilot can influence the development process, including information collected, and benefit from enhanced quality information for internal use and an early opportunity to use this information to improve care. For more information on participating in this effort, contact Michael Lundberg,Executive Director of VHI, at michael@vhi.org or (804) 644-7026.

    Initial Recruitment Announcement sent to Hospitals via the VHHA electronic newsletter.

    Some of the initial issues with recruitment were whether or not the CEOs should be contacted directly and in what format (because letters tend to "get lost on desks"), the development of a follow-up schedule with strategies to handle non-responsive hospitals, and whom to contact first inside of multi-hospital systems.

    VHI developed several key materials to help ensure hospital participation:

    The full recruitment package (see Attachment 3) contained three elements:

    1) A cover letter from VHI's Executive Director explaining the importance of the project

    2) A graphic overview of the project (see below) and

    3) A clear description of the roles and responsibilities of all parties involved. This letter was sent to three contacts: the hospital administrator, the quality assurance coordinator, and the patient level data contact.

    It is worth describing a couple of key points about these recruitment materials. On the graphic itself, VHI consulted with its internal nursing staff and we determined that the most compelling feature of the project was that "No additional data abstraction [would be] required." Given the work burden of the quality contacts, our nurses thought that knowing this up-front would increase the likelihood of participation.

    diagram

    Virginia Health Information Adding Clinical Data to Administrative Data An Overview

    Hospitals provide VHI with Present on Admission (POA) Indicators and laboratory tests VHI provides hospitals with Quality Indicators

    First step is determining current hospital capabilities:
    Hospital Responsibilities:
    January 2008
    Survey of current capabilities:
    • POA collection
    • Labs
    • Vitals
    Followed by initial introduction/kickoff meeting:
    March 2008
    Kickoff
    • Clinical discussion:
      • Researchers and hospitals
    • Technical discussion:
      • Health IT professionals
    Finally providing training, provider education, and technical assistance:
    VHI (Patient-level data) Responsibilities:
    Education
    • POA training (hospital/physician)
    • Custom quality reports for internal hospital use
    • Comparative reports on POA use among hospitals
    Technical assistance
    • POA collection
    • Labs
    • Vitals (feasibility)
    No additional data abstraction required
    Partners: VHI logo, AHRQ logo, Virginia Health Quality Center logo, MPA logo. end of diagram

    When the opportunity arose, VHI's Executive Director hand-delivered these recruitment materials.

    A couple of weeks after the packages were mailed; VHI began to follow-up with data quality contacts by phone. Usually after that discussion, an email was sent with additional follow-up items (see Attachment 4). This follow-up email contained two key components: a link to the "science behind the project" and a link to an electronic newsletter description of the AHRQ project. A screenshot of the print version of that story is below:

    diagram

    Newsletter with a description of the AHRQ project from January 2008. Description reads:

    VHI Wins Federal Contract to Enhance Data

    VHI is the premier source of reporting on health care quality in Virginia and the federal Agency for Healthcare Research and Quality seems to agree. As one of only three states receiving this contract, VHI will work to improve the usefulness of hospital discharge data, also known as administrative data, for measuring quality of health care in Virginia.

    We know that while people may be treated equally when they walk through the hospital door, their hospital experience can vary widely based on certain risk factors. For example, common sense tells us that surgical risks are inherently higher for a 85 year old than they are for a 35 year old. VHI has always attempted to account for these somewhat obvious differences (think age, multiple diseases or illnesses) through a process known as risk-adjustment.

    Over the years, with the assistance of our university partners at UVA and VCU, we have developed sophisticated statistical models for risk adjusting. Just think of it as leveling the playing field. Having been in this business for 15 years, we also know that the field can be made more level with enhanced data. That's exactly why we applied for the contract.

    The idea is simple. Instead of using costly approaches such as medical chart review to discover the clinical differences between people, VHHI will be able to mimic this manual process by adding clinical data to administrative data. The goal is to improve our statistical adjustments to include these not-so-obvious differences that people have as they walk through the hospital door. In the end, VHI wants to increase the value of these data to consumers, providers, and researchers alike.

    112 — The risk-adjusted rate (per 100,000 population) of discharges for adult asthma in Virginia, 2006

    1,532 — The number of invasive cardiology procedures performed by CJW Medical Center, 2006

    end of diagram

    One of the purposes of including a link to this newsletter was to encourage participation by giving the hospitals the sense that many people around the state would read about the project and to give them refined materials to share with internal key players and link to on their own websites if desired.

    After the kickoff meeting, VHI developed two additional web-based materials that the hospitals could use to "market" and follow the project. The first was another electronic newsletter which included a story called "VHI Hosts AHRQ Kickoff Meeting" with a map of all participating hospitals. (This graphic template was used periodically throughout the project, approximately 10 times, to provide project updates in a format that would be visually appealing.) A larger version of the graphic shows a list of participating hospitals, their locations in Virginia, color-coded by the status of their contract, and a pie-chart showing the percentage of hospital discharges also color coded by contract status.

    diagram

    "VHI Hosts AHRQ Kickoff Meeting" (Newsletter)

    March 7, 2008

    In March 2008, VHI hosted a kickoff meeting for the Agency for Healthcare Research and Quality pilot project called "Adding Clinical Data to Administrative Data." The meeting gave quality, information technology and coding staff from over 30 participating hospitals the opportunity to learn more about project objectives, timelines and technicalities. VHI brought together experts in the field of present-on-admission (POA) coding, clinical pathology and statistical modeling for quality reporting to present on a range of topics from POA coding scenarios to anticipated quality reporting that VHI will provide for participants. Virginia was one of three states in the country to be awarded this cutting edge contract. Please visit our project page at www.vhi.org/hybriddata.Exit Disclaimer

    Map showing the locations of hospitals involved in the "Adding Clinical Data to Administrative Data" project.

    end of diagram

    diagram

    "Adding Clinical Data to Administrative Data" Status Map

    Map of Virginia with participating hospitals indicated.

    31 hospitals/systems involved (as of June 1, 2008)!

    Confirmed Hospitals (31 percent of total hospital discharge volume by contract status (2006 discharges))
    VCU Health System
    Carilion Medical Center
    University of Virginia Medical Center
    Prince William Hospital
    Fauquier Hospital
    Centra Health
    Retreat Hospital
    CJW Medical Center
    Henrico Doctors' Hospital
    Mary Washington
    Reston Hospital Center
    Lewis-Gale Medical Center
    Pulaski Community Hospital
    Montgomery Regional Hospital
    Alleghany Regional Hospital

    Contract Sent Hospitals (15 percent of total hospital discharge volume by contract status (2006 discharges))
    Bon Secours Richmond Community Hospital
    Rappahannock General Hospital
    John Randolph Hospital
    Clinch Valley Medical Center
    Shore Memorial Hospital
    Southampton Memorial Hospital
    Northern Virginia Community Hospital
    Culpeper Regional Hospital
    Bon Secours St. Mary's Hospital
    Riverside Regional Medical Center
    Sentara Norfolk General Hospital
    Riverside Tappahannock Hospital
    Carilion Franklin Memorial Hospital

    Interested Hospitals (4 percent of total hospital discharge volume by contract status (2006 discharges))
    Warren Memorial Hospital
    Winchester Medical Center
    Shenandoah Memorial Hospital

    Hospitals with a contract status of "Other" accounted for 50 percent of total hospital discharge volume by contract status (2006 discharges).

    end of diagram

    The second was the development of a Hybrid data project home page. A screen shot of the home page, as of July 2009, is given below. Quarterly reporting was added later in the project but the initial page contained three important elements.

    1) VHI-AHRQ Kickoff Meeting Materials. This provided hospitals with access to all presentation given during the kickoff meeting. This section did not change throughout the project.

    2) Additional Materials. This section was updated as needed. For example, the POA fact sheet, which came from CMS, changed during the project so this link was updated. We also provided a list of all project coordinators that could be used as a resource by participants.

    3) Frequently Asked Questions. This section also evolved through time. Each question and the response can be viewed at www.vhi.org/hybriddata.asp.Exit Disclaimer

    diagram

    Screenshot of the Hybrid data project home page as of July 2009.

    end of diagram

    VHI developed a comprehensive business associates' agreement and modified to meet hospital's legal requirements as necessary.


  16. List the data elements initially chosen to add to your administrative data set and discuss why these elements were chosen.

    Data elements initially chosen were those provided by MPA. MPA also provided the following response via email in January 2008:

    "We [Virginia] will begin with the MPA list but will expand it in collaboration with pilot sites. We will substitute hemoglobin for hematocrit based on work done recently in California evaluating the clinical utility and credibility of the two almost interchangeable measures. We will consult with Cardinal Health about their experience with Atlas laboratory variables not included on the MPA list and will consider some new variables such as BNP that appear to be breaking through as potentially important risk factors. All potential variables will be ranked on two scales: one for potential utility and a second for ease of collection. These rankings will be used to guide final selection of laboratory data elements."

    Data elements were removed from consideration after the results of the survey. The final list of laboratory data elements are found in Attachment 2.


  17. List any outside sources you referenced when determining which data elements to collect, such as expert input, data standards, articles, research, or other material.

    Several outside consultants were very helpful during the course of this project in Ed Hammond and Linda Hyde.


  18. Project Initiation

  19. Describe the process used to involve hospitals with the project:

    1. Did you contact the hospital organization? What was the purpose of the contact?

      Often, there were, at least, two levels of contact for each hospital. VHI's analyst contacted the quality assurance coordinators by phone and then via email while VHI's Executive Director contacted other key personnel either by phone or in person. The purpose of these contacts was to "put a face" to the project and give hospital personnel an opportunity to ask preliminary questions. VHI also hosted two technical conference calls. See Attachment 5 for the agenda.


    2. Were all hospitals contacted or only selected hospitals? What criteria were used to select hospitals for participation if only selected hospitals were included?

      First, VHI gauged the potential interest among hospitals by relying on the initial responses to the VHHA email above. Once we determined that the interest was there, we sent a mass email to all hospital contacts with the full recruitment package.


    3. Who within the hospital did you contact (e.g., administration, IT, coders)?

      For all hospitals, the quality assurance coordinator was contacted.

      For some hospitals, higher-level personnel were contacted. During recruitment, no IT or coding personnel were contacted. However, these people were invited to the kick-off meeting and sometimes became the primary contacts as the project evolved.


    4. Describe any products or materials developed as part of this pilot/planning process (presentations, reports, brochures, fact sheets, etc). Include products in an appendix if they can be shared.

      VHI hosted a Kickoff meeting modeled after Minnesota's equivalent at the Virginia Hospital and Healthcare Association in Richmond, Virginia. VHI planned for 45 participants (including staff and subcontractors) and had an actual turnout of 32 people. For Kickoff meeting materials, please see www.vhi.org/hybriddata.asp.Exit Disclaimer


    5. What incentive did the hospitals have to participate?

      In the recruitment packaga and kickoff meeting, VHI agreed 6 primary benefits to participation:

      1) Hospitals will have the opportunity to evaluate and improve data quality.
      2) Hospitals will have the benefit of comparative performance as a guide to quality improvement.
      3) Hospitals will help design a program with the least administrative burden and greatest value.
      4) Hospitals will receive quality reports using the AHRQ Quality and Patient Safety Indicators.
      5) Hospitals will receive enhanced cardiac care mortality and readmission information.
      6) Hospitals will receive comparative information on use of POA values and other reports that hospitals suggest throughout the project.


    6. Did you offer to provide any information to hospitals in return for their participation?

      Yes. Hospitals knew that they would receive feedback and reports, listed in Items 4-6, from data submitted.


    7. What other issues did you encounter in establishing hospital buy-in to this project?

      There were significant delays in getting a signed contract from one hospital system because they felt that it was necessary to put the contract through Internal Review Board (IRB) review. The internal approval process took approximately 1 month.


  20. How did you assess a hospital's readiness to participate? If you used a survey, please include the survey questions and results in an appendix and summarize the findings here. Did the results cause you to alter your approach in any way?

    Survey Summary

    There were 11 respondents to the survey representing single and multiple hospitals. 9 of 11 (81%) indicated that they could provide a "supplementary electronic file containing POA values and the "linking" variables." The dominant files types for POA submission were text (6) and EXCEL (5). 8 of 10 (80%) of respondents said that they could provide a "supplementary electronic file containing lab values and the linking variables." We also wanted to know if hospitals had the capability of submitting pre-admission labs by linking pre-admission lab results to the subsequent admission. 9 of 10 (90%) respondents indicated that they could link the results. There were a variety of primary lab vendors listed. "In-house" was listed for 3 respondents. 9 of 10 (90%) respondents indicated that most of the lab values that we requested were available. However, availability of ProBNP, Troponin T, and Neutrophil was more limited. We also found that 7 of 9 (77%) respondents did not have vital signs available electronically. 5 of 8 (63%) respondents said that they would not be able to map data element to LOINC and an equal number said that they would not be able to provide a data dictionary to VHI for mapping. For those that could do the mapping, we asked a follow-up to estimate the number of hours needed to do the mapping. The answers were "unknown but significant","Uncertain", and "4."

    Altering Approach

    VHI made several significant adjustments based on the survey. First, we decided not to collect vital signs because most hospitals did not have them electronically. MPA's research indicated that vital signs did not add significantly to the power of the models so we were comfortable with this decision. Second, we decided to provide a LOINC mapping template to make the process of assigning LOINC values as easy as possible for those hospitals that indicated that they would have problems with the mapping. The template is included in Appendix X. We spent a significant amount of time in developing this template which is described in [section].


  21. Describe any administrative hurdles encountered in moving forward with this project.

    There were no major administrative barriers to project implementation.

  22. What changes were made during initiation that were not anticipated during the planning phase?

    The major changes included the decision to use a data transmission format other than HL-7 and the exclusion of vital signs.


  23. Project Implementation

  24. How many hospitals provided data?

    Out of 28 hospitals, 28 (100%) provided at least 3 quarters of POA data. 15 of 28 (54%) provided at least 3 quarters of laboratory data.

    1. Provide a general description of the types of hospitals that were recruited (such as number of community hospitals, children's hospitals, specialty hospitals, number of beds, urban/rural, teaching/nonteaching, other key descriptors)

      AHRQ Project Status # Hospitals Mean Beds Mean Profit ($) % Not-for-profit % w/ Teaching status % Urban
      Final Pilot Participant 27 327 $24,566,692 66% 37% 88%
      Contract Sent but Declined 9 151 $11,197,654 77% 0% 22%
      No Contract Sent 48 158 $9,456,088 81% 17% 47%


    2. Discuss problems participating hospitals encountered in complying with data requests (e.g., allocation of staff and technological resources, other commitments during certain times of year, other issues).

      For hospitals that committed to the project, the technical difficulties were minimal. As a means of encouraging them to stay engaged in the project, we reminded them that understanding the technical barriers to project implementation was an objective and that we would like to understand the issues. Although some hospital staff turned over, the transitions were relatively smooth and sometimes required us to resend to the data layout package.


    3. How were these problems resolved?

      As we initiated the project, we tried to minimize some of the technical problems by allowing hospitals to submit test files of about 100 records. The following is a typical exchange from VHI to the hospital after the submission of the test file:

      "Took a look at the file this morning. Overall, looks like everything is there but I have just a few questions.

      Is the date in the record the admit or discharge date? One or the other looks to be missing.

      Is the value that starts with the 8 leading zeros the PCN or MRN? One or the other looks to be missing.

      I don't see where the LOINC code has been used but I do see the test name. Will you be providing the LOINC map as well?

      What does the 0 in the MPN field mean?

      Is the last time in the record the observation date/time or the analysis date/time?"


      Although the field format was fixed length, sometimes, depending on the file format, this would vary. For example, the hospital identifier, a six digit number with no spaces, was sometimes sent with leading and trailing zeros. We made the decision to just accept these types of variations if they could easily be corrected on our end. For one hospital system with four participating hospitals, there were significant delays in receiving their POA files because they could not transpose the data so that each row was a patient record. Because we were eager to have them participate, we accepted the submission and transposed it on this end. This hospital system was never able to provide lab data due to system conversion. We also requested that the ECODES be extracted from diagnosis codes and placed into separate columns at the end of the file. This proved to be problematic for many hospitals because they had to write special extraction features. We finally just accepted the files without the specific ECODE columns.


    4. Describe key hospital characteristics that led to successful hospital participation.

      On average, larger, urban hospitals with larger profit margins participated in the pilot. However, obtaining data was easiest from smaller hospitals where the data quality contact was directly connected to IT support or through hospital systems that were able to export a single file for multiple hospitals. Data submission was more problematic for large, teaching hospitals not affiliated with systems.


    5. Did you assess fiscal impact to hospitals for participation? If so, what resources did hospitals need to participate?

      As of July 31, 2009, we have not assessed the fiscal impact. However, it is our intention to do so using the template survey provided by Florida.


  25. Describe issues encountered in standardizing data elements.

    1. Did you use HL-7 and/or LOINC?

      i. If not, why not? What other coding method was selected and why was it chosen?

      HL7

      VHI did not use a full compliant HL7 message for this project. Some hospitals did indicate the capability to export in an HL-7 format and that it would be fairly straightforward. For example, one hospital wrote:

      "We already have a lab interface [name of interface] in HL7 format. Is it ok to send you our lab data in this format? If so, we'll just need to filter out the tests you're not requesting and can have a test file ready by July."

      However, during the kick-off meeting, many hospitals indicated that they did not have HL-7 capability. Although the idea of using the pilot to "force" them to provide information in a fully compliant HL-7 format as a means of "getting them used to it" was considered, VHI decided that this would be inappropriate for this purpose for four reasons: 1) the project was voluntary and VHI did not want to risk excluding hospitals that could not provide HL-7 easily 2) the universe of data elements needed for a fully compliant HL-7 message was much larger than needed for this project 3) this project was not a real-time transaction of data which is one of the primary reasons for using HL-7 and 4) VHI was also inexperienced with the format.

      Ed Hammond was very helpful in explaining the HL-7 format to VHI {see Attachment 6}. VHI worked through several examples of how to construct a message for this project before deciding on an HL7-like file format.

      ii. If so, describe any challenges and how they were resolved.

      Once VHI discovered that LOINC mapping would be very challenging for some hospitals, it was decided to develop a simplified map, with examples and instructions, for just those lab elements of interest. The following table was provided as an example for instructing hospitals on how to complete the LOINC worksheet. VHI did not request that the maps be returned for verification. According to Michael Pine and Associates, Inc. the quality of the lab data suggests that Virginia hospitals were successful in mapping LOINC values without significant technical assistance.

      Example LOINC Code Worksheet

        TO BE COMPLETED BY HOSPITAL
      # Test Name Spec. Type Unit LOINC Code LOINC Name Comment Spec. Type Unit Normal Range Method Comment
      25A O2 Sat. Arterial Arterial Blood % 2708-6 O2 % BldA FIO2 if available Arterial Blood % 94-100 Pulse oximetry ABG11 FIO2 not available
      28 Sodium Serum / Plasma mEq/l = mmol/l 2951-2 Sodium SerPl-sCnc   Serum mEq/l 135-145 Ion exchange- gravimetric Na1
      Serum mEq/l 135-145 Flame photometry Na2
      32B White Blood Count Whole Blood 109 cells/ul 6690-2 WBC # Bld Auto   Whole Blood 109 cells/ul 4.3-10.8 Flow cytometry WBC101


    2. Include a copy of your format for data collection in the appendix.

      Please see Attachment 1 for the data file format and technical specifications.


    3. What advice would be useful to other states in understanding/employing HL-7 and/or LOINC?

      Although VHI spent a significant amount of time to understand the pros/cons of the HL-7 format, it was decided not to use it for 2 reasons:

      1) If hospitals were even aware of the standard at all, most hospitals that VHI surveyed indicated that they would face significant barriers to exporting in a HL-7 format.

      2) Given that VHI was expecting quarterly transfers of batch files and that HL-7 was designed for real-time transactions, VHI did not think it would be appropriate to impose the format unnecessarily.


  26. Did you use any specific communications or tools with hospitals to ease their collection efforts?

    1. Describe the communications/tools and furnish copies in an appendix.

      VHI took several steps to help ensure that participation in the pilot would be as easy as possible. First, in all of the data layout packages, VHI provided very specific examples. For example, the table below was provided to help describe valid POA formats.

      POA Coding Scenarios for Data Verification for "Adding Clinical Data to Administrative Data" project

        VALID
      ?
      POSITION NUMBER REASON
        6 7 8POA  
      1 2 3 4 5 * * A
      3 Digit DX code Pneumonia, Organism unspecified YES 4 8 6         N POA value in the 8th position
      NO 4 8 6 N         POA value not in 8th position
      NO 4 8 6           No POA value
      YES 4 8 6         1 POA value in the 8th position
      4 Digit DX code Urinary Tract Infection YES 5 9 9 0       Y POA value in the 8th position
      NO 5 9 9 0 Y       POA value not in 8th position
      NO 5 9 9 0         No POA value
      YES 5 9 9 0       1 POA value in the 8th position
      5 Digit DX code Decubitus Ulcer on Heel-Bedsore or Pressure Ulcer YES 7 0 7 0 7     W POA value in the 8th position
      NO 7 0 7 0 7 W     POA value not in 8th position
      NO 7 0 7 0 7       No POA value
      YES 7 0 7 0 7     1 POA value in the 8th position
      ECODE (4 digit)** Foreign Object Left in Body during Procedure YES E 8 7 1       N POA value in the 8th position
      NO E 8 7 1 N       POA value not in 8th position
      NO E 8 7 1         No POA value
      YES E 8 7 1       U POA value in the 8th position
      ECODE (5 digit) Foreign Object Left in Body during Procedure YES E 8 7 1 4     N POA value in the 8th position
      NO E 8 7 1 4 W     POA value not in 8th position
      NO E 8 7 1 4       No POA value
      YES E 8 7 1 4     U POA value in the 8th position
      POA The 8th position is always reserved for the POA indicator (see next page for valid POA indicators)
      * The 6th and 7th positions are always left blank for data submission during this project but consistent with UB04 reporting requirements
      ** The ECODE always starts with an "E" in the first position


    2. How did the communications/tools assist the hospitals?


  27. Describe the process and technologies used for hospitals to transmit the data, and your organization to receive them. What problems were encountered with data transmission and how were they resolved?

    VHI also provided a web-based data upload site. Although this site was sometimes insufficient for transmitting very large files (> 30 MB) and FedEx was used as the fallback, overall the hospitals were competent users of the site. A screenshot of the upload page is found below.

    Screenshot of the web-based data upload page.

    Page includes the name of the hospital, the purpose of the upload page, the timeframe for data submission, and a place to upload the data file. The page also shows files previously uploaded by the hospital, as well as files uploaded by VHI. Additional options featured on the page are to update facility information and update contact information.

    When files needed to be transmitted via FedEx, most hospitals chose to zip and password protect the files. Hospitals provided the passwords as emails. One hospital sent a USB drive, with proprietary encryption software installed. Although slightly more involved, this did not cause any problems.


  28. Data Analysis

  29. How were the clinical data linked to the administrative data, and how was the correctness of the linkage verified?

    POA

    As mentioned previously, all hospitals were given the opportunity to submit test files with a small number of POA enhanced records. Using SAS, we wrote code to import files according to the fixed file format that we requested and this was applied, unaltered as the first attempt to read the data. Although the field lengths sometimes were off only by a few digits, this would often lead to compounding errors throughout the file. It was rare for the hospital to submit data exactly as requested however, unless the error was egregious, we would accept the file as submitted. Eventually, a combination of SAS, ACCESS, and EXCEL was used to read files. The import procedure within ACCESS was very useful because it provides a visual ruler for counting field lengths.

    For all data submissions, at a minimum VHI requested a patient control number (PCN) and medical record number (MRN). The PCN was the primary means of linkage to the administrative data. While this unique identifier was usually consistent between the two data sets, sometimes the PCN submitted for the pilot would be in a different format. Often, the formatting difference was attributable to segments (such as trailing and leading zeros) of the identifiers. The difference probably occurred because the pilot project bypassed routine VHI data standardization.

    Once the PCN was standardized, the first step to ensure that the linkage was effective was to perform a count of discharges by hospital for the quarter of interest. [Provide some descriptive statistics of variance.] This step often revealed when more records than necessary were being included in the POA data. The difference in counts revealed that some hospitals were sending outpatient data.

    We usually provided an immediate email back to the sender to indicate that the POA counts were within a reasonable range of what had been submitted for their administrative data. This process sometimes involved sending a small subset of problematic records back to the sender. These records were typically uploaded to the same hospital-specific web page used for submitting the data.

    At this time, we also started to develop preliminary POA reports and hold conference calls with our subcontractors to discuss results. Initially, these reports were sent to the hospitals in "real-time" as the data was received but the release time, format, and content evolved significantly through time. One of the earliest versions is shown below. You can see the admin vs. POA record count above the table which shows the distribution of the POA indicator through 18 diagnosis codes and 3 ECODE fields. We applied an automated highlight to occur when the percentage of POA flags in any given DX field was greater than 10%. We also began developing potentially hospital acquired conditions reports based on fact sheets available from CMS.

    Diagram

    VHI logo

    "Adding Clinical Data to Administrative Data"

    DRAFT REPORT OF POA DISTRIBUTION FOR HOSPITAL 1

    Discharges Oct-Dec 2007

    Admin Records 1,568 POA Records 1,547

    DX#
    ECODE#
    ICD9 Present POA=Y % POA=N % POA=E % POA=U % POA=W %
    DX1 1,532 1,096 71.5 182 11.9 213 13.9 41 2.7 2 0.1
    DX2 1,549 1,284 82.9 43 2.8 215 13.9 7 0.5 . .
    DX3 1,354 1,015 75.0 105 7.8 186 13.7 48 3.5 . .
    DX4 1,184 909 76.8 97 8.2 152 12.8 26 2.2 1 0.1
    DX5 1,078 830 77.0 73 6.8 146 13.5 29 2.7 . .
    DX6 970 746 76.9 72 7.4 130 13.4 22 2.3 . .
    DX7 899 668 74.3 75 8.3 129 14.3 27 3.0 . .
    DX8 829 586 70.7 59 7.1 158 19.1 26 3.1 . .
    DX9 769 519 67.5 59 7.7 166 21.6 25 3.3 . .
    DX10 701 453 64.6 51 7.3 178 25.4 19 2.7 . .
    DX11 645 385 59.7 55 8.5 182 28.2 23 3.6 . .
    DX12 582 333 57.2 40 6.9 184 31.6 25 4.3 . .
    DX13 518 285 55.0 31 6.0 176 34.0 26 5.0 . .
    DX14 468 223 47.6 26 5.6 202 43.2 17 3.6 . .
    DX15 411 181 44.0 36 8.8 181 44.0 13 3.2 . .
    DX16 368 162 44.0 25 6.8 166 45.1 15 4.1 . .
    DX17 308 129 41.9 19 6.2 147 47.7 13 4.2 . .
    DX18 261 99 37.9 19 7.3 135 51.7 8 3.1 . .
    ECODE1 270 184 68.1 60 22.2 16 5.9 10 3.7 . .
    ECODE2 72 37 51.4 18 25.0 14 19.4 3 4.2 . .
    ECODE3 . . . . . . . . . . .
    TOTAL 14,768 10,124 68.6% 1,145 7.8% 3,076 20.8% 423 2.9% 3 0.0%
    Code Present - The number of records having the DX#/ECODE3 indicated
    POA = Y - The number of records where the value Y (YES) was found in the position reserved for the POA value
    POA = N - The number of records where the value N (NO) was found in the position reserved for the POA value
    POA = 1/E - The number of records where the value (1 or E) (EXEMPT) was found in the position reserved for the POA value
    POA = U - The number of records where the value U (INSUFFICIENT) was found in the position reserved for the POA value
    POA = W - The number of records where the value W (UNDETERMINED) was found in the position reserved for the POA value
    POA = 1/E - The number of records where the value (1 or E) (EXEMPT) was found in the position reserved for the POA value
    For DX1, ECODE1, and ECODE2, POA=N was reported for greater than or equal to 10 percent of all POA values


    VHI Logo "Adding Clinical Data to Administrative Data"

    DRAFT REPORT OF POTENTIALLY HOSPITAL ACQUIRED CONDITIONS FOR HOSPITAL 1

    Discharges Oct-Dec 2007

    CMS CONDITION (ICD9 CODE) POA=N % POA=Y % POA=U % TOTAL
    Clostridium difficile - Assoc. Dis (008.45) 3 15.8 16 84.2 0 0.0 19
    Deep Vein Thrombosis (453.40) 0 0.0 4 100.0 0 0.0 4
    Deep Vein Thrombosis (453.41) 1 9.1 9 81.8 1 9.1 11
    Deep Vein Thrombosis (453.42) 1 14.3 5 71.4 1 14.3 7
    Pressure Ulcer (707.03) 0 0.0 12 92.3 1 7.7 13
    Pressure Ulcer (707.07) 0 0.0 5 100.0 0 0.0 5
    Pressure Ulcer (707.09) 0 0.0 3 100.0 0 0.0 3
    Pulmonary Embolism (415.19) 8 30.8 17 65.4 1 3.8 26
    Staph. Septicemia (998.59) 3 37.5 5 62.5 0 0.0 8
    Vasc. Cath. Assoc. Inf. (999.31) 0 0.0 3 100.0 0 0.0 3
    TOTAL 16 16.2 79 79.8 4 4.0 99
    CMS flagged conditions are based on information found at http://www.cms.hhs.gov/HOSPITALAcqCond/


    These reports were also used as an additional data quality checks and to generate further discussion during meetings with subcontractors. Non-identifiable versions were also presented to VHI's Board of Directors as a means of demonstrating pilot progress.

    Through the course of subcontractor meetings, additional tables were requested such as the "Top 50 Diagnoses NOT PRESENT ON ADMISSION in the principal diagnosis" and "Top 50 Diagnoses EXEMPT FROM POA REPORTING in the principal diagnosis". Additional information was added to the "Top 50 Diagnoses NOT PRESENT ON ADMISSION in the principal diagnosis" table that indicated when the "condition may be inconsistent with the definition of a principal diagnosis or POA coding guidelines." This was our attempt to draw hospital attention to potential coding problems.

    This report evolved into an 8 table quarterly report called "VHI-AHRQ Pilot Project Adding Clinical Data to Administrative Data: 1st Quarter Results (4Q 2007 data)". The cover for this report is shown below and the full contents can be read at www.vhi.org/hybriddata.asp.Exit Disclaimer

    Image of cover of report: "VHI-AHRQ Pilot Project Adding Clinical Data to Administrative Data: 1st Quarter Results (4Q 2007 data)"

    The summary included the paragraph

    "Based on initial commitments to the project, VHI expected to receive 99, 350 hospital discharges with the POA indicator added for 4Q 2007. VHI received 92,749 (93%) discharges. Although minor issues with formatting, coding conventions, and data extraction were encountered along the way, the POA phase of the project has been very successful."

    By the release of the second quarterly report (see Attachment 7) representing 1Q and 2Q 2008 discharges, the information had expanded significantly because of the addition of several important tables:

    • AHRQ Hospital Patient Safety Indicators (PSI) with Pre-POA and Post-POA rates specific to their hospital and for all pilot sites aggregated.
    • AHRQ Patient Safety Indicators (PSI) Comparison of Pilot Data, a National Inpatient Sample, and a Veterans Administration Sample (shown below)
    • AHRQ Hospital Patient Safety Indicators (PSI) with Pre-POA and Post-POA rates specific to their hospital and for all pilot sites aggregated.


    VHI logo
    Table 12 Patient Safety Indicators (PSI) Comparision (1Q and 2Q 2008 Aggregated Data for 27 Pilot Hospitals)
      Virginia PRE-POA Virginia POST-POA National Inpatient Sample (2006) Veterans Administration Sample (2001-2005)
    Indicator Number Indicator Description Obs Rate Obs Rate Risk Adj Rate Obs Rate Risk Adj Rate
    1* Complications of anesthesia 0.50 0.50 0.66 0.74 0.73
    2 Death in low mortality DRGs 0.70 0.70 0.43 2.83 .
    3 Decubitus Ulcer 33.10 8.70 24.57 15.20 15.92
    4 Death among surgical inpatients w/serious treatable comp. 151.70 189.90 114.00 137.18 137.60
    5* Foreign body left in during procedure, secondary DX field** 0.00 1000.00 0.09 0.12 .
    6 Iatrogenic pneumothorax, secondary DX field 0.60 0.50 0.61 0.85 1.13
    7 Selected Infections due to medical care, secondary DX field 2.20 1.60 2.19 1.95 1.57
    8* Post-operative hip fracture 0.30 0.10 0.31 0.39 0.53
    9 Post-operative hemorrhage or hematoma 2.60 2.30 2.40 3.19 2.71
    10 Post-operative physiologic and metabolic derangements 1.30 0.50 0.42 2.08 1.90
    11 Post-operative respiratory failure 7.70 6.70 10.39 13.36 10.28
    12 Post-operative pulmonary embolism or deep vein thrombosis 15.40 8.70 11.18 11.29 9.40
    13 Post-operative sepsis 18.10 15.90 15.06 6.60 6.60
    14* Post-operative wound dehiscence 2.70 2.70 2.63 6.47 3.61
    15 Accidental puncture or laceration, secondary DX field 3.70 3.30 4.57 2.96 4.53
    16* Transfusion reaction, secondary DX field 0.00 0.00 0.00 . .
    17 Birth trauma-Injury to Neonate 4.00 4.00 1.58 . .
    18 OB Trauma-vaginal delivery with instrument 149.90 149.90 160.55 . .
    19 OB Trauma-vaginal delivery without instrument 32.00 32.00 36.20 . .
    20 OB Trauma-cesarean section 5.50 5.50 3.93 . .
    Source Virginia Health Information AHRQ Shimada et al 2008
    * Virginia's numerator is less than or equal to 30
    ** Currently, the numerator/denominator for PSI 5 is 1 resulting in a rate of 1,000. See the aggregate PSI report for the number of PSI 5 events.

    LAB

    Initially, VHI began the process of data quality monitoring and report writing from the POA and lab files simultaneously. When the pace of POA submission began to pick-up, VHI began to concentrate data management effort on the POA side. Before that time, VHI did develop a number of internal lab reports based on subcontractor requests. As with POA, VHI counted unique patients in the lab files and compared to administrative data files. A more sophisticated report was the distribution of lab tests by discharge timeframe.

    Draft Lab Value Distribution Report
    Discharges OCT 1 2007 thru DEC 31 2007
    Patient poplulation = 27,299
    Number of Hospitals in system = 10
    NAME Total # of Patients Receiving Test Average # of Tests per Patient Total # of Tests A. > 30 Days Prior to Admit Day B. 30 to 8 Days Prior to Admit C. 7 to 1 Days Prior to Admit D. Admit Day E. 1 Day After Admit Day F. 2 Days After Admit Day G. 3 Days to 1 Day Prior to Discharge H. Discharge Day I. After Discharge Day
    N % N % N % N % N % N % N % N % N % N % N %
    AST (SGOT) 17,619 2.1 36,438 4 0.01 293 0.8 2,524 6.93 12,811 35.16 5,101 14 2,811 7.71 11,746 32.24 1,143 3.14 5 0.01
    Albumin 17,749 2.3 40,350 4 0.01 293 0.73 2,526 6.26 12,865 31.88 5,383 13.34 3,225 7.99 14,560 36.08 1,489 3.69 5 0.01
    Alkaline Phosphatase 17,617 2.1 36,406 4 0.01 293 0.8 2,524 6.93 12,805 35.17 5,091 13.98 2,803 7.7 11,747 32.27 1,134 3.11 5 0.01
    Amylase 2,982 1.4 4,213 . . 2 0.05 445 10.56 2,103 49.92 518 12.3 304 7.22 748 17.75 88 2.09 5 0.12
    BNP 2,322 1.4 3,314 . . . . 221 6.67 1,578 47.62 425 12.82 297 8.96 727 21.94 66 1.99 . .
    Base Units Excess 3,535 3.3 11,772 . . 16 0.14 220 1.87 2,957 25.12 1,641 13.94 1,069 9.08 5,650 48 219 1.86 . .
    Bicarbonate 3,537 3.3 11,776 . . 16 0.14 220 1.87 2,959 25.13 1,642 13.94 1,069 9.08 5,651 47.99 219 1.86 . .
    Bilirubin Total 18,009 2.1 38,531 4 0.01 293 0.76 2,525 6.55 12,954 33.62 5,486 14.24 3,275 8.5 12,697 32.95 1,288 3.34 9 0.02
    C Reactive Protein 575 1.4 786 . . 16 2.04 8 1.02 144 18.32 156 19.85 89 11.32 342 43.51 31 3.94 . .
    Calcium 19,784 3.8 75,216 4 0.01 548 0.73 3,385 4.5 14,958 19.89 12,666 16.84 8,727 11.6 30,879 41.05 4,042 5.37 7 0.01
    Creatine Kinase (CPK) 3,998 2.2 8,929 . . . . 569 6.37 4,487 50.25 2,243 25.12 500 5.6 1,020 11.42 100 1.12 10 0.11
    Creatine Kinase MB8 7,181 3.4 24,158 2 0.01 190 0.79 1,223 5.06 8,695 35.99 4,826 19.98 1,779 7.36 6,692 27.7 746 3.09 5 0.02
    Creatinine Serum 22,189 4.0 87,743 6 0.01 737 0.84 3,802 4.33 16,638 18.96 14,347 16.35 9,997 11.39 37,396 42.62 4,813 5.49 7 0.01
    Glucose8 23,711 10.9 259,208 6 0 742 0.29 4,377 1.69 32,126 12.39 37,162 14.34 28,652 11.05 141,626 54.64 12,694 4.9 1,823 0.7
    Hemoglobin 25,393 3.6 91,036 6 0.01 778 0.85 4,603 5.06 19,708 21.65 16,330 17.94 10,491 11.52 34,493 37.89 4,622 5.08 5 0.01
    INR 10,460 2.6 27,639 4 0.01 354 1.28 1,361 4.92 6,853 24.79 3,485 12.61 2,783 10.07 10,737 38.85 2,062 7.46 . .
    Inhaled oxygen8 1,939 2.2 4,276 . . . . 77 1.8 1,224 28.62 613 14.34 375 8.77 1,927 45.07 60 1.4 . .
    Lactate Dehydrogenase (LDH) 1,135 1.4 1,630 . . 15 0.92 49 3.01 673 41.29 196 12.02 138 8.47 509 31.23 45 2.76 5 0.31
    Lactic Acid 383 1.3 482 . . . . 15 3.11 167 34.65 103 21.37 51 10.58 134 27.8 12 2.49 . .
    Neutrophils Band 5,118 2.6 13,254 . . 74 0.56 746 5.63 3,787 28.57 2,352 17.75 1,475 11.13 4,111 31.02 709 5.35 . .
    O2 Saturation Arterial 2,821 3.2 8,989 . . 14 0.16 197 2.19 2,375 26.42 1,332 14.82 814 9.06 4,079 45.38 178 1.98 . .
    Partial Thromboplastin Time 8,585 2.2 18,883 3 0.02 342 1.81 1,248 6.61 6,134 32.48 2,547 13.49 1,544 8.18 6,639 35.16 426 2.26 . .
    Platelet Count 24,695 3.3 81,132 6 0.01 744 0.92 4,448 5.48 18,438 22.73 13,202 16.27 8,657 10.67 31,553 38.89 4,080 5.03 4 0
    Potassium 22,160 4.0 88,780 6 0.01 738 0.83 3,811 4.29 16,934 19.07 14,618 16.47 10,197 11.49 37,667 42.43 4,802 5.41 7 0.01
    Prothrombin Time 10,462 2.6 27,688 4 0.01 354 1.28 1,361 4.92 6,856 24.76 3,489 12.6 2,785 10.06 10,773 38.91 2,066 7.46 . .
    Sodium 22,138 3.9 86,581 6 0.01 737 0.85 3,795 4.38 16,629 19.21 14,303 16.52 9,934 11.47 36,490 42.15 4,680 5.41 7 0.01
    Troponin I 5,157 2.3 11,779 . . 1 0.01 572 4.86 5,775 49.03 3,537 30.03 611 5.19 1,178 10 103 0.87 2 0.02
    Urea Nitrogen Blood (BUN) 22,171 3.9 87,465 6 0.01 737 0.84 3,799 4.34 16,624 19.01 14,326 16.38 9,966 11.39 37,217 42.55 4,783 5.47 7 0.01
    White Blood Count 24,492 3.3 79,954 6 0.01 744 0.93 4,441 5.55 18,011 22.53 13,057 16.33 8,519 10.65 31,145 38.95 4,027 5.04 40.01
    pCO2 Arterial 3,537 3.3 11,781 . . 16 0.14 220 1.87 2,963 25.15 1,642 13.94 1,069 9.07 5,652 47.98 219 1.86 ..
    pH Arterial 3,537 3.3 11,782 . . 16 0.14 220 1.87 2,963 25.15 1,642 13.94 1,070 9.07 5,652 47.97 219 1.86 ..
    pO2 Arterial 3,534 3.4 12,183 . . 16 0.13 231 1.9 3,208 26.33 1,696 13.92 1,092 8.96 5,714 46.9 226 1.86 . .
    pro-BNP 1,170 1.3 1,509 . . . . 145 9.61 954 63.22 105 6.96 56 3.71 225 14.91 24 1.59 . .
    Missing/Invalid LOINC 608 1.3 811 . . 3 0.37 62 7.64 242 29.84 123 15.17 70 8.63 280 34.53 31 3.82 . .
    All 27,299   1,306,474 81 0.01 9,122 0.7 55,970 4.28 288,598 22.09 205,385 15.72 136,294 10.43 547,656 41.92 61,446 4.71,922 0.15


    Another report shows, graphically, the types of lab tests, as a percentage of all of the requested lab tests, being sent by hospital. This chart helped us see that hospital 490023 was providing substantially more Neutrophil Band data than the other hospitals in the pilot at that time.

    Lab Tests Sent by Hospital as a Percentage of All the Requested Lab Tests
    Line Chart
    Chart does not include data values, so the ranges of percentages are indicated here.
      Hospital (by Number)
    490020 490023 490032 490048 490071 490107 490112 490116 490118 490126 490110
    AST (SGOT) 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Albumin Fraction 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Amylase 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Base Units Excess 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Bilirubin Total 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Calcium 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 0% to 5% 5% to 10% 5% to 10% 5% to 10% 5% to 10%
    Creatine Kinase MB 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Glucose 20% to 25% 5% to 10% 5% to 10% 20% to 25% 20% to 25% 20% to 25% 20% to 25% 20% to 25% 20% to 25% 20% to 25% 20% to 25%
    INR 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Lactate Dehydrogenase (LDH) 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Neutrophils Band 0% to 5% 5% to 10% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Partial Thromboplastin Time 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    Potassium 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10%
    Sodium 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10%
    Troponin I 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    White Blood Count 5% to 10% 10% to 15% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10% 5% to 10%
    pH Arterial 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%
    pro-BNP 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5% 0% to 5%


  30. Describe the procedures you employed to ensure that the information you received was reasonable and accurate. How did you decide what data were reasonable enough to use (e.g., checks for missing data, appropriate lab tests for diagnoses, reasonableness of lab results, lab data confirmation of POA, data appropriate for field, outliers)? Include details in the appendix (e.g. list of edit checks).

    In addition to the data quality checks already mentioned, one of the most useful means of checking for reasonableness was requesting similar POA distributions from California, New York, and Florida. Although the analysis was never formal, seeing the basic distribution in other states did give us a good sense that Virginia data seemed to be reasonable. The multi-state comparison is shown below. This table has not yet been shared with participating hospitals.

    Multi-state Comparison New York+ Virginia* Florida** California***
    Principal Diagnosis n % n % n % n %
    Exempt (allows blank, 1, E, or Y) 101,117 2.97 9,311 10.04 130,282 9.76 74,179 1.85
    Y 2,840,069 83.49 75,619 81.53 1,177,292 88.17 3,787,112 94.38
    N 284,228 8.36 3,117 3.36 26,271 1.97 127,001 3.16
    U 161,509 4.75 73 0.08 1,340 0.10 23,680 0.59
    W 14,389 0.42 28 0.03 91 0.01 730 0.02
    Missing (but not exempt) NA   2,340 2.52 - 0.00 not validated #VALUE!
    Exempt (but miscoded, any value other than blank 1, E, or Y) 189 0.01 2,261 2.44 - 0.00 72 0.00
    Total of Principal Diagnosis 3,401,501 100.00 92,749 100.00 1,335,276 100.00 4,012,774 100.00
    *Q4 2007 (26 hospitals)
    **Q1-Q3 2008
    ***Full year 2007 (454 hospitals) - no validation for 2007 or first half of 2008, validation will begin with report period Jul-Dec 2008
    (Invalid = 72 miscoding, other than the acceptable values)
    + Q4 2007 (235 hospitals)


  31. Describe analyses performed, methods/process used, statistical modeling, and results/outcomes, if applicable. Include details of models used in an appendix.

    As of July 31, 2009, VHI has applied 15 POA data screens, developed to measure the quality of POA coding, by Michael Pine and Associates, Inc. These screens give a sense of the reliability of medical coding in the hospital and were initially developed using New York State SPARCS data from 2003 through 2005 from 108 hospitals. A simplified example of how one of these screens is applied is found below:

    A patient is admitted to the hospital for high-risk pneumonia. The patient also has lung cancer which gets coded. The lung cancer is considered a chronic condition and should almost never be coded as hospital-acquired. In other words, the POA indicator on the ICD9 code for the lung cancer should be a "Y" for "yes, the condition was present on admission."

    The interim results of this specific screen in Virginia's participating hospitals indicated that 19% (5 hospitals) reported a chronic condition, such as cancer, as being not present on admission (POA indicator="N") in more than 2%1 of their high-risk medical condition codes such as pneumonia. In other words, a cancerous condition was coded as hospital acquired too often in these hospitals. Each of these hospitals received a score of 3, on a scale of 1 to 4, for this screen. This and 14 additional screens were averaged for a composite screening score.

    Interim results using the composite score indicated that 6 of 26 participating hospitals would fail POA coding quality. However, after further analysis, a data quality problem was identified in the original data submissions from two of the screened-out hospitals. VHI requested a new submission. To date, one hospital has successfully re-submitted while the other is still processing.

    Lab data results will be presented as an Addendum to the final report.


    1 The 2% level was determined by Michael Pine and Associates, Inc.


  32. Project Results

  33. Describe your project's overall success during the pilot/planning process.

    1. Identify major accomplishments.

      Although we do not have direct evidence that the pilot project led to legislation requiring the submission of POA in Virginia, we believe that because of the positive working relationship with participating hospitals and the overall success of the project reduced any potential barriers to adoption.

      According to our pilot project partners, our model of data collection was very effective. So, to the extent that this model is reproducible, we believe that it is a major accomplishment. We also believe that our reporting was innovative and as timely as possible, helping VHI to build an even stronger foundation of collaboration on quality improvement issues with the hospital community.


    2. How does the end result compare to the initial vision of the project?

      VHI met all aspects of our proposed vision of the contract.

      • VHI proposed and succeeded in recruiting hospitals to participate in this effort. AHRQ requirements were for a minimum of 5 hospitals. VHI had 27 hospitals participating.
      • Our goal of collecting and incorporating POA and laboratory data with our administrative data set was also met. Hospitals were able to provide this information to VHI.
      • VHI sought and completed a preliminary analysis before the September, 2009 end of the contract. An analytical dataset was developed, delivered and preliminary analysis was conducted. Further analysis will continue after the end of the contract.
      • In addition, working with hospitals to obtain POA information led to full scale implementation of POA values to part of regular quarterly submissions of hospital discharge data to VHI for all hospitals.

    3. What unexpected hurdles did you encounter and how were they resolved?

      One of the largest unexpected hurdles was the file size of the laboratory data. Not only was our online uploading tool insufficient for handling this capacity but we also crashed one of our hard drives during overnight processing.


  34. List the clinical data elements you were able to add to your administrative data set (please specify—POA, lab values, vital signs).

    1. Why were these elements chosen?

      The POA indicator has been officially added to the administrative data as of July 31, 2009. We are awaiting results of the laboratory models to see if requiring lab data submission will be feasible.


    2. Were there any data elements you had hoped to include but were not able to collect?

      No.


    3. If so, what were those data elements and what barriers did you find to adding them?

  35. Describe your methods and any related challenges in the following areas, if not already discussed:

    1. Hospital participation

      The following is a list of important issues to keep in mind for garnering hospital participation: formalizing the relationship through contractual obligations even though the project is voluntary, a balanced communication pattern that respects their time, willingness to provide on and off-site technical support, immediate feedback after data submission if only to say that the files were received, comprehensive reporting so that hospitals do not feel like data is going into "black box", when asking for hospitals to go above and beyond showing them that you understand the voluntary nature of the project and that their time is important, being able to discuss the project at multiple levels from those in the hospital interesting in the statistical methods to those interested in medical coding improvement (e.g., one hospital was interested in hierarchical modeling), meeting with hospital contacts in other professional venues, and providing marketable materials that they can share internally.


    2. Hospital training and education

      VHI tried to help inform hospitals on the quality of coding through the various tables provided in the quarterly reporting. We will also be providing the results of the POA screens developed by Michael Pine and Associates, Inc.

    3. Data formats, coding, and standardization

      VHI struggled with whether or not to require the use of HL7. VHI understood the value of HL7 and that it had an important role to play in the real-time transmission of data, but, in the end, it was decided that the priority of the project was to get the data as quickly and easily as possible without excluding hospitals with no ability to transmit HL7 messages.


    4. Data transmission

      For the most part, hospitals were willing to send the data by any means necessary. Some sent flash drives with special encryption software while others sent CDs via FedEx. Hospitals were always willing to send even when it cost them money.


    5. Data cleaning

      See above.


    6. Data merging

      See above.


    7. Data security

      VHI applied standard security procedures to incoming data. First, the upload site via the web was secure. Second, once the files were received they were downloaded to a desktop. Any files received via CD were locked in a cabinet.


    8. Data risk adjustment

      Michael Pine and Associates, Inc. presented preliminary results at the final partners meeting in September 2009. We have not yet had an opportunity to analyze his findings. We will do so as an addendum to this final report.


    9. Model results

      See above.


    10. Summary findings

      See above.


  36. Explain your process for assessing the value of adding clinical data to administrative data sets and the outcome of your assessment. What did the added data contribute to your analyses? What were the benefits to users?

    On the POA side, VHI successfully generated hospital-level reports of potentially hospital acquired conditions and POA-adjusted versions of the AHRQ quality indicators. VHI showed hospitals in Virginia the value of the POA indicator by demonstrating significant differences in rates for certain indicators before and after the use of the POA indicator. VHI also demonstrated that the distribution of POA values was very similar to what other states found, building additional confidence in the data set for future use.

    On the lab side, VHI anticipates analytical results by November 2009.


  37. How do you expect this information will be used in your state (e.g., enhanced analysis of quality indicators, transparency initiatives, academic research)?

    VHI currently publishes the AHRQ quality indicators on its website at www.www.vhi.org/aqi.asp.Exit Disclaimer VHI expects to update this site with POA-enhanced administrative data will POA reporting is complete. VHI also produces a Cardiac Care report at www.www.vhi.org/cardiac_reports.asp.Exit Disclaimer VHI intends to improve the risk-adjustment methodology of this report by using laboratory data. VHI also intends on using the results of this project to demonstrate the importance of laboratory reporting to all hospitals in Virginia in order to expand data collection efforts. Finally, VHI will also use the results to continue to engage a wide variety of stakeholders in data-driven transparency initiatives.


  38. How likely is it that your state will continue to collect this clinical data or to expand collection beyond current participating hospitals?

    VHI believes it is very likely.


  39. Do you plan to disseminate the results of the pilot/planning project? If so, how? Who is your audience (general public, HIT, coders, others)?

    VHI plans on creating a series of reports using the POA screens developed by MPA, Inc. as well as reports based on outcomes from the lab data analysis. Hospitals will receive hard-copies of these reports but will also have a face-to-face meeting to disseminate results. VHI currently has meetings planned with hospitals, health insurance companies, physicians, and consumers to present results and discuss ways to move forward with quality improvement using this data.


  40. Are you expecting to encounter political or other challenges in dissemination?

    No major challenges are expected. VHI will be able to anticipate additional challenges in November 2009.


  41. Review

  42. What do you believe were the critical success factors to facilitating the involvement of hospitals in your pilot/planning process (e.g. state team, state infrastructure, relationships, data issues, etc)?

    Please see above.


  43. FOR THE PILOT PROJECT: Describe your state's plans for continuing the work of the pilot. In what ways can AHRQ be of assistance to you in this?

    VHI has entered into a contract with the Brookings Institute to convene a group of stakeholders (health insurance companies, physicians, and hospitals) to discuss the value of adding clinical data to administrative data. This work is ongoing but we expect to have results in December 2009.


  44. FOR THE PLANNING PROJECT: Explain how you feel your project, as planned, would perform as a pilot.

    Not applicable.


  45. What would you do differently if you were to start over (e.g., contact more or different departments within a facility, host more meetings, put additional mechanisms in place to facilitate collection, collect additional information)?

    Given the relative ease of collecting the POA data, VHI would have spent more time focusing on lab data collection. Although VHI had more hospitals participating than required by AHRQ, VHI would have liked to get more complete laboratory reporting. VHI had 4 quarters of data from 14 hospitals, 2 quarters from 3 hospitals, 1 quarter from 6 hospitals, and no lab data from 4 hospitals. Although the data files themselves were quite large, per MPA, Inc. the final sample size was limited for several conditions being analyzed.

    VHI might request the data less frequently. Even though the data was coming from the same hospital (and sometimes even the same person) the format was frequently different than the previous quarter. This meant that two files had to be checked against each other from the same hospital. Having the hospitals submit quarterly was useful in terms of keeping the project "fresh in their minds" but less frequent submission may have reduce the probability of multiple errors through time. VHI would also have increased the capacity of the website to handle very large data files.


  46. What support do you believe other states will need to build upon your experience (e.g. tools, communications examples, technical assistance)?

    It was very important to have a basic knowledge of the distribution of POA values (i.e., tables showing descriptive statistics from other states).


  47. What are the most important lessons that you learned during this process?

    There was nothing surprising about what VHI encountered during the process. Constant communication and feedback to hospitals are key. VHI had to remain cognizant of hospitals commitments of time and effort to this project and adjust expectations accordingly.


  48. What would you recommend as AHRQ's next steps in helping states add clinical data to administrative data sets?

    In order to make a commitment to adding clinical data to administrative data states would greatly benefit from additional information in specific areas. Virginia Health Information notes that further work is necessary to demonstrate the benefits of a HYBRID dataset to the public, health care providers and the organizations that will develop a HYBRID dataset.

    In VHI's work with hospitals, health insurance companies, physicians and as a health data organization VHI sees the value of further defining applications for the clinically enhanced administrative data, evaluating their value, and assessing the feasibility of and developing a business case for HYBRID datasets. In considering these issues, a number of related questions arise:

    • What is the level of improvement in health outcomes measurement across a variety of health outcome measures, both for AHRQ and other quality measures?


    • What are the most important laboratory tests needed to assess these health outcomes measures?


    • What are the initial implementation costs at a hospital level for submission? How can hospitals use this information for internal quality improvement?


    • Can these lessons be applied to those developing electronic health records for further analysis of laboratory data outside the hospital environment?


    • What are the start-up and ongoing costs to develop and operate a hybrid dataset?

    Virginia Health Information believes AHRQ's sponsorship of the pilot contracts "Adding Clinical Data to Administrative Data" has demonstrated that HYBRID datasets can be developed for use. An important next step for AHRQ is to further sponsor pilot efforts with VHI and others to demonstrate the value to outcomes measurement and outline the business case for implementation of HYBRID datasets on an operational level.


  49. Are you willing and interested in helping other states as they work toward adding clinical data to administrative data sets?

    VHI has enjoyed the opportunity to work collaboratively with its partners on this project and look forward to sharing our collective experience with other states as appropriate.

Internet Citation: Virginia Final Report. Healthcare Cost and Utilization Project (HCUP). July 2016. Agency for Healthcare Research and Quality, Rockville, MD. www.hcup-us.ahrq.gov/datainnovations/clinicaldata/AHRQFINALREPORTVHI.jsp.
Are you having problems viewing or printing pages on this website?
If you have comments, suggestions, and/or questions, please contact hcup@ahrq.gov.
Privacy Notice, Viewers & Players
Last modified 7/19/16