Patient Safety Authority Logo

Skip navigation links
HOME
PATIENT SAFETY AUTHORITYExpand PATIENT SAFETY AUTHORITY
PA-PSRS and PASSKEYExpand PA-PSRS and PASSKEY
PATIENT SAFETY ADVISORIESExpand PATIENT SAFETY ADVISORIES
PATIENTS AND CONSUMERSExpand PATIENTS AND CONSUMERS
NEWS AND INFORMATIONExpand NEWS AND INFORMATION
COLLABORATIONSExpand COLLABORATIONS
EDUCATIONAL TOOLSExpand EDUCATIONAL TOOLS
AUTHORITY EVENTSExpand AUTHORITY EVENTS







ADDRESS:

Patient Safety Authority
333 Market Street
Lobby Level
Harrisburg, PA 17120


Phone: 717-346-0469
Fax: 717-346-1090


 
Advisory Banner
The Role of the Electronic Health Record in Patient Safety Events
Pa Patient Saf Advis 2012 Dec;9(4):113-21.   
 

Erin Sparnon, MEng
Senior Patient Safety Analyst

William M. Marella, MBA
Program Director
Pennsylvania Patient Safety Authority

ABSTRACT

As adoption of health information technology solutions like electronic health records (EHRs) has increased across the United States, increasing attention is being paid to the safety and risk profile of these technologies. However, several groups have called out a lack of available safety data as a major challenge to assessing EHR safety, and this study was performed to inform the field about the types of EHR-related errors and problems reported to the Pennsylvania Patient Safety Authority and to serve as a basis for further study. Authority analysts queried the Pennsylvania Patient Safety Reporting System for reports related to EHR technologies and performed an exploratory analysis of 3,099 reports using a previously published classification structure specific to health information technology. The majority of EHR-related reports involved errors in human data entry, such as entry of “wrong” data or the failure to enter data, and a few reports indicated technical failures on the part of the EHR system. This may reflect the clinical mindset of frontline caregivers who report events to the Authority.

Introduction

Adoption of electronic medical records (EMRs) and electronic health records (EHRs)* in US healthcare facilities is growing: HIMSS Analytics reports that, as of the second quarter of 2012, over three-quarters of US healthcare facilities have achieved at least stage 3 of their seven-stage EMR Adoption Model.1 Stage 3 reflects a facility having the cumulative capabilities for electronic flowcharts, error checking, and picture archiving and communication systems (PACS) available outside of the radiology department.1 However, as adoption grows, so does concern over the potential safety implications of these systems. The recently released Institute of Medicine report Health IT and Patient Safety: Building Safer Systems for Better Care2 noted a lack of hazard and risk reporting data on health information technology (HIT) as a hindering factor in building safer systems. In response to this need for information on the scope and extent of EHR risks posed by today’s implemented systems, Pennsylvania Patient Safety Authority analysts identified EHR events in the Authority’s Pennsylvania Patient Safety Reporting System (PA-PSRS).

_________________________
* For the purposes of this article, the term “EHR” is used to denote a family of technologies that includes electronic medical records and electronic medication administration records, except in instances in which “EHR” constitutes a search or manufacturer-specific term.
_________________________

Methods

Authority analysts queried the PA-PSRS database on May 23, 2012, using the keywords “emr,” “ehr,” “adt,” “electronic med,” “electronic health,” “information system,” “dropdown,” “default,” “selection,” “mouse,” “no record,” and “link,” in conjunction with EHR supplier and system names. The query returned 8,003 reports from June 2, 2004, through May 18, 2012. Analysts noted that the search query returned some types of reports in which the term “EHR” was either incidental or EHR involvement could not be confirmed, such as the following:

  • An event (e.g., a fall) that was reported in the EHR but for which no EHR systems were involved in or contributed to the event
  • Manual errors that were committed outside EHR systems, such as pulling the wrong medication from a cabinet or applying the wrong label to a specimen
  • Reports that indicated the use of a paper-based chart or did not specify whether an electronic system was involved

A random sample of approximately 20% of these event reports was created by assigning each of the 8,003 queried reports a random number between 0 and 1 and reviewing those reports with a randomly assigned number between 0 and 0.2. This random sample was manually reviewed by one analyst with a background in clinical and biomedical engineering to classify the events as relevant or not relevant to the topic of patient safety events involving the EHR; 933 (59.5%) of the 1,567 manually reviewed reports were identified as relevant.

With the intent of reducing manual review of irrelevant reports, the data set of manually reviewed event reports (n = 1,567) was divided into training and validation data sets for a machine-learning model. The objective of the model was to estimate the probability of relevance of unlabeled cases using an algorithm trained on manually labeled cases. The training data set contained 70% (n = 1,097) of the manually reviewed reports, while 30% (n = 470) of reports were used in 10-fold cross-validation with stratified sampling. The best-performing model, using a Naïve Bayes kernel classifier, achieved an area under the receiver operating characteristic (ROC) curve of 0.927±0.023 after dropping uncertain predictions (i.e., those with less than 90% confidence).

This model was then applied to the remaining 6,436 queried reports that had not been manually classified. The machine-learning tool identified 2,500 of 6,436 reports as relevant. These 2,500 reports were then manually screened to confirm relevance, and analysts deemed 2,166 of these reports (87%) as relevant to EHRs. In total, 3,099 reports were confirmed as relevant to EHRs (933 from the initial random sample and 2,166 from the machine-learning sample), and these reports were subjected to further analysis. Analysts noted that EHR-related reports are increasing over time, which was to be expected as adoption of EHRs is growing in the United States overall (see Figure 1).

Figure 1. Reports Related to Electronic Health Records (June 2004 through May 2012)  Figure 1. Reports Related to Electronic Health Records 
 (June 2004 through May 2012)


 

 


Results

Classification by Harm Score

Reported events were categorized by their reporter-selected harm score (see Table 1). Of the 3,099 EHR-related events, 2,763 (89%) were reported as “event, no harm” (e.g., an error did occur but there was no adverse outcome for the patient), and 320 (10%) were reported as “unsafe conditions,” which did not result in a harmful event. Fifteen reports involved temporary harm to the patient due to the following: entering wrong medication data (n = 6), administering the wrong medication (n = 3), ignoring a documented allergy (n = 2), failure to enter lab tests (n = 2), and failure to document (n = 2). Only one event report, related to a failure to properly document an allergy, involved significant harm.

Table 1. Classification of Reports Related to Electronic Health Records, by Harm Score

  Table 1. Classification of Reports Related to Electronic Health
  Records, by Harm Score

 

Patient with documented allergy to penicillin received ampicillin and went into shock, possible [sic] due to anaphylaxis. Allergy written on some order sheets and “soft” coded into Meditech but never linked to pharmacy drug dictionary.

Although the vast majority of EHR-related reports did not document actual harm to the patient, analysts believe that further study of EHR-related near misses and close calls is warranted as a proactive measure.

Classification by Event Type

EHR-related reports represented many event types in the Authority’s classification system (see Table 2); however, the vast majority of reported events (81%) involved medication errors, mostly wrong-drug, -dose, -time, -patient, or -route errors (50%) or omitted dose (10%). The only other event type with a significant number of reports was complications of procedures, treatments, or tests (13%), most of which involved lab test errors (7%). Analysis attributed this distribution of event types to the wide-reaching nature of potential EHR-related problems. EHR systems are used for the ordering, validation, and administration of medications, laboratory tests, and diagnostic and therapeutic procedures. Therefore, it is not surprising that reported errors related to EHR use are associated with these event types.

Table 2. Classification of Reports Related to Electronic Health Records, by Event Type

  Table 2. Classification of Reports Related to Electronic Health
  Records, by Event Type

 

 

 


Relevant cases were further classified by the same analyst according to an HIT-specific taxonomy developed by Magrabi et al.3 This taxonomy includes classifications for problems with data input, transfer, output, general technical issues, and contributing factors (see Figure 2). Analysts considered applying the HIT taxonomy contained in the new Agency for Healthcare Research and Quality (AHRQ) Common Formats for risk reporting; however, insufficient detail was present in the narrative reports to properly apply this taxonomy.

Figure 2. Magrabi et al. Classification of Reports Related to Health Information Technology

  Figure 2. Magrabi et al. Classification of Reports Related to Health
  Information Technology

 


Analysts identified four new categories, expanding the Magrabi et al. classification to include specific problems with unit errors in wrong data entry (1.2.1.1), data entered into wrong fields (1.2.1.2), misreading or misinterpreting displayed information (3.4.5), and default values in system configurations (4.4.2.1).

Some reports were tagged with more than one problem type, such as in the following example:

Patient was ordered albuterol 0.5 mL Q4H [every four hours] and ipratropium 2 mL Q4H nebulized breathing treatments at 8:00 a.m. into ProTouch system. The order was acknowledged by nursing, but nursing did not notify RT [the respiratory therapy department] of new orders. RT did not become aware of orders until eight hours later. Due to limitations of ProTouch, RT cannot acknowledge respiratory orders; thus, therapist on duty was unaware of the new orders until overdue order report run at end of shift (two doses of each medication missed by that time). Patient did not experience any adverse effects from delay in respiratory therapy treatment; patient’s respirations were unlabored.

  • This report was tagged with:
    • 3.4.4, not alerted, because the system was not set up to alert respiratory therapists
    • 4.4.1, software issue—functionality, because the system does not allow alerting of respiratory therapists

An additional example is as follows:

A pharmacist entered correct day start time (9/10) for Lovenox®, but interface between pharmacy system and Bridge [administration system] caused the order to default to next day start time. The nurse signed off order without confirming correct order entry and did not "Add Dose" in Bridge to correct start time; patient missed one dose.

  • -This report was tagged with:
    • 2.2, system interface issues, because the interface between the pharmacy and Bridge systems changed the order settings
    • 3.3, output/display error, because the Bridge system output an incorrect start time
    • 3.4.2, missing data (did not look at complete record), because the nurse did not confirm correct order entry
    • 4.4.2.1, software issue—system configuration—default, because the Bridge system was configured to change to a default start time

Another report read:

Acetate component was not ordered under the component section but was ordered in the administration instructions, which is a free-text field that does not link with the TPN [total parenteral nutrition] additives and was missed by pharmacy upon verification and transcription into the TPN program. Acetate should have been ordered as meq/kg and not acetate 50:50, which was in the administration instructions.

  • This report was tagged with:
    • 1.2.1.2, wrong input—wrong field, because the component order was placed in the wrong field
    • 3.4.2, missing data (did not look at complete record), because the pharmacist did not pull information from the administration instructions field

Overall, 96% of the reports were tagged with only one or two tags (see Table 3), and 3,946 problems were identified in the 3,099 relevant reports.

Table 3. Number of Tags Assigned per Resident   Table 3. Number of Tags Assigned per Resident



Comparison with Other Data Sets

In general, narrative reports from the Authority database exhibited a very different pattern of problem types than the two sets of data tagged by Magrabi et al. (the US Food and Drug Administration’s [FDA] Manufacturer and User Facility Device Experience [MAUDE] database, in which there were 712 problems from 432 reports, and Australia’s Advanced Incident Management System, in which there were 117 problems). Analysts noted that the most commonly used tags for reports to the Authority were related to wrong input (applied to 47% of reports), failure to update data (18%), or default system configuration (10%).3 Many of the classifications developed by Magrabi et al.—especially those that focused on failures of the network, hardware, or software—applied to few or no reports. (See Table 4.)

Table 4. Application of Magrabi et al. Taxonomy to Queried Reports  Table 4. Application of Magrabi et al. Taxonomy to Queried Reports

 


 


Wrong Input

Problems related to wrong input (n = 1,867) spanned a wide range of event types and outcomes: transposition or transcription errors in the entry of orders or administration information, entry of incorrect patient parameters (like weight or blood glucose) that trigger calculations of incorrect therapy, and even entry of the wrong physician name, resulting in reports being sent to the wrong recipient. Authority analysts identified two new categories to describe specific types of wrong-input problems that deserved more attention: 1.2.1.1 wrong input—units error (n = 18) and 1.2.1.2 wrong input—wrong fields (n = 65). Reports tagged with “units error” typically involved mix-ups between patient weight units (lb versus kg) or selection or entry of an incorrect dosing unit for a medication (e.g., weight-based dosing like mg/kg/hr versus non-weight-based dosing like mg/hr), and analysts noted that default values contained in EHR systems were mentioned as contributing factors in three of these reports. Reports tagged with “wrong fields” typically indicated unfamiliarity with the configuration or function of a facility’s EHR system. Users were entering data in a field that was inappropriate for the intended data, as in the following example:

A patient received two extra doses of oral magnesium oxide 400 mg. Order originally placed by physician for [magnesium] oxide 400 mg [twice a day] for two days or four doses. Physician did not place stop date into ProTouch as per proper procedure but instead wrote instructions in the free-text box of ProTouch. When the order was verified by the pharmacist, instructions in the text box [were] not acknowledged. When the nursing staff administered the medication, written instructions [were] not acknowledged. Event [was] discovered by pharmacist after the patient had received six doses of medication.

Default Values

This classification was created when Authority analysts noted that a large proportion of system configuration issues mentioned errors due to default values. Like wrong-value problems, default-value problems spanned a wide range of event types and outcomes, but reports generally fell into one of two categories: (1) a user failed to modify a prepopulated default value for dose, time, route, or other parameters in an order or (2) after entry of an order, a system replaced entered information with default values, often for start times. After correspondence with Magrabi,4 the first type of default-value reports (“user failure to modify a default,” n = 70) were removed and retagged as 1.2.3 failure to update data, and the second type (“system inserts a default after human entry,” n = 221) were tagged with a new code, 4.4.2.1 software issue—system configuration—default.

Failure to Update Data

Problems related to failure to update data (n = 762) largely involved four event types: (1) users failing to transcribe written or verbal orders into an electronic order or pharmacy system, (2) users failing to enter lab results into an information system, (3) users failing to modify a default value to an intended value (as described in the discussion regarding default values), or (4) users reporting that they did not properly document a clinical activity like removing a medication from stock or administering a therapy. Analysts noted that many failure-to-document events involved situations in which documentation was completed in a paper system but not an electronic system (n = 85). By attempting to use both paper and electronic systems in the course of workflow, users created confusing and conflicting situations in which patient care was compromised, such as in the following case:

A patient was admitted to [the emergency department] with [a urinary tract infection]. A physician prescribed ciprofloxacin 500 mg [by mouth, once]. Patient had been in the [emergency department] for a while, and the previous nurse had administered the dose without documenting it on the physician’s order sheet. The next nurse also administered the dose because she did not see it documented. When she went into the EHR, she saw that the previous nurse had documented [the initial administration] in the computer. She called the nurse to double-check that the [medication] had been given. The physician was notified about the double dose.

Discussion

The pattern of reported problems present in the PA-PSRS database was different than that found by Magrabi et al. in FDA’s MAUDE database. Analysts attribute this difference in problem patterns to (1) differences in both the databases themselves and the people who populate them and (2) limitations of the existing PA-PSRS data set.

PA-PSRS and MAUDE differ in scope and reporting requirements. The MAUDE database is populated by mandatory and voluntary reports of device failures and device-related errors. Currently, devices and systems like radiology information systems (RIS) and PACS are FDA-cleared medical devices with mandated reporting requirements, while EHR systems, laboratory information systems, and computerized provider order entry (CPOE) and pharmacy (PhIS) systems are not. Therefore, the MAUDE database is likely to contain more reported events related to PACS than CPOE. The query string used for this analysis also differs from the string used by Magrabi et al.; it specifically targeted EHR- and EMR-related events and did not include terms related to RIS or other more broadly defined HIT technologies.

PA-PSRS and MAUDE also differ in the background of reporting individuals. The MAUDE database is typically populated by biomedical and clinical engineers employed by facilities and manufacturers, while the PA-PSRS database is typically populated by risk management professionals who are collecting clinical narrative event reports. Both reporting systems receive reports that are framed by the reporter’s experience. Frontline caregivers will likely recognize if they have failed to perform a duty or have entered incorrect information, but they will rarely have enough information to suspect a problem with device components or network components. Therefore, most frontline caregiver reports of system availability errors may indicate that the “computer system was down” (tag 4.1), even if the underlying cause is a device interface issue, a network configuration problem, or an access problem.

Limitations

Specific limitations of this study may also shape the nature and frequency of EHR-related events present in the PA-PSRS database query.

Reporting statutes of PA-PSRS. Pennsylvania healthcare facilities are required to report Serious Events, Incidents, and Infrastructure Failures through the Authority’s PA-PSRS. However, Infrastructure Failures are accessible only by the Pennsylvania Department of Health. In many facilities, a failure of a computer, information system, or network may be classified as an Infrastructure Failure and would not appear in the Authority’s data set.

Awareness of EHRs as a potential contributing factor to an error. As noted previously, frontline caregivers may not suspect that an EHR system has contributed to a human error. Events in the PA-PSRS database were not picked up by the Analysts’ query if they did not specifically call out a particular system or EHR in general. Therefore, if a frontline caregiver did not suspect that the configuration of an EHR somehow contributed to their choosing the wrong drug for a patient, they may have simply reported that they selected the wrong drug and not mention the EHR.

Limitations of narrative reporting affected both the types of reports queried and the tags applied. Unless a narrative report specifically included the search query terms, the report was not captured by this query. Unless specifically mentioned in a narrative report, a problem type or contributing factor could not be tagged. Perhaps because of these limitations, few of the contributing factors identified by Magrabi et al. could be applied to queried reports. Although analysts may suspect that EHR-related errors could stem from inadequate training, interruption, or multitasking, analysts could not apply these tags unless the narrative specifically identified these problems.

The limitations of using narrative review to identify EHR-related reports could be alleviated through the use of EHR- or HIT-specific event taxonomy like that used for the AHRQ Common Formats. Going forward, it may be advantageous for the Authority to include EHR- or HIT-specific options in the event type taxonomy and provide educational materials on the use of this taxonomy. This would prompt users to specify whether they believe EHR systems played a role in the reported event and would reduce the burden of manually reviewing irrelevant queried reports. As in any scientific study, adding to reporter knowledge will likely increase the quality of the reports and decrease the missed risk events, allowing the Authority a greater understanding of HIT risks.

Query design. This study’s query focused on EHR system names and usage terms. Terms related to missing, lost, or corrupted data were not specifically included in the search string, although reports of this type were identified in the study. Further study on a more focused query string could identify more reports of system errors resulting in missing, lost, or corrupted data.

Further refinement of the machine-learning tool. Analysts have not manually confirmed the remaining 3,936 queried reports that were identified as “not EHR-relevant” by the machine-learning algorithm. Therefore, events that were falsely tagged by the machine-learning algorithm as “not EHR-relevant” were excluded from this analysis. Identification of false-negative machine-learning results could allow for refinement of the machine-learning algorithm.

Conclusions

Overall, 3,946 problems were identified in the 3,099 reports of EHR-related events identified through this query of the Authority’s database, and several themes that may prove fruitful for further study were identified in reviewed reports, including the following:

  • The types of reported human-related problems (e.g., wrong entry, wrong field, failure to update data) could have many underlying causes, which could not be captured in the current data set of narrative reports. Further study could provide more insight into the root causes of these errors, which may include issues in workflow design or policies and procedures, usability or functionality gaps in the design or configuration of an electronic system, or gaps in the training or understanding of the user population.
  • Ongoing study of incident reports can help identify the common types of problems seen with EHRs. The Authority can help improve patient safety by characterizing and systematically addressing these common problem types even in the absence of root-cause data.
  • EHR- and HIT-related reports that are classified by reporting facilities as Infrastructure Failures are accessible by the Pennsylvania Department of Health but not by the Authority. Because many facilities classify failures of information technology networks and systems as Infrastructure Failures, this type of report is likely to be underrepresented in the Authority’s database. A query of the Infrastructure Failure reports may identify more machine- and system-related reports of EHR and HIT events.
  • Adding EHR- and HIT-specific event types and taxonomy to the Authority’s reporting system may increase the number and quality of event reports related to EHRs and HIT.
  • Dual workflow that uses both paper-based and electronic records seems particularly problematic and may be of interest for further study as more facilities transition between paper-based and electronic systems.
  • The configuration of electronic systems, especially the use of default values, seems to lead to certain types of errors in medication orders and documentation. Further study could shed some light on best practices in the use of default values in system configuration.

Acknowledgments

Edward Finley, BS, Pennsylvania Patient Safety Authority, contributed to data acquisition and validity for this article.

Notes

  1. HIMSS Analytics Database. US EMR Adoption Model [online]. 2012 [cited 2012 Jul 26]. http://www.himssanalytics.org/stagesGraph.asp.
  2. Institute of Medicine. Health IT and patient safety: building safer systems for better care [online]. 2011 Nov 8 [cited 2012 Jul 25]. http://www.iom.edu/Reports/2011/Health-IT-and-Patient-Safety-Building-Safer-Systems-for-Better-Care.aspx.
  3. Magrabi F, Ong MS, Runciman W, et al. Using FDA reports to inform a classification for health information technology safety problems. J Am Med Inform Assoc 2012 Jan-Feb;19(1):45-53.
  4. Magrabi, Farah. E-mail to: Pennsylvania Patient Safety Authority. 2012 Oct 31.
 
 Browse by Topic
Navigation  



THE PENNSYLVANIA PATIENT SAFETY AUTHORITY AND ITS CONTRACTORS  
PSA LOGO The Pennsylvania Patient Safety Authority is an independent state agency created by Act 13 of 2002, the Medical Care Availability and Reduction of Error (“Mcare”) Act. Consistent with Act 13, ECRI Institute, as contractor for the Authority, is issuing this publication to advise medical facilities of immediate changes that can be instituted to reduce Serious Events and Incidents. For more information about the Pennsylvania Patient Safety Authority, see the Authority’s Web site at www.patientsafetyauthority.org .      
ECRI LOGO ECRI Institute, a nonprofit organization, dedicates itself to bringing the discipline of applied scientific research in healthcare to uncover the best approaches to improving patient care. As pioneers in this science for more than 40 years, ECRI Institute marries experience and independence with the objectivity of evidence-based research. More than 5,000 healthcare organizations worldwide rely on ECRI Institute’s expertise in patient safety improvement, risk and quality management, and healthcare processes, devices, procedures and drug technology.      

ISMP Logo The Institute for Safe Medication Practices (ISMP) is an independent, nonprofit organization dedicated solely to medication error prevention and safe medication use. ISMP provides recommendations for the safe use of medications to the healthcare community including healthcare professionals, government agencies, accrediting organizations, and consumers. ISMP’s efforts are built on a nonpunitive approach and systems-based solutions.      
 
©2014 Pennsylvania Patient Safety Authority                                                Home      Who We Are      Contact Us     Subscribe to Advisories and Press Releases   Site Map     Privacy Statement