VA View Module 7: Systems Design and Designing for Safety
Introduction
[Select the Play button above to start the video.]
[Jeanie Scott, MT (ASCP). Program Director - Informatics Patient Safety, Department of Veterans Affairs, Veterans Health Administration, Office of Informatics and Analytics]
In Module 1, we learned about the history of patient safety in healthcare by reviewing the IOM (Institute of Medicine) report, “To Err is Human”, and then the establishment of several patient safety organizations, such as the Leapfrog Group, and the Agency for Healthcare Research and Quality, AHRQ. [Screenshot of AHRQ webpage] We also reviewed the introduction of national patient safety goals by the Joint Commission.
Health information technology implementation, particularly Computerized Provider Order Entry (CPOE) was recognized as an important aspect to achieving safer healthcare and reducing or even eliminating certain medical errors. CPOE along with other features of electronic health records such as up-to- date diagnostic results and decision support provide VA healthcare professionals with the tools to deliver safer healthcare.
[Congratulations, you have completed this video!]
[To continue, please select one of the topics to the left.]
Health Care and Safety
[Health Care and Safety: Automobile Safety Analogy]
As in other high-reliability organizations and the use of technology, we as humans often encounter design limitations and even design that introduces new types of errors. When discussing healthcare and safety, there is often a reference to the aviation industry. While this does recognize the complexity of both organizations, it is difficult for many of us to relate to flying an airplane or even being an air traffic controller. It's a set of tasks and experiences that many of us will most likely never encounter, but almost every one of us deals with healthcare, either as a provider of the healthcare or as a recipient of the process. That is, we are either on the giving or receiving end of healthcare.
Therefore, I'd like to take healthcare and health information technology and make an analogy to the automobile industry and the evolution of safety and safety design. Every one of us has either driven in a car or been a passenger in a car, so we can relate to automobile safety. Let's take a common feature of automobile safety: the safety restraint system, or seatbelts, airbags, even carseats. [Collage of a man locking the door of a vehicle, a woman fastening her seatbelt and a man putting a child in a car seat]
For many decades, automobiles had none of these and then in the latter part of the 20th century your high-end automobiles began to incorporating more sophisticated and easy-to-use safety features [Photograph of a child in a carseat] As seatbelts evolved, first we had lap belts, then shoulder belts, and some autos even had automatic seatbelt sensors. [Photographs of a lap belt system, a man driving with a shoulder belt, and a woman wearing an automatic seatbelt system]
Many of us can even remember driving and being passengers and not even considering using the seatbelt. As information became available about auto design and the collision data [Photograph of a car crashing into a wall during collision testing in a factory] (isn't that what a medical error is? A collision of just the wrong processes to result in an adverse event?) We saw automobile safety restraint system design evolve to not only have the seatbelt but to include impact protection: the airbag. Airbags were first introduced on the driver's side and then added to the passenger side. Now, some of us even have side curtains. [Photograph of a car interior with inflated airbags] So, what happened when side airbags were implemented? Instead of saving lives, passengers such as small adults, children, and infants were being seriously injured and even killed.
But wait! We all know that there were those warnings on the dashboard and the visor that told us of the hazard of the airbag explosion and that these types of passengers should ride in the backseat. [A hazard sign of a child in a car seat should not be riding a vehicle front seat] Maybe even it was so specific as to tell us to ride in the center of the backseat. So, were parents and other people bad users of the car's safety restraint system when they rode with their infants? Of course these infants were in rear-facing carseats in the front seat. These parents wanted to keep an eye on that precious cargo. “Oh my, what if the baby needed me and I can't see him or her in the backseat?" Well, we had a dilemma: the same safety feature that could save lives could also take lives. So the design was modified.
Most automobiles built after 2003 began to have smart airbags. [Photograph of a car interior with inflated front and side smart airbags] The auto industry kept the passenger airbag but introduced a sensor that could detect when small passengers (those weighing less than 70 pounds) or, in my case my laptop sitting on it, or if the passenger seat was too close to the dashboard. Well the electronic portion in the safety restraint system would disable the airbag. This is an example how designing for safety and taking into account the types of users the design needs to accommodate.
[Congratulations, you have completed this video!]
[To continue, please select one of the topics to the left.]
Safety Design in the VA
[Safety Design in Health Information Technology within the VA]
Now let's look at the safety design of health information technology within the VA. Just as with automobiles and different types of passengers, health IT has different types of users who all interact the same essential information. We'll begin looking at the medication use process. [Medication Use Process] Who are our main users that interact with the system? At minimum, we have the ordering provider, the dispensing pharmacist, or the pharmacy technician, and the administering nurse. [Photographs of a Provider ordering on the phone, a dispensing Pharmacist looking at a medicine bottle, a Pharmacy technician and an administering Nurse are going over a chart]
This is a real simple model of users, but within each of those it is a little more complicated. But each of these simple models of users depends on the same basic data element associated with medication, such as the name of the medication, the dosage, the schedule, and the route of the administration. [Medication Data Elements: Name, Dosage, Schedule, Route]
I want to talk about a close call involving an infusion order that revealed a series of design issues that affected each of these basic user types. When using a certain medication order dialog, in the VA CPRS software, the only way a provider could actually indicate the route of infusion was to enter the medication as a free-text comment. [Photograph of a Pharmacist entering information in a computer] So this order would then transmit to the pharmacy module. Let me just back up a little bit. The original name for this particular dialog was the IV dialog—not the Infusion dialog. So, when we order an IV dialog, well, it's an IV route for administration. So when this dialog was recognized to include more than IV ordering and the name was changed to reflect such, we started calling it Infusion dialog, the actual dialog elements were not then also redesigned to accommodate an entry for the various routes of infusions such as IV, intrathecal, or intramuscular.
Now, back to our transfusion ordering, which is now transmitting to the pharmacy software. [Photograph of a pharmacy technician looking at a computer screen while he places an order by phone] Well, the design of the pharmacy software, to make it quicker for pharmacists also defaulted the data fields for infusion types to IV. Do we now see where there is a design problem? The ordering provider had no way to enter the route of infusion as a discrete data input. The pharmacist will always get IVs as a prepopulated data field. Well, if the pharmacy user didn't recognize the ordering provider's comment that the infusion was other than IV route, the medication would be dispensed as IV and sent to our third user, those using the medication administration software. [Photograph of a woman working on medication administration software] In the VA, VA’s bar code administration software, or BCMA and it would be sent to them as an IV infusion.
Now, to even add to the design flaws, even if the pharmacist edited the default route administration from IV to the route indicated in the ordering provider's comments, let's say it's an intrathecal, and the correct route, intrathecal route, did not happen to find the abbreviation in its file setup, let's say we're going to call it an IC, the BCMA software was designed so that it didn't display any indication of what was the intended route of the infusion if there was no abbreviation for that infusion. And, the nurse administering it would then be under the assumption "this is showing on the IV tab of BCMA, it must be an intravenous route of infusion." And the nurse would administer an intrathecal-intended medication via an intravenous route.
I just want to note, this is not just a design flaw that was in VA software, there are documented cases of other healthcare institutions where this same medical error occurred. There are documented cases where either chemotherapy agents or analgesics were administered via the wrong route, intravenous instead of intrathecal and the results were lethal.
Suffice it to say, in the VA case that we worked on, this was recognized with a bladder infusion of an antibiotic and the design changes were implemented to mitigate this risk. It was through an investigation and evaluation of the entire medication use process and identifying the different user roles and the respective use of the data elements that the design in the three systems was corrected.
[Congratulations, you have completed this video!]
[To continue, please select one of the topics to the left.]
Modifying Designs
[Modification of the Design of Laboratory Information Displays]
Another example regarding the design of the software uniqueness of user types, the VA's informatics patient safety team, including a cognitive engineer, collaborated with CPRS developers and the testing team to modify the design of laboratory information displays. [Assemble a representative user group-cognitive engineer, primary care physicians, nurses, laboratory technologists, pathologists, health informaticist, software developer] In this case study, the cognitive engineer, took users such as primary care physicians, nurses, laboratory technologists, pathologists, and clinical informatics specialists through clinical scenarios in which there was a task to retrieve laboratory results.
Depending on not only the professional role --physician versus a nurse versus a laboratorian--the manner in which the clinical users sought out diagnostic results also depended on their clinical specialty as it relates to acute or long-term care.
While our initial study was to only evaluate one of several information displays on this CPRS lab tab, it revealed that just because the users is, let's say, a physician, that within in the physician-specific workflow and even their information seeking needs, that one design format versus another design format produces either desirable satisfaction or undesirable satisfaction and ultimately one design may be a better fit to the task.
[Determine how different users in representative group interact with the software.] We use various methods to uncover how different user types seek out information tabs just for the simple task of retrieving the value of an ordered lab test. Some of the methods we used was projecting the as-is design and asking users to use think-aloud methods to tell us what they were interpreting as we incrementally recorded their remarks and slowly moved to their next intended action on the software display.
After gathering an as-is use, we then engaged similar users in using moderate fidelity we walked through user-centered design process. [Conduct study through user-center design process.] The moderate fidelity was simple to project the as-is display on a whiteboard, such as what you see behind me, [Display as-is design.] and have the users take whiteboard markers and literally redraw their design changes. [Solicit feedback from users indicating desired changes.] Oh, and by the way, we had a developer in the room who could actually observe what the users were thinking and doing. [Observe and document users' design changes.] We also had requirement analysts in the room who actively documented the specifications and changes needed.
Now, we weren't done. [Incorporate design changes into program] After the developer made the requested user changes, the next part of our job was still to do. We then went to two different medical facilities. [Test new design] Neither of which were test sites for the proposed version upgrade [Collage of two medical facilities] and using the same clinical scenarios that uncovered there needed to be a design change, back in our user-centered design phase, we then engaged clinical users ranging again from physicians, nurses, pharmacists, and across acute-, primary-, and long-term care to quantify how effective was the design change? Users were asked to complete an information seeking task using the same scenario and no data, that is to us as the testing team, we knew the results based on what we populated in there and to see how well our users given a scenario could consistently find the correct result.
This last part of what I’ve been describing, testing the user-centered design, is just as critical as the user-centered design phase. This testing validates assumptions from the user-centered design. After observing not only the user-centered design methodology and then observing and actually being part of a testing validation team, I would definitely say one of the most critical parts of the process is designing the test cases. Designing not just the IT product, but designing a set of criteria and sample population that will demonstrate the design is indeed a valid assumption, is what completes the user-centered design process. When our goal is to design out error conditions, we also need to design how we will measure if the outcome is better than the original. There is much covered in the system design and designing for safety module and this is just a couple of examples of these methods and how they were applied within VA to continuously evaluate and improve the health information tools for our various clinical users. Designing for safety is incremental, and like the safety restraint systems in the automobile industry, continuously evolving as the environment and user base changes.
[Congratulations, you have completed this video!]
[To continue, please select one of the topics to the left.]