Introduction to patient safety

Welcome to the “Introduction to Patient Safety” module. Please note that this module is a prerequisite for the other modules in the series.

Course Description

This course will examine patient safety and the potential for medication incidents from two aspects:
(1) the medication-use system (e.g. prescribing, order entry, dispensing, administration, and monitoring); and
(2) patient care (e.g. preventable adverse drug events experienced by patients).

It will build on materials from the Institute for Safe Medication Practices Canada (ISMP Canada), the Canadian Patient Safety Institute (CPSI), and the concept of continuous quality improvement in pharmacy practice. The CPSI Patient Safety Competency Domains will be applied to topics covered in this course.

The rationale for Inclusion in the Curriculum

As pharmacists practice patient-centered care and seek to create leadership roles in the Canadian health care system, they need to embrace a culture of medication safety where shared learning from non-punitive reporting of near misses and medication incidents can be achieved, and establish a practice environment where standardized continuous quality improvement is in place for consistent delivery of safe and effective patient care. This course will provide students with the opportunity to use the tools for medication-use system improvement.

Learning objectives

After completing this module, learners will be able to:

  • Describe the scope of medical and medication incidents in our health care system.
  • Recognize the individual and system-level factors that contribute to the complexity of health care environments, and therefore increase the risk of errors.
  • Explain the principles of human factors engineering.
  • Identify error prevention strategies that are system-based to account for human factors.

Module outline

There are four key components to this module.Module outline

The scope of medical and medication incidents

Definitions:

  • A medication incident: Any preventable event that may cause or lead to inappropriate medication use or patient harm while the medication is in the control of the health care professional, patient, or consumer. Medication incidents may be related to professional practice, drug products, procedures, and systems, and can involved prescribing, order communication, product labeling, packaging or nomenclature, compounding, dispensing, administration, patient education, or monitoring. A simplified definition is a mistake with medication or a problem that could cause a mistake with medication.
  • High-alert medications: These are medications that bear a heightened risk of causing significant patient harm when they are used in error. Although mistakes may or may not be more common with these drugs, the consequences of an error are clearly more devastating to patients.
    ISMP in the U.S. developed lists of high-alert medications per health care setting – for acute care, community and ambulatory care, and long-term care. 
  • Adverse events can be defined as unintended injuries or complications caused by health care providers rather than the patient’s disease, which eventually leads to death, disability, or an extended hospital stay.

Examples of medication incident

These are all examples shared with the Institute for Safe Medication Practices Canada, also known as ISMP Canada.

Drug interactions

This newsletter was based on pharmacoepidemiological studies that showed an association between certain drug interactions and increased risk of hospitalization. This newsletter provided pharmacists with the tools to help recognize the interactions, communicate the risk to prescribers, and recommend alternative therapies to reduce the risk of hospitalization. This incident was shared during a live webinar so that others could learn from the error and make system-level improvements to avoid a similar error in their practice.

Inadvertent Overdose

Example: An opioid-tolerant, palliative patient on a medicine unit was prescribed morphine 1-5 mg per hour. An infusion bag was sent up with 100 mg of morphine in 100 mL. Two nurses hung the bag and programmed the smart pump. Since morphine is a high-alert medication, there was an independent double-check, but neither nurse was aware that morphine infusions (where the medication could be given slowly over a long period of time) were not the only setting in the pump library; on that particular medicine unit, morphine is usually dosed intermittently, so the “infusion time” default was set to only 30 min.
The full 100 mL bag was infused over 30 min as an intermittent dose, thus, the patient received 1100 mg of morphine (40 to 200 times the intended dose). After the 30 min infusion was complete, the pump beeped, and the nurse returned to the patient. The physician and family were notified of the error immediately, and the decision was made to honor the DNR (which meant no rescue attempts were made); the patient passed away 3 hours later.
The webinar is available on the Med Safety Exchange website.
This incident was shared in a newsletter created for consumers or patients so that they can be empowered to optimize their health and safety.

Dose Miscalculation

A child was given the wrong dose of a medication because her weight was misunderstood. The child was seen in a clinic for a sore throat. Two days later, test results showed that she had a throat infection, and her mother was called. Over the phone, the mother reported the child’s weight as “18”. This information was shared with the doctor, who prescribed an antibiotic at a dose based on a weight of 18 pounds.
The child was brought back to the clinic five days later, with persistent fever and sore throat. This time, the doctor weighed the child and discovered that she weighed 18 kilograms (or 40 pounds). The antibiotic dose had been based on the wrong body weight and was too low to treat the infection.
The message to patients was the importance of knowing and recording weight in kilograms. They were encouraged to state the units (kilograms) when talking to health care providers about their weight. This newsletter is available from the SafeMedicationUse.ca website.

IV Compounding Error

A patient was discharged from the hospital after surgical excision of a cancerous tumor and was further treated, in a collaborative arrangement, by a conventional medical team and a naturopathic doctor at a complementary care center. The naturopathic doctor prescribed a complex tissue- and wound- healing formulation, which included selenium, for twice-weekly intravenous administration. The selenium solution was prepared by a compounding pharmacy and was added to the formulation on-site at the care center.
The patient had received this healing formula on 12 previous occasions, with no reported reactions. However, shortly after initiation of the 13th dose infusion, she became nauseous and diaphoretic. The infusion was stopped, and homeopathic remedies were administered, with no clinical improvement. Over the next several hours, the patient’s condition continued to deteriorate. When the patient began to experience hypotension, shortness of breath to the point of cyanosis, and chest pain, she was transferred to the emergency department of a local hospital, where she later died. Postmortem investigations showed that the selenium concentration in the infusion was 1000 times greater than intended due to a calculation and weighing error, which likely contributed to the patient’s death. This bulletin is available from the ISMP Canada website.

Medication error statistics

In the United States, medical errors ranks as the 3rd leading cause of death, following heart disease and cancer.

The Canadian Adverse Events Study

  • 7.5%, which accounted for 187,500 patients, in Canadian hospitals experienced adverse events. Extrapolation suggests that as many as 9,250 to 23,750 people died in a Canadian hospital as a result of medical errors.
  • The Canadian Adverse Events Study reviewed 3745 patient charts from participating hospitals across 5 provinces. In total, 365 adverse events were identified in 255 patients.
    • 24% of these adverse events were associated with medical/fluid administration.
    • 37% of the adverse events were determined to be preventable.

A decade after the Baker and Norton study, a review suggested that there is a lack of system-wide safety strategies and no assessment of the impact of interventions on workload.

Systems Theory

Foundational Principles

  • Errors occur at all levels of health care
  • Anyone can be involved in an error
  • Errors tend to fall in recurrent patterns regardless of the personnel involved
  • Errors can be prevented by building a system that is resilient to expected human errors

Reality of Health Care Environments

These include:

  • Increased complexity of patients, potential treatment options, and technologies
  • Frequent interruptions that take their focus away from the task at hand
  • Cognitive overload: trying to remember and consider all competing pieces of information needed to deliver safe care
  • Increased workload with reduced resources
  • Competing priorities, particularly in emergency situations or in high-risk patient care areas, such as intensive care units or emergency departments. These are also situations and environments where many high-alert medications are used.
  • There is a need to multi-task to be able to deliver care

High-reliability organizations.

High-reliability organizations have learned to address potential errors before the errors become disastrous.

The key characteristics of these high-reliability organizations
  • Preoccupation with failure. It’s not enough to avoid an error for a long time. Attention is given to the smallest problems that could result in a compromise in safety.
  • The reluctance to simplify interpretations and observations. Being able to recognize subtle changes and then communicating how they can lead to safety issues can prevent small issues from becoming a larger problem in the future. Safety can be complex, so observations need to be precise to get to the core.
  • Sensitivity to operations: Changes to the operation of an organization tend to be the originating cause of safety incidents. Thus, in HROs those closely involved with operations are expected to report any changes that cause deviations in expected performance. Speaking up is considered an obligation.
  • Commitment to resilience: Resilience is key in identifying and mitigating errors and containing them before they evolve into larger concerns.
  • Deference to expertise: is the fifth key characteristic. When a problem arises, the individual with the most knowledge and expertise in that field is appointed authority. Authority is not given according to the hierarchy of the firm. Rather it is given to the person that is likely best able to manage the problem.

Adapting lessons from these industries to health care would improve quality and safety.

Which one of these is not a characteristic of high-reliability organizations?

  • Preoccupation with failure
  • Reluctance to simplify interpretations
  • Sensitivity to operation
  • Compliance to hierarchy
  • Commitment to resilience
  • Deference to expertise

Answer: Compliance with hierarchy is not a characteristic of high reliability organizations. In fact, when there is an adverse event, control is given to the best-suited personnel. This is usually the person with the most knowledge and experience with that type of event.

The systems approach

It counters the traditional medical model of stressing individual performance as the primary determinant of outcomes.
Instead, incidents are due to predictable human failings in the context of poorly designed systems. It focuses on improving the processes, systems, and environment in which people work rather than attempting only to improve individual skills and performance.

The Swiss Cheese model is often used in systems theory to show how incidents can occur. It essentially shows us that for an error to get through to a patient, there were multiple failures. Each slice is a step or a process or a technology or a person, and the holes in each slice show us the vulnerabilities for each.
For an error to get through to a patient, it had to have capitalized on the vulnerabilities within each one. And the weaker the system, the more holes, and the easier it is for an error to cause patient harm.
An example of a weak point in the design of the system is shift changes between nurses in a hospital. Many errors can occur due to lack of communication between nurses during the change in shift.
The systems-based perspective focuses on improving the processes, systems, and environment in which people work, rather than attempting only to improve individual skills and performance. In the case of shift changes, incorporation of standardized debriefs and handoffs between shifts can help prevent errors.
But how can we differentiate system errors from personnel errors?

Latent and active failures

They are a way of differentiating between these types of errors.
Latent failures are errors of the organization or design. Consider them to be factors that may have contributed to an error. Flaws in the design make it easy for anyone to cause an error in those conditions. In other words, “errors are waiting to happen”.
An active failure is an error that occurs when frontline personnel interact with the organization. These are errors that occur at the point of human interaction.
You can also think of latent failures as blunt ends and active failures as sharp ends.
Personnel at the sharp end may literally be holding a scalpel when the error is committed, (e.g., the surgeon performing the incorrect procedure) or administering any kind of treatment.
The blunt end refers to the many layers of the health care system not in direct contact with patients, but which influence the personnel and equipment at the sharp end that come into contact with patients. The blunt end thus consists of those who set policy, manage health care institutions, or design medical devices, and other people and forces, which—though removed in time and space from direct patient care—nonetheless affect how care is delivered.

True or False? The Swiss Cheese Model illustrates that incidents are due to failures at many levels.
This statement is True. The Swiss Cheese Model shows that the alignment of many “holes” in a system can lead to an error.

True or False? A health care provider incorrectly programmed an infusion pump is an example of a latent failure.
This statement is false. Latent failures are not personnel-based. They are failures of the organization or design. Although this is an example of an active failure, additional latent failures might have made this error more likely to happen – if the hospital’s training program did not cover how to program the infusion pump, or if the pump was designed in a way that made it difficult to understand and use, those would be examples of organization or design failures.

Human factors engineering

It focuses on:

  • The design of systems, tools, processes, and machines that take into account human capabilities, limitations, and characteristics.
    Basically, it means that there’s always a human behind the design of a process or a tool, and there’s a human as an end-user using the process or tool, and as humans, we have our own inherent limitations that need to be considered in design.
  • Human factors engineering work the environment function in a way that seems natural to people – i.e., to tailor the environment to the worker, instead of the other way around and this is certainly applicable in health care.

Human Factors

These human factors include

Our working memory

Our working memory is typically able to hold about 7 pieces of information. However, health care workers are required to hold many pieces of information on their minds. Often, when one thing is forgotten, it feels like “how did I forget that ONE thing”. In reality, it was 1 out of 100 things. Thus, creating opportunities to reduce use of the working memory can help reduce errors. For example, consider a pharmacist checking hundreds of prescriptions per day, considering if it will be safe and effective for the patient, and if it interacts with any of the other medications that the patient is taking. And when checking a new prescription for a patient, a drug interaction with one of the chronic medications was overlooked.
A potential environmental solution would be to have

  • A list of commonly seen drug-drug interactions available to the pharmacist.
  • A computer software system to alert the pharmacist to a potential drug-drug interaction.
Inattentional blindness

Inattentional blindness refers to failing to see what should have been plainly visible due to unfocused attention
Interruptions, for example, can make it more difficult to focus on the details within one task.
Example: Consider a pharmacist compounding an oral suspension who is then asked by a pharmacy technician for clarification regarding the medications for a patient with kidney disease. Since the pharmacist’s attention is diverted to the kidney-related medication question, he reaches on the shelf and inadvertently pulls out the wrong medication for the suspension. After mixing it, however, no one can visually confirm what medication is in the suspension because both are white powders.
This type of error, and many more in health care and other industries, have happened under similar circumstances; the person performing the task fails to see what should have been plainly visible, and later, they cannot explain the lapse. In many cases, people involved in the errors have been labelled as careless and negligent. However, these types of accidents are common and can be made by intelligent, vigilant, and attentive people.
An environmental solution would be to:

  • Implement a new work environment design where the compounding area is segregated from the regular pharmacy to prevent interruptions and allow for complete focus on the task at hand, as compounding is a complex process.
  • Incorporate a bar-code scanner for automated identification to verify the ingredients used. This way, even if the wrong product is pulled off the shelf due to inattention and multi-tasking, the scanner will alert the user to the error before it reaches the patient.
Confirmation bias

Is the tendency to see what we expect to see. When we make a substitution error, the reason that we do not catch it ourselves is called confirmation bias.
In terms of deciphering prescriptions, confirmation bias can lead to errors when certain details, such as the medication name, dose, or directions for use, appear to be similar to something else.
ISMP Canada’s list of Do Not Use: Abbreviations, Symbols, and Dose Designations, identified abbreviations reported as being the second example shows a type of frequently misinterpreted and involved in harmful medication errors, which should never be used when communicating medical information.
In the first example, the “u”, representing the whole word, “units”, has often been misinterpreted as a “0”, leading to a 10-fold dose error. Here, the intended “6u” was misinterpreted as “60” and the patient received 60 units of regular, short-acting insulin.
You may think that only handwritten prescriptions are misinterpreted, but the second example shows a typed infusion rate. The infusion was administered at 25 mL/h instead of 5 mL/has intended. Whether hand-written or computer-generated, the “@” symbol can be misread as the number “2” or “5”, leading to a substantial overdose of medication.
When deciphering and inputting prescriptions into the dispensing software, the fact that some medications have very similar names can be problematic. For example, there have been mix-ups between amlodipine and amiodarone due to orthographic similarities of the names and also because both are cardiac-related medications that likely be used in similar types of circumstances. Our confirmation bias makes it easy to assume it is one medication over another due to those overlapping cues.
TALLman lettering is a method of applying uppercase lettering to sections of look-alike, sound-alike (LASA) drug names to bring attention to their points of dissimilarity. ISMP Canada and ISMP in the U.S., each has lists of confusable drug name pairs with TALLman lettering to help distinguish between them. Although the lengths of the lists differ, the actual TALLman lettering for medication names in both lists is kept consistent.
This is particularly problematic when it comes to labeling and packaging concerns. Even if you’ve never worked in a pharmacy, you’re probably well aware of how very similar certain medication packages look.
Look-alike labeling and packaging is an important contributing factor to med errors. Confirmation bias can kick in if the cues that tell us the products are different are small or easily overlooked, and all the more obvious similarities convince us that we’ve selected the right product.
In the image shown, the injectable morphine in 2 mg/mL and 10 mg/mL concentrations look very similar. If our mind takes sufficient cues, like the overall design of the box, yellow bar around the concentration, gray-ish/green-ish bar underneath (which can look similar under the shadow of a shelf), we could inadvertently pick the wrong one if we don’t ‘see’ the one distinguishing factor – the 2 versus the 10.
An environmental solution would be to:

  • implement a new storage design where look-alike labeling and packaging medical products are stored separately and not side-by-side, as this would minimize the risk that the user will reach for one product and inadvertently retrieve the other one.
  • minimize the similarities between the labels and packages to make them more distinctive and less likely to cause selection error. The colours are more distinctive – yellow and dark green – and the front of the box isn’t all white for both products.
Workarounds

are a natural tendency to take shortcuts to make the completion of tasks easier or to increase efficiency.
In health care, the high workload can often force health care workers to “cut corners” to be efficient. But this can be risky behaviour if established processes and appropriate protocols intended to be safeguards against errors, are bypassed or circumvented.
An example from an ISMP Canada Safety Bulletin on student-associated errors can illustrate this point. A pharmacy student was asked to refill the metformin bin in an automated dispensing machine, a process which involved selecting and scanning the bottle label prior to pouring the tablets into the machine. The student picked up 4 bottles of medication but scanned the label of only 1 bottle 4 times instead of scanning each individually, with the aim of improving efficiency in the busy pharmacy. The scanned bottle contained metformin but 1 of the other bottles selected contained Tylenol #3 tablets; both products are round, white tablets. As a result, 2 different medications were added to the same compartment of the automated dispensing machine.
The bar code scanner is intended to be a safeguard against selection error, so the student’s workaround to not scan each of the 4 bottles prevented the scanner from being able to alert the user that 1 of the 4 bottles was incorrect.
A possible environmental solution would be to ensure

  • Adequate staffing for the workload to decrease the perceived need for workarounds that circumvent safety processes.
  • Appropriate training for all staff on the policies and procedures regarding use of the bar code system, including the scanning of every item being used.
Automation Bias and Complacency

An ISMP Canada Safety Bulletin regarding over-reliance on technology defines automation bias and automation complacency.

  • Automation bias is the tendency to favour or give greater credence to information obtained from an automated decision-making system and to ignore a non-automated source of information that provides contradictory information.
  • Automation complacency is an overlapping term that refers to the monitoring of an automated process less frequently or with less vigilance than optimal based on a strong belief in the accuracy of the technology.

An example of automation bias from the bulletin was a nurse’s trust in the information within the automated dispensing system rather than the handwritten documentation. In the hospital, a patient’s medication orders included phenytoin, and the brand name, Dilantin, was written on the prescription order. A pharmacy staff member entered the Dilantin order into the pharmacy computer system so that the nurse could get the medication from an automated dispensing cabinet (or ADC). To enter the order, the first 3 letters of the medication name were typed in, and then the pharmacy staff member was interrupted, so after typing D-I-L-, diltiazem was selected instead of Dilantin.
On the unit, the prescription order for Dilantin was correctly transcribed by hand onto the medication administration record (or MAR). The MAR entry was verified against the prescriber’s order sheet and was cosigned by a nurse. Another nurse had to obtain the medications from the ADC noticed the discrepancy between the MAR and the ADC display (i.e., the MAR said ‘Dilantin’ and the ADC said ‘diltiazem’), but accepted the information displayed in the ADC as correct. The next morning, the patient exhibited significant hypotension and bradycardia, which was attributed to the administration of the unordered diltiazem. An environmental solution for this example would involve a workflow change that requires

  • Standardized manual checks. This process would address identified medication discrepancies, including verification of the original prescriber’s order before medication administration. Part of the verification process should also include assessing the appropriateness of the medication based on the patient’s medical history and treatment plan. This manual verification counteracts automation complacency that can occur with technological outputs from the medication use process.

Case: A health care provider is talking to a patient who wants to be vaccinated against rabies before a vacation in Columbia. The IMOVAX® Polio is inadvertently selected from the shelf instead of IMOVAX® Rabies. Fortunately, the product is double-checked prior to administration and the error is recognized.
Which human factor was most likely illustrated in this example?

  • Working Memory
  • Inattentional Blindness
  • Automation Bias & Complacency
  • Workaround
  • Confirmation Bias

The correct answer was confirmation bias.
The two medications have similar names which can make them easy to mix up. Recall that confirmation bias leads us to see information that confirms our expectations and would convince us that we have selected the right product.
Can you think of a possible environmental solution to prevent this error in the future?
An environmental solution at the pharmacy level could involve storing these products with look-alike names separately on the shelf, avoiding side by-side presentation.
Another environmental solution at the pharmacy level could involved implementation of a bar code scanning system to alert users to the selection errors.

Error prevention strategy

Overview

When you’re developing actions and solutions to prevent errors, encourage and focus on system-level changes, which, if implemented, will have lasting effects on safety, as opposed to persons-based strategies, i.e. issuing verbal warnings or cautions such as “Pay more attention”, or “You need to be more careful next time”. All of these statements focus on the person, which do little to target the system-based vulnerabilities that contributed to the error. When developing a strategy to prevent an error from occurring or recurring, it is helpful to consider the hierarchy of effectiveness, presented here from a bulletin from ISMP Canada. There are 6 components to this hierarchy, which are solutions and strategies ranked by their effectiveness in preventing the error. As you can see, the higher-leverage strategies are system-based, whereas the lower-leverage strategies are person-based, meaning that they rely on individuals to effect change. The first 3 components on the top right are more system-based, and are usually less feasible due to the need for more money and a longer time to implement. The 3 components on the bottom left involve the individual and are usually less expensive and can be implemented in a shorter period of time.
Consider the example of a community pharmacy dispensing medication and the patient experiencing a previously known allergic reaction. No one at the pharmacy asked about medication allergies, otherwise, the patient would have been able to tell them about the allergic reaction they had to a similar type of medication a couple of years before. We’ll use this example to develop different types of strategies along the hierarchy that the community pharmacy can consider implementing to prevent this type of error from recurring.

Person-Based
Education

Starting from the bottom left, you have education and information, which heavily relies on human memory, and we know that over-reliance on that is a human factor that can lead to errors. Using the example, the pharmacy manager might choose to hold an informal education session to teach staff about the importance of asking a patient about any medication allergies before dispensing medication. Of course, this depends on the individual remembering this education at the point of asking questions and creating a patient profile.

Rules and policies

For the case, the pharmacy manager may realize that the policy and procedure are not sufficiently detailed about what information needs to be asked and documented at the point where The patient brings in a prescription (or at least before it is picked up). Thus, the policy on creating a patient profile in the computer system is updated to include more detail specific to medication allergies and circulated to all staff members. This is a little more effective than standalone education because there’s an accessible document that can be reviewed instead of only memory.

Reminders

Next are reminders, checklists, and double-checks. In this example, a sticky note that says, “Remember to ask about ALLERGIES” on the side of the computer screen that all staff use to create patient profiles can serve as a reminder when talking to the patient. This is moderately more effective because any individual, whether they remember the education session or not, or whether they’ve accessed the new policy or not, will hopefully see the reminder at the time it’s needed (which is when asking a patient questions to create their profile in the system). Of course, this still requires the individual to notice and use that reminder, and we know humans are at risk of inattentional blindness.
We’ve covered the three types of person-based strategies that rely on the individual to remember education, access policies and procedures, and/or use reminders and checklists. However, remember that an effective error prevention strategy should focus less on the individual and more on the system – this incorporates systems theory and human factors engineering principles, so we’ll look at those examples now.

System-Based
Simplification and Standardization

Moving from person-based to system-based, we start with the moderately effective simplification and standardization. Using the example, the pharmacy manager might develop a standardized form that each patient must complete for their profile in the pharmacy computer system, which would include a question about medication allergies and the reaction that occurred. Standardizing this intake form reduces the onus on the individual staff member to remember to verbally ask about allergies, and provides a visual prompt to ask about it if that information is not completed on the form.

Automation and computerization

Next, we move into higher leverage strategies such as automation and computerization. In this case, the computer software can move through the patient profile sequentially so that the pharmacy staff member is automatically cued to each question that needs to be asked when creating the profile; most importantly in this case, the question about medication allergies.

Forcing functions

Lastly, we have forcing functions and constraints. In the example, the pharmacy software can be updated to include a forcing function on completion of the allergy field in a patient profile. Essentially, each time that a pharmacy staff member creates a patient profile, the system would not allow you to move forward and enter the prescription information until the allergy fields have been completed, even if it’s to say “no known allergies”.
You can see how the upper three types of strategies were more system-based and relied more on standardized practice, computerization, or forcing function that made you fill in the necessary information before moving forward. Although it’s most effective to aim for the higher leverage strategies, there’s still a place for person-based strategies that may support those system-based ones or maybe an interim solution, until resources are available for higher-level solutions.

Case: The quality and patient safety team at the hospital noted that several physicians were not prescribing antibiotics for pneumonia per the recommendations from the guidelines. To resolve these prescribing differences, the hospital created a pre-printed order set to standardize the prescribing of antibiotics for pneumonia.

What type of error prevention strategy was implemented?

  • Forcing functions / Constraint
  • Automation / Computerization
  • Simplification / Standardization
  • Reminders, Checklists, Double-Checks
  • Rules & Policies
  • Education & Information

Answer: The use of pre-printed order sets is an example of standardization.

According to the hierarchy of effectiveness, how effective are standardized order sets?

  • Most effective
  • Moderately effective
  • Least effective

Now let’s try another case and question.
Case: A child was brought to the family doctor because of a skin infection. The doctor prescribed cephalexin. The family doctor did not ask about medication allergies, and the parent forgot to mention that the child was allergic to penicillin. The pharmacy also did not ask about medication allergies and dispensed the cephalexin.
Note: Cephalexin is a similar antibiotic to penicillin, therefore a true allergy to penicillin would be concerning with cephalexin as well.
Rank the following strategies from most effective to least effective, with 1 being the most effective and 3 being the least effective.

  1. Add a note near the computer to always ask about medication allergies.
  2. Talk to the individual doctor and pharmacist about the importance of knowing a patient’s medication allergies.
  3. Design the computer software, in both the doctor’s office and pharmacy, to prevent the user from proceeding with a prescription until the allergy field is complete – even if it’s to document “No Known Allergies”.

Answer: In order, a computer software design that requires an allergy field completion is most effective. It is an example of forcing function and constraint and is a system-based error prevention strategy. Adding a note is an example of a moderately effective strategy because it serves as a reminder. Finally, informing the pharmacist and doctor of the importance of asking about allergies is least effective because education is a person-based error prevention strategy.

Key Learning Points

  • Health care environments are complex, therefore the risk of error is relatively high
  • Systems theory focuses on improving the environment in which people work, not individuals
  • The Swiss Cheese Model demonstrates that errors occur due to the alignment of latent and active failures
  • Human factors engineering principles identify human limitations, and the Hierarchy of Effectiveness helps design strategies to overcome those human factors

We talked about why errors occur in healthcare environments, realizing that the personnel are working in a complex field and are frequently under pressure. We then moved on to discuss how high-reliability organizations are able to virtually avoid major incidents due to their preoccupation with potential failures. Learning from how these organizations address failures to prevent incidents would improve health care safety.
A systems-based approach to addressing errors looks at determining organization/system failures, also known as latent failures, that can eventually lead to active failures at the point of interaction. The Swiss Cheese Model demonstrated this connection with the fact that errors occur because multiple vulnerabilities in multiple safeguards are compromised.
Finally, human factors engineering is about understanding how certain human factors can affect individual performance and how we can use different types of environmental solutions, especially by using the hierarchy of effectiveness, to mitigate the risk of error. James Reason aptly said, “We cannot change the human condition, but we can change the conditions under which people work.”
Here are some additional resources that you can visit to learn more about patient and medication safety.

Pdf Files

COURSE INTRODUCTION
Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab
COURSE OUTLINE
Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab
CONTACT INFORMATION
Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab
READINGS AND RESOURCES
Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab
VIC TEACHING MODULES DISCLOSURE
Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab
  • Post category:Pharm.D
  • Post last modified:April 14, 2021