FEATURE: User Interaction, Understanding is Vital to Medical Technology Success
When new technology in healthcare is introduced without adequate analysis of how staff will interact with such devices, adverse events and near-misses are common, according to experts.
Reporting of such events is sporadic, however, with few measures in place to help healthcare providers learn from others’ mistakes. Safety leaders agree that the technology itself is not always the one that’s problematic, but rather the lack of testing and understanding by users prior to integration in the care-delivery process.
“We have a cascade of gadgets and equipment that’s just raining down on the healthcare system,” Rosemary Gibson, senior advisor to the Hastings Center, told Modern Healthcare.
According to Gibson, productivity demands are forcing physicians, nurses and other clinical staff to work faster. When that directive is coupled with new devices and equipment, “even the most competent people in the world can’t do that safely,” Gibson said.
Recent studies have found that rapid implementation of electronic health records, patient monitoring device, and other new medical technology tools can lead to adverse patient events when it is not properly integrated into workflow.
According to a Joint Commission report, human-factor issues were the most frequently identified root causes of adverse events (such as treatment delays and medication errors) from 2011-2013.
“It’s the interface of the human with the technology that creates a problem,” said Dr. Ana Pujols-McKee, the commission’s chief medical officer.
Some hospitals and health systems have responded to these growing concerns and personal alarming experiences by establishing human-factor teams. These teams investigate what could go wrong in the deployment process and scrutinize new devices from a human and technical perspective.
Root Causes Overshadowed By Complex Systems
In June, researchers at the Veteran Health Administration Center for Innovations in Quality, Effectiveness and Safety reported that complicated and confusing electronic health records pose a serious threat to patient safety. The more complex a system, the more difficult it is to trace the root cause of a mistake.
“The problem is not always the tool,” said Dr. David Chang of the University of California, San Diego. “The people using it, that’s the part many people are not paying attention to.”
A Food and Drug Administration report on device recalls noted that for the most part “the problems have not been with the technology itself, but rather with clinical use of the technology.”
MedStar Health, a 10-hospital not-for-profit system, launched its National Center for Human Factors in Healthcare in 2010 to address safety issues associated with new technology deployment. The center works to discover problems and determine what changes in the healthcare environment of the products will produce safe and effective outcomes.
Over the past year, the team has evaluated dozens of devices, including health IT software, infusion pumps, patient beds and wound-treatment devices, among others.
In a simulation that took place last week, staff at MedStar’s center demonstrated how an error could easily occur with a cardiac defibrillator.
When a patient’s heart was noted to be in distress by an attending physician, a nurse was ordered to deliver a synchronized shock at a low level using a defibrillator, a process that helps re-establish normal heart rhythms.
When the first shock was ineffective, another jolt was issued. Between the two shocks, however, the machine defaulted back to non-synchronized mode, which could have made a real patient’s heart stop beating.
“We know that even well-trained doctors who know how to use it right will naturally make that error,” Terry Fairbanks, the center’s director, said following the simulation. “We can’t depend on doctors remembering. We need to design the device so that it signals to the doctor that it has changed modes.”
Fairbanks and his colleagues rely on frontline providers to discover problems and report them to the center for testing, but they have found that sometimes clinical staff are anxious about reporting problems because they blame themselves.
“If you don’t work on opening up the culture, they might keep it quiet,” said Fairbanks. “Then you don’t learn about where opportunities are to design out the mistake.”
Simulation Centers Allow for Progress
The Society for Simulation in Healthcare has identified 165 simulation centers in the United States. Many, however, focus on training clinical staff on new procedures and devices rather than working out human interaction problems with new technology.
Still, more hospitals are assembling multidisciplinary teams to evaluate significant technological changes such as EHR implementation. But some problems don’t get flagged until after an adverse incident leads to patient-safety risks.
Mary Logan, president of the Association for Advancement of medical Instrumentation, says hospitals should standardize the way their purchase new technologies and get key users involved before making the buying decision.
“This is where a lot of organizations make mistakes,” Logan said. “The team that does the technology assessment should not be driven by the one person who wants the shiny object.”