THE SKINNY ON PATIENT SAFETY:

Posted on in General

PRESS TO LISTEN TO AUDIO

Case study of patient safety:

A woman was admitted to hospital for surgery. She had a mass involving her right kidney. The morning of her surgery, the resident came to see her. He said that they were going to remove both kidneys and then get her ready for dialysis. She told him that it was only one kidney that was to be removed. He said that she looked upset. He went off to check the chart and never came back. This was a near miss.  That is why I am very interested in patient safety and why I have been following the thought leaders in patient safety.

Sidney Dekker is a thought leader and a genius. He is Professor in the School of Humanities at Griffith University in Brisbane, Australia. He is also a commercial pilot. There is no one on the planet with a better understanding of the core issues of patient safety. Too bad not everyone is listening.

Professor Dekker’s focus on a “human factors” approach looks for sources of safety and risk everywhere in the healthcare system. His book “Patient Safety – A Human Factors Approach” examines: the designs of medical devices, the teamwork and co-ordination between different providers, communication across hierachical and gender barriers, cognitive processes involved in diagnosis and treatment decisions, the constraints and goals of the organization, the financial and human resources provided, the technology that is available and the politics and culture of the organization. He is exhaustive in his review of Patient Safety.

When comparing healthcare and aviation,  the incidence of adverse events in aviation is 10 to the minus 6 and in healthcare it is 10 to minus 4. A big difference if you are a patient.

These differences in adverse events can be explained by the varied assumptions about competence, continuity, assurance and maintenance. Other aspects of competence related to ability to collaborate on a complex task in a team and the use of standardized communication for problem solving. Alpha. Bravo.Charlie.

Aviation has limits on the amount of work to avoid fatigue. Sleep deprivation and the absence of fatigue management decreases the accuracy of judgement.Aviation has standard format briefings for each new operational phase, division of labour for tasks and the extensive use of checklists. In aviation, skills, competence and safety levels demonstrated in one situation are not considered transferable to another. Pilots are checked out at least twice per year in the simulator.

Healthcare has none of these safety measures. Healthcare’s motto used to be: see one, do one, teach one. Unfortunately, this motto is out of date.

Another source of adverse events is the complexity of the healthcare system. Errors seem to be systematically connected to features of people’s tools, tasks and operating environment. The doctor’s work space is filled with ambiguity, uncertainty and moral choices. This work space is not guided by the healthcare organization’s rules and regulations. It is guided by the local rationality of the work space.

According to Dekker, the system is not safe.  The logical targets for intervention are the error-producing conditions present in the working environment. A complex system with conflicting goals and outcomes that are better, faster and cheaper is “drifting towards failure”. The  system is more prone to adverse events, as economic constraints and production needs increase.

The healthcare working environment is characterized by complexities, uncertainties, pressures, unlimited variability, shortcomings and contradictions between multiple goals that workers have to reconcile, decode and pursue at the same time.

The irony is that bad outcomes can arise even when everyone is doing good work and is following the rules. In complex situations, adverse events can arise without really bad assessments or bad decisions.

Because of the complexity of the system and of the patient’s context, a common error is cognitive lock-up or fixation. Doctors hold on to existing ideas even if there is contradictory data. Another error is caused by the expectation of patients and by other doctors that problems require immediate answers. Doctors strive for causal simplicity in delivering answers quickly that over-simplify the case and may prove to be wrong.

From the mass of uncertain, incomplete and contradictory data, doctors develop a plausible explanation. Multiple threads of activity, task and cause can disrupt the doctor’s attention. There is more certainty and comfort in continuing with prior conclusions. And escalation of the problem increases the need for co-ordination demands across people. Trouble can arise because the processes for informing, updating and working with others are not efficient. The technology and culture have not evolved to make collaboration seamless and essential.

Organizational risk is a characteristic of a complex system. Adverse events are the result of structural interactive complexity and tight coupling within the system. The only way to reduce risk to is to reduce complexity. System vulnerability arises from unintended and complex interactions between seemingly normal organizational, managerial and administrative features of the system.  There is steady progression of small steps toward greater risk. A mistake is embedded in everyday organizational life and worsened by scarcity and competition, unprecedented and uncertain technology, barriers in information flows, standardization and lack of communication and collaboration.

So instead of allowing our healthcare system to drift into failure, here are some productive strategies for improving patient safety: promoting adverse event investigations without the threat of punishment, systems for adverse event reporting, rapid response teams, narratives about near misses, improved soft skills such as communication and interaction for producing safe and successful outcomes, ensuring diversity in decision making by including persons with diverse skill sets and knowledge, checklists with check-off provisions, operational discretion at the level of the front line operations persons and promoting accountability by empowering people to change their work conditions.

Success in complex system does not result entirely from following best practices but from a diversity of responses and from effective adaptation that allows it to cope with a changing and dynamic environment. Ironically, making changes to some components within a complex system may lead to an exponential growth in relationships and associated transaction costs and therefore to more complexity with markedly increased system costs.

Advances in the understanding of unstructured work in complex systems, is driving research and tool building in an emerging field called “adaptive case management“.  Coincidentally, Rupert Case Management has pioneered the use of “adaptive case management” for over a decade using advanced technology platforms to the benefit of our clients and their families.

Important changes to a complex system will involve improving the quality and efficiency of the relationships between the independent agents that work within the complex system. All care providers must come out of their silos and learn to communicate and collaborate more effectively. That should be the focus of our evolving information technology platforms and of the curricula of our new health science educational institutions.

The following diagram depicts the complex healthcare system as a network of independent agents doing specialized knowledge work to help their patients achieve their treatment goals.  Making the system less complex and safer should be a top priority for everyone including health policy makers.