First posted on Educate the Young on 12/17/2012

We can’t change the human condition but we can change the conditions under which humans work.      James Reason

David Mayer MD, Vice President of Quality and Safety for MedStar Health

I had the chance to attend a mini-course on the Science of Safety at IHI’s 24th Annual Forum last week, led by Don Berwick and others. I have heard him give this talk before but it is a good message…plus Don can speak on hand soap and totally engage his audience while making the talk educational. His focus was on how and what we can learn by adapting human factors engineering principles in our healthcare work. Don referred to James Reason’s quote above, and focused his presentation on 5 Human Factors Engineering lessons healthcare has to adopt within our culture:

  1. Avoid reliance on memory
  2. Simplify
  3. Standardize
  4. Use constraints and forcing functions
  5. Use protocols and checklists

Human factors engineering expertise is being invited into the safety and quality conversation more and more today.  Similar to that of patients and families, this set of eyes and knowledge has also been lacking from discussions that lead to meaningful change in our care systems. Large integrated health systems, like MedStar Health, are taking that next step and are making major investments in human factors engineering in their quest to make care safer for their patients at any cost. Terry Fairbanks and his team at the National Center for Human Factors Engineering in Healthcare represents this new model. The only large center of it kind in human factors engineering in the United States, Terry’s team is available to help redesign our systems (and others) in the best interest of patient safety, as well as design safer and more efficient systems altogether.

What is human factors engineering? A simple explanation from Terry’s website describe it as:

…an interdisciplinary approach to evaluating and improving the safety, efficiency, and robustness of work systems, such as healthcare delivery. Human Factors scientists and engineers study the intersection of people, technology, policy, and work across multiple domains, using an interdisciplinary approach that draws from cognitive psychology, organizational psychology, human performance, industrial engineering, systems engineering, and economic theory.

As Don Berwick emphasized in his talk, a human factors approach puts science into the safety conversation–providing us new ways to look at old problems. Albert Einstein warned us ‘we can’t solve problems by using the same kind of thinking we used when we created them’ and Terry’s team introduces new and different ways of thinking about the problems in our healthcare systems that continue to put patients at risk.

On March 11-12, 2013, Terry’s team will be the official host of the Human Factors and Ergonomics Society (HFES) conference in Baltimore.  This will not only help the HFES draw healthcare providers and administrators into their work, but also allow attendees to better understand how human factors applied to healthcare creates a safer healthcare environment for both patients and providers. It is a “must attend” conference for those looking at taking quality and safety learning to the next level.

Patricia Salber MD, MBA (@docweighsin)
Patricia Salber, MD, MBA is the Founder and Editor-in-Chief of The Doctor Weighs In. She is also the CEO of Health Tech Hatch, the sister site of TDWI that helps innovators tell their stories to the world. She is also a physician executive who has worked in all aspects of healthcare including practicing emergency physician, health plan executive, consultant to employers, CMS, and other organizations. She is a Board Certified Internist and Emergency Physician who loves to write about just about anything that has to do with healthcare.


  1. Interesting that this is being called “human factors” because what it is trying to do is engineer out the “human factors” that contribute to errors in the first place.

    Always remember that “environment is stronger than willpower” and look to offload as much of the responsibility for a process to the environment in the form of checklists and built in things you HAVE to do to avoid mistakes. This eliminates having to rely on someone in the process to remember (memory is a “human factor”) a particular step based on their experience (we turn the “human factor” of experience into a standardized checklist for all to follow) in order to be successful.

    Here is an example from industry. When you have a huge metal press that stamps out parts, often it is loaded by the press operator placing the unstamped parts in the press. If there was only one button to activate the stamper … we would have a LOT of one handed press operators out there.

    These machines are designed so the press only works when you press two buttons at once. They are placed on the machine so that you must use both of your hands and the buttons are a long way from where the stamp piston comes down to stamp the part and a long way from each other. In order to complete the process you must load the unpressed sheet of metal, take your hands out of the machine to place each on its own button and press both buttons at once. The button placement takes out the “human element” and makes it so you literally can’t cut your hands off as the operator (any more).

    I would hope that in medicine we are always trying to set up our processes to be as failsafe as this example.

    Dike Drummond MD


All comments are moderated. Please allow at least 1-2 days for it to display.