- BIOETHICS FORUM ESSAY
Friends in High Places: Doing Bioethics at 36,000 feet
I recently spent several invigorating days at Lund University in Sweden as a guest lecturer at a “learning laboratory” for professionals who share an interest in the role of human factors in how complex systems make and respond to errors. When I give talks, the audience usually includes physicians, nurses, and other clinicians – but not pilots, air traffic controllers, and crash-site investigators. And because the usual context in which I give talks is the U.S. health care system, I’m used to questions about the legal and financial consequences of disclosure, apology, and compensation – but European and Canadian systems work differently, and professionals in these systems worry about different things.
My role at this gathering turned out to be less “continuing medical (or nursing) education” – for once, no one asked me to specify up to three learning objectives using approved verbs – and more like doing philosophy on my feet, or while imagining myself at 36,000 feet. Or as Sidney Dekker, who directs the human factors research program at Lund, told our group: it’s a challenge to introduce new ideas into complex systems like aviation or health care. Do you tell your managers to go read a book – or do you show them how to use ideas?
(Dekker’s own book, Just Culture: Balancing Safety and Accountability – which draws on research from The Hastings Center’s Promoting Patient Safety project – has one very famous reader: Captain Chesley Sullenberger, who left his library copy on US Airways Flight 1549 after landing his plane in the Hudson River back in January. When New York City Mayor Michael Bloomberg gave “Sully” the keys to the city, the mayor also presented him with a replacement copy of the book.)
I learned that aviation safety experts, like patient safety experts, have difficulty making a case for dedicating time to conceptual work inside of systems whose leaders push them to keep safety purely “operational,” even when it’s clear that current operations are encouraging flawed reasoning about what actually promotes safety and what hinders it. One conceptual challenge that came up regularly, and that was shared across fields and professions, was the ethics of problem-solving in the absence of professional guidance, or when guidance doesn’t match real-world conditions.
What’s the ethical distinction between a work-around (a practice that dodges rather than solves the problem, may introduce new risk, and may be rewarded by peers) and innovation (a practice that may solve the problem, may involve risk-taking, may introduce new risk, and may be punished by superiors)?
A Dutch pilot described a case in which guidelines didn’t match the reality of a plane’s interior, requiring each crew to come up with a work-around to distribute the weight of passengers and baggage evenly. When this work-around failed, resulting in damage to an aircraft, the crew was written up for having “no situation awareness” – even though (to this outside observer) it looked like a faulty system relied on them to be rather too aware of their situation.
And I learned that familiar ethical relationships don’t always map neatly onto other fields. In aviation, the analogue to “physician-patient” is not “pilot-passenger” or even “pilot-cargo,” as a pilot remains in the professional role whether she’s transporting royalty or ferrying an empty plane. The analogue may be more like “pilot-mission” – but perhaps understanding this profession in its own terms is more important than trying to understand it by means of analogy.
When I asked about fidelity as a professional virtue, it was an air traffic controller who spoke up: she viewed her primary ethical duty as one of fidelity to each pilot crossing her airspace. Other ATCs said it could be psychologically overwhelming to think about each dot on a screen as a container full of people – and yet, said one, “it’s not a video game, either.”
Published on: June 8, 2009
Published in: Bioethics