Our brains are fickle lumps of flesh. Easily led astray.
As nurses one of the things we need to be aware of are the many biases, cognitive errors and prejudices that can flavour and misdirectÂ our decision making process.
Such erroneous thinking has been referred to as cognitive dispositions to respond or CDRâ€™s. It can lead to adverse outcomes, errors and a generally poor quality of nursing care.
The big problem is that when you are working from within a place of CDR it is very difficult to remember to step back and examine your thinking process objectively.
This is why critical thinking and reflective practice is such a very important part of our nursing kit.
In a paper titled The Importance of Cognitive Errors in Diagnosis and the Strategies to Minimize Them, professor Pat Croskerry examines some of the more common CDRâ€™s in the medical field.
Decision-making theorists in medicine have clung to normative, often robotic, models of clinical decision making that have little practical application in the real world of decision making. What is needed, instead, is a systematic analysis of what …. [has been] called flesh and blood decision-making. This is the real decision making that occurs at the front line, when resources are in short supply, when time constraints apply, and when shortcuts are being sought. When we look more closely at exactly what cognitive activity is occurring when these clinical decisions are being made, we may be struck by how far it is removed from what normative theory describes….
…Medical decision makers and educators have to do three things: (1) appreciate the full impact of diagnostic errors in medicine and the contribution of cognitive errors in particular; (2) refute the inevitability of cognitive diagnostic errors; and (3) dismiss the pessimism that surrounds approaches for lessening cognitive bias.
Here are some of the biases professor Croskerry has listed, that you may (or may not) recognise from your own experiences of the nursing process or observations of doctors at work.
Yes, there are plenty of pitfalls…and I can clearly recall (yipes!….lots and lots of) instances where I have become entangled in one or more of them.
- Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are invoking the aggregate fallacy. The belief that their patients are atypical or somehow exceptional may lead to errors of commission, e.g., ordering x-rays or other tests when guidelines indicate none are required.
- Anchoring: the tendency to perceptually lock onto salient features in the patientâ€™s initial presentation too early in the diagnostic process, and failing to adjust this initial impression in the light of later information. This CDR may be severely compounded by the confirmation bias.
- Ascertainment bias: occurs when a physicianâ€™s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples.
- Availability: the disposition to judge things as being more likely, or frequently occurring, if they readily come to mind. Thus, recent experience with a disease may inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time (is less available), it may be underdiagnosed.
- Base-rate neglect: the tendency to ignore the true prevalence of a disease, either inflating or reducing its base-rate, and distorting Bayesian reasoning. However, in some cases, clinicians may (consciously or otherwise) deliberately inflate the likelihood of disease, such as in the strategy of â€˜â€˜rule out worst-case scenarioâ€™â€™ to avoid missing a rare but significant diagnosis.
- Commission bias: results from the obligation toward beneficence, in that harm to the patient can only be prevented by active intervention. It is the tendency toward action rather than inaction. It is more likely in over-confident physicians. Commission bias is less common than omission bias.
- Confirmation bias: the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despite the latter often being more persuasive and definitive.
- Diagnosis momentum: once diagnostic labels are attached to patients they tend to become stickier and stickier. Through intermediaries (patients, paramedics, nurses, physicians), what might have started as a possibility gathers increasing momentum until it becomes definite, and all other possibilities are excluded.
- Feedback sanction: a form of ignorance trap and time-delay trap CDR. Making a diagnostic error may carry no immediate consequences, as considerable time may elapse before the error is discovered, if ever, or poor system feedback processes prevent important information on decisions getting back to the decision maker. The particular CDR that failed the patient persists because of these temporal and systemic sanctions.
- Framing effect: how diagnosticians see things may be strongly influenced by the way in which the problem is framed, e.g., physiciansâ€™ perceptions of risk to the patient may be strongly influenced by whether the outcome is expressed in terms of the possibility that the patient might die or might live. In terms of diagnosis, physicians should be aware of how patients, nurses, and other physicians frame potential outcomes and contingencies of the clinical problem to them.
- Fundamental attribution error: the tendency to be judgmental and blame patients for their illnesses (dispositional causes) rather than examine the circumstances (situational factors) that might have been responsible. In particular, psychiatric patients, minorities, and other marginalized groups tend to suffer from this CDR. Cultural differences exist in terms of the respective weights attributed to dispositional and situational causes.
- Gamblerâ€™s fallacy: attributed to gamblers, this fallacy is the belief that if a coin is tossed ten times and is heads each time, the 11th toss has a greater chance of being tails (even though a fair coin has no memory). An example would be a physician who sees a series of patients with chest pain in clinic or the emergency department, diagnoses all of them with an acute coronary syndrome, and assumes the sequence will not continue. Thus, the pretest probability that a patient will have a particular diagnosis might be influenced by preceding but independent events.
- Gender bias: the tendency to believe that gender is a determining factor in the probability of diagnosis of a particular disease when no such pathophysiological basis exists. Generally, it results in an overdiagnosis of the favored gender and underdiagnosis of the neglected gender.
- Hindsight bias: knowing the outcome may profoundly influence the perception of past events and prevent a realistic appraisal of what actually occurred. In the context of diagnostic error, it may compromise learning through either an underestimation (illusion of failure) or overestimation (illusion of control) of the decision makerâ€™s abilities.
- Omission bias: the tendency toward inaction and rooted in the principle of nonmaleficence. In hindsight, events that have occurred through the natural progression of a disease are more acceptable than those that may be attributed directly to the action of the physician. The bias may be sustained by the reinforcement often associated with not doing anything, but it may prove disastrous. Omission biases typically outnumber commission biases.
- Outcome bias: the tendency to opt for diagnostic decisions that will lead to good outcomes, rather than those associated with bad outcomes, thereby avoiding chagrin associated with the latter. It is a form of value bias in that physicians may express a stronger likelihood in their decision-making for what they hope will happen rather than for what they really believe might happen. This may result in serious diagnoses being minimized.
- Overconfidence bias: a universal tendency to believe we know more than we do. Overconfidence reflects a tendency to act on incomplete information, intuitions, or hunches. Too much faith is placed in opinion instead of carefully gathered evidence. The bias may be augmented by both anchoring and availability, and catastrophic outcomes may result when there is a prevailing commission bias.
- Playing the odds: (also known as frequency gambling) is the tendency in equivocal or ambiguous presentations to opt for a benign diagnosis on the basis that it is significantly more likely than a serious one. It may be compounded by the fact that the signs and symptoms of many common and benign diseases are mimicked by more serious and rare ones. The strategy may be unwitting or deliberate and is diametrically opposed to the rule out worst-case scenario strategy (see base-rate neglect).
- Premature closure: a powerful CDR accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decisionmaking process, accepting a diagnosis before it has been fully verified. The consequences of the bias are reflected in the maxim: â€˜â€˜When the diagnosis is made, the thinking stops.â€™â€™
- Psych-out error: psychiatric patients appear to be particularly vulnerable to the CDRs described in this list and to other errors in their management, some of which may exacerbate their condition. They appear especially vulnerable to fundamental attribution error. In particular, comorbid medical conditions may be overlooked or minimized. A variant of psych-out error occurs when serious medical conditions (e.g., hypoxia, delerium, metabolic abnormalities, CNS infections, head injury) are misdiagnosed as psychiatric conditions.
- Search satisfying: reflects the universal tendency to call off a search once something is found. Comorbidities, second foreign bodies, other fractures, and coingestants in poisoning may all be missed. Also, if the search yields nothing, diagnosticians should satisfy themselves that they have been looking in the right place.
- Suttonâ€™s slip: takes its name from the apocryphal story of the Brooklyn bank-robber Willie Sutton who, when asked by the Judge why he robbed banks, is alleged to have replied: â€˜â€˜Because thatâ€™s where the money is!â€™â€™ The diagnostic strategy of going for the obvious is referred to as Suttonâ€™s law. The slip occurs when possibilities other than the obvious are not given sufficient consideration.
- Sunk costs: the more clinicians invest in a particular diagnosis, the less likely they may be to release it and consider alternatives. This is an entrapment form of CDR more associated with investment and financial considerations. However, for the diagnostician, the investment is time and mental energy and, for some, ego may be a precious investment. Confirmation bias may be a manifestation of such an unwillingness to let go of a failing diagnosis.
- Triage cueing: the triage process occurs throughout the health care system, from the self-triage of patients to the selection of a specialist by the referring physician. In the emergency department, triage is a formal process that results in patients being sent in particular directions, which cues their subsequent management. Many CDRs are initiated at triage, leading to the maxim: â€˜â€˜Geography is destiny.â€™â€™
- Unpacking principle: failure to elicit all relevant information (unpacking) in establishing a differential diagnosis may result in significant possibilities being missed. The more specific a description of an illness that is received, the more likely the event is judged to exist. If patients are allowed to limit their history-giving, or physicians otherwise limit their history-taking, unspecified possibilities may be discounted.
- Vertical line failure: routine, repetitive tasks often lead to thinking in silosâ€”predictable, orthodox styles that emphasize economy, efficacy, and utility. Though often rewarded, the approach carries the inherent penalty of inflexibility. In contrast, lateral thinking styles create opportunities for diagnosing the unexpected, rare, or esoteric. An effective lateral thinking strategy is simply to pose the question: â€˜â€˜What else might this be?â€™â€™
- Visceral bias: the influence of affective sources of error on decision-making has been widely underestimated. Visceral arousal leads to poor decisions. Countertransference, both negative and positive feelings toward patients, may result in diagnoses being missed. Some attribution phenomena (fundamental attribution error) may have their origin in countertransference.
- Yin-Yang out: when patients have been subjected to exhaustive and unavailing diagnostic investigations, they are said to have been worked up the Yin-Yang. The Yin-Yang out is the tendency to believe that nothing further can be done to throw light on the dark place where, and if, any definitive diagnosis resides for the patient, i.e., the physician is let out of further diagnostic effort. This may prove ultimately to be true, but to adopt the strategy at the outset is fraught with the chance of a variety of errors.
Source: The Importance of Cognitive Errors in Diagnosis and the Strategies to Minimize Them(pdf file);
Academic Medicine: August 2003 – Volume 78 – Issue 8 – p 775-780