(And Who Says They’re Wrong)
Heather Stone, Ph.D.
Clinical Psychologist, PSY 21112
Daniel M. Wegner uses the phrase “authorship confusion” to describe how people mistakenly assume responsibility for causing an event, simply because the thought preceded the occurrence.
Theodore Sarbin offers the phrase “believed-in imaginings” to suggest that people affirm the existence of improbable things. A hypothetical perspective, such as a “what if” thought, morphs into an “it is” thought – creating the feeling that an imagined scenario actually exists. The person considers something as if it were true, and pretty soon it appears to be true. When this happens, the belief and the phenomenon cannot easily be differentiated.
Causal Mistakes and Reasoning Errors.
Jean Piaget states that from a very young age, people develop mistaken beliefs about causal relationships between the mind and the physical world. Examples might include certain rituals such as counting, checking or repeating – compulsions that are intended to bring about something good or prevent something bad. The mind errs by taking unrelated events and connecting them, creating the feeling that there is a meaningful relationship between the two. Suddenly, meaningless things take on unique significance.
Steven C. Hayes explains that we believe our thoughts to be literally true when perhaps they are not, and this happens because ideas arise convincingly inside our heads in the form of language. In this way, thoughts become convincing and we become “fused” with our thinking. The assertion, “I’m right and I can give you the reasons” is a strong indicator of cognitive fusion. Matthew McKay and Patrick Fanning agree: “There is no one so sure as someone totally deluded.”
Cognitive Motivation to Reduce Uncertainty.
Leonard Zusne and Warren Jones describe that we all have a cognitive motivation to secure explanations, however faulty. This drive to remove uncertainty is so powerful that the mind will prefer to fill the gap with incorrect information (even with catastrophic explanations) rather than tolerating the unknown. Jeffrey S. Victor explains that even disturbing beliefs receive credibility: “A mistaken explanation for emotional pain can be preferable for a confused person to the ambiguity of uncertainty.”
Congruence is Preferred Over Truth.
Zusne and Jones also describe how people want to believe something simply because it matches up with how they feel. Illness, fatigue, chronic pain, menapause, and PMS are good examples of this, where events or interactions become exaggerated or misinterpreted. The best example of this is a panic attack, where people explain the “spike” in their nervous system as evidence that they are dying, going crazy, or losing control. However, just because the thought feels congruent with our physical or emotional state, it doesn’t make it true.
Sigmund Freud, Margaret Mahler, and Ernest Becker have discussed magical thinking as a “primitive” (early) defense mechanism that was originally designed to protect us from feeling helpless. Magical thinking is a universal condition that continues throughout everyone’s lifetime, often emerging in the face of “existential anxieties” surrounding separation, death or uncertainty. But in other moments where we feel a loss of control (like when chance, hope, luck, fear, or danger are present), magical thinking will show up as well. Magical thinking is the belief that thoughts and reality are connected and that thinking can influence the actual world. Omnipotence (the belief that we caused something by thinking about it) is one form of magical thinking, and so are superstitions. In fact, most of the distortions in this handout are, to some extent, a form of magical thinking. From a behavioral perspective, magical thinking exists largely to control the uncontrollable.
Human beings have evolved to become very anxious, but this trait helped our ancestors more than it helps us. Aaron T. Beck explains that in earlier times when our physical survival was at stake, we could not afford to miss any danger signals. “It is better to have ‘false positives’ (false alarms) than ‘false negatives’ (which miss the danger) in an ambiguous situation. One false negative – and you are eliminated from the gene pool.” This is why it is said that “evolution favors anxious genes”: Our hypervigilant ancestors passed their genes on to us, and now we suffer from “negativity bias” – a propensity to focus on negative events or even perceived threats to the exclusion of neutral or positive things.
This phrase means that the person “importantizes” or “overvalues” certain ideas, making random, fleeting thoughts more meaningful or threatening than they actually are. For example, having a disease or illness can seem plausible simply because it was on tv, mentioned in a conversation, or encountered in some other innocuous context. The International Obsessive-Compulsive Foundation describes this phenomenon as “when the person with OCD has great difficulty understanding that his/her worry is senseless.” Jonathan Grayson similarly says it is “the belief that the concerns underlying the symptoms are entirely realistic.”
Stanley Rachman describes a tendency to confuse thinking about an action with the action itself. For example, we all have senseless, random thoughts, such as, “What if I just drove into oncoming traffic right now?” or “What if I stood up and shouted an obscenity in the middle of church?” As often seen with OCD, these transient thoughts make some people concerned that they might actually do those things. However, actions require execution and volition; thinking the thought isn’t the same as actually doing it.
©2013 Heather Stone, Ph.D.