Book Review – Leadership and Self-deception: Getting Out of the Box

I’m going to preface my review by stating that the book I’m going to review is not directly about psychology; it has nothing to do with neuroscience. However, it has everything to do with interpersonal relationships and social interactions, which are two areas frequently addressed in psychology. I do not have any affiliation with The Arbinger Institute; I just enjoyed the book.

The book Leadership and Self Deception: Getting Out of the Box was written by The Arbinger Institute. The work was derived from the ideas of C. Terry Warner, a U.S. philosopher. The Arbinger Institute is a management training and consulting firm that works with businesses and individuals to help them improve their businesses and lives. The 168 page book is easy to read; it is written in a simple prose like a novel.

The main character in the book is Tom, a recently-hired mid to upper level manager at the fictional company Zagrum. Throughout the book Tom mainly interacts with two other characters – Bud, his boss who is the executive vice president of the company, and Kate, Zagrum’s president. Both Bud and Kate take time out of their busy schedules to train Tom about “the box”, which is self-deception.

The gist of the book is that much conflict between people is based on self-betrayal and self-deception. It comes from viewing other people as objects, as “things” that either help or hinder our own progress. The self-deception is that we are more important than other people and that they only exist to help us (or at least not stop us) self-actualize (I’m using different terms than used in the book; the author(s) of the book are not particularly fond of the humanistic concept of self-actualization, by the way). However, we deceive ourselves when we think that if we want to have improved relationships with others – especially if they are strained – then it is others who need to change and not ourselves.

Self-betrayal occurs, according to the author(s), when we are not true to that part of ourselves that is other-centered; this results in self-centeredness. In the book the author(s) give an example of how self-betrayal occurs. I’ll summarize that example.

At night a husband and wife are sleeping. The husband wakes up when the baby in the other room starts to cry. The husband’s first thought is to get up and get the child before his wife wakes up; after all, she works so hard all day and needs all the sleep she can get. The husband’s next thought though is that he too works hard all day and needs to get up early for a meeting. “Why should I get up? My responsibility in this family is to go to work and earn money so we can live. I need all the sleep I can get so I can function at my job – I have a big project to complete tomorrow. [Baby continues to cry]. Why doesn’t my wife get up and get that baby? Doesn’t she realize I need to get sleep? Okay, I know she’s awake now. Why doesn’t she get up? Now she’s just being lazy. [And so on].

These types of thoughts often become self-fulfilling prophecies, such that all our our own actions and thoughts inflate our self-worth (i.e., we do see ourselves as good, hard-working people) while simultaneously deflating the self-worth of another (i.e., we attribute certain attributes to them – “lazy” or “inconsiderate” – and then much of what we see them do after that only supports that hypothesis). While this specific example has not occurred with everyone, we have all experienced similar situations. Maybe the situation is at work where you had a thought that you should do something but then didn’t do it. When it created a problem you were able to rationalize your behavior and blame someone else (“I would have done X had Susan done her job” or “I was just too busy with other things to get X done.”). Basically, self-betrayal results from not being true to what you [hopefully] know is the right thing to do. When we don’t do what we know is right, the normal human response is to rationalize and justify our action or inaction in order to protect our egos, per se. This leads to us shifting the blame from ourselves onto others. We start to view others as hindering our progress; when this occurs they stop being people and start being objects (in other words, people are viewed as either starting blocks or stumbling blocks – they help or hinder us).

It is relatively straightforward to see how this can lead to interpersonal problems – at home or at work. The problem is that we do not know that we are betraying and deceiving ourselves, so we continue to ascribe most of our problems to others. The author(s) further points out that even if we recognize our self-betrayal and self-deception, we never will completely be free of these behaviors; however, we will be able to reduce these negative behaviors and improve our relationships with others.

Overall, this book provides an important and novel way to approach interpersonal behavior. The overarching message is that we should not worry about changing others (or even ourselves! – but I’ll let you read the book to understand that); we should instead recognize that the problem lies within ourselves and go from there. One very creative application of this philosophy is how this is being applied in businesses to increase productivity, human relations, public relations, and even the profitability of the company. I’ll let my readers read this book to understand how this philosophical approach to other-interaction can help a business make more money.

One of my criticisms of this approach to interpersonal behaviors is that it is fairly esoteric and difficult to grasp conceptually. That’s not necessarily a negative; however, it means that most people will really have to study and ponder on the concepts in order to understand them. The book also only serves as a brief – but important – introduction to the topic, leaving one a bit unsure exactly how to implement this new attitude and these new behaviors in one’s own life (although, there is enough information in the book that an astute reader can understand enough to follow this method of interpersonal interaction). This is where the Arbinger Institute’s training workshops and seminars come in. Additionally, C. Terry Warner wrote a book called Bonds That Make Us Free: Healing Our Relationships, Coming to Ourselves, which is a more complete description of the concepts found in Leadership and Self-deception.

I recommend the book Leadership and Self Deception: Getting Out of the Box to anyone seeking to develop insight about themselves and their interpersonal interactions.

The Death of Psychotherapy

I’m going to preface my post by stating that the following post was written to help me think through the relationship between neuroscience and therapy. As such, it is a philosophical journey through some of my thoughts and is not even necessarily what I really believe because I’m still working on discovering what I believe. Thought processes like this are one way I try to keep some of my beliefs about psychology and neuroscience balanced. If I start leaning too strongly one way, I’ll start looking for things that disconfirm those beliefs and see what I discover. It’s a bit of playing the Devil’s Advocate with myself and a bit of philosophizing. Some of my friends and I used to do things like this in junior high and high school – having philosophical discussions where we discussed things and even tried to argue for things that we didn’t necessarily believe (e.g., classic topics such as supposing that this world and universe really aren’t real but are just reflections of reality. Again, that’s not something I believe but we would speculate). What does this all have to do with psychology and neuroscience?

The brain is what drew me to psychology initially. However, I vowed I would never go into clinical psychology because I didn’t think I would like therapy or dealing with people’s problems. Over time I discovered neuropsychology. Most neuropsychologists are clinical psychologists so in order or me to be a neuropsychologist, I had to be trained as a clinical psychologist. There are many things I enjoy about clinical psychology but therapy is not one of those things. Granted, most neuropsychologists do not actually do therapy, but we have to be trained in it. I enjoy talking with people in sessions but I haven’t been that impressed with therapy as a whole so far. Maybe that’s just because I haven’t exactly found the particular type of therapeutic method that really “clicks” with me. Cognitive-behavioral therapy is fine but so much of actual therapy in practice is just plain common sense. However, not everyone has a lot of common sense so they need some training in it. Part of me recognizes the validity of therapy but another part of me struggles with it. Now on to my main article.

The more I study the brain and the more exposure I have to therapy (giving, not receiving), the more biased towards the brain I become. What I mean is that we continue to discover more about the brain and as we discover more, the more behavior we can explain based on biology or neurophysiology and the less important I think therapy is. I’ve written about this topic in the past but wanted to briefly revisit it. This is somewhat of a second chapter to that post. Before I continue I wanted to expose one of my biases; I believe humans have free will. Even though some of my beliefs about the brain could be seen as mechanistic and deterministic, I do not believe that a strongly-biological foundation for behavior rules out free will. You can still assume biological foundations without assuming determinism. If, for example, you have a monistic set of assumptions that incorporates both mind – “nonmaterial” – and body – “material” – in one. [I have quotes around nonmaterial and material because mind is not necessarily nonmaterial and body is not necessarily material, well at least philosophically speaking]. Monism is a similar idea to a unified field theory (e.g., Grand Unified Theory) or the Theory of Everything for which some theoretical physicists are searching. That’s not what I’m going to write about and if it didn’t make sense, then don’t worry about it (I discussed this topic in a couple different posts: here {I linked to that post previously} and here). To summarize, I view behavior through a strong biological bias but I do not assume determinism.

As I said earlier, the more I learn about the brain and behavior (through research and observation), the more I lean towards neuroscience and away from “traditional psychology.” However, I still appreciate the psychosocial aspects of behavior; the nature versus nurture dispute will never be resolved because both are important. The environment is important  – all external stimuli are important – but the problem with downplaying biology is that it is the medium of behavior. What I mean is, everything we think, sense, perceive, or do is translated and transmitted through the firing of neurons. This means that all abnormal behavior, which is what psychologists often are interested in, originates in a neuron or related cell. Whether or not the cause of that behavior was internal or external is irrelevant. All events and stimuli are translated into patterns of neuronal firings.

This is why I think that understanding the biology of the brain is the best way to understand a person’s behavior. However, because we have an imperfect understanding of the biology of the brain, we have an imperfect understand of the biological foundations of behavior. This means that until we have a perfect understanding, we cannot ignore the psychosocial aspects of behavior; even with a perfect understand we couldn’t either because even if we understand the “translation” process we may not understand the origin of what needs to be translated. This is where traditional talk therapy can be most beneficial. However, I still believe less and less that talk therapy is the best solution for dealing with many psychological issues. Over time as we discover more and more about the brain, therapy will become even less important.

That is a fairly radical position to take as a student of clinical psychology – it’s more in line with psychiatry, or rather, I believe it’s more in line with neuroscience. I’m not saying that therapy is useless, I’m just saying that as we gain a more perfect understanding of the brain and how various chemicals interact in the brain, we will have less need for people to help others by “talking” through their problems. The better we understand the physiology of the brain, the more natural our pharmaceuticals will be. In other words, it will be easier to mimic and create normal brain functioning. Of course, many will ask, “What is normal?” That’s a good question.

Some may argue that with depression, for example, many people will have negative image and self-evaluations, which can lead to depression. That is true but it’s the classic chicken and egg question. Which came first? Did the negative thoughts cause the depression or did the person experiencing negative thoughts have a biological predisposition to those thoughts and depression? In other words, it is possible that biology originally led to the negative thoughts and not vice versa. This is all speculation but I think there is increasing evidence for this view.

The big problem with my point though is that at some point, it does become a deterministic system in that it’s possible that we could medicate away people’s free will. This is an unacceptable outcome. There would be a lot of power with this knowledge and many opportunities for abuse. That’s an ethical discussion for a later time.

To summarize, I think that as we (speaking in the collective) gain a more perfect understanding of the brain (and even individual differences in the brain) we will be better able to eradicate and prevent many or most psychological disorders. We could potentially stop schizophrenia through genetic engineering or other modifications. Again, I’m not addressing whether or not we should but I believe we will have the ability to at some point. This is why, at the moment I lean more towards neuroscience than I do psychotherapy. Of course, tomorrow I could [I won’t] write a post that completely contradicts this one. As I said, this is a process. I think it’s important to argue both sides of the issue.

The Philosophy of Science

I wrote this response to someone who questioned the assertion I made (on a different website) that science is not impartial. Let me know what you think.

I don’t really have room or time to get into a philosophical discussion; this is a discussion that takes months to talk about. As a note, I’m not just making things up, I’ve studied epistemology in college. One of the philosophical foundations of science (science is all based on philosophy, which is one reason in the U.S. all science doctorates are PhDs – Doctor of Philosophy) is empiricism. I’ll quote from Wikipedia because in this case it is accurate.

“Empiricism is one of several competing views about how we know things, part of the branch of philosophy called epistemology, or “theory of knowledge”. Empiricism emphasizes the role of experience and evidence, especially sensory perception, in the formation of ideas, while discounting the notion of innate ideas.”

“In the philosophy of science, empiricism emphasizes those aspects of scientific knowledge that are closely related to evidence, especially as discovered in experiments. It is a fundamental part of the scientific method that all hypotheses and theories must be tested against observations of the natural world, rather than resting solely on a priori reasoning, intuition, or revelation. Hence, science is considered to be methodologically empirical in nature.”

There are other competing philosophies to empiricism. Rationalism is one of those; although in our day some ideas of rationalism are combined with empiricism. Materialism (all entities are matter and reducible to smaller entities, e.g., atoms) is another foundation for most science.

Because modern science is based on specific philosophies with specific assumptions (e.g., that all is matter) it cannot be completely impartial because science (forgive the anthropomorphism) inherently disregards anything that is not based on its same assumptions and philosophies (e.g., religion). Science has one particular view of the world and states that everything else is false, or at least unknowable. That’s not impartial – that’s bias. That’s like Americans saying “Our world view is the only correct world view.” Now, maybe it is true but that does not make it less biased. Everything and everyone have biases, even the philosophies that form the foundation for science.

As I said, this is some pretty deep philosophy. People have been arguing over this for thousands of years and will be for thousands more.

One last example. We tend to believe that mathematics is perfect and unbiased. Kurt Godel showed that it isn’t. Now, not everyone agrees with his ideas but he convincingly showed that most math is flawed, or at least incomplete. Math does not equal science but most science is founded on mathematical principles.

I answered your question, hopefully without coming across as a troll. As I said in my original post, I’m not trying to discredit science (science is my job) but blindly accepting that science is perfect and completely unbiased and the only way to knowledge is demonstrating as much faith in science as many do in religion.”

After a reply back that expressed complete disbelief (that also insulted my intelligence) 🙂 here’s my final response:

“I did not say that philosophy and science are the same, I just said that science is based on specific philosophies. As I said, it’s some pretty heavy stuff that most people (rightly) don’t care about. Again, I didn’t say philosophy and science are the same. The relationship (and this isn’t a perfect example) is more like philosophy:science::arithmetic:calculus.”

Am I completely off base here? I haven’t had extensive epistemology but I’ve had a fair amount. I remember in one of my classes that some people just didn’t get it. They were very bright people, it’s just that philosophy requires a different way of thinking (not better, just different). It takes practice; I just happened to start having serious philosophical discussions with friends pretty early on in school.

Biological Determinism

Free WillAs someone with a strong neurobiological foundation, I believe that the brain is the center of all behavior. What is the evidence for that belief? Remove someone’s brain and see if they have any behavior (note: I’m not endorsing this behavior, I’m merely postulating a hypothetical situation). Without the brain, there is no behavior. So, the brain is necessary for behavior but is it sufficient?

In psychology we often talk about necessary and sufficient conditions for behavior. That is, you may need a certain factor in order for a behavior to happen but without other factors, the behavior will not occur. For example, you need water to live – it is necessary – but you also need food, so thus not sufficient. So, the brain is necessary for behavior but can all behavior be explained solely by the brain? Another way of phrasing this question is, “Does biology determine all behavior?” The term for this belief is biological determinism.

To answer the question we first have to investigate and uncover other potential influences on (causes of) behavior. If behavior is biologically determined, do people have free will? That is, do people really have the ability to consciously make and choose different behavior? Or are all behaviors simply determined at the neuronal (or genetic) level and free will is only an illusion? This post is an expansion on one of my previous posts concerning alternative assumptions to naturalism in neuroscience.

If you really believe that the brain (and by reductionism, genes) are solely responsible for behavior, then you cannot believe that people have free will. You also cannot believe that the environment is directly responsible for behavior – it can influence it – but at the core, your genes and your neurons create behaviors. Alternatively, you can believe that humans have free will, that we can make choices because of or in spite of our biology. Agency can influence biology and biology can influence agency – they are not mutually exclusive categories. While the brain is a necessary condition for behavior, it is not sufficient; agency is a factor in human behavior.