Cognition is “the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses.”1 Cognition is a word that dates back to the 15th century when it meant “thinking and awareness.”2 Today the term cognition refers to a diverse collection of psychological activities and encompasses processes such as attention, comprehension, memory, judgment, evaluation, reasoning, decision-making, problem-solving, the usage of language, etc. Cognition within humans may be concrete or abstract, conscious and unconscious, as well as intuitive and conceptual.3
The science of cognitive – cognitive science – is an interdisciplinary approach that tries to understand the mind, intelligence and the workings of cognition by drawing upon insights from psychology, philosophy, artificial intelligence, neuroscience, linguistics, and anthropology among others.4 There are a number of different approaches to understanding cognition, but two main paradigms are the computational and evolutionary approaches. The first wave of cognitive science focused on computational processes that generate knowledge about the world by looking at cognition in terms of inputs, processes, and outputs. One of the main tenets of cognitive science since its origins is that thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures.5 A more recent approach has been that of evolutionary psychology, which tries to understand human cognition as the product of a set of evolved psychological adaptations to perennial challenges in our historical, social and natural environment.6 This evolutionary perspective can be seen as the second wave of the cognitive revolution.7 This second wave views the human brain as composed of evolved computational systems, formed through natural and social selection to use information to adaptively regulate physiology, social and physical behavior. This shift in focus—from knowledge acquisition to the adaptive regulation of behavior—provides new ways of thinking about different areas of cognition.8
One of the primary overarching organizational structures to cognition is its use of abstraction to create a hierarchical structure composed of various levels of abstraction. In an evolutionary sense our biological brains can be seen to be a primitive brain stem, or lizard brain that is wrapped around a mammal brain, inside of a primate brain, inside of a human brain, which is identified as the most recently evolved parts of our brain, specifically the frontal lobe portion of the neocortex.9 The human brain functions hierarchically in its capacity to modify and control the earlier evolved, more primitive parts of our brain through the more advanced parts.Psychologist Abraham Harold Maslow made a holistic attempt to try and classify this hierarchical structure to the brain and its corresponding needs in what is called Maslow’s hierarchy of needs. Although Maslow’s hierarchy may not define an exact correspondence to the underlying complexity of the brain, it does capture a general hierarchical structure to cognition, motivation and the different emotional needs that people have. Within this hierarchy of needs, people are seen to have a set of basic physiological needs such as food, sex, sleep etc. But in addition to these basic emotions, we also have a set of higher psychological needs that we seek: desires to be safe, to be loved, to have self-esteem, and to experience what Maslow called self-actualization.
The brainstem is the area associated with the most primitive functions, regulating the most basic bodily requirements, such as regulating one’s heartbeat or limb coordination while walking. Much of our cognition takes place unconsciously in these most primitive parts of our brain which are also where our emotions are housed. These emotions essentially make quick decisions for us that are mostly adaptive, evolved strategies, including fear, lust, hunger, anxiety, disgust, happiness, sadness etc. The idea is that emotions provide a direct, and rapid, behavioral motivation so that we do not have to perform the conscious act of reasoning things through. For example, in calculating the risk of encountering a predator, we simply experience the emotion of fear, and then we act upon that emotion.
The primitive parts of our brain can experience basic emotions like hunger, but only our much more evolved neocortex can experience an emotion like the need for self-actualization. The neocortex, or the frontal part of our brain, has executive function whereas more primitive parts of the brain are the seats of emotion and instinct. The neocortex is involved with social behavior, long-term planning, and inhibition.
The brain is not a unified whole with some centralized command center, the perception of it being so is a constructor, in effect, it consists of networks of different functional domains and hierarchical levels. Many different parts are interacting and communicating to, on aggregate, create the illusion of one unified consciousness that we think of as ourselves. But this is simply a construct, the reality involves much tension and conflict between the different brain regions as they each attempt to fulfill their own function and purpose.
This is most explicit in the relationship between the more advanced neocortex, that has executive function, and the more primitive parts through which we experience direct emotions and instincts. The conflict between the different parts creates cognitive dissonance, which is the holding of two or more beliefs or ideas that are manifestly contradictory. We do not like cognitive dissidence and thus are motivated to resolve this conflict.10 The individual may resolve this cognitive dissonance in a number of ways. They may exert downward control from the executive part to constrain lower level drives. They may avoid this cognitive dissonance by compartmentalizing beliefs and ideas so as to keep them separate. One instance of this might involve special pleading, which is a form of fallacious argument that involves an attempt to cite something as an exception to a generally accepted rule, principle, etc. without justification for defining it as an exception.11 In so doing one can keep ideas, beliefs or motives separate from other categories that might create conflict.
Humans are very good at inventing reasons to justify their desired beliefs, what is called rationalizing. Often, rather than imposing these high-level functions on our more primitive desires, the neocortex may rationalize – through the use of special pleading for example – decisions that are made by more primitive regions in order to resolve the cognitive dissonance. The different brain regions come into conflict, and once the conflict is resolved, our brains give us a small amount of dopamine, the reward neurotransmitter, to make us feel good. When we meet our psychological needs, our brain gives us a reward that makes us feel good. There is a basic reward and punishment system hardwired into the biochemistry of the brain. When we do something that is likely to be advantageous evolutionarily, we feel good, which equates to a release of dopamine to our reward centers.12 With Functional Magnetic Resonance Imaging (fMRI) and Transcranial Magnetic Stimulation, contemporary neuroscience methods are increasingly finding the neurological correlates to what psychologies have been demonstrating for decades. With these tools, we can now see different brain areas in conflict and the conflict resolution that the individual experience when a decision or rationalization is achieved.
The desire for control, or at least for the sense or illusion of control, is one of the most basic and primitive need that motivates us. We do not like to experience the feeling that we are victims of a whimsical world or that we are helpless in the face of uncertain forces or random events. This desire for control can be seen as a direct product of evolution. Unknown and uncontrollable environments limit our capacity for predictability and security, that threatens our survival. We like to think that we exert some causal control over ourselves, the events that affect us, and over our environment.
One manifestation of this desire for control is a belief in superstitions. We tend to develop beliefs that if we engage in a certain activity, it will protect us or enable us to succeed. Wanting more control or certainty is an important driving force behind most forms of superstition. We invariably look for some kind of a rule, or an explanation for why things happen and feel discontent if we do not find one, leaving us open to the desire of simply create one.13 Superstitious practices give us the illusion that we can exert some control over otherwise random events. We also have a desire for simplicity because the simpler things are, the more control we can have over them. Therefore, we are motivated to oversimplify the things that we are confronted with. We stereotype because it enables us to reduce a potentially complicated set of factors into some simple rules. This can be advantageous when we understand that the rule is just a simplified representation, but often we take the simplified model to be the complex reality that it represents and this leads to errors because the two are different.
Along with a desire for control, we have an innate desire for meaning in our lives, a sense of unity, a desire for connection with something greater than ourselves. And these are strong motivators towards belief in some supernatural power or forces that fulfill all of these psychological needs. The drive towards belief is very strong in humans, particularly strong towards things that we want to believe in. Not only have we evolved to identify patterns rapidly but we also inherently desire control, a sense of stability order and unity. Many of the things we are driven to believe are so because they provide this sense of unity to our interpretation of the world and relieve the burden of having to make a full inquiry into a complex reality. Added to this may things are beneficial for us to believe. When we have a sense that we are connected to something that is profound, it feels good. This can then be reinforced by confirmation bias, which involves seeking out data that seems to confirm our beliefs.
Beliefs are largely expressive of our emotional state,14 when we are positive we believe the future will be good, when we are feeling negative we believe the future will be bad. Thus is can be very difficult to change people’s behavior by making a rational argument to them because their behavior is still overwhelmed by their beliefs and by their emotions. However, if one addresses the individual’s emotions, that is much more effective. This has been demonstrated with the success of modern advertising that speaks intentionally to the emotions while largely bypassing reason; knowing this to be more effective for most people
The social brain hypothesis was proposed by British anthropologist Robin Dunbar, who argues that human intelligence did not evolve primarily as a means to solve ecological problems, but rather as a means of surviving and reproducing in large and complex social groups.15 The development of the human brain can be closely associated with the development of ever more complex social systems; the use of language for communication between members and other socio-economic facts involved in the formation of large formal social organizations.
Social norms, the customary rules that govern behavior in groups and societies, strongly shape individual human behavior.16 Most people are driven much more by social influence than by reason. Telling someone what others do in a situation is much more effective than telling them what is the logical things to do. For example, telling people that others do not drink and drive is much more effective than telling them the negative implications of such behavior.17 Individual humans have a need for self-esteem, people want to feel good about themselves, but just as importantly we want to know that others think positively of us also. People always want to make their behavior and beliefs seem consistent to others. In order to do this, we often rationalize what we want to believe in order to put a socially acceptable spin on our behaviors and behavior. Equally, people do not like to admit that they may be wrong or to admit that we have flaws because that is a threat to our self-esteem.
The brain gives us the experience of free will and control over ourselves, but research overwhelmingly shows that human beings generally have poor self-control. Typically, about 95 percent of the time, people will fail to alter their own behavior through conscious effort alone; for example, to quit some habit. This is due to the fact that this takes significant energy and conscious commitment on the behalf of the individual.18 The brain is highly adaptive and plastic, meaning its ways of thinking can change over time. Therefore, if one practice and make a concerted effort to behave in certain ways, those behaviors will become inculcated and will become easier over time. Practicing the habit of executive control or executive function over one’s more basic parts of the brain can be a learned capability. This is where critical thinking can be of value, in examining the individual’s behavior carefully, in understanding errors and enabling us to exert an effort to correct flaws.