The perfectible brain.

Why does the perfectible brain seemingly perceive sometimes complicated truths as most attractive, and how do we avoid the temptation of influencing others with this non-critical thinking?
A few challenging questions that can or should stimulate further in-depth research.

Perfectible: capable of improvement or perfection.

Introduction

The human brain and the effects of using it has fascinated researchers for more than 5,000 years. From the first reported euphoric effect of the poppy plant in ancient Sumerian records in lower Mesopotamia (3,400 BC) to recent findings on circumventing the brain’s natural defences when they are counterproductive (Eric Song et al.,2020), mankind has always considered the brain as a mouldable concept. The brain is a complex body part; it has the ability to change continuously during an individual’s life. Neuroplasticity is important for healthy development, learning, memory, and brain injury recovery on multiple scales, from small improvements in individual neurons to major changes, such as cortical remapping (Pascual-Leone A et al., 2011). The brain can reorganize itself, both physically and functionally, based on our environment, behaviour, thinking, and emotions such that every new experience enhances or changes the mind’s old dimensions. As the American jurist Oliver Wendell Holmes Jr. once said: ‘A mind that is stretched by a new experience can never go back to its old dimensions’. Improving one’s intelligence, solving problems, integrating new ideas, and even experiencing unpleasant emotions not only restructure and alter the neurochemistry of our brain but also affect the lives of people around us. Ancient philosophers affirmed that gaining knowledge improves our decision-making, memory and thinking, and influences our environment. Alcmaeon was the first ancient scholar to recognize the human brain as the human body’s most important organ (Zemelka AM, 2017). According to Alcmaeon, the brain plays a guiding role (Gr. hegemonikon) in the body. Unlike Empedocles (Longrigg J, 1993), Alcmaeon investigated the source of intelligence in the body itself. Although the role and importance of the heart was prevalent in pre-Classic times, Alcmaeon argued that the brain is the centre of human mental life. His view of homeostasis theory and perceived sensory cognition as closely related to the cerebrum was a unique approach to the topic, and his conceptualization of the brain is still mostly up-to-date even today (Zemelka AM, 2017).

Why is it important to not let this three-pound organ responsible for our intelligence, sensation, body movement, and behaviour stay in oblivion? If we consider the brain as a significant organ, why do we accept the fact that the brain seemingly perceives the simplest truth as most attractive? Conversely, to what extent are we responsible for how our thoughts influence others and their thinking?

Red pill or blue pill

In The Matrix, a 1999 futuristic film by the Wachowskis, the rebel leader Morpheus offered Neo, the protagonist, a red pill and a blue pill. The red pill signifies an uncertain future; it offers freedom from the slavish control of a machine-generated world and allows him to escape to the real world, which is cruel and more difficult. The blue pill represents the world in which he lives, a dream world without lack or fear – the simulated reality of the matrix. Neo chooses the red pill, with clear references to Alice in Wonderland by Lewis Carrol.

This is your last chance. After this, there is no turning back. You take the blue pill – the story ends; you wake up in your bed and believe whatever you want to believe. You take the red pill – you stay in Wonderland, and I show you how deep the rabbit hole goes. Remember, all I am offering is the truth. Nothing more. (The Matrix, 1999)

The decision regarding which pill to take is crucial because the choice is irrevocable. Australian writer, philosopher, and literary critic Russel Blackford questions whether the person is fully informed about the consequences of taking the red pill, and therefore opting for the ‘real’ world and choosing physical reality over simulated reality. Such choice may not be beneficial nor valid for all people (Blackford, R. K., 2004). Cypher, another character in the movie, regrets his choice of taking the red pill and states that if he would have been fully informed about the ‘real’ situation, he would have told Morpheus to ‘shove the red pill right up [his] ass’. Ignorance is bliss, it seems, but Blackfort argues that even in the worst-case scenario, taking the red pill is worthwhile because one lives and dies authentically.

Authenticity is exactly what we are looking for. Descartes wrote, ‘Cogito, ergo sum: I think, therefore I am’. Conversely, Nietzsche claimed, ‘Sum, ergo cogito’, stating that by apperceiving, one can conclude that one must be a thinking being. He argued that social ontology, including metaphysical, logical, linguistic, and conceptual elements, may in fact be a precondition for Descartes’ conclusion about human existence from such predetermined values (Monte J., 2010). It could be misleading to consider this as a simple truth, as according to Nietzsche, cogito is only a part of one’s mental status. However, thinking – in whatever order – is connected to the concept of authentic being and to the creation of one’s own reality. Why then is ignorance bliss for so many people (Gray T., 1743)? Why do some people think it is better not to know? Why do they not think more critically? Why do they not question wrong information about something? People seem to flee into obscure theories rather than accept the truth. Is it their worldview? Is it the way how they exist in their own universe? Is it because society told them not to question authority, not to question life? Sometimes people are just mirroring the reality of others, and therefore mirroring the mind of others. And, on the contrary, are people who use the capabilities of the brain considered gifted? Do they form an elite group that can influence the world and our reality? Do we have to question everything? Why choose a safe haven and not take risks? Is there a reason not to use our brains, not to make use of neuroplastic possibilities? In this world of instant gratification, it is so difficult to choose either the path of critical thinking or the mindset that a priori is beneficial for everyone. There must be a generally beneficial thought. How can we change something we are not aware of? Are we busy rather than being alone with our thoughts?

Quid est veritas?

‘You can’t handle the truth!’ said Jack Nicholson once (A few good men, 1992).  But what is the truth? Truth is defined as ‘a fact of faith that is accepted as true’. This popular definition has two major problems. First, in itself, this definition defines nothing at all. Any definition that uses the same concept in its actual definition is pure nonsense, such as ‘goodness is something good’. Second, this definition expresses self-destructive subjectivism. If ‘accepting’ or ‘agreeing’ makes it true, then truth does not actually exist. How can our brain process such abstract thoughts, and how can we express this without prior consideration? How do our brains see the world, and how does it disrupt us to create and innovate?

Perception is the basis of human experience. A phenomenon known as motivated perception has been studied in psychological research for decades. The world as we perceive it in our consciousness is not exactly an accurate representation of what it really is. The endpoint seems to be a set of probable events from which our consciousness chooses. This choice also influences and affects others; therefore, our perception is often biased, selective, and malleable. People tend to believe that their perception is a veridical representation of the world but often report that they perceive what they want to see or hear from others. It remains unclear whether this reflects an actual change in what people perceive or only a bias in their response (Leong, Y.C., Hughes, B.L., Wang, Y. et al.,2020).  People assume that they see the world as it really is and that there is not always a good motivation to reason critically. As such, how do we solve this issue? Are we, in general, responsible for our thoughts, even before the action, in favour of others’ minds?

Post-truth

False information has become more and more common, and worse, the human brain loves it. As a philosophical concept for the disappearance of shared objective norms of truth and the misstep between fact, knowledge, opinion, belief, and truth, it is a fairly new concept, but it can be traced back to earlier debates on relativism, postmodernity, and mendacity in politics. After the Brexit referendum and the election of Donald Trump as president of the United States, the Oxford English Dictionary chose ‘post-truth’ as Word of the Year in 2016. The current COVID-19 situation has the same characteristics. These are all situations in which public discourse is formed by others’ thinking, resulting in emotion, personal conviction, or ‘felt truth’ rather than objective facts (Post-Truth: perspectives, strategies, prospects – KU Leuven conference Jan 16-17, 2020).

This strange phenomenon is characteristic of the 21st century and reflect a profound lack of confidence in the legitimacy of dispensers of facts (i.e. experts) by groups that suspect any truth or claim to the truth. Science is not spared, as demonstrated by the climate controversy and the rejection of vaccines by a large group of the population. These defenders of this ‘new’ truth also defend their right to say and do so. Unfortunately, post-truth possesses formidable self-defence mechanisms. Spreading corrective updates, however factual, often reinforces false information simply because it is repeated and propagated. Attacking an alternative fact also gives it weight, making it more credible and memorable than it deserves (Rivera, C., 2017). Worse still, a recent study shows that even if we succeed in changing false beliefs, this does not guarantee a change in behaviour, so why bother? Is there a way to change this alternative reality? Do we, the human species, have a certain responsibility towards others to keep these alternative truths under control? Every now and then, it is hilarious to check on the latest conspiracy theories, yet we also remember the moment when 11 people were shot to death at the Tree of Life synagogue by a gunman believed to be inspired by an anti-Semitic internet conspiracy theory. This all started with a certain momentum in the thinking process of one irresponsible individual.

Hyperbrain and the fear for real life

Scientists say our ‘mind’ is not confined to our brain or even our body; nevertheless, we need a brain. Science has proven that high intelligence is predictive of positive outcomes, including educational success and income levels. However, little is known about the difficulties people, especially those with high intellectual capacity (i.e. hyperbrain), face. These individuals exhibit overexcitability in various domains that can predispose them to certain psychological disorders, physiological conditions related to increased sensory, and altered immune and inflammatory responses (i.e. hyperbody) (Karpinski R.I., 2017). Ruth Karpinski, the principal author of a study, developed a hyperbrain/hyperbody theory of integration that states that individuals with high cognitive ability respond with an overly excitable emotional and behavioural response to their environment. Partly because of their increased awareness of their environment, people with a high IQ (whatever that means) tend to experience an over-excitable, hyper-reactive central nervous system. This seems to indicate that high intelligence is a potential genetic piece of a psycho-neuroimmunological puzzle. Can our unconscious self realize this? Are we unconsciously programmed to not use our brain to its full capabilities? Are we unable to think critically, not towards an endpoint of our thinking (i.e. the top of the brainwave) but from the starting point of our conscious momentum? Is it our duty to take our own pre-action thinking into account?

Designer brains

Science is on the eve of technological breakthroughs that can improve our mental capabilities beyond recognition. It will soon be possible to increase the power of the human brain with electronic ‘plug-ins’ or even through genetic improvement. What does this mean for the future of humanity? Brain condition is a pill or a chip away. Brain amelioration is already a reality, but do we need to get to this technology quickly, or do we have to stop and listen to scientists, doctors, theologians, philosophers, and politicians? Memory-enhancing genes have already been identified, and intensive research is conducted to develop memory-enhancing drugs. We always focus on a therapeutic strategy. Should this seemingly endless ability to innovate, adapt, and be better be a concern?

In a world where a chip, a pill, or a gene that can so easily form and transform the brain is available, how can we ensure that we do not promote inequality, become less autonomous, build more brain-related pathologies, or end with ‘lazy brains’ (Wiltfang J. 2018)?  In 2016, scientists at Columbia University in New York used light technology to control the mind of a living being, demonstrating that brains can be manipulated to change behaviours and processes that go far beyond what was previously thought possible.

In our increasingly complex society, cognitive functioning is key. Numerous studies to augment brain functions have been proposed, resulting in discussions about ethical, societal, and medical implications. This is often seen, from a public point of view, as an indivisible status quo, but when we look closer, it clearly appears multifaceted: There is no ‘one solution’ to augmenting the brain’s functions. Only a great variety of interventions, clustered into biochemical, behavioural, and physical actions will or can result in cognitive enhancement. Can we, with this enhancement of the mind, conduct our own thinking in the near or far future such that we can take a certain a priori responsibility towards our fellow man or woman? Can we take action before the action or is this all very futuristic?

Ambicentric thinking

A new way of thinking is emerging. It is logical that our thinking – not only its result (i.e. the action) but also the a priori momentum in which our thoughts are conceived – influences others. Training our consciousness in taking ‘the other’ into account can prevent interferences. From the first organizing or decompositioning of a complex problem into more manageable parts, the relevance of the possible action later in the thinking process to the thinking of others can be part of the process itself. This should be a segment in the identification of the problem – the determination of why the problem exists and what the consequences would be if no action is taken to solve it. This requires epistemic inclusion in our thinking processes, which contrasts the more pragmatic way of thinking (i.e. bringing ourselves closer to the goal) and enables us to uncover information about ‘the other’ that is hidden at first sight. This ambicentric behaviour, as in a moderated anthropocentrism where human interests demand precedence and where people have qualified duties to the system (Marie, M. 2005), can result in an ambicentric thinking process that takes into account the importance and relevance of one’s own mind and also recognizes that the same mind has a task in relation to ‘the other’. This ambicentric thinking should be applied to other thinking principles as well. For example, while critical thinking generally improves our thinking by analyzing, assessing, and reconstructing itself (i.e. self-directed, self-disciplined, and self-corrective) we can add some ambicentric thinking in the very process. The same is true for system thinking where the use of certain events, patterns, and structures as parts of an overall system can be reconsidered as regards how the thinking process (and the resulting action) influences the thinking process and behaviour of others, not only from an action point of view but also from every step taken in that specific thinking process. This responsibility to and recognition of the relevance of other people’s minds will also contain an ethical framework. Adding these ambicentric thinking aspects into our known thinking processes will result in more balanced actions, for the benefit of others and the society at large.

In a world full of different truths, enhancing our thinking processes is crucial. Thinking is often considered as an elitary practice. Too much intelligence or too much knowledge is suspicious. Alternative truths give just enough facts to entertain our initial fears. Accordingly, the creation and conservation of these logical fallacies and proportionality biases are disruptive for the well-being of a society. There seems to be a very thin line between thinking about facts and inventing facts for us to understand the world and how it works. There is certainly more going on than we can understand, and this ambicentric approach will enlighten our struggle with the complexity of everyday life. More research is needed to explore how far we can bend our thinking processes towards a more holistic way of understanding how we influence the world around us, not only through the result of our thinking (i.e. action) but also through our efforts to understand the phenomena. We do this for ourselves and for others.

Should we, as a society, educate ourselves in this matter? Apart from the fact that the brain is clearly elated with receiving untruths, there is also a clear need for holistic formation and education within all layers of the population from an early age. How can we introduce our brain to a new way of enhancing our thinking so that we can innovate or create new ideas? Our educational system should emphasize on the mindset rather than on the action or the result to create a framework for identifying thinking processes in such a way it adds new dimensions to our worldview and provides us, early on in the process, a glimpse into the worldview of others.

Conclusion

Critical thinking has often surpassed truth since the Enlightenment. In the domain of religion, ’truth’ could relate to quite different things. The distinction of intra-religious (the supernatural) and extra-religious truths (i.e. the historical events) (Bronk A. 1998), had more to do with faith than with thinking an sich, but the Enlightenment did somehow took the non-discussable, dogmatic and also easy to comprehend truth away from society. A new era of enlightened thinking celebrated human rationality but also added some uncertainty to the human mind in relation to its environment. Individualism was a prominent theme of the Enlightenment era, but along the way human society needed more than just strong, self-reliant, and assertive individual thinking. Like our daily communication, thinking should be a multidirectional process, whereby numerous agents transmit information, not only in their actions, but also in the process prior to the action, the active thinking. This will not alter our individuation process but will add an extra dimension to our thinking process and, at the end, broaden the human’s individual worldview.


Sources:

  • Eric Song, Tianyang Mao, Huiping Dong, Ligia Simoes Braga Boisserand, Salli Antila, Marcus Bosenberg, Kari Alitalo, Jean-Leon Thomas & Akiko Iwasaki. VEGF-C-driven lymphatic drainage enables immunosurveillance of brain tumours. Nature, 2020
  • Pascual-Leone A.; Freitas C.; Oberman L.; Horvath J. C.; Halko M.; Eldaief M.; et al. (2011). “Characterizing brain cortical plasticity and network dynamics across the age-span in health and disease with TMS-EEG and TMS-fMRI”
  • Zemelka AM. Alcmaeon of Croton – Father of Neuroscience? Brain, Mind and Senses in the Alcmaeon’s Study. J Neurol Neurosci 2017
  • Longrigg J (1993) Greek rational medicine. Philosophy and Medicine from Alcmaeon to the Alexandrians. London: Routledge 1041.
  • Blackford, R. K. (2004). Try the blue pill: what’s wrong with life in a simulation? In M. Kapell, & W. G. Doty (Eds.), Jacking Into The Matrix Franchise: Cultural Reception and Interpretation. (1st ed., pp. 169 – 182). New York USA: Continuum International Publishing Group.
  • Monte J; Sum, Ergo Cogito: Nietzche re-orders Descartes, Aporia vol.25, n.2-2015
  • Leong, Y.C., Hughes, B.L., Wang, Y. et al. Neurocomputational mechanisms underlying motivated seeing. Nat Hum Behav 3, 962–973 (2019). https://doi.org/10.1038/s41562-019-0637-z
  • Rivera, C. B. P. A. V. A. (2017, August 22). Human Brain Loves False Information. Retrieved from https://galindes.wordpress.com/2017/08/21/human-brain-loves-false-information/
  • Post-Truth: perspectives, strategies, prospects – KU Leuven conference Jan 16-17, 2020
  • Karpinski R.I “Hyper Brain, Hyper Body: The Trouble With High IQ.” NeuroscienceNews. NeuroscienceNews, 11 October 2017.
  • Marie, M., Edwards, S., Gandini, G., Reiss, M., & Borell, V. E. (2005). Animal Bioethics: Principles and Teaching Methods (1st ed.). Wageningen, The Netherlands: Wageningen Academic Publishers.
  • Wiltfang J. (2018) Unlecture session 18 – Designer Brains in the Pursuit of Human Perfection.
  • Bronk, A. (2018). Truth and Religion Reconsidered: An Analytical Approach. Retrieved from https://www.bu.edu/wcp/Papers/Reli/ReliBron.htm