International Journal of Brain and Cognitive Sciences

p-ISSN: 2163-1840    e-ISSN: 2163-1867

2024;  12(1): 10-15

doi:10.5923/j.ijbcs.20241201.02

Received: Aug. 29, 2024; Accepted: Sep. 10, 2024; Published: Sep. 13, 2024

 

Biological Consciousness: A Novel Framework Including a 10-Parameter Definition to Explain the Qualitative Experience of Consciousness

Jabi Shriki1, Ted Selker2

1Associate Professor, Department of Radiology, University of Washington School of Medicine, Staff Radiologist, Puget Sound VA Healthcare System, USA

2Research Professor, University of Maryland, USA

Correspondence to: Jabi Shriki, Associate Professor, Department of Radiology, University of Washington School of Medicine, Staff Radiologist, Puget Sound VA Healthcare System, USA.

Email:

Copyright © 2024 The Author(s). Published by Scientific & Academic Publishing.

This work is licensed under the Creative Commons Attribution International License (CC BY).
http://creativecommons.org/licenses/by/4.0/

Abstract

Artificial intelligence has opened new perspectives on consciousness and potentiates resolution of the “hard problem of consciousness”. However, there are important differences between how biological entities actualize consciousness and how computer systems simulate the experience of consciousness. We propose a novel conceptualization of biological consciousness that considers the features of biologically conscious beings, and also incorporates the context of their interactions with their environment. Our model considers ten features that are necessary components of the qualitative experience of consciousness for biological entities: environmental features, borders around the conscious entity, input sensing systems, response systems, internal decision algorithms, feedback loops, memory systems, interoception, proprioception, and predictive systems. We believe that this definition of biological consciousness is informed by relevant insights from the neurosciences and that this definition clarifies the limitations of efforts to simulate and approximate biological consciousness in computer systems.

Keywords: Consciousness, Biological consciousness

Cite this paper: Jabi Shriki, Ted Selker, Biological Consciousness: A Novel Framework Including a 10-Parameter Definition to Explain the Qualitative Experience of Consciousness, International Journal of Brain and Cognitive Sciences, Vol. 12 No. 1, 2024, pp. 10-15. doi: 10.5923/j.ijbcs.20241201.02.

1. Introduction

There has never been a better time in history to study the phenomenon of consciousness. Our ability to look deep inside a living, thinking, awake brain as it’s functioning, and dissect it down almost to the neuronal level has never been higher. Mapping tracts in the brain has shown us the complex elegance of the information superhighway that exists inside our own heads. (Yamada et al., 2009) We’ve even been able to map out the entire brain of smaller mammals on a cell-by-cell level, creating a map of every connection that comprises their minds. (Yao et al., 2023) Progress in functional MRI has shown us the waves of dynamic, metabolic activity in our brains as they propagate—thinking as it’s happening. (Wang et al., 2016; Bandetti, 2012)
However, translating anatomic clarity into a functional understanding of the processes of consciousness has been difficult. The study of consciousness often encounters the impasse of the “hard problem of consciousness”, which refers to the set of problems in explaining how physical processes give rise to the qualitative experience of consciousness. (Chalmers, 1995) Attempts to clarify the definition of consciousness have given rise to dualistic models that consider the mind as a distinct entity from the physical brain. (Jackson, 2003) However, dualistic models have been criticized for being non-scientific, and some authors, notably Daniel Dennett, have denied that the hard problem of consciousness exists, arguing that consciousness is a complex, but physically real, mechanistic process. (Dennett, 1996; Cohen 2011)
An additional problem is that understanding consciousness requires a multi-disciplinary approach, since many of the existing questions around consciousness span so many disciplines, such as philosophy, anatomy, physiology, radiology, neuroscience, and computer science. (Grandpierre et al., 2013; Lozovanu et al., 2021) In an era where the sciences are increasingly siloed off from each other, tackling such a complex, trans-disciplinary problem is even more difficult. (McLevey, 2018) As a result of these hurdles, there is no universally accepted definition of consciousness. (Velmans, 2009; Graziano, 2022) Many proposed definitions, fail to incorporate some of the complexities of biological consciousness. (Edelman, 2011)
One of the recent revelations in our study of consciousness, elucidated by Anil Seth and colleagues, is that it may be a spatially and temporally heterogeneous epiphenomenal state, consisting of brain activity which dynamically bounces around inside our heads, resulting in tightly coordinated activation of parts of our brain that are far apart from each other. (Seth and Bayne, 2022) Studies seem to show that the degree of spatially heterogeneous activation is the main differentiating factor between an awake brain and a brain that is unconscious or under the influence of anesthesia. So, the key to unlocking consciousness may rely on the recognition that signals have to be rapidly processed in multiple different areas of the brain for consciousness to emerge. (Revach and Sati, 2022)
We will propose a set of definitions that complement this functional understanding. This set of definitions adds deeper explanation of the steps needed to generate the qualitative experience of biological consciousness.

2. Framework for the Definition of Biological Consciousness

Our model is built on two assumptions: First, some components of the qualitative experience of biological consciousness are not limited to the brain, but also require participation of the entire organism. Second, in order to understand consciousness, some system-level issues need to be considered. In other words, an organism can’t be conscious in isolation. There must be an environment and a physically real or perceivable external world in order for consciousness to exist. (Nagel, 2013; Bierman 2001)
First, let’s briefly explain why consciousness, as we qualitatively experience it, is more than just a process limited to the brain. There is consensus among most researchers in this field that the brain is not the only component of a conscious mind (Edelman et al., 2011), and that there must be other neural and hormonal underpinnings. Blood flow to the brain is tightly regulated and is a critical component of neuronal activation, and this depends on the rules of our circulatory system and its governing hemodynamic parameters. The chemical milieu of the vat in which our brains sit is made up of components that are influenced by our dietary intake, factors related to our use of other body parts, distributions of neurotransmitters, and other hormonal, hematological, and chemical factors. There are also biochemical factors, genetic factors, and entrainment factors that act at the neurotransmitter level that may be variable over time and in different regions of the brain. So, there is broad consensus that the conscious mind is more than just the brain, and is also tied into several, interlinked physical, chemical, and biological processes of our bodies that may vary over time. This may seem like an obvious point, but it will be shown to be an important preface of the definitions that we provide subsequently.
Second, let’s briefly review why consciousness is a system-level process. In isolation, an organism cannot be conscious without a physically real world or other environmental system and associated governing rules. In the study of the brain and consciousness, we often don’t have sufficient appreciation for the fact that consciousness is a dyadic relationship. It’s not just about how an entity behaves, it’s about the features of the environment that are perceived by the observer. So, consciousness is defined partially as a property of a system, including the observer and the environment. Without three-dimensional space, an entity would not have awareness of spatial location. Without the passage of time, biological processes underlying consciousness cannot take place.
Stemming from these assumptions, these are the properties that pertain not only to the individual, but also to the surrounding environment. The following properties are necessary for consciousness to emerge:
1. Physical environmental features: These are the physical features and rules of the environment that need to be present before consciousness can occur. The existence of these parameters may seem axiomatic, but they are important to delineate and consider. There must be space and dimensionality for consciousness to emerge within that space. There also must be time and causality, so that interactions between the individual and the environment can occur. So, a set of laws of physics needs to be in place and needs to govern the environmental features that are within the realm of perception.
2. Borders: Just as there must be dimensionality of the environment, there must be borders between the environment and the conscious individual. We need to be able to define where a conscious entity ends and where the environment begins. This may again seem axiomatic, but this is an important consideration for our subsequent analysis, and we will reference this further as we consider the development of higher-order consciousness.
3. Input sensing system: For consciousness to arise within a system, there must be a way for signaling to be passed from the environment to the individual. An example would be a chemical milieu gradient that can act as the basis of chemotaxis. Another example would be the presence of sound waves that are propagated in an atmosphere. But the organism or individual generally needs to have a way of sensing the environment, so that interactions can occur, and this is a prerequisite for consciousness. Each sensory system can be disabled in disease states, but the infrastructure of processing and consciousness would remain present.
4. Ability to respond to the environment: As a prerequisite for the emergence of consciousness within a system, the individual must be able to induce some sort of change as a response to a change in the environment. This change may consist of motion, buttressing of barriers, shape changes, or other changes in position, behavior, or characteristics as a response to signals. If the individual cannot change themselves or their position, then there is no way to detect or measure the interactions that might constitute consciousness. If we have an animal that does not respond to differences in temperature, then we would have no way to conclude that it is conscious of temperature changes.
5. Internal decision calculus: The conscious individuals within an environment can’t just be idealized billiard balls reacting uncontrollably to changes around them. They need to be able to make internal decisions or internally create algorithms for determining what interaction they will have with the environment. If the behavior of the individual is exclusively determined by environmental features and external governing rules, with no internal ability to recalculate, adjust, or refine interactions, then there can only be reactions, rather than consciousness. This internal decision calculus can be very complex and layered to give rise to higher forms of consciousness, with engagement of more complicated or elegant decision trees. The rules of how individuals respond to the environment have to be different from the physical rules of the environment itself.
These features, described above, are the parameters that the system must have for the minimum definition of consciousness. We can overlay on these minimal requirements above to create higher or more complex forms of consciousness that would be more familiar to us as what we would generally regard as consciousness. These additional features include:
1. Internal feedback loops: For higher orders of consciousness or the experience of consciousness that we are more familiar with, we must have internal feedback loops that can be interpreted internally as positive or negative responses. These can be pleasure, pain, or internal rewards as a response to how the individual navigates the environment. These feedback loops may also occupy varying degrees of complexity. The human brain seems particularly adept at accomplishing this. We can react to changes in our environments, and then we are able to react to our own responses to those changes, so that we engage even more complex responses to our own actions as well. Internal feedback lays out the ability to interpret scenarios and develop secondary, tertiary, or other higher order reactions to the interplay between environmental changes and our reactions to those changes.
2. Memory systems: Individuals within a system interacting with their environment can have memories for navigating, creating, and overlaying their internally generated mnemonic map over preexisting landmarks or stimuli in the environment. Memory systems can consist of laying down and recalling memories as well as overlaying other narrative information with varying degrees of complexity.
3. Interoception: In our parameters of simple consciousness, we essentially regarded conscious entities as point sources within their respective environments. In order to consider more spatially variable, potentially multidimensional beings, we need to add to individuals the ability to define their internal structure as the infrastructure of a rough, internal stimulus-response system. Interoception also brings into the system the ability for individuals to define and understand their own borders relative to the environment. More simple forms of consciousness might only require that a border exists, but higher orders of consciousness likely require a more refined internal map of where those borders lie. We aren’t just vaguely conscious. We also experience consciousness in relation to our bodies on a moment-to-moment basis. We’re aware of where our bodies are how our limbs are positioned relative to each other, and we integrate this information nearly instantaneously as well.
4. Proprioception: In order to create a more complicated model of positioning and velocity within maps of self-environment axes, conscious individuals need to also have proprioception. Proprioception consists of senses that confer awareness of movement and position within an environment. This is necessary for higher order self-to-environment dyadic interactions, for individuals to develop more elegant methodologies of navigating their environments. Higher orders of proprioception confer the ability for the individual to sense their position and velocity and map and anticipate changes in the self-to-environment relationship.
5. Predictive systems: While we added internal decision calculus as a prerequisite for simple forms of consciousness, we can add to this a system for anticipating and predicting future changes to create even more complicated systems of informing internal decision calculus. Predictive systems may grow out of refined combinations of the other features of consciousness, such as memory systems and feedback loops. Predictive systems can introduce more complicated algorithms, including theory of mind to help learn how knowledge can be subdivided among individual organisms to make differential predictions for how various other entities within the system will respond.
These facets give rise to varying layers of complexity in the actualization of consciousness. There are mechanisms for increasing the complexity and elegance of several of these features to create higher order and more complex degrees of consciousness. For example, internal calculus can have varying degrees of complexity. In addition, internal feedback loops, memory systems, interoception, proprioception, and predictive systems can also form more complex systems to create very elegant maps of ourselves and our environments. We should consider that there could be several ways that intellect and intelligence can be sharpened and expanded by developing various forms of expanded and optimized neurological pathways for these abilities.
It should be noted that we have defined separate components of a conscious mind, but each does not need to correspond to a specific anatomical pathway within the brain. Some of the individual brain systems such as the somatosensory cortex—the part of the brain that maps and represents the body—would seem to likely correlate with processes such as interoception. However, there may be several layered and tightly integrated parts of the brain that connect and execute several of these processes simultaneously. So, there need not be a one-to-one correlation between individual systems within the brain and individual components of the process of consciousness that we have delineated.

3. Discussion

The parameters that we have listed constitute a mechanistic explanation for the qualitative experience of consciousness. Each parameter is rooted in a neurological mechanism that is well-understood and well-documented. Some of these parameters, however, are under-estimated by the existing literature in the artificial intelligence academic community, underscoring that there is a gap between the underpinnings of biological, qualitative consciousness and computer systems that simulate consciousness.
We believe that our framework, which includes 10 different facets of consciousness is novel and has not been previously advanced by other writers. Although there are other discussions and definitions of biological consciousness, none incorporate the disparate somatic, sensory, and intellectual capacities of the brain and nervous system.
Baars and Edelman considered the physical mechanisms of consciousness, but did not specifically delineate the neurological pathways that we have listed. (Baars and Edelman, 2012) Changeux considered molecular mechanisms in murine models of consciousness, although this analysis was focused more on understanding mechanisms at the neurotransmitter level. (Changeux, 2007) Recently, Damasio and Damasio advanced a model of consciousness that acknowledges the importance of proprioception, but this model did not incorporate the remaining features that we have included. (Damasio and Damasio, 2022)
At present, the question of whether a physical model of the mind or a dualistic model is the correct depiction of the qualitative nature of consciousness is still unresolved. (Velmans, 2009; Rickabaugh 2023) One of the advantages of our model is that each, integrated facet of consciousness that we outline can be recognized as a physically real process. A disadvantage of this framework is that the processes that we have outlined are likely inter-related. As a result, further studies seeking to prove that our definitions are each mappable in the brain may be complex. (Jabakhanji, 2022)
It's important to recognize the inter-relation of each of the facets of consciousness that we have described. Each of the attributes of consciousness that we have mapped out can interact internally within neurological systems, within themselves, and within other attributes. This may give rise to some of the epiphenomenal manifestations of consciousness and emergent qualitative experiences described by some authors. (Halligan et al., 2021; Feinberg and Mallatt, 2020) Complex epiphenomena may arise as an outgrowth of how we take these components of consciousness that we have described and combine and integrate them into a concept of the single, monolithic whole of the individual.
Our framework tends to support that consciousness is comprised of physically real processes. Is a worm conscious? We can work out almost every detail in the ways that a worm thinks about, navigates, and reacts to its surroundings. (Birch, 2022) Is a mouse conscious? It’s likely that we could look elegantly into the brain of a mouse and find more rudimentary mirrors for almost every feature of consciousness that we find in the human brain. (Yao et al., 2023; Andrews, 2024) As orders of magnitude in complexity increase, we will likely see as a spectrum and be able to understand that we are merely manifesting a very convoluted, complex, higher form of consciousness. This would dissolve the hard problem of consciousness into a very integrated choreography of several, more basic models of the mind that exist elsewhere in the animal kingdom.
Consciousness seems to be tied to activation of multiple separate areas of the brain. (Seth and Bayne, 2022; Revach and Saiti, 2022) It’s likely that our qualitative experience of consciousness is tied to the way we experience multiple facets of consciousness simultaneously, or with close timing. It may be that the “qualia” of consciousness that we experience is a shorthand for the ten features of consciousness that we experience on a moment-to-moment basis.
Our framework also supports the proposed notion that there are important feedback loops in the human brain that are part of our qualitative experience of consciousness. (Graziano, 2022; Modolo, 2020) Within our model, there are several ways that feedback loops can arise explicitly in the definitions that we’ve laid out, and also implicitly in the interrelation of the components of consciousness. Our model also helps to navigate some of the somatosensory problems of consciousness (such as the rubber hand illusion), since we more thoroughly consider in our definitions not just the brain, but also the entire organism as a conscious individual. (Bretas, 2020; Golaszewski, 2021)
It may be that the definitions that we’ve laid out still make it difficult to understand and explain what we experience as consciousness, and why it exists. We’ve described processes that are mechanistic and biologically based, but these are layered and looping processes that interact with each other. It may be that the epiphenomenal features of consciousness which transcend the definitions that we’ve laid out will never be understood by any individual human, due to the complexity of the brain. Arguably, there is no single individual who completely understands all the interactions that can occur even at the interface of a single synapse—the connection between neurons. The knowledge required to map out all the biological information exchange at a single synapse and integrate that with the many levels of hierarchy that form the brain and its signaling pathways may be too much information for our minds to hold. We may have to produce an artificial brain to be able to resolve all these processes at once.
Some of our definitions of processes involved in consciousness may still be difficult to bring to bear on this complex debate. Let’s consider an analogy to make our definitions of these concepts clearer. Let’s consider the nation of Canada as an analogy for some of the definitions that we have offered. Canada is a physically real entity, with physically real properties, but the nationality and national identity of Canada are conceptual constructs that define a shorthand for an identity beyond the physical properties alone. At the basal level, which we can analogize to the brain and nervous system, there are physical properties of Canada. These include resources, governmental buildings, stacks of paperwork, border crossing points, and other physical resources. But built upon physical properties, we overlay the conceptual definition of the nationality of Canada. If you ask Canadians what nation they’re in, they’ll say that they’re in Canada instantaneously without having to check any geographical landmarks or external cues. So just like the brain is real and has physical properties, there is an experienced consciousness that emerges that works like a convenient shorthand for an identity and all the features of physically real phenomena. This is the difference between the brain and the conscious mind, although this is an imperfect analogy.
Just as there has never been a better time to study the human brain, there has likely never been a better time to engage in a conversation about what consciousness is. Our evolving maps of our own experiences of consciousness are shining a brighter light on what it means to be a human on a neuroanatomic level. Our evolving creations in the field of artificial intelligence are approaching and almost simulating the act of thinking, creating an opportunity to map the differences between human thinking and artificial thinking. As we understand better what our minds are and what they aren’t, we’ll have an even sharper, crystallized view of the emergent features of our minds, like consciousness.

References

[1]  Andrews, K. (2024). “All animals are conscious”: Shifting the null hypothesis in consciousness science. Mind & Language.
[2]  Baars, B. J., & Edelman, D. B. (2012). Consciousness, biology and quantum hypotheses. Physics of life reviews, 9(3), 285-294.
[3]  Bandettini, P. A. (2012). Twenty years of functional MRI: the science and the stories. Neuroimage, 62(2), 575-588.
[4]  Bierman, D. J. (2001). On the nature of anomalous phenomena: Another reality between the world of subjective consciousness and the objective world of physics. The physical nature of consciousness, 269-292.
[5]  Birch, J. (2022). The search for invertebrate consciousness. Noûs, 56(1), 133-153.
[6]  Bretas, R. V., Taoka, M., Suzuki, H., & Iriki, A. (2020). Secondary somatosensory cortex of primates: beyond body maps, toward conscious self-in-the-world maps. Experimental Brain Research, 238(2), 259-272.
[7]  Chalmers, D. J. (1995). Facing up to the problem of consciousness. Journal of Consciousness Studies, 2(3), 200-219.
[8]  Changeux, J. P. (2007). The molecular biology of consciousness. In Consciousness Transitions (pp. 123-160). Elsevier Science BV.
[9]  Cohen, M. A., & Dennett, D. C. (2011). Consciousness cannot be separated from function. Trends in Cognitive Sciences, 15(8), 358-364.
[10]  Damasio, A., & Damasio, H. (2022). Homeostatic feelings and the biology of consciousness. Brain, 145(7), 2231-2235.
[11]  Dennett, D. C. (1996). Facing backwards on the problem of consciousness. Journal of Consciousness Studies, 3, 4-6.
[12]  Edelman, G. M., Gally, J. A., & Baars, B. J. (2011). Biology of consciousness. Frontiers in psychology, 2, 4.
[13]  Feinberg, T. E., & Mallatt, J. (2020). Phenomenal consciousness and emergence: eliminating the explanatory gap. Frontiers in Psychology, 11, 1041.
[14]  Golaszewski, S., Frey, V., Thomschewski, A., Sebastianelli, L., Versace, V., Saltuari, L., ... & Nardone, R. (2021). Neural mechanisms underlying the Rubber Hand Illusion: A systematic review of related neurophysiological studies. Brain and behavior, 11(8), e02124.
[15]  Grandpierre, A., Chopra, D., Doraiswamy, P. M., Tanzi, R., & Kafatos, M. C. (2013). A multidisciplinary approach to mind and consciousness. NeuroQuantology, 11(4), 607-617.
[16]  Graziano, M. S. (2022). A conceptual framework for consciousness. Proceedings of the National Academy of Sciences, 119(18), e2116933119.
[17]  Halligan, P. W., & Oakley, D. A. (2021). Giving up on consciousness as the ghost in the machine. Frontiers in Psychology, 12, 571460.
[18]  Jabakhanji, R., Vigotsky, A. D., Bielefeld, J., Huang, L., Baliki, M. N., Iannetti, G., & Apkarian, A. V. (2022). Limits of decoding mental states with fMRI. Cortex, 149, 101-122.
[19]  Jackson, F. (2003). Mind and illusion. Royal Institute of Philosophy Supplements, 53, 251-271.
[20]  Lozovanu, E., & Lazariuc, C. (2021). The phenomenon of consciousness from an inter and multidisciplinary perspective. Journal of Social Sciences, (4), 16-24.
[21]  McLevey, J., Graham, A. V., McIlroy-Young, R., Browne, P., & Plaisance, K. S. (2018). Interdisciplinarity and insularity in the diffusion of knowledge: an analysis of disciplinary boundaries between philosophy of science and the sciences. Scientometrics, 117, 331-349.
[22]  Modolo, J., Hassan, M., Wendling, F., & Benquet, P. (2020). Decoding the circuitry of consciousness: From local microcircuits to brain-scale networks. Network Neuroscience, 4(2), 315-337.
[23]  Nagel, T. (2013). ‘Consciousness and objective reality'. In Minds and Bodies (pp. 217-222). Routledge.
[24]  Revach, D., & Salti, M. (2022). Consciousness as the temporal propagation of information. Frontiers in systems neuroscience, 16, 759683.
[25]  Rickabaugh, B., & Moreland, J. P. (2023). The Substance of Consciousness: A Comprehensive Defense of Contemporary Substance Dualism. John Wiley & Sons.
[26]  Seth, A. K., & Bayne, T. (2022). Theories of consciousness. Nature Reviews Neuroscience, 23(7), 439-452.
[27]  Velmans, M. (2009). How to define consciousness: And how not to define consciousness. Journal of consciousness studies, 16(5), 139-156.
[28]  Wang, K. S., Smith, D. V., & Delgado, M. R. (2016). Using fMRI to study reward processing in humans: past, present, and future. Journal of neurophysiology, 115(3), 1664-1678.
[29]  Yamada, K., Sakai, K., Akazawa, K., Yuen, S., & Nishimura, T. (2009). MR tractography: a review of its clinical applications. Magnetic resonance in medical sciences, 8(4), 165-174.
[30]  Yao, Z., van Velthoven, C. T., Kunst, M., Zhang, M., McMillen, D., Lee, C., ... & Zeng, H. (2023). A high-resolution transcriptomic and spatial atlas of cell types in the whole mouse brain. Nature, 624(7991), 317-332.