Nothing Special   »   [go: up one dir, main page]

Academia.eduAcademia.edu

Brain and mind operational architectonics and man-made machine consciousness

2009, Cognitive Processing

See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/23387746 Brain and mind operational architectonics and man-made “machine” consciousness Article in Cognitive Processing · November 2008 DOI: 10.1007/s10339-008-0234-y · Source: PubMed CITATIONS READS 6 63 3 authors, including: Alexander & Andrew Fingelkurts BM-Science - Brain & Mind Technologies Research Centre 110 PUBLICATIONS 2,163 CITATIONS SEE PROFILE All content following this page was uploaded by Alexander & Andrew Fingelkurts on 23 January 2017. The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document and are linked to publications on ResearchGate, letting you access and read them immediately. 1(13) Below is the unedited draft of the article that has been accepted for publication (© Cognitive Processing, 2009, V. 10. No 2. P. 105–111.) BRAIN AND MIND OPERATIONAL ARCHITECTONICS AND MAN-MADE “MACHINE” CONSCIOUSNESS Andrew A. Fingelkurts, Alexander A. Fingelkurts, Carlos F.H. Neves BM-Science – Brain & Mind Technologies Research Centre, PO Box 77, FI-02601, Espoo, Finland Abstract To build a true conscious robot requires that a robot’s “brain” be capable of supporting the phenomenal consciousness as human’s brain enjoys. Operational Architectonics framework through exploration of the temporal structure of information flow and interarea interactions within the network of functional neuronal populations (by examining topographic sharp transition processes in the scalp electroencephalogram (EEG) on the millisecond scale) reveals and describes the EEG architecture which is analogous to the architecture of the phenomenal world. This suggests that the task of creating the “machine” consciousness would require a machine implementation that can support the kind of hierarchical architecture found in EEG. Key words: EEG; neural assemblies; artificial intelligence; robot mind; cognition; phenomena; operation; synchrony; metastability; subjective experience; embodied approach; virtual reality; synthetic. Introduction For decades it has been believed that the approach to machine intelligence does not have to follow what human minds do. Therefore, as we stated in Fingelkurts and Fingelkurts (2004a), with all the advances in classical robotics and artificial intelligence, such machines are still fundamentally passive, do not have any self-understanding, and do not capture all of the information processing features, such as they are unified and integrated in the human brain and consciousness (Sloman and Chrisley 2005), while human beings on the other hand are essentially consciously active and selfaware. 2(13) This is why some researchers believe that the practical machine cognition has proven to be notoriously difficult to implement by traditional computational means (Haikonen 2007a). However, it appears that many researchers and engineers have already realized that machine cognition (and eventually consciousness) requires so-called biological “grounding” (Aleksander and Dunmall 2003; Sloman and Chrisley 2005; Clowes et al. 2007; Haikonen 2007a,b; Gamez 2008; Ziemke 2007, just to mention a few). This signifies that the importance of a more sophisticated architectural approach to artificial systems (robots) is beginning to be widely acknowledged. In this context the term “machine consciousness” starts to be used in many ways. The increasing contemporary interest in this emerging field can be seen from the number of ‘Machine Consciousness’ symposia held in Cold Spring Harbor (2001), Skövde (2001), Memphis (2002), Birmingham (2003), Turin (2003), Antwerp (2004), Hatfield (2005), Agrigento (2005; for the conference contributions see Chella and Manzotti 2007), Bristol (2006), Lesvos (2006), Washington (2007) and San Luis, Brazil (2008). But it seems to us that in connection to the artificial man-made systems, “consciousness” is usually understood as perception of the surrounding world and adaptable behavior guided by these surroundings. For example, Holland (2003) calls for a ‘strongly embodied approach to machine consciousness’ which he implement in the anthropomimetic robot CRONOS. This approach requires a robot with the physical instantiation—especially its means of movement—which strongly resembles the means and modalities of movement available to human beings. However, human beings can perform relatively complex behaviours, such as driving home from work, shopping, moving along the street without being conscious about these actions (Revonsuo 2006; Gamez 2008). At the same time, patients with locked-in syndrome (when they are almost completely paralyzed; Laureys et al. 2005) are just as conscious as healthy subjects (Koch and Tononi 2008). Therefore, at best such robot could replicate conscious human behaviour without experiencing phenomenal states. This behaviouristic interpretation of the notion of ‘consciousness’ is rather far from the definition which is agreed in the cognitive neuroscience. Here, the concept of consciousness refers to the world of subjective experiences (phenomena such as seeing, hearing, touching, feeling, embodiment, moving, and thinking) that happen to a person right now and, thus, is called phenomenal consciousness (Revonsuo 2006). An important practical matter related to this definition of consciousness is that “conscious machine” should be seen as a man-made artificial system (e.g., robot) that enjoys subjective 3(13) phenomenal experiences, — a genuinely conscious machine (rather than one which merely shows outward signs or internal organizational features of consciousness)1. If we wish to give such capacity to our robots2, we should have to study and understand the architecture of phenomenal world (and its biological constituents) of human beings. There have been numerous attempts to describe the phenomenological properties of mind (for the review see Moran 2000). However, it was mostly purely philosophical attempts. William James was the fist scientist, who did metaphorical but deep description of the structure of human consciousness in his ‘Stream of Though’ essay (James 1890). His main observation was that consciousness is dynamic: it continually moves from one relatively stable thought to another relatively stable thought. The most recent and the most comprehensive analysis of phenomenological architecture of consciousness has been done by neuroscientist Antti Revonsuo in his stimulating book ‘Inner Presence’ (Revonsuo 2006). Even though a full account of the human phenomenal (virtual) world is beyond the scope of this paper (for extensive reference list and detail discussion see Revonsuo 2006), below we will illustrate this world by describing the most important features of it which are required for instantiation of phenomenal consciousness. It is worth mentioning that there are several other accounts of phenomenal world (Nagel 1974; Block 1995; Stubenberg 1998; Dainton 2000; Zeki 2003). However, they do not contain the full hierarchical description of phenomenal experience from the first-point perspective, but rather concentrated on one or several important attributes of phenomenality (Nagel 1974; Block 1995), or propose controversial microconsiousness entities (Zeki and Bartels 1999; for a critical discussion see Revonsuo 2006), or even deny any interesting phenomenology at all (Dennett 1991). 1 This distinction is well known as dichotomy between the Weak Artificial Consciousness and Strong Artificial Consciousness (Holland 2003), where the Weak Artificial Consciousness deals with design and construction of machines that simulate consciousness or cognitive processes usually correlated with it, while the Strong Artificial Consciousness aims to design a true conscious machine. This separation between Weak and Strong Artificial Consciousness mirrors the separation between the ‘easy’ and the ‘hard problem’ of consciousness (Chalmers 1996). According to this distinction, ‘easy problem’ of understanding consciousness refers to explaining the ability to discriminate, integrate information, report mental states, focus attention, etc., whereas the ‘hard problem’ needs to answer the question why does subjective awareness of sensory information exist at all? Both mentioned dichotomies are strongly related to a third one – dichotomy between ‘access’ and ‘phenomenal’ consciousness (Block 1995). In the framework of this distinction, access consciousness is defined as availability for use reasoning and rationally in guiding speech, action and thought. In contrast, phenomenal consciousness is a subjective experience. 2 Even though there are first attempts to create so-called ‘synthetic phenomenology’ (Gamez 2005, 2006; Stening et al. 2005; Chrisley and Parthemore 2007; Kiverstein 2007; Hesslow and Jirenhed 2007; Ikegami 2007; Haikonen 2007a,b), it is very early to speak about even possibility of achieving genuinely conscious machines. Despite productive work, there is strong awareness in the field of synthetic phenomenology that something crucial is still missing in the current implementations of autonomous systems (see Manzotti 2007; Ziemke 2007; Dreyfus 2007; Koch and Tononi 2008). 4(13) Architecture of the human phenomenal consciousness The human phenomenal consciousness (i.e. the virtual world) is characterized by the complex hierarchical architecture (as it described in Revonsuo 2006): • • • phenomenal space: for any phenomenon to be present for a person it must manifest itself somewhere within his/her perceptual space (coordinate system); phenomenal features (qualities): the “stuff” that the experiences per se are made of; phenomenal patterns: carefully organized qualities. Patterns of phenomenal experience are self-presenting and they are both the acts (operations) and the objects at the same time • (Stubenberg 1998); phenomenal objects: complex phenomenal patterns that make up shapes, forms, and objects. They are characterized by coherent perceptual wholes (Gestalt windows) and by • immediate meaningful wholes (semantic windows); phenomenal map: it provides an answer to the question where is the currently present place with the phenomenal “self” located in relation to other places not currently present. Moreover, every part of this hierarchical phenomenal world is present to every other part simultaneously, to create their spatial co-presence (temporal coincidence) in the same and unified phenomenal world. One important feature of this phenomenal world is that any phenomenal pattern, object or thought can suddenly becomes incarnated as some other pattern, object or thought, in a way that previous instant simply turns abruptly into the new one, thus creating phenomenal discontinuity (Revonsuo 2006; Fingelkurts and Fingelkurts 2001). Another important issue is that each part of human hierarchical phenomenal world is not monolithic; instead it has its own inherent fine structure (Fingelkurts and Fingelkurts 2003). And at last, but not the least, all these phenomenal patterns or objects are transparent for us (we do not experience them as representations of external objects or scenes, instead we just perceive those objects or scenes right there outside the brain; for detailed and accurate conceptualization, see Revonsuo 2006; for alternative approach see Manzotti 2006). 5(13) Operational Architectonics of brain-mind functioning According to the biological realism as a path to study consciousness, phenomenal consciousness is grounded to material carrier processes that take place in the brain3 (Pockett 2000; McFadden 2002; Koch 2004; Fingelkurts and Fingelkurts 2005; Revonsuo 2006; Haikonen 2007a). The important question is: which objective measure is the most useful for studying such brain processes? Treating both the brain activity and consciousness as dynamic processes from which we are able to observe certain features/parameters over time; and since it has been found that coherency of conscious states is achieved rapidly and effortlessly, a temporal resolution in the order of milliseconds is of special interest (Fingelkurts and Fingelkurts 2006). Here, the electroencephalogram (EEG)4 provides a satisfactory scale for accessing temporal evolution of the brain activity which is associated with conscious performance (for a detail discussion and argumentation see Nunez 2000; Freeman 2007a). If it is so, then another important question arises: what might be the EEG architecture which would reflect or even instantiate this kind of phenomenal world (considering that there should be the ‘well-defined’ and ‘well-detected’ EEG phenomena)? Even though we still do not have a good overview of all the features and mechanisms (or their interrelations) that characterize EEG, and that makes theory construction very difficult, the described below Operational Architectonics framework can be already useful for finding good solutions to engineering problems. We will use informal way of description (modeling aspects5 will not be elaborated here), hoping that the lack of technical detail would be seen as a welcome attempt at maintaining intelligibility for a broad audience. It is worth mentioning also that this paper is not attempting to propose and defend a particular type of architecture as ‘right’ and others as ‘wrong’. The scientific study of consciousness is still in a very young state, and therefore it is too soon to attempt to rule out any of the research directions that are being around in this area (Metzinger 2003; Sloman and Chrisley 2005; Gamez 2008). However, presented here framework gives one more path for engineers to consider. 3 In another words, phenomenal consciousness is a higher level of biological organization in the brain (Revonsuo 2006). Although it is often claimed that volume conduction is the main obstacle in interpreting EEG data, we have shown through modeling experiments that the proper EEG methodology reveals such EEG architecture which is sensitive to the morpho-functional organization of the cortex rather than to the volume conduction and/or reference electrode (for relevant details, we address the reader to Kaplan et al. 2005). 5 That are largely still to be devised. 4 6(13) The Operational Architectonics theory6 (Fingelkurts and Fingelkurts 2001, 2003, 2004b, 2005, 2006) offers the plausible framework which states that whenever any pattern of phenomenality (including reflective thought) is instantiated, there is neuro-physiological pattern (revealed directly by EEG) of appropriate kind that corresponds to it (Fingelkurts and Fingelkurts 2001, 2005). These neuron-physiological patterns (expressed as the virtual operational modules) are brought to existence by joint operations of many functional and transient neuronal assemblies in the brain7 (Fingelkurts and Fingelkurts 2005, 2006). The activity of neuronal assemblies is “hidden” in the complex nonstationary structure of EEG (Fingelkurts and Fingelkurts 2001, 2005; Kaplan et al. 2005). Proper EEG analysis8 reveals the EEG architecture which is amazingly similar to the architecture of a phenomenal world (see previous section): • single neurons (highly distributed along the cortex) can quickly become associated (or dis-associated) by synchronizing their activity/operations and giving rise to transient • neuronal assemblies (von der Malsburg 1999); transient neuronal assemblies maintain discrete elemental brain operations some of which have phenomenal/subjective ontology in addition to the neuro-physiological one. At the EEG level such operations of the neuronal assemblies are reflected in the periods of the EEG quasi-stationary periods/segments (~milliseconds) within different frequency ranges, registered from different brain locations (Fingelkurts and Fingelkurts 2001, 2003, • 2005, 2008); temporal synchronization of different brain operations executed by different local neuronal assemblies simultaneously (operational synchrony9) gives rise to a new level of brain abstractness – metastable brain states (Fingelkurts and Fingelkurts 2001, 2004b; for a practical measurement see Fingelkurts and Fingelkurts 2008). These metastable brain states or functional Operational Modules (OM)10, as we name them, underlie the realization of brain complex macrooperations: cognitive percepts, phenomenal objects, 6 Even though this framework has many similarities with other theoretical conceptualizations, it is quite distant from them in the core principles (for the detailed comparative analysis, see Fingelkurts and Fingelkurts 2006). Additionally, in the context of OA framework (and in contrast to other theories) there is a range of methodological tools which enable in practice to measure the postulated entities of the theory (Fingelkurts and Fingelkurts 2008). 7 These EEG phenomena are rarely exploited due to lack of analytical tools and methodology. Special techniques (which take into consideration the inherent nature and structure of EEG signal) are required for the detection of them (Fingelkurts and Fingelkurts 2001, 2008). 8 See, Fingelkurts and Fingelkurts (2005). 9 Quantitatively such phenomenon is assessed through the measure of synchronization of EEG segments (structural synchrony) obtained from different brain locations (Fingelkurts and Fingelkurts 2001, 2005). 10 Each OM is a metastable spatial-temporal pattern of brain activity; it is so because the neuronal assemblies which constitute OM have different operations/functions and each does its own inherent “job” (thus expressing the autonomous tendency), while still, at the same time, been temporally entangled among each other (and thus expressing the coordinated activity) in order to execute common complex operation or complex cognitive act of a higher hierarchy (Fingelkurts and Fingelkurts 2004b, 2005). As has been proposed by Kelso (1995) metastability relates exactly to the phenomenon of a constant interplay between these autonomous and interdependent tendencies (see also Bressler and Kelso 2001). 7(13) and reflective thoughts within the operational space-time continuum (Fingelkurts and Fingelkurts 2004b, 2006). The key point here is that OMs have a more complex structure than operations which constitute them. However OMs carry less fine-grained information since only the essential information for the emergent cognitive percept, phenomenal object, or reflective thought is preserved. Thus, in accordance with the information theory of Shannon (Shannon 1948), the operational synchrony process “abstracts out” the information carried by OMs, meaning that OMs are transparent to the original raw (neuro-physiological) data, and sensitive only to the spatial-temporal pattern of activation that is embodied in the set of involved neuronal assemblies (Fingelkurts and Fingelkurts 2003, 2005). So, the information that remains is only an abstraction of certain aspects of the original data, including physical (non-mental) processes in the brain (see also • Freeman 2007a,b); sequence of these metastable OMs may represent the stream of consciousness (for details see second part of Fingelkurts and Fingelkurts 2001). The main idea is that the structure of the electrical brain field (EEG), the structure of cognition, and the phenomenal structure of consciousness, all have the same organization: the succession of discrete and relatively stable periods (metastable OMs, cognitive acts, or thoughts, correspondingly) separated by rapid transitive processes (abrupt changes between OMs, cognitive acts or • thoughts, correspondingly); OMs (being by themselves the result of synchronized operations going on in distributed and local brain structures) could be operationally synchronized between each other (on a new time scale), thus forming more abstract and more complex OM which constitute new and more integrated phenomenal experience (Fingelkurts and Fingelkurts 2003). Each of the new OMs is not just a sum of simpler OMs. Rather, the more complex OM is most naturally a union of abstractions about simpler OMs. This implies that the complex OMs reflect mostly phenomenological (and not nonconscious11) information. Described hierarchy enables the brain to build complex phenomenal patterns/objects from primitive ones so that the semantic value of the complex representation is determined by, and dependent on, the semantic values of the primitives (Fingelkurts and Fingelkurts 2003, 11 Note, that we use the term ‘nonconsiousness’ instead of ‘unconsciousness’. We are not going to enter into the extensive debate about the difference of these terms, but they are quite distant and should not be intermixed. In short, unconscious material is still presented somewhere within the mental sphere, but usually is inaccessible for awareness. Only within the mental world it does make sense to speak of conscious or unconscious events or processes (Allen 1994). In contrast, nonconscious processes are nonmental in nature – they are simply not available to mental experience, they are physical or neurophysiological processes (Searle 1992). 8(13) 2005). Since at the top level of abstractness (reflective consciousness) we already do not have direct access to the brain neuro-physiological processes12, this subjective • (conscious) experience seems so strange and mysterious to us; Also the reverse process is possible: when complex phenomenal pattern, object, or reflective thought (represented in the brain by complex OM) guided by attention is decomposed to several simpler phenomenal parts (OMs) which in their turn may be further decomposed even to simpler ones (Fingelkurts and Fingelkurts 2003, 2005). The price for this decomposition is a narrowly focused attention and consequently the focused reflective conscious state (Revonsuo 2006). In this way the multidimensional and hierarchical Operational Architectonics framework provides plausible and productive foundations for the fine texture of phenomenal consciousness; it shows how bioelectrical brain dynamics can be integrated in order to tell a unified story about how the mind works. Of course, the contents of consciousness are not reflected in so-called primary EEG characteristics (Revonsuo 2001). However, described here a way of looking on EEG (Fingelkurts and Fingelkurts 2001, 2005) and the used methods for its analysis (Fingelkurts and Fingelkurts 2008) reveal a highly hierarchical organization of EEG phenomenon, whereas different levels of that hierarchy are strongly associated with particular features of phenomenality and cognition. We are now in the most beginning of studying this architecture in relation with different aspects of phenomenal consciousness; however, the results are encouraging so far (Fingelkurts and Fingelkurts 2006). There is much more to be said about the additional details required for all of this to work, but space constraints rule that out here. Whether this framework can provide a sufficient underpinning for machine phenomenology remains to be shown, but we offer one more theoretical framework to the range of possible scenarios to consider. 12 The actual neuron-physiological machinery is hidden from our awareness, – it is transparent to us. In other words we have access only to the content, but not the vehicle, of the phenomenal information (see also Revonsuo 2006; Haikonen 2007b). 9(13) Functional isomorphism between brain operational architectonics and mind phenomenal world Even though almost any neural phenomenon in the brain could conceivably correlate with mental states and consciousness, only very particular phenomena could ever constitute them, that is, be the underlying mechanism that realizes them (for a discussion, see Revonsuo 2001). By analysing and considering our phenomenal world as a hierarchy of carefully organized selfpresenting phenomenal patterns (which are both the objects and the acts/operations at the same time; see Stubenberg 1998; Revonsuo 2006; Manzotti 2006), produced by the activity of some physical system (the brain as a whole, or its parts), it is possible to reveal a functionally isomorphic13 hierarchical architecture in the electrical brain activity. Functional isomorphism is ‘visible’ only at the level in which similarities between otherwise disparate realizations can be seen, and so it is at this level that we must look for laws ranging over them14. The unified/coherent phenomenal objects or thoughts can be presented in the internal subjective world by means of entities of distributed and operationally unified neuronal brain assemblies (Fingelkurts and Fingelkurts 2001, 2004, 2005). Basic idea is that synchronized brain operations produced by local neuronal assemblies are linked to the required phenomenal operation (or operational act) and only those local activities evoked by common objects, scenes, or tasks (either physical or phenomenal) are bound together in dynamically formed metastable OMs. The notion of operation, then, is the fundamental and central one in bridging the gap between brain and phenomenal consciousness: it is precisely by means of this notion that it is possible to identify what at the same time belongs to the phenomenal conscious level and to the neurophysiological level of brain activity organization, and what mediates between them (Fingelkurts and Fingelkurts 2005). “Operation” stands for the process or series of acts/functions that are limited in time; and can be broadly defined as the state of being in effect. This is so regardless of whether this process is conceptual/phenomenal or physical/biological. In fact, 13 Isomorphism is generally defined as a mapping of one entity into another having the same elemental structure, whereby the behaviors of the two entities are identically describable (Warfield 1977). A functional isomorphism on the other hand requires the functional connectivity between its component entities (Lehar 2003). It is an extension to Muller’s psychophysical postulate (Muller 1896), and Chalmers’ principle of structural coherence (Chalmers 1995). Therefore, two systems that are functionally isomorphic are, in virtue of this fact, different realizations of the same kind (Shapiro 2000). In other words, two functionally isomorphic different systems bring about the same function that defines the kind. But, if two particulars differ only in properties that do not in any way affect the achievement of the defining capacity of a kind then there is no reason to say that they are tokens of different realizations of the kind (Shapiro 2000). 14 Such approach coincides with a positive methodology suggested by Chalmers (1995) for facing up to the hard problem of consciousness. The main points of this methodology are: a) pay careful attention both to physical processing and to phenomenology; b) find systematic regularities between the two; c) work down to the simpler principles which explain these regularities in turn; and d) ultimately explain the connection in terms of a simple set of fundamental laws. 10(13) everything what can be represented by a process is an operation. This provides a basis for discussion of the relative complexity of operations, where there is a more complex operation/operational act that subsumes the simpler ones (Fingelkurts and Fingelkurts 2003, 2005). Understanding of the operation as a process and considering its combinatorial nature, seems especially well suited for describing and studying the mechanisms of how the internal subjective world can be presented by means of entities of distributed neuronal brain assemblies15. The description of both architectures of the phenomenal consciousness (second section) and operational brain functioning (third section) clearly shows that they are isomorphic to each other through this shared very notion of operation16. Elsewhere we have argued that such isomorphism signifies the epistemic correspondence between described two architectures (Fingelkurts and Fingelkurts 2001). Consequently, this gives us the reason to speculate that if the phenomenological architecture of consciousness and the brain’s operational architectonics are so remarkably correspondent to each other, then they might also have ontological identity. If this holds true, then we may make another claim that by reproducing the one architecture we can observe selfemergence of the other17. In contrast to the attempts to rebuild the physical structure of the brain (Markram 2006), which is extremely difficult if not impossible at all (Koch and Tononi 2008), we suggest to try to model the proper level of brain functional organization, namely its Operational Architectonics. In this way, the road is open to the systematic and detailed research of the physical (neurophisiological) basis of phenomenal world and to the engineering of the kind of hierarchical architecture found in EEG, which is isomorphic to such world and may constitute it. 5. Conclusion If consciousness is a biological phenomenon in the brain (Revonsuo 2006) realized by the highly organized macro-level electrophysiological (EEG) phenomena (metastable OMs), which are brought to existence by the coordinated electrical activity (operational synchrony) of many neuronal populations dispersed throughout the brain (Fingelkurts and Fingelkurts 2001, 2003, 2004b, 2005, 2006, 2008), the problem of producing man-made “machine” consciousness is the problem of 15 An alternative approach to consciousness which is based on the process-oriented ontology (Whitehead 1927/1978; Griffin 1998) has been suggested by Manzotti (2006). According to Manzotti, consciousness and physical reality can be conceived as two perspectives on the same processes. In this case there is no problem of re-presentation since the experience and the occurrence of the world are identical. More precisely, phenomenal experiences do not represent reality but are reality (for a detail see Manzotti 2006). 16 Both, the material neurophysiological organization that characterizes brain and the informational order that characterizes mind necessarily involve such events as operations at their cores (Fingelkurts and Fingelkurts 2003, 2005). 11(13) duplicating the whole level of architecture (with its inherent rules and mechanisms) found in EEG, which can constitute this phenomenal level (Revonsuo 2006). Acknowledgement This theoretical work was supported by BM-Science. References: Aleksander I, Dunmall B (2003) Axioms and tests for the presence of minimal consciousness in agents. J Conscious Stud 10:7–18 Allen JA (1994) Delineating conscious and unconscious processes: commentary on Baars on contrastive analysis. Psyche 1(9) Url: http://psyche.cs.monash.edu.an/v2/psyche-1-9-allen.html Block N (1995) On a confusion about a function of consciousness. Behav Brain Sci 18:227-287 Bressler SL, Kelso JAS (2001) Cortical coordination dynamics and cognition. Trends Cogn Sci 5:26–36 Chalmers DJ (1995) Facing up to the problems of consciousness. J Conscious Stud 2:200–219 Chalmers DJ (1996) The conscious mind: In search of a fundamental theory. Oxford University Press, New York Chella A, Manzotti R (2007) Artificial consciousness. Imprint Academic, Exeter UK Chrisley R, Parthemore J (2007) Synthetic phenomenology: exploiting embodiment to specify the non-conceptual content of visual experience. J Conscious Stud 14:44–58 Clowes R, Torrance S, Chrisley R (2007) Machine consciousness: embodiment and imagination. J Conscious Stud 14:7–14 Dainton B (2000) Stream of consciousness. Routledge, London Dennett DC (1991) Consciousness explained. Little Brown, Boston Dreyfus HL (2007) Why Heideggerian AI failed and how fixing it would require making it more Heideggerian. Philos Psychol 20:247–268 Fingelkurts AnA, Fingelkurts AlA (2001) Operational architectonics of the human brain biopotential field: Towards solving the mind-brain problem. Brain Mind 2:261–296 Fingelkurts AnA, Fingelkurts AlA (2003) Operational architectonics of perception and cognition: A principle of self-organized metastable brain states, VI Parmenides Workshop “Perception and Thinking” of the Institute of Medical Psychology, University of Munich. Elba, Italy, 2003, April 5–10 Fingelkurts AnA, Fingelkurts AlA (2004a) The Operational Architectonics concept of brain and mind functioning. Congress on Modeling Mental Processes and Disorders. Kusadasi, Turkey, 2004, May 24–29 Fingelkurts AnA, Fingelkurts AlA (2004b) Making complexity simpler: multivariability and metastability in the brain. Int J Neurosci 114:843–862 Fingelkurts AnA, Fingelkurts AlA (2005) Mapping of the brain operational architectonics. In: Chen FJ (ed) Focus on Brain Mapping Research, chapter 2, Nova Science Publishers Inc, New York, pp. 59–98 17 It seems an extreme point of view, but one that is gaining some currency in recent discussions of brain-mind interaction (for an example, see Pockett 2000; McFadden 2002; Revonsuo 2006; Freeman 2007a,b). 12(13) Fingelkurts AnA, Fingelkurts AlA (2006) Timing in cognition and EEG brain dynamics: discreteness versus continuity. Cognitive Proces 7:135–162 Fingelkurts AnA, Fingelkurts AlA (2008) Brain-mind Operational Architectonics imaging: technical and methodological aspects. Open Neuroimag J 2:73-93 Freeman WJ (2007a) Indirect biological measures of consciousness from field studies of brains as dynamical systems. Neural Netw 20:1021–1031 Freeman WJ (2007b) Definitions of state variables and state space for brain-computer interface. Part 1. Multiple hierarchical levels of brain function. Cogn Neurodyn 1:3–14 Gamez D (2005) An ordinal probability scale for synthetic phenomenology. In: Chrisley R, Clowes R, Torrance S (eds) Proceedings of the AISB05 symposium on next generation approaches to machine consciousness. Hatfield, UK, pp 85–94 Gamez D (2006) The XML approach to synthetic phenomenology. In: Chrisley R, Clowes R, Torrance S (eds) Proceedings of the AISB06 symposium on integrative approaches to machine consciousness. Bristol, UK, pp 128–135 Gamez D (2008) Progress in machine consciousness. Conscious Cogn 17:887-910 Griffin DR (1998) Unsnarling the world-knot: Consciousness, freedom, and the mind-body problem. University of California Press, Berkeley Haikonen PO (2007a) Robot brains: Circuits and systems for conscious machines. Wiley, UK Haikonen PO (2007b) Essential issues of conscious machines. J Conscious Stud 14:72–84 Hesslow G, Jirenhed D-A (2007) The inner world of a simple robot. J Conscious Stud 14:85–96 Holland O (2003) Editorial introduction. J Conscious Stud 12:1–6 Ikegami T (2007) Simulating active perception and mental imagery with embodied chaotic itinerancy. J Conscious Stud 14:111–125 James W (1890) The principles of psychology. Vol. I. Dover, New York Kaplan AYa, Fingelkurts AnA, Fingelkurts AlA, Borisov SV, Darkhovsky BS (2005) Nonstationary nature of the brain activity as revealed by EEG/MEG: Methodological, practical and conceptual challenges. Signal Process 85:2190-2212 Kelso JAS (1995) Dynamic patterns: The self-organization of brain and behavior. MIT Press, Cambridge Kiverstein J (2007) Could a robot have a subjective point of view? J Conscious Stud 14:127–139 Koch C (2004) The quest for consciousness: A neurobiological approach. Roberts & Company Publishers, Englewood (Col) Koch C, Tononi G (2008) Can machines be conscious? IEEE Spectr 6:47-51 Laureys S, Pellas F, Van Eeckhout F, Ghorbel S, Schnakers C, et al (2005) The locked-in syndrome: what is it like to be conscious but paralyzed and voiceless? In: Laureys S (ed) Progress in brain research, Vol. 150. Elsevier, Amsterdam, pp 495-511 Lehar S (2003) Gestalt isomorphism and the primacy of subjective conscious experience: a gestalt bubble model. Behav Brain Sci 26:375–408 Manzotti R (2006) A process oriented view of conscious perception. J Conscious Stud 13:7–41 Manzotti R (2007) Towards artificial consciousness. APA Newsl Philos Comput 07:12-15 Markram H (2006) The blue brain project. Nat Rev Neurosci 7:153-160 McFadden J (2002) Synchronous firing and its influence on the brain’s electromagnetic field. Evidence for an electromagnetic field theory of consciousness. J Conscious Stud 9:23–50 Metzinger T (2003) Being no one. MIT Press, Cambridge Moran D (2000) Introduction to phenomenology. Routledge, London Muller GE (1896) Zur psychophysik der gesichtsempfindungen. Z Psychol 10:1–82 Nagel T (1974) What is it like to be a bat? Phyl Rev 83:435-450 13(13) Nunez PL (2000) Toward a quantitative description of large-scale neocortical dynamic function and EEG. Behav Brain Sci 23:371–398 Pockett S (2000) The nature of consciousness: A hypothesis. Writers Club Press, Lincoln Revonsuo A (2001) Can functional brain imaging discover consciousness in the brain? J Conscious Stud 8:3–23 Revonsuo A (2006) Inner presence: Consciousness as a biological phenomenon. MIT Press, Cambridge Searle JR (1992) The rediscovery of the mind. MIT Press, CambridgeF Shannon CE (1948) A mathematical theory of communication. Bell Sys Tech J 27:379–423, 623– 656 Shapiro LA (2000) Multiple realizations. J Phil 97:635-654 Sloman A, Chrisley RL (2005) More things than are dreamt of in your biology: Informationprocessing in biologically inspired robots. Cogn Syst Res 6:145–174 Stening J, Jacobsson H, Ziemke T (2005) Imagination and abstraction of sensorimotor flow: Towards a robot model. In: Chrisley R, Clowes R, Torrance S (eds) Proceedings of the AISB05 symposium on next generation approaches to machine consciousness. Hatfield, UK, pp 50–58 Stubenberg L (1998) Consciousness and qualia. John Benjamins, Amsterdam von der Malsburg C (1999) The what and why of binding: the modeler’s perspective. Neuron 24:95–104 Warfield JN (1977) Crossing theory and hierarchy mapping. IEEE Trans Syst Man Cybern 7:505– 523 Whitehead AN (1929/1978) Process and Reality, Free Press, London Zeki S (2003) The disunity of consciousness. Trends Cogn Sci 7:214-218 Zeki S, Bartels A (1999) Towards a theory of visual consciousness. Conscious Cogn 8:225-259 Ziemke T (2007) What’s life got to do with it? In: Chella A, Manzotti R (eds) Artificial consciousness. Imprint Academic, Exeter UK, pp 48–66 View publication stats