Abstract
The research and philosophical communities currently lack a clear way to quantify, measure, and characterize the degree of consciousness in a mind or AI entity. This paper addresses that gap by providing a numerical measure of consciousness. Implicit in our approach is a definition of consciousness itself. Underlying this is our assumption that consciousness is not a single unified characteristic but a constellation of features, mental abilities, and thought patterns. Although some people may experience their own consciousness as a unified whole, we assume that consciousness is a multi-dimensional set of attributes, each of which can be present to differing degrees in a given mind. These attributes can be measured and therefore the degree of consciousness can be quantified with a number, much as IQ attempts to quantify human intelligence.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
To make this (subjective) definition a clearer and more discrete target for future discussions of the nature of consciousness, let us name the present methodology and implied definition of consciousness “Porter's Definition and Assessment of AI Consciousness” so as to distinguish it from other definitions.
- 2.
This multiplier was chosen so that an answer of “3 – HUMAN” for all 45 questions will yield a score of 100. If questions are added or deleted, the multiplier will need to be adjusted accordingly.
- 3.
Perhaps being able to form a friendship with a thinking entity is a useful indicator of whether that entity is conscious. We suggest that with any AI entity able to score high on this assessment, it would be possible to form a reasonably recognizable friendship. For example, if the features listed here could be added to Siri, then there is no question that Siri would appear to be more consciousness than she does now.
- 4.
It is for this reason that we are presenting our implicit definition of consciousness as one possible standard definition among many, rather than suggesting it is more valid or correct than competing definitions of consciousness.
References
Baars, B.J.: In the theatre of consciousness. Global Workspace Theory, a rigorous scientific theory of consciousness. J. Conscious. Stud. 4(4), 292–309 (1997)
Baars, B.J.: The Conscious access hypothesis: origins and recent evidence. TRENDS Cogn. Sci. 6(1), 47–52 (2002)
Baars, B.J.: The global brainweb: an update on global workspace theory. Sci. Conscious. Rev. 4, 292–309 (2003)
Turing, A.M.: Computing machinery and intelligence. Mind 49, 433–460 (1950)
Riedl, M.O.: The Lovelace 2.0 Test of Artificial Creativity and Intelligence (2014). http://arXiv.com:1410.6142v3[cs.AI]
Levesque, H.J., Davis, E., Morgenstern, L.: The winograd schema challenge. In: Proceedings of the Thirteenth International Conference on Principles of Knowledge Representation. AAAI Press (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Porter III, H.H. (2016). A Methodology for the Assessment of AI Consciousness. In: Steunebrink, B., Wang, P., Goertzel, B. (eds) Artificial General Intelligence. AGI 2016. Lecture Notes in Computer Science(), vol 9782. Springer, Cham. https://doi.org/10.1007/978-3-319-41649-6_31
Download citation
DOI: https://doi.org/10.1007/978-3-319-41649-6_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-41648-9
Online ISBN: 978-3-319-41649-6
eBook Packages: Computer ScienceComputer Science (R0)