Nothing Special   »   [go: up one dir, main page]

Next Issue
Volume 3, September
Previous Issue
Volume 3, March
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 

Information, Volume 3, Issue 2 (June 2012) – 5 articles , Pages 175-255

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
324 KiB  
Article
The World Within Wikipedia: An Ecology of Mind
by Andrew M. Olney, Rick Dale and Sidney K. D’Mello
Information 2012, 3(2), 229-255; https://doi.org/10.3390/info3020229 - 18 Jun 2012
Cited by 4 | Viewed by 5176
Abstract
Human beings inherit an informational culture transmitted through spoken and written language. A growing body of empirical work supports the mutual influence between language and categorization, suggesting that our cognitive-linguistic environment both reflects and shapes our understanding. By implication, artifacts that manifest this [...] Read more.
Human beings inherit an informational culture transmitted through spoken and written language. A growing body of empirical work supports the mutual influence between language and categorization, suggesting that our cognitive-linguistic environment both reflects and shapes our understanding. By implication, artifacts that manifest this cognitive-linguistic environment, such asWikipedia, should represent language structure and conceptual categorization in a way consistent with human behavior. We use this intuition to guide the construction of a computational cognitive model, situated in Wikipedia, that generates semantic association judgments. Our unsupervised model combines information at the language structure and conceptual categorization levels to achieve state of the art correlation with human ratings on semantic association tasks including WordSimilarity-353, semantic feature production norms, word association, and false memory. Full article
(This article belongs to the Special Issue Cognition and Communication)
Show Figures

Graphical abstract

Graphical abstract
Full article ">
37 KiB  
Book Review
Mark Burgin’s Theory of Information
by Joseph E. Brenner
Information 2012, 3(2), 224-228; https://doi.org/10.3390/info3020224 - 1 Jun 2012
Cited by 2 | Viewed by 7492
Abstract
A review of a major, definitive source book on the basis of information theory is presented. Full article
(This article belongs to the Section Information Theory and Methodology)
176 KiB  
Article
Information and Physics
by Vlatko Vedral
Information 2012, 3(2), 219-223; https://doi.org/10.3390/info3020219 - 11 May 2012
Cited by 13 | Viewed by 8465
Abstract
In this paper I discuss the question: what comes first, physics or information? The two have had a long-standing, symbiotic relationship for almost a hundred years out of which we have learnt a great deal. Information theory has enriched our interpretations of quantum [...] Read more.
In this paper I discuss the question: what comes first, physics or information? The two have had a long-standing, symbiotic relationship for almost a hundred years out of which we have learnt a great deal. Information theory has enriched our interpretations of quantum physics, and, at the same time, offered us deep insights into general relativity through the study of black hole thermodynamics. Whatever the outcome of this debate, I argue that physicists will be able to benefit from continuing to explore connections between the two. Full article
(This article belongs to the Special Issue Information and Energy/Matter)
284 KiB  
Article
Physical Computation as Dynamics of Form that Glues Everything Together
by Gordana Dodig Crnkovic
Information 2012, 3(2), 204-218; https://doi.org/10.3390/info3020204 - 26 Apr 2012
Cited by 23 | Viewed by 9644
Abstract
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the ability [...] Read more.
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the ability to carry out a process, which corresponds to computation. The relationship between each two complementary parts of each dichotomous pair (matter/energy, structure/process, information/computation) are analogous to the relationship between being and becoming, where being is the persistence of an existing structure while becoming is the emergence of a new structure through the process of interactions. This approach presents a unified view built on two fundamental ontological categories: Information and computation. Conceptualizing the physical world as an intricate tapestry of protoinformation networks evolving through processes of natural computation helps to make more coherent models of nature, connecting non-living and living worlds. It presents a suitable basis for incorporating current developments in understanding of biological/cognitive/social systems as generated by complexification of physicochemical processes through self-organization of molecules into dynamic adaptive complex systems by morphogenesis, adaptation and learning—all of which are understood as information processing. Full article
(This article belongs to the Special Issue Information: Its Different Modes and Its Relation to Meaning)
403 KiB  
Article
Beyond Bayes: On the Need for a Unified and Jaynesian Definition of Probability and Information within Neuroscience
by Christopher D. Fiorillo
Information 2012, 3(2), 175-203; https://doi.org/10.3390/info3020175 - 20 Apr 2012
Cited by 21 | Viewed by 11736
Abstract
It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information [...] Read more.
It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information have been in agreement that it depends on probabilities (through Shannon entropy), but there has long been a dispute about the definition of probabilities themselves. The “frequentist” view is that probabilities are (or can be) essentially equivalent to frequencies, and that they are therefore properties of a physical system, independent of any observer of the system. E.T. Jaynes developed the alternate “Bayesian” definition, in which probabilities are always conditional on a state of knowledge through the rules of logic, as expressed in the maximum entropy principle. In doing so, Jaynes and others provided the objective means for deriving probabilities, as well as a unified account of information and logic (knowledge and reason). However, neuroscience literature virtually never specifies any definition of probability, nor does it acknowledge any dispute concerning the definition. Although there has recently been tremendous interest in Bayesian approaches to the brain, even in the Bayesian literature it is common to find probabilities that are purported to come directly and unconditionally from frequencies. As a result, scientists have mistakenly attributed their own information to the neural systems they study. Here I argue that the adoption of a strictly Jaynesian approach will prevent such errors and will provide us with the philosophical and mathematical framework that is needed to understand the general function of the brain. Accordingly, our challenge becomes the identification of the biophysical basis of Jaynesian information and logic. I begin to address this issue by suggesting how we might identify a probability distribution over states of one physical system (an “object”) conditional only on the biophysical state of another physical system (an “observer”). The primary purpose in doing so is not to characterize information and inference in exquisite, quantitative detail, but to be as clear and precise as possible about what it means to perform inference and how the biophysics of the brain could achieve this goal. Full article
(This article belongs to the Special Issue Information and Energy/Matter)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Are the information and probabilities used in neuroscience properties of the environment (an observed object), the neural system under investigation, or the scientist? The frequentist view is that probabilities are a property of a physical system (or object) that generates frequency distributions, and they exist independent of any observer of that system. The physical system could correspond to a neural system or anything else. Here I argue in favor of the view of Jaynes (and Laplace, shown at left) that probabilities are always conditional on the information of an observer about an object. I presume that the observer’s information must be in a physical form inside the observer. There could be many observers, but in neuroscience, the observer of interest would usually correspond either to the scientist (who observes a neural entity as well as its environment from a “third-person” perspective), or to the neural entity under investigation (for example, a brain, neuron or ion channel, which observes its environment from its “first-person” perspective). The arrows indicate the typical direction of information flow. The distinction between “observer” and “object” is merely one of perspective, and does not imply any fundamental distinction between the qualities of the two.</p>
Full article ">Figure 2
<p>The General Methodological Sequence of Jaynes.</p>
Full article ">Figure 3
<p>Illustration of neural systems that are sometimes said to be “random, stochastic, noisy, or probabilistic.” <b>(a)</b> Fixed and known inputs to these systems result in variable outputs. From top to bottom: In response to a certain concentration of neurotransmitter, or a specific membrane voltage, an ion channel can be open or closed; in response to arrival of an action potential, a presynaptic terminal may or may not release a vesicle of neurotransmitter; in response to a given amount of excitatory postsynaptic current, a neuron may or may not generate an action potential; in response to a given temperature, a rat may or may not remove its tail from a hotplate. According to contemporary physics, none of these systems are “random”, “stochastic”, <span class="html-italic">etc.</span> These terms purport to describe a physical system, but they only indicate our ignorance of the system; <b>(b)</b> an example of a typical model of an ion channel. The channel can exist in any one of multiple states. With a patch electrode, we can only discriminate open (“O”) from closed (“C”) states. The various closed states are “hidden” from our observation, but we can infer their existence through careful analysis. The other three cases illustrated in “a” are analogous, but because they are less microscopic in scale, their “hidden” states can be more readily observed.</p>
Full article ">
Previous Issue
Next Issue
Back to TopTop