-
Notifications
You must be signed in to change notification settings - Fork 0
Causal entropic forces
Background: optics : : Optic-flow
-
APS: 2013 Focus: Model Suggests Link between Intelligence and Entropy
-
A. D. Wissner-Gross and C. E. Freer (2013) Causal Entropic Forces
Stunning… has that feeling of 'could be an entire field unto itself'. 72 citations to date
{
Tweeted
}... as if by magic
"“Information thermodynamics” is an emerging field focused on info theory+2nd thermodyn law…incl biochem adaptation"
- Prokopenko and Einav: "Information thermodynamics of near-equilibrium computation"
{ Tweeted }
-
5 results for 'vision' in citing articles:
- Automated search for new quantum experiments
- The anatomy of choice: dopamine and decision-making
- Self-organization in complex systems as decision making (2014)
The idea is advanced that self-organization in complex systems can be treated as decision making (as it is performed by humans) and, vice versa, decision making is nothing but a kind of self-organization in the decision maker nervous systems. A mathematical formulation is suggested based on the definition of probabilities of system states, whose particular cases characterize the probabilities of structures, patterns, scenarios, or prospects. In this general framework, it is shown that the mathematical structures of self-organization and of decision making are identical. This makes it clear how self-organization can be seen as an endogenous decision making process and, reciprocally, decision making occurs via an endogenous self-organization. The approach is illustrated by phase transitions in large statistical systems, crossovers in small statistical systems, evolutions and revolutions in social and biological systems, structural self-organization in dynamical systems, and by the probabilistic formulation of classical and behavioral decision theories. In all these cases, self-organization is described as the process of evaluating the probabilities of macroscopic states or prospects in the search for a state with the largest probability. The general way of deriving the probability measure for classical systems is the principle of minimal information, that is, the conditional entropy maximization under given constraints. Behavioral biases of decision makers can be characterized in the same way as analogous to quantum fluctuations in natural systems.
-
Fascinating articles
In studying fundamental physical limits and properties of computational processes, one is faced with the challenges of interpreting primitive information-processing functions through well-defined information-theoretic as well as thermodynamic quantities. In particular, transfer entropy, characterizing the function of computational transmission and its predictability, is known to peak near critical regimes. We focus on a thermodynamic interpretation of transfer entropy aiming to explain the underlying critical behavior by associating information flows intrinsic to computational transmission with particular physical fluxes. Specifically, in isothermal systems near thermodynamic equilibrium, the gradient of the average transfer entropy is shown to be dynamically related to Fisher information and the curvature of system's entropy. This relationship explicitly connects the predictability, sensitivity, and uncertainty of computational processes intrinsic to complex systems and allows us to consider thermodynamic interpretations of several important extreme cases and trade-offs.
- 17th Aug
(via @biochemistries
)
-
Acknowledged in new Science paper on expansion microscopy. http://syntheticneurobiology.org/PDFs/15.01.chen.FULL.pdf
-
Slides now available from my AGI 2014 keynote on the thermodynamics of artificial general intelligence. http://agi-conf.org/2014/schedule/
-
Featured in Complexity article on the most interesting new advances in #complexity research. http://onlinelibrary.wiley.com/doi/10.1002/cplx.21451/full
(via @permutans
)
-
"Daniel Kovach's new paper on information-theoretic approaches to artificial intelligence. www.scirp.org/journal/PaperInformation.aspx?PaperID=50204#.VCxttSldU7s”
-
Late 2014: The Computational Theory of Intelligence: Information Entropy
-
“Content without method leads to fantasy; method without content to empty sophistry.”
— Johann Wolfgang von Goethe (“Maxims and Reflections”, 1892)
“Perhaps the most important news of our day is that datasets — not algorithms — might be the key limiting factor to development of human-level artificial intelligence,” according to Alexander Wissner-Gross in a written response to the question posed by Edge: “What do you consider the most interesting recent scientific news?”
At the dawn of the field of artificial intelligence, two of its founders famously predicted that solving the problem of machine vision would only take a summer. We now know that they were off by half a century. Wissner-Gross began to ponder the question of: “What took the AI revolution so long?” By reviewing the timing of the most publicized AI advances over the past 30 years, he found evidence that suggests a provocative explanation: perhaps many major AI breakthroughs have actually been constrained by the availability of high-quality training datasets, and not by algorithmic advances.
- "Featured in new DIS Magazine essay by architect Philippe Morel on ideology, computation, and architecture." -
- archived here
Philippe Morel , A.D. Wissner-Gross Diagram. “Recent advances in high-frequency financial trading have made light propagation delays between geographically separated exchanges relevant. Here we (A.D. Wissner-Gross and C. E. Freer) show that there exist optimal locations from which to coordinate the statistical arbitrage of pairs of spacelike separated securities, and calculate a representative map of such locations on Earth. Furthermore, trading local securities along chains of such intermediate locations results in a novel econophysical effect, in which the relativistic propagation of tradable information is effectively slowed or stopped by arbitrage.)The map shows Optimal intermediate trading node locations (small circles) for all pairs of 52 major securities exchanges (large Circles)”.
-
"Cited in new theoretical analysis of the trade-off between cost, speed, and reliability when erasing a bit. http://arxiv.org/pdf/1410.1710.pdf (via)
-
Slides now available from my AGI 2014 keynote on the thermodynamics of artificial general intelligence. http://agi-conf.org/2014/schedule/
-
An equation for intelligence. Ted Talk
-
Networking faster than light. Another Ted talk
-
Self-Knowledge Dim-Out: Stress Impairs Metacognitive Accuracy
-
Last item: Cellular tagging as a neural network mechanism for behavioural tagging
Behavioural tagging is the transformation of a short-term memory, induced by a weak experience, into a long-term memory (LTM) due to the temporal association with a novel experience. The mechanism by which neuronal ensembles, each carrying a memory engram of one of the experiences, interact to achieve behavioural tagging is unknown. Here we show that retrieval of a LTM formed by behavioural tagging of a weak experience depends on the degree of overlap with the neuronal ensemble corresponding to a novel experience. The numbers of neurons activated by weak training in a novel object recognition (NOR) task and by a novel context exploration (NCE) task, denoted as overlapping neurons, increases in the hippocampal CA1 when behavioural tagging is successfully achieved. Optical silencing of an NCE-related ensemble suppresses NOR–LTM retrieval. Thus, a population of cells recruited by NOR is tagged and then preferentially incorporated into the memory trace for NCE to achieve behavioural tagging.