- published: 12 Dec 2013
- views: 1348
Ray Solomonoff (July 25, 1926 – December 7, 2009) was the inventor of algorithmic probability, his General Theory of Inductive Inference (also known as Universal Inductive Inference), and was a founder of algorithmic information theory. He was an originator of the branch of artificial intelligence based on machine learning, prediction and probability. He circulated the first report on non-semantic machine learning in 1956.
Solomonoff first described algorithmic probability in 1960, publishing the theorem that launched Kolmogorov complexity and algorithmic information theory. He first described these results at a Conference at Caltech in 1960, and in a report, Feb. 1960, "A Preliminary Report on a General Theory of Inductive Inference." He clarified these ideas more fully in his 1964 publications, "A Formal Theory of Inductive Inference," Part I and Part II.
Algorithmic probability is a mathematically formalized combination of Occam's razor, and the Principle of Multiple Explanations. It is a machine independent method of assigning a probability value to each hypothesis (algorithm/program) that explains a given observation, with the simplest hypothesis (the shortest program) having the highest probability and the increasingly complex hypotheses receiving increasingly small probabilities.
In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of the shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as descriptive complexity, Kolmogorov–Chaitin complexity, algorithmic entropy, or program-size complexity. It is named after Andrey Kolmogorov, who first published on the subject in 1963.
For example, consider the following two strings of 32 lowercase letters and digits:
The first string has a short English-language description, namely "ab 16 times", which consists of 11 characters. The second one has no obvious simple description (using the same character set) other than writing down the string itself, which has 32 characters.
More formally, the complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (the sensitivity of complexity relative to the choice of description language is discussed below). It can be shown that the Kolmogorov complexity of any string cannot be more than a few bytes larger than the length of the string itself. Strings, like the abab example above, whose Kolmogorov complexity is small relative to the string's size are not considered to be complex.
Paper: http://agi-conf.org/2010/wp-content/uploads/2009/06/paper_41.pdf This paper is about Algorithmic Probability (ALP) and Heuristic Programming and how they can be combined to achieve AGI. It is an update of a 2003 report describing a system of this kind (Sol03). We first describe ALP, giving the most common implementation of it, then the features of ALP relevant to its application to AGI. They are: Completeness, Incomputability, Subjectivity and Diversity. We then show how these features enable us to create a very general, very intelligent problem solving machine. For this we will devise "Training Sequences" - sequences of problems designed to put problem-solving information into the machine. We describe a few kinds of training sequences. The problems are solved by a "generate and t...
What is ALGORITHMIC PROBABILITY? What does ALGORITHMIC PROBABILITY mean? ALGORITHMIC PROBABILITY meaning - ALGORITHMIC PROBABILITY definition - ALGORITHMIC PROBABILITY explanation. Source: Wikipedia.org article, adapted under https://creativecommons.org/licenses/by-sa/3.0/ license. In algorithmic information theory, algorithmic (Solomonoff) probability is a mathematical method of assigning a prior probability to a given observation. It was invented by Ray Solomonoff in the 1960s. It is used in inductive inference theory and analyses of algorithms. In his general theory of inductive inference, Solomonoff uses the prior obtained by this formula, in Bayes' rule for prediction. In the mathematical formalism used, the observations have the form of finite binary strings, and the universal pri...
Transcript: http://matchingpennies.com/solomonoff_induction/
Grace Solomonoff presents her talk "A Few Notes on Multiple Theories and Conceptual Jump Size" at the Ninth Conference on Artificial General Intelligence (AGI-16) in New York (http://agi-conf.org/2016/). Paper authors: Grace Solomonoff Abstract: These are a few notes about some of Ray Solomonoff’s foundational work in algorithmic probability, focussing on the universal prior and conceptual jump size, including a few illustrations of how he thought. His induction theory gives a way to compare the likelihood of different theories describing observations. He used Bayes’ rule of causation to discard theories inconsistent with the observations. Can we find good theories? Lsearch may give a way to search and the conceptual jump size a measure for this.
Held at Monash Uni, Solomonoff Memorial Conference 2011 late November
http://www.donationalerts.ru/r/solomonoff В гостях Weizel Sergej https://www.youtube.com/channel/UCxnD96rPnaAxSLx09AZ01Dg Alex Diamond https://www.youtube.com/channel/UCz-jsYsxwiD3I-TJhq0hxGQ Ссылка на второй канал с ПсихоСтримами https://goo.gl/vM32DI
During the 2016 Copernicus Festival Gregory Chaitin delivered a lecture entitled "Beauty in physics, mathematics and biology," in which he tried to show what is the beauty in natural sciences. "The theory is beautiful when it easily explains complicated things" - he said. Gregory John Chaitin is an Argentine-American mathematician and computer scientist. Beginning in the late 1960s, Chaitin made contributions to algorithmic information theory and metamathematics, in particular a computer-theoretic result equivalent to Gödel's incompleteness theorem. He is considered to be one of the founders of what is today known as Kolmogorov (or Kolmogorov-Chaitin) complexity together with Andrei Kolmogorov and Ray Solomonoff. Today, algorithmic information theory is a common subject in any computer sc...
Winter Intelligence Oxford - AGI12 - http://agi-conference.org/2012 ==Differences between Kolmogorov Complexity and Solomonoff Probability: Consequences for AGI== (this video includes discussion afterwards - though the video is blurred. Abstract. Kolmogorov complexity and algorithmic probability are compared in the context of the universal algorithmic intelligence. Accuracy of time series prediction based on single best model and on averaging over multiple models is estimated. Connection between inductive behavior and multimodel prediction is established. Uncertainty as a heuristic for reducing the number of used models without losses of universality is discussed. The conclusion is made that plurality of models is the essential feature of artificial general intelligence, and this feature ...
Paper: http://agi-conf.org/2010/wp-content/uploads/2009/06/paper_41.pdf This paper is about Algorithmic Probability (ALP) and Heuristic Programming and how they can be combined to achieve AGI. It is an update of a 2003 report describing a system of this kind (Sol03). We first describe ALP, giving the most common implementation of it, then the features of ALP relevant to its application to AGI. They are: Completeness, Incomputability, Subjectivity and Diversity. We then show how these features enable us to create a very general, very intelligent problem solving machine. For this we will devise "Training Sequences" - sequences of problems designed to put problem-solving information into the machine. We describe a few kinds of training sequences. The problems are solved by a "generate and t...
What is ALGORITHMIC PROBABILITY? What does ALGORITHMIC PROBABILITY mean? ALGORITHMIC PROBABILITY meaning - ALGORITHMIC PROBABILITY definition - ALGORITHMIC PROBABILITY explanation. Source: Wikipedia.org article, adapted under https://creativecommons.org/licenses/by-sa/3.0/ license. In algorithmic information theory, algorithmic (Solomonoff) probability is a mathematical method of assigning a prior probability to a given observation. It was invented by Ray Solomonoff in the 1960s. It is used in inductive inference theory and analyses of algorithms. In his general theory of inductive inference, Solomonoff uses the prior obtained by this formula, in Bayes' rule for prediction. In the mathematical formalism used, the observations have the form of finite binary strings, and the universal pri...
Transcript: http://matchingpennies.com/solomonoff_induction/
Grace Solomonoff presents her talk "A Few Notes on Multiple Theories and Conceptual Jump Size" at the Ninth Conference on Artificial General Intelligence (AGI-16) in New York (http://agi-conf.org/2016/). Paper authors: Grace Solomonoff Abstract: These are a few notes about some of Ray Solomonoff’s foundational work in algorithmic probability, focussing on the universal prior and conceptual jump size, including a few illustrations of how he thought. His induction theory gives a way to compare the likelihood of different theories describing observations. He used Bayes’ rule of causation to discard theories inconsistent with the observations. Can we find good theories? Lsearch may give a way to search and the conceptual jump size a measure for this.
Held at Monash Uni, Solomonoff Memorial Conference 2011 late November
http://www.donationalerts.ru/r/solomonoff В гостях Weizel Sergej https://www.youtube.com/channel/UCxnD96rPnaAxSLx09AZ01Dg Alex Diamond https://www.youtube.com/channel/UCz-jsYsxwiD3I-TJhq0hxGQ Ссылка на второй канал с ПсихоСтримами https://goo.gl/vM32DI
During the 2016 Copernicus Festival Gregory Chaitin delivered a lecture entitled "Beauty in physics, mathematics and biology," in which he tried to show what is the beauty in natural sciences. "The theory is beautiful when it easily explains complicated things" - he said. Gregory John Chaitin is an Argentine-American mathematician and computer scientist. Beginning in the late 1960s, Chaitin made contributions to algorithmic information theory and metamathematics, in particular a computer-theoretic result equivalent to Gödel's incompleteness theorem. He is considered to be one of the founders of what is today known as Kolmogorov (or Kolmogorov-Chaitin) complexity together with Andrei Kolmogorov and Ray Solomonoff. Today, algorithmic information theory is a common subject in any computer sc...
Winter Intelligence Oxford - AGI12 - http://agi-conference.org/2012 ==Differences between Kolmogorov Complexity and Solomonoff Probability: Consequences for AGI== (this video includes discussion afterwards - though the video is blurred. Abstract. Kolmogorov complexity and algorithmic probability are compared in the context of the universal algorithmic intelligence. Accuracy of time series prediction based on single best model and on averaging over multiple models is estimated. Connection between inductive behavior and multimodel prediction is established. Uncertainty as a heuristic for reducing the number of used models without losses of universality is discussed. The conclusion is made that plurality of models is the essential feature of artificial general intelligence, and this feature ...
Grace Solomonoff presents her talk "A Few Notes on Multiple Theories and Conceptual Jump Size" at the Ninth Conference on Artificial General Intelligence (AGI-16) in New York (http://agi-conf.org/2016/). Paper authors: Grace Solomonoff Abstract: These are a few notes about some of Ray Solomonoff’s foundational work in algorithmic probability, focussing on the universal prior and conceptual jump size, including a few illustrations of how he thought. His induction theory gives a way to compare the likelihood of different theories describing observations. He used Bayes’ rule of causation to discard theories inconsistent with the observations. Can we find good theories? Lsearch may give a way to search and the conceptual jump size a measure for this.
Held at Monash Uni, Solomonoff Memorial Conference 2011 late November
Winter Intelligence Oxford - AGI12 - http://agi-conference.org/2012 ==Differences between Kolmogorov Complexity and Solomonoff Probability: Consequences for AGI== (this video includes discussion afterwards - though the video is blurred. Abstract. Kolmogorov complexity and algorithmic probability are compared in the context of the universal algorithmic intelligence. Accuracy of time series prediction based on single best model and on averaging over multiple models is estimated. Connection between inductive behavior and multimodel prediction is established. Uncertainty as a heuristic for reducing the number of used models without losses of universality is discussed. The conclusion is made that plurality of models is the essential feature of artificial general intelligence, and this feature ...
http://www.solomonoff85thmemorial.monash.edu.au/
During the 2016 Copernicus Festival Gregory Chaitin delivered a lecture entitled "Beauty in physics, mathematics and biology," in which he tried to show what is the beauty in natural sciences. "The theory is beautiful when it easily explains complicated things" - he said. Gregory John Chaitin is an Argentine-American mathematician and computer scientist. Beginning in the late 1960s, Chaitin made contributions to algorithmic information theory and metamathematics, in particular a computer-theoretic result equivalent to Gödel's incompleteness theorem. He is considered to be one of the founders of what is today known as Kolmogorov (or Kolmogorov-Chaitin) complexity together with Andrei Kolmogorov and Ray Solomonoff. Today, algorithmic information theory is a common subject in any computer sc...
Les problèmes de raisonnement inductif ou d'extrapolation comme « deviner la suite d'une série de nombres », ou plus généralement, « comprendre la structure cachée dans des observations », sont fondamentaux si l'on veut un jour construire une intelligence artificielle. On pourrait avoir l'impression que ces problèmes ne sont pas mathématiquement bien définis, or il existe une théorie mathématique rigoureuse du raisonnement inductif et de l'extrapolation, fondée sur des principes de théorie de la calculabilité. Cette théorie a été définie il y a 50 ans par Ray Solomonoff, mais on commence seulement à avoir des outils mathématiques pour l'appliquer en pratique, grâce à des techniques de probabilités, de compression de données, de géométrie différentielle, de théorie de l'information. On donn...
Episode 245: Dr. Hans Utter returns to continue our investigational series into history of FM radio, titled “The [Secret] History of FM Radio, Part 2: Context & Networks, The Radical Left & Negative Intelligence”. Dr. Hans Utter returns for his 11th time. Episode 245 is about guns, self-defense, gun control laws, genocide, and freedom, with weapons expert Norman France, titled “Gun Control Vs. Reality. This episode was recorded January 27 and is being released on Sunday, January 31, 2016”. Chain-Links by Frigyes Karinthy https://djjr-courses.wdfiles.com/loca... Connectivity of Random Nets by Ray Solomonoff and Antol Rapoport http://world.std.com/~rjs/publication... Dr. Hans Utter's latest article: www.gnosticmedia.com/txtfiles/HansUtter_DoWhatThouWilt.pdf Previous episodes wi...
Monday, October 14, 2013 Wood Auditorium REX Principal Joshua Prince-Ramus, Project Manager David Menicovich, and On-Site Architect Ishtiaq Rafiuddin; Façade Consultant Marc Simmons, Front; and Interior Designer Sefer Çağlar, Autoban Responses by Merve Çağlar, SAHA Contemporary Art, and Galia Solomonoff, GSAPP An x-ray look at one building by key players in its realization. The Vakko Fashion Center, a 40,000 square-foot headquarters for the Turkish fashion house Vakko and its related media company, which was completed in 2012, began in Southern California, with REX's design for a new building at Caltech. A sudden change in the school's administration halted plans and freed them up for a new client in Istanbul, Vakko, which needed a new home within one year. In this unusual story, constr...
http://www.donationalerts.ru/r/solomonoff В гостях Weizel Sergej https://www.youtube.com/channel/UCxnD96rPnaAxSLx09AZ01Dg Alex Diamond https://www.youtube.com/channel/UCz-jsYsxwiD3I-TJhq0hxGQ Ссылка на второй канал с ПсихоСтримами https://goo.gl/vM32DI