The Department of Logic & Philosophy of Science presents
"Topics in Inductive Logic"
Thursday, March 19, 2015 - Saturday, March 20, 2015
9:00a.m. - 6:00p.m.
Social & Behavioral Sciences Gateway, Room 1517
A two-day workshop
Inductive logic was a research program started by Rudolf Carnap in the 1940s and involves
the philosophical and mathematical analysis of inductive inference. It has a close
connection to the Bayes-Laplace tradition in probability theory and statistics. While
inductive logic has largely disappeared from the radar of mainstream philosophy ofscience,
progress has been made in various fields that are related to inductive logic. The
aim of this workshop is to discuss some of these new developments.
Organized by Simon Huttegger, Associate Professor of Logic and Philosophy of Science and Chancellor’s Fellow, University of California, Irvine; Department of Logic and Philosophy of Science. Attendance is free, but registration is recommended. Register by contacting Patty Jones (patty.jones@uci.edu) on or before Friday, March 13th.
Speakers
Frederick Eberhardt (Cal Tech), Simon Huttegger (UCI), Jeff B. Paris (Manchester), Jan-Willem Romeijn (Groningen), Gerhard Schurz (Düsseldorf), Tom Sterkenburg (Groningen), Marta Sznajder (LMU), Jon Williamson (Kent), Sandy Zabell (Northwestern)
Schedule of Talks
9:00am
Sandy Zabell, Professor of Statistics and Mathematics, Northwestern University : “The Problem of Zero Probability”
Abstract:The de Finetti representation theorem affords a nice resolution of Hume’s
classic problem of inductive inference. There are several important pieces to the
puzzle: providing an operational definition for probability, justifying its static
and dynamic properties as codified by the standard axioms, invoking some form of exchangeability.
But, at least in the form of the problem put forward by Hume, there is also what some
have characterized as “fine print”: the assumption that one’s prior not be dogmatic,
in the sense that appropriate subsets of the parameter space are not assigned an initial
probability of zero. In his talk, Dr. Zabell will discuss the question of if or when
a Bayesian should ever assign a probability of zero to events that are initially thought
to be possible. The first part of the talk will survey the general question; the second
half will draw lessons for the resolution of the problem of induction.
10:45am
Marta Sznajder, Doctoral Fellow, LMU (Munich): “Inductive Logic, Conceptual Spaces, and Theory Change”
Abstract: The idea that we could and possibly should represent concepts in a geometrical
manner can be found in two seemingly separate traditions. First, it was introduced
by Rudolf Carnap under the guise of attribute spaces, a concept developed in his late
work on inductive logic (Carnap 1971, 1980). Second, currently flourishing in cognitive
science is the theory of conceptual spaces, advanced i.a. by Peter Gärdenfors. It
is aimed at an empirically informed and accurate geometrical representations of concepts.
In her talk, Ms. Sznajder will show how the two proposals relate to each other, not
only insofar as Carnap’s attribute spaces can be seen as predecessors of the modern
conceptual spaces, but also through the possibilities of extending the Basic System
(and, more broadly, inductive logic) with the insights brought in by the conceptual
spaces program. Further, Ms. Sznajder will discuss a recent proposal of Zenker and
Gärdenfors (2014) on describing theory change in terms of changes in the relevant
conceptual spaces. Finally, Ms. Sznajder will argue that the authors’ conclusion that
the use of conceptual spaces can replace the three-level view of scientific theories,
postulated by Michael Friedman, cannot be arrived at quite yet.
1:15pm
Jon Williamson, Professor of Philosophy, University of Kent: “Classical Inductive Logic, Carnap's
Programme and the Objective Bayesian Approach”
Abstract: In his talk, Dr. Williamson will introduce what he calls classical inductive
logic, due to Wittgenstein, and its limitations. He will explain how Carnap's approach,
originally due to W.E. Johnson, tried to overcome these problems and why it failed.
Finally, he will introduce the objective Bayesian approach to inductive logic and
argue that it can overcome the limitations of the classical and Carnapian approaches.
3:00pm
Gerhard Schurz, Professor of Philosophy, Heinrich-Heine-Universität Düsseldorf: “Inductive Logic,
Wolpert's No-Free-Lunch Theorem, and the Optimality of Meta-Induction”
Abstract: Wolpert's No-Free-Lunch Theorem is a mathematical proof of Hume's inductive
scepticism.
Wolpert proves that under mild assumptions every prediction method - induction, anti-induction,
guessing or what else - has the same expected predictive success, according to a uniform
prior probability distribution over possible worlds.
In the first part of his talk, Dr. Gerhard Schurz will discuss how strong the limitations
are that follow from this result for the program of Inductive Logic. In the second
part, he will present a different account to Hume's problem of induction that is based
on the optimality of meta-induction. According to this result, there exist meta-inductive
prediction-selection methods whose predictive success is optimal in regard to all
prediction method whose output is accessible to them. Dr. Schurz will discuss this
result in the light of inductive logic and
Wolpert 's no-free-lunch theorem.
4:30pm
Frederick Eberhardt, Professor of Philosophy, Cal Tech: “Causal Variables in a Pixel World”
Abstract: How can one construct causal macro-variables from the ground up? Dr. Eberhardt
will present some of the problems, some formal limitations and finally a successful
account of trying to provide a theory of how to define and automatically identify
a cause of some target behavior in a domain where we have no prior knowledge that
delineates the candidate causes. This work is motivated by an aim to weaken the standard
assumption in causal discovery of a given set of well-defined variables, since in
many domains we have very little knowledge of what the candidate causes are. Eberhardt
will use vision as an example, since visual causes are sometimes very clearly delineated,
such as when a traffic light turns to green, and sometimes very subtle, such as with
the increased judgment of attractiveness for more symmetric human faces. The resulting
account illustrates in what sense Goodman's new riddle can be reconstructed for causal
variables.
Friday, March 20.
9:00am
Jeff B. Paris, Professor of Mathematics, University of Manchester: “Analogy in Pure Inductive Logic”
Abstract: Dealing as it does with the assignment of probabilities on solely rational or logical grounds in the absence of any intended interpretation Pure Inductive Logic would seem to provide a natural context to investigate the idea of `reasoning by analogy'. In particular: In what sense does this concept exist in its own right? Can it be formalized? Is it `rational' and if so what consequence does it have for the rational assignment of probabilities?
In his talk, Jeffrey Paris will briefly describe the context and default assumptions
of Pure Inductive Logic and then go on to explain four general principles of analogical
support which have been proposed in this framework.
10:45am
Tom Sterkenburg, PhD Student, Rijksuniversiteit Groningen, Department of Philosophy: “Ocam’s Razor
in Algorithmic Information Theory”
Abstract: The notion of Kolmogorov-complexity provides a quantification of how much we can possibly compress (i.e., describe in a shorter way) a given sequence of data. As the name suggests, this compressibility is to reflect the data sequence's complexity: the lower the Kolmogorov-complexity, the simpler the sequence. A parallel definition can be employed in the context of sequential prediction. That is, we can specify an idealized prediction method that assigns future data elements a higher probability as the combination of past and future elements exhibit a sequence that is more compressible. Furthermore, under the assumption that the data is generated in a computable way, we can formally prove that this prediction method will almost always converge to the best possible predictions.
On the previous identification of compressibility and simplicity, this prediction method shows a bias towards simple sequences; and so it is customarily presented as a formalization of the principle of Occam's razor. Indeed, it is often suggested that the proof of the performance of this prediction method constitutes a demonstration that a preference for simplicity will lead us to the truth, and can therefore provide us with a genuine justification of Occam's razor.
In his talk, Tom Sterkenburg will explicate the relevant argument that is to justify
Occam's razor. He will then recast the argument in Bayesian terms, thereby revealing
the hidden assumptions and showing why, unfortunately, the argument has no such justificatory
force. On the way, he will also discuss the close affinity of algorithmic information
theory to Carnap's early programme of inductive logic.
1:15pm
Simon Huttegger, Professor of Logic and Philosophy of Science, UCI: “Analogical Predictive Probabilities”
Abstract: How should analogical considerations enter inductive reasoning? This question
was raised after the introduction of Carnap's early systems of inductive logic, and
various inductive rules have been developed since. Most of these proposals do not
have an axiomatic foundation along the lines of W. E. Johnson’s and Carnap’s work.
Thus, it is at least to some extent unclear to which inductive problems they are supposed
to apply. By taking clues from de Finetti’s ideas about analogy, I present a new analogical
inductive logic that is based on a rigorous foundation. The axioms of the new theory
extend the axioms of Johnson and Carnap in fairly minimal ways, and they allow us
to discuss in a precise way the merits and limitations of the resulting system of
inductive logic.
3:00pm
Jan-Willem Romeijn, Professor of Philosophy of Science, University of Groningen: “Analogy by Proximity
in Conceptual Space”
Abstract: This paper exploits tools from Bayesian statistics to develop an account of analogical inductive predictions. In the first part of the talk Dr. Romeijn will consider the well-known link between prediction rules and Bayesian statistics due to De Finetti. In the second part of the talk work in progress with Marta Sznajder - the Bayesian framework will be employed to facilitate a refinement of the observation algebra on which predictions are defined. This leads to a new understanding of the role of similarity in inductive predictions.
The first part starts with the observation that Carnapian prediction rules for multiple predicate families have analogical effects built in. This can be illuminated by representing the prediction rules in terms of Dirichlet priors over Bernoulli hypotheses. Carnap's analogical prediction rules can be illuminated in much the same way, by simply choosing a different class of priors. Romeijn argues that the representation of analogical predictions in terms of such priors offers conceptual advantages. It brings out the reasoning that underlies analogical predictions.
The longer second part develops a particular driver of analogical predictions: similarities among predicates. Romeijn shows that the notion of similarity-as-distance can be developed further by representing predicates as regions in a conceptual space. This representation invites the definition of a refined predictive system, in which a probability density over a conceptual space is adapted in the light of observation. It turns out that the Bayesian methods of part one can be used again in this context, offering a new approach to analogical predictions based on similarity.
connect with us