# Formal Epistemology Symposium: Justification and Probabilistic Support

The Departement of Theoretical Philosophy hosts a symposium on Formal Epistemology on April 30 from 13:00 to 17:00 in room Omega of the Faculty of Philosophy, Oude Boteringestraat 52, Groningen

Time table (There are short breaks between all sessions)

- 13:15 Martin Smith (Glasgow): "When does evidence suffice for conviction?"
- 14:30 Paul Pedersen (Berlin): "Dogmas, Dominance and Disintegrations"
- 15:45 Simon Huttegger (UC Irvine): "Foundations for boundedly rational learning"

Admission is free. Everybody is cordially invited!

##### Abstracts

###### When does evidence suffice for conviction? - Martin Smith (Glasgow)

There is something puzzling about statistical evidence. One place this manifests is in the law, where courts are reluctant to use evidence of this kind, in spite of the fact that it is quite capable of meeting the standards of proof enshrined in legal doctrine. After surveying some proposed solutions to this problem, I shall outline a somewhat different approach - one that makes use of a notion of *normalcy* that is distinct from the idea of statistical frequency. The problem is not, however, merely a legal one. Our unwillingness to base beliefs on statistical evidence is by no means limited to the courtroom, and is at odds with almost every general principle that epistemologists have ever proposed as to how we ought to manage our beliefs.

###### Dogmas, Dominance, and Disintegrations - Paul Pedersen (Berlin)

A familiar theme in foundational discussions of probability, inference, and decision making is the normative adequacy of the orthodox doctrine of subjective probability and expected utility associated with de Finetti and Savage. A number of authors have raised objections against dogmas the doctrine has taken for granted without due warrant. These objections, as well as the accounts proposed as alternatives that abandon some or several such dogmas, generally presuppose another dogma---the dogma of real-valued measurability of probability and expectation. Just as some have argued that indeterminate probabilities are not only rationally permissible but also sometimes rationally obligatory, I contend that if judgments of probability and expectation entail commitments to standards of deliberation usually supposed of them, then probabilities and expectations failing to conform to the dogma of real-valued measurability---and hence failing to be Archimedean---are not only rationally permissible but also sometimes rationally obligatory. In the absence of moral hazards, deliberating agents should be committed to conforming to the principle of weak dominance---the criterion enjoining an agent to reject as unacceptable for choice any option that, by his own lights, is possibly worse and certainly no better than another option open for him to choose. If this is correct, then the dogma of real-valued measurability ought to be abandoned if the core tenants of the doctrine of subjective probability and expectation are to be retained---even if other alleged dogmas of the orthodox doctrine are also rejected.

In this talk I shall introduce a normative theory of subjective probability and expected utility that rests upon qualitative criteria regulating preference judgements (or comparative judgments of expectation) and that abandons the commitment to the technically convenient but rationally non-obligatory dogma of real-valued measurability. Like other normative theories, such as Savage’s theory of personal probability, the theory I advance does not presume that a given agent makes numerical judgments of probability or expectation. I shall explain a key lemma behind a full numerical representation of preference in terms of subjective expected utilities formed from possibly non-Archimedean probabilities and utilities. Expected utilities take a very simple numerical form in terms of power series in a single infinitesimal with addition and multiplication naturally defined by means of the familiar operations of addition and multiplication of power series and with a natural lexicographic ordering. Other accounts of non-Archimedean probability or expectation are insufficiently general and philosophically inadequate.

Time permitting, I shall explain a qualitative criterion of coherence reminiscent of de Finetti’s numerical criterion of coherence and make remarks about disintegrations and dominance. The qualitative criterion of coherence is formulated for a truly arbitrary collection of gambles (random quantities), free of structural constraints, and it does not require the binary relation reflecting an agent’s comparative judgments to be reflexive, complete, or even transitive. Nonetheless, the qualitative criterion of coherence satisfies an analogue of de Finetti’s Fundamental Theorem of Prevision, ensuring that a coherent system of comparative judgments can be extended to a coherent weakly ordered system of comparative judgments over any space of gambles (random quantities), also furnishing a basis for an account admitting a representation in terms of numerically indeterminate, and possibly non-Archimedean, probabilities and expected utilities in the style of, for example, Levi, Walley, Williams, and Smith.

###### Foundations for boundedly rational learning - Simon Huttegger (UC Irvine)

The core idea of rational Bayesian learning is that learning from experience should be consistent with one's inductive assumptions. I will use this idea to develop foundations for so-called bounded rationality learning rules, which are based on much less informational inputs than Bayesian conditioning. This results in foundations for bounded rationality learning very similar to those of a more standard Bayesian type.

Last modified: | 09 May 2014 11.39 a.m. |