Evidential Decision Theory via Partial Markov Categories (Di Lavore, Román)
- Article: pdf.
- Slides: Nordic Workshop on Programming Theory.
- Extended Abstract: pdf.
Abstract. We introduce partial Markov categories. In the same way that Markov categories encode stochastic processes, partial Markov categories encode stochastic processes with constraints, observations and updates. In particular, we prove a synthetic Bayes theorem; we use it to define a syntactic partial theory of observations on any Markov category, whose normalisations can be computed in the original Markov category. Finally, we formalise Evidential Decision Theory in terms of partial Markov categories, and provide implemented examples.
Notes on the paper.
- Evidential decision theory
- An implementation of Newcomb problem
- partial Markov category
- discrete partial Markov category
- partial Markov - Bayes update on subdistributions
References.