Home  /  Research  / Events
30 Maggio, 2011 12:30 oclock

Time Inconsistent Stochastic Control Theory

Tomas Björk, Stockholm School of Economics
Aula Consiglio VII Piano
Abstract

We present a theory for stochastic control problems which, in various ways, are time inconsistent in the sense that they do not admit a Bellman optimality principle. We attach these problems by viewing them within a game theoretic framework, and we look for subgame perfect Nash equilibrium points. For a general controlled Markov process and a fairly general objective functional we derive an extension of the standard Hamilton-Jacobi-Bellman equation, in the form of a system of non-linear equations, for the determination for the equilibrium strategy as well as the equilibrium value function. We also study some concrete examples.

Search by section
Search string Reset

Mathematical Seminars
in Milan and surrounding areas