Markov Decision Processes

Concept

A mathematical framework for modeling decision-making in situations where outcomes are partly random and partly under the control of a decision-maker. It assumes the current state contains all necessary information about the future.

Mentioned in 1 video