Policy Explanation and Model Refinement in Decision-Theoretic Planning
OZ Khan - 2013 - uwspace.uwaterloo.ca
Decision-theoretic systems, such as Markov Decision Processes (MDPs), are used for
sequential decision-making under uncertainty. MDPs provide a generic framework that can …
sequential decision-making under uncertainty. MDPs provide a generic framework that can …
Abstraction and approximate decision-theoretic planning
R Dearden, C Boutilier - Artificial Intelligence, 1997 - Elsevier
Markov decision processes (MDPs) have recently been proposed as useful conceptual
models for understanding decision-theoretic planning. However, the utility of the associated …
models for understanding decision-theoretic planning. However, the utility of the associated …
Efficient solution algorithms for factored MDPs
This paper addresses the problem of planning under uncertainty in large Markov Decision
Processes (MDPs). Factored MDPs represent a complex state space using state variables …
Processes (MDPs). Factored MDPs represent a complex state space using state variables …
Planning with hidden parameter polynomial MDPs
For many applications of Markov Decision Processes (MDPs), the transition function cannot
be specified exactly. Bayes-Adaptive MDPs (BAMDPs) extend MDPs to consider transition …
be specified exactly. Bayes-Adaptive MDPs (BAMDPs) extend MDPs to consider transition …
Decision-theoretic planning: Structural assumptions and computational leverage
Planning under uncertainty is a central problem in the study of automated sequential
decision making, and has been addressed by researchers in many different fields, including …
decision making, and has been addressed by researchers in many different fields, including …
Stochastic dynamic programming with factored representations
Markov decision processes (MDPs) have proven to be popular models for decision-theoretic
planning, but standard dynamic programming algorithms for solving MDPs rely on explicit …
planning, but standard dynamic programming algorithms for solving MDPs rely on explicit …
[PDF][PDF] Decision making under uncertainty: operations research meets AI (again)
C Boutilier - AAAI/IAAI, 2000 - researchgate.net
Abstract Models for sequential decision making under uncertainty (eg, Markov decision
processes, or MDPs) have been studied in operations research for decades. The recent …
processes, or MDPs) have been studied in operations research for decades. The recent …
Reasoning about MDPs Abstractly: Bayesian Policy Search with Uncertain Prior Knowledge
J Molhoek - 2024 - repository.tudelft.nl
Many real-world problems fall in the category of sequential decision-making under
uncertainty; Markov Decision Processes (MDPs) are a common method for modeling such …
uncertainty; Markov Decision Processes (MDPs) are a common method for modeling such …
An introduction to fully and partially observable Markov decision processes
P Poupart - Decision theory models for applications in artificial …, 2012 - igi-global.com
The goal of this chapter is to provide an introduction to Markov decision processes as a
framework for sequential decision making under uncertainty. The aim of this introduction is …
framework for sequential decision making under uncertainty. The aim of this introduction is …