Statistics Papers

Document Type

Journal Article

Date of this Version

5-1976

Publication Source

Statistical Decision Theory and Related Topics

Start Page

57

Last Page

91

DOI

10.1016/B978-0-12-307560-4.50008-9

Abstract

This chapter focuses on stochastic control and decision processes that occur in a variety of theoretical and applied contexts, such as statistical decision problems, stochastic dynamic programming problems, gambling processes, optimal stopping problems, stochastic adaptive control processes, and so on. It has long been recognized that these are all mathematically closely related. That being the case, all of these decision processes can be viewed as variations on a single theoretical formulation. The chapter presents some general conditions under which optimal policies are guaranteed to exist. The given theoretical formulation is flexible enough to include most variants of the types of processes. In statistical problems, the distribution of the observed variables depends on the true value of the parameter. The parameter space has no topological or other structure here; it is merely a set indexing the possible distributions. Hence, the formulation is not restricted to those problems known in the statistical literature as parametric problems. In nonstatistical contexts, the distribution does not depend on an unknown parameter. All such problems may be included in the formulation by the device of choosing the parameter space to consist of only one point, corresponding to the given distribution.

Copyright/Permission Statement

© 1976. This manuscript version is made available under the CC-BY-NC-ND 4.0 license.

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.