Simulation output analysis using standardized time series

Simulation output analysis using standardized time series

0.00 Avg rating0 Votes
Article ID: iaor1990756
Country: United States
Volume: 15
Issue: 1
Start Page Number: 1
End Page Number: 7
Publication Date: Feb 1990
Journal: Mathematics of Operations Research
Authors: ,
Abstract:

The method of standardized time series (STS) was proposed by Schruben as an approach for constructing asymptotic confidence intervals for the steady-state mean from a single simulation run. The STS method ‘cancels out’ the variance constant while other methods attempt to consistently estimate the variance constant. The authors’ goal in this paper is to generalize the STS method and to study some of its basic properties. Starting from a functional central limit theorem (FCLT) for the sample mean of the simulated process, a class of mappings of C[0,1] to ℝ is identified, each of which leads to a STS confidence interval. One of these mappings leads to the batch means method. A lower bound is obtained for the expected length of the asymptotic (as the run size becomes large) STS confidence intervals. This lower bound is not attained, but can be approached arbitrarily closely, by STS confidence intervals. Methods that consistently estimate the variance constant do realize this lower bound. The variance of the length of a STS confidence interval is of larger order (in the run length) than is that for the regenerative method.

Reviews

Required fields are marked *. Your email address will not be published.