### Abstract

A sequential estimator is proposed for the autoregression parameter of first-order (AR(1)), which is constructed on the basis of a generalized least square method (GLSM) using a special choice of the weight coefficients in the sum of residual squares. Under some natural requirements on the noise distribution function, this is the prescribed precision estimator in the sense that it provides the unknown parameter estimation with any fixed square average accuracy at the moment of termination of the observation. In contrast to the sequential least square estimator, our estimator has the important property of uniform asymptotic normality with respect to the parameter on the whole axis. Using this result one can show that the sequential least square estimator is asymptotically optimal in the minimax sense for the power loss function, in a wide class of sequential and nonsequential procedures.

Original language | English |
---|---|

Pages (from-to) | 678-694 |

Number of pages | 17 |

Journal | Theory of Probability and its Applications |

Volume | 41 |

Issue number | 4 |

DOIs | |

Publication status | Published - Dec 1996 |

### Fingerprint

### Keywords

- Autoregression process
- Local asymptotic normality
- Prescribed precision estimators
- Uniform asymptotic normality

### ASJC Scopus subject areas

- Statistics and Probability

### Cite this

*Theory of Probability and its Applications*,

*41*(4), 678-694. https://doi.org/10.1137/S0040585X97975691