Open Access
2019 Strong consistency of the least squares estimator in regression models with adaptive learning
Norbert Christopeit, Michael Massmann
Electron. J. Statist. 13(1): 1646-1693 (2019). DOI: 10.1214/19-EJS1558

Abstract

This paper looks at the strong consistency of the ordinary least squares (OLS) estimator in linear regression models with adaptive learning. It is a companion to Christopeit & Massmann (2018) which considers the estimator’s convergence in distribution and its weak consistency in the same setting. Under constant gain learning, the model is closely related to stationary, (alternating) unit root or explosive autoregressive processes. Under decreasing gain learning, the regressors in the model are asymptotically collinear. The paper examines, first, the issue of strong convergence of the learning recursion: It is argued that, under constant gain learning, the recursion does not converge in any probabilistic sense, while for decreasing gain learning rates are derived at which the recursion converges almost surely to the rational expectations equilibrium. Secondly, the paper establishes the strong consistency of the OLS estimators, under both constant and decreasing gain learning, as well as rates at which the estimators converge almost surely. In the constant gain model, separate estimators for the intercept and slope parameters are juxtaposed to the joint estimator, drawing on the recent literature on explosive autoregressive models. Thirdly, it is emphasised that strong consistency is obtained in all models although the near-optimal condition for the strong consistency of OLS in linear regression models with stochastic regressors, established by Lai & Wei (1982a), is not always met.

Citation

Download Citation

Norbert Christopeit. Michael Massmann. "Strong consistency of the least squares estimator in regression models with adaptive learning." Electron. J. Statist. 13 (1) 1646 - 1693, 2019. https://doi.org/10.1214/19-EJS1558

Information

Received: 1 August 2018; Published: 2019
First available in Project Euclid: 17 April 2019

zbMATH: 07056160
MathSciNet: MR3939590
Digital Object Identifier: 10.1214/19-EJS1558

Subjects:
Primary: 62F10 , 62H12 , 62J05 , 62M10
Secondary: 91A26 , 91B64 , 91B84

Keywords: Adaptive learning , Almost sure convergence , non-stationary regression , ordinary least squares

Vol.13 • No. 1 • 2019
Back to Top