Abstract
Exact inference for hidden Markov models requires the evaluation of all distributions of interest – filtering, prediction, smoothing and likelihood – with a finite computational effort. This article provides sufficient conditions for exact inference for a class of hidden Markov models on general state spaces given a set of discretely collected indirect observations linked non linearly to the signal, and a set of practical algorithms for inference. The conditions we obtain are concerned with the existence of a certain type of dual process, which is an auxiliary process embedded in the time reversal of the signal, that in turn allows to represent the distributions and functions of interest as finite mixtures of elementary densities or products thereof. We describe explicitly how to update recursively the parameters involved, yielding qualitatively similar results to those obtained with Baum–Welch filters on finite state spaces. We then provide practical algorithms for implementing the recursions, as well as approximations thereof via an informed pruning of the mixtures, and we show superior performance to particle filters both in accuracy and computational efficiency. The code for optimal filtering, smoothing and parameter inference is made available in the Julia package DualOptimalFiltering.
Acknowledgements
The third author was partially supported by the Italian Ministry of Education, University and Research (MIUR) through PRIN 2015SNS29B and through “Dipartimenti di Eccellenza” grant 2018-2022.
We thank an anonymous reviewer for interesting suggestions.
Citation
Guillaume Kon Kam King. Omiros Papaspiliopoulos. Matteo Ruggiero. "Exact inference for a class of hidden Markov models on general state spaces." Electron. J. Statist. 15 (1) 2832 - 2875, 2021. https://doi.org/10.1214/21-EJS1841
Information