Skip to Main content Skip to Navigation
Journal articles

Approximate Inference and Learning of State Space Models with Laplace Noise

Abstract : State space models have been extensively applied to model and control dynamical systems in disciplines including neuroscience, target tracking, and audio processing. A common modeling assumption is that both the state and data noise are Gaussian because it simplifies the estimation of the system's state and model parameters. However, in many real-world scenarios where the noise is heavy-tailed or includes outliers, this assumption does not hold, and the performance of the model degrades. In this aper, we present a new approximate inference algorithm for state space models with Laplace-distributed multivariate data that is robust to a wide range of non-Gaussian noise. Exact inference is combined with an expectation propagation algorithm, leading to filtering and smoothing that outperforms existing approximate inference methods for Laplace-distributed data, while retaining a fast speed similar to the Kalman filter. Further, we present a maximum posterior expectation-maximization (EM) algorithm that learns the parameters of the model in an unsupervised way, automatically avoids over-fitting the data, and provides better model estimation than existing methods for the Gaussian model. The quality of the inference and learning algorithms are exemplified through a diverse set of experiments and an application to non-linear tracking of audio frequency.
Document type :
Journal articles
Complete list of metadata
Contributor : Roland Badeau Connect in order to contact the contributor
Submitted on : Wednesday, June 9, 2021 - 2:35:58 PM
Last modification on : Tuesday, October 19, 2021 - 11:16:31 AM




Julian Neri, Philippe Depalle, Roland Badeau. Approximate Inference and Learning of State Space Models with Laplace Noise. IEEE Transactions on Signal Processing, Institute of Electrical and Electronics Engineers, 2021, ⟨10.1109/tsp.2021.3075146⟩. ⟨hal-03255319⟩



Record views