Publication | Closed Access
Anomaly Detection of Time Series With Smoothness-Inducing Sequential Variational Auto-Encoder
190
Citations
53
References
2020
Year
Data AugmentationAnomaly DetectionMachine LearningData ScienceEngineeringDeep Generative ModelsGenerative Adversarial NetworkAutoencodersOutlier DetectionNovelty DetectionGenerative ModelsGenerative ModelComputer ScienceGenerative AiDeep LearningSignal ProcessingNonlinear Time Series
Deep generative models have demonstrated their effectiveness in learning latent representation and modeling complex dependencies of time series. In this article, we present a smoothness-inducing sequential variational auto-encoder (VAE) (SISVAE) model for the robust estimation and anomaly detection of multidimensional time series. Our model is based on VAE, and its backbone is fulfilled by a recurrent neural network to capture latent temporal structures of time series for both the generative model and the inference model. Specifically, our model parameterizes mean and variance for each time-stamp with flexible neural networks, resulting in a nonstationary model that can work without the assumption of constant noise as commonly made by existing Markov models. However, such flexibility may cause the model fragile to anomalies. To achieve robust density estimation which can also benefit detection tasks, we propose a smoothness-inducing prior over possible estimations. The proposed prior works as a regularizer that places penalty at nonsmooth reconstructions. Our model is learned efficiently with a novel stochastic gradient variational Bayes estimator. In particular, we study two decision criteria for anomaly detection: reconstruction probability and reconstruction error. We show the effectiveness of our model on both synthetic data sets and public real-world benchmarks.
| Year | Citations | |
|---|---|---|
Page 1
Page 1