Bibliography#
Jens Behrmann, Will Grathwohl, Ricky T. Q. Chen, David Duvenaud, Jörn-Henrik Jacobsen. “Invertible residual networks.” International Conference on Machine Learning. 2019.
Ba, Jimmy Lei, Jamie Ryan Kiros, and Geoffrey E. Hinton. “Layer normalization.” arXiv preprint arXiv:1607.06450 (2016).
Bai, Shaojie, J. Zico Kolter, and Vladlen Koltun. “An empirical evaluation of generic convolutional and recurrent networks for sequence modeling.” arXiv preprint arXiv:1803.01271 (2018).
Du, Nan, et al. “Recurrent Marked Temporal Point Processes: Embedding Event History to Vector.” The 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2016.
Gal, Yarin, and Zoubin Ghahramani. “A theoretically grounded application of dropout in recurrent neural networks.” Advances in neural information processing systems. 2016.
Hyndman, Rob J., and George Athanasopoulos. Forecasting: principles and practice, 3rd edition. OTexts, 2021.
Krueger, David, Tegan Maharaj, János Kramár, Mohammad Pezeshki, Nicolas Ballas, Nan Rosemary Ke, Anirudh Goyal, Yoshua Bengio, Aaron Courville, and Chris Pal. “Zoneout: Regularizing rnns by randomly preserving hidden activations.” arXiv preprint arXiv:1606.01305 (2016).
Lai, Guokun, et al. “Modeling long-and short-term temporal patterns with deep neural networks.” The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. ACM, 2018.
Lim, Bryan, Sercan O. Arik, Nicolas Loeff, and Tomas Pfister. “Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting.” International Journal of Forecasting 37.4 (2021): 1748-1764
Makridakis, Spyros, Evangelos Spiliotis, and Vassilios Assimakopoulos. “The M4 Competition: 100,000 time series and 61 forecasting methods.” International Journal of Forecasting 36.1 (2020): 54-74.
Muller, Rafael, Simon Kornblith, and Geoffrey E. Hinton. “When does label smoothing help?.” Advances in Neural Information Processing Systems. 2019.
Merity, Stephen, Bryan McCann, and Richard Socher. “Revisiting activation regularization for language rnns.” arXiv preprint arXiv:1708.01009 (2017).
Oord, Aaron van den, et al. “Wavenet: A generative model for raw audio.” arXiv preprint arXiv:1609.03499 (2016).
Paine, Tom Le, et al. “Fast wavenet generation algorithm.” arXiv preprint arXiv:1611.09482 (2016).
Rangapuram, Syama Sundar, et al. “Deep state space models for time series forecasting.” Advances in Neural Information Processing Systems. 2018.
Salinas, David, Valentin Flunkert, and Jan Gasthaus. “DeepAR: Probabilistic forecasting with autoregressive recurrent networks.” arXiv preprint arXiv:1704.04110 (2017).
Shchur, Oleksandr, et al. “Intensity-free Learning of Temporal Point Processes.” International Conference on Learning Representations. 2020.
Turkmen, Caner, et al. “Intermittent Demand Forecasting with Deep Renewal Processes.” Learning with Temporal Point Processes Workshop, NeurIPS. 2019.
Wen, Ruofeng, et al. “A multi-horizon quantile recurrent forecaster.” arXiv preprint arXiv:1711.11053 (2017).
Yu, Hsiang-Fu, Nikhil Rao, and Inderjit S. Dhillon. “High-dimensional time series prediction with missing values.” arXiv preprint arXiv:1509.08333 (2015).