Bibliografía

5. Bibliografía#

[1]

Maurice S Bartlett. On the theoretical specification and sampling properties of autocorrelated time-series. Supplement to the Journal of the Royal Statistical Society, 8(1):27–41, 1946.

[2]

John W Tukey and others. Exploratory data analysis. Volume 2. Reading, MA, 1977.

[3]

David A Dickey and Wayne A Fuller. Distribution of the estimators for autoregressive time series with a unit root. Journal of the American statistical association, 74(366a):427–431, 1979.

[4]

Robert B Cleveland, William S Cleveland, Jean E McRae, Irma Terpenning, and others. Stl: a seasonal-trend decomposition. J. Off. Stat, 6(1):3–73, 1990.

[5]

George EP Box and David A Pierce. Distribution of residual autocorrelations in autoregressive-integrated moving average time series models. Journal of the American statistical Association, 65(332):1509–1526, 1970.

[6]

Greta M Ljung and George EP Box. On a measure of lack of fit in time series models. Biometrika, 65(2):297–303, 1978.

[7]

Hirotugu Akaike. A new look at the statistical model identification. IEEE transactions on automatic control, 19(6):716–723, 1974.

[8]

Gideon Schwarz. Estimating the dimension of a model. The annals of statistics, pages 461–464, 1978.

[9]

Clifford M Hurvich and Chih-Ling Tsai. Regression and time series model selection in small samples. Biometrika, 76(2):297–307, 1989.

[10]

D.C. Montgomery, L.A. Johnson, and J.S. Gardiner. Forecasting and Time Series Analysis. Industrial engineering series. McGraw-Hill, 1990. ISBN 9780070428584. URL: https://books.google.com.co/books?id=t9HuAAAAMAAJ.

[11]

Robert Goodell Brown. Smoothing, forecasting and prediction of discrete time series. Courier Corporation, 2004.

[12]

B. Abraham and J. Ledolter. Statistical Methods for Forecasting. Wiley Series in Probability and Statistics. Wiley, 2009. ISBN 9780470317297. URL: https://books.google.com.co/books?id=WIPxdb2P8sAC.

[13]

DW Trigg and A Gi Leach. Exponential smoothing with an adaptive response rate. Journal of the Operational Research Society, 18:53–59, 1967.

[14]

Charles C Holt. Forecasting seasonals and trends by exponentially weighted moving averages. International journal of forecasting, 20(1):5–10, 2004.

[15]

Peter R Winters. Forecasting sales by exponentially weighted moving averages. Management science, 6(3):324–342, 1960.

[16]

R. Hyndman, A.B. Koehler, J.K. Ord, and R.D. Snyder. Forecasting with Exponential Smoothing: The State Space Approach. Springer Series in Statistics. Springer Berlin Heidelberg, 2008. ISBN 9783540719182. URL: https://books.google.com.co/books?id=GSyzox8Lu9YC.

[17]

Herman Wold. A study in the analysis of stationary time series. PhD thesis, Almqvist & Wiksell, 1938.

[18]

George Udny Yule. On a method of investigating periodicities in disturbed series with special reference to wolfer’s sunspot numbers. Statistical Papers of George Udny Yule, pages 389–420, 1971.

[19]

Søren Bisgaard and Murat Kulahci. Time series analysis and forecasting by example. John Wiley & Sons, 2011.

[20]

P.J. Brockwell and R.A. Davis. Time Series: Theory and Methods: Theory and Methods. Springer Series in Statistics. Springer New York, 1991. ISBN 9780387974293. URL: https://books.google.com.co/books?id=ZW_ThhYQiXIC.

[21]

Casimir Michael Stralkowski. Lower order autoregressive-moving average stochastic models and their use for the characterization of abrasive cutting tools. PhD thesis, University of Wisconsin, 1968.

[22]

George Udny Yule. Vii. on a method of investigating periodicities disturbed series, with special reference to wolfer's sunspot numbers. Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character, 226(636-646):267–298, 1927.

[23]

Gilbert Thomas Walker. On periodicity in series of related terms. Proceedings of the Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character, 131(818):518–532, 1931.

[24]

Maurice H Quenouille. Approximate tests of correlation in time-series 3. In Mathematical Proceedings of the Cambridge Philosophical Society, volume 45, 483–484. Cambridge University Press, 1949.

[25]

GM Jenkins and others. Tests of hypotheses in the linear autoregressive model. 1956.

[26]

HE Daniels. The approximate distribution of serial correlation coefficients. Biometrika, 43(1/2):169–185, 1956.

[27]

George EP Box, Gwilym M Jenkins, Gregory C Reinsel, and Greta M Ljung. Time series analysis: forecasting and control. John Wiley & Sons, 2015.

[28]

Wei William and S Wei. Time series analysis: univariate and multivariate methods. USA, Pearson Addison Wesley, Segunda edicion. Cap, 10:212–235, 2006.

[29]

George C Tiao and George EP Box. Modeling multiple time series with applications. journal of the American Statistical Association, 76(376):802–816, 1981.

[30]

Ruey S Tsay and George C Tiao. Consistent estimates of autoregressive parameters and extended sample autocorrelation function for stationary and nonstationary arma models. Journal of the American Statistical Association, 79(385):84–96, 1984.

[31]

Johannes Ledolter and Bovas Abraham. Some comments on the initialization of exponential smoothing. Journal of Forecasting, 3(1):79–84, 1984.

[32]

S. Theodoridis. Machine Learning: A Bayesian and Optimization Perspective. Elsevier Science, 2020. ISBN 9780128188040. URL: https://books.google.com.co/books?id=l-nEDwAAQBAJ.

[33]

J. Brownlee and Machine Learning Mastery. Deep Learning with Python: Develop Deep Learning Models on Theano and TensorFlow Using Keras. Machine Learning Mastery, 2017. URL: https://books.google.com.co/books?id=eJw2nQAACAAJ.

[34]

David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. Learning representations by back-propagating errors. nature, 323(6088):533–536, 1986.

[35]

Arthur E Bryson Jr, Walter F Denham, and Stewart E Dreyfus. Optimal programming problems with inequality constraints. AIAA journal, 1(11):2544–2550, 1963.

[36]

Razvan Pascanu, Caglar Gulcehre, Kyunghyun Cho, and Yoshua Bengio. How to construct deep recurrent neural networks. arXiv preprint arXiv:1312.6026, 2013.

[37]

Alex Graves, Abdel-rahman Mohamed, and Geoffrey Hinton. Speech recognition with deep recurrent neural networks. In 2013 IEEE international conference on acoustics, speech and signal processing, 6645–6649. Ieee, 2013.

[38]

Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.

[39]

Klaus Greff, Rupesh K Srivastava, Jan Koutník, Bas R Steunebrink, and Jürgen Schmidhuber. Lstm: a search space odyssey. IEEE transactions on neural networks and learning systems, 28(10):2222–2232, 2016.

[40]

Rafal Jozefowicz, Wojciech Zaremba, and Ilya Sutskever. An empirical exploration of recurrent network architectures. In International conference on machine learning, 2342–2350. PMLR, 2015.

[41]

Ilya Sutskever, James Martens, and Geoffrey E Hinton. Generating text with recurrent neural networks. In Proceedings of the 28th international conference on machine learning (ICML-11), 1017–1024. 2011.

[42]

Shujie Liu, Nan Yang, Mu Li, and Ming Zhou. A recursive recurrent neural network for statistical machine translation. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 1491–1500. 2014.

[43]

Alex Graves and Navdeep Jaitly. Towards end-to-end speech recognition with recurrent neural networks. In International conference on machine learning, 1764–1772. PMLR, 2014.

[44]

Andrej Karpathy and Li Fei-Fei. Deep visual-semantic alignments for generating image descriptions. In Proceedings of the IEEE conference on computer vision and pattern recognition, 3128–3137. 2015.

[45]

Youngjoo Seo, Manuel Morante, Yannis Kopsinis, and Sergios Theodoridis. Unsupervised pre-training of the brain connectivity dynamic using residual d-net. In Neural Information Processing: 26th International Conference, ICONIP 2019, Sydney, NSW, Australia, December 12–15, 2019, Proceedings, Part III 26, 608–620. Springer, 2019.

[46]

Tiago E Pratas, Filipe R Ramos, and Lihki Rubio. Forecasting bitcoin volatility: exploring the potential of deep learning. Eurasian Economic Review, pages 1–21, 2023.