Referencias

Amatriain, Xavier. 2015. “In Machine Learning, What Is Better: More Data or Better Algorithms.” http://www.kdnuggets.com/2015/06/machine-learning-more-data-better-algorithms.html.

Caban, Jesus J., Ulas Bagci, Alem Mehari, Shoaib Alam, Joseph R. Fontana, Gregory J. Kato, and Daniel J. Mollura. 2012. “Characterizing Non-Linear Dependencies Among Pairs of Clinical Variables and Imaging Data.” Conf Proc IEEE Eng Med Biol Soc 2012 (August): 2700–2703. doi:10.1109/EMBC.2012.6346521.

Fernandez-Delgado, Manuel. 2014. “Do We Need Hundreds of Classifiers to Solve Real World Classification Problems?” http://jmlr.csail.mit.edu/papers/volume15/delgado14a/delgado14a.pdf.

Fortmann, Scott. 2012. “Understanding the Bias-Variance Tradeoff.” http://scott.fortmann-roe.com/docs/BiasVariance.html.

Handbook, Engineering Statistics. 2013. “Measures of Skewness and Kurtosis.” http://www.itl.nist.gov/div898/handbook/eda/section3/eda35b.htm.

Hyndman, Rob J. 2010. “Why Every Statistician Should Know About Cross-Validation?” https://robjhyndman.com/hyndsight/crossvalidation/.

———. 2017. “ARIMA Modelling in R.” https://www.otexts.org/fpp/8/.

Izbicki, Mike. 2011. “Converting Images into Time Series for Data Mining.” https://izbicki.me/blog/converting-images-into-time-series-for-data-mining.html.

Kuhn, Max. 2017. “Recursive Feature Elimination in R Package Caret.” https://topepo.github.io/caret/recursive-feature-elimination.html.

McNeese, Bill. 2016. “Are the Skewness and Kurtosis Useful Statistics?” https://www.spcforexcel.com/knowledge/basic-statistics/are-skewness-and-kurtosis-useful-statistics.

Raschka, Sebastian. 2017. “Machine Learning Faq.” http://sebastianraschka.com/faq/docs/evaluate-a-model.html.

Reshef, David N., Yakir A. Reshef, Hilary K. Finucane, Sharon R. Grossman, Gilean McVean, Peter J. Turnbaugh, Eric S. Lander, Michael Mitzenmacher, and Pardis C. Sabeti. 2011. “Detecting Novel Associations in Large Data Sets.” Science 334 (6062): 1518–24. doi:10.1126/science.1205438.

stackoverflow.com. 2017. “What Is Entropy and Information Gain?” http://stackoverflow.com/questions/1859554/what-is-entropy-and-information-gain.

stats.stackexchange.com. 2015. “Gradient Boosting Machine Vs Random Forest.” https://stats.stackexchange.com/questions/173390/gradient-boosting-tree-vs-random-forest.

———. 2017a. “How to Interpret Mean Decrease in Accuracy and Mean Decrease Gini in Random Forest Models.” http://stats.stackexchange.com/questions/197827/how-to-interpret-mean-decrease-in-accuracy-and-mean-decrease-gini-in-random-fore.

———. 2017b. “Percentile Vs Quantile Vs Quartile.” https://stats.stackexchange.com/questions/156778/percentile-vs-quantile-vs-quartile.

Wikipedia. 2017a. “Distribution of Wealth.” https://en.wikipedia.org/wiki/Distribution_of_wealth.

———. 2017b. “Monotonic Function.” https://en.wikipedia.org/wiki/Monotonic_function.

———. 2017d. “White Noise - Time Series Analysis and Regression.” https://en.wikipedia.org/wiki/White_noise.