For the latest version of JMP Help, visit JMP.com/help.


Publication date: 07/24/2024

References

The following sources are referenced in Predictive and Specialized Modeling.

Agrawal, R., and Srikant, R. (1994). “Fast Algorithms for Mining Association Rules.” In Proceedings of the 20th VLDB Conference. Santiago, Chile: IBM Almaden Research Center.

Bates, D. M., and Watts, D. G. (1988). Nonlinear Regression Analysis and Its Applications. New York: John Wiley & Sons.

Benford, F. (1938). “The law of anomalous numbers.” Proceedings of the American Philosophical Society 551–572.

Benjamini, Y., and Hochberg, Y. (1995). “Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing.” Journal of the Royal Statistical Society, Series B 57:289–300.

Benjamini, Y, et al. (2005). “False discover rate: adjusted multiple confidence intervals for selected parameters.” Journal of the American Statistical Association 100:71–93.

Bloomfield, P. (2004). Fourier Analysis of Times Series: An Introduction. John Wiley & Sons.

Box, G. E. P., Jenkins, G. M., and Reinsel, G. C. (1994). Time Series Analysis: Forecasting and Control. 3rd ed. Englewood Cliffs, NJ: Prentice-Hall.

Candes, E. J., Li, X., Ma, Y., and Wright, J. (2009). “Robust Principal Component Analysis?” Journal of the ACM 58:1–37.

Cleveland, W. S. (1994). Visualizing Data, Summit, NJ: Hobart Press.

Conover, W. J. (1999). Practical Nonparametric Statistics. 3rd ed. New York: John Wiley & Sons.

Cureton, E. E. (1967). “The Normal Approximation to the Signed-Rank Sampling Distribution when Zero Differences are Present.” Journal of the American Statistical Association 62:1068–1069.

Donoho, D. L. (1995). “De-noising by soft-thresholding.” IEEE Transactions on Information Theory 41:613–627.

Du, P., Kibbe, W. A., and Lin, S. M. (2006). “Improved peak detection in mass spectrum by incorporating continuous wavelet transform-based pattern matching.” bioinformatics, 22(17): 2059-2065.

Efron, B. (1981). “Nonparametric standard errors and confidence intervals.” Canadian Journal of Statistics 9:139–172.

Geladi, P., MacDougall, D., and Martens, H. (1985). “Linearization and scatter-correction for near-infrared reflectance spectra of meat.” Applied Spectroscopy 39:491–500.

Hahsler, M. (2015). “A Probabilistic Comparison of Commonly Used Interest Measures for Association Rules.” https://mhahsler.github.io/arules/docs/measures.

Haldane, J. B. (1956). “The estimation and significance of the logarithm of a ratio of frequencies.” Ann Hum Genet 20:309–311.

Hand, D. J., Mannila, H., and Smyth, P. (2001). Principles of Data Mining. Cambridge, MA: MIT Press.

Hastie, T. J., Tibshirani, R. J., and Friedman, J. H. (2009). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. 2nd ed. New York: Springer-Verlag.

Hawkins D. M., and Kass G. V. (1982). “Automatic Interaction Detection.” In Topics in Applied Multivariate Analysis, edited by D. M. Hawkins, 267–300. Cambridge: Cambridge University Press.

Hirschberg, J., and Lye, J. (2010). “A geometric comparison of the delta and Fieller confidence intervals.” The American Statistician 64(3): 234–241.

Hoffelder, T. (2019). “Equivalence analyses of dissolution profiles with the Mahalanobis distance.” Biometrical Journal 61(5): 1120–1137.

Huber, P. J., and Ronchetti, E. M. (2009). Robust Statistics. 2nd ed. New York: John Wiley & Sons.

Hyndman, R. J., Koehler, A. B., Ord, J. K., and Snyder, R. D. (2008). Forecasting with Exponential Smoothing: The State Space Approach. Berlin: Springer-Verlag.

Jolliffe, I. T. (2002). Principal Component Analysis. New York: Springer-Verlag.

Kass, G. V. (1980). “An Exploratory Technique for Investigating Large Quantities of Categorical Data.” Journal of the Royal Statistical Society, Series C 29:119–127.

Lee, M., Shen, H., Huang, J. Z., andMarron, J. S. (2010). “Biclustering via sparse singular value decomposition.” Biometrics 66(4): 1087–1095.

Lehman, E. L. (2006). Nonparametrics: Statistical Methods Based on Ranks. 2nd ed. New York: Springer.

Lin, Z., Chen, M., and Ma, Y. (2013). The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices. arXiv preprint arXiv:1009.5055.

Mason, R. L., and Young, J. C. (2002). Multivariate Statistical Process Control with Industrial Applications. Philadelphia: SIAM.

McCullagh, P., and Nelder, J. A. (1989). Generalized Linear Models. 2nd ed. London: Chapman & Hall.

Nagelkerke, N. J. D. (1991). “A Note on a General Definition of the Coefficient of Determination.” Biometrika 78:691–692.

Nason, G. P. (Ed.). (2008). Wavelet methods in statistics with R. New York, NY: Springer New York.

Nelder, J. A., and Wedderburn, R. W. M. (1972). “Generalized Linear Models.” Journal of the Royal Statistical Society, Series A 135:370–384.

Paixão, P., Gouveia, L. F., Silva, N., and Morais, J. A. (2017). “Evaluation of dissolution profile similarity–Comparison between the f2, the multivariate statistical distance and the f2 bootstrapping methods.” European Journal of Pharmaceutics and Biopharmaceutics 112:67–74.

Parker, R. J. (2015). Efficient Computational Methods for Large Spatial Data. Ph.D. diss., Department of Statistics, North Carolina State University. https://repository.lib.ncsu.edu/ir/bitstream/1840.16/10572/1/etd.pdf.

Platt, J. (1998). Sequential minimal optimization: A fast algorithm for training support vector machines. Technical Report MST-TR-98-14, Microsoft Research. https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr-98-14.pdf.

Qian, P. Z., Huaiquing, W., and Wu, C. F. (2012). “Gaussian process models for computer experiments with qualitative and quantitative factors.” Technometrics 50:383–396.

Ramsay, J. O., and Silverman, B. W. (2005). Functional Data Analysis. 2nd ed. New York: Springer.

Ratkowsky, D. A. (1990). Handbook of Nonlinear Regression Models. New York: Marcel Dekker.

Sall, J. (2002). “Monte Carlo Calibration of Distributions of Partition Statistics.” SAS Institute Inc.,Cary, NC. https://www.jmp.com/content/dam/jmp/documents/en/white-papers/montecarlocal.pdf.

Santer, T., Williams, B., and Notz, W. (2003). The Design and Analysis of Computer Experiments. New York: Springer-Verlag.

SAS Institute Inc. (2023). SAS/ETS® User’s Guide. Cary, NC: SAS Institute Inc. https://go.documentation.sas.com/api/collections/pgmsascdc/v_044/docsets/etsug/content/etsug.pdf.

Savitzky, A., and Golay, M. J. (1964). “Smoothing and differentiation of data by simplified least squares procedures.” Analytical Chemistry 36:1627–1639.

Schäfer, J., and Strimmer, K. (2005). “A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional Genomics.” Statistical Applications in Genetics and Molecular Biology 4, Article 32.

Schuirmann, D. J. (1987). “A Comparison of the Two One-sided Tests Procedure and the Power Approach for Assessing the Equivalence of Average Bioavailability.” Journal of Pharmacokinetics and Biopharmaceutics 15:657–680.

Shiskin, J., Young, A. H., and Musgrave, J. C. (1967). The X-11 Variant of the Census Method II Seasonal Adjustment Program. Technical Report 15, US Department of Commerce, Bureau of the Census.

Shmueli, G., Patel, N. R., and Bruce, P. C. (2010). Data Mining For Business Intelligence: Concepts, Techniques, and Applications in Microsoft Office Excel with XLMiner. 2nd ed. Hoboken, NJ: John Wiley & Sons.

Shmueli, G., Bruce, P. C., Stephens M. L., and Patel, N. R. (2017). Data Mining For Business Intelligence: Concepts, Techniques, and Applications with JMP Pro. Hoboken, NJ: John Wiley & Sons.

Smyth, Gordon K. (2004). “Linear models and empirical bayes methods for assessing differential expression in microarray experiments.” Statistical applications in genetics and molecular biology 3(1).

Tippey, K. G., and Longnecker, M. T. (2016). “An ad hoc method for computing pseudo-effect size of mixed models.” Proceeding of south central SAS users group forum.

Westfall, P. H., Tobias, R. D., and Wolfinger, R. D. (2011). Multiple Comparisons and Multiple Tests Using SAS. 2nd ed. Cary, NC: SAS Institute Inc.

Want more information? Have questions? Get answers in the JMP User Community (community.jmp.com).