×

An update on statistical boosting in biomedicine. (English) Zbl 1397.92018

Summary: Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.

MSC:

92B15 General biostatistics
62P10 Applications of statistics to biology and medical sciences; meta analysis
62J05 Linear regression; mixed models
68W25 Approximation algorithms

References:

[1] Mayr, A.; Binder, H.; Gefeller, O.; Schmid, M., The evolution of boosting algorithms: from machine learning to statistical modelling, Methods of Information in Medicine, 53, 6, 419-427, (2014) · doi:10.3414/ME13-01-0122
[2] Bühlmann, P.; Hothorn, T., Rejoinder: boosting algorithms: regularization, prediction and model fitting, Statistical Science, 22, 4, 516-522, (2007) · Zbl 1246.62164 · doi:10.1214/07-STS242REJ
[3] Tutz, G.; Binder, H., Generalized additive modeling with implicit variable selection by likelihood-based boosting, Biometrics, 62, 4, 961-971, (2006) · Zbl 1116.62075 · doi:10.1111/j.1541-0420.2006.00578.x
[4] Hofner, B.; Hothorn, T.; Kneib, T.; Schmid, M., A framework for unbiased model selection based on boosting, Journal of Computational and Graphical Statistics, 20, 4, 956-971, (2011) · doi:10.1198/jcgs.2011.09220
[5] Kneib, T.; Hothorn, T.; Tutz, G., Variable selection and model choice in geoadditive regression models, Biometrics, 65, 2, 626-634, (2009) · Zbl 1167.62096 · doi:10.1111/j.1541-0420.2008.01112.x
[6] Breiman, L., Statistical modeling: the two cultures, Statistical Science, 16, 3, 199-231, (2001) · Zbl 1059.62505 · doi:10.1214/ss/1009213726
[7] Freund, Y.; Fulk, M. A.; Case, J., Boosting a weak learning algorithm by majority, Proceedings of the Third Annual Workshop on Computational Learning Theory, COLT 1990, University of Rochester
[8] Friedman, J. H., Greedy function approximation: a gradient boosting machine, The Annals of Statistics, 29, 5, 1189-1232, (2001) · Zbl 1043.62034 · doi:10.1214/aos/1013203451
[9] Friedman, J.; Hastie, T.; Tibshirani, R., Additive logistic regression: a statistical view of boosting, The Annals of Statistics, 28, 2, 337-407, (2000) · Zbl 1106.62323 · doi:10.1214/aos/1016120463
[10] Hepp, T.; Schmid, M.; Gefeller, O.; Waldmann, E.; Mayr, A., Approaches to regularized regression - A comparison between gradient boosting and the lasso, Methods of Information in Medicine, 55, 5, 422-430, (2016) · doi:10.3414/ME16-01-0033
[11] Mayr, A.; Binder, H.; Gefeller, O.; Schmid, M., Extending statistical boosting, Methods of Information in Medicine, 53, 6, 428-435, (2014) · doi:10.3414/ME13-01-0123
[12] Hofner, B.; Boccuto, L.; Göker, M., Controlling false discoveries in high-dimensional situations: Boosting with stability selection, BMC Bioinformatics, 16, 1, article no. 144, (2015) · doi:10.1186/s12859-015-0575-3
[13] Waldmann, E.; Taylor-Robinson, D.; Klein, N., Boosting joint models for longitudinal and time-to-event data, Biometrical Journal, (2017) · Zbl 1379.62088 · doi:10.1002/bimj.201600158
[14] Brockhaus, S.; Melcher, M.; Leisch, F.; Greven, S., Boosting flexible functional regression models with a high number of functional historical effects, Statistics and Computing, 27, 4, 913-926, (2017) · Zbl 1384.62131 · doi:10.1007/s11222-016-9662-1
[15] Schapire, R. E., The strength of weak learnability, Machine Learning, 5, 2, 197-227, (1990) · doi:10.1007/BF00116037
[16] Schapire, R.; Freund, Y., Boosting: Foundations and Algorithms, 14, (2012), MIT Press · Zbl 1278.68021
[17] Freund, Y.; Schapire, R., Experiments with a new boosting algorithm, Proceedings of the Thirteenth International Conference on Machine Learning Theory, San Francisco: Morgan Kaufmann Publishers Inc.
[18] Hastie, T.; Tibshirani, R.; Friedman, J., The Elements of Statistical Learning: Data Mining, Inference, and Prediction, (2009), New York, NY, USA: Springer, New York, NY, USA · Zbl 1273.62005
[19] Wyner, A. J.; Olson, M.; Bleich, J.; Mease, D., Explaining the success of adaboost and random forests as interpolating classifiers, Journal of Machine Learning Research, 18, 48, 1-33, (2017) · Zbl 1433.68384
[20] Strobl, C.; Boulesteix, A.-L.; Zeileis, A.; Hothorn, T., Bias in random forest variable importance measures: illustrations, sources and a solution, BMC Bioinformatics, 8, article 25, (2007) · doi:10.1186/1471-2105-8-25
[21] Hapfelmeier, A.; Hothorn, T.; Ulm, K.; Strobl, C., A new variable importance measure for random forests with missing data, Statistics and Computing, 24, 1, 21-34, (2014) · Zbl 1325.62011 · doi:10.1007/s11222-012-9349-1
[22] Hastie, T. J.; Tibshirani, R. J., Generalized Additive Models, 43, (1990), London, UK: Chapman and Hall, London, UK · Zbl 0747.62061
[23] Mayr, A.; Hofner, B.; Schmid, M., The importance of knowing when to stop: a sequential stopping rule for component-wise gradient boosting, Methods of Information in Medicine, 51, 2, 178-186, (2012) · doi:10.3414/ME11-02-0030
[24] Hothorn, T., Boosting – An unusual yet attractive optimiser, Methods of Information in Medicine, 53, 6, 417-418, (2014) · doi:10.3414/ME13-10-0123
[25] Mason, L.; Baxter, J.; Bartlett, P.; Frean, M., Boosting algorithms as gradient descent, Proceedings of the 13th Annual Neural Information Processing Systems Conference, NIPS 1999
[26] Hothorn, T.; Bühlmann, P.; Kneib, T.; Schmid, M.; Hofner, B., mboost: Model-Based Boosting · Zbl 1242.68002
[27] R Development Core Team, R: A Language and Environment for Statistical Computing, (2016), Vienna, Austria
[28] Hofner, B.; Mayr, A.; Robinzonov, N.; Schmid, M., Model-based boosting in R: a hands-on tutorial using the R Package mboost, Computational Statistics, 29, 1-2, 3-35, (2014) · Zbl 1306.65069 · doi:10.1007/s00180-012-0382-5
[29] Tutz, G.; Binder, H., Boosting ridge regression, Computational Statistics and Data Analysis, 51, 12, 6044-6059, (2007) · Zbl 1330.62294 · doi:10.1016/j.csda.2006.11.041
[30] Bühlmann, P.; Yu, B., Boosting with the L2 loss: regression and classification, Journal of the American Statistical Association, 98, 324-338, (2003) · Zbl 1041.62029 · doi:10.1198/016214503000125
[31] Binder, H., GAMBoost: Generalized Linear and Additive Models by Likelihood Based Boosting, (2011)
[32] Binder, H., CoxBoost: Cox Models by Likelihood-based Boosting for a Single Survival Endpoint or Competing Risks, (2013)
[33] De Bin, R., Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages CoxBoost and mboost, Computational Statistics, 31, 2, 513-531, (2016) · Zbl 1342.65029 · doi:10.1007/s00180-015-0642-2
[34] Tibshirani, R., Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society - Series B, 58, 1, 267-288, (1996) · Zbl 0850.62538
[35] Efron, B.; Hastie, T.; Johnstone, I.; Tibshirani, R., Least angle regression, The Annals of Statistics, 32, 2, 407-499, (2004) · Zbl 1091.62054 · doi:10.1214/009053604000000067
[36] Meinshausen, N.; Rocha, G.; Yu, B., Discussion: a tale of three cousins: Lasso, L2 boosting and Dantzig, The Annals of Statistics, 35, 6, 2373-2384, (2007)
[37] Duan, J.; Soussen, C.; Brie, D.; Idier, J.; Wang, Y.-P., On LARS/homotopy equivalence conditions for over-determined LASSO, IEEE Signal Processing Letters, 19, 12, (2012) · doi:10.1109/LSP.2012.2221712
[38] Bühlmann, P.; Gertheiss, J.; Hieke, S., Discussion of ’the evolution of boosting algorithms’ and ’extending statistical boosting’, Methods of Information in Medicine, 53, 6, 436-445, (2014) · doi:10.3414/13100122
[39] Hastie, T.; Taylor, J.; Tibshirani, R.; Walther, G., Forward stagewise regression and the monotone lasso, Electronic Journal of Statistics, 1, 1-29, (2007) · Zbl 1306.62176 · doi:10.1214/07-EJS004
[40] Janitza, S.; Binder, H.; Boulesteix, A.-L., Pitfalls of hypothesis tests and model selection on bootstrap samples: causes and consequences in biometrical applications, Biometrical Journal, 58, 3, 447-473, (2016) · Zbl 1386.62053 · doi:10.1002/bimj.201400246
[41] Meinshausen, N.; Bühlmann, P., Stability selection, Journal of the Royal Statistical Society Series B, 72, 4, 417-473, (2010) · Zbl 1411.62142 · doi:10.1111/j.1467-9868.2010.00740.x
[42] Shah, R. D.; Samworth, R. J., Variable selection with error control: another look at stability selection, Journal of the Royal Statistical Society. Series B. Statistical Methodology, 75, 1, 55-80, (2013) · Zbl 07555438 · doi:10.1111/j.1467-9868.2011.01034.x
[43] Hofner, B.; Hothorn, T.; stabs., Stability Selection with Error Control, (2017)
[44] Mayr, A.; Hofner, B.; Schmid, M., Boosting the discriminatory power of sparse survival models via optimization of the concordance index and stability selection, BMC Bioinformatics, 17, 1, article no. 288, (2016) · doi:10.1186/s12859-016-1149-8
[45] Mayr, A.; Schmid, M., Boosting the concordance index for survival data - a unified framework to derive and evaluate biomarker combinations, PLoS ONE, 9, 1, (2014) · doi:10.1371/journal.pone.0084483
[46] Chen, Y.; Jia, Z.; Mercola, D.; Xie, X., A gradient boosting algorithm for survival analysis via direct optimization of concordance index, Computational and Mathematical Methods in Medicine, 2013, (2013) · Zbl 1307.92016 · doi:10.1155/2013/873595
[47] Thomas, J.; Mayr, A.; Bischl, B.; Schmid, M.; Smith, A.; Hofner, B., Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates, Statistics and Computing, 1-15, (2017) · Zbl 1384.62139 · doi:10.1007/s11222-017-9754-6
[48] Rigby, R. A.; Stasinopoulos, D. M., Generalized additive models for location, scale and shape, Journal of the Royal Statistical Society. Series C. Applied Statistics, 54, 3, 507-554, (2005) · Zbl 1490.62201 · doi:10.1111/j.1467-9876.2005.00510.x
[49] Mayr, A.; Fenske, N.; Hofner, B.; Kneib, T.; Schmid, M., Generalized additive models for location, scale and shape for high dimensional data—a flexible approach based on boosting, Journal of the Royal Statistical Society. Series C. Applied Statistics, 61, 3, 403-427, (2012) · doi:10.1111/j.1467-9876.2011.01033.x
[50] Hofner, B.; Mayr, A.; Schmid, M., gamboostLSS: an R package for model building and variable selection in the GAMLSS framework, Journal of Statistical Software, 74, 1, (2016) · doi:10.18637/jss.v074.i01
[51] Hofner, B.; Mayr, A.; Fenske, N.; Thomas, J.; Schmid, M., gamboostLSS: Boosting Methods for GAMLSS Models
[52] Alon, U.; Barka, N.; Notterman, D. A.; Gish, K.; Ybarra, S.; Mack, D.; Levine, A. J., Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays, Proceedings of the National Academy of Sciences of the United States of America, 96, 12, 6745-6750, (1999) · doi:10.1073/pnas.96.12.6745
[53] Gravier, E.; Pierron, G.; Vincent-Salomon, A., A prognostic DNA signature for T1T2 node-negative breast cancer patients, Genes Chromosomes and Cancer, 49, 12, 1125-1134, (September 2009) · doi:10.1002/gcc.20820
[54] Bühlmann, P.; Kalisch, M.; Meier, L., High-dimensional statistics with a view toward applications in Biology, Annual Review of Statistics and Its Application, 1, 255-278, (2014) · doi:10.1146/annurev-statistics-022513-115545
[55] Ramey, J. A., Datamicroarray: Collection of Data Sets for Classification
[56] Dezeure, R.; Bühlmann, P.; Meier, L.; Meinshausen, N., High-dimensional inference: confidence intervals, p-values and R-software hdi, Statistical Science, 30, 4, 533-558, (2015) · Zbl 1426.62183 · doi:10.1214/15-STS527
[57] Sariyar, M.; Schumacher, M.; Binder, H., A boosting approach for adapting the sparsity of risk prediction signatures based on different molecular levels, Statistical Applications in Genetics and Molecular Biology, 13, 3, 343-357, (2014) · doi:10.1515/sagmb-2013-0050
[58] Zhang, C.-X.; Zhang, J.-S.; Kim, S.-W., PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection, Computational Statistics, 31, 4, 1237-1262, (2016) · Zbl 1348.65044 · doi:10.1007/s00180-016-0652-8
[59] Thomas, J.; Hepp, T.; Mayr, A.; Bischl, B., Probing for sparse and fast variable selection with model-based boosting · Zbl 1397.92020
[60] Huang, Y.; Liu, J.; Yi, H.; Shia, B.-C.; Ma, S., Promoting similarity of model sparsity structure in integrative analysis of cancer genetic data, Statistics in Medicine, 36, 3, 509-559, (2017) · doi:10.1002/sim.7138
[61] Bühlmann, P.; Yu, B., Sparse boosting, Journal of Machine Learning Research, 7, 1001-1024, (2006) · Zbl 1222.68155
[62] Ramsay, J. O.; Silverman, B. W., Applied functional data analysis: methods and case studies. Applied functional data analysis: methods and case studies, Springer Series in Statistics, 77, (2002), Berlin, Germany: Springer, Berlin, Germany · Zbl 1011.62002 · doi:10.1007/b98886
[63] Greven, S.; Scheipl, F., A general framework for functional regression modelling, Statistical Modelling, 17, 1-2, 1-35, (2017) · Zbl 07289474 · doi:10.1177/1471082X16681317
[64] Morris, J. S., Functional regression, Annual Review of Statistics and Its Application, 2, 321-359, (2015) · doi:10.1146/annurev-statistics-010814-020413
[65] Brockhaus, S.; Scheipl, F.; Hothorn, T.; Greven, S., The functional linear array model, Statistical Modelling, 15, 3, 279-300, (2015) · Zbl 07258990 · doi:10.1177/1471082X14566913
[66] Currie, I. D.; Durban, M.; Eilers, P. H., Generalized linear array models with applications to multidimensional smoothing, Journal of the Royal Statistical Society. Series B. Statistical Methodology, 68, 2, 259-280, (2006) · Zbl 1110.62090 · doi:10.1111/j.1467-9868.2006.00543.x
[67] Brockhaus, S.; Rügamer, D., Brockhaus S, Rügamer D. FDboost: Boosting Functional Regression Models; R package version 0.2-0, (2016)
[68] Brockhaus, S.; Fuest, A.; Mayr, A.; Greven, S., Signal regression models for location, scale and shape with an application to stock returns
[69] Rügamer, D.; Brockhaus, S.; Gentsch, K.; Scherer, K.; Greven, S., Boosting factor-specific functional historical models for the detection of synchronisation in bioelectrical signals
[70] Ullah, S.; Finch, C. F., Applications of functional data analysis: a systematic review, BMC Medical Research Methodology, 13, 1, article 43, (2013) · doi:10.1186/1471-2288-13-43
[71] Zemmour, C.; Bertucci, F.; Finetti, P., Prediction of early breast cancer metastasis from dna microarray data using high-dimensional Cox regression models, Cancer Informatics, 14, 129-138, (2015) · doi:10.4137/CIN.S17284
[72] Schmid, M.; Hothorn, T., Flexible boosting of accelerated failure time models, BMC Bioinformatics, 9, article 269, (2008) · doi:10.1186/1471-2105-9-269
[73] Wulfsohn, M. S.; Tsiatis, A. A., A joint model for survival and longitudinal data measured with error, Biometrics, 53, 1, 330-339, (1997) · Zbl 0874.62140 · doi:10.2307/2533118
[74] Faucett, C. L.; Thomas, D. C., Simultaneously modelling censored survival data and repeatedly measured covariates: a Gibbs sampling approach, Statistics in Medicine, 15, 15, 1663-1685, (1996) · doi:10.1002/(SICI)1097-0258(19960815)15:15<1663::AID-SIM294>3.0.CO;2-1
[75] Rizopoulos, D., JM: an R package for the joint modelling of longitudinal and time-to-event data, Journal of Statistical Software, 35, 9, 1-33, (2010)
[76] Schmid, M.; Potapov, S.; Pfahlberg, A.; Hothorn, T., Estimation and regularization techniques for regression models with multidimensional prediction functions, Statistics and Computing, 20, 2, 139-150, (2010) · doi:10.1007/s11222-009-9162-7
[77] Waldmann, E.; Mayr, A., JMboost: Boosting Joint Models for Longitudinal and Time-to-Event Outcomes
[78] Reulen, H.; Kneib, T., Boosting multi-state models, Lifetime Data Analysis, 22, 2, 241-262, (2016) · Zbl 1356.65030 · doi:10.1007/s10985-015-9329-9
[79] Reulen, H., gamboostMSM: Estimating multistate models using gamboost()
[80] Möst, L.; Hothorn, T., Conditional transformation models for survivor function estimation, The International Journal of Biostatistics, 11, 1, 23-50, (2015) · Zbl 07901951 · doi:10.1515/ijb-2014-0006
[81] Hothorn, T.; Kneib, T.; Bühlmann, P., Conditional transformation models, Journal of the Royal Statistical Society. Series B. Statistical Methodology, 76, 1, 3-27, (2014) · Zbl 1411.62100 · doi:10.1111/rssb.12017
[82] van der Laan, M. J.; Robins, J. M., Unified Methods for Censored Longitudinal Data and Causality. Unified Methods for Censored Longitudinal Data and Causality, New York, NY, USA, (2003), Springer Science & Business Media · Zbl 1013.62034 · doi:10.1007/978-0-387-21700-0
[83] De Bin, R.; Sauerbrei, W.; Boulesteix, A.-L., Investigating the prediction ability of survival models based on both clinical and omics data: two case studies, Statistics in Medicine, 33, 30, 5310-5329, (2014) · doi:10.1002/sim.6246
[84] Guo, Z.; Lu, W.; Li, L., Forward Stagewise Shrinkage and Addition for High Dimensional Censored Regression, Statistics in Biosciences, 7, 2, 225-244, (2015) · doi:10.1007/s12561-014-9114-4
[85] Sariyar, M.; Hoffmann, I.; Binder, H., Combining techniques for screening and evaluating interaction terms on high-dimensional time-to-event data, BMC Bioinformatics, 15, 1, article 58, (2014) · doi:10.1186/1471-2105-15-58
[86] Hieke, S.; Benner, A.; Schlenk, R. F.; Schumacher, M.; Bullinger, L.; Binder, H., Identifying prognostic SNPs in clinical cohorts: complementing univariate analyses by resampling and multivariable modeling, PLoS ONE, 11, 5, (2016) · doi:10.1371/journal.pone.0155226
[87] Weinhold, L.; Wahl, S.; Pechlivanis, S.; Hoffmann, P.; Schmid, M., A statistical model for the analysis of beta values in DNA methylation studies, BMC Bioinformatics, 17, 1, article 480, (2016) · doi:10.1186/s12859-016-1347-4
[88] Schauberger, G.; Tutz, G., Detection of differential item functioning in Rasch models by boosting techniques, British Journal of Mathematical and Statistical Psychology, 69, 1, 80-103, (2016) · Zbl 1406.91379 · doi:10.1111/bmsp.12060
[89] Casalicchio, G.; Tutz, G.; Schauberger, G., Subject-specific Bradley-Terry-Luce models with implicit variable selection, Statistical Modelling, 15, 6, 526-547, (2015) · Zbl 07259001 · doi:10.1177/1471082X15571817
[90] Napolitano, G.; Stingl, J. C.; Schmid, M.; Viviani, R., Predicting CYP2D6 phenotype from resting brain perfusion images by gradient boosting, Psychiatry Research: Neuroimaging, 259, 16-24, (2017) · doi:10.1016/j.pscychresns.2016.11.005
[91] Feilke, M.; Bischl, B.; Schmid, V. J.; Gertheiss, J., Boosting in nonlinear regression models with an application to DCE-MRI data, Methods of Information in Medicine, 55, 1, 31-41, (2016) · doi:10.3414/ME14-01-0131
[92] Pybus, M.; Luisi, P.; Dall’Olio, G. M., Hierarchical boosting: a machine-learning framework to detect and classify hard selective sweeps in human populations, Bioinformatics, 31, 24, 3946-3952, (2015) · doi:10.1093/bioinformatics/btv493
[93] Lin, K.; Li, H.; Schlötterer, C.; Futschik, A., Distinguishing positive selection from neutral evolution: Boosting the performance of summary statistics, Genetics, 187, 1, 229-244, (2011) · doi:10.1534/genetics.110.122614
[94] Truntzer, C.; Mostacci, E.; Jeannin, A.; Petit, J.-M.; Ducoroy, P.; Cardot, H., Comparison of classification methods that combine clinical data and high-dimensional mass spectrometry data, BMC Bioinformatics, 15, 1, article 385, (2014) · doi:10.1186/s12859-014-0385-z
[95] Messner, J. W.; Mayr, G. J.; Zeileis, A., Nonhomogeneous Boosting for Predictor Selection in Ensemble Postprocessing, Monthly Weather Review, 145, 1, 137-147, (2017) · doi:10.1175/MWR-D-16-0088.1
[96] Mayr, A.; Schmid, M.; Pfahlberg, A.; Uter, W.; Gefeller, O., A permutation test to analyse systematic bias and random measurement errors of medical devices via boosting location and scale models, Statistical Methods in Medical Research, 26, 3, 1443-1460, (2017) · doi:10.1177/0962280215581855
[97] Faschingbauer, F.; Dammer, U.; Raabe, E., A new sonographic weight estimation formula for small-for-gestational-age fetuses, Journal of Ultrasound in Medicine, 35, 8, 1713-1724, (2016) · doi:10.7863/ultra.15.09084
[98] Schäfer, J.; Young, J.; Bernasconi, E., Predicting smoking cessation and its relapse in HIV-infected patients: the swiss HIV cohort study, HIV Medicine, 16, 1, 3-14, (2015) · doi:10.1111/hiv.12165
[99] Melcher, M.; Scharl, T.; Luchner, M.; Striedner, G.; Leisch, F., Boosted structured additive regression for, Biotechnology and Bioengineering, 114, 2, 321-334, (2017) · doi:10.1002/bit.26073
[100] Bahrmann, P.; Christ, M.; Hofner, B., Prognostic value of different biomarkers for cardiovascular death in unselected older patients in the emergency department, European Heart Journal: Acute Cardiovascular Care, 5, 8, 568-578, (2016) · doi:10.1177/2048872615612455
[101] Pattloch, D.; Richter, A.; Manger, B., Das erste Biologikum bei rheumatoider arthritis: einflussfaktoren auf die Therapieentscheidung, Zeitschrift für Rheumatologie, 76, 3, 210-218, (2017) · doi:10.1007/s00393-016-0174-3
[102] Chen, T.; Guestrin, C., XGBoost: a scalable tree boosting system, Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2016 · doi:10.1145/2939672.2939785
[103] Hofner, B.; Müller, J.; Hothorn, T., Monotonicity-constrained species distribution models, Ecology, 92, 10, 1895-1901, (2011) · doi:10.1890/10-2276.1
[104] Hofner, B.; Kneib, T.; Hothorn, T., A unified framework of constrained regression, Statistics and Computing, 26, 1-2, 1-14, (2016) · Zbl 1342.62115 · doi:10.1007/s11222-014-9520-y
[105] Hofner, B.; Smith, A., Boosted negative binomial hurdle models for spatiotemporal abundance of sea birds, Proceedings of the 30th International Workshop on Statistical Modelling
[106] Friedrichs, S.; Manitz, J.; Burger, P., Pathway-based kernel boosting for the analysis of genome-wide association studies, Computational and Mathematical Methods in Medicine, 2017, (2017) · Zbl 1382.92127 · doi:10.1155/2017/6742763
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.