Skip to contents

Provides Partial least squares Regression for regular, generalized linear and Cox models for big data. It allows for missing data in the explanatory variables. Repeated k-fold cross-validation of such models using various criteria. Bootstrap confidence intervals constructions are also available.

References

Maumy, M., Bertrand, F. (2023). PLS models and their extension for big data. Joint Statistical Meetings (JSM 2023), Toronto, ON, Canada.

Maumy, M., Bertrand, F. (2023). bigPLS: Fitting and cross-validating PLS-based Cox models to censored big data. BioC2023 — The Bioconductor Annual Conference, Dana-Farber Cancer Institute, Boston, MA, USA. Poster. https://doi.org/10.7490/f1000research.1119546.1

Bastien, P., Bertrand, F., Meyer, N., and Maumy-Bertrand, M. (2015). Deviance residuals-based sparse PLS and sparse kernel PLS for binary classification and survival analysis. BMC Bioinformatics, 16, 211.

Author

Maintainer: Frederic Bertrand frederic.bertrand@lecnam.net (ORCID)

Authors:

Examples

set.seed(314)
library(bigPLScox)
data(sim_data)
head(sim_data)
#>                    status         X1         X2         X3        X4         X5
#> 0.0013236229370777      1  0.5448667 -0.9205711  1.1017160 1.3558567  1.4346174
#> 0.193665925040523       1 -0.5641483  0.2733279  0.9731780 1.1232252  0.2652977
#> 0.0167866701431944      1  1.4921118  0.2598002 -1.5436997 0.1165158  1.2208183
#> 0.0584127055299712      1 -0.6430141 -0.9807448 -1.2294945 0.8006227  1.5492078
#> 0.732960708716205       1  0.1876928 -1.2571263  0.9016827 1.3562191 -1.6809553
#> 0.508483386474255       0 -0.6141516 -0.8162560  0.2633415 0.4188066  0.2791399
#>                            X6         X7         X8          X9        X10
#> 0.0013236229370777 -0.8727406  1.5161252  0.7801527 -0.53617252 -0.6990319
#> 0.193665925040523   1.5046047  0.9096495 -1.2200395 -1.57280359  0.8347194
#> 0.0167866701431944 -0.6451659  1.2515692  0.5867273 -0.20080821  0.7492891
#> 0.0584127055299712  1.2557210  0.6188920  0.7123894 -0.67379538 -1.2377412
#> 0.732960708716205   0.7304366 -1.1223302  0.9633307  0.14016470 -0.9996676
#> 0.508483386474255  -0.0538974 -0.1410697 -0.8637916  0.01669784  1.5589135