This function provides a predict method for the class "plsRcoxmodel"
An object of the class "plsRcoxmodel"
.
An optional data frame in which to look for variables with which to predict. If omitted, the fitted values are used.
A value with a single value of component to use for prediction.
Type of predicted value. Choices are the linear predictor
("lp
"), the risk score exp(lp) ("risk
"), the expected number
of events given the covariates and follow-up time ("expected
"), the
terms of the linear predictor ("terms
") or the scores
("scores
").
If TRUE, pointwise standard errors are produced for the predictions using the Cox model.
Vector of case weights. If weights
is a vector of
integers, then the estimated coefficients are equivalent to estimating the
model from data with the individual cases
replicated as many times as
indicated by weights
.
Selects the way of predicting the response or the scores of
the new data. For complete rows, without any missing value, there are two
different ways of computing the prediction. As a consequence, for mixed
datasets, with complete and incomplete rows, there are two ways of computing
prediction : either predicts any row as if there were missing values in it
(missingdata
) or selects the prediction method accordingly to the
completeness of the row (adaptative
).
Should some details be displayed ?
Arguments to be passed on to survival::coxph
and to
plsRglm::PLS_lm
.
When type is "response
", a matrix of predicted response
values is returned.
When type is "scores
", a score matrix is
returned.
plsRcox, Cox-Models in a high dimensional setting in R, Frederic
Bertrand, Philippe Bastien, Nicolas Meyer and Myriam Maumy-Bertrand (2014).
Proceedings of User2014!, Los Angeles, page 152.
Deviance residuals-based sparse PLS and sparse kernel PLS regression for censored data, Philippe Bastien, Frederic Bertrand, Nicolas Meyer and Myriam Maumy-Bertrand (2015), Bioinformatics, 31(3):397-404, doi:10.1093/bioinformatics/btu660.
data(micro.censure)
data(Xmicro.censure_compl_imp)
X_train_micro <- apply((as.matrix(Xmicro.censure_compl_imp)),FUN="as.numeric",MARGIN=2)[1:80,]
Y_train_micro <- micro.censure$survyear[1:80]
C_train_micro <- micro.censure$DC[1:80]
modpls <- plsRcox(X_train_micro,time=Y_train_micro,event=C_train_micro,nt=3)
#> ____************************************************____
#> ____Component____ 1 ____
#> ____Component____ 2 ____
#> ____Component____ 3 ____
#> ____Predicting X without NA neither in X nor in Y____
#> ****________________________________________________****
#>
predict(modpls)
#> 1 2 3 4 5 6
#> -3.91149897 -0.01103427 2.29488351 0.77390739 1.81792717 -2.07323031
#> 7 8 9 10 11 12
#> -2.31730512 4.07156130 -1.81103896 1.32578127 2.48317798 -1.91438957
#> 13 14 15 16 17 18
#> 0.69486087 -1.40225489 -0.70950316 -0.24670449 -3.67870398 -4.16645669
#> 19 20 21 22 23 24
#> 1.81768091 -3.02734320 -2.13745459 -1.92831286 0.98589347 1.14971022
#> 25 26 27 28 29 30
#> -5.89616340 -2.34994661 4.42353417 0.80061019 -0.51396920 -3.08443396
#> 31 32 33 34 35 36
#> 3.56453166 -3.55117411 -2.73950599 0.22916899 2.63357746 0.08206085
#> 37 38 39 40 41 42
#> 5.74551756 -0.15211159 -3.34205848 0.50932362 1.49286943 1.00697968
#> 43 44 45 46 47 48
#> -1.69053571 -0.43807672 -1.88100209 0.57105812 -1.94086294 0.13737583
#> 49 50 51 52 53 54
#> 2.91478046 -0.19794750 -0.53317032 1.81309897 0.05410920 6.74653169
#> 55 56 57 58 59 60
#> 6.02372222 -3.15552724 -5.38533967 1.93977635 0.36991240 -1.30359496
#> 61 62 63 64 65 66
#> 3.40800015 2.42212390 6.44149568 -1.36663839 3.82119344 0.41050691
#> 67 68 69 70 71 72
#> -5.59789050 -1.73207956 3.64545451 -2.33472431 1.90409918 2.74143690
#> 73 74 75 76 77 78
#> 2.27208915 -1.50499888 1.61558478 2.36442373 -1.52350702 -1.62287584
#> 79 80
#> -1.11760516 -5.22936005
#Identical to predict(modpls,type="lp")
predict(modpls,type="risk")
#> 1 2 3 4 5 6
#> 2.001048e-02 9.890264e-01 9.923280e+00 2.168222e+00 6.159078e+00 1.257788e-01
#> 7 8 9 10 11 12
#> 9.853878e-02 5.864846e+01 1.634842e-01 3.765126e+00 1.197927e+01 1.474318e-01
#> 13 14 15 16 17 18
#> 2.003430e+00 2.460415e-01 4.918885e-01 7.813716e-01 2.525569e-02 1.550711e-02
#> 19 20 21 22 23 24
#> 6.157562e+00 4.844417e-02 1.179547e-01 1.453933e-01 2.680205e+00 3.157278e+00
#> 25 26 27 28 29 30
#> 2.749975e-03 9.537425e-02 8.339048e+01 2.226899e+00 5.981168e-01 4.575593e-02
#> 31 32 33 34 35 36
#> 3.532291e+01 2.869093e-02 6.460225e-02 1.257555e+00 1.392349e+01 1.085522e+00
#> 37 38 39 40 41 42
#> 3.127855e+02 8.588924e-01 3.536409e-02 1.664165e+00 4.449846e+00 2.737321e+00
#> 43 44 45 46 47 48
#> 1.844207e-01 6.452763e-01 1.524373e-01 1.770139e+00 1.435800e-01 1.147259e+00
#> 49 50 51 52 53 54
#> 1.844476e+01 8.204129e-01 5.867419e-01 6.129413e+00 1.055600e+00 8.511017e+02
#> 55 56 57 58 59 60
#> 4.131134e+02 4.261593e-02 4.583283e-03 6.957195e+00 1.447608e+00 2.715538e-01
#> 61 62 63 64 65 66
#> 3.020478e+01 1.126977e+01 6.273444e+02 2.549626e-01 4.565867e+01 1.507582e+00
#> 67 68 69 70 71 72
#> 3.705673e-03 1.769161e-01 3.830018e+01 9.683718e-02 6.713357e+00 1.550925e+01
#> 73 74 75 76 77 78
#> 9.699644e+00 2.220175e-01 5.030829e+00 1.063791e+01 2.179462e-01 1.973304e-01
#> 79 80
#> 3.270621e-01 5.356952e-03
predict(modpls,type="expected")
#> [1] 2.883280e-02 1.380608e-02 8.122308e-03 3.764026e-03 7.153829e-01
#> [6] 3.147782e-04 2.466064e-04 4.834946e-01 2.282118e-03 1.405932e-03
#> [11] 1.524722e-02 1.206745e-04 5.013850e-03 6.175850e-02 1.836756e-04
#> [16] 2.917712e-04 3.525507e-04 3.880860e-05 7.152067e-01 1.215990e-02
#> [21] 4.772813e-04 1.850566e-04 6.707570e-03 3.667208e-01 6.902670e-04
#> [26] 3.561361e-05 3.374238e-01 9.010726e-03 1.496867e-03 3.745170e-05
#> [31] 2.891217e-02 2.365264e-04 9.017997e-04 4.542828e-02 2.807208e-01
#> [36] 1.260844e-01 1.265627e+00 1.198950e-02 1.320527e-05 6.011681e-02
#> [41] 1.661613e-03 2.256632e-02 3.201537e-04 7.494946e-02 5.692145e-05
#> [46] 1.501472e-01 3.603979e-02 4.642167e-03 4.616046e-02 1.182121e+00
#> [51] 2.672665e-02 1.538534e+00 1.832518e-03 1.477510e+00 1.542603e-01
#> [56] 5.948868e-04 1.150443e-03 2.597879e-03 1.193400e-02 3.154123e-02
#> [61] 3.844464e-02 1.573176e-01 1.570013e+00 6.380777e-04 2.079795e+00
#> [66] 5.629446e-04 1.499430e-05 1.458487e-03 1.032203e+00 1.124773e-02
#> [71] 7.797629e-01 9.656531e-01 3.924777e-02 1.830300e-03 1.259032e-02
#> [76] 9.023316e-01 2.531466e-02 7.984600e-04 4.565545e-03 2.167589e-05
predict(modpls,type="terms")
#> tt.1 tt.2 tt.3
#> 1 -2.18710337 -1.48886332 -0.235532283
#> 2 -0.81612842 -0.15285272 0.957946868
#> 3 1.98230440 0.21817877 0.094400341
#> 4 -0.03763442 0.71295698 0.098584833
#> 5 2.76422524 -0.99036777 0.044069693
#> 6 -0.67244092 -1.10346519 -0.297324191
#> 7 -2.58500221 0.75832999 -0.490632893
#> 8 4.44407584 0.24616686 -0.618681405
#> 9 -0.22541827 -0.58024569 -1.005375007
#> 10 -0.84227497 0.28025877 1.887797465
#> 11 2.04910586 0.79213735 -0.358065231
#> 12 -2.69538483 0.85828666 -0.077291400
#> 13 1.92380082 -1.33725629 0.108316345
#> 14 -1.86302758 0.61053657 -0.149763886
#> 15 0.64929894 -0.66908891 -0.689713189
#> 16 1.81629288 -2.07233767 0.009340293
#> 17 -2.48051171 -0.85356060 -0.344631671
#> 18 -2.62201033 -1.05144670 -0.492999659
#> 19 1.09917459 1.95539108 -1.236884761
#> 20 -2.72786696 0.70395989 -1.003436130
#> 21 -2.31982425 -1.16242677 1.344796432
#> 22 -0.33710697 -1.77249772 0.181291831
#> 23 0.83670151 -0.49844873 0.647640690
#> 24 0.92709769 0.37666249 -0.154049958
#> 25 -4.17839003 -1.00436509 -0.713408282
#> 26 -3.23849773 0.82991867 0.058632456
#> 27 3.76915233 0.10875940 0.545622446
#> 28 2.77597103 -1.04890130 -0.926459549
#> 29 -1.62093845 1.14441654 -0.037447280
#> 30 -3.52772907 0.02533532 0.417959794
#> 31 2.05389237 0.24889469 1.261744606
#> 32 -4.12827545 -0.66279231 1.239893653
#> 33 -2.77012808 -0.46454837 0.495170455
#> 34 0.32516821 1.28520067 -1.381199884
#> 35 3.50256781 -0.94459787 0.075607524
#> 36 -0.07995553 0.65671168 -0.494695292
#> 37 3.03853777 1.97459796 0.732381837
#> 38 -1.45636298 0.93856907 0.365682320
#> 39 -3.53282556 -0.26343210 0.454199186
#> 40 0.74036729 -0.64349416 0.412450488
#> 41 -0.86913640 1.77284506 0.589160765
#> 42 2.68507617 -1.49011174 -0.187984755
#> 43 -3.60834238 0.57507560 1.342731063
#> 44 -1.23093622 -0.58200803 1.374867531
#> 45 -1.59763012 0.84059090 -1.123962874
#> 46 1.10000467 -0.82419326 0.295246713
#> 47 -0.50059170 -0.96816688 -0.472104369
#> 48 1.24805046 -1.02426486 -0.086409764
#> 49 2.96084678 -1.01995965 0.973893338
#> 50 0.06882002 0.22611338 -0.492880907
#> 51 -1.11588680 0.98052575 -0.397809267
#> 52 4.01703405 -2.83545722 0.631522149
#> 53 0.90902159 -0.48356886 -0.371343527
#> 54 2.13579403 2.45286756 2.157870098
#> 55 3.74121422 1.27919360 1.003314392
#> 56 -2.77980196 -0.43857585 0.062850575
#> 57 -4.24248067 -0.52612401 -0.616734987
#> 58 -0.68204492 1.66466779 0.957153476
#> 59 -2.32344267 2.10728040 0.586074673
#> 60 -1.79884966 0.91229467 -0.417039976
#> 61 2.23759637 1.86135124 -0.690947459
#> 62 1.65814317 -0.08633734 0.850318072
#> 63 6.58084023 0.49206731 -0.631411857
#> 64 1.53997918 -2.73887126 -0.167746309
#> 65 3.35162942 0.65176772 -0.182203701
#> 66 -0.02929053 0.92472060 -0.484923160
#> 67 -3.19754704 -1.17547958 -1.224863879
#> 68 0.18287169 -0.71576532 -1.199185933
#> 69 2.37682745 1.57609851 -0.307471453
#> 70 -1.24791377 -0.41580352 -0.671007023
#> 71 -0.48138068 0.95498256 1.430497301
#> 72 2.02397146 2.38359745 -1.666132008
#> 73 2.40114203 -0.50630766 0.377254774
#> 74 -1.80842119 0.21331778 0.090104528
#> 75 2.86058583 -0.78279774 -0.462203310
#> 76 3.06726572 -0.79914281 0.096300823
#> 77 -1.26148410 0.15994260 -0.421965529
#> 78 -0.80170514 -0.09059412 -0.730576581
#> 79 -1.37219178 1.09425077 -0.839664154
#> 80 -3.95053329 -1.58030167 0.301474909
#> attr(,"constant")
#> [1] -3.218013e-16
predict(modpls,type="scores")
#> Comp_1 Comp_2 Comp_3
#> 1 -1.51471639 -1.61173906 -0.44174968
#> 2 -0.56522390 -0.16546763 1.79666550
#> 3 1.37287931 0.23618504 0.17705140
#> 4 -0.02606437 0.77179724 0.18489957
#> 5 1.91441216 -1.07210272 0.08265437
#> 6 -0.46571063 -1.19453407 -0.55764274
#> 7 -1.79028814 0.82091488 -0.92020050
#> 8 3.07782184 0.26648299 -1.16036032
#> 9 -0.15611733 -0.62813331 -1.88561876
#> 10 -0.58333215 0.30338850 3.54063538
#> 11 1.41914382 0.85751237 -0.67156485
#> 12 -1.86673554 0.92912097 -0.14496294
#> 13 1.33236164 -1.44761991 0.20315139
#> 14 -1.29027208 0.66092409 -0.28088782
#> 15 0.44968325 -0.72430875 -1.29358311
#> 16 1.25790515 -2.24336748 0.01751807
#> 17 -1.71792143 -0.92400487 -0.64636970
#> 18 -1.81591875 -1.13822249 -0.92463946
#> 19 0.76125243 2.11676931 -2.31982404
#> 20 -1.88923160 0.76205763 -1.88198233
#> 21 -1.60663454 -1.25836174 2.52221645
#> 22 -0.23346928 -1.91878178 0.34001967
#> 23 0.57947215 -0.53958566 1.21467455
#> 24 0.64207759 0.40774841 -0.28892651
#> 25 -2.89381652 -1.08725524 -1.33802415
#> 26 -2.24287780 0.89841177 0.10996738
#> 27 2.61039185 0.11773530 1.02333549
#> 28 1.92254691 -1.13546701 -1.73760984
#> 29 -1.12260905 1.23886512 -0.07023379
#> 30 -2.44318998 0.02742624 0.78389936
#> 31 1.42245880 0.26943594 2.36644962
#> 32 -2.85910880 -0.71749249 2.32546733
#> 33 -1.91850027 -0.50288750 0.92871087
#> 34 0.22520089 1.39126815 -2.59049250
#> 35 2.42576411 -1.02255543 0.14180476
#> 36 -0.05537459 0.71091002 -0.92781969
#> 37 2.10439206 2.13756133 1.37360977
#> 38 -1.00862945 1.01602908 0.68585099
#> 39 -2.44671964 -0.28517313 0.85186771
#> 40 0.51275421 -0.69660167 0.77356645
#> 41 -0.60193549 1.91915778 1.10499325
#> 42 1.85959610 -1.61309051 -0.35257250
#> 43 -2.49902013 0.62253653 2.51834278
#> 44 -0.85250624 -0.63004109 2.57861593
#> 45 -1.10646646 0.90996478 -2.10803478
#> 46 0.76182732 -0.89221385 0.55374635
#> 47 -0.34669346 -1.04806959 -0.88544956
#> 48 0.86435900 -1.10879734 -0.16206477
#> 49 2.05058581 -1.10413682 1.82657371
#> 50 0.04766250 0.24477450 -0.92441674
#> 51 -0.77282677 1.06144844 -0.74610629
#> 52 2.78206663 -3.06946722 1.18444362
#> 53 0.62955868 -0.52347775 -0.69646880
#> 54 1.47918121 2.65530247 4.04716679
#> 55 2.59104283 1.38476532 1.88175400
#> 56 -1.92520009 -0.47477147 0.11787862
#> 57 -2.93820361 -0.56954497 -1.15670974
#> 58 -0.47236205 1.80205266 1.79517746
#> 59 -1.60914054 2.28119403 1.09920516
#> 60 -1.24582455 0.98758626 -0.78217421
#> 61 1.54968619 2.01496837 -1.29589803
#> 62 1.14837582 -0.09346275 1.59480363
#> 63 4.55767509 0.53267757 -1.18423676
#> 64 1.06653930 -2.96491002 -0.31461453
#> 65 2.32122911 0.70555804 -0.34172992
#> 66 -0.02028567 1.00103770 -0.90949169
#> 67 -2.21451669 -1.27249179 -2.29727842
#> 68 0.12665096 -0.77483736 -2.24911847
#> 69 1.64611310 1.70617376 -0.57667431
#> 70 -0.86426433 -0.45011974 -1.25849899
#> 71 -0.33338854 1.03379717 2.68295167
#> 72 1.40173655 2.58031550 -3.12489345
#> 73 1.66295258 -0.54809318 0.70755556
#> 74 -1.25245347 0.23092288 0.16899444
#> 75 1.98114835 -0.84740196 -0.86687975
#> 76 2.12428810 -0.86509599 0.18061583
#> 77 -0.87366270 0.17314265 -0.79141227
#> 78 -0.55523481 -0.09807084 -1.37022395
#> 79 -0.95033523 1.18455918 -1.57482181
#> 80 -2.73601038 -1.71072380 0.56542757
predict(modpls,se.fit=TRUE)
#> $fit
#> 1 2 3 4 5 6
#> -3.91149897 -0.01103427 2.29488351 0.77390739 1.81792717 -2.07323031
#> 7 8 9 10 11 12
#> -2.31730512 4.07156130 -1.81103896 1.32578127 2.48317798 -1.91438957
#> 13 14 15 16 17 18
#> 0.69486087 -1.40225489 -0.70950316 -0.24670449 -3.67870398 -4.16645669
#> 19 20 21 22 23 24
#> 1.81768091 -3.02734320 -2.13745459 -1.92831286 0.98589347 1.14971022
#> 25 26 27 28 29 30
#> -5.89616340 -2.34994661 4.42353417 0.80061019 -0.51396920 -3.08443396
#> 31 32 33 34 35 36
#> 3.56453166 -3.55117411 -2.73950599 0.22916899 2.63357746 0.08206085
#> 37 38 39 40 41 42
#> 5.74551756 -0.15211159 -3.34205848 0.50932362 1.49286943 1.00697968
#> 43 44 45 46 47 48
#> -1.69053571 -0.43807672 -1.88100209 0.57105812 -1.94086294 0.13737583
#> 49 50 51 52 53 54
#> 2.91478046 -0.19794750 -0.53317032 1.81309897 0.05410920 6.74653169
#> 55 56 57 58 59 60
#> 6.02372222 -3.15552724 -5.38533967 1.93977635 0.36991240 -1.30359496
#> 61 62 63 64 65 66
#> 3.40800015 2.42212390 6.44149568 -1.36663839 3.82119344 0.41050691
#> 67 68 69 70 71 72
#> -5.59789050 -1.73207956 3.64545451 -2.33472431 1.90409918 2.74143690
#> 73 74 75 76 77 78
#> 2.27208915 -1.50499888 1.61558478 2.36442373 -1.52350702 -1.62287584
#> 79 80
#> -1.11760516 -5.22936005
#>
#> $se.fit
#> 1 2 3 4 5 6 7 8
#> 0.8162219 0.2763778 0.4841867 0.1945131 0.4939491 0.4477485 0.5608206 0.9168810
#> 9 10 11 12 13 14 15 16
#> 0.4488860 0.5799729 0.5418386 0.4943679 0.3531245 0.3569619 0.2650980 0.4302105
#> 17 18 19 20 21 22 23 24
#> 0.7619582 0.8630618 0.6109658 0.7205740 0.6316150 0.4856943 0.2938761 0.2498695
#> 25 26 27 28 29 30 31 32
#> 1.2262466 0.5876082 0.9420914 0.4544296 0.2932452 0.7015508 0.7918338 0.8726003
#> 33 34 35 36 37 38 39 40
#> 0.6132601 0.4553099 0.6526979 0.1856844 1.1957187 0.2515113 0.7444075 0.2118061
#> 41 42 43 44 45 46 47 48
#> 0.4586467 0.4458471 0.6223899 0.4201203 0.5349032 0.2406794 0.4248530 0.2337450
#> 49 50 51 52 53 54 55 56
#> 0.7266590 0.1480826 0.2696072 0.7941012 0.1632820 1.4785420 1.2541028 0.6686363
#> 57 58 59 60 61 62 63 64
#> 1.1303158 0.5371830 0.4846529 0.3809646 0.7815128 0.5475350 1.4139451 0.5831624
#> 65 66 67 68 69 70 71 72
#> 0.8122094 0.2311461 1.1765103 0.4766329 0.7851201 0.5009383 0.5542336 0.8408220
#> 73 74 75 76 77 78 79 80
#> 0.5247789 0.3476377 0.4720976 0.5791829 0.3427235 0.3778105 0.4108927 1.1068457
#>
#Identical to predict(modpls,type="lp")
predict(modpls,type="risk",se.fit=TRUE)
#> $fit
#> 1 2 3 4 5 6
#> 2.001048e-02 9.890264e-01 9.923280e+00 2.168222e+00 6.159078e+00 1.257788e-01
#> 7 8 9 10 11 12
#> 9.853878e-02 5.864846e+01 1.634842e-01 3.765126e+00 1.197927e+01 1.474318e-01
#> 13 14 15 16 17 18
#> 2.003430e+00 2.460415e-01 4.918885e-01 7.813716e-01 2.525569e-02 1.550711e-02
#> 19 20 21 22 23 24
#> 6.157562e+00 4.844417e-02 1.179547e-01 1.453933e-01 2.680205e+00 3.157278e+00
#> 25 26 27 28 29 30
#> 2.749975e-03 9.537425e-02 8.339048e+01 2.226899e+00 5.981168e-01 4.575593e-02
#> 31 32 33 34 35 36
#> 3.532291e+01 2.869093e-02 6.460225e-02 1.257555e+00 1.392349e+01 1.085522e+00
#> 37 38 39 40 41 42
#> 3.127855e+02 8.588924e-01 3.536409e-02 1.664165e+00 4.449846e+00 2.737321e+00
#> 43 44 45 46 47 48
#> 1.844207e-01 6.452763e-01 1.524373e-01 1.770139e+00 1.435800e-01 1.147259e+00
#> 49 50 51 52 53 54
#> 1.844476e+01 8.204129e-01 5.867419e-01 6.129413e+00 1.055600e+00 8.511017e+02
#> 55 56 57 58 59 60
#> 4.131134e+02 4.261593e-02 4.583283e-03 6.957195e+00 1.447608e+00 2.715538e-01
#> 61 62 63 64 65 66
#> 3.020478e+01 1.126977e+01 6.273444e+02 2.549626e-01 4.565867e+01 1.507582e+00
#> 67 68 69 70 71 72
#> 3.705673e-03 1.769161e-01 3.830018e+01 9.683718e-02 6.713357e+00 1.550925e+01
#> 73 74 75 76 77 78
#> 9.699644e+00 2.220175e-01 5.030829e+00 1.063791e+01 2.179462e-01 1.973304e-01
#> 79 80
#> 3.270621e-01 5.356952e-03
#>
#> $se.fit
#> 1 2 3 4 5 6
#> 0.11546146 0.27485715 1.52524793 0.28641823 1.22585771 0.15879541
#> 7 8 9 10 11 12
#> 0.17604657 7.02168369 0.18149889 1.12537539 1.87536232 0.18982169
#> 13 14 15 16 17 18
#> 0.49982148 0.17706227 0.18592584 0.38028552 0.12109069 0.10747497
#> 19 20 21 22 23 24
#> 1.51607717 0.15859861 0.21692523 0.18519753 0.48111434 0.44398632
#> 25 26 27 28 29 30
#> 0.06430462 0.18146940 8.60302677 0.67813619 0.22678997 0.15006617
#> 31 32 33 34 35 36
#> 4.70611196 0.14780451 0.15587215 0.51058791 2.43548975 0.19346158
#> 37 38 39 40 41 42
#> 21.14717201 0.23309178 0.13998837 0.27323526 0.96749953 0.73764731
#> 43 44 45 46 47 48
#> 0.26728056 0.33747878 0.20884343 0.32021565 0.16098511 0.25036470
#> 49 50 51 52 53 54
#> 3.12080913 0.13412824 0.20651660 1.96600797 0.16775979 43.13446322
#> 55 56 57 58 59 60
#> 25.48988026 0.13803078 0.07652232 1.41690048 0.58311772 0.19852377
#> 61 62 63 64 65 66
#> 4.29510626 1.83810102 35.41486238 0.29446098 5.48819583 0.28380960
#> 67 68 69 70 71 72
#> 0.07161916 0.20047844 4.85888346 0.15588535 1.43602756 3.31130759
#> 73 74 75 76 77 78
#> 1.63438469 0.16380247 1.05889179 1.88905148 0.15999948 0.16783053
#> 79 80
#> 0.23498683 0.08101137
#>
predict(modpls,type="expected",se.fit=TRUE)
#> $fit
#> [1] 2.883280e-02 1.380608e-02 8.122308e-03 3.764026e-03 7.153829e-01
#> [6] 3.147782e-04 2.466064e-04 4.834946e-01 2.282118e-03 1.405932e-03
#> [11] 1.524722e-02 1.206745e-04 5.013850e-03 6.175850e-02 1.836756e-04
#> [16] 2.917712e-04 3.525507e-04 3.880860e-05 7.152067e-01 1.215990e-02
#> [21] 4.772813e-04 1.850566e-04 6.707570e-03 3.667208e-01 6.902670e-04
#> [26] 3.561361e-05 3.374238e-01 9.010726e-03 1.496867e-03 3.745170e-05
#> [31] 2.891217e-02 2.365264e-04 9.017997e-04 4.542828e-02 2.807208e-01
#> [36] 1.260844e-01 1.265627e+00 1.198950e-02 1.320527e-05 6.011681e-02
#> [41] 1.661613e-03 2.256632e-02 3.201537e-04 7.494946e-02 5.692145e-05
#> [46] 1.501472e-01 3.603979e-02 4.642167e-03 4.616046e-02 1.182121e+00
#> [51] 2.672665e-02 1.538534e+00 1.832518e-03 1.477510e+00 1.542603e-01
#> [56] 5.948868e-04 1.150443e-03 2.597879e-03 1.193400e-02 3.154123e-02
#> [61] 3.844464e-02 1.573176e-01 1.570013e+00 6.380777e-04 2.079795e+00
#> [66] 5.629446e-04 1.499430e-05 1.458487e-03 1.032203e+00 1.124773e-02
#> [71] 7.797629e-01 9.656531e-01 3.924777e-02 1.830300e-03 1.259032e-02
#> [76] 9.023316e-01 2.531466e-02 7.984600e-04 4.565545e-03 2.167589e-05
#>
#> $se.fit
#> [1] 3.406705e-02 1.373203e-02 8.318021e-03 4.414805e-03 3.323965e-01
#> [6] 5.283104e-04 4.339381e-04 2.960415e-01 2.844664e-03 2.018374e-03
#> [11] 1.421937e-02 2.144790e-04 5.855288e-03 6.315959e-02 3.126687e-04
#> [16] 4.834845e-04 5.653308e-04 8.141610e-05 4.461899e-01 1.605200e-02
#> [21] 7.969296e-04 3.192707e-04 7.374734e-03 1.680227e-01 1.201299e-03
#> [26] 7.095361e-05 1.822379e-01 9.960405e-03 2.109728e-03 7.457524e-05
#> [31] 2.537306e-02 4.225683e-04 1.316041e-03 3.711109e-02 1.651624e-01
#> [36] 7.893326e-02 6.531176e-01 1.205310e-02 2.846891e-05 4.379305e-02
#> [41] 2.297955e-03 2.067452e-02 5.466169e-04 6.211512e-02 1.098280e-04
#> [46] 8.881650e-02 3.543230e-02 5.465062e-03 3.813097e-02 9.892654e-01
#> [51] 2.356531e-02 1.040102e+00 2.412576e-03 1.021390e+00 1.547032e-01
#> [56] 9.035359e-04 1.917129e-03 3.440573e-03 1.339237e-02 2.815578e-02
#> [61] 3.315152e-02 1.023384e-01 1.012679e+00 1.022091e-03 8.780925e-01
#> [66] 8.589894e-04 3.456508e-05 1.978790e-03 4.882389e-01 1.153031e-02
#> [71] 4.938505e-01 6.351421e-01 3.160159e-02 2.423648e-03 1.265539e-02
#> [76] 4.133363e-01 2.244496e-02 1.206846e-03 5.308087e-03 4.848122e-05
#>
predict(modpls,type="terms",se.fit=TRUE)
#> $fit
#> tt.1 tt.2 tt.3
#> 1 -2.18710337 -1.48886332 -0.235532283
#> 2 -0.81612842 -0.15285272 0.957946868
#> 3 1.98230440 0.21817877 0.094400341
#> 4 -0.03763442 0.71295698 0.098584833
#> 5 2.76422524 -0.99036777 0.044069693
#> 6 -0.67244092 -1.10346519 -0.297324191
#> 7 -2.58500221 0.75832999 -0.490632893
#> 8 4.44407584 0.24616686 -0.618681405
#> 9 -0.22541827 -0.58024569 -1.005375007
#> 10 -0.84227497 0.28025877 1.887797465
#> 11 2.04910586 0.79213735 -0.358065231
#> 12 -2.69538483 0.85828666 -0.077291400
#> 13 1.92380082 -1.33725629 0.108316345
#> 14 -1.86302758 0.61053657 -0.149763886
#> 15 0.64929894 -0.66908891 -0.689713189
#> 16 1.81629288 -2.07233767 0.009340293
#> 17 -2.48051171 -0.85356060 -0.344631671
#> 18 -2.62201033 -1.05144670 -0.492999659
#> 19 1.09917459 1.95539108 -1.236884761
#> 20 -2.72786696 0.70395989 -1.003436130
#> 21 -2.31982425 -1.16242677 1.344796432
#> 22 -0.33710697 -1.77249772 0.181291831
#> 23 0.83670151 -0.49844873 0.647640690
#> 24 0.92709769 0.37666249 -0.154049958
#> 25 -4.17839003 -1.00436509 -0.713408282
#> 26 -3.23849773 0.82991867 0.058632456
#> 27 3.76915233 0.10875940 0.545622446
#> 28 2.77597103 -1.04890130 -0.926459549
#> 29 -1.62093845 1.14441654 -0.037447280
#> 30 -3.52772907 0.02533532 0.417959794
#> 31 2.05389237 0.24889469 1.261744606
#> 32 -4.12827545 -0.66279231 1.239893653
#> 33 -2.77012808 -0.46454837 0.495170455
#> 34 0.32516821 1.28520067 -1.381199884
#> 35 3.50256781 -0.94459787 0.075607524
#> 36 -0.07995553 0.65671168 -0.494695292
#> 37 3.03853777 1.97459796 0.732381837
#> 38 -1.45636298 0.93856907 0.365682320
#> 39 -3.53282556 -0.26343210 0.454199186
#> 40 0.74036729 -0.64349416 0.412450488
#> 41 -0.86913640 1.77284506 0.589160765
#> 42 2.68507617 -1.49011174 -0.187984755
#> 43 -3.60834238 0.57507560 1.342731063
#> 44 -1.23093622 -0.58200803 1.374867531
#> 45 -1.59763012 0.84059090 -1.123962874
#> 46 1.10000467 -0.82419326 0.295246713
#> 47 -0.50059170 -0.96816688 -0.472104369
#> 48 1.24805046 -1.02426486 -0.086409764
#> 49 2.96084678 -1.01995965 0.973893338
#> 50 0.06882002 0.22611338 -0.492880907
#> 51 -1.11588680 0.98052575 -0.397809267
#> 52 4.01703405 -2.83545722 0.631522149
#> 53 0.90902159 -0.48356886 -0.371343527
#> 54 2.13579403 2.45286756 2.157870098
#> 55 3.74121422 1.27919360 1.003314392
#> 56 -2.77980196 -0.43857585 0.062850575
#> 57 -4.24248067 -0.52612401 -0.616734987
#> 58 -0.68204492 1.66466779 0.957153476
#> 59 -2.32344267 2.10728040 0.586074673
#> 60 -1.79884966 0.91229467 -0.417039976
#> 61 2.23759637 1.86135124 -0.690947459
#> 62 1.65814317 -0.08633734 0.850318072
#> 63 6.58084023 0.49206731 -0.631411857
#> 64 1.53997918 -2.73887126 -0.167746309
#> 65 3.35162942 0.65176772 -0.182203701
#> 66 -0.02929053 0.92472060 -0.484923160
#> 67 -3.19754704 -1.17547958 -1.224863879
#> 68 0.18287169 -0.71576532 -1.199185933
#> 69 2.37682745 1.57609851 -0.307471453
#> 70 -1.24791377 -0.41580352 -0.671007023
#> 71 -0.48138068 0.95498256 1.430497301
#> 72 2.02397146 2.38359745 -1.666132008
#> 73 2.40114203 -0.50630766 0.377254774
#> 74 -1.80842119 0.21331778 0.090104528
#> 75 2.86058583 -0.78279774 -0.462203310
#> 76 3.06726572 -0.79914281 0.096300823
#> 77 -1.26148410 0.15994260 -0.421965529
#> 78 -0.80170514 -0.09059412 -0.730576581
#> 79 -1.37219178 1.09425077 -0.839664154
#> 80 -3.95053329 -1.58030167 0.301474909
#> attr(,"constant")
#> [1] -3.218013e-16
#>
#> $se.fit
#> tt.1 tt.2 tt.3
#> 1 0.474934889 0.386146562 0.078545808
#> 2 0.177224300 0.039643364 0.319458164
#> 3 0.430462288 0.056586108 0.031480827
#> 4 0.008172408 0.184910113 0.032876280
#> 5 0.600258328 0.256858439 0.014696455
#> 6 0.146022205 0.286191005 0.099152305
#> 7 0.561339605 0.196677904 0.163617303
#> 8 0.965042028 0.063845006 0.206319194
#> 9 0.048950133 0.150490562 0.335274601
#> 10 0.182902087 0.072686969 0.629546723
#> 11 0.444968391 0.205446068 0.119408357
#> 12 0.585309462 0.222602330 0.025775301
#> 13 0.417758091 0.346826273 0.036121566
#> 14 0.404561032 0.158346702 0.049943580
#> 15 0.140996866 0.173532640 0.230007025
#> 16 0.394412477 0.537474496 0.003114821
#> 17 0.538649234 0.221376593 0.114928504
#> 18 0.569376009 0.272699663 0.164406577
#> 19 0.238688472 0.507143624 0.412478967
#> 20 0.592363036 0.182576660 0.334628020
#> 21 0.503755555 0.301483081 0.448465581
#> 22 0.073203610 0.459709022 0.060457586
#> 23 0.181691797 0.129275980 0.215976598
#> 24 0.201321549 0.097689911 0.051372908
#> 25 0.907347699 0.260488736 0.237908915
#> 26 0.703247767 0.215244902 0.019552876
#> 27 0.818480723 0.028207470 0.181955336
#> 28 0.602808955 0.272039497 0.308957706
#> 29 0.351990782 0.296812007 0.012487999
#> 30 0.766055066 0.006570882 0.139382123
#> 31 0.446007792 0.064552485 0.420769281
#> 32 0.896465194 0.171899574 0.413482379
#> 33 0.601540143 0.120483695 0.165130499
#> 34 0.070611078 0.333325303 0.460605482
#> 35 0.760591236 0.244987711 0.025213758
#> 36 0.017362541 0.170322522 0.164972041
#> 37 0.659825968 0.512125056 0.244236256
#> 38 0.316252812 0.243424104 0.121948520
#> 39 0.767161782 0.068322860 0.151467313
#> 40 0.160772582 0.166894471 0.137544868
#> 41 0.188735112 0.459799106 0.196474588
#> 42 0.583070912 0.386470347 0.062689557
#> 43 0.783560444 0.149149666 0.447776817
#> 44 0.267300835 0.150947637 0.458493755
#> 45 0.346929320 0.218012817 0.374821535
#> 46 0.238868726 0.213759981 0.098459503
#> 47 0.108704721 0.251100490 0.157438372
#> 48 0.271017234 0.265649875 0.028816112
#> 49 0.642955178 0.264533290 0.324776027
#> 50 0.014944438 0.058644003 0.164366976
#> 51 0.242317570 0.254305845 0.132662283
#> 52 0.872308848 0.735394607 0.210601353
#> 53 0.197396280 0.125416787 0.123836431
#> 54 0.463792939 0.636167444 0.719611121
#> 55 0.812413894 0.331767332 0.334587422
#> 56 0.603640851 0.113747550 0.020959544
#> 57 0.921265139 0.136453747 0.205670098
#> 58 0.148107737 0.431742615 0.319193582
#> 59 0.504541304 0.546537125 0.195445431
#> 60 0.390624639 0.236609665 0.139075380
#> 61 0.485899569 0.482753769 0.230418632
#> 62 0.360069878 0.022392161 0.283565883
#> 63 1.429045683 0.127620914 0.210564572
#> 64 0.334410275 0.710344397 0.055940397
#> 65 0.727814593 0.169040274 0.060761679
#> 66 0.006360511 0.239832412 0.161713209
#> 67 0.694355224 0.304868413 0.408470218
#> 68 0.039711039 0.185638475 0.399907082
#> 69 0.516133942 0.408771584 0.102536236
#> 70 0.270987552 0.107841396 0.223768853
#> 71 0.104533002 0.247681051 0.477045290
#> 72 0.439510393 0.618201782 0.555625254
#> 73 0.521413913 0.131314244 0.125807726
#> 74 0.392703120 0.055325380 0.030048250
#> 75 0.621183266 0.203023778 0.154136545
#> 76 0.666064314 0.207262980 0.032114604
#> 77 0.273934382 0.041482173 0.140717964
#> 78 0.174092248 0.023496185 0.243634236
#> 79 0.297974828 0.283801184 0.280012992
#> 80 0.857868045 0.409861702 0.100536495
#>
predict(modpls,type="scores",se.fit=TRUE)
#> Comp_1 Comp_2 Comp_3
#> 1 -1.51471639 -1.61173906 -0.44174968
#> 2 -0.56522390 -0.16546763 1.79666550
#> 3 1.37287931 0.23618504 0.17705140
#> 4 -0.02606437 0.77179724 0.18489957
#> 5 1.91441216 -1.07210272 0.08265437
#> 6 -0.46571063 -1.19453407 -0.55764274
#> 7 -1.79028814 0.82091488 -0.92020050
#> 8 3.07782184 0.26648299 -1.16036032
#> 9 -0.15611733 -0.62813331 -1.88561876
#> 10 -0.58333215 0.30338850 3.54063538
#> 11 1.41914382 0.85751237 -0.67156485
#> 12 -1.86673554 0.92912097 -0.14496294
#> 13 1.33236164 -1.44761991 0.20315139
#> 14 -1.29027208 0.66092409 -0.28088782
#> 15 0.44968325 -0.72430875 -1.29358311
#> 16 1.25790515 -2.24336748 0.01751807
#> 17 -1.71792143 -0.92400487 -0.64636970
#> 18 -1.81591875 -1.13822249 -0.92463946
#> 19 0.76125243 2.11676931 -2.31982404
#> 20 -1.88923160 0.76205763 -1.88198233
#> 21 -1.60663454 -1.25836174 2.52221645
#> 22 -0.23346928 -1.91878178 0.34001967
#> 23 0.57947215 -0.53958566 1.21467455
#> 24 0.64207759 0.40774841 -0.28892651
#> 25 -2.89381652 -1.08725524 -1.33802415
#> 26 -2.24287780 0.89841177 0.10996738
#> 27 2.61039185 0.11773530 1.02333549
#> 28 1.92254691 -1.13546701 -1.73760984
#> 29 -1.12260905 1.23886512 -0.07023379
#> 30 -2.44318998 0.02742624 0.78389936
#> 31 1.42245880 0.26943594 2.36644962
#> 32 -2.85910880 -0.71749249 2.32546733
#> 33 -1.91850027 -0.50288750 0.92871087
#> 34 0.22520089 1.39126815 -2.59049250
#> 35 2.42576411 -1.02255543 0.14180476
#> 36 -0.05537459 0.71091002 -0.92781969
#> 37 2.10439206 2.13756133 1.37360977
#> 38 -1.00862945 1.01602908 0.68585099
#> 39 -2.44671964 -0.28517313 0.85186771
#> 40 0.51275421 -0.69660167 0.77356645
#> 41 -0.60193549 1.91915778 1.10499325
#> 42 1.85959610 -1.61309051 -0.35257250
#> 43 -2.49902013 0.62253653 2.51834278
#> 44 -0.85250624 -0.63004109 2.57861593
#> 45 -1.10646646 0.90996478 -2.10803478
#> 46 0.76182732 -0.89221385 0.55374635
#> 47 -0.34669346 -1.04806959 -0.88544956
#> 48 0.86435900 -1.10879734 -0.16206477
#> 49 2.05058581 -1.10413682 1.82657371
#> 50 0.04766250 0.24477450 -0.92441674
#> 51 -0.77282677 1.06144844 -0.74610629
#> 52 2.78206663 -3.06946722 1.18444362
#> 53 0.62955868 -0.52347775 -0.69646880
#> 54 1.47918121 2.65530247 4.04716679
#> 55 2.59104283 1.38476532 1.88175400
#> 56 -1.92520009 -0.47477147 0.11787862
#> 57 -2.93820361 -0.56954497 -1.15670974
#> 58 -0.47236205 1.80205266 1.79517746
#> 59 -1.60914054 2.28119403 1.09920516
#> 60 -1.24582455 0.98758626 -0.78217421
#> 61 1.54968619 2.01496837 -1.29589803
#> 62 1.14837582 -0.09346275 1.59480363
#> 63 4.55767509 0.53267757 -1.18423676
#> 64 1.06653930 -2.96491002 -0.31461453
#> 65 2.32122911 0.70555804 -0.34172992
#> 66 -0.02028567 1.00103770 -0.90949169
#> 67 -2.21451669 -1.27249179 -2.29727842
#> 68 0.12665096 -0.77483736 -2.24911847
#> 69 1.64611310 1.70617376 -0.57667431
#> 70 -0.86426433 -0.45011974 -1.25849899
#> 71 -0.33338854 1.03379717 2.68295167
#> 72 1.40173655 2.58031550 -3.12489345
#> 73 1.66295258 -0.54809318 0.70755556
#> 74 -1.25245347 0.23092288 0.16899444
#> 75 1.98114835 -0.84740196 -0.86687975
#> 76 2.12428810 -0.86509599 0.18061583
#> 77 -0.87366270 0.17314265 -0.79141227
#> 78 -0.55523481 -0.09807084 -1.37022395
#> 79 -0.95033523 1.18455918 -1.57482181
#> 80 -2.73601038 -1.71072380 0.56542757
#Identical to predict(modpls,type="lp")
predict(modpls,newdata=X_train_micro[1:5,],type="risk")
#> 1 2 3 4 5
#> 0.02001048 0.98902638 9.92327996 2.16822181 6.15907848
#predict(modpls,newdata=X_train_micro[1:5,],type="expected")
predict(modpls,newdata=X_train_micro[1:5,],type="terms")
#> tt.1 tt.2 tt.3
#> 1 -2.18710337 -1.4888633 -0.23553228
#> 2 -0.81612842 -0.1528527 0.95794687
#> 3 1.98230440 0.2181788 0.09440034
#> 4 -0.03763442 0.7129570 0.09858483
#> 5 2.76422524 -0.9903678 0.04406969
#> attr(,"constant")
#> [1] -3.218013e-16
predict(modpls,newdata=X_train_micro[1:5,],type="scores")
#> Comp_1 Comp_2 Comp_3
#> [1,] -1.51471639 -1.6117391 -0.44174968
#> [2,] -0.56522390 -0.1654676 1.79666550
#> [3,] 1.37287931 0.2361850 0.17705140
#> [4,] -0.02606437 0.7717972 0.18489957
#> [5,] 1.91441216 -1.0721027 0.08265437
#Identical to predict(modpls,type="lp")
predict(modpls,newdata=X_train_micro[1:5,],type="risk",se.fit=TRUE)
#> $fit
#> 1 2 3 4 5
#> 0.02001048 0.98902638 9.92327996 2.16822181 6.15907848
#>
#> $se.fit
#> 1 2 3 4 5
#> 0.1154615 0.2748572 1.5252479 0.2864182 1.2258577
#>
#predict(modpls,newdata=X_train_micro[1:5,],type="expected",se.fit=TRUE)
predict(modpls,newdata=X_train_micro[1:5,],type="terms",se.fit=TRUE)
#> $fit
#> tt.1 tt.2 tt.3
#> 1 -2.18710337 -1.4888633 -0.23553228
#> 2 -0.81612842 -0.1528527 0.95794687
#> 3 1.98230440 0.2181788 0.09440034
#> 4 -0.03763442 0.7129570 0.09858483
#> 5 2.76422524 -0.9903678 0.04406969
#> attr(,"constant")
#> [1] -3.218013e-16
#>
#> $se.fit
#> tt.1 tt.2 tt.3
#> 1 0.474934889 0.38614656 0.07854581
#> 2 0.177224300 0.03964336 0.31945816
#> 3 0.430462288 0.05658611 0.03148083
#> 4 0.008172408 0.18491011 0.03287628
#> 5 0.600258328 0.25685844 0.01469646
#>
predict(modpls,newdata=X_train_micro[1:5,],type="scores")
#> Comp_1 Comp_2 Comp_3
#> [1,] -1.51471639 -1.6117391 -0.44174968
#> [2,] -0.56522390 -0.1654676 1.79666550
#> [3,] 1.37287931 0.2361850 0.17705140
#> [4,] -0.02606437 0.7717972 0.18489957
#> [5,] 1.91441216 -1.0721027 0.08265437
predict(modpls,newdata=X_train_micro[1:5,],type="risk",comps=1)
#> 1 2 3 4 5
#> 0.1122414 0.4421401 7.2594524 0.9630650 15.8667424
predict(modpls,newdata=X_train_micro[1:5,],type="risk",comps=2)
#> 1 2 3 4 5
#> 0.02532491 0.37946947 9.02937514 1.96466659 5.89354378
predict(modpls,newdata=X_train_micro[1:5,],type="risk",comps=3)
#> 1 2 3 4 5
#> 0.02001048 0.98902638 9.92327996 2.16822181 6.15907848
try(predict(modpls,newdata=X_train_micro[1:5,],type="risk",comps=4))
#> Error in predict.plsRcoxmodel(modpls, newdata = X_train_micro[1:5, ], :
#> Cannot predict using more components than extracted.
predict(modpls,newdata=X_train_micro[1:5,],type="terms",comps=1)
#> tt.1 tt.2 tt.3
#> 1 -2.18710337 1.461454e-16 -2.071822e-17
#> 2 -0.81612842 1.461454e-16 -2.071822e-17
#> 3 1.98230440 1.461454e-16 -2.071822e-17
#> 4 -0.03763442 1.461454e-16 -2.071822e-17
#> 5 2.76422524 1.461454e-16 -2.071822e-17
#> attr(,"constant")
#> [1] -3.218013e-16
predict(modpls,newdata=X_train_micro[1:5,],type="terms",comps=2)
#> tt.1 tt.2 tt.3
#> 1 -2.18710337 -1.4888633 -2.071822e-17
#> 2 -0.81612842 -0.1528527 -2.071822e-17
#> 3 1.98230440 0.2181788 -2.071822e-17
#> 4 -0.03763442 0.7129570 -2.071822e-17
#> 5 2.76422524 -0.9903678 -2.071822e-17
#> attr(,"constant")
#> [1] -3.218013e-16
predict(modpls,newdata=X_train_micro[1:5,],type="terms",comps=3)
#> tt.1 tt.2 tt.3
#> 1 -2.18710337 -1.4888633 -0.23553228
#> 2 -0.81612842 -0.1528527 0.95794687
#> 3 1.98230440 0.2181788 0.09440034
#> 4 -0.03763442 0.7129570 0.09858483
#> 5 2.76422524 -0.9903678 0.04406969
#> attr(,"constant")
#> [1] -3.218013e-16
try(predict(modpls,newdata=X_train_micro[1:5,],type="terms",comps=4))
#> Error in predict.plsRcoxmodel(modpls, newdata = X_train_micro[1:5, ], :
#> Cannot predict using more components than extracted.
predict(modpls,newdata=X_train_micro[1:5,],type="scores",comps=1)
#> Comp_1
#> [1,] -1.51471639
#> [2,] -0.56522390
#> [3,] 1.37287931
#> [4,] -0.02606437
#> [5,] 1.91441216
predict(modpls,newdata=X_train_micro[1:5,],type="scores",comps=2)
#> Comp_1 Comp_2
#> [1,] -1.51471639 -1.6117391
#> [2,] -0.56522390 -0.1654676
#> [3,] 1.37287931 0.2361850
#> [4,] -0.02606437 0.7717972
#> [5,] 1.91441216 -1.0721027
predict(modpls,newdata=X_train_micro[1:5,],type="scores",comps=3)
#> Comp_1 Comp_2 Comp_3
#> [1,] -1.51471639 -1.6117391 -0.44174968
#> [2,] -0.56522390 -0.1654676 1.79666550
#> [3,] 1.37287931 0.2361850 0.17705140
#> [4,] -0.02606437 0.7717972 0.18489957
#> [5,] 1.91441216 -1.0721027 0.08265437
try(predict(modpls,newdata=X_train_micro[1:5,],type="scores",comps=4))
#> Error in predict.plsRcoxmodel(modpls, newdata = X_train_micro[1:5, ], :
#> Cannot predict using more components than extracted.