Light version of PLS_glm for cross validation purposes either on complete or incomplete datasets.

PLS_glm_wvc(
  dataY,
  dataX,
  nt = 2,
  dataPredictY = dataX,
  modele = "pls",
  family = NULL,
  scaleX = TRUE,
  scaleY = NULL,
  keepcoeffs = FALSE,
  keepstd.coeffs = FALSE,
  tol_Xi = 10^(-12),
  weights,
  method = "logistic",
  verbose = TRUE
)

Arguments

dataY

response (training) dataset

dataX

predictor(s) (training) dataset

nt

number of components to be extracted

dataPredictY

predictor(s) (testing) dataset

modele

name of the PLS glm model to be fitted ("pls", "pls-glm-Gamma", "pls-glm-gaussian", "pls-glm-inverse.gaussian", "pls-glm-logistic", "pls-glm-poisson", "pls-glm-polr"). Use "modele=pls-glm-family" to enable the family option.

family

a description of the error distribution and link function to be used in the model. This can be a character string naming a family function, a family function or the result of a call to a family function. (See family for details of family functions.) To use the family option, please set modele="pls-glm-family". User defined families can also be defined. See details.

scaleX

scale the predictor(s) : must be set to TRUE for modele="pls" and should be for glms pls.

scaleY

scale the response : Yes/No. Ignored since non always possible for glm responses.

keepcoeffs

whether the coefficients of the linear fit on link scale of unstandardized eXplanatory variables should be returned or not.

keepstd.coeffs

whether the coefficients of the linear fit on link scale of standardized eXplanatory variables should be returned or not.

tol_Xi

minimal value for Norm2(Xi) and \(\mathrm{det}(pp' \times pp)\) if there is any missing value in the dataX. It defaults to \(10^{-12}\)

weights

an optional vector of 'prior weights' to be used in the fitting process. Should be NULL or a numeric vector.

method

logistic, probit, complementary log-log or cauchit (corresponding to a Cauchy latent variable).

verbose

should info messages be displayed ?

Value

valsPredict

nrow(dataPredictY) * nt matrix of the predicted values

list("coeffs")

If the coefficients of the eXplanatory variables were requested:
i.e. keepcoeffs=TRUE.
ncol(dataX) * 1 matrix of the coefficients of the the eXplanatory variables

Details

This function is called by PLS_glm_kfoldcv_formula in order to perform cross-validation either on complete or incomplete datasets.

There are seven different predefined models with predefined link functions available :

list("\"pls\"")

ordinary pls models

list("\"pls-glm-Gamma\"")

glm gaussian with inverse link pls models

list("\"pls-glm-gaussian\"")

glm gaussian with identity link pls models

list("\"pls-glm-inverse-gamma\"")

glm binomial with square inverse link pls models

list("\"pls-glm-logistic\"")

glm binomial with logit link pls models

list("\"pls-glm-poisson\"")

glm poisson with log link pls models

list("\"pls-glm-polr\"")

glm polr with logit link pls models

Using the "family=" option and setting "modele=pls-glm-family" allows changing the family and link function the same way as for the glm function. As a consequence user-specified families can also be used.

The

accepts the links (as names) identity, log and inverse.

list("gaussian")

accepts the links (as names) identity, log and inverse.

family

accepts the links (as names) identity, log and inverse.

The

accepts the links logit, probit, cauchit, (corresponding to logistic, normal and Cauchy CDFs respectively) log and cloglog (complementary log-log).

list("binomial")

accepts the links logit, probit, cauchit, (corresponding to logistic, normal and Cauchy CDFs respectively) log and cloglog (complementary log-log).

family

accepts the links logit, probit, cauchit, (corresponding to logistic, normal and Cauchy CDFs respectively) log and cloglog (complementary log-log).

The

accepts the links inverse, identity and log.

list("Gamma")

accepts the links inverse, identity and log.

family

accepts the links inverse, identity and log.

The

accepts the links log, identity, and sqrt.

list("poisson")

accepts the links log, identity, and sqrt.

family

accepts the links log, identity, and sqrt.

The

accepts the links 1/mu^2, inverse, identity and log.

list("inverse.gaussian")

accepts the links 1/mu^2, inverse, identity and log.

family

accepts the links 1/mu^2, inverse, identity and log.

The

accepts the links logit, probit, cloglog, identity, inverse, log, 1/mu^2 and sqrt.

list("quasi")

accepts the links logit, probit, cloglog, identity, inverse, log, 1/mu^2 and sqrt.

family

accepts the links logit, probit, cloglog, identity, inverse, log, 1/mu^2 and sqrt.

The function

can be used to create a power link function.

list("power")

can be used to create a power link function.

Non-NULL weights can be used to indicate that different observations have different dispersions (with the values in weights being inversely proportional to the dispersions); or equivalently, when the elements of weights are positive integers w_i, that each response y_i is the mean of w_i unit-weight observations.

References

Nicolas Meyer, Myriam Maumy-Bertrand et Frédéric Bertrand (2010). Comparing the linear and the logistic PLS regression with qualitative predictors: application to allelotyping data. Journal de la Societe Francaise de Statistique, 151(2), pages 1-18. http://publications-sfds.math.cnrs.fr/index.php/J-SFdS/article/view/47

See also

PLS_glm for more detailed results, PLS_glm_kfoldcv for cross-validating models and PLS_lm_wvc for the same function dedicated to plsR models

Examples


data(Cornell)
XCornell<-Cornell[,1:7]
yCornell<-Cornell[,8]
PLS_glm_wvc(dataY=yCornell,dataX=XCornell,nt=3,modele="pls-glm-gaussian",
dataPredictY=XCornell[1,])
#> ____************************************************____
#> 
#> Family: gaussian 
#> Link function: identity 
#> 
#> ____Predicting X without NA neither in X nor in Y____
#> ____Component____ 1 ____
#> ____Component____ 2 ____
#> ____Component____ 3 ____
#> ****________________________________________________****
#> 
#> $valsPredict
#>       [,1]     [,2]    [,3]
#> 1 95.03164 97.08409 97.4436
#> 
PLS_glm_wvc(dataY=yCornell,dataX=XCornell,nt=3,modele="pls-glm-family",
family=gaussian(),dataPredictY=XCornell[1,], verbose=FALSE)
#> $valsPredict
#>       [,1]     [,2]    [,3]
#> 1 95.03164 97.08409 97.4436
#> 
PLS_glm_wvc(dataY=yCornell[-1],dataX=XCornell[-1,],nt=3,modele="pls-glm-gaussian",
dataPredictY=XCornell[1,], verbose=FALSE)
#> $valsPredict
#>       [,1]     [,2]     [,3]
#> 1 93.74777 95.32475 96.08522
#> 
PLS_glm_wvc(dataY=yCornell[-1],dataX=XCornell[-1,],nt=3,modele="pls-glm-family",
family=gaussian(),dataPredictY=XCornell[1,], verbose=FALSE)
#> $valsPredict
#>       [,1]     [,2]     [,3]
#> 1 93.74777 95.32475 96.08522
#> 
rm("XCornell","yCornell")

# \donttest{
## With an incomplete dataset (X[1,2] is NA)
data(pine)
ypine <- pine[,11]
data(XpineNAX21)
PLS_glm_wvc(dataY=ypine,dataX=XpineNAX21,nt=10,modele="pls-glm-gaussian")
#> ____************************************************____
#> Only naive DoF can be used with missing data
#> 
#> Family: gaussian 
#> Link function: identity 
#> 
#> ____There are some NAs in X but not in Y____
#> ____Predicting X with NA in X and not in Y____
#> ____Component____ 1 ____
#> ____Component____ 2 ____
#> ____Component____ 3 ____
#> ____Component____ 4 ____
#> ____Component____ 5 ____
#> ____Component____ 6 ____
#> ____Component____ 7 ____
#> ____Component____ 8 ____
#> ____Component____ 9 ____
#> Warning : reciprocal condition number of t(cbind(res$pp,temppp)[XXNA[1,],,drop=FALSE])%*%cbind(res$pp,temppp)[XXNA[1,],,drop=FALSE] < 10^{-12}
#> Warning only 9 components could thus be extracted
#> ****________________________________________________****
#> 
#> $valsPredict
#>             [,1]        [,2]       [,3]        [,4]        [,5]        [,6]
#> 1   1.4539431721  2.58686302  2.6090759  2.65655978  3.15883072  3.42926182
#> 2   0.9803000295  1.16058533  1.1833005  1.28488164  1.18517618  1.17351469
#> 3   1.5421433729  1.20822242  1.3853553  1.15999892  1.06119789  0.99352272
#> 4   0.8634757400  0.61153667  0.9598232  0.63676710  0.73217251  0.72439005
#> 5   1.1046967056  0.72056610  0.5615100  0.48002572  0.05031434 -0.02890243
#> 6   1.3430089378  1.44124063  1.2350616  1.35213181  1.07499774  1.00153522
#> 7   0.4653011509 -0.55778608 -0.1345810 -0.40312929 -0.11007833  0.02486483
#> 8   0.7245706631  0.77453614 -0.1329786  0.06547790  0.31942685 -0.06589345
#> 9   1.4768276059  1.51567612  1.6250574  1.59073347  1.97691808  2.06744335
#> 10  1.0240822662  0.87210913  0.6081081  0.51995031  1.25860279  1.17823388
#> 11  0.3376379927  1.17893648  0.7989349  0.59109134  0.67492159  0.66971895
#> 12 -0.1361926285  0.32400051  0.1088808  0.02503985  1.09413721  0.91614621
#> 13  1.7251506342  1.94257097  2.3721225  2.30673885  2.08307096  2.12519360
#> 14  1.2983306330  1.75308300  1.6929078  1.99562658  1.58229858  1.60648610
#> 15  1.2673647925  1.36309370  1.3509504  1.50028142  1.56068200  1.66788433
#> 16 -0.2742212930 -0.54947877 -0.4625418 -0.45217350 -0.43900249 -0.54284805
#> 17  0.0726132592  0.09110250  0.2767850  0.38572317 -0.46094657 -0.37755959
#> 18  1.0119232701  1.24823667  1.0358333  0.93127818  1.04973056  1.07268197
#> 19  1.1712169069  0.45709538  0.7154129  0.69975186  1.04627874  1.12890339
#> 20  0.3380528810 -0.05282248 -0.1359717 -0.17511644  0.01795423 -0.03635962
#> 21 -0.0006184277  0.21271797  1.0933839  1.10740476  0.60193100  0.61977791
#> 22  0.6475528007  0.64676752  0.9677756  0.88988795  0.74500454  0.78221342
#> 23  0.9069905284  0.88140911  0.7681153  1.08147288  0.95938321  1.05823268
#> 24  0.2908076924  0.12961264  0.8282876  0.95768372  0.56951175  0.59933236
#> 25  0.6399372516  0.14296644 -0.4236403 -0.37407716 -0.44329114 -0.69039703
#> 26  0.9183707395  0.90065654  0.8887530  1.09037413  0.98475758  1.18762561
#> 27 -0.0660539634  0.61584030  0.2968035  0.37286685 -0.40233133 -0.23591051
#> 28  1.6294091667  1.01981261  1.4487065  1.30885327  1.04296746  1.02750273
#> 29  1.4527596503  1.39098952  1.6208012  1.60355751  1.81341376  1.91094735
#> 30  0.7455927900  0.74230137  0.7595845  0.45727580  0.79191445  0.79398217
#> 31  0.9251914554  1.22235139  0.9675384  1.27926735  1.60673050  1.62225025
#> 32  0.6030109074  1.10766233  0.9423622  0.97635769  0.97873680  1.07730348
#> 33  0.2463233753  0.05289306 -0.1983792 -0.15328541  0.20920986  0.37188736
#>           [,7]        [,8]        [,9]
#> 1   3.41392627  2.96083291  3.23542261
#> 2   1.09536968  0.93205421  0.88650726
#> 3   1.29732525  1.20790868  1.22562598
#> 4   0.93697105  0.96071216  1.00900607
#> 5   0.34395975  0.31660424  0.35664778
#> 6   1.25769379  1.27286284  1.31025110
#> 7   0.10061500  0.20716458  0.22337146
#> 8  -0.06749700 -0.45235173 -0.53735541
#> 9   1.84752454  1.96619927  1.95995370
#> 10  1.07258576  1.25598280  1.30528130
#> 11  0.65182003  0.67547129  0.66502359
#> 12  0.82760907  0.63803800  0.59597012
#> 13  2.25015844  2.33499027  2.34358840
#> 14  1.83700630  1.92773146  2.00021878
#> 15  1.64781018  1.69047876  1.67309741
#> 16 -0.53535935 -0.55349482 -0.58253557
#> 17 -0.09360753 -0.07937692  0.06042429
#> 18  0.73514311  0.39733701  0.23952431
#> 19  1.03072981  1.07397224  1.07267962
#> 20 -0.08754010 -0.20699608 -0.25853347
#> 21  0.58792158  0.61901916  0.60223241
#> 22  0.72354893  0.65192523  0.61620016
#> 23  0.86644827  0.94744835  0.95335224
#> 24  0.38330515  0.46140202  0.44202144
#> 25 -0.43279587 -0.59408406 -0.57013118
#> 26  0.93104122  0.98212754  0.92705304
#> 27  0.20209342  0.32284534  0.38231639
#> 28  1.01171393  1.02783639  0.97721921
#> 29  1.78120212  1.87739994  1.82673047
#> 30  0.47349187  0.52149603  0.52673297
#> 31  1.46531670  1.50285087  1.51032980
#> 32  0.87652870  1.16647296  1.16995441
#> 33  0.28101723  0.47212049  0.52841275
#> 
rm("XpineNAX21","ypine")
#> Warning: object 'XpineNAX21' not found

data(pine)
Xpine<-pine[,1:10]
ypine<-pine[,11]
PLS_glm_wvc(ypine,Xpine,10,modele="pls", verbose=FALSE)
#> $valsPredict
#>           [,1]        [,2]        [,3]        [,4]         [,5]        [,6]
#> 1   1.57673729  2.00433040  2.00216849  2.01939633  1.937245401  1.88550891
#> 2   1.01203030  1.15199511  1.11571861  1.24570213  1.235802532  1.20179188
#> 3   1.48993594  1.24149458  1.36818432  1.53585542  1.486704711  1.51671911
#> 4   0.82287026  0.63250520  0.93956109  1.04161870  1.007232341  1.04125064
#> 5   1.03843648  0.73560935  0.56129947  0.58902449  0.528441708  0.57134580
#> 6   1.35675180  1.40730081  1.25451432  1.37731103  1.366862539  1.39997555
#> 7   0.32014006 -0.32532009 -0.14347163 -0.17938500 -0.165426534 -0.11535438
#> 8   0.71839021  0.73099520  0.30866945  0.39516648  0.412751496  0.32418945
#> 9   1.48896024  1.52799376  1.67587131  1.60139990  1.652717167  1.65007715
#> 10  0.99804354  0.88959103  0.97569244  0.86352520  0.947912813  0.94847153
#> 11  0.43405362  0.86608054  0.97012047  0.64182858  0.616856901  0.63882421
#> 12 -0.07021385  0.26999887  0.59312200  0.92157715  1.008253831  0.98380367
#> 13  1.76424139  1.90090137  2.11336407  2.19169300  2.166416346  2.19486369
#> 14  1.37638458  1.70611149  1.58504677  1.84688715  1.857927378  1.90455582
#> 15  1.29114690  1.38753932  1.35365960  1.43462512  1.476219895  1.51027552
#> 16 -0.31402446 -0.47052602 -0.50369169 -0.48552205 -0.489472457 -0.51351160
#> 17  0.07868727  0.09889902 -0.01564596  0.15763382  0.083278591  0.10574381
#> 18  1.03444868  1.14007872  1.09595610  0.88458167  0.852892800  0.78377367
#> 19  1.08137124  0.67811651  0.73915351  0.82968208  0.879143668  0.88040097
#> 20  0.27838475  0.03080907 -0.03681897 -0.06160100 -0.054332827 -0.07763891
#> 21  0.04732997  0.24640318  0.51964685  0.71563687  0.653313657  0.61594736
#> 22  0.64993047  0.65252960  0.76864397  0.76705111  0.732874483  0.71165796
#> 23  0.91576469  0.95152885  0.70877390  0.68206508  0.726537887  0.71111241
#> 24  0.28714240  0.25747771  0.32732647  0.36607751  0.340816493  0.27266379
#> 25  0.55621083  0.20897499 -0.13621852 -0.01161858 -0.029168653 -0.04249364
#> 26  0.92489518  0.94649310  0.77909889  0.64060202  0.670826485  0.67364464
#> 27  0.01702572  0.38181268  0.20327363  0.07976920  0.023726459  0.15754257
#> 28  1.54352449  1.13003525  1.14773845  0.98446655  0.935861794  0.90956750
#> 29  1.45181367  1.42783969  1.54487387  1.46033916  1.495705868  1.51049933
#> 30  0.73123225  0.66412607  0.84569291  0.49408354  0.482259156  0.42326984
#> 31  0.98197420  1.24443724  1.14351551  1.31718138  1.406786445  1.39942166
#> 32  0.66850881  0.94811355  0.92220097  0.50456716  0.531183973  0.55048739
#> 33  0.21787107  0.10572384  0.04295933 -0.08122119 -0.008152348  0.04161271
#>           [,7]         [,8]        [,9]       [,10]
#> 1   1.95676227  1.948450849  1.93896093  1.94144962
#> 2   1.21592652  1.235442425  1.24794773  1.24303207
#> 3   1.54977163  1.524520187  1.53266664  1.53900851
#> 4   1.05581321  1.021862826  1.03178760  1.03548430
#> 5   0.58624206  0.562391104  0.56933856  0.58993331
#> 6   1.38175170  1.360508864  1.35664763  1.36575593
#> 7  -0.07128741 -0.047428997 -0.05482975 -0.06507565
#> 8   0.35423193  0.306999026  0.28300642  0.26870841
#> 9   1.63319069  1.631509959  1.64240206  1.64555642
#> 10  0.90453538  0.837015730  0.82600711  0.82493397
#> 11  0.66440603  0.619480777  0.61806414  0.61605314
#> 12  1.06477362  1.031255436  1.02700955  1.02592619
#> 13  2.16399005  2.163911302  2.15049944  2.14429918
#> 14  1.84909183  1.845501860  1.84534215  1.84361941
#> 15  1.53220512  1.567310225  1.56848600  1.56902383
#> 16 -0.51165239 -0.500336508 -0.52095379 -0.50662867
#> 17  0.01828375 -0.002607768  0.01667444 -0.02206645
#> 18  0.88281941  0.918670514  0.94195909  0.93284893
#> 19  0.88632519  0.909007733  0.91379654  0.90631555
#> 20 -0.03748213 -0.025963454 -0.03697219 -0.04597628
#> 21  0.56329746  0.606867697  0.60725951  0.63204655
#> 22  0.71534817  0.734795585  0.73328612  0.71985709
#> 23  0.66101344  0.690882417  0.69856666  0.69360847
#> 24  0.17883038  0.221072111  0.21360194  0.22005958
#> 25 -0.02445300 -0.070467998 -0.05897223 -0.03210573
#> 26  0.69268993  0.763387338  0.78043784  0.79028544
#> 27  0.19700010  0.215177839  0.20405047  0.19800511
#> 28  0.87932761  0.893834308  0.86843397  0.85997782
#> 29  1.52036274  1.552065698  1.53445170  1.53023569
#> 30  0.38164488  0.326448839  0.35187384  0.34979385
#> 31  1.37872533  1.385981563  1.39371317  1.39564352
#> 32  0.49904917  0.493233383  0.48196396  0.49873722
#> 33  0.04746533  0.049219132  0.06349274  0.06165365
#> 
PLS_glm_wvc(ypine,Xpine,10,modele="pls-glm-Gamma", verbose=FALSE)
#> $valsPredict
#>         [,1]      [,2]      [,3]      [,4]      [,5]      [,6]      [,7]
#> 1  1.7021625 3.3482083 2.7139565 2.4058093 3.2467830 2.2037424 2.5521714
#> 2  0.6639038 0.6387631 0.6121317 0.7600887 0.9482646 0.9294531 0.8653260
#> 3  1.3088626 0.8281449 0.8781982 0.8175537 1.0345037 1.1660944 1.3209167
#> 4  0.5006322 0.4014678 0.4796826 0.4658354 0.4844424 0.4841038 0.5449712
#> 5  0.6854677 0.5251666 0.4284003 0.3615301 0.3586455 0.3703400 0.3739103
#> 6  1.1713845 1.2407194 0.9038012 0.8025733 0.7106149 0.7508921 0.7329991
#> 7  0.3549836 0.2691848 0.2777883 0.2608888 0.2663969 0.2714017 0.2817687
#> 8  0.5120913 0.5899263 0.5143725 0.4952195 0.4492835 0.4160523 0.3486881
#> 9  1.4945619 1.8984358 3.2406348 2.6184777 2.9855578 3.5895484 2.8384646
#> 10 0.6500508 0.7579531 1.0736393 0.7529701 0.5609868 0.5445539 0.4945195
#> 11 0.3849210 0.4622240 0.4980263 0.3569521 0.3024332 0.2995254 0.3421787
#> 12 0.2795105 0.3152286 0.4814740 0.6678051 0.6433927 0.5911492 0.7463694
#> 13 4.2141188 3.0076725 3.4791475 4.2092704 3.2534025 2.9808121 3.3581408
#> 14 1.2733917 1.8325125 1.2115241 1.6556086 1.3004955 1.5582862 1.6386778
#> 15 1.0345150 1.2033342 1.0290148 1.0809521 1.1747941 1.4991592 1.7732216
#> 16 0.2576160 0.2298102 0.2306029 0.2393888 0.2219856 0.2038692 0.1991727
#> 17 0.3181072 0.2641593 0.2394775 0.2465446 0.2434009 0.2454800 0.2454886
#> 18 0.6689827 0.6826523 0.6317041 0.5589350 0.7682575 0.8003709 0.8141919
#> 19 0.7212759 0.5417550 0.5624130 0.6169171 0.7357709 0.7839123 0.6859125
#> 20 0.3544669 0.3125267 0.3071243 0.3004883 0.2983363 0.2888686 0.2809863
#> 21 0.3010344 0.2531795 0.2689006 0.3545433 0.3516489 0.2928523 0.2888406
#> 22 0.4524566 0.3845643 0.3876906 0.4082754 0.4216088 0.3971631 0.3949778
#> 23 0.6507469 0.6608146 0.5157721 0.5355239 0.5223465 0.5289271 0.4564660
#> 24 0.3581027 0.2902701 0.2841434 0.3507472 0.3384621 0.2851532 0.2518528
#> 25 0.4407103 0.3888439 0.3464864 0.3246940 0.3226049 0.3196535 0.2975328
#> 26 0.6435771 0.6370758 0.4976960 0.4812621 0.5249918 0.5591198 0.5489260
#> 27 0.3096927 0.3200657 0.2664525 0.2208262 0.1962616 0.2068674 0.2457191
#> 28 1.7027982 0.7863943 0.6191251 0.5376817 0.5259622 0.4714803 0.3925142
#> 29 1.3624280 1.3395903 1.3437875 1.1678080 1.0829979 1.0899791 1.0607518
#> 30 0.4809691 0.4482798 0.5264782 0.4211320 0.4080366 0.3796394 0.3529260
#> 31 0.6795166 0.9667644 1.0310662 1.5914495 1.4876477 1.6576885 1.4173828
#> 32 0.4867971 0.5929337 0.5403197 0.3962608 0.3103903 0.2955165 0.2975775
#> 33 0.3501638 0.3513783 0.3489678 0.3059872 0.2892922 0.3083446 0.3264570
#>         [,8]      [,9]     [,10]
#> 1  2.4848460 2.5383659 2.5386650
#> 2  0.8818512 0.8504103 0.8504414
#> 3  1.3240837 1.3853373 1.3858636
#> 4  0.5790253 0.5768843 0.5769496
#> 5  0.3921145 0.4055536 0.4055933
#> 6  0.7330597 0.7621053 0.7622316
#> 7  0.2660918 0.2660232 0.2660038
#> 8  0.2912579 0.2917354 0.2917732
#> 9  3.2596209 3.2484587 3.2483633
#> 10 0.4443817 0.4485445 0.4486040
#> 11 0.3542984 0.3444401 0.3444444
#> 12 0.6853879 0.6926198 0.6926340
#> 13 3.2365795 3.1305159 3.1301005
#> 14 1.7486322 1.7094242 1.7099539
#> 15 1.6651777 1.7173913 1.7167684
#> 16 0.2027942 0.2069252 0.2069125
#> 17 0.2423289 0.2281793 0.2282071
#> 18 0.7941254 0.7399260 0.7398598
#> 19 0.6071330 0.6085899 0.6085694
#> 20 0.2637502 0.2634246 0.2634166
#> 21 0.3540481 0.3637124 0.3636558
#> 22 0.3923977 0.3816768 0.3816594
#> 23 0.4520624 0.4454465 0.4454376
#> 24 0.2724808 0.2743025 0.2742785
#> 25 0.3016025 0.3133463 0.3133922
#> 26 0.6115410 0.6177281 0.6175658
#> 27 0.2518436 0.2473006 0.2472828
#> 28 0.3629759 0.3659599 0.3659410
#> 29 0.9255826 0.9503453 0.9500157
#> 30 0.3723393 0.3588267 0.3588724
#> 31 1.3702410 1.3900717 1.3901525
#> 32 0.3220191 0.3261020 0.3260694
#> 33 0.3243259 0.3203266 0.3203212
#> 
PLS_glm_wvc(ypine,Xpine,10,modele="pls-glm-family",family=Gamma(), verbose=FALSE)
#> $valsPredict
#>         [,1]      [,2]      [,3]      [,4]      [,5]      [,6]      [,7]
#> 1  1.7021625 3.3482083 2.7139565 2.4058093 3.2467830 2.2037424 2.5521714
#> 2  0.6639038 0.6387631 0.6121317 0.7600887 0.9482646 0.9294531 0.8653260
#> 3  1.3088626 0.8281449 0.8781982 0.8175537 1.0345037 1.1660944 1.3209167
#> 4  0.5006322 0.4014678 0.4796826 0.4658354 0.4844424 0.4841038 0.5449712
#> 5  0.6854677 0.5251666 0.4284003 0.3615301 0.3586455 0.3703400 0.3739103
#> 6  1.1713845 1.2407194 0.9038012 0.8025733 0.7106149 0.7508921 0.7329991
#> 7  0.3549836 0.2691848 0.2777883 0.2608888 0.2663969 0.2714017 0.2817687
#> 8  0.5120913 0.5899263 0.5143725 0.4952195 0.4492835 0.4160523 0.3486881
#> 9  1.4945619 1.8984358 3.2406348 2.6184777 2.9855578 3.5895484 2.8384646
#> 10 0.6500508 0.7579531 1.0736393 0.7529701 0.5609868 0.5445539 0.4945195
#> 11 0.3849210 0.4622240 0.4980263 0.3569521 0.3024332 0.2995254 0.3421787
#> 12 0.2795105 0.3152286 0.4814740 0.6678051 0.6433927 0.5911492 0.7463694
#> 13 4.2141188 3.0076725 3.4791475 4.2092704 3.2534025 2.9808121 3.3581408
#> 14 1.2733917 1.8325125 1.2115241 1.6556086 1.3004955 1.5582862 1.6386778
#> 15 1.0345150 1.2033342 1.0290148 1.0809521 1.1747941 1.4991592 1.7732216
#> 16 0.2576160 0.2298102 0.2306029 0.2393888 0.2219856 0.2038692 0.1991727
#> 17 0.3181072 0.2641593 0.2394775 0.2465446 0.2434009 0.2454800 0.2454886
#> 18 0.6689827 0.6826523 0.6317041 0.5589350 0.7682575 0.8003709 0.8141919
#> 19 0.7212759 0.5417550 0.5624130 0.6169171 0.7357709 0.7839123 0.6859125
#> 20 0.3544669 0.3125267 0.3071243 0.3004883 0.2983363 0.2888686 0.2809863
#> 21 0.3010344 0.2531795 0.2689006 0.3545433 0.3516489 0.2928523 0.2888406
#> 22 0.4524566 0.3845643 0.3876906 0.4082754 0.4216088 0.3971631 0.3949778
#> 23 0.6507469 0.6608146 0.5157721 0.5355239 0.5223465 0.5289271 0.4564660
#> 24 0.3581027 0.2902701 0.2841434 0.3507472 0.3384621 0.2851532 0.2518528
#> 25 0.4407103 0.3888439 0.3464864 0.3246940 0.3226049 0.3196535 0.2975328
#> 26 0.6435771 0.6370758 0.4976960 0.4812621 0.5249918 0.5591198 0.5489260
#> 27 0.3096927 0.3200657 0.2664525 0.2208262 0.1962616 0.2068674 0.2457191
#> 28 1.7027982 0.7863943 0.6191251 0.5376817 0.5259622 0.4714803 0.3925142
#> 29 1.3624280 1.3395903 1.3437875 1.1678080 1.0829979 1.0899791 1.0607518
#> 30 0.4809691 0.4482798 0.5264782 0.4211320 0.4080366 0.3796394 0.3529260
#> 31 0.6795166 0.9667644 1.0310662 1.5914495 1.4876477 1.6576885 1.4173828
#> 32 0.4867971 0.5929337 0.5403197 0.3962608 0.3103903 0.2955165 0.2975775
#> 33 0.3501638 0.3513783 0.3489678 0.3059872 0.2892922 0.3083446 0.3264570
#>         [,8]      [,9]     [,10]
#> 1  2.4848460 2.5383659 2.5386650
#> 2  0.8818512 0.8504103 0.8504414
#> 3  1.3240837 1.3853373 1.3858636
#> 4  0.5790253 0.5768843 0.5769496
#> 5  0.3921145 0.4055536 0.4055933
#> 6  0.7330597 0.7621053 0.7622316
#> 7  0.2660918 0.2660232 0.2660038
#> 8  0.2912579 0.2917354 0.2917732
#> 9  3.2596209 3.2484587 3.2483633
#> 10 0.4443817 0.4485445 0.4486040
#> 11 0.3542984 0.3444401 0.3444444
#> 12 0.6853879 0.6926198 0.6926340
#> 13 3.2365795 3.1305159 3.1301005
#> 14 1.7486322 1.7094242 1.7099539
#> 15 1.6651777 1.7173913 1.7167684
#> 16 0.2027942 0.2069252 0.2069125
#> 17 0.2423289 0.2281793 0.2282071
#> 18 0.7941254 0.7399260 0.7398598
#> 19 0.6071330 0.6085899 0.6085694
#> 20 0.2637502 0.2634246 0.2634166
#> 21 0.3540481 0.3637124 0.3636558
#> 22 0.3923977 0.3816768 0.3816594
#> 23 0.4520624 0.4454465 0.4454376
#> 24 0.2724808 0.2743025 0.2742785
#> 25 0.3016025 0.3133463 0.3133922
#> 26 0.6115410 0.6177281 0.6175658
#> 27 0.2518436 0.2473006 0.2472828
#> 28 0.3629759 0.3659599 0.3659410
#> 29 0.9255826 0.9503453 0.9500157
#> 30 0.3723393 0.3588267 0.3588724
#> 31 1.3702410 1.3900717 1.3901525
#> 32 0.3220191 0.3261020 0.3260694
#> 33 0.3243259 0.3203266 0.3203212
#> 
PLS_glm_wvc(ypine,Xpine,10,modele="pls-glm-gaussian", verbose=FALSE)
#> $valsPredict
#>           [,1]         [,2]        [,3]         [,4]        [,5]          [,6]
#> 1   1.57673729  2.112703071  2.04925275  2.025720690  1.99988877  1.9833376732
#> 2   1.01203030  1.097055838  1.16768196  1.188171412  1.15718087  1.2424038010
#> 3   1.48993594  1.408750619  1.48576816  1.417980819  1.47582762  1.4982947689
#> 4   0.82287026  0.920756079  1.01128635  0.901257566  0.97491134  0.9909505955
#> 5   1.03843648  0.659104154  0.56302369  0.539466433  0.54522439  0.5188201727
#> 6   1.35675180  1.363858458  1.34528931  1.394223504  1.37942281  1.3306250748
#> 7   0.32014006 -0.208936487 -0.12375412 -0.181775915 -0.06686014 -0.0207190905
#> 8   0.71839021  0.466159761  0.40368534  0.513587769  0.48884774  0.3578640505
#> 9   1.48896024  1.577050840  1.58976688  1.576666292  1.60118999  1.6121834444
#> 10  0.99804354  0.955521315  0.91591324  0.915171230  0.97502649  0.8174125637
#> 11  0.43405362  1.004865089  0.70022268  0.636399772  0.65960111  0.6245784031
#> 12 -0.07021385  0.625346725  0.85018705  0.906678742  1.04976341  1.0296261669
#> 13  1.76424139  2.130136647  2.23589140  2.197201851  2.19275527  2.1697925483
#> 14  1.37638458  1.681218703  1.75409814  1.849649772  1.80173880  1.8135307368
#> 15  1.29114690  1.361382091  1.39654708  1.475461977  1.50322338  1.5668160958
#> 16 -0.31402446 -0.467204941 -0.38855760 -0.373068343 -0.38143767 -0.4913222929
#> 17  0.07868727 -0.009485106  0.00923828 -0.036909766 -0.14037506  0.0003648119
#> 18  1.03444868  1.006290938  0.84699467  0.805974957  0.80434945  0.9670404542
#> 19  1.08137124  0.652213314  0.80163323  0.811023094  0.86280576  0.9155210110
#> 20  0.27838475 -0.032388253 -0.02071891 -0.009618538  0.02030676  0.0147427279
#> 21  0.04732997  0.463005953  0.75191452  0.690072944  0.59756638  0.5787557488
#> 22  0.64993047  0.723791521  0.78678336  0.734843252  0.71613035  0.7677772238
#> 23  0.91576469  0.661826559  0.63945101  0.723928013  0.65099726  0.6902564661
#> 24  0.28714240  0.226510895  0.44320269  0.409261632  0.27917608  0.2269039458
#> 25  0.55621083 -0.024808263 -0.08836434 -0.049135741 -0.04935391 -0.1262912840
#> 26  0.92489518  0.696833356  0.61254496  0.676658656  0.64557351  0.7620516870
#> 27  0.01702572  0.383663431  0.09067225  0.142983517  0.15295203  0.2265845367
#> 28  1.54352449  1.118282238  1.14733091  1.067690338  1.02839043  0.9429046207
#> 29  1.45181367  1.511879599  1.55176709  1.566714783  1.60155782  1.5828522058
#> 30  0.73123225  0.675786261  0.50840164  0.348494547  0.32640081  0.3072993447
#> 31  0.98197420  1.138509770  1.22086791  1.348938103  1.34491760  1.3622819484
#> 32  0.66850881  0.900496012  0.64433614  0.644006458  0.60182468  0.4779691553
#> 33  0.21787107 -0.010176189 -0.13235770 -0.087719821 -0.02952413  0.0287906836
#>           [,7]        [,8]        [,9]       [,10]
#> 1   1.95998953  1.94645747  1.93186966  1.94144962
#> 2   1.25598752  1.24287885  1.23986128  1.24303207
#> 3   1.49520811  1.51724573  1.53404270  1.53900851
#> 4   0.99891106  1.01599568  1.03395498  1.03548430
#> 5   0.51896770  0.56790205  0.58389496  0.58993331
#> 6   1.32539225  1.34603852  1.36392791  1.36575593
#> 7  -0.06196351 -0.05464413 -0.06141636 -0.06507565
#> 8   0.33392253  0.26216021  0.26205140  0.26870841
#> 9   1.65121630  1.65247099  1.64949308  1.64555642
#> 10  0.84217676  0.81747252  0.83103198  0.82493397
#> 11  0.62520462  0.61633899  0.61661614  0.61605314
#> 12  1.05350867  1.02479904  1.02287949  1.02592619
#> 13  2.13743805  2.13939290  2.14749253  2.14429918
#> 14  1.80845036  1.81864863  1.84537429  1.84361941
#> 15  1.56195461  1.57681740  1.57071745  1.56902383
#> 16 -0.50961890 -0.49310993 -0.50826612 -0.50662867
#> 17 -0.02051672 -0.06780741 -0.01988610 -0.02206645
#> 18  0.98706465  0.95905848  0.92650134  0.93284893
#> 19  0.91934368  0.91222031  0.90916925  0.90631555
#> 20 -0.01218066 -0.03394867 -0.04715924 -0.04597628
#> 21  0.59411953  0.64146282  0.62996595  0.63204655
#> 22  0.74682044  0.72671756  0.72012579  0.71985709
#> 23  0.70899785  0.69938422  0.69733168  0.69360847
#> 24  0.23235605  0.23628193  0.22277700  0.22005958
#> 25 -0.08891154 -0.05778064 -0.04168003 -0.03210573
#> 26  0.78675646  0.81693166  0.79133604  0.79028544
#> 27  0.15249578  0.18906869  0.19908660  0.19800511
#> 28  0.88885380  0.87522045  0.86246092  0.85997782
#> 29  1.55342327  1.55350046  1.53516894  1.53023569
#> 30  0.37656252  0.34741004  0.35160380  0.34979385
#> 31  1.40075691  1.39407660  1.39778420  1.39564352
#> 32  0.49263192  0.51865275  0.50530217  0.49873722
#> 33  0.05468043  0.06268582  0.06658628  0.06165365
#> 
PLS_glm_wvc(ypine,Xpine,10,modele="pls-glm-family",family=gaussian(log), verbose=FALSE)
#> Warning: glm.fit: algorithm did not converge
#> $valsPredict
#>          [,1]       [,2]       [,3]       [,4]       [,5]       [,6]       [,7]
#> 1  1.94034398 2.71462895 2.43467867 2.34032266 2.49387242 2.52177239 2.36881223
#> 2  0.75808067 0.69121040 0.65620818 0.70721651 1.04029001 0.98040711 0.93783784
#> 3  1.40934523 1.05235219 1.16507753 1.04330692 1.27590261 1.35755324 1.37414207
#> 4  0.37708016 0.35352178 0.46140678 0.36128092 0.61457152 0.56530388 0.62061981
#> 5  0.77818425 0.53104971 0.43760160 0.37373842 0.34144739 0.34208810 0.33370653
#> 6  1.51169791 1.44153372 1.26663946 1.31087294 1.06609858 1.04718818 1.03518591
#> 7  0.16713747 0.08885717 0.09752329 0.06901046 0.10159776 0.10286735 0.11602626
#> 8  0.52169803 0.45181518 0.39494988 0.45271597 0.35170317 0.30990886 0.24700339
#> 9  1.72168458 2.00061811 2.45997527 2.60364609 2.49269049 2.67113548 2.60678716
#> 10 0.69017124 0.87663394 1.23358029 1.21064587 0.88829904 0.83085920 0.79099293
#> 11 0.24761936 0.59543246 0.61011139 0.43755305 0.41628614 0.43157686 0.53045450
#> 12 0.07788296 0.13938476 0.23445656 0.21678007 0.54126772 0.46650102 0.53587964
#> 13 2.54607327 2.73230801 2.90764248 2.80193225 2.88622102 2.71241728 2.82081773
#> 14 1.66719840 1.88972107 1.63632610 1.91063363 1.84158614 1.73408495 1.86193764
#> 15 1.34895761 1.33460645 1.23252238 1.29571895 1.25365525 1.45657367 1.58051773
#> 16 0.06341544 0.04781972 0.04523887 0.03588003 0.06230984 0.03910136 0.03595447
#> 17 0.13804789 0.10480764 0.08384257 0.07535406 0.14547767 0.10826649 0.12411130
#> 18 0.79061549 0.82574247 0.74064232 0.69586410 0.87182540 1.09564836 1.06117605
#> 19 0.78786507 0.45717855 0.51628995 0.52758141 0.65036236 0.66698463 0.63632866
#> 20 0.18952817 0.13436883 0.12758699 0.10907087 0.14319721 0.12855285 0.12477305
#> 21 0.10522301 0.09131829 0.09479376 0.08113521 0.28581761 0.13871715 0.12566135
#> 22 0.34302383 0.29785876 0.29338908 0.25447835 0.42009515 0.35272268 0.35674133
#> 23 0.81820041 0.66972983 0.54855397 0.64001770 0.59570640 0.55276326 0.50977543
#> 24 0.19474253 0.12654062 0.12041800 0.11763698 0.25419417 0.12872552 0.10208317
#> 25 0.35299833 0.22248594 0.19757934 0.19485657 0.19771813 0.17411768 0.14585414
#> 26 0.80124529 0.67773527 0.52264350 0.53603714 0.54379607 0.60854112 0.60682863
#> 27 0.14150887 0.22211358 0.13221850 0.08582109 0.08069217 0.09245398 0.14580144
#> 28 1.81316065 0.97154149 0.87534137 0.78129461 0.64910997 0.54803145 0.45120061
#> 29 1.61509512 1.55735811 1.60106153 1.52704385 1.31898041 1.40585691 1.42994630
#> 30 0.39043233 0.49977789 0.66789433 0.57577267 0.69466570 0.59985551 0.53879331
#> 31 0.83475962 0.99424242 1.05056360 1.31875471 1.42449237 1.39059830 1.37115108
#> 32 0.47025561 0.82350400 0.75180162 0.62855746 0.42921486 0.37592533 0.38664416
#> 33 0.19382221 0.20784671 0.20961391 0.18342774 0.18891735 0.21202125 0.24981935
#>          [,8]       [,9]      [,10]
#> 1  2.31091578 2.37167744 2.36198559
#> 2  0.93585220 0.95791298 0.98575560
#> 3  1.41641694 1.50522613 1.47941643
#> 4  0.59950218 0.70071774 0.71423505
#> 5  0.37190673 0.42675025 0.42402377
#> 6  1.09060470 1.11874494 1.08702344
#> 7  0.09571100 0.09647244 0.09630241
#> 8  0.20785619 0.16125305 0.14088436
#> 9  2.71378257 2.69868104 2.70464247
#> 10 0.73029614 0.65922545 0.60326641
#> 11 0.54022679 0.51472598 0.49819043
#> 12 0.35506281 0.45808959 0.46151119
#> 13 2.79802911 2.76063955 2.78066411
#> 14 1.98170008 1.95001758 1.93229013
#> 15 1.53052186 1.51841242 1.52042283
#> 16 0.02766734 0.04128363 0.04540010
#> 17 0.14200686 0.10908344 0.10682024
#> 18 1.11860371 0.97590985 0.97740363
#> 19 0.58502710 0.55929325 0.55484711
#> 20 0.10381511 0.10249376 0.10126389
#> 21 0.10867268 0.22973117 0.29869098
#> 22 0.33080890 0.33655123 0.35050213
#> 23 0.54181217 0.50162257 0.50663290
#> 24 0.09203855 0.13818992 0.16532608
#> 25 0.15063372 0.18519359 0.18156725
#> 26 0.65085067 0.71168397 0.76264933
#> 27 0.14727015 0.14028361 0.13815779
#> 28 0.43710022 0.39895696 0.39737902
#> 29 1.28688769 1.23785532 1.23709320
#> 30 0.62334349 0.59651237 0.59626615
#> 31 1.30383078 1.34676942 1.34563447
#> 32 0.40329953 0.46335753 0.48230863
#> 33 0.24694643 0.23442405 0.22839915
#> 
PLS_glm_wvc(round(ypine),Xpine,10,modele="pls-glm-poisson", verbose=FALSE)
#> $valsPredict
#>         [,1]       [,2]       [,3]       [,4]       [,5]       [,6]       [,7]
#> 1  1.6399850 2.60506445 2.11819574 2.41837386 2.53667962 2.20251218 2.06903053
#> 2  0.7290121 0.69629584 0.86286774 0.91041443 1.03494013 1.08047292 1.04940245
#> 3  1.4007318 0.91577118 0.87914923 0.91048321 0.88943352 1.13236954 1.27356987
#> 4  0.5073121 0.58670097 0.49153953 0.47288217 0.47195939 0.58602584 0.68553431
#> 5  0.7167976 0.26869425 0.17547231 0.16659766 0.14906365 0.20126291 0.20750054
#> 6  1.2277121 0.87836123 0.79196664 0.74357948 0.62570190 0.70241020 0.67883278
#> 7  0.2394851 0.11484061 0.10723855 0.09952527 0.09561722 0.09178740 0.10346041
#> 8  0.4486498 0.21249182 0.25417881 0.29270841 0.19704511 0.07643748 0.06392087
#> 9  1.5375425 1.72997866 1.88584119 2.06164431 2.14385483 2.24654776 2.35711461
#> 10 0.7082187 0.70892020 0.65884300 0.68326209 0.48035002 0.29285649 0.30175497
#> 11 0.2936374 0.73061296 0.24263912 0.23403044 0.20470584 0.23168872 0.22840093
#> 12 0.1322042 0.56376787 0.95488027 1.07792637 0.91182710 0.93220801 0.97645652
#> 13 2.3010540 3.38048229 3.41269178 3.09515690 3.11675468 2.83446755 2.94222186
#> 14 1.3291560 1.58321807 1.75503319 1.50912761 1.35057561 1.76448864 1.65562250
#> 15 1.1448439 1.08877131 1.25968183 1.27173335 1.27622471 1.76272105 1.72201423
#> 16 0.0941790 0.08628985 0.09230834 0.08013583 0.07468272 0.04857939 0.04622128
#> 17 0.1757439 0.12211384 0.07899382 0.05819776 0.05831164 0.05675775 0.05570430
#> 18 0.7307741 0.57135156 0.43624010 0.56728509 0.74050220 0.81621686 0.79616814
#> 19 0.8036405 0.39016521 0.60474619 0.63620204 0.64631151 0.57833457 0.61425877
#> 20 0.2296892 0.13681561 0.13816133 0.13921219 0.12832775 0.08814754 0.08562427
#> 21 0.1690309 0.41209378 0.59658496 0.47032464 0.67844078 0.64008147 0.65138413
#> 22 0.4128838 0.44811365 0.42513686 0.40081655 0.45309870 0.36374930 0.36640901
#> 23 0.6719322 0.38499657 0.43763564 0.42067876 0.42831773 0.38428972 0.34730722
#> 24 0.2518950 0.26446399 0.40504213 0.33075720 0.42496338 0.22271054 0.21188843
#> 25 0.3415030 0.09895810 0.09308877 0.09857795 0.07905177 0.08172748 0.07994832
#> 26 0.6718422 0.41760796 0.39983789 0.41407059 0.51103696 0.75685751 0.71134635
#> 27 0.1598578 0.23017899 0.06417506 0.04623807 0.03896774 0.08594351 0.07861887
#> 28 1.6103573 0.64304401 0.62041835 0.58564249 0.58885702 0.26270837 0.26265212
#> 29 1.4503379 1.50168396 1.65079230 1.65474205 1.64013824 1.36785798 1.36596556
#> 30 0.4619784 0.46051634 0.26232493 0.28912455 0.31006305 0.21938924 0.23797224
#> 31 0.7331324 0.89331268 1.39212273 1.45116888 1.34444176 1.50249618 1.40404240
#> 32 0.4547769 0.70774332 0.33469867 0.29588706 0.26928216 0.24247816 0.22480412
#> 33 0.2201032 0.16657890 0.11747299 0.11349275 0.10047155 0.14341775 0.14484710
#>          [,8]       [,9]      [,10]
#> 1  2.02482801 2.09337142 2.09445057
#> 2  1.02979098 1.00523770 1.00859344
#> 3  1.24588728 1.28630576 1.28831559
#> 4  0.67844519 0.67808489 0.67912103
#> 5  0.20598501 0.22263909 0.22289776
#> 6  0.67635225 0.70420679 0.70561084
#> 7  0.10133294 0.09847205 0.09737196
#> 8  0.06139630 0.06185310 0.06202701
#> 9  2.43317037 2.42608241 2.43213747
#> 10 0.31175798 0.31302351 0.31392542
#> 11 0.23326785 0.22472545 0.22453972
#> 12 0.96450569 0.96172419 0.95912503
#> 13 2.92826816 2.88041181 2.87784898
#> 14 1.63943166 1.61203045 1.61806746
#> 15 1.72636185 1.72600708 1.71786827
#> 16 0.04614683 0.04911642 0.04884304
#> 17 0.05209752 0.04344071 0.04389911
#> 18 0.78681869 0.75517340 0.75514721
#> 19 0.60876974 0.59789477 0.59656484
#> 20 0.08364279 0.08252597 0.08207915
#> 21 0.65597011 0.69554797 0.69687802
#> 22 0.35829121 0.33954857 0.33920155
#> 23 0.35016186 0.33960578 0.34056020
#> 24 0.21327505 0.21575584 0.21648404
#> 25 0.07937426 0.08865757 0.08919000
#> 26 0.72799199 0.73683412 0.73410737
#> 27 0.07702918 0.07296722 0.07209219
#> 28 0.26006772 0.26199357 0.26110683
#> 29 1.38309722 1.38333502 1.37253094
#> 30 0.24610086 0.23542502 0.23820983
#> 31 1.42270195 1.41766315 1.42184343
#> 32 0.24013435 0.24834876 0.24777468
#> 33 0.14754717 0.14199043 0.14158704
#> 
PLS_glm_wvc(round(ypine),Xpine,10,modele="pls-glm-family",family=poisson(log), verbose=FALSE)
#> $valsPredict
#>         [,1]       [,2]       [,3]       [,4]       [,5]       [,6]       [,7]
#> 1  1.6399850 2.60506445 2.11819574 2.41837386 2.53667962 2.20251218 2.06903053
#> 2  0.7290121 0.69629584 0.86286774 0.91041443 1.03494013 1.08047292 1.04940245
#> 3  1.4007318 0.91577118 0.87914923 0.91048321 0.88943352 1.13236954 1.27356987
#> 4  0.5073121 0.58670097 0.49153953 0.47288217 0.47195939 0.58602584 0.68553431
#> 5  0.7167976 0.26869425 0.17547231 0.16659766 0.14906365 0.20126291 0.20750054
#> 6  1.2277121 0.87836123 0.79196664 0.74357948 0.62570190 0.70241020 0.67883278
#> 7  0.2394851 0.11484061 0.10723855 0.09952527 0.09561722 0.09178740 0.10346041
#> 8  0.4486498 0.21249182 0.25417881 0.29270841 0.19704511 0.07643748 0.06392087
#> 9  1.5375425 1.72997866 1.88584119 2.06164431 2.14385483 2.24654776 2.35711461
#> 10 0.7082187 0.70892020 0.65884300 0.68326209 0.48035002 0.29285649 0.30175497
#> 11 0.2936374 0.73061296 0.24263912 0.23403044 0.20470584 0.23168872 0.22840093
#> 12 0.1322042 0.56376787 0.95488027 1.07792637 0.91182710 0.93220801 0.97645652
#> 13 2.3010540 3.38048229 3.41269178 3.09515690 3.11675468 2.83446755 2.94222186
#> 14 1.3291560 1.58321807 1.75503319 1.50912761 1.35057561 1.76448864 1.65562250
#> 15 1.1448439 1.08877131 1.25968183 1.27173335 1.27622471 1.76272105 1.72201423
#> 16 0.0941790 0.08628985 0.09230834 0.08013583 0.07468272 0.04857939 0.04622128
#> 17 0.1757439 0.12211384 0.07899382 0.05819776 0.05831164 0.05675775 0.05570430
#> 18 0.7307741 0.57135156 0.43624010 0.56728509 0.74050220 0.81621686 0.79616814
#> 19 0.8036405 0.39016521 0.60474619 0.63620204 0.64631151 0.57833457 0.61425877
#> 20 0.2296892 0.13681561 0.13816133 0.13921219 0.12832775 0.08814754 0.08562427
#> 21 0.1690309 0.41209378 0.59658496 0.47032464 0.67844078 0.64008147 0.65138413
#> 22 0.4128838 0.44811365 0.42513686 0.40081655 0.45309870 0.36374930 0.36640901
#> 23 0.6719322 0.38499657 0.43763564 0.42067876 0.42831773 0.38428972 0.34730722
#> 24 0.2518950 0.26446399 0.40504213 0.33075720 0.42496338 0.22271054 0.21188843
#> 25 0.3415030 0.09895810 0.09308877 0.09857795 0.07905177 0.08172748 0.07994832
#> 26 0.6718422 0.41760796 0.39983789 0.41407059 0.51103696 0.75685751 0.71134635
#> 27 0.1598578 0.23017899 0.06417506 0.04623807 0.03896774 0.08594351 0.07861887
#> 28 1.6103573 0.64304401 0.62041835 0.58564249 0.58885702 0.26270837 0.26265212
#> 29 1.4503379 1.50168396 1.65079230 1.65474205 1.64013824 1.36785798 1.36596556
#> 30 0.4619784 0.46051634 0.26232493 0.28912455 0.31006305 0.21938924 0.23797224
#> 31 0.7331324 0.89331268 1.39212273 1.45116888 1.34444176 1.50249618 1.40404240
#> 32 0.4547769 0.70774332 0.33469867 0.29588706 0.26928216 0.24247816 0.22480412
#> 33 0.2201032 0.16657890 0.11747299 0.11349275 0.10047155 0.14341775 0.14484710
#>          [,8]       [,9]      [,10]
#> 1  2.02482801 2.09337142 2.09445057
#> 2  1.02979098 1.00523770 1.00859344
#> 3  1.24588728 1.28630576 1.28831559
#> 4  0.67844519 0.67808489 0.67912103
#> 5  0.20598501 0.22263909 0.22289776
#> 6  0.67635225 0.70420679 0.70561084
#> 7  0.10133294 0.09847205 0.09737196
#> 8  0.06139630 0.06185310 0.06202701
#> 9  2.43317037 2.42608241 2.43213747
#> 10 0.31175798 0.31302351 0.31392542
#> 11 0.23326785 0.22472545 0.22453972
#> 12 0.96450569 0.96172419 0.95912503
#> 13 2.92826816 2.88041181 2.87784898
#> 14 1.63943166 1.61203045 1.61806746
#> 15 1.72636185 1.72600708 1.71786827
#> 16 0.04614683 0.04911642 0.04884304
#> 17 0.05209752 0.04344071 0.04389911
#> 18 0.78681869 0.75517340 0.75514721
#> 19 0.60876974 0.59789477 0.59656484
#> 20 0.08364279 0.08252597 0.08207915
#> 21 0.65597011 0.69554797 0.69687802
#> 22 0.35829121 0.33954857 0.33920155
#> 23 0.35016186 0.33960578 0.34056020
#> 24 0.21327505 0.21575584 0.21648404
#> 25 0.07937426 0.08865757 0.08919000
#> 26 0.72799199 0.73683412 0.73410737
#> 27 0.07702918 0.07296722 0.07209219
#> 28 0.26006772 0.26199357 0.26110683
#> 29 1.38309722 1.38333502 1.37253094
#> 30 0.24610086 0.23542502 0.23820983
#> 31 1.42270195 1.41766315 1.42184343
#> 32 0.24013435 0.24834876 0.24777468
#> 33 0.14754717 0.14199043 0.14158704
#> 
rm(list=c("pine","ypine","Xpine"))
#> Warning: object 'pine' not found


data(Cornell)
XCornell<-Cornell[,1:7]
yCornell<-Cornell[,8]
PLS_glm_wvc(yCornell,XCornell,10,modele="pls-glm-inverse.gaussian", verbose=FALSE)
#> $valsPredict
#>        [,1]     [,2]     [,3]     [,4]     [,5]     [,6]
#> 1  95.04599 96.84573 97.43211 97.72960 97.67692 97.66926
#> 2  96.68764 98.13364 98.03691 97.94523 98.18530 98.04886
#> 3  96.09893 97.85843 97.65344 97.12746 96.98194 97.11752
#> 4  95.02359 91.98294 91.88913 91.79131 91.91479 91.89828
#> 5  87.87811 86.94625 86.08537 86.10526 86.02670 86.01328
#> 6  93.46390 90.91980 91.39057 91.61375 91.49733 91.58550
#> 7  81.89068 81.77545 81.85201 81.95061 82.04635 82.07739
#> 8  82.40022 82.61020 82.63964 82.58216 82.62615 82.70790
#> 9  82.17002 82.45931 82.55178 82.57494 82.62086 82.69025
#> 10 82.58963 83.07833 83.11779 83.00651 83.01600 83.12446
#> 11 82.15442 81.74862 82.05216 81.73070 81.63852 81.40000
#> 12 87.59686 88.64130 88.29910 88.84247 88.76913 88.66730
#> 
PLS_glm_wvc(yCornell,XCornell,10,modele="pls-glm-family",
family=inverse.gaussian(), verbose=FALSE)
#> $valsPredict
#>        [,1]     [,2]     [,3]     [,4]     [,5]     [,6]
#> 1  95.04599 96.84573 97.43211 97.72960 97.67692 97.66926
#> 2  96.68764 98.13364 98.03691 97.94523 98.18530 98.04886
#> 3  96.09893 97.85843 97.65344 97.12746 96.98194 97.11752
#> 4  95.02359 91.98294 91.88913 91.79131 91.91479 91.89828
#> 5  87.87811 86.94625 86.08537 86.10526 86.02670 86.01328
#> 6  93.46390 90.91980 91.39057 91.61375 91.49733 91.58550
#> 7  81.89068 81.77545 81.85201 81.95061 82.04635 82.07739
#> 8  82.40022 82.61020 82.63964 82.58216 82.62615 82.70790
#> 9  82.17002 82.45931 82.55178 82.57494 82.62086 82.69025
#> 10 82.58963 83.07833 83.11779 83.00651 83.01600 83.12446
#> 11 82.15442 81.74862 82.05216 81.73070 81.63852 81.40000
#> 12 87.59686 88.64130 88.29910 88.84247 88.76913 88.66730
#> 
rm(list=c("XCornell","yCornell"))


data(Cornell)
XCornell<-Cornell[,1:7]
yCornell<-Cornell[,8]
PLS_glm_wvc(dataY=yCornell,dataX=XCornell,nt=3,modele="pls-glm-gaussian",
dataPredictY=XCornell[1,], verbose=FALSE)
#> $valsPredict
#>       [,1]     [,2]    [,3]
#> 1 95.03164 97.08409 97.4436
#> 
PLS_glm_wvc(dataY=yCornell[-1],dataX=XCornell[-1,],nt=3,modele="pls-glm-gaussian",
dataPredictY=XCornell[1,], verbose=FALSE)
#> $valsPredict
#>       [,1]     [,2]     [,3]
#> 1 93.74777 95.32475 96.08522
#> 
rm("XCornell","yCornell")

data(aze_compl)
Xaze_compl<-aze_compl[,2:34]
yaze_compl<-aze_compl$y
PLS_glm(yaze_compl,Xaze_compl,10,modele="pls-glm-logistic",typeVC="none", verbose=FALSE)$InfCrit
#>                 AIC      BIC Missclassed Chi2_Pearson_Y    RSS_Y      R2_Y
#> Nb_Comp_0  145.8283 148.4727          49      104.00000 25.91346        NA
#> Nb_Comp_1  118.1398 123.4285          28      100.53823 19.32272 0.2543365
#> Nb_Comp_2  109.9553 117.8885          26       99.17955 17.33735 0.3309519
#> Nb_Comp_3  105.1591 115.7366          22      123.37836 15.58198 0.3986915
#> Nb_Comp_4  103.8382 117.0601          21      114.77551 15.14046 0.4157299
#> Nb_Comp_5  104.7338 120.6001          21      105.35382 15.08411 0.4179043
#> Nb_Comp_6  105.6770 124.1878          21       98.87767 14.93200 0.4237744
#> Nb_Comp_7  107.2828 128.4380          20       97.04072 14.87506 0.4259715
#> Nb_Comp_8  109.0172 132.8167          22       98.90110 14.84925 0.4269676
#> Nb_Comp_9  110.9354 137.3793          21      100.35563 14.84317 0.4272022
#> Nb_Comp_10 112.9021 141.9904          20      102.85214 14.79133 0.4292027
#>             R2_residY RSS_residY
#> Nb_Comp_0          NA   25.91346
#> Nb_Comp_1   -6.004879  181.52066
#> Nb_Comp_2   -9.617595  275.13865
#> Nb_Comp_3  -12.332217  345.48389
#> Nb_Comp_4  -15.496383  427.47839
#> Nb_Comp_5  -15.937183  438.90105
#> Nb_Comp_6  -16.700929  458.69233
#> Nb_Comp_7  -16.908851  464.08033
#> Nb_Comp_8  -17.555867  480.84675
#> Nb_Comp_9  -17.834439  488.06552
#> Nb_Comp_10 -17.999267  492.33678
PLS_glm_wvc(yaze_compl,Xaze_compl,10,modele="pls-glm-logistic", keepcoeffs=TRUE, verbose=FALSE)
#> $valsPredict
#>           [,1]        [,2]        [,3]        [,4]        [,5]        [,6]
#> 1   0.43119151 0.702047114 0.631858261 0.673867317 0.633105720 0.643502088
#> 2   0.21927519 0.208200622 0.166440274 0.108288423 0.122814698 0.163115357
#> 3   0.04106998 0.029963092 0.006791838 0.002965336 0.003725325 0.003746401
#> 4   0.30929185 0.554282284 0.414180215 0.441249985 0.520547128 0.604562658
#> 5   0.05118382 0.067233546 0.022736369 0.006104612 0.005217977 0.002943517
#> 6   0.05290489 0.111062277 0.030654567 0.012385420 0.011599399 0.007397041
#> 7   0.04770848 0.012461554 0.009969069 0.011387816 0.019758711 0.057213719
#> 8   0.23030599 0.203968184 0.120876571 0.103096864 0.048403176 0.031122482
#> 9   0.79603342 0.646912665 0.498819889 0.545037945 0.543238786 0.423550027
#> 10  0.18420109 0.281904502 0.319927429 0.355711963 0.419764286 0.422912933
#> 11  0.86850308 0.895570974 0.856803244 0.853618586 0.877371706 0.906873854
#> 12  0.81973108 0.742594700 0.819314935 0.772099212 0.770185219 0.741949824
#> 13  0.10378096 0.034374513 0.061664137 0.105513481 0.144646667 0.124182122
#> 14  0.24065500 0.216121428 0.236771150 0.254073363 0.180029130 0.162304231
#> 15  0.25406843 0.189921149 0.122309327 0.046077032 0.051655065 0.046711552
#> 16  0.11742108 0.026636030 0.021144841 0.009932451 0.011928074 0.012867736
#> 17  0.62187722 0.356241370 0.300877326 0.222486735 0.223971302 0.259218578
#> 18  0.63085304 0.495186298 0.710135221 0.596235794 0.529760711 0.488161297
#> 19  0.15384411 0.299159188 0.246846331 0.150702653 0.120019849 0.138670219
#> 20  0.23938183 0.102233996 0.261323652 0.418478866 0.480248654 0.437149596
#> 21  0.31843739 0.359915314 0.702415128 0.553592625 0.486242603 0.471686531
#> 22  0.26948160 0.218115310 0.330039231 0.176032976 0.142744996 0.144657867
#> 23  0.38688875 0.322522571 0.194282977 0.216329749 0.180302487 0.177187916
#> 24  0.25929423 0.113805680 0.125244332 0.065622331 0.055343844 0.053635436
#> 25  0.23956717 0.451995286 0.543962067 0.419106683 0.580554368 0.530330948
#> 26  0.19444922 0.251820723 0.123966447 0.177175557 0.150373132 0.093400279
#> 27  0.70737780 0.733381154 0.462512849 0.414579021 0.345820167 0.355863574
#> 28  0.13829538 0.102635881 0.082813672 0.085265556 0.055242281 0.061020998
#> 29  0.40496093 0.233837866 0.285045914 0.369925740 0.447381566 0.509598027
#> 30  0.12221316 0.186893799 0.059111670 0.048296062 0.062374711 0.061380243
#> 31  0.36560681 0.422607656 0.224275870 0.377066732 0.416945477 0.375434519
#> 32  0.58998317 0.392983408 0.151910276 0.143422224 0.185402553 0.124318557
#> 33  0.72886354 0.793463890 0.580779612 0.545477866 0.603245033 0.464806210
#> 34  0.31934676 0.184466274 0.101014622 0.077194233 0.075925319 0.066154173
#> 35  0.23746345 0.058469811 0.031578958 0.024285467 0.016659570 0.009334068
#> 36  0.35013391 0.355313187 0.415796281 0.521253111 0.426498990 0.456656482
#> 37  0.64850050 0.375923248 0.304390434 0.224221121 0.215331147 0.211685762
#> 38  0.38755032 0.137284188 0.101796277 0.084379325 0.091286905 0.115888877
#> 39  0.27424019 0.226697218 0.139175059 0.183685469 0.193530386 0.216586409
#> 40  0.65388494 0.519887626 0.478567247 0.400516903 0.394105599 0.554735843
#> 41  0.12946450 0.097012345 0.102677452 0.045309845 0.068832652 0.045887603
#> 42  0.66689831 0.490170524 0.514407085 0.617870569 0.618113005 0.602192147
#> 43  0.34774313 0.607127514 0.681870162 0.723512977 0.612203171 0.599031952
#> 44  0.20534214 0.060514691 0.034936078 0.023923047 0.022811279 0.030127494
#> 45  0.10787137 0.110762018 0.078045740 0.062754149 0.056126233 0.054466980
#> 46  0.66116234 0.725582784 0.700422712 0.842037688 0.798801727 0.748214349
#> 47  0.17500187 0.181257265 0.147033836 0.075688012 0.052217574 0.054286615
#> 48  0.27826924 0.312332943 0.156648719 0.138061479 0.102556222 0.094175338
#> 49  0.67906352 0.518547155 0.484272837 0.608913994 0.596609568 0.610163418
#> 50  0.40298084 0.482554614 0.223122702 0.198767758 0.288197158 0.261541091
#> 51  0.28408032 0.163890090 0.188384267 0.161186131 0.189991487 0.226812553
#> 52  0.02120634 0.007494379 0.010418897 0.005706003 0.004028951 0.003211954
#> 53  0.53874100 0.450203729 0.614709704 0.386004761 0.328383647 0.295044939
#> 54  0.67152137 0.406158600 0.409417001 0.411703449 0.449462796 0.485883982
#> 55  0.08777970 0.075522741 0.132824654 0.178966392 0.124633556 0.141516794
#> 56  0.61855060 0.721686274 0.584022743 0.783926814 0.737841421 0.696846758
#> 57  0.52725323 0.671536671 0.902964122 0.912756588 0.928084614 0.938575849
#> 58  0.67512847 0.587075631 0.694863898 0.715487331 0.707353396 0.646163105
#> 59  0.39617134 0.436010575 0.346619901 0.369125532 0.274518456 0.240623252
#> 60  0.20362696 0.227843791 0.394195242 0.515590659 0.637767577 0.535068332
#> 61  0.67200417 0.660036839 0.747335781 0.662234187 0.679349829 0.720216240
#> 62  0.50396071 0.635184174 0.695678303 0.564600033 0.608155992 0.607249320
#> 63  0.53548611 0.537052534 0.518321788 0.506052818 0.560106390 0.705117054
#> 64  0.81456207 0.923746901 0.940505447 0.946852982 0.937110807 0.963914099
#> 65  0.76930107 0.652078773 0.779696124 0.750240414 0.759297145 0.774107137
#> 66  0.74081281 0.538838825 0.490449388 0.561092475 0.628698596 0.645415477
#> 67  0.91280565 0.985524499 0.987385270 0.995952381 0.995809795 0.994173319
#> 68  0.27025468 0.642218762 0.881003868 0.892387298 0.893777605 0.913554112
#> 69  0.72400506 0.847099614 0.763912533 0.755718333 0.751986129 0.842526567
#> 70  0.75276334 0.922407667 0.915955573 0.924372638 0.952274824 0.956119967
#> 71  0.50141585 0.318030343 0.310744148 0.218311644 0.212515092 0.174944911
#> 72  0.12927414 0.099124892 0.030697353 0.026288318 0.034633861 0.054380083
#> 73  0.48132481 0.465376787 0.500862912 0.742159597 0.688353402 0.689246028
#> 74  0.68772032 0.778090832 0.874976023 0.935754083 0.949673750 0.969407185
#> 75  0.72031937 0.606769993 0.622494495 0.572143218 0.579477288 0.627568141
#> 76  0.55793917 0.494421342 0.437434284 0.460400091 0.460542480 0.494890446
#> 77  0.79943991 0.855257753 0.935928508 0.978005670 0.981273972 0.984309678
#> 78  0.85745937 0.830618813 0.865134636 0.898312327 0.913928051 0.928170413
#> 79  0.83345117 0.919224715 0.856369203 0.791325587 0.745470889 0.766729075
#> 80  0.57235513 0.414039021 0.351476404 0.311958419 0.290518135 0.263925112
#> 81  0.86879852 0.898679681 0.879900894 0.773949255 0.704553708 0.671359206
#> 82  0.45535619 0.764274615 0.797532046 0.744461196 0.824764221 0.811884622
#> 83  0.82827167 0.900830434 0.938275422 0.933835357 0.951546879 0.956093466
#> 84  0.70805640 0.903075834 0.967809228 0.979031867 0.972328032 0.963104641
#> 85  0.58750948 0.270646003 0.275457584 0.262474288 0.265058291 0.259707331
#> 86  0.85501392 0.950653792 0.950972848 0.957421636 0.940415633 0.911863862
#> 87  0.59350862 0.367675846 0.541557780 0.623191067 0.628609893 0.603566405
#> 88  0.78701299 0.779778499 0.528980639 0.595241939 0.624596722 0.612263975
#> 89  0.53053475 0.463363952 0.357941654 0.342680704 0.325066419 0.411465198
#> 90  0.78742155 0.970653016 0.964721638 0.978839943 0.985003070 0.987145933
#> 91  0.35243467 0.320053491 0.479839722 0.640766735 0.611879896 0.666491216
#> 92  0.89720093 0.966418149 0.960220418 0.963882817 0.955612296 0.958551535
#> 93  0.68862173 0.907543424 0.913645809 0.913758045 0.951448175 0.959000320
#> 94  0.28015169 0.404075368 0.545346353 0.760763289 0.680088744 0.672242170
#> 95  0.47946796 0.826642022 0.699988396 0.535143591 0.588217592 0.502063440
#> 96  0.29303531 0.260466286 0.575477371 0.479152176 0.513734415 0.428690041
#> 97  0.57134153 0.493862376 0.662480104 0.534413997 0.483990436 0.485013225
#> 98  0.80738973 0.800712471 0.861510658 0.899175983 0.915135265 0.882660615
#> 99  0.17424886 0.084872199 0.039330110 0.089167782 0.089911231 0.075436072
#> 100 0.69464297 0.849056789 0.901977691 0.852790391 0.855378024 0.837083296
#> 101 0.42957598 0.523749819 0.697708069 0.781939401 0.831423860 0.869246953
#> 102 0.77532567 0.726430234 0.850516228 0.849288906 0.876686813 0.866961583
#> 103 0.30244539 0.633632746 0.580168854 0.600867278 0.516381330 0.595553649
#> 104 0.72830077 0.856329971 0.822297122 0.839564063 0.843377821 0.894234435
#>            [,7]        [,8]         [,9]        [,10]
#> 1   0.638787122 0.659736147 0.6929134622 0.6943989562
#> 2   0.183028361 0.177705421 0.1926067471 0.1927561148
#> 3   0.003376060 0.003152591 0.0032092459 0.0031900703
#> 4   0.637363116 0.601345955 0.6315525475 0.6128454701
#> 5   0.001804324 0.001087005 0.0009059437 0.0008702787
#> 6   0.007558967 0.008124069 0.0075922059 0.0072190675
#> 7   0.075010051 0.090577306 0.0904145007 0.0948080986
#> 8   0.037693727 0.036957495 0.0385194884 0.0366579947
#> 9   0.425135095 0.416115302 0.4234960049 0.4324032991
#> 10  0.343557178 0.355791649 0.3338844672 0.3416736098
#> 11  0.886266198 0.869985302 0.8690005512 0.8662229884
#> 12  0.710054194 0.673535568 0.6614385720 0.6495800417
#> 13  0.113655657 0.099642192 0.0950823806 0.0871083484
#> 14  0.162253329 0.157798411 0.1528392913 0.1395625569
#> 15  0.052536503 0.054898188 0.0458127882 0.0418375835
#> 16  0.014844229 0.016856869 0.0175371209 0.0173133552
#> 17  0.265167590 0.277072182 0.2975291066 0.3102814947
#> 18  0.471171553 0.456928590 0.4615499112 0.4590135395
#> 19  0.124799505 0.110005858 0.0939806189 0.0901182975
#> 20  0.390670737 0.441849813 0.4361859267 0.4353931743
#> 21  0.476731092 0.500960968 0.5071133610 0.5064639572
#> 22  0.122666126 0.121059180 0.1160295322 0.1176406817
#> 23  0.157621354 0.130954478 0.1380573789 0.1433585464
#> 24  0.060662413 0.064337031 0.0732478070 0.0788979344
#> 25  0.578979810 0.545258742 0.5619282569 0.5595832463
#> 26  0.108427152 0.149040860 0.1378520800 0.1466056822
#> 27  0.395406681 0.355201942 0.3518622723 0.3491848585
#> 28  0.063474701 0.061968018 0.0571164691 0.0556497892
#> 29  0.437293949 0.440015737 0.4520380190 0.4550505243
#> 30  0.090222972 0.106807614 0.1110660966 0.1078204935
#> 31  0.352227836 0.338481138 0.3335947194 0.3461925622
#> 32  0.137231222 0.139858851 0.1304182629 0.1232302292
#> 33  0.429294436 0.474754408 0.4757750462 0.4534418743
#> 34  0.075702396 0.089288097 0.0976117186 0.1002116059
#> 35  0.008228014 0.007198486 0.0075462359 0.0077113361
#> 36  0.416343698 0.433478675 0.4314919966 0.4291386780
#> 37  0.214184092 0.233031978 0.2466886171 0.2451334330
#> 38  0.131467992 0.133488355 0.1335614164 0.1313510107
#> 39  0.155762668 0.148717710 0.1567326194 0.1594062043
#> 40  0.577240742 0.554673059 0.5497730900 0.5509835451
#> 41  0.048124309 0.050731059 0.0500083553 0.0448639866
#> 42  0.557924005 0.621291699 0.6152093088 0.6041143234
#> 43  0.634717435 0.625848315 0.6179020932 0.6180509570
#> 44  0.030924644 0.034659867 0.0306744277 0.0329719302
#> 45  0.044139143 0.041342057 0.0370482993 0.0355143768
#> 46  0.765226062 0.719351450 0.7165147398 0.7371329957
#> 47  0.057359793 0.045401532 0.0448929095 0.0487307228
#> 48  0.078230361 0.075182597 0.0721409234 0.0707122052
#> 49  0.694411530 0.720080600 0.7210842527 0.7183599426
#> 50  0.235732262 0.220747630 0.2261186866 0.2145370126
#> 51  0.234065793 0.220978475 0.1869430913 0.1995610601
#> 52  0.003437084 0.003779608 0.0035686761 0.0033152141
#> 53  0.303245796 0.306092227 0.3003119432 0.3125280275
#> 54  0.523115204 0.502477841 0.4860084012 0.4799221158
#> 55  0.159531646 0.148936794 0.1471387050 0.1461051611
#> 56  0.704311058 0.670809998 0.6863833984 0.6954430065
#> 57  0.943204808 0.926497752 0.9285111732 0.9189983080
#> 58  0.654185441 0.637253621 0.6339188721 0.6237031375
#> 59  0.211731291 0.194702818 0.2153012812 0.2224698451
#> 60  0.578481709 0.553430985 0.5498083458 0.5552296673
#> 61  0.717399252 0.688899651 0.6833342733 0.6938073689
#> 62  0.580236946 0.561386908 0.5344644041 0.5476964614
#> 63  0.651909358 0.666423048 0.6518698084 0.6663502131
#> 64  0.966570975 0.967919914 0.9629859429 0.9607163779
#> 65  0.769033500 0.771933407 0.7688031852 0.7654739625
#> 66  0.618660601 0.605251254 0.6147034915 0.6115411360
#> 67  0.992233615 0.992370008 0.9912205187 0.9908677967
#> 68  0.926820425 0.940010587 0.9433622300 0.9472604786
#> 69  0.820022341 0.828019425 0.8345071602 0.8307300622
#> 70  0.952446809 0.942761722 0.9464573655 0.9474262795
#> 71  0.177146497 0.202296797 0.2065928146 0.2148691609
#> 72  0.059294591 0.044850251 0.0396625181 0.0359318012
#> 73  0.764615989 0.774200149 0.7720125792 0.7645581755
#> 74  0.977761403 0.984747935 0.9840257594 0.9833411230
#> 75  0.691909459 0.682556639 0.6572532436 0.6429766278
#> 76  0.532574707 0.494727136 0.4906445050 0.5066664293
#> 77  0.987637116 0.991228314 0.9916479530 0.9915530258
#> 78  0.945274182 0.949158918 0.9500366289 0.9503794757
#> 79  0.754986019 0.776998067 0.7816242563 0.7871699651
#> 80  0.228016301 0.223396570 0.2186964646 0.2082883519
#> 81  0.666420005 0.684833773 0.6766117373 0.6632488136
#> 82  0.801528750 0.829859317 0.8267776343 0.8261734713
#> 83  0.959791502 0.967581315 0.9699116096 0.9732698475
#> 84  0.956332714 0.964097103 0.9633606742 0.9650418710
#> 85  0.265579761 0.306826150 0.3330031055 0.3383605790
#> 86  0.891734459 0.888000574 0.8923024430 0.8958113953
#> 87  0.634861265 0.637836802 0.6497185838 0.6508400647
#> 88  0.592698628 0.583637434 0.5435709244 0.5588191361
#> 89  0.424076781 0.404623342 0.4313744898 0.4218769380
#> 90  0.989332252 0.992616886 0.9943134281 0.9946703862
#> 91  0.675576650 0.734303755 0.7206461704 0.7140897952
#> 92  0.953076020 0.959646001 0.9545742998 0.9480225668
#> 93  0.970105937 0.974883444 0.9754998983 0.9782804092
#> 94  0.680953582 0.649567373 0.6240201397 0.6168279359
#> 95  0.561170807 0.599211294 0.5856984481 0.5961543896
#> 96  0.396894305 0.339334143 0.3662679214 0.3640990048
#> 97  0.484857170 0.487769044 0.4938813223 0.4960597646
#> 98  0.881013093 0.859741343 0.8710430696 0.8736903224
#> 99  0.072463010 0.082375646 0.0886148007 0.0861258464
#> 100 0.853833336 0.877254170 0.8697760323 0.8733522761
#> 101 0.873059397 0.868378502 0.8660074209 0.8659618513
#> 102 0.877121140 0.890714101 0.8995141757 0.9033306611
#> 103 0.555036219 0.587526765 0.6104774849 0.6020532420
#> 104 0.871930923 0.882906865 0.8840853461 0.8836606796
#> 
#> $coeffs
#>                       [,1]
#> tempConstante -2.276982302
#>               -1.068275295
#>                3.509231595
#>               -1.651869135
#>                2.207538418
#>                0.568523938
#>               -0.059691869
#>               -0.214529856
#>               -1.405223273
#>                0.396973880
#>               -0.782167532
#>                0.677591817
#>               -0.972259676
#>                0.650745841
#>                0.723667343
#>                0.477540145
#>                0.638755948
#>                1.666070158
#>               -0.005938234
#>                0.482766293
#>               -0.904425334
#>                0.300460249
#>                1.367992779
#>               -1.201977825
#>               -1.536120691
#>               -1.983144986
#>                1.544435411
#>                1.410302156
#>               -0.495400138
#>                0.454129717
#>                1.240250301
#>               -0.222933455
#>               -2.822712745
#>                0.026369914
#> 
rm("Xaze_compl","yaze_compl")
# }