Finds an optimal solution for the Q.func function.
Arguments
- Q.func
- name of the function to be minimized. 
- bounds
- bounds for parameters 
- round.n
- number of digits after comma, default: 5 
- parms.coding
- parmeters coding: none or log2, default: none. 
- fminlower
- minimal value for the function Q.func, default is 0. 
- flag.find.one.min
- do you want to find one min value and stop? Default: FALSE 
- show
- show plots of DIRECT algorithm: none, final iteration, all iterations. Default: none 
- N
- define the number of start points, see details. 
- maxevals
- the maximum number of DIRECT function evaluations, default: 500. 
- pdf.name
- pdf name 
- pdf.width
- default: 12 
- pdf.height
- default: 12 
- my.mfrow
- default: c(1,1) 
- verbose
- verbose? default: TRUE. 
- seed
- seed 
- ...
- additional argument(s) 
Value
- fmin
- minimal value of Q.func on the interval defined by bounds. 
- xmin
- corresponding parameters for the minimum 
- iter
- number of iterations 
- neval
- number of visited points 
- maxevals
- the maximum number of DIRECT function evaluations 
- seed
- seed 
- bounds
- bounds for parameters 
- Q.func
- name of the function to be minimized. 
- points.fmin
- the set of points with the same fmin 
- Xtrain
- visited points 
- Ytrain
- the output of Q.func at visited points Xtrain 
- gp.seed
- seed for Gaussian Process 
- model.list
- detailed information of the search process 
Details
if the number of start points (N) is not defined by the user, it will be defined dependent on the dimensionality of the parameter space. N=10D+1, where D is the number of parameters, but for high dimensional parameter space with more than 6 dimensions, the initial set is restricted to 65. However for one-dimensional parameter space the N is set to 21 due to stability reasons.
The idea of EPSGO (Efficient Parameter Selection via Global Optimization): Beginning from an intial Latin hypercube sampling containing N starting points we train an Online GP, look for the point with the maximal expected improvement, sample there and update the Gaussian Process(GP). Thereby it is not so important that GP really correctly models the error surface of the SVM in parameter space, but that it can give a us information about potentially interesting points in parameter space where we should sample next. We continue with sampling points until some convergence criterion is met.
DIRECT is a sampling algorithm which requires no knowledge of the objective function gradient. Instead, the algorithm samples points in the domain, and uses the information it has obtained to decide where to search next. The DIRECT algorithm will globally converge to the maximal value of the objective function. The name DIRECT comes from the shortening of the phrase 'DIviding RECTangles', which describes the way the algorithm moves towards the optimum.
The code source was adopted from MATLAB originals, special thanks to Holger Froehlich.
References
Froehlich, H. and Zell, A. (2005) "Effcient parameter selection
for support vector machines in classification and regression via model-based
global optimization" In Proc. Int. Joint Conf. Neural Networks,
1431-1438 .
Sill M., Hielscher T., Becker N. and Zucknick M. (2014), c060: Extended Inference with Lasso and Elastic-Net Regularized Cox and Generalized Linear Models, Journal of Statistical Software, Volume 62(5), pages 1–22. https://doi.org/10.18637/jss.v062.i05.
Examples
set.seed(1010)
n=1000;p=100
nzc=trunc(p/10)
x=matrix(rnorm(n*p),n,p)
beta=rnorm(nzc)
fx= x[,seq(nzc)] %*% beta
eps=rnorm(n)*5
y=drop(fx+eps)
px=exp(fx)
px=px/(1+px)
ly=rbinom(n=length(px),prob=px,size=1)
set.seed(1011)
# \donttest{
# y - binomial
y.classes<-ifelse(y>= median(y),1, 0)
set.seed(1234)
nfolds = 10
foldid <- balancedFolds(class.column.factor=y.classes, cross.outer=nfolds)
#> 121 
#> 2 
#> 3 
#> 4 
#> 5 
#> 6 
#> 7 
#> 8 
#> 9 
#> 10 
bounds <- t(data.frame(alpha=c(0, 1)))
colnames(bounds)<-c("lower","upper")
 
fit <- EPSGO(Q.func="tune.glmnet.interval", 
             bounds=bounds, 
             parms.coding="none", 
             seed = 1234, 
             show="none",
             fminlower = -100,
             x = x, y = y.classes, family = "binomial", 
             foldid = foldid,
             type.min = "lambda.1se",
             type.measure = "mse")
#> [1] "parms.coding"
#> [1] "none"
#>          [,1]
#>  [1,] 0.73445
#>  [2,] 0.85311
#>  [3,] 0.53235
#>  [4,] 0.13097
#>  [5,] 0.60814
#>  [6,] 0.62236
#>  [7,] 0.99216
#>  [8,] 0.26607
#>  [9,] 0.31969
#> [10,] 0.15851
#> [11,] 0.67349
#> [12,] 0.86540
#> [13,] 0.41169
#> [14,] 0.90698
#> [15,] 0.01183
#> [16,] 0.48959
#> [17,] 0.22937
#> [18,] 0.35578
#> [19,] 0.09411
#> [20,] 0.45474
#> [21,] 0.77450
#> [1] "alpha= 0.73445"
#> [1] "alpha= 0.85311"
#> [1] "alpha= 0.53235"
#> [1] "alpha= 0.13097"
#> [1] "alpha= 0.60814"
#> [1] "alpha= 0.62236"
#> [1] "alpha= 0.99216"
#> [1] "alpha= 0.26607"
#> [1] "alpha= 0.31969"
#> [1] "alpha= 0.15851"
#> [1] "alpha= 0.67349"
#> [1] "alpha= 0.8654"
#> [1] "alpha= 0.41169"
#> [1] "alpha= 0.90698"
#> [1] "alpha= 0.01183"
#> [1] "alpha= 0.48959"
#> [1] "alpha= 0.22937"
#> [1] "alpha= 0.35578"
#> [1] "alpha= 0.09411"
#> [1] "alpha= 0.45474"
#> [1] "alpha= 0.7745"
#>          X         Q
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "loop 1"
#> [1] "fmin= 0.441921792409136"
#> [1] "EImax= 0.360494188095381"
#> [1] "xmax:"
#>            upper
#> alpha 0.04183813
#> [1] "history:"
#>       Iteration Nr Function Count      f_min
#>  [1,]            1              3  0.0000000
#>  [2,]            2              9  0.0000000
#>  [3,]            3             27 -0.2824064
#>  [4,]            4             29 -0.3465652
#>  [5,]            5             83 -0.3533452
#>  [6,]            6             87 -0.3551043
#>  [7,]            7             93 -0.3556474
#>  [8,]            8            255 -0.3580834
#>  [9,]            9            257 -0.3593507
#> [10,]           10            261 -0.3597449
#> [11,]           11            267 -0.3598733
#> [12,]           12            275 -0.3599158
#> [13,]           13            285 -0.3599299
#> [14,]           14            297 -0.3599347
#> [15,]           15            311 -0.3599362
#> [16,]           16            791 -0.3604942
#> [1] "iteration : 1   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.04184
#>     Xtrain    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> [1] "loop 2"
#> [1] "alpha= 0.04184"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.28764932113945"
#> [1] "xmax:"
#>              upper
#> alpha 9.408382e-07
#> [1] "history:"
#>       Iteration Nr Function Count      f_min
#>  [1,]            1              3  0.0000000
#>  [2,]            2              9  0.0000000
#>  [3,]            3             27 -0.1696142
#>  [4,]            4             29 -0.1891384
#>  [5,]            5             83 -0.2656889
#>  [6,]            6             87 -0.2810880
#>  [7,]            7            249 -0.2855411
#>  [8,]            8            255 -0.2869601
#>  [9,]            9            261 -0.2874261
#> [10,]           10            271 -0.2875807
#> [11,]           11            283 -0.2876322
#> [12,]           12            769 -0.2876493
#> [1] "iteration : 2   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>      alpha
#> [1,]     0
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> [1] "loop 3"
#> [1] "alpha= 0"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.288438449825165"
#> [1] "xmax:"
#>            upper
#> alpha 0.01859379
#> [1] "history:"
#>       Iteration Nr Function Count      f_min
#>  [1,]            1              3  0.0000000
#>  [2,]            2              9  0.0000000
#>  [3,]            3             27 -0.2870820
#>  [4,]            4             29 -0.2870820
#>  [5,]            5             83 -0.2870820
#>  [6,]            6             87 -0.2870820
#>  [7,]            7            247 -0.2870820
#>  [8,]            8            253 -0.2870820
#>  [9,]            9            259 -0.2880008
#> [10,]           10            267 -0.2883041
#> [11,]           11            277 -0.2884049
#> [12,]           12            763 -0.2884384
#> [1] "iteration : 3   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.01859
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> [1] "loop 4"
#> [1] "alpha= 0.01859"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.376644447641587"
#> [1] "xmax:"
#>            upper
#> alpha 0.03436934
#> [1] "history:"
#>       Iteration Nr Function Count         f_min
#>  [1,]            1              3  0.000000e+00
#>  [2,]            2              9  0.000000e+00
#>  [3,]            3             27 -8.834395e-08
#>  [4,]            4             29 -3.743045e-01
#>  [5,]            5             83 -3.765597e-01
#>  [6,]            6             87 -3.765597e-01
#>  [7,]            7             93 -3.766393e-01
#>  [8,]            8            255 -3.766444e-01
#>  [9,]            9            263 -3.766444e-01
#> [10,]           10            273 -3.766444e-01
#> [11,]           11            285 -3.766444e-01
#> [12,]           12            299 -3.766444e-01
#> [13,]           13            315 -3.766444e-01
#> [14,]           14            333 -3.766444e-01
#> [15,]           15            817 -3.766444e-01
#> [1] "iteration : 4   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.03437
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> [1] "loop 5"
#> [1] "alpha= 0.03437"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.365043353841314"
#> [1] "xmax:"
#>            upper
#> alpha 0.02537723
#> [1] "history:"
#>       Iteration Nr Function Count         f_min
#>  [1,]            1              3  0.000000e+00
#>  [2,]            2              9  0.000000e+00
#>  [3,]            3             27 -1.551645e-05
#>  [4,]            4             29 -3.332488e-01
#>  [5,]            5             83 -3.332488e-01
#>  [6,]            6             87 -3.449330e-01
#>  [7,]            7            249 -3.579414e-01
#>  [8,]            8            253 -3.607789e-01
#>  [9,]            9            259 -3.616089e-01
#> [10,]           10            265 -3.618739e-01
#> [11,]           11            275 -3.619609e-01
#> [12,]           12            759 -3.650434e-01
#> [1] "iteration : 5   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.02538
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> [1] "loop 6"
#> [1] "alpha= 0.02538"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.33165134229888"
#> [1] "xmax:"
#>             upper
#> alpha 0.005647852
#> [1] "history:"
#>       Iteration Nr Function Count         f_min
#>  [1,]            1              3  0.000000e+00
#>  [2,]            2              9  0.000000e+00
#>  [3,]            3             27 -2.058179e-11
#>  [4,]            4             29 -3.310340e-01
#>  [5,]            5             83 -3.310340e-01
#>  [6,]            6             87 -3.310340e-01
#>  [7,]            7            249 -3.316411e-01
#>  [8,]            8            255 -3.316411e-01
#>  [9,]            9            263 -3.316507e-01
#> [10,]           10            273 -3.316513e-01
#> [11,]           11            285 -3.316513e-01
#> [12,]           12            771 -3.316513e-01
#> [1] "iteration : 6   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.00565
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> [1] "loop 7"
#> [1] "alpha= 0.00565"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.169139300761214"
#> [1] "xmax:"
#>            upper
#> alpha 0.02254907
#> [1] "history:"
#>       Iteration Nr Function Count         f_min
#>  [1,]            1              3  0.000000e+00
#>  [2,]            2              9  0.000000e+00
#>  [3,]            3             27 -3.120980e-16
#>  [4,]            4             81 -1.283425e-03
#>  [5,]            5             83 -9.553696e-02
#>  [6,]            6            245 -1.690474e-01
#>  [7,]            7            247 -1.690474e-01
#>  [8,]            8            251 -1.690474e-01
#>  [9,]            9            257 -1.690576e-01
#> [10,]           10            265 -1.691317e-01
#> [11,]           11            747 -1.691393e-01
#> [1] "iteration : 7   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.02255
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> [1] "loop 8"
#> [1] "alpha= 0.02255"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> 28 0.02255 0.4518852
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.298921709400262"
#> [1] "xmax:"
#>             upper
#> alpha 0.008814713
#> [1] "history:"
#>       Iteration Nr Function Count         f_min
#>  [1,]            1              3  0.000000e+00
#>  [2,]            2              9  0.000000e+00
#>  [3,]            3             27 -2.719416e-08
#>  [4,]            4             29 -1.207255e-01
#>  [5,]            5             83 -2.721111e-01
#>  [6,]            6             87 -2.721111e-01
#>  [7,]            7            249 -2.764528e-01
#>  [8,]            8            253 -2.987879e-01
#>  [9,]            9            257 -2.987879e-01
#> [10,]           10            263 -2.988964e-01
#> [11,]           11            745 -2.989217e-01
#> [1] "iteration : 8   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.00881
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> 28 0.02255 0.4518852
#> [1] "loop 9"
#> [1] "alpha= 0.00881"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> 28 0.02255 0.4518852
#> 29 0.00881 0.4553102
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.358893503444292"
#> [1] "xmax:"
#>            upper
#> alpha 0.01514844
#> [1] "history:"
#>       Iteration Nr Function Count        f_min
#>  [1,]            1              3  0.000000000
#>  [2,]            2              9  0.000000000
#>  [3,]            3             27 -0.007906843
#>  [4,]            4             29 -0.131613696
#>  [5,]            5             83 -0.334659883
#>  [6,]            6             87 -0.355827428
#>  [7,]            7            249 -0.356846019
#>  [8,]            8            253 -0.358737403
#>  [9,]            9            259 -0.358890975
#> [10,]           10            267 -0.358890975
#> [11,]           11            751 -0.358893503
#> [1] "iteration : 9   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.01515
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> 28 0.02255 0.4518852
#> 29 0.00881 0.4553102
#> [1] "loop 10"
#> [1] "alpha= 0.01515"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> 28 0.02255 0.4518852
#> 29 0.00881 0.4553102
#> 30 0.01515 0.4539218
#> [1] "fmin= 0.441921792409136"
#> ...done
#> ...done
#> ...done
#> ...done
#> ...done
#> [1] "EImax= 0.344941817080621"
#> [1] "xmax:"
#>             upper
#> alpha 0.002584483
#> [1] "history:"
#>       Iteration Nr Function Count         f_min
#>  [1,]            1              3  0.0000000000
#>  [2,]            2              9  0.0000000000
#>  [3,]            3             27 -0.0001563221
#>  [4,]            4             29 -0.1340091716
#>  [5,]            5             83 -0.3425694657
#>  [6,]            6             87 -0.3425694657
#>  [7,]            7            247 -0.3449022032
#>  [8,]            8            253 -0.3449022032
#>  [9,]            9            259 -0.3449390412
#> [10,]           10            267 -0.3449418038
#> [11,]           11            279 -0.3449418038
#> [12,]           12            767 -0.3449418171
#> [1] "iteration : 10   fmin =  0.441921792409136"
#> [1] "finished? FALSE"
#> [1] "X"
#>        alpha
#> [1,] 0.00258
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> 28 0.02255 0.4518852
#> 29 0.00881 0.4553102
#> 30 0.01515 0.4539218
#> [1] "loop 11"
#> [1] "alpha= 0.00258"
#>      alpha    Ytrain
#> 1  0.73445 0.4427822
#> 2  0.85311 0.4422716
#> 3  0.53235 0.4419218
#> 4  0.13097 0.4444470
#> 5  0.60814 0.4435479
#> 6  0.62236 0.4434463
#> 7  0.99216 0.4438921
#> 8  0.26607 0.4440248
#> 9  0.31969 0.4426631
#> 10 0.15851 0.4444240
#> 11 0.67349 0.4431161
#> 12 0.86540 0.4422268
#> 13 0.41169 0.4432506
#> 14 0.90698 0.4420846
#> 15 0.01183 0.4534574
#> 16 0.48959 0.4423172
#> 17 0.22937 0.4430704
#> 18 0.35578 0.4419665
#> 19 0.09411 0.4462451
#> 20 0.45474 0.4426950
#> 21 0.77450 0.4425921
#> 22 0.04184 0.4502101
#> 23 0.00000 0.4563529
#> 24 0.01859 0.4535900
#> 25 0.03437 0.4498726
#> 26 0.02538 0.4515460
#> 27 0.00565 0.4550679
#> 28 0.02255 0.4518852
#> 29 0.00881 0.4553102
#> 30 0.01515 0.4539218
#> 31 0.00258 0.4558921
#> [1] "fmin= 0.441921792409136"
#> [1] "No changes in the last 10 iterations, break iterations"
summary(fit)
#>             Length Class      Mode     
#> fmin         1     -none-     numeric  
#> xmin         1     -none-     numeric  
#> iter         1     -none-     numeric  
#> neval        1     -none-     numeric  
#> maxevals     1     -none-     numeric  
#> seed         1     -none-     numeric  
#> bounds       2     -none-     numeric  
#> Q.func       1     -none-     character
#> points.fmin  2     data.frame list     
#> Xtrain      31     -none-     numeric  
#> Ytrain      31     -none-     numeric  
#> gp.seed     10     -none-     numeric  
#> model.list  31     -none-     list     
# }
if (FALSE) { # \dontrun{
# y - multinomial: low - low 25%, middle - (25,75)-quantiles, high - larger 75%.
y.classes<-ifelse(y <= quantile(y,0.25),1, ifelse(y >= quantile(y,0.75),3, 2))
set.seed(1234)
nfolds = 10
foldid <- balancedFolds(class.column.factor=y.classes, cross.outer=nfolds)
bounds <- t(data.frame(alpha=c(0, 1)))
colnames(bounds)<-c("lower","upper")
 
fit <- EPSGO(Q.func="tune.glmnet.interval", 
             bounds=bounds, 
             parms.coding="none", 
             seed = 1234, 
             show="none",
             fminlower = -100,
             x = x, y = y.classes, family = "multinomial", 
             foldid = foldid,
             type.min = "lambda.1se",
             type.measure = "mse")
summary(fit)
} # }
if (FALSE) { # \dontrun{
##poisson
N=500; p=20
nzc=5
x=matrix(rnorm(N*p),N,p)
beta=rnorm(nzc)
f = x[,seq(nzc)]%*%beta
mu=exp(f)
y.classes=rpois(N,mu)
nfolds = 10
set.seed(1234)
foldid <- balancedFolds(class.column.factor=y.classes, cross.outer=nfolds)
fit <- EPSGO(Q.func="tune.glmnet.interval", 
             bounds=bounds, 
             parms.coding="none", 
             seed = 1234, 
             show="none",
             fminlower = -100,
             x = x, y = y.classes, family = "poisson", 
             foldid = foldid,
             type.min = "lambda.1se",
             type.measure = "mse")
summary(fit)
} # }
if (FALSE) { # \dontrun{
#gaussian
set.seed(1234)
x=matrix(rnorm(100*1000,0,1),100,1000)
y <- x[1:100,1:1000]%*%c(rep(2,5),rep(-2,5),rep(.1,990))
foldid <- rep(1:10,each=10)
fit <- EPSGO(Q.func="tune.glmnet.interval", 
             bounds=bounds, 
             parms.coding="none", 
             seed = 1234, 
             show="none",
             fminlower = -100,
             x = x, y = y, family = "gaussian", 
             foldid = foldid,
             type.min = "lambda.1se",
             type.measure = "mse")
summary(fit)  
} # }
# y - cox in vignette