This function computes the optimal ridge regression model based on cross-validation.
ridge.cv( X, y, lambda = NULL, scale = TRUE, k = 10, plot.it = FALSE, groups = NULL, method.cor = "pearson", compute.jackknife = TRUE )
X | matrix of input observations. The rows of |
---|---|
y | vector of responses. The length of y must equal the number of rows of X |
lambda | Vector of penalty terms. |
scale | Scale the columns of X? Default is scale=TRUE. |
k | Number of splits in |
plot.it | Plot the cross-validation error as a function of
|
groups | an optional vector with the same length as |
method.cor | How should the correlation to the response be computed? Default is ''pearson''. |
compute.jackknife | Logical. If |
matrix of cross-validated errors based on mean squared error. A row corresponds to one cross-validation split.
vector of cross-validated errors based on mean squared error
optimal value of lambda
, based on mean
squared error
intercept of the optimal model, based on mean squared error
vector of regression coefficients of the optimal model, based on mean squared error
matrix of cross-validated errors based on correlation. A row corresponds to one cross-validation split.
vector of cross-validated errors based on correlation
optimal value of lambda
, based on correlation
intercept of the optimal model, based on correlation
vector of regression coefficients of the optimal model, based on mean squared error
Array of
the regression coefficients on each of the cross-validation splits. The
dimension is ncol(X) x length(lambda) x k
.
Based on the regression coefficients coefficients.jackknife
computed
on the cross-validation splits, we can estimate their mean and their
variance using the jackknife. We remark that under a fixed design and the
assumption of normally distributed y
-values, we can also derive the
true distribution of the regression coefficients.
Nicole Kraemer