Skip to contents

Kernel Logistic PLS (klogitpls)

We first extract latent scores with Kernel PLS (KPLS):

T=KcU, T = K_c U,

where Kc=HK(X,X)HK_c = H K(X,X) H is the centered Gram matrix and the columns of UU are the dual score directions (KPLS deflation).

We then fit a logistic link in the latent space using IRLS:

η=β0+Tβ,p=σ(η), \eta = \beta_0 + T \beta, \qquad p = \sigma(\eta), W=diag(p(1p)),z=η+ypp(1p). W = \mathrm{diag}(p (1-p)), \qquad z = \eta + \frac{y - p}{p(1-p)}.

At each iteration, solve the weighted least squares system for [β0,β][\beta_0, \beta]: (M̃M̃)θ=M̃z̃,M̃=W1/2[1,T],z̃=W1/2z. (\tilde{M}^\top \tilde{M}) \theta = \tilde{M}^\top \tilde{z}, \quad \tilde{M} = W^{1/2}[1, T], \ \tilde{z} = W^{1/2} z.

Optionally, we alternate: replace yy by pp and recompute KPLS to refresh TT for a few steps.
Prediction on new data uses the centered cross-kernel $K_c(X_\*, X)$ and the stored KPLS basis UU: $$ T_\* = K_c(X_\*, X) \, U, \qquad \hat{p}_\* = \sigma\!\big(\beta_0 + T_\* \beta\big). $$