Vietnamese
Ngôn ngữ
English
Bengali
French
Hindi
Italian
Japanese
Korean
Malayalam
Russian
Spanish
Tamil
Turkish
Vietnamese
Shortcuts



SVCLoss

class SVCLoss(**kwargs)[source]

Bases: KernelLoss

This class provides a kernel loss function for classification tasks by fitting an SVC model from scikit-learn. Given training samples, \(x_{i}\), with binary labels, \(y_{i}\), and a kernel, \(K_{θ}\), parameterized by values, \(θ\), the loss is defined as:

\[SVCLoss = \sum_{i} a_i - 0.5 \sum_{i,j} a_i a_j y_{i} y_{j} K_θ(x_i, x_j)\]

where \(a_i\) are the optimal Lagrange multipliers found by solving the standard SVM quadratic program. Note that the hyper-parameter C for the soft-margin penalty can be specified through the keyword args.

Minimizing this loss over the parameters, \(θ\), of the kernel is equivalent to maximizing a weighted kernel alignment, which in turn yields the smallest upper bound to the SVM generalization error for a given parameterization.

See https://arxiv.org/abs/2105.03406 for further details.

Tham số

**kwargs -- Arbitrary keyword arguments to pass to SVC constructor within SVCLoss evaluation.

Methods

evaluate(parameter_values, quantum_kernel, ...)

An abstract method for evaluating the loss of a kernel function on a labeled dataset.