Optimizers (qiskit.algorithms.optimizers
)¶
It contains a variety of classical optimizers for use by quantum variational algorithms,
such as VQE
.
Logically, these optimizers can be divided into two categories:
 Local Optimizers
Given an optimization problem, a local optimizer is a function that attempts to find an optimal value within the neighboring set of a candidate solution.
 Global Optimizers
Given an optimization problem, a global optimizer is a function that attempts to find an optimal value among all possible solutions.
Optimizer Base Class¶
Support Level enum for features such as bounds, gradient and initial point 

Base class for optimization algorithm. 
Local Optimizers¶
Adam and AMSGRAD optimizers. 

Analytic Quantum Gradient Descent (AQGD) with Epochs optimizer. 

Conjugate Gradient optimizer. 

Constrained Optimization By Linear Approximation optimizer. 

Limitedmemory BFGS Bound optimizer. 

Gaussiansmoothed Line Search. 

The gradient descent minimization routine. 

NelderMead optimizer. 

NakanishiFujiiTodo algorithm. 

Parallelized Limitedmemory BFGS optimizer. 

Powell optimizer. 

Sequential Least SQuares Programming optimizer. 

Simultaneous Perturbation Stochastic Approximation (SPSA) optimizer. 

The Quantum Natural SPSA (QNSPSA) optimizer. 

Truncated Newton (TNC) optimizer. 

A general Qiskit Optimizer wrapping scipy.optimize.minimize. 
Qiskit also provides the following optimizers, which are builtout using the optimizers from the scikitquant package. The scikitquant package is not installed by default but must be explicitly installed, if desired, by the user  the optimizers therein are provided under various licenses so it has been made an optional install for the end user to choose whether to do so or not. To install the scikitquant dependent package you can use pip install scikitquant.
Bound Optimization BY Quadratic Approximation algorithm. 

IMplicit FILtering algorithm. 

Stable Noisy Optimization by Branch and FIT algorithm. 
Global Optimizers¶
The global optimizers here all use NLopt for their core function and can only be used if their dependent NLopt package is manually installed. See the following section for installation instructions.
The global optimizers are as follows:
Controlled Random Search (CRS) with local mutation optimizer. 

DIviding RECTangles Locallybiased optimizer. 

DIviding RECTangles Locallybiased Randomized optimizer. 

ESCH evolutionary optimizer. 

Improved Stochastic Ranking Evolution Strategy optimizer. 