# Optimizers (qiskit.aqua.components.optimizers)¶

Aqua contains a variety of classical optimizers for use by quantum variational algorithms, such as VQE. Logically, these optimizers can be divided into two categories:

Local Optimizers

Given an optimization problem, a local optimizer is a function that attempts to find an optimal value within the neighboring set of a candidate solution.

Global Optimizers

Given an optimization problem, a global optimizer is a function that attempts to find an optimal value among all possible solutions.

# Optimizer Base Class¶

 OptimizerSupportLevel Support Level enum for features such as bounds, gradient and initial point Optimizer Base class for optimization algorithm.

# Local Optimizers¶

 ADAM Adam and AMSGRAD optimizers. AQGD Analytic Quantum Gradient Descent (AQGD) with Epochs optimizer. CG Conjugate Gradient optimizer. COBYLA Constrained Optimization By Linear Approximation optimizer. L_BFGS_B Limited-memory BFGS Bound optimizer. GSLS Gaussian-smoothed Line Search. NELDER_MEAD Nelder-Mead optimizer. NFT Nakanishi-Fujii-Todo algorithm. P_BFGS Parallelized Limited-memory BFGS optimizer. POWELL Powell optimizer. SLSQP Sequential Least SQuares Programming optimizer. SPSA Simultaneous Perturbation Stochastic Approximation (SPSA) optimizer. TNC Truncated Newton (TNC) optimizer.

Qiskit Aqua also provides the following optimizers, which are built-out using the optimizers from the scikit-quant package. The scikit-quant package is not installed by default but must be explicitly installed, if desired, by the user - the optimizers therein are provided under various licenses so it has been made an optional install for the end user to choose whether to do so or not. To install the scikit-quant dependent package you can use pip install ‘qiskit-aqua[skquant]’.

 BOBYQA Bound Optimization BY Quadratic Approximation algorithm. IMFIL IMplicit FILtering algorithm. SNOBFIT Stable Noisy Optimization by Branch and FIT algorithm.

# Global Optimizers¶

The global optimizers here all use NLopt for their core function and can only be used if their dependent NLopt package is manually installed. See the following section for installation instructions.

The global optimizers are as follows:

 CRS Controlled Random Search (CRS) with local mutation optimizer. DIRECT_L DIviding RECTangles Locally-biased optimizer. DIRECT_L_RAND DIviding RECTangles Locally-biased Randomized optimizer. ESCH ESCH evolutionary optimizer. ISRES Improved Stochastic Ranking Evolution Strategy optimizer.