English
Languages
English
Shortcuts

Note

This page was generated from docs/tutorials/03_minimum_eigen_optimizer.ipynb.

Minimum Eigen Optimizer

Introduction

An interesting class of optimization problems to be addressed by quantum computing are Quadratic Unconstrained Binary Optimization (QUBO) problems. Finding the solution to a QUBO is equivalent to finding the ground state of a corresponding Ising Hamiltonian, which is an important problem not only in optimization, but also in quantum chemistry and physics. For this translation, the binary variables taking values in \(\{0, 1\}\) are replaced by spin variables taking values in \(\{-1, +1\}\), which allows to replace the resulting spin variables by Pauli Z matrices, and thus, an Ising Hamiltonian. For more details on this mapping we refer to [1].

Qiskit provides automatic conversion from a suitable QuadraticProgram to an Ising Hamiltonian, which then allows to leverage all the MinimumEigenSolver such as - VQE, - QAOA, or - NumpyMinimumEigensolver (classical exact method).

Qiskit wraps the translation to an Ising Hamiltonian (in Qiskit Aqua also called Operator), the call to an MinimumEigensolver as well as the translation of the results back to OptimizationResult in the MinimumEigenOptimizer.

In the following we first illustrate the conversion from a QuadraticProgram to an Operator and then show how to use the MinimumEigenOptimizer with different MinimumEigensolver to solve a given QuadraticProgram. The algorithms in Qiskit automatically try to convert a given problem to the supported problem class if possible, for instance, the MinimumEigenOptimizer will automatically translate integer variables to binary variables or add a linear equality constraints as a quadratic penalty term to the objective. It should be mentioned that Aqua will through a QiskitOptimizationError if conversion of a quadratic program with integer variable is attempted.

The circuit depth of QAOA potentially has to be increased with the problem size, which might be prohibitive for near-term quantum devices. A possible workaround is Recursive QAOA, as introduced in [2]. Qiskit generalizes this concept to the RecursiveMinimumEigenOptimizer, which is introduced at the end of this tutorial.

Converting a QUBO to an Operator

[1]:
from qiskit import BasicAer
from qiskit.utils import algorithm_globals, QuantumInstance
from qiskit.algorithms import QAOA, NumPyMinimumEigensolver
from qiskit_optimization.algorithms import MinimumEigenOptimizer, RecursiveMinimumEigenOptimizer, SolutionSample, OptimizationResultStatus
from qiskit_optimization import QuadraticProgram
from qiskit.visualization import plot_histogram
from typing import List, Tuple
import numpy as np
[2]:
# create a QUBO
qubo = QuadraticProgram()
qubo.binary_var('x')
qubo.binary_var('y')
qubo.binary_var('z')
qubo.minimize(linear=[1,-2,3], quadratic={('x', 'y'): 1, ('x', 'z'): -1, ('y', 'z'): 2})
print(qubo.export_as_lp_string())
\ This file has been generated by DOcplex
\ ENCODING=ISO-8859-1
\Problem name: CPLEX

Minimize
 obj: x - 2 y + 3 z + [ 2 x*y - 2 x*z + 4 y*z ]/2
Subject To

Bounds
 0 <= x <= 1
 0 <= y <= 1
 0 <= z <= 1

Binaries
 x y z
End

Next we translate this QUBO into an Ising operator. This results not only in an Operator but also in a constant offset to be taking into account to shift the resulting value.

[3]:
op, offset = qubo.to_ising()
print('offset: {}'.format(offset))
print('operator:')
print(op)
offset: 1.5
operator:
-1.75 * ZII
+ 0.25 * IZI
+ 0.5 * ZZI
- 0.5 * IIZ
- 0.25 * ZIZ
+ 0.25 * IZZ

Sometimes an QuadraticProgram might also directly be given in the form of an Operator. For such cases, Qiskit also provides a converter from an Operator back to a QuadraticProgram, which we illustrate in the following.

[4]:
qp=QuadraticProgram()
qp.from_ising(op, offset, linear=True)
print(qp.export_as_lp_string())
\ This file has been generated by DOcplex
\ ENCODING=ISO-8859-1
\Problem name: CPLEX

Minimize
 obj: x0 - 2 x1 + 3 x2 + [ 2 x0*x1 - 2 x0*x2 + 4 x1*x2 ]/2
Subject To

Bounds
 0 <= x0 <= 1
 0 <= x1 <= 1
 0 <= x2 <= 1

Binaries
 x0 x1 x2
End

This converter allows, for instance, to translate an Operator to a QuadraticProgram and then solve the problem with other algorithms that are not based on the Ising Hamiltonian representation, such as the GroverOptimizer.

Solving a QUBO with the MinimumEigenOptimizer

We start by initializing the MinimumEigensolver we want to use.

[5]:
algorithm_globals.random_seed = 10598
quantum_instance = QuantumInstance(BasicAer.get_backend('statevector_simulator'),
                                   seed_simulator=algorithm_globals.random_seed,
                                   seed_transpiler=algorithm_globals.random_seed)
qaoa_mes = QAOA(quantum_instance=quantum_instance, initial_point=[0., 0.])
exact_mes = NumPyMinimumEigensolver()

Then, we use the MinimumEigensolver to create MinimumEigenOptimizer.

[6]:
qaoa = MinimumEigenOptimizer(qaoa_mes)   # using QAOA
exact = MinimumEigenOptimizer(exact_mes)  # using the exact classical numpy minimum eigen solver

We first use the MinimumEigenOptimizer based on the classical exact NumPyMinimumEigensolver to get the optimal benchmark solution for this small example.

[7]:
exact_result = exact.solve(qubo)
print(exact_result)
optimal function value: -2.0
optimal value: [0. 1. 0.]
status: SUCCESS

Next we apply the MinimumEigenOptimizer based on QAOA to the same problem.

[8]:
qaoa_result = qaoa.solve(qubo)
print(qaoa_result)
optimal function value: -2.0
optimal value: [0. 1. 0.]
status: SUCCESS

Analysis of Samples

OptimizationResult provides a useful information source SolutionSample (here denoted as samples). They contain information about input values x, objective function values fval, probability of obtaining that result probability and the solution status status (SUCCESS, FAILURE, INFEASIBLE).

[9]:
print('variable order:', [var.name for var in qaoa_result.variables])
for s in qaoa_result.samples:
    print(s)
variable order: ['x', 'y', 'z']
SolutionSample(x=array([0., 1., 0.]), fval=-2.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([0., 0., 0.]), fval=0.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 1., 0.]), fval=0.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 0., 0.]), fval=1.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([0., 0., 1.]), fval=3.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 0., 1.]), fval=3.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([0., 1., 1.]), fval=3.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 1., 1.]), fval=4.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)

We may also want to filter samples according to their status or probabilities.

[10]:
def get_filtered_samples(samples: List[SolutionSample],
                         threshold: float = 0,
                         allowed_status: Tuple[OptimizationResultStatus] = (OptimizationResultStatus.SUCCESS,)):
    res = []
    for s in samples:
        if s.status in allowed_status and s.probability > threshold:
            res.append(s)

    return res
[11]:
filtered_samples = get_filtered_samples(qaoa_result.samples,
                                        threshold=0.005,
                                        allowed_status=(OptimizationResultStatus.SUCCESS,))
for s in filtered_samples:
    print(s)
SolutionSample(x=array([0., 1., 0.]), fval=-2.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([0., 0., 0.]), fval=0.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 1., 0.]), fval=0.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 0., 0.]), fval=1.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([0., 0., 1.]), fval=3.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 0., 1.]), fval=3.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([0., 1., 1.]), fval=3.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)
SolutionSample(x=array([1., 1., 1.]), fval=4.0, probability=0.12499999999999994, status=<OptimizationResultStatus.SUCCESS: 0>)

If we want to obtain a better perspective of the results, statistics is very helpful, both with respect to the objective function values and their respective probabilities. Thus, mean and standard deviation are the very basics for understanding the results.

[12]:
fvals = [s.fval for s in qaoa_result.samples]
probabilities = [s.probability for s in qaoa_result.samples]
[13]:
np.mean(fvals)
[13]:
1.5
[14]:
np.std(fvals)
[14]:
1.9364916731037085

Finally, despite all number-crunching, visualization is usually the best early-analysis approach.

[15]:
samples_for_plot = {' '.join(f'{qaoa_result.variables[i].name}={int(v)}'
                             for i, v in enumerate(s.x)): s.probability
                    for s in filtered_samples}
samples_for_plot
[15]:
{'x=0 y=1 z=0': 0.12499999999999994,
 'x=0 y=0 z=0': 0.12499999999999994,
 'x=1 y=1 z=0': 0.12499999999999994,
 'x=1 y=0 z=0': 0.12499999999999994,
 'x=0 y=0 z=1': 0.12499999999999994,
 'x=1 y=0 z=1': 0.12499999999999994,
 'x=0 y=1 z=1': 0.12499999999999994,
 'x=1 y=1 z=1': 0.12499999999999994}
[16]:
plot_histogram(samples_for_plot)
[16]:
../_images/tutorials_03_minimum_eigen_optimizer_31_0.png

RecursiveMinimumEigenOptimizer

The RecursiveMinimumEigenOptimizer takes a MinimumEigenOptimizer as input and applies the recursive optimization scheme to reduce the size of the problem one variable at a time. Once the size of the generated intermediate problem is below a given threshold (min_num_vars), the RecursiveMinimumEigenOptimizer uses another solver (min_num_vars_optimizer), e.g., an exact classical solver such as CPLEX or the MinimumEigenOptimizer based on the NumPyMinimumEigensolver.

In the following, we show how to use the RecursiveMinimumEigenOptimizer using the two MinimumEigenOptimizer introduced before.

First, we construct the RecursiveMinimumEigenOptimizer such that it reduces the problem size from 3 variables to 1 variable and then uses the exact solver for the last variable. Then we call solve to optimize the considered problem.

[17]:
rqaoa = RecursiveMinimumEigenOptimizer(qaoa, min_num_vars=1, min_num_vars_optimizer=exact)
[18]:
rqaoa_result = rqaoa.solve(qubo)
print(rqaoa_result)
optimal function value: -2.0
optimal value: [0. 1. 0.]
status: SUCCESS
[19]:
filtered_samples = get_filtered_samples(rqaoa_result.samples,
                                        threshold=0.005,
                                        allowed_status=(OptimizationResultStatus.SUCCESS,))
[20]:
samples_for_plot = {' '.join(f'{rqaoa_result.variables[i].name}={int(v)}'
                             for i, v in enumerate(s.x)): s.probability
                    for s in filtered_samples}
samples_for_plot
[20]:
{'x=0 y=1 z=0': 1.0}
[21]:
plot_histogram(samples_for_plot)
[21]:
../_images/tutorials_03_minimum_eigen_optimizer_39_0.png
[22]:
import qiskit.tools.jupyter
%qiskit_version_table
%qiskit_copyright

Version Information

Qiskit SoftwareVersion
qiskit-terra0.18.3
qiskit-aer0.9.0
qiskit-optimization0.2.3
System information
Python3.8.12 (default, Sep 13 2021, 08:28:12) [GCC 9.3.0]
OSLinux
CPUs2
Memory (Gb)6.790924072265625
Wed Oct 06 16:47:34 2021 UTC

This code is a part of Qiskit

© Copyright IBM 2017, 2021.

This code is licensed under the Apache License, Version 2.0. You may
obtain a copy of this license in the LICENSE.txt file in the root directory
of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.

Any modifications or derivative works of this code must retain this
copyright notice, and modified files need to carry a notice indicating
that they have been altered from the originals.

[44]: