.. title: Epsilon-Constrained Optimization
.. slug: 29
.. date: 2022-08-29 18:00
.. tags: documentation,genetic algorithms,english,hardware,hamradio,optimization,open source
.. description:
.. wp-status: publish
.. has_math: true
.. |--| unicode:: U+2013 .. en dash
.. |ohm| unicode:: U+02126 .. Omega
:trim:
.. |_| unicode:: U+00A0 .. Non-breaking space
:trim:
.. |epsilon| unicode:: U+03B5 .. epsilon
.. |alpha| unicode:: U+03B1 .. alpha
.. |alp| unicode:: U+03B1 .. alpha
:trim:
.. |delta| unicode:: U+03B4 .. delta
.. |-| unicode:: U+02011 .. Non-breaking hyphen
:trim:
Many optimization problems involve constraints that valid solutions must
satisfy. A `constrained optimization problem`_
is typically written as a nonlinear programming problem [1]_.
.. _`constrained optimization problem`:
.. math::
\hbox{Minimize} \; & f_i(\vec{x}), & i &= 1, \ldots, I \\
\hbox{Subject to} \; & g_j(\vec{x}) \le 0, & j &= 1, \ldots, J \\
& h_k(\vec{x}) = 0, & k &= 1, \ldots, K \\
& x_m^l \le x_m \le x_m^u, & m &= 1, \ldots, M \\
In this problem, there are :math:`n` variables (the vector
:math:`\vec{x}` is of size :math:`n`), :math:`J` inequality constraints,
:math:`K` equality constraints and the variable :math:`x_m` must be in the
range :math:`|x_m^l, x_m^u|` (often called a box constraint). The
functions :math:`f_i` are called the objective functions. If there is more
than one objective function, the problem is said to be a multi
objective optimization problem as described previously in this blog [2]_.
In the following I will use some terms that were introduced in that
earlier blog-post without further explanation.
The objective functions are not necessarily minimized (as in the
formula) but can also be maximized if the best solutions to a problem
requires maximization. Note that the inequality constraints are often
depicted with a :math:`\ge` sign but the formula can easily be changed
(e.g. by multiplying with -1) to use a :math:`\le` sign.
Since it is very hard to fulfil equality constraints, especially if they
involve nonlinear functions of the input variables, equality constraints
are often converted to inequality constraints using an |delta| |-|
neighborhood:
.. math::
-\delta \le h_k(\vec{x}) \le \delta
Where |delta| is chosen according to the requirements of the problem
for which a solution is sought.
One very successful method of solving constrained optimization problems
is to consider a lexicographic ordering of constraints and objective
function values. Candidate solutions to the optimization problem are
first sorted by violated constraints (typically the sum over all
violated constraints) and then by the objective function value(s) [1]_.
When comparing two individuals during selection in the genetic algorithm
there are three cases: If both individuals violate constraints, the
individual with the lower constraint violation wins. If one violates
constraints and the other does not, the one not violating constraints
wins. In the last case where both individuals do not violate constraints
the normal comparison is used (which depends on the algorithm and if
we're minimizing or maximizing).
This method, originally proposed by Deb [1]_, is implemented in the
genetic algorithm package I'm maintaining, PGAPack_, and my Python
wrapper PGAPy_ for it.
With this algorithm for handling constraints, the constraints are
optimized first before the algorithm "looks" at the objective
function(s) at all. It often happens that the algorithm ends up
searching in a region of the input space where no good solutions exist
(but no constraints are violated). Hard problems often contain equality
constraints (converted to inequality constraints as indicated earlier)
or other "hard" constraints. In my previous blog post [2]_ on
antenna optimization I wrote: "the optimization algorithm has a hard
time finding the director solutions at all. Only in one of a handful
experiments I was able to obtain the pareto front plotted above".
In that experiment I was running 50 searches and only 5 of them did
*not* get stuck in a local optimum. A similar thing happens for the
problem (7) in Deb's paper [1]_ which has equality constraints. I've
implemented this as `example problem 7`_ in PGAPack_. It only finds a
solution near the (known) optimum when :math:`\varepsilon \ge 10^{-2}` for
all equality constraints (I didn't experiment with different random
seeds for the optimizer, maybe a better solution would be possible with
a different random seed). In the paper [1]_, Deb uses
:math:`\varepsilon = 10^{-3}` for the same reason.
One method for handling this problem was appealing because it is so
easy to understand and implement:
Takahama and Sakai were first experimenting with a method for relaxing
constraints during the early stages of optization with a formulation
they called an |alpha| |-| constrained genetic algorithm [3]_. They
later simplified the formulation and called the resulting algorithm
|epsilon| constrained optimization. It can be applied to different
optimization algorithms, not just genetic algorithms and variants [4]_.
Of special interest is the application of the method to differential
evolution [5]_, [6]_ but of course it can also be applied to other forms
of genetic algorithms.
Note that the |epsilon| in the name of the algorithm *can* be used for
the |delta| used when converting an equality constraint to inequality
constraints but is not limited to this case.
During the run of the optimizer in each generation a new value for
|epsilon| is computed. The comparison of individuals outlined above is
modified, so that an individual is handled like it was not violating any
constraints if the constraint violation is below |epsilon|. So if both
individuals have constraint violations larger than |epsilon|, the one
with lower violation wins. If one violation is below |epsilon| and the
other above, the individual with the violation below |epsilon| wins. And
finally if the constraint violations of both individuals are below
|epsilon|, the normal comparison takes place.
The last case is the key to the success of this algorithm: Even though
the search proceeds into a direction where the constraint violations are
minimized, *at the same time* good solutions in terms of the objective
function are found.
The algorithm begins by initializing :math:`\varepsilon_0` with the constraint
violation of the individuum with index :math:`\theta` from the initial
population sorted by constraint violation, where
:math:`\theta` is a parameter of the algorithm between 1 and the
population size, a good value uses the individuum at about 20%
of the population size which is also the default in PGAPack_.
In each generation :math:`t`, :math:`\varepsilon_t` is computed by
.. math::
\varepsilon_t = \varepsilon_0 \left(1-\frac{t}{T_c}\right)^{cp}
up to generation :math:`T_c`. After that generation, |epsilon| is set to
0. The exponent :math:`cp` is between 2 and 10. The 2010 paper [6]_
recommends to set :math:`cp = 0.3 cp + 0.7 cp_\min` at
generation :math:`T_\lambda = 0.95 T_c` where :math:`cp_\min` is
the fixed value 3. The initial value of :math:`cp` is chosen so that
:math:`\varepsilon_\lambda=10^{-5}` at generation :math:`T_\lambda` unless
it is smaller than :math:`cp_\min` in which case it is set to
:math:`cp_\min`. PGAPack_ implements this
schedule for :math:`cp` by default but allows to change :math:`cp`
at start and during run of the optimizer, so it's possible to easily
implement a different schedule for :math:`cp` |--| the default works
quite nicely, though.
With the |epsilon| constraint method, example 7 from Deb [1]_ can be
optimized with a precision of :math:`10^{-6}` in my experiments, see the
``epsilon_generation`` parameter in the `optimizer example`_
The `antenna-optimizer`_ with an |epsilon| |-| generation of 50 (that's
the :math:`T_c` parameter of the algorithm) gets stuck in the local
optimum only in *one* of 50 cases, all other cases find good results:
.. iframe:: /content/2022/plot-50.html
In that picture all the solutions that are dominated by solutions from
another run are drawn in black. It can be seen that the data from run
number 16 did not contribute any non-dominated solutions (on the right
side in the legend the number 16 is missing). You can turn off the
display of the dominated solutions by clicking on the black dot in the
legend.
When I increase the |epsilon| |-| generation to 60, the run with random
seed 16 also finds a solution:
.. iframe:: /content/2022/plot-50+60.html
We also see that the solutions are quite good (quite near to the pareto
front) for all runs, the black "shadow" of the dominated solutions is
quite near to the real pareto front and it is enough to do a single run
of the algorithm for finding a good set of solutions.
.. [1] Kalyanmoy Deb. An efficient constraint handling method for
genetic algorithms. Computer Methods in Applied Mechanics and
Engineering, 186(2–4):311–338, June 2000.
.. [2] Ralf Schlatterbeck. `Multi-objective antenna optimization`_.
Blog post, Open Source Consulting, December 2021.
.. [3] Tetsuyuki Takahama and Setsuko Sakai. Constrained optimization
by |alpha| constrained genetic algorithm ( |alp| GA). Systems
and Computers in Japan, 35(5):11–22, May 2004.
.. [4] Tetsuyuki Takahama and Setsuko Sakai. Constrained optimization
by |epsilon| constrained particle swarm optimizer with |epsilon|
|-| level control. In Abraham et al., editors, Soft Computing as
Transdisciplinary Science and Technology, volume 29 of Advances
in Soft Computing, pages 1019–1029. Springer, 2005.
.. [5] Tetsuyuki Takahama and Setsuko Sakai. Constrained optimization by
the |epsilon| constrained differential evolution with gradient-based
mutation and feasible elites. In IEEE International Conference on
Evolutionary Computation (CEC). Vancouver, BC, Canada, July 2006.
.. [6] Tetsuyuki Takahama and Setsuko Sakai. Constrained optimization by
the |epsilon| constrained differential evolution with an archive and
gradient-based mutation. In IEEE Congress on Evolutionary Computation
(CEC), Barcelona, Spain, July 2010.
.. _`Multi-objective antenna optimization`:
https://blog.runtux.com/posts/2021/12/27/
.. _`PGAPack`: https://github.com/schlatterbeck/pgapack
.. _`PGAPy`: https://github.com/schlatterbeck/pgapy
.. _`example problem 7`:
https://github.com/schlatterbeck/pgapack/blob/master/examples/deb/deb7.c
.. _`optimizer example`:
https://github.com/schlatterbeck/pgapack/blob/master/examples/deb/optimize.c
.. _`antenna-optimizer`: https://github.com/schlatterbeck/antenna-optimizer