par_sigma0 (float) – scalar, initial standard deviation in each coordinate.
par_sigma0 should be about 1/4th of the search domain width
(where the optimum is to be expected)
options (dict) – Optimizer options that are directly passed on to cma.
Enhanced Scatter Search (ESS) global optimization.
Scatter search is a meta-heuristic for global optimization. A set of points
(the reference set, RefSet) is iteratively adapted to explore the parameter
space and to follow promising directions.
This implementation is based on [1][2],
but does not implement any constraint handling beyond box constraints.
The basic steps of ESS are:
Initialization: Generate a diverse set of points (RefSet) in the
parameter space.
Recombination: Generate new points by recombining the RefSet points.
Improvement: Improve the RefSet by replacing points with better ones.
The steps are repeated until a stopping criterion is met.
ESS is gradient-free, unless a gradient-based local optimizer is used
(local_optimizer).
Various hyperparameters control the behavior of ESS.
Initialization is controlled by dim_refset and n_diverse.
Local optimizations are controlled by local_optimizer, local_n1,
local_n2, and balance.
The optimization stops if any of the following criteria are met:
The maximum number of iterations is reached (max_iter).
The maximum number of objective function evaluations is reached
(max_eval).
The maximum wall-time is reached (max_walltime_s).
One of these criteria needs to be provided.
Note that the wall-time and function evaluation criteria are not checked
after every single function evaluation, and thus, the actual number of
function evaluations may slightly exceed the given value.
Objective function evaluations inside ESSOptimizer can be
parallelized using multiprocessing or multithreading by passing a value
>1 for n_procs or n_threads, respectively.
For plausible values of hyperparameters, see Villaverde et al.[3].
Parameters:
dim_refset (int) – Size of the ReferenceSet. Note that in every iteration at least
dim_refset**2-dim_refset function evaluations will occur.
max_iter (int) – Maximum number of ESS iterations.
local_n1 (int) – Minimum number of iterations before first local search.
Ignored if local_optimizer=None.
local_n2 (int) – Minimum number of iterations between consecutive local
searches. Maximally one local search per performed in each
iteration. Ignored if local_optimizer=None.
local_optimizer (Optimizer | OptimizerFactory | None) – Local optimizer for refinement, or a callable that creates an
pypesto.optimize.Optimizer or None to skip local searches.
In case of a callable, it will be called with the keyword arguments
max_walltime_s and max_eval, which should be passed to the optimizer
(if supported) to honor the overall budget.
See SacessFidesFactory for an example.
n_diverse (int) – Number of samples to choose from to construct the initial RefSet
max_eval – Maximum number of objective functions allowed. This criterion is
only checked once per iteration, not after every objective
evaluation, so the actual number of function evaluations may exceed
this value.
max_walltime_s – Maximum walltime in seconds. Will only be checked between local
optimizations and other simulations, and thus, may be exceeded by
the duration of a local search.
balance (float) – Quality vs. diversity balancing factor with
\(0 \leq balance \leq 1\); 0 = only quality,
1 = only diversity.
Affects the choice of starting points for local searches. I.e.,
whether local optimization should focus on improving the best
solutions found so far (quality), or on exploring new regions of
the parameter space (diversity).
Ignored if local_optimizer=None.
n_procs – Number of parallel processes to use for parallel function
evaluation. Mutually exclusive with n_threads.
n_threads – Number of parallel threads to use for parallel function evaluation.
Mutually exclusive with n_procs.
history – History of the best values/parameters found so far.
(Monotonously decreasing objective values.)
result_includes_refset (bool) – Whether the minimize() result should include the final
RefSet, or just the local search results and the overall best
parameters.
hessian_update (None | HessianApproximation) – Hessian update strategy. If this is None, a hybrid approximation
that switches from the problem.objective provided Hessian (
approximation) to a BFGS approximation will be used.
method – Local or global Optimizer to use for minimization.
local_method – Local method to use in combination with the global optimizer (
for the MLSL family of solvers) or to solve a subproblem (for the
AUGLAG family of solvers)
options (dict) – Optimizer options. scipy option maxiter is automatically
transformed into maxeval and takes precedence.
local_options (dict) – Optimizer options for the local method
allow_failed_starts (bool) – Flag indicating whether we tolerate that exceptions are thrown during
the minimization process.
report_sres (bool) – Flag indicating whether sres will be stored in the results object.
Deactivating this option will improve memory consumption for large
scale problems.
report_hess (bool) – Flag indicating whether hess will be stored in the results object.
Deactivating this option will improve memory consumption for large
scale problems.
history_beats_optimizer (bool) – Whether the optimal value recorded by pyPESTO in the history has
priority over the optimal value reported by the optimizer (True)
or not (False).
par_popsize (float) – number of particles in the swarm, default value 10
options (dict) – Optimizer options that are directly passed on to pyswarms.
c1: cognitive parameter
c2: social parameter
w: inertia parameter
Default values are (c1,c2,w) = (0.5, 0.3, 0.9)
Examples
Arguments that can be passed to options:
maxiter:
used to calculate the maximal number of funcion evaluations.
Default: 1000
__call__() will forward the walltime limit and function evaluation
limit imposed on SacessOptimizer to FidesOptimizer.
Besides that, default options are used.
A shared-memory-based implementation of the
Self-Adaptive Cooperative Enhanced Scatter Search (SaCeSS) algorithm
presented in Penas et al.[4]. This is a meta-heuristic for
global optimization. Multiple processes (workers) run
enhancedscattersearches(ESSs) in parallel.
After each ESS iteration, depending on the outcome, there is a chance
of exchanging good parameters, and changing ESS hyperparameters to those of
the most promising worker. See Penas et al.[4] for details.
SacessOptimizer can be used with or without a local optimizer, but
it is highly recommended to use one.
A basic example using SacessOptimizer to minimize the Rosenbrock
function:
>>> frompypesto.optimizeimportSacessOptimizer>>> frompypesto.problemimportProblem>>> frompypesto.objectiveimportObjective>>> importscipyassp>>> importnumpyasnp>>> importlogging>>> # Define some test Problem>>> objective=Objective(... fun=sp.optimize.rosen,... grad=sp.optimize.rosen_der,... hess=sp.optimize.rosen_hess,... )>>> dim=6>>> problem=Problem(... objective=objective,... lb=-5*np.ones((dim,1)),... ub=5*np.ones((dim,1)),... )>>> # Create and run the optimizer>>> sacess=SacessOptimizer(... num_workers=2,... max_walltime_s=5,... sacess_loglevel=logging.WARNING... )>>> result=sacess.minimize(problem)
List of the histories of the best values/parameters
found by each worker. (Monotonously decreasing objective values.)
See pypesto.visualize.optimizer_history.sacess_history() for
visualization.
List of argument dictionaries passed to
ESSOptimizer.__init__(). Each entry corresponds to one worker
process. I.e., the length of this list is the number of ESSs.
Ideally, this list contains some more conservative and some more
aggressive configurations.
Resource limits such as max_eval apply to a single ESS
iteration, not to the full search.
Mutually exclusive with num_workers.
num_workers (int | None) – Number of workers to be used. If this argument is given,
(different) default ESS settings will be used for each worker.
Mutually exclusive with ess_init_args.
See get_default_ess_options() for details on the default
settings.
max_walltime_s (float) – Maximum walltime in seconds. It will only be checked between local
optimizations and other simulations, and thus, may be exceeded by
the duration of a local search. Defaults to no limit.
Note that in order to impose the wall time limit also on the local
optimizer, the user has to provide a wrapper function similar to
SacessFidesFactory.__call__().
tmpdir (Path | str) – Directory for temporary files. This defaults to a directory in the
current working directory named SacessOptimizerTemp-{randomsuffix}.
When setting this option, make sure any optimizers running in
parallel have a unique tmpdir. Expected to be empty.
mp_start_method (str) – The start method for the multiprocessing context.
See multiprocessing for details. Running SacessOptimizer
under Jupyter may require mp_start_method="fork".
Note that if this function is called from a multithreaded program (
multiple threads running at the time of calling this function) and
the multiprocessing start method is set to fork, there is
a good chance for deadlocks. Postpone spawning threads until after
minimize or change the start method to spawn.
Parameters:
problem (Problem) – Minimization problem.
Problem.startpoint_method() will be used to sample random
points. SacessOptimizer will deal with non-evaluable points.
Therefore, using pypesto.startpoint.CheckedStartpoints
with check_fval=True or check_grad=True is not recommended
since it would create significant overhead.
startpoint_method (StartpointMethod) – Method for choosing starting points.
Deprecated. Use ``problem.startpoint_method`` instead.
_ – Result object with optimized parameters in
pypesto.Result.optimize_result.
Results are sorted by objective. At least the best parameters are
included. Additional results may be included - this is subject to
change.
manager_initial_rejection_threshold (float) – Initial and minimum threshold for relative objective improvements that
incoming solutions have to pass to be accepted. If the number of
rejected solutions exceeds the number of workers, the threshold is
halved until it reaches manager_minimum_rejection_threshold.
manager_minimum_rejection_threshold (float) – Initial and minimum threshold for relative objective improvements that
incoming solutions have to pass to be accepted. If the number of
rejected solutions exceeds the number of workers, the threshold is
halved until it reaches manager_minimum_rejection_threshold.
worker_acceptance_threshold (float) – Minimum relative improvement of the objective compared to the best
known value to be eligible for submission to the Manager.
Hyperparameters that control when the workers will adapt their settings
based on the performance of the other workers.
The adaptation step is performed if all the following conditions are
met:
The number of function evaluations since the last solution was sent
to the manager times the number of optimization parameters is greater
than adaptation_min_evals.
The number of solutions received by the worker since the last
solution it sent to the manager is greater than
adaptation_sent_coeff*n_sent_solutions+adaptation_sent_offset,
where n_sent_solutions is the number of solutions sent to the
manager by the given worker.
Hyperparameters that control when the workers will adapt their settings
based on the performance of the other workers.
The adaptation step is performed if all the following conditions are
met:
The number of function evaluations since the last solution was sent
to the manager times the number of optimization parameters is greater
than adaptation_min_evals.
The number of solutions received by the worker since the last
solution it sent to the manager is greater than
adaptation_sent_coeff*n_sent_solutions+adaptation_sent_offset,
where n_sent_solutions is the number of solutions sent to the
manager by the given worker.
Hyperparameters that control when the workers will adapt their settings
based on the performance of the other workers.
The adaptation step is performed if all the following conditions are
met:
The number of function evaluations since the last solution was sent
to the manager times the number of optimization parameters is greater
than adaptation_min_evals.
The number of solutions received by the worker since the last
solution it sent to the manager is greater than
adaptation_sent_coeff*n_sent_solutions+adaptation_sent_offset,
where n_sent_solutions is the number of solutions sent to the
manager by the given worker.
Returns settings for num_workers parallel scatter searches, combining
more aggressive and more conservative configurations. Mainly intended for
use with SacessOptimizer. For details on the different options,
see keyword arguments of ESSOptimizer.__init__().
Setting appropriate values for n_threads and local_optimizer is
left to the user. Defaults to single-threaded and no local optimizer.
dim (Problem dimension (number of optimized parameters).)
local_optimizer (The local optimizer to use) – (see same argument in ESSOptimizer), a boolean indicating
whether to set the default local optimizer
(currently FidesOptimizer), a Optimizer instance,
or a Callable returning an optimizer instance.
The latter can be used to propagate walltime limits to the local
optimizers. See SacessFidesFactory.__call__() for an example.
The current default optimizer assumes that the optimized objective
function can provide its gradient. If this is not the case, the user
should provide a different local optimizer or consider using
pypesto.objective.finite_difference.FD to approximate the
gradient using finite differences.
result (Result) – A result object to append the optimization results to. For example,
one might append more runs to a previous optimization. If None,
a new object is created.
progress_bar (bool) – Whether to display a progress bar.
options (OptimizeOptions) – Various options applied to the multistart optimization.
history_options (HistoryOptions) – Optimizer history options.
filename (Union[str, Callable, None]) – Name of the hdf5 file, where the result will be saved. Default is
None, which deactivates automatic saving. If set to
Auto it will automatically generate a file named
year_month_day_profiling_result.hdf5.
Optionally a method, see docs for pypesto.store.auto.autosave().
overwrite (bool) – Whether to overwrite result/optimization in the autosave file
if it already exists.
problem (Optional[Problem]) – The problem to find optimal parameters for.
If None, bounds will be assumed to be [-inf, inf] for checking for
admissible points.