Optimize¶
Multistart optimization with support for various optimizers.
-
class
pypesto.optimize.
CmaesOptimizer
(par_sigma0: float = 0.25, options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Global optimization using cma-es.
Package homepage: https://pypi.org/project/cma-es/
-
__init__
(par_sigma0: float = 0.25, options: Optional[Dict] = None)[source]¶ Initialize.
- Parameters
par_sigma0 – scalar, initial standard deviation in each coordinate. par_sigma0 should be about 1/4th of the search domain width (where the optimum is to be expected)
options – Optimizer options that are directly passed on to cma.
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
-
class
pypesto.optimize.
DlibOptimizer
(options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Use the Dlib toolbox for optimization.
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
-
class
pypesto.optimize.
FidesOptimizer
(hessian_update: Optional[None] = 'default', options: Optional[Dict] = None, verbose: Optional[int] = 20)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Global/Local optimization using the trust region optimizer fides.
Package Homepage: https://fides-optimizer.readthedocs.io/en/latest
-
__init__
(hessian_update: Optional[None] = 'default', options: Optional[Dict] = None, verbose: Optional[int] = 20)[source]¶ Initialize.
- Parameters
options – Optimizer options.
hessian_update – Hessian update strategy. If this is None, a hybrid approximation that switches from the problem.objective provided Hessian ( approximation) to a BFGS approximation will be used.
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
-
class
pypesto.optimize.
IpoptOptimizer
(options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Use IpOpt (https://pypi.org/project/ipopt/) for optimization.
-
__init__
(options: Optional[Dict] = None)[source]¶ Initialize.
- Parameters
options – Options are directly passed on to cyipopt.minimize_ipopt.
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
-
class
pypesto.optimize.
NLoptOptimizer
(method=None, local_method=None, options: Optional[Dict] = None, local_options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Global/Local optimization using NLopt.
Package homepage: https://nlopt.readthedocs.io/en/latest/
-
__init__
(method=None, local_method=None, options: Optional[Dict] = None, local_options: Optional[Dict] = None)[source]¶ Initialize.
- Parameters
method – Local or global Optimizer to use for minimization.
local_method – Local method to use in combination with the global optimizer ( for the MLSL family of solvers) or to solve a subproblem (for the AUGLAG family of solvers)
options – Optimizer options. scipy option maxiter is automatically transformed into maxeval and takes precedence.
local_options – Optimizer options for the local method
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
-
class
pypesto.optimize.
OptimizeOptions
(allow_failed_starts: bool = True, report_sres: bool = True, report_hess: bool = True, history_beats_optimizer: bool = True)[source]¶ Bases:
dict
Options for the multistart optimization.
- Parameters
allow_failed_starts – Flag indicating whether we tolerate that exceptions are thrown during the minimization process.
report_sres – Flag indicating whether sres will be stored in the results object. Deactivating this option will improve memory consumption for large scale problems.
report_hess – Flag indicating whether hess will be stored in the results object. Deactivating this option will improve memory consumption for large scale problems.
history_beats_optimizer – Whether the optimal value recorded by pyPESTO in the history has priority over the optimal value reported by the optimizer (True) or not (False).
-
__init__
(allow_failed_starts: bool = True, report_sres: bool = True, report_hess: bool = True, history_beats_optimizer: bool = True)[source]¶ Initialize self. See help(type(self)) for accurate signature.
-
static
assert_instance
(maybe_options: Union[pypesto.optimize.options.OptimizeOptions, Dict]) → pypesto.optimize.options.OptimizeOptions[source]¶ Return a valid options object.
- Parameters
maybe_options (OptimizeOptions or dict) –
-
class
pypesto.optimize.
Optimizer
[source]¶ Bases:
abc.ABC
Optimizer base class, not functional on its own.
An optimizer takes a problem, and possibly a start point, and then performs an optimization. It returns an OptimizerResult.
-
abstract
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
abstract
-
class
pypesto.optimize.
PyswarmOptimizer
(options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Global optimization using pyswarm.
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
-
class
pypesto.optimize.
PyswarmsOptimizer
(par_popsize: float = 10, options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Global optimization using pyswarms.
Package homepage: https://pyswarms.readthedocs.io/en/latest/index.html
- Parameters
par_popsize – number of particles in the swarm, default value 10
options – Optimizer options that are directly passed on to pyswarms. c1: cognitive parameter c2: social parameter w: inertia parameter Default values are (c1,c2,w) = (0.5, 0.3, 0.9)
Examples
Arguments that can be passed to options:
- maxiter:
used to calculate the maximal number of funcion evaluations. Default: 1000
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
class
pypesto.optimize.
ScipyDifferentialEvolutionOptimizer
(options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Global optimization using scipy’s differential evolution optimizer.
Package homepage: https://docs.scipy.org/doc/scipy/reference/generated /scipy.optimize.differential_evolution.html
- Parameters
options – Optimizer options that are directly passed on to scipy’s optimizer.
Examples
Arguments that can be passed to options:
- maxiter:
used to calculate the maximal number of funcion evaluations by maxfevals = (maxiter + 1) * popsize * len(x) Default: 100
- popsize:
population size, default value 15
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
class
pypesto.optimize.
ScipyOptimizer
(method: str = 'L-BFGS-B', tol: Optional[float] = None, options: Optional[Dict] = None)[source]¶ Bases:
pypesto.optimize.optimizer.Optimizer
Use the SciPy optimizers.
Find details on the optimizer and configuration options at: https://docs.scipy.org/doc/scipy/reference/generated/scipy. optimize.minimize.html#scipy.optimize.minimize
-
__init__
(method: str = 'L-BFGS-B', tol: Optional[float] = None, options: Optional[Dict] = None)[source]¶ Initialize base class.
-
minimize
(problem: pypesto.problem.Problem, x0: numpy.ndarray, id: str, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None)[source]¶
-
-
pypesto.optimize.
fill_result_from_history
(result: pypesto.result.optimize.OptimizerResult, optimizer_history: pypesto.objective.history.OptimizerHistory, optimize_options: Optional[pypesto.optimize.options.OptimizeOptions] = None) → pypesto.result.optimize.OptimizerResult[source]¶ Overwrite some values in the result object with values in the history.
- Parameters
result (Result as reported from the used optimizer.) –
optimizer_history (History of function values recorded by the objective.) –
optimize_options (Options on e.g. how to override.) –
- Returns
result
- Return type
The in-place modified result.
-
pypesto.optimize.
minimize
(problem: pypesto.problem.Problem, optimizer: Optional[pypesto.optimize.optimizer.Optimizer] = None, n_starts: int = 100, ids: Optional[Iterable[str]] = None, startpoint_method: Optional[Union[pypesto.startpoint.base.StartpointMethod, Callable, bool]] = None, result: Optional[pypesto.result.result.Result] = None, engine: Optional[pypesto.engine.base.Engine] = None, progress_bar: bool = True, options: Optional[pypesto.optimize.options.OptimizeOptions] = None, history_options: Optional[pypesto.objective.history.HistoryOptions] = None, filename: Optional[Union[str, Callable]] = 'Auto', overwrite: bool = False) → pypesto.result.result.Result[source]¶ Do multistart optimization.
- Parameters
problem – The problem to be solved.
optimizer – The optimizer to be used n_starts times.
n_starts – Number of starts of the optimizer.
ids – Ids assigned to the startpoints.
startpoint_method – Method for how to choose start points. False means the optimizer does not require start points, e.g. for the ‘PyswarmOptimizer’.
result – A result object to append the optimization results to. For example, one might append more runs to a previous optimization. If None, a new object is created.
engine – Parallelization engine. Defaults to sequential execution on a SingleCoreEngine.
progress_bar – Whether to display a progress bar.
options – Various options applied to the multistart optimization.
history_options – Optimizer history options.
filename – Name of the hdf5 file, where the result will be saved. Default is “Auto”, in which case it will automatically generate a file named year_month_day_optimization_result.hdf5. Deactivate saving by setting filename to None. Optionally a method, see docs for pypesto.store.auto.autosave.
overwrite – Whether to overwrite result/optimization in the autosave file if it already exists.
- Returns
Result object containing the results of all multistarts in result.optimize_result.
- Return type
result
-
pypesto.optimize.
optimization_result_from_history
(filename: str, problem: pypesto.problem.Problem) → pypesto.result.result.Result[source]¶ Convert a saved hdf5 History to an optimization result.
Used for interrupted optimization runs.
- Parameters
filename – The name of the file in which the information are stored.
problem – Problem, needed to identify what parameters to accept.
- Returns
A result object in which the optimization result is constructed from
history. But missing “Time, Message and Exitflag” keys.
-
pypesto.optimize.
read_result_from_file
(problem: pypesto.problem.Problem, history_options: pypesto.objective.history.HistoryOptions, identifier: str) → pypesto.result.optimize.OptimizerResult[source]¶ Fill an OptimizerResult from history.
- Parameters
problem – The problem to find optimal parameters for.
identifier – Multistart id.
history_options – Optimizer history options.
-
pypesto.optimize.
read_results_from_file
(problem: pypesto.problem.Problem, history_options: pypesto.objective.history.HistoryOptions, n_starts: int) → pypesto.result.result.Result[source]¶ Fill a Result from a set of histories.
- Parameters
problem – The problem to find optimal parameters for.
n_starts – Number of performed multistarts.
history_options – Optimizer history options.